WorldWideScience

Sample records for pipeline processing

  1. Youpi: YOUr processing PIpeline

    Science.gov (United States)

    Monnerville, Mathias; Sémah, Gregory

    2012-03-01

    Youpi is a portable, easy to use web application providing high level functionalities to perform data reduction on scientific FITS images. Built on top of various open source reduction tools released to the community by TERAPIX (http://terapix.iap.fr), Youpi can help organize data, manage processing jobs on a computer cluster in real time (using Condor) and facilitate teamwork by allowing fine-grain sharing of results and data. Youpi is modular and comes with plugins which perform, from within a browser, various processing tasks such as evaluating the quality of incoming images (using the QualityFITS software package), computing astrometric and photometric solutions (using SCAMP), resampling and co-adding FITS images (using SWarp) and extracting sources and building source catalogues from astronomical images (using SExtractor). Youpi is useful for small to medium-sized data reduction projects; it is free and is published under the GNU General Public License.

  2. Maurer computers for pipelined instruction processing

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2008-01-01

    We model micro-architectures with non-pipelined instruction processing and pipelined instruction processing using Maurer machines, basic thread algebra and program algebra. We show that stored programs are executed as intended with these micro-architectures. We believe that this work provides a new

  3. Astronomical pipeline processing using fuzzy logic

    Science.gov (United States)

    Shamir, Lior

    In the past few years, pipelines providing astronomical data have been becoming increasingly important. The wide use of robotic telescopes has provided significant discoveries, and sky survey projects such as SDSS and the future LSST are now considered among the premier projects in the field astronomy. The huge amount of data produced by these pipelines raises the need for automatic processing. Astronomical pipelines introduce several well-defined problems such as astronomical image compression, cosmic-ray hit rejection, transient detection, meteor triangulation and association of point sources with their corresponding known stellar objects. We developed and applied soft computing algorithms that provide new or improved solutions to these growing problems in the field of pipeline processing of astronomical data. One new approach that we use is fuzzy logic-based algorithms, which enables the automatic analysis of the astronomical pipelines and allows mining the data for not-yet-known astronomical discoveries such as optical transients and variable stars. The developed algorithms have been tested with excellent results on the NightSkyLive sky survey, which provides a pipeline of 150 astronomical pictures per hour, and covers almost the entire global night sky.

  4. Astronomical pipeline processing using fuzzy logic

    Science.gov (United States)

    Shamir, Lior; Nemiroff, Robert J. Nemiroff

    2008-01-01

    Fundamental astronomical questions on the composition of the universe, the abundance of Earth-like planets, and the cause of the brightest explosions in the universe are being attacked by robotic telescopes costing billions of dollars and returning vast pipelines of data. The success of these programs depends on the accuracy of automated real time processing of images never seen by a human, and all predicated on fast and accurate automatic identifications of known astronomical objects and new astronomical transients. In this paper the needs of modern astronomical pipelines are discussed in the light of fuzzy-logic based decision-making. Several specific fuzzy-logic algorithms have been develop for the first time for astronomical purposes, and tested with excellent results on a test pipeline of data from the existing Night Sky Live sky survey.

  5. A study of processes for welding pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Weston, J. (ed.)

    1991-07-01

    A review was made of exisiting and potential processes for welding pipelines: fusion welding (arc, electron beam, laser, thermit) and forge welding (friction, flash, magnetically impelled arc butt, upset butt, explosive, shielded active gas, gas pressure). Consideration of J-lay operations gave indications that were reflections of the status of the processes in terms of normal land and offshore S-lay operation: forge welding processes, although having promise require considerable development; fusion welding processes offer several possibilities (mechanized GMA welding likely to be used in 1991-2); laser welding requires development in all pipeline areas: a production machine for electron beam welding will involve high costs. Nondestructive testing techniques are also reviewed. Demand for faster quality assessment is being addressed by speeding radiographic film processing and through the development of real time radiography and automatic ultrasonic testing. Conclusions on most likely future process developments are: SMAW with cellulosic electrodes is best for tie-ins, short pip runs; SMAW continues to be important for small-diameter lines, although mechanized GMA could be used, along with mechanical joining, MIAB, radial fraction, and flash butt; mechanized GMA welding is likely to predominate for large diameter lines and probably will be used for the first J-lay line (other techniques could be used too); and welding of piping for station facilities involves both shop welding of sub-assemblies and on-site welding of pipe and sub-assemblies to each other (site welding uses both SMAW and GMAW). Figs, tabs.

  6. 78 FR 32010 - Pipeline Safety: Public Workshop on Integrity Verification Process

    Science.gov (United States)

    2013-05-28

    ... Pipeline and Hazardous Materials Safety Administration Pipeline Safety: Public Workshop on Integrity Verification Process AGENCY: Pipeline and Hazardous Materials Safety Administration, DOT. ACTION: Notice of... fitness for service processes. At this workshop, the Pipeline and Hazardous Materials...

  7. A parallel-pipelining software process model

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Software process is a framework for effective and timely delivery of software system. The framework plays a crucial role for software success. However, the development of large-scale software still faces the crisis of high risks, low quality, high costs and long cycle time.This paper proposed a three-phase parallel-pipelining software process model for improving speed and productivity, and reducing software costs and risks without sacrificing software quality. In this model, two strategies were presented. One strategy, based on subsystem-cost priority, Was used to prevent software development cost wasting and to reduce software complexity as well; the other strategy, used for balancing subsystem complexity, was designed to reduce the software complexity in the later development stages. Moreover. The proposed function-detailed and workload-simplified subsystem pipelining software process model presents much higher parallelity than the concurrent incremental model. Finally, the component-based product line technology not only ensures software quality and further reduces cycle time, software costs. And software risks but also sufficiently and rationally utilizes previous software product resources and enhances the competition ability of software development organizations.

  8. Processing of Ultralow Carbon Pipeline Steels with Acicular Ferrite

    Institute of Scientific and Technical Information of China (English)

    Furen XIAO; Mingchun ZHAO; Yiyin SHAN; Bo LIAO; Ke YANG

    2004-01-01

    Acicular ferrite microstructure was achieved for an ultralow carbon pipeline steel through the improved thermomechanical control process (TMCP), which was based on the transformation process of deformed austenite of steel.Compared with commercial pipeline steels, the experimental ultralow carbon pipeline steel possessed the satisfied strength and toughness behaviors under the current improved TMCP, although it contained only approximately 0.025% C, which should mainly be attributed to the microstructural characteristics of acicular ferrite.

  9. Amateur Image Pipeline Processing using Python plus PyRAF

    Science.gov (United States)

    Green, Wayne

    2012-05-01

    A template pipeline spanning observing planning to publishing is offered as a basis for establishing a long term observing program. The data reduction pipeline encapsulates all policy and procedures, providing an accountable framework for data analysis and a teaching framework for IRAF. This paper introduces the technical details of a complete pipeline processing environment using Python, PyRAF and a few other languages. The pipeline encapsulates all processing decisions within an auditable framework. The framework quickly handles the heavy lifting of image processing. It also serves as an excellent teaching environment for astronomical data management and IRAF reduction decisions.

  10. ANALYSIS ON TECHNOLOGICAL PROCESSES CLEANING OIL PIPELINES

    Directory of Open Access Journals (Sweden)

    Mariana PǍTRAŞCU

    2015-05-01

    Full Text Available In this paper the researches are presented concerning the technological processes of oil pipelines.We know several technologies and materials used for cleaning the sludge deposits, iron and manganese oxides, dross, stone, etc.de on the inner walls of drinking water pipes or industries.For the oil industry, methods of removal of waste materials and waste pipes and liquid and gas transport networks are operations known long, tedious and expensive. The main methods and associated problems can be summarized as follows: 1 Blowing with compressed air.2 manual or mechanical brushing, sanding with water or dry.3 Wash with water jet of high pressure, solvent or chemical solution to remove the stone and hard deposits.4 The combined methods of cleaning machines that use water jets, cutters, chains, rotary heads cutters, etc.

  11. Gemini Planet Imager Calibrations, Pipeline Updates, and Campaign Data Processing

    Science.gov (United States)

    Perrin, Marshall D.; Follette, Katherine B.; Millar-Blanchaer, Max; Wang, Jason; Wolff, Schulyer; Hung, Li-Wei; Arriaga, Pauline; Savransky, Dmitry; Bailey, Vanessa P.; Bruzzone, Sebastian; Chilcote, Jeffrey K.; De Rosa, Robert J.; Draper, Zachary; Fitzgerald, Michael P.; Greenbaum, Alexandra; Ingraham, Patrick; Konopacky, Quinn M.; Macintosh, Bruce; Marchis, Franck; Marois, Christian; Maire, Jerome; Nielsen, Eric L.; Rajan, Abhijith; Rameau, Julien; Rantakyro, Fredrik; Ruffio, Jean-Baptise; Tran, Debby; Ward-Duong, Kimberly; Zalesky, Joe; GPIES Team

    2017-01-01

    In support of GPI imaging and spectroscopy of exoplanets, polarimetry of disks, and the ongoing Exoplanet Survey we continue to refine calibrations, improve data reduction methods, and develop other enhancements to the data pipeline. We summarize here the latest updates to the open-source GPI Data Reduction Pipeline, including recent improvements spectroscopic and photometric calibrations and polarimetric data processing. For the GPI Exoplanet Survey we have incorporated the GPI Data Pipeline into a larger campaign data system that provides automatic data processing including rapid PSF subtraction and contrast measurements in real time during observations and fully automated PSF subtractions using several state-of-the-art algorithms shortly after each observation completes.

  12. Polarity Categorization with Fine Tuned Pipeline Process of Online Reviews

    Directory of Open Access Journals (Sweden)

    Prabha Natarajan

    2013-06-01

    Full Text Available The development of Web 2.0 concept increased the web storage by offering information sharing from anywhere in the world. But how to use this content effectively and efficiently is the challenging taskwhich is the important research in the field of Sentiment Analysis and Opinion Mining. This paper focus on these online data to process the web content using a pipeline processing which is applied to onlinereviews about products and generating a polarity checking tool for the user to provide them decision support information. Most of the research focuses on classification of polarities instead of pre-processing of data. But our idea is fine tuned pipeline processing will help us give better categorization. Classificationhas been achieved with many techniques, mainly depends on Machine Learning. This study also focuses on ranking using different classification techniques.

  13. Image-processing pipelines: applications in magnetic resonance histology

    Science.gov (United States)

    Johnson, G. Allan; Anderson, Robert J.; Cook, James J.; Long, Christopher; Badea, Alexandra

    2016-03-01

    Image processing has become ubiquitous in imaging research—so ubiquitous that it is easy to loose track of how diverse this processing has become. The Duke Center for In Vivo Microscopy has pioneered the development of Magnetic Resonance Histology (MRH), which generates large multidimensional data sets that can easily reach into the tens of gigabytes. A series of dedicated image-processing workstations and associated software have been assembled to optimize each step of acquisition, reconstruction, post-processing, registration, visualization, and dissemination. This talk will describe the image-processing pipelines from acquisition to dissemination that have become critical to our everyday work.

  14. Processes of Turbulent Liquid Flows in Pipelines and Channels

    Directory of Open Access Journals (Sweden)

    R. I. Yesman

    2011-01-01

    Full Text Available The paper proposes a methodology for an analysis and calculation of processes pertaining to turbulent liquid flows in pipes and channels. Various modes of liquid motion in pipelines of thermal power devices and equipment have been considered in the paper.The presented dependences can be used while making practical calculations of losses due to friction in case of transportation of various energy carriers.

  15. Overview of the Kepler Science Processing Pipeline

    CERN Document Server

    Jenkins, Jon M; Chandrasekaran, Hema; Twicken, Joseph D; Bryson, Stephen T; Quintana, Elisa V; Clarke, Bruce D; Li, Jie; Allen, Christopher; Tenenbaum, Peter; Wu, Hayley; Klaus, Todd C; Middour, Christopher K; Cote, Miles T; McCauliff, Sean; Girouard, Forrest R; Gunter, Jay P; Wohler, Bill; Sommers, Jeneen; Hall, Jennifer R; Uddin, Kamal; Wu, Michael S; Bhavsar, Paresh A; Van Cleve, Jeffrey; Pletcher, David L; Dotson, Jessie A; Haas, Michael R; Gilliland, Ronald L; Koch, David G; Borucki, William J

    2010-01-01

    The Kepler Mission Science Operations Center (SOC) performs several critical functions including managing the ~156,000 target stars, associated target tables, science data compression tables and parameters, as well as processing the raw photometric data downlinked from the spacecraft each month. The raw data are first calibrated at the pixel level to correct for bias, smear induced by a shutterless readout, and other detector and electronic effects. A background sky flux is estimated from ~4500 pixels on each of the 84 CCD readout channels, and simple aperture photometry is performed on an optimal aperture for each star. Ancillary engineering data and diagnostic information extracted from the science data are used to remove systematic errors in the flux time series that are correlated with these data prior to searching for signatures of transiting planets with a wavelet-based, adaptive matched filter. Stars with signatures exceeding 7.1 sigma are subjected to a suite of statistical tests including an examinat...

  16. Youpi, a Web-based Astronomical Image Processing Pipeline

    OpenAIRE

    Monnerville, M.; Sémah, G.

    2010-01-01

    Youpi stands for "YOUpi is your processing PIpeline". It is a portable, easy to use web application providing high level functionalities to perform data reduction on scientific FITS images. It is built on top of open source processing tools that are released to the community by Terapix, in order to organize your data on a computer cluster, to manage your processing jobs in real time and to facilitate teamwork by allowing fine-grain sharing of results and data. On the server side, Youpi is wri...

  17. Advances in carbon dioxide compression and pipeline transportation processes

    CERN Document Server

    Witkowski, Andrzej; Majkut, Mirosław; Rulik, Sebastian; Stolecka, Katarzyna

    2015-01-01

    Providing a comprehensive analysis of CO2 compression, transportation processes and safety issues for post combustion CO2 capture applications for a 900 MW pulverized hard coal-fired power plant, this book assesses techniques for boosting the pressure of CO2 to pipeline pressure values with a minimal amount of energy. Four different types of compressors are examined in detail: a conventional multistage centrifugal compressor, integrally geared centrifugal compressor, supersonic shock wave compressor, and pump machines. The study demonstrates that the total compression power is closely related

  18. Youpi, a Web-based Astronomical Image Processing Pipeline

    CERN Document Server

    Monnerville, M

    2010-01-01

    Youpi stands for "YOUpi is your processing PIpeline". It is a portable, easy to use web application providing high level functionalities to perform data reduction on scientific FITS images. It is built on top of open source processing tools that are released to the community by Terapix, in order to organize your data on a computer cluster, to manage your processing jobs in real time and to facilitate teamwork by allowing fine-grain sharing of results and data. On the server side, Youpi is written in the Python programming language and uses the Django web framework. On the client side, Ajax techniques are used along with the Prototype and script.aculo.us Javascript librairies.

  19. Optimization of Memory Management in Image Processing using Pipelining Technique

    Directory of Open Access Journals (Sweden)

    P.S. Ramesh

    2015-02-01

    Full Text Available The quality of the image is mainly based on the various phenomena which generally consume lots of memory that needs to be resolved addressed. The handling of the memory is mainly affected due to disorderly arranged pixels in an image. This may lead to salt and pepper noise which will affect the quality of the image. The aim of this study is to remove the salt and pepper noise which is most crucial in image processing fields. In this study, we proposed a technique which combines adaptive mean filtering technique and wavelet transform technique based on pipeline processing to remove intensity spikes from the image and then both Otsu’s and Clahe algorithms are used to enhance the image. The implemented framework produces good results and proves against salt and pepper noise using PSNR algorithm.

  20. Application of Hilbert-Huang signal processing to ultrasonic non-destructive testing of oil pipelines

    Institute of Scientific and Technical Information of China (English)

    MAO Yi-mei; QUE Pei-wen

    2006-01-01

    In this paper, a detection technique for locating and determining the extent of defects and cracks in oil pipelines based on Hilbert-Huang time-frequency analysis is proposed. The ultrasonic signals reflected from defect-free pipelines and from pipelines with defects were processed using Hilbert-Huang transform, a recently developed signal processing technique based on direct extraction of the energy associated with the intrinsic time scales in the signal. Experimental results showed that the proposed method is feasible and can accurately and efficiently determine the location and size of defects in pipelines.

  1. Iterative Strategies for Aftershock Classification in Automatic Seismic Processing Pipelines

    Science.gov (United States)

    Gibbons, Steven J.; Kværna, Tormod; Harris, David B.; Dodge, Douglas A.

    2016-04-01

    Aftershock sequences following very large earthquakes present enormous challenges to near-realtime generation of seismic bulletins. The increase in analyst resources needed to relocate an inflated number of events is compounded by failures of phase association algorithms and a significant deterioration in the quality of underlying fully automatic event bulletins. Current processing pipelines were designed a generation ago and, due to computational limitations of the time, are usually limited to single passes over the raw data. With current processing capability, multiple passes over the data are feasible. Processing the raw data at each station currently generates parametric data streams which are then scanned by a phase association algorithm to form event hypotheses. We consider the scenario where a large earthquake has occurred and propose to define a region of likely aftershock activity in which events are detected and accurately located using a separate specially targeted semi-automatic process. This effort may focus on so-called pattern detectors, but here we demonstrate a more general grid search algorithm which may cover wider source regions without requiring waveform similarity. Given many well-located aftershocks within our source region, we may remove all associated phases from the original detection lists prior to a new iteration of the phase association algorithm. We provide a proof-of-concept example for the 2015 Gorkha sequence, Nepal, recorded on seismic arrays of the International Monitoring System. Even with very conservative conditions for defining event hypotheses within the aftershock source region, we can automatically remove over half of the original detections which could have been generated by Nepal earthquakes and reduce the likelihood of false associations and spurious event hypotheses. Further reductions in the number of detections in the parametric data streams are likely using correlation and subspace detectors and/or empirical matched

  2. The Studies of the Welding Processes and Procedures on the West-East Pipeline Project

    Institute of Scientific and Technical Information of China (English)

    SuiYongli; HuangFuxiang; ZhaoHaihong; YinChanghua

    2004-01-01

    The West-East pipeline project attracted the attention from all over the world for its long distance, huge diameter, complex geographic conditions, and diversified welding techniques being applied. In this paper the detail welding process and procedures used in the project are discussed and the distinguished achievements on welding techniques of China pipeline construction are described.

  3. Processing Earth Observing images with Ames Stereo Pipeline

    Science.gov (United States)

    Beyer, R. A.; Moratto, Z. M.; Alexandrov, O.; Fong, T.; Shean, D. E.; Smith, B. E.

    2013-12-01

    ICESat with its GLAS instrument provided valuable elevation measurements of glaciers. The loss of this spacecraft caused a demand for alternative elevation sources. In response to that, we have improved our Ames Stereo Pipeline (ASP) software (version 2.1+) to ingest satellite imagery from Earth satellite sources in addition to its support of planetary missions. This enables the open source community a free method to generate digital elevation models (DEM) from Digital Globe stereo imagery and alternatively other cameras using RPC camera models. Here we present details of the software. ASP is a collection of utilities written in C++ and Python that implement stereogrammetry. It contains utilities to manipulate DEMs, project imagery, create KML image quad-trees, and perform simplistic 3D rendering. However its primary application is the creation of DEMs. This is achieved by matching every pixel between the images of a stereo observation via a hierarchical coarse-to-fine template matching method. Matched pixels between images represent a single feature that is triangulated using each image's camera model. The collection of triangulated features represents a point cloud that is then grid resampled to create a DEM. In order for ASP to match pixels/features between images, it requires a search range defined in pixel units. Total processing time is proportional to the area of the first image being matched multiplied by the area of the search range. An incorrect search range for ASP causes repeated false positive matches at each level of the image pyramid and causes excessive processing times with no valid DEM output. Therefore our system contains automatic methods for deducing what the correct search range should be. In addition, we provide options for reducing the overall search range by applying affine epipolar rectification, homography transform, or by map projecting against a prior existing low resolution DEM. Depending on the size of the images, parallax, and image

  4. A Midas Plugin to Enable Construction of Reproducible Web-based Image Processing Pipelines

    Directory of Open Access Journals (Sweden)

    Michael eGrauer

    2013-12-01

    Full Text Available Image processing is an important quantitative technique for neuroscience researchers, but difficult for those who lack experience in the field. In this paper we present a web-based platform that allows an expert to create a brain image processing pipeline, enabling execution of that pipeline even by those biomedical researchers with limited image processing knowledge. These tools are implemented as a plugin for Midas, an open-source toolkit for creating web based scientific data storage and processing platforms. Using this plugin, an image processing expert can construct a pipeline, create a web-based UI, manage jobs, and visualize intermediate results. Pipelines are executed on a grid computing platform using BatchMake and HTCondor. This represents a new capability for biomedical researchers and offers an innovative platform for scientific collaboration. Current tools work well, but can be inaccessible for those lacking image processing expertise. Using this plugin, researchers in collaboration with image processing experts can create workflows with reasonable default settings and streamlined user interfaces, and data can be processed easily from a lab environment without the need for a powerful desktop computer. This platform allows simplified troubleshooting, centralized maintenance, and easy data sharing with collaborators. These capabilities enable reproducible science by sharing datasets and processing pipelines between collaborators. In this paper, we present a description of this innovative Midas plugin, along with results obtained from building and executing several ITK based image processing workflows for diffusion weighted MRI (DW MRI of rodent brain images, as well as recommendations for building automated image processing pipelines. Although the particular image processing pipelines developed were focused on rodent brain MRI, the presented plugin can be used to support any executable or script-based pipeline.

  5. A midas plugin to enable construction of reproducible web-based image processing pipelines.

    Science.gov (United States)

    Grauer, Michael; Reynolds, Patrick; Hoogstoel, Marion; Budin, Francois; Styner, Martin A; Oguz, Ipek

    2013-01-01

    Image processing is an important quantitative technique for neuroscience researchers, but difficult for those who lack experience in the field. In this paper we present a web-based platform that allows an expert to create a brain image processing pipeline, enabling execution of that pipeline even by those biomedical researchers with limited image processing knowledge. These tools are implemented as a plugin for Midas, an open-source toolkit for creating web based scientific data storage and processing platforms. Using this plugin, an image processing expert can construct a pipeline, create a web-based User Interface, manage jobs, and visualize intermediate results. Pipelines are executed on a grid computing platform using BatchMake and HTCondor. This represents a new capability for biomedical researchers and offers an innovative platform for scientific collaboration. Current tools work well, but can be inaccessible for those lacking image processing expertise. Using this plugin, researchers in collaboration with image processing experts can create workflows with reasonable default settings and streamlined user interfaces, and data can be processed easily from a lab environment without the need for a powerful desktop computer. This platform allows simplified troubleshooting, centralized maintenance, and easy data sharing with collaborators. These capabilities enable reproducible science by sharing datasets and processing pipelines between collaborators. In this paper, we present a description of this innovative Midas plugin, along with results obtained from building and executing several ITK based image processing workflows for diffusion weighted MRI (DW MRI) of rodent brain images, as well as recommendations for building automated image processing pipelines. Although the particular image processing pipelines developed were focused on rodent brain MRI, the presented plugin can be used to support any executable or script-based pipeline.

  6. Development of Protective Coatings for Co-Sequestration Processes and Pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Bierwagen, Gordon; Huang, Yaping

    2011-11-30

    The program, entitled Development of Protective Coatings for Co-Sequestration Processes and Pipelines, examined the sensitivity of existing coating systems to supercritical carbon dioxide (SCCO2) exposure and developed new coating system to protect pipelines from their corrosion under SCCO2 exposure. A literature review was also conducted regarding pipeline corrosion sensors to monitor pipes used in handling co-sequestration fluids. Research was to ensure safety and reliability for a pipeline involving transport of SCCO2 from the power plant to the sequestration site to mitigate the greenhouse gas effect. Results showed that one commercial coating and one designed formulation can both be supplied as potential candidates for internal pipeline coating to transport SCCO2.

  7. Pigging the unpiggable: a total integrated maintenance approach of the Progreso Process Pipelines in Yucatan, Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez Graciano, Luis [PEMEX Refinacion, Mexico, MX (Mexico); Gonzalez, Oscar L. [NDT Systems and Services, Stutensee (Germany)

    2009-07-01

    Pemex Refinacion and NDT Systems and Services, executed a Total Integrated Maintenance Program of the Process Pipeline System in the Yucatan Peninsula in Mexico, in order to modernize, enhance and bring the pipeline system up to the best industry standards and ensure the integrity, reliability and safe operation of the system. This approach consisted in using multi-diameter ultrasonic inspection technology to determine the current status of the pipelines, repair every 'integrity diminishing' feature present on the system and establish a Certified Maintenance Program to ensure the future reliability and safety of the pipelines. Due to the complex nature of the pipeline construction, dated from 1984, several special modifications, integrations and solutions were necessary to improve the in line inspection survey as for all traditionally unpiggable systems. The Progreso Pipeline System consists in 3 major pipelines which transport diesel, jet fuel and gasoline respectively. The outside diameter of two pipelines varies along its length between 12 inches - 14 inches - 16 inches, making the inspection survey more difficult and particularly demanding an Inspection Tool solution. It is located on the coast of the Yucatan Peninsula, at the Mexican Caribbean, and its main purpose is to transport the product from the docked tanker ships to the Pemex Storage and Distribution Terminal. (author)

  8. Enhancement of Hydrodynamic Processes in Oil Pipelines Considering Rheologically Complex High-Viscosity Oils

    Science.gov (United States)

    Konakhina, I. A.; Khusnutdinova, E. M.; Khamidullina, G. R.; Khamidullina, A. F.

    2016-06-01

    This paper describes a mathematical model of flow-related hydrodynamic processes for rheologically complex high-viscosity bitumen oil and oil-water suspensions and presents methods to improve the design and performance of oil pipelines.

  9. The data processing pipeline for the Herschel SPIRE Fourier Transform Spectrometer

    CERN Document Server

    Fulton, T; Polehampton, E T; Valtchanov, I; Hopwood, R; Lu, N; Baluteau, J -P; Mainetti, G; Pearson, C; Papageorgiou, A; Guest, S; Zhang, L; Imhof, P; Swinyard, B M; Griffin, M J; Lim, T L

    2016-01-01

    We present the data processing pipeline to generate calibrated data products from the Spectral and Photometric Imaging Receiver (SPIRE) imaging Fourier Transform Spectrometer on the Herschel Space Observatory. The pipeline processes telemetry from SPIRE observations and produces calibrated spectra for all resolution modes. The spectrometer pipeline shares some elements with the SPIRE photometer pipeline, including the conversion of telemetry packets into data timelines and calculation of bolometer voltages. We present the following fundamental processing steps unique to the spectrometer: temporal and spatial interpolation of the scan mechanism and detector data to create interferograms; Fourier transformation; apodization; and creation of a data cube. We also describe the corrections for various instrumental effects including first- and second-level glitch identification and removal, correction of the effects due to emission from the Herschel telescope and from within the spectrometer instrument, interferogra...

  10. Strategy and technical considerations of an upstream pipeline risk assessment process

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, D.; Santander, M.; Yu, B. [Husky Energy Inc., Calgary, AB (Canada)

    2010-07-01

    This paper presented an upstream pipeline integrity management program developed by Husky Energy. Three levels of risk assesment were used to optimize technical expertise and resources. The first level consisted of prioritizing gathering systems for the net assessment level. Level 2 was comprised of a subject matter expert (SME) based assessment performed in collaboration with industry operators and integrity management specialists. The assessment focused on the development of a mitigation strategy for higher risk pipelines. Level 3 investigated frequent failures of systematic mechanisms and pipelines with a high risk of failure. The risk assessment process was shown to effectively manage pipeline integrity when risk validation and mitigation actions were implemented, closely followed up, and evaluated in relation to their consistency. 9 refs., 3 tabs., 7 figs.

  11. Building a scalable global data processing pipeline for large astronomical photometric datasets

    CERN Document Server

    Doyle, Paul

    2015-01-01

    Astronomical photometry is the science of measuring the flux of a celestial object. Since its introduction, the CCD has been the principle method of measuring flux to calculate the apparent magnitude of an object. Each CCD image taken must go through a process of cleaning and calibration prior to its use. As the number of research telescopes increases the overall computing resources required for image processing also increases. Existing processing techniques are primarily sequential in nature, requiring increasingly powerful servers, faster disks and faster networks to process data. Existing High Performance Computing solutions involving high capacity data centres are complex in design and expensive to maintain, while providing resources primarily to high profile science projects. This research describes three distributed pipeline architectures, a virtualised cloud based IRAF, the Astronomical Compute Node (ACN), a private cloud based pipeline, and NIMBUS, a globally distributed system. The ACN pipeline proce...

  12. HiCUP: pipeline for mapping and processing Hi-C data.

    Science.gov (United States)

    Wingett, Steven; Ewels, Philip; Furlan-Magaril, Mayra; Nagano, Takashi; Schoenfelder, Stefan; Fraser, Peter; Andrews, Simon

    2015-01-01

    HiCUP is a pipeline for processing sequence data generated by Hi-C and Capture Hi-C (CHi-C) experiments, which are techniques used to investigate three-dimensional genomic organisation. The pipeline maps data to a specified reference genome and removes artefacts that would otherwise hinder subsequent analysis. HiCUP also produces an easy-to-interpret yet detailed quality control (QC) report that assists in refining experimental protocols for future studies. The software is freely available and has already been used for processing Hi-C and CHi-C data in several recently published peer-reviewed studies.

  13. Data Processing Pipeline for Pointing Observations of Lunar-based Ultraviolet Telescope

    CERN Document Server

    Meng, Xian-Min; Qiu, Yu-Lei; Wu, Chao; Wang, Jing; Han, Xu-Hui; Deng, Jin-Song; Xin, Li-Ping; Cai, Hong-Bo; Wei, Jian-Yan

    2015-01-01

    We describe the data processing pipeline developed to reduce the pointing observation data of Lunar-based Ultraviolet Telescope (LUT), which belongs to the Chang'e-3 mission of the Chinese Lunar Exploration Program. The pointing observation program of LUT is dedicated to variable objects monitoring in a near-ultraviolet (245-345 nm) band. LUT works in lunar daytime for sufficient power supply, so some special data processing strategies have been developed in the pipeline. The procedures of the pipeline mainly include stray light removing, astrometry, flat fielding employing superflat technique, source extraction and cosmic rays rejection, aperture and PSF photometry, aperture correction, and catalogues archiving. It has been intensively tested and works smoothly with observation data. The photometric accuracy is typically ~0.02 mag for LUT 10 mag stars (30s exposure), with errors from background noises, residuals of stray light removing, and flat fielding. The accuracy degrades to be ~0.2 mag for stars of 13....

  14. Process analysers for optimization of reinjection of contaminated product in product pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Benke, H. (Benke Instruments und Elektro GmbH, Reinbek (Germany, F.R.)); Gravert, W. (Rhein-Main-Rohrleitungstransport GmbH, Koeln (Germany, F.R.))

    1989-01-01

    Correctly applied process analysers for the measurement of mixing zones in a multi product pipeline and the limitation of the reinjection volumes are indispensable aids to safeguard pipeline economics while maintaining required product specifications. As an example a European multiproduct pipeline with various input and output stations is described where analysers have been installed and operational experiences are reported. Measuring equipment installed a head of an output station indicates the start and finish of a mixing zone between products, enabling the separation of the mixing zone volume into contaminated product tank. Additionally, measuring equipment has been installed to enable injection of the contaminated product into suitable products without harming the current relevant product specifications. (MOS).

  15. An image processing pipeline to detect and segment nuclei in muscle fiber microscopic images.

    Science.gov (United States)

    Guo, Yanen; Xu, Xiaoyin; Wang, Yuanyuan; Wang, Yaming; Xia, Shunren; Yang, Zhong

    2014-08-01

    Muscle fiber images play an important role in the medical diagnosis and treatment of many muscular diseases. The number of nuclei in skeletal muscle fiber images is a key bio-marker of the diagnosis of muscular dystrophy. In nuclei segmentation one primary challenge is to correctly separate the clustered nuclei. In this article, we developed an image processing pipeline to automatically detect, segment, and analyze nuclei in microscopic image of muscle fibers. The pipeline consists of image pre-processing, identification of isolated nuclei, identification and segmentation of clustered nuclei, and quantitative analysis. Nuclei are initially extracted from background by using local Otsu's threshold. Based on analysis of morphological features of the isolated nuclei, including their areas, compactness, and major axis lengths, a Bayesian network is trained and applied to identify isolated nuclei from clustered nuclei and artifacts in all the images. Then a two-step refined watershed algorithm is applied to segment clustered nuclei. After segmentation, the nuclei can be quantified for statistical analysis. Comparing the segmented results with those of manual analysis and an existing technique, we find that our proposed image processing pipeline achieves good performance with high accuracy and precision. The presented image processing pipeline can therefore help biologists increase their throughput and objectivity in analyzing large numbers of nuclei in muscle fiber images.

  16. A pipeline for comprehensive and automated processing of electron diffraction data in IPLT.

    Science.gov (United States)

    Schenk, Andreas D; Philippsen, Ansgar; Engel, Andreas; Walz, Thomas

    2013-05-01

    Electron crystallography of two-dimensional crystals allows the structural study of membrane proteins in their native environment, the lipid bilayer. Determining the structure of a membrane protein at near-atomic resolution by electron crystallography remains, however, a very labor-intense and time-consuming task. To simplify and accelerate the data processing aspect of electron crystallography, we implemented a pipeline for the processing of electron diffraction data using the Image Processing Library and Toolbox (IPLT), which provides a modular, flexible, integrated, and extendable cross-platform, open-source framework for image processing. The diffraction data processing pipeline is organized as several independent modules implemented in Python. The modules can be accessed either from a graphical user interface or through a command line interface, thus meeting the needs of both novice and expert users. The low-level image processing algorithms are implemented in C++ to achieve optimal processing performance, and their interface is exported to Python using a wrapper. For enhanced performance, the Python processing modules are complemented with a central data managing facility that provides a caching infrastructure. The validity of our data processing algorithms was verified by processing a set of aquaporin-0 diffraction patterns with the IPLT pipeline and comparing the resulting merged data set with that obtained by processing the same diffraction patterns with the classical set of MRC programs.

  17. Business process modeling applied to oil pipeline and terminal processes: a proposal for TRANSPETRO's oil pipelines and terminals in Rio de Janeiro and Minas Gerais

    Energy Technology Data Exchange (ETDEWEB)

    Santiago, Adilson da Silva [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil); Caulliraux, Heitor Mansur [Universidade Federal do Rio de Janeiro (COPPE/UFRJ/GPI), RJ (Brazil). Coordenacao de Pos-graduacao em Engenharia. Grupo de Producao Integrada; Almeida, Maria Fatima Ludovico de [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio/ITUC), Rio de Janeiro, RJ (Brazil). Instituto Tecnologico; Felippe, Adriana Vieira de Oliveira [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    Business process modeling (BPM) using event driven process chain diagrams (EPCs) to lay out business process work flows is now widely adopted around the world. The EPC method was developed within the framework of the ARIS Toolset developed by Prof. Wilhelm-August Scheer at the Institut fur Wirtschaftsinformatik at the Universitat des Saarlandes, in the early 1990s. It is used by many companies to model, analyze and redesign business processes. As such it forms the core technique for modeling in ARIS, which serves to link the different aspects of the so-called control view, which is discussed in the section on ARIS business process modeling. This paper describes a proposal made to TRANSPETRO's Oil Pipelines and Terminals Division in the states of Rio de Janeiro and Minas Gerais, which will be jointly developed by specialists and managers from TRANSPETRO and from COPPETEC, the collaborative research arm of Rio de Janeiro Federal University (UFRJ). The proposal is based on ARIS business process modeling and is presented here according to its seven phases, as follows: information survey and definition of the project structure; mapping and analysis of Campos Eliseos Terminal (TECAM) processes; validation of TECAM process maps; mapping and analysis of the remaining organizational units' processes; validation of the remaining organizational units' process maps; proposal of a business process model for all organizational units of TRANSPETRO's Oil Pipelines and Terminals Division in Rio de Janeiro and Minas Gerais; critical analysis of the process itself and the results and potential benefits of BPM. (author)

  18. Leak detection in gas pipeline by acoustic and signal processing - A review

    Science.gov (United States)

    Adnan, N. F.; Ghazali, M. F.; Amin, M. M.; Hamat, A. M. A.

    2015-12-01

    The pipeline system is the most important part in media transport in order to deliver fluid to another station. The weak maintenance and poor safety will contribute to financial losses in term of fluid waste and environmental impacts. There are many classifications of techniques to make it easier to show their specific method and application. This paper's discussion about gas leak detection in pipeline system using acoustic method will be presented in this paper. The wave propagation in the pipeline is a key parameter in acoustic method when the leak occurs and the pressure balance of the pipe will generated by the friction between wall in the pipe. The signal processing is used to decompose the raw signal and show in time- frequency. Findings based on the acoustic method can be used for comparative study in the future. Acoustic signal and HHT is the best method to detect leak in gas pipelines. More experiments and simulation need to be carried out to get the fast result of leaking and estimation of their location.

  19. Youpi: A Web-based Astronomical Image Processing Pipeline

    Science.gov (United States)

    Monnerville, M.; Sémah, G.

    2010-12-01

    Youpi stands for “YOUpi is your processing PIpeline”. It is a portable, easy to use web application providing high level functionalities to perform data reduction on scientific FITS images. It is built on top of open source processing tools that are released to the community by Terapix, in order to organize your data on a computer cluster, to manage your processing jobs in real time and to facilitate teamwork by allowing fine-grain sharing of results and data. On the server side, Youpi is written in the Python programming language and uses the Django web framework. On the client side, Ajax techniques are used along with the Prototype and script.aculo.us Javascript librairies.

  20. Safety Control on the Chocking Process of Supercritical Carbon Dioxide Pipeline

    Directory of Open Access Journals (Sweden)

    Qing Zhao

    2014-09-01

    Full Text Available Transportation safety of supercritical CO2 pipeline is a key aspect of carbon capture and storage (CCS. For reducing the high pressure in supercritical pipeline when accidental cases arise, man-made release will be applied using chocking process. The downstream parameters of chocking process can be predicted based on the adiabatic process assumption. In the critical chocking process, the critical velocity at outlet is sonic. A chocking pipe can be designed for buffering between different chocking orifices according to the length of turbulence area produced by jetting momentum. For the effect of noise hazard produced by large jetting velocity, a muffler can be applied at the outlet of final stage orifice to atmosphere. For the influence of impurities on the chocking process of anthropogenic CO2 pipeline, the presence of SO2 as an impurity is helpful for increasing the downstream temperatures through the chocking device to prevent the frozen hazard, whereas the presence of N2 as an impurity indicates a lower downstream temperature. The higher initial temperature can prevent the dry ice formation at the outlet of vent pipe when the multistage chocking is applied.

  1. Standardization process for pipeline right-of-way activities: the case of TRANSPETRO's Oil Pipeline and Terminals Business Unit

    Energy Technology Data Exchange (ETDEWEB)

    Ribeiro, Kassandra Senra de Morais M.; Goncalves, Bruno Martins [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil). Diretoria de Terminais e Oleodutos; Almeida, Maria Fatima Ludovico de [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio/ITUC), Rio de Janeiro, RJ (Brazil). Instituto Tecnologico

    2009-07-01

    This paper describes the experience of PETROBRAS Transporte S.A. concerning the standardization process for its pipeline right-of-way (ROW) activities. This standardization initiative has been carried out within the Oil Pipelines and Terminals Standardization Program (PRONOT), focusing on planning, standardization and implementation of all norms and corporate procedures referring to TRANSPETRO's right-of-way activities. The process promoted the integration of isolated regional initiatives, a sense of unity and the creation of a learning network consisting of 60 employees. This paper presents the last phase's results concerning implementation of corporate standards, based upon achievements of previous phases. It covers the following topics: a general view of the whole process by way of introduction; the potential of integration of recent standardization results with TRANSPETRO's corporate management tools and information systems; definition of four performance indicators and their metrics related to pipeline right-of-way management, as well as a corporate standard for the requirements for contracting services related to rights-of-way inspection, maintenance and communication; challenges, barriers and benefits perceived by the team responsible for formulating and implementing standards and procedures in TRANSPETRO's Oil Pipelines and Terminals Unit. (author)

  2. A Design of Pipelined Architecture for on-the-Fly Processing of Big Data Streams

    Directory of Open Access Journals (Sweden)

    Usamah Algemili

    2015-01-01

    Full Text Available Conventional processing infrastructures have been challenged by huge demand of stream-based applications. The industry responded by introducing traditional stream processing engines along-with emerged technologies. The ongoing paradigm embraces parallel computing as the most-suitable proposition. Pipelining and Parallelism have been intensively studied in recent years, yet parallel programming on multiprocessor architectures stands as one of the biggest challenges to the software industry. Parallel computing relies on parallel programs that may encounter internal memory constrains. In addition, parallel computing needs special skillset of programming as well as software conversions. This paper presents reconfigurable pipelined architecture. The design is especially aimed at Big Data clustering, and it adopts Symmetric multiprocessing (SMP along with crossbar switch and forced interrupt. The main goal of this promising architecture is to efficiently process big data streams on-the-fly, while it can process sequential programs on parallel-pipelined model. The system overpasses internal memory constrains of multicore architectures by applying forced interrupts and crossbar switching. It reduces complexity, data dependency, high-latency, and cost overhead of parallel computing.

  3. A software pipeline for processing and identification of fungal ITS sequences

    Directory of Open Access Journals (Sweden)

    Kristiansson Erik

    2009-01-01

    Full Text Available Abstract Background Fungi from environmental samples are typically identified to species level through DNA sequencing of the nuclear ribosomal internal transcribed spacer (ITS region for use in BLAST-based similarity searches in the International Nucleotide Sequence Databases. These searches are time-consuming and regularly require a significant amount of manual intervention and complementary analyses. We here present software – in the form of an identification pipeline for large sets of fungal ITS sequences – developed to automate the BLAST process and several additional analysis steps. The performance of the pipeline was evaluated on a dataset of 350 ITS sequences from fungi growing as epiphytes on building material. Results The pipeline was written in Perl and uses a local installation of NCBI-BLAST for the similarity searches of the query sequences. The variable subregion ITS2 of the ITS region is extracted from the sequences and used for additional searches of higher sensitivity. Multiple alignments of each query sequence and its closest matches are computed, and query sequences sharing at least 50% of their best matches are clustered to facilitate the evaluation of hypothetically conspecific groups. The pipeline proved to speed up the processing, as well as enhance the resolution, of the evaluation dataset considerably, and the fungi were found to belong chiefly to the Ascomycota, with Penicillium and Aspergillus as the two most common genera. The ITS2 was found to indicate a different taxonomic affiliation than did the complete ITS region for 10% of the query sequences, though this figure is likely to vary with the taxonomic scope of the query sequences. Conclusion The present software readily assigns large sets of fungal query sequences to their respective best matches in the international sequence databases and places them in a larger biological context. The output is highly structured to be easy to process, although it still needs

  4. Status of the SPIRE photometer data processing pipelines during the early phases of the Herschel Mission

    OpenAIRE

    Dowell, C. Darren; Levenson, Louis; Lu, Nanyao; Schulz, Bernhard; Schwartz, Arnold; Shupe, David L.; Xu, C. Kevin; Zhang, Lijun

    2010-01-01

    We describe the current state of the ground segment of Herschel-SPIRE photometer data processing, approximately one year into the mission. The SPIRE photometer operates in two modes: scan mapping and chopped point source photometry. For each mode, the basic analysis pipeline - which follows in reverse the effects from the incidence of light on the telescope to the storage of samples from the detector electronics - is essentially the same as described pre-launch. However, the calibration param...

  5. Modulator and VCSEL-MSM smart pixels for parallel pipeline networking and signal processing

    Science.gov (United States)

    Chen, C.-H.; Hoanca, Bogdan; Kuznia, C. B.; Pansatiankul, Dhawat E.; Zhang, Liping; Sawchuk, Alexander A.

    1999-07-01

    TRANslucent Smart Pixel Array (TRANSPAR) systems perform high performance parallel pipeline networking and signal processing based on optical propagation of 3D data packets. The TRANSPAR smart pixel devices use either self-electro- optic effect GaAs multiple quantum well modulators or CMOS- VCSEL-MSM (CMOS-Vertical Cavity Surface Emitting Laser- Metal-Semiconductor-Metal) technology. The data packets transfer among high throughput photonic network nodes using multiple access/collision detection or token-ring protocols.

  6. ArrayPipe: a flexible processing pipeline for microarray data

    Science.gov (United States)

    Hokamp, Karsten; Roche, Fiona M.; Acab, Michael; Rousseau, Marc-Etienne; Kuo, Byron; Goode, David; Aeschliman, Dana; Bryan, Jenny; Babiuk, Lorne A.; Hancock, Robert E. W.; Brinkman, Fiona S. L.

    2004-01-01

    A number of microarray analysis software packages exist already; however, none combines the user-friendly features of a web-based interface with potential ability to analyse multiple arrays at once using flexible analysis steps. The ArrayPipe web server (freely available at www.pathogenomics.ca/arraypipe) allows the automated application of complex analyses to microarray data which can range from single slides to large data sets including replicates and dye-swaps. It handles output from most commonly used quantification software packages for dual-labelled arrays. Application features range from quality assessment of slides through various data visualizations to multi-step analyses including normalization, detection of differentially expressed genes, andcomparison and highlighting of gene lists. A highly customizable action set-up facilitates unrestricted arrangement of functions, which can be stored as action profiles. A unique combination of web-based and command-line functionality enables comfortable configuration of processes that can be repeatedly applied to large data sets in high throughput. The output consists of reports formatted as standard web pages and tab-delimited lists of calculated values that can be inserted into other analysis programs. Additional features, such as web-based spreadsheet functionality, auto-parallelization and password protection make this a powerful tool in microarray research for individuals and large groups alike. PMID:15215429

  7. Application of a high-throughput process analytical technology metabolomics pipeline to Port wine forced ageing process

    OpenAIRE

    2014-01-01

    Metabolomics aims at gathering the maximum amount of metabolic information for a total interpretation of biological systems. A process analytical technology pipeline, combining gas chromatography–mass spectrometry data preprocessing with multivariate analysis, was applied to a Port wine “forced ageing” process under different oxygen saturation regimes at 60 °C. It was found that extreme “forced ageing” conditions promote the occurrence of undesirable chemical reactions by production of di...

  8. Fully automated rodent brain MR image processing pipeline on a Midas server: from acquired images to region-based statistics.

    Science.gov (United States)

    Budin, Francois; Hoogstoel, Marion; Reynolds, Patrick; Grauer, Michael; O'Leary-Moore, Shonagh K; Oguz, Ipek

    2013-01-01

    Magnetic resonance imaging (MRI) of rodent brains enables study of the development and the integrity of the brain under certain conditions (alcohol, drugs etc.). However, these images are difficult to analyze for biomedical researchers with limited image processing experience. In this paper we present an image processing pipeline running on a Midas server, a web-based data storage system. It is composed of the following steps: rigid registration, skull-stripping, average computation, average parcellation, parcellation propagation to individual subjects, and computation of region-based statistics on each image. The pipeline is easy to configure and requires very little image processing knowledge. We present results obtained by processing a data set using this pipeline and demonstrate how this pipeline can be used to find differences between populations.

  9. Accelerating root system phenotyping of seedlings through a computer-assisted processing pipeline.

    Science.gov (United States)

    Dupuy, Lionel X; Wright, Gladys; Thompson, Jacqueline A; Taylor, Anna; Dekeyser, Sebastien; White, Christopher P; Thomas, William T B; Nightingale, Mark; Hammond, John P; Graham, Neil S; Thomas, Catherine L; Broadley, Martin R; White, Philip J

    2017-01-01

    There are numerous systems and techniques to measure the growth of plant roots. However, phenotyping large numbers of plant roots for breeding and genetic analyses remains challenging. One major difficulty is to achieve high throughput and resolution at a reasonable cost per plant sample. Here we describe a cost-effective root phenotyping pipeline, on which we perform time and accuracy benchmarking to identify bottlenecks in such pipelines and strategies for their acceleration. Our root phenotyping pipeline was assembled with custom software and low cost material and equipment. Results show that sample preparation and handling of samples during screening are the most time consuming task in root phenotyping. Algorithms can be used to speed up the extraction of root traits from image data, but when applied to large numbers of images, there is a trade-off between time of processing the data and errors contained in the database. Scaling-up root phenotyping to large numbers of genotypes will require not only automation of sample preparation and sample handling, but also efficient algorithms for error detection for more reliable replacement of manual interventions.

  10. Gap-free segmentation of vascular networks with automatic image processing pipeline.

    Science.gov (United States)

    Hsu, Chih-Yang; Ghaffari, Mahsa; Alaraj, Ali; Flannery, Michael; Zhou, Xiaohong Joe; Linninger, Andreas

    2017-03-01

    Current image processing techniques capture large vessels reliably but often fail to preserve connectivity in bifurcations and small vessels. Imaging artifacts and noise can create gaps and discontinuity of intensity that hinders segmentation of vascular trees. However, topological analysis of vascular trees require proper connectivity without gaps, loops or dangling segments. Proper tree connectivity is also important for high quality rendering of surface meshes for scientific visualization or 3D printing. We present a fully automated vessel enhancement pipeline with automated parameter settings for vessel enhancement of tree-like structures from customary imaging sources, including 3D rotational angiography, magnetic resonance angiography, magnetic resonance venography, and computed tomography angiography. The output of the filter pipeline is a vessel-enhanced image which is ideal for generating anatomical consistent network representations of the cerebral angioarchitecture for further topological or statistical analysis. The filter pipeline combined with computational modeling can potentially improve computer-aided diagnosis of cerebrovascular diseases by delivering biometrics and anatomy of the vasculature. It may serve as the first step in fully automatic epidemiological analysis of large clinical datasets. The automatic analysis would enable rigorous statistical comparison of biometrics in subject-specific vascular trees. The robust and accurate image segmentation using a validated filter pipeline would also eliminate operator dependency that has been observed in manual segmentation. Moreover, manual segmentation is time prohibitive given that vascular trees have more than thousands of segments and bifurcations so that interactive segmentation consumes excessive human resources. Subject-specific trees are a first step toward patient-specific hemodynamic simulations for assessing treatment outcomes.

  11. Karhunen-Loeve Transform: An Exercise in Simple Image-Processing Parallel Pipelines

    OpenAIRE

    Fleury, M.; A. C. Downton; Clark, A.F.

    2012-01-01

    Practical parallelizations of multi-phased low-level image-processing algorithms may require working in batch mode. The features of a new processing model, employing a pipeline of processor farms, are described. A simple exemplar, the Karhunen-Loeve transform, is prototyped on a network of processors running a real-time operating system. The design trade-offs for this and similar algorithms are indicated. In the manner of co-design, eventual implementation on large- and fine-grained hardware ...

  12. Application of groundwater aggressiveness assessment method for estimation of the karst process at main gas pipeline construction

    Science.gov (United States)

    Ermolaeva, A. V.

    2016-03-01

    Main pipelines maintenance is connected with hazard engineering and geological working conditions. The article deals with the use of groundwater aggressiveness assessment method to estimate the karst processes development during the construction of main gas pipelines. The possibility of using this method is analyzed on the example of the initial section of the designed gas pipeline “Power of Siberia” (section “Chayanda-Lensk"). The calculation of the nonequilibrium index Ca was made in accordance with the geotechnical survey data. The dependencies between the geomorphological features of the terrain and the natural waters aggressiveness were determined.

  13. EST Pipeline System: Detailed and Automated EST Data Processing and Mining

    Institute of Scientific and Technical Information of China (English)

    Hao Xu; Liang Zhang; Hong Yu; Yan Zhou; Ling He; Yuanzhong Zhu; Wei Huang; Lijun Fang; Lin Tao; Yuedong Zhu; Lin Cai; Huayong Xu

    2003-01-01

    Expressed sequence tags (ESTs) are widely used in gene survey research these years. The EST Pipeline System, software developed by Hangzhou Genomics Institute (HGI), can automatically analyze different scalar EST sequences by suitable methods. All the analysis reports, including those of vector masking, sequence assembly, gene annotation, Gene Ontology classification, and some other analyses,can be browsed and searched as well as downloaded in the Excel format from the web interface, saving research efforts from routine data processing for biological rules embedded in the data.

  14. Energy efficiency evaluation of a natural gas pipeline based on an analytic hierarchy process

    National Research Council Canada - National Science Library

    Xie, Ying; Ma, Xiufen; Ning, Haifeng; Yuan, Zongming; Xie, Ting

    2017-01-01

    A long-distance natural gas pipeline system consists of considerable equipment and many pipe segments, but the conventional energy efficiency index of a natural gas pipeline is considered as a whole...

  15. Natural language processing pipelines to annotate BioC collections with an application to the NCBI disease corpus.

    Science.gov (United States)

    Comeau, Donald C; Liu, Haibin; Islamaj Doğan, Rezarta; Wilbur, W John

    2014-01-01

    BioC is a new format and associated code libraries for sharing text and annotations. We have implemented BioC natural language preprocessing pipelines in two popular programming languages: C++ and Java. The current implementations interface with the well-known MedPost and Stanford natural language processing tool sets. The pipeline functionality includes sentence segmentation, tokenization, part-of-speech tagging, lemmatization and sentence parsing. These pipelines can be easily integrated along with other BioC programs into any BioC compliant text mining systems. As an application, we converted the NCBI disease corpus to BioC format, and the pipelines have successfully run on this corpus to demonstrate their functionality. Code and data can be downloaded from http://bioc.sourceforge.net. Database URL: http://bioc.sourceforge.net. © The Author(s) 2014. Published by Oxford University Press.

  16. Fast, accurate and easy-to-pipeline methods for amplicon sequence processing

    Science.gov (United States)

    Antonielli, Livio; Sessitsch, Angela

    2016-04-01

    Next generation sequencing (NGS) technologies established since years as an essential resource in microbiology. While on the one hand metagenomic studies can benefit from the continuously increasing throughput of the Illumina (Solexa) technology, on the other hand the spreading of third generation sequencing technologies (PacBio, Oxford Nanopore) are getting whole genome sequencing beyond the assembly of fragmented draft genomes, making it now possible to finish bacterial genomes even without short read correction. Besides (meta)genomic analysis next-gen amplicon sequencing is still fundamental for microbial studies. Amplicon sequencing of the 16S rRNA gene and ITS (Internal Transcribed Spacer) remains a well-established widespread method for a multitude of different purposes concerning the identification and comparison of archaeal/bacterial (16S rRNA gene) and fungal (ITS) communities occurring in diverse environments. Numerous different pipelines have been developed in order to process NGS-derived amplicon sequences, among which Mothur, QIIME and USEARCH are the most well-known and cited ones. The entire process from initial raw sequence data through read error correction, paired-end read assembly, primer stripping, quality filtering, clustering, OTU taxonomic classification and BIOM table rarefaction as well as alternative "normalization" methods will be addressed. An effective and accurate strategy will be presented using the state-of-the-art bioinformatic tools and the example of a straightforward one-script pipeline for 16S rRNA gene or ITS MiSeq amplicon sequencing will be provided. Finally, instructions on how to automatically retrieve nucleotide sequences from NCBI and therefore apply the pipeline to targets other than 16S rRNA gene (Greengenes, SILVA) and ITS (UNITE) will be discussed.

  17. IMPROVED PROCESSING FOR MICROSTRUCTURES AND MECHANICAL PROPERTIES OF A COMMERCIAL PIPELINE STEEL

    Institute of Scientific and Technical Information of China (English)

    Y.C. Wang; Y.S. Li; M.C. Zhao; K. Yang

    2005-01-01

    The transformation productions of hot-deformation simulation experiments were investigated us ing a Gleeble-1500 hot simulator for a commercial pipeline steel. Based on the investigation results, the improved thermo-mechanical control processing (TMCP) schedules containing a two stage multi-pass controlled rolling coupled with moderate cooling rates were applied to hot rolling experiments and acicular ferrite dominated microstructure was obtained. Microstructures and mechanical properties of hot rolled plates were related to TMCP processing, and regression equations describing the relation between processing parameters and mechanical properties in the current TMCP were developed, which could be used to predict mechanical properties of the experimental steel during commercially processing. It was found that with an increase in cooling rate after hot rolling, grain size in the microstructure became smaller, the amount of polyg onal ferrite decreased and acicular ferrite increased, and accordingly mechanical properties increased.

  18. Visual operations management tools applied to the oil pipelines and terminals standardization process: the experience of TRANSPETRO

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Maria Fatima Ludovico de [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio/ITUC), Rio de Janeiro, RJ (Brazil). Instituto Tecnologico; Santiago, Adilson; Ribeiro, Kassandra Senra; Arruda, Daniela Mendonca [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper describes the process by which visual operations management (VOM) tools were implemented, concerning standards and operational procedures in TRANSPETRO's Oil Pipelines and Terminals Unit. It provides: a brief literature review of visual operations management tools applied to total quality management and the standardization processes; a discussion of the assumptions from the second level of VOM (visual standards) upon which TRANSPETRO's oil pipelines and terminals business processes and operational procedures are based; and a description of the VOM implementation process involving more than 100 employees and one illustrative example of 'Quick Guides' for right-of- way management activities. Finally, it discusses the potential impacts and benefits of using VOM tools in the current practices in TRANSPETRO's Oil Pipelines and Terminals Unit, reinforcing the importance of such visual guides as vital to implement regional and corporate procedures, focusing on the main operational processes. (author)

  19. The Herschel Data Processing System - Hipe And Pipelines - During The Early Mission Phase

    Science.gov (United States)

    Ardila, David R.; Herschel Science Ground Segment Consortium

    2010-01-01

    The Herschel Space Observatory, the fourth cornerstone mission in the ESA science program, was launched 14th of May 2009. With a 3.5 m telescope, it is the largest space telescope ever launched. Herschel's three instruments (HIFI, PACS, and SPIRE) perform photometry and spectroscopy in the 55 - 672 micron range and will deliver exciting science for the astronomical community during at least three years of routine observations. Here we summarize the state of the Herschel Data Processing System and give an overview about future development milestones and plans. The development of the Herschel Data Processing System started seven years ago to support the data analysis for Instrument Level Tests. Resources were made available to implement a freely distributable Data Processing System capable of interactively and automatically reduce Herschel data at different processing levels. The system combines data retrieval, pipeline execution and scientific analysis in one single environment. The software is coded in Java and Jython to be platform independent and to avoid the need for commercial licenses. The Herschel Interactive Processing Environment (HIPE) is the user-friendly face of Herschel Data Processing. The first PACS preview observation of M51 was processed with HIPE, using basic pipeline scripts to a fantastic image within 30 minutes of data reception. Also the first HIFI observations on DR-21 were successfully reduced to high quality spectra, followed by SPIRE observations on M66 and M74. The Herschel Data Processing System is a joint development by the Herschel Science Ground Segment Consortium, consisting of ESA, the NASA Herschel Science Center, and the HIFI, PACS and SPIRE consortium members.

  20. Extending the Fermi-LAT Data Processing Pipeline to the Grid

    Science.gov (United States)

    Zimmer, S.; Arrabito, L.; Glanzman, T.; Johnson, T.; Lavalley, C.; Tsaregorodtsev, A.

    2012-12-01

    The Data Handling Pipeline (“Pipeline”) has been developed for the Fermi Gamma-Ray Space Telescope (Fermi) Large Area Telescope (LAT) which launched in June 2008. Since then it has been in use to completely automate the production of data quality monitoring quantities, reconstruction and routine analysis of all data received from the satellite and to deliver science products to the collaboration and the Fermi Science Support Center. Aside from the reconstruction of raw data from the satellite (Level 1), data reprocessing and various event-level analyses are also reasonably heavy loads on the pipeline and computing resources. These other loads, unlike Level 1, can run continuously for weeks or months at a time. In addition it receives heavy use in performing production Monte Carlo tasks. In daily use it receives a new data download every 3 hours and launches about 2000 jobs to process each download, typically completing the processing of the data before the next download arrives. The need for manual intervention has been reduced to less than 0.01% of submitted jobs. The Pipeline software is written almost entirely in Java and comprises several modules. The software comprises web-services that allow online monitoring and provides charts summarizing work flow aspects and performance information. The server supports communication with several batch systems such as LSF and BQS and recently also Sun Grid Engine and Condor. This is accomplished through dedicated job control services that for Fermi are running at SLAC and the other computing site involved in this large scale framework, the Lyon computing center of IN2P3. While being different in the logic of a task, we evaluate a separate interface to the Dirac system in order to communicate with EGI sites to utilize Grid resources, using dedicated Grid optimized systems rather than developing our own. More recently the Pipeline and its associated data catalog have been generalized for use by other experiments, and are

  1. A Pipeline for Large Data Processing Using Regular Sampling for Unstructured Grids

    Energy Technology Data Exchange (ETDEWEB)

    Berres, Anne Sabine [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Adhinarayanan, Vignesh [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Turton, Terece [Univ. of Texas, Austin, TX (United States); Feng, Wu [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Rogers, David Honegger [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-05-12

    Large simulation data requires a lot of time and computational resources to compute, store, analyze, visualize, and run user studies. Today, the largest cost of a supercomputer is not hardware but maintenance, in particular energy consumption. Our goal is to balance energy consumption and cognitive value of visualizations of resulting data. This requires us to go through the entire processing pipeline, from simulation to user studies. To reduce the amount of resources, data can be sampled or compressed. While this adds more computation time, the computational overhead is negligible compared to the simulation time. We built a processing pipeline at the example of regular sampling. The reasons for this choice are two-fold: using a simple example reduces unnecessary complexity as we know what to expect from the results. Furthermore, it provides a good baseline for future, more elaborate sampling methods. We measured time and energy for each test we did, and we conducted user studies in Amazon Mechanical Turk (AMT) for a range of different results we produced through sampling.

  2. Automatically designing an image processing pipeline for a five-band camera prototype using the local, linear, learned (L3) method

    Science.gov (United States)

    Tian, Qiyuan; Blasinski, Henryk; Lansel, Steven; Jiang, Haomiao; Fukunishi, Munenori; Farrell, Joyce E.; Wandell, Brian A.

    2015-02-01

    The development of an image processing pipeline for each new camera design can be time-consuming. To speed camera development, we developed a method named L3 (Local, Linear, Learned) that automatically creates an image processing pipeline for any design. In this paper, we describe how we used the L3 method to design and implement an image processing pipeline for a prototype camera with five color channels. The process includes calibrating and simulating the prototype, learning local linear transforms and accelerating the pipeline using graphics processing units (GPUs).

  3. Numerical simulation of wave-induced scour and backfilling processes beneath submarine pipelines

    DEFF Research Database (Denmark)

    Fuhrman, David R.; Baykal, Cüneyt; Sumer, B. Mutlu

    2014-01-01

    A fully-coupled hydrodynamic/morphodynamic numerical model is presented and utilized for the simulation of wave-induced scour and backfilling processes beneath submarine pipelines. The model is based on solutions to Reynolds-averaged Navier–Stokes equations, coupled with k−ω turbulence closure......≤30 demonstrate reasonable match with previous experiments, both in terms of the equilibrium scour depth as well as the scour time scale. Wave-induced backfilling processes are additionally studied by subjecting initial conditions taken from scour simulations with larger KC to new wave climates...... characterized by lower KC values. The simulations considered demonstrate the ability of the model to predict backfilling toward expected equilibrium scour depths based on the new wave climate, in line with experimental expectations. The simulated backfilling process is characterized by two stages: (1...

  4. Standardization process aligned to integrated management system: the case of TRANSPETRO's Oil Pipelines and Terminals Unit

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Maria Fatima Ludovico de [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio/ITUC), Rio de Janeiro, RJ (Brazil). Instituto Tecnologico; Labrunie, Charles; Araujo, Dario Doria de [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil). Diretoria de Terminais e Oleodutos

    2009-07-01

    This paper presents the implementation by PETROBRAS Transporte S.A. - TRANSPETRO of its Oil Pipelines and Terminals Standardization Program (PRONOT) within the scope of the 'Integrated Management System' (IMS). This program, launched in 2006 in the regions where the company operates, aims at standardizing all of its oil pipeline and terminal operations. Its implementation was planned in two phases: the first, already successfully concluded, refers to pipeline operations, industrial maintenance and right-of-way activities management; and the second, initiated in 2009, encompasses cross-sectional activities including health, safety and environment (HSE); training and development of oil pipeline workforce; communication with stake holders; oil pipeline integrity; and engineering project requirements. The documental structures of TRANSPETRO IMS and PRONOT are described and represented graphically to emphasize the intentional alignment of the standardization process carried out by the Oil Pipelines and Terminals Unit to the corporate IMS, based upon national and international literature review and through practical research focusing on the best international practices. (author)

  5. A data processing pipeline for mammalian proteome dynamics studies using stable isotope metabolic labeling.

    Science.gov (United States)

    Guan, Shenheng; Price, John C; Prusiner, Stanley B; Ghaemmaghami, Sina; Burlingame, Alma L

    2011-12-01

    In a recent study, in vivo metabolic labeling using (15)N traced the rate of label incorporation among more than 1700 proteins simultaneously and enabled the determination of individual protein turnover rate constants over a dynamic range of three orders of magnitude (Price, J. C., Guan, S., Burlingame, A., Prusiner, S. B., and Ghaemmaghami, S. (2010) Analysis of proteome dynamics in the mouse brain. Proc. Natl. Acad. Sci. U. S. A. 107, 14508-14513). These studies of protein dynamics provide a deeper understanding of healthy development and well-being of complex organisms, as well as the possible causes and progression of disease. In addition to a fully labeled food source and appropriate mass spectrometry platform, an essential and enabling component of such large scale investigations is a robust data processing and analysis pipeline, which is capable of the reduction of large sets of liquid chromatography tandem MS raw data files into the desired protein turnover rate constants. The data processing pipeline described in this contribution is comprised of a suite of software modules required for the workflow that fulfills such requirements. This software platform includes established software tools such as a mass spectrometry database search engine together with several additional, novel data processing modules specifically developed for (15)N metabolic labeling. These fulfill the following functions: (1) cross-extraction of (15)N-containing ion intensities from raw data files at varying biosynthetic incorporation times, (2) computation of peptide (15)N isotopic incorporation distributions, and (3) aggregation of relative isotope abundance curves for multiple peptides into single protein curves. In addition, processing parameter optimization and noise reduction procedures were found to be necessary in the processing modules in order to reduce propagation of errors in the long chain of the processing steps of the entire workflow.

  6. Predictive modeling of colorectal cancer using a dedicated pre-processing pipeline on routine electronic medical records

    NARCIS (Netherlands)

    Kop, Reinier; Hoogendoorn, Mark; Teije, Annette Ten; Büchner, Frederike L; Slottje, Pauline; Moons, Leon M G; Numans, Mattijs E

    2016-01-01

    Over the past years, research utilizing routine care data extracted from Electronic Medical Records (EMRs) has increased tremendously. Yet there are no straightforward, standardized strategies for pre-processing these data. We propose a dedicated medical pre-processing pipeline aimed at taking on

  7. Measurement of the stressed state of welded joints in the NPP process components and circulation pipelines based on acoustoelasticity theory

    Directory of Open Access Journals (Sweden)

    A.I. Trofimov

    2016-09-01

    Full Text Available The paper presents the results of a theoretical justification and an experimental research for a method to measure the stressed state of welded joints in the nuclear power plant (NPP process components and circulation pipelines based on acoustoelasticity theory, as well as for ways to implement them technically. Devices for measuring the stressed state of welded joints in the NPP process components and circulation pipelines based on acoustoelasticity theory allow online measurement of residual stresses along the weld height and detection of crack formation points. The use of such devices will enable early crack detection in welded joints for an increased safety of the NPP operation.

  8. Application of Method-of-Lines to Charging-Up Process in Pipelines with Entrapped Air

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yongliang; K. Vairavamoorthy

    2006-01-01

    A mathematical model is presented for the charging-up process in an air-entrapped pipeline with moving boundary conditions. A coordinate transformation technique is employed to reduce fluid motion in time-dependent domains to ones in time-independent domains. The nonlinear hyperbolic partial differential equations governing the unsteady motion of fluid combined with an equation for transient shear stress between the pipe wall and the flowing fluid are solved by the method of lines. Results show that ignoring elastic effects overestimates the maximum pressure and underestimates the maximum front velocity of filling fluid. The peak pressure of the entrapped air is sensitive to the length of the initial entrapped air pocket.

  9. GMAW (Gas Metal Arc Welding) process development for girth welding of high strength pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Rajan, Vaidyanath; Daniel, Joe; Quintana, Marie [The Lincoln Electric Company, Cleveland, OH (United States); Chen, Yaoshan [Center for Reliable Energy Systems (CRES), Dublin, OH (United States); Souza, Antonio [Lincoln Electric do Brasil, Guarulhos, SP (Brazil)

    2009-07-01

    This paper highlights some of the results and findings from the first phase of a consolidated program co-funded by US Department of Transportation Pipeline and Hazardous Materials Safety Administration (PHMSA) and Pipeline Research Council Inc (PRCI) to develop pipe weld assessment and qualification methods and optimize X 100 pipe welding technologies. One objective of the program is to establish the range of viable welding options for X 100 line pipe, and define the essential variables to provide welding process control for reliable and consistent mechanical performance of the weldments. In this first phase, a series of narrow gap girth welds were made with pulsed gas metal arc welding (GMAW), instrumented with thermocouples in the heat affected zone (HAZ) and weld metal to obtain the associated thermal profiles, and instrumented to measure true energy input as opposed to conventional heat input. Results reveal that true heat input is 16%-22% higher than conventional heat input. The thermal profile measurements correlate very well with thermal model predictions using true energy input data, which indicates the viability of treating the latter as an essential variable. Ongoing microstructural and mechanical testing work will enable validation of an integrated thermal-microstructural model being developed for these applications. Outputs from this model will be used to correlate essential welding process variables with weld microstructure and hardness. This will ultimately enable development of a list of essential variables and the ranges needed to ensure mechanical properties are achieved in practice, recommendations for controlling and monitoring these essential variables and test methods suitable for classification of welding consumables. (author)

  10. Application of a high-throughput process analytical technology metabolomics pipeline to Port wine forced ageing process.

    Science.gov (United States)

    Castro, Cristiana C; Martins, R C; Teixeira, José A; Silva Ferreira, António C

    2014-01-15

    Metabolomics aims at gathering the maximum amount of metabolic information for a total interpretation of biological systems. A process analytical technology pipeline, combining gas chromatography-mass spectrometry data preprocessing with multivariate analysis, was applied to a Port wine "forced ageing" process under different oxygen saturation regimes at 60°C. It was found that extreme "forced ageing" conditions promote the occurrence of undesirable chemical reactions by production of dioxane and dioxolane isomers, furfural and 5-hydroxymethylfurfural, which affect the quality of the final product through the degradation of the wine aromatic profile, colour and taste. Also, were found high kinetical correlations between these key metabolites with benzaldehyde, sotolon, and many other metabolites that contribute for the final aromatic profile of the Port wine. The use of the kinetical correlations in time-dependent processes as wine ageing can further contribute to biological or chemical systems monitoring, new biomarkers discovery and metabolic network investigations.

  11. ALMA Pipeline: Current Status

    Science.gov (United States)

    Shinnaga, H.; Humphreys, E.; Indebetouw, R.; Villard, E.; Kern, J.; Davis, L.; Miura, R. E.; Nakazato, T.; Sugimoto, K.; Kosugi, G.; Akiyama, E.; Muders, D.; Wyrowski, F.; Williams, S.; Lightfoot, J.; Kent, B.; Momjian, E.; Hunter, T.; ALMA Pipeline Team

    2015-12-01

    The ALMA Pipeline is the automated data reduction tool that runs on ALMA data. Current version of the ALMA pipeline produces science quality data products for standard interferometric observing modes up to calibration process. The ALMA Pipeline is comprised of (1) heuristics in the form of Python scripts that select the best processing parameters, and (2) contexts that are given for book-keeping purpose of data processes. The ALMA Pipeline produces a "weblog" that showcases detailed plots for users to judge how each step of calibration processes are treated. The ALMA Interferometric Pipeline was conditionally accepted in March 2014 by processing Cycle 0 and Cycle 1 data sets. From Cycle 2, ALMA Pipeline is used for ALMA data reduction and quality assurance for the projects whose observing modes are supported by the ALMA Pipeline. Pipeline tasks are available based on CASA version 4.2.2, and the first public pipeline release called CASA 4.2.2-pipe has been available since October 2014. One can reduce ALMA data both by CASA tasks as well as by pipeline tasks by using CASA version 4.2.2-pipe.

  12. Optimizing the fMRI data-processing pipeline using prediction and reproducibility performance metrics: I. A preliminary group analysis

    DEFF Research Database (Denmark)

    Strother, Stephen C.; Conte, Stephen La; Hansen, Lars Kai

    2004-01-01

    of baseline scans have constant, equal means, and this assumption was assessed with prediction metrics. Higher-order polynomial warps compared to affine alignment had only a minor impact on the performance metrics. We found that both prediction and reproducibility metrics were required for optimizing......We argue that published results demonstrate that new insights into human brain function may be obscured by poor and/or limited choices in the data-processing pipeline, and review the work on performance metrics for optimizing pipelines: prediction, reproducibility, and related empirical Receiver...... Operating Characteristic (ROC) curve metrics. Using the NPAIRS split-half resampling framework for estimating prediction/reproducibility metrics (Strother et al., 2002), we illustrate its use by testing the relative importance of selected pipeline components (interpolation, in-plane spatial smoothing...

  13. Development of an automated data acquisition and processing pipeline using multiple telescopes for observing transient phenomena

    Science.gov (United States)

    Savant, Vaibhav; Smith, Niall

    2016-07-01

    We report on the current status in the development of a pilot automated data acquisition and reduction pipeline based around the operation of two nodes of remotely operated robotic telescopes based in California, USA and Cork, Ireland. The observatories are primarily used as a testbed for automation and instrumentation and as a tool to facilitate STEM (Science Technology Engineering Mathematics) promotion. The Ireland node is situated at Blackrock Castle Observatory (operated by Cork Institute of Technology) and consists of two optical telescopes - 6" and 16" OTAs housed in two separate domes while the node in California is its 6" replica. Together they form a pilot Telescope ARrAy known as TARA. QuickPhot is an automated data reduction pipeline designed primarily to throw more light on the microvariability of blazars employing precision optical photometry and using data from the TARA telescopes as they constantly monitor predefined targets whenever observing conditions are favourable. After carrying out aperture photometry, if any variability above a given threshold is observed, the reporting telescope will communicate the source concerned and the other nodes will follow up with multi-band observations, taking advantage that they are located in strategically separated time-zones. Ultimately we wish to investigate the applicability of Shock-in-Jet and Geometric models. These try to explain the processes at work in AGNs which result in the formation of jets, by looking for temporal and spectral variability in TARA multi-band observations. We are also experimenting with using a Twochannel Optical PHotometric Imaging CAMera (TOΦCAM) that we have developed and which has been optimised for simultaneous two-band photometry on our 16" OTA.

  14. Predictive modeling of colorectal cancer using a dedicated pre-processing pipeline on routine electronic medical records.

    Science.gov (United States)

    Kop, Reinier; Hoogendoorn, Mark; Teije, Annette Ten; Büchner, Frederike L; Slottje, Pauline; Moons, Leon M G; Numans, Mattijs E

    2016-09-01

    Over the past years, research utilizing routine care data extracted from Electronic Medical Records (EMRs) has increased tremendously. Yet there are no straightforward, standardized strategies for pre-processing these data. We propose a dedicated medical pre-processing pipeline aimed at taking on many problems and opportunities contained within EMR data, such as their temporal, inaccurate and incomplete nature. The pipeline is demonstrated on a dataset of routinely recorded data in general practice EMRs of over 260,000 patients, in which the occurrence of colorectal cancer (CRC) is predicted using various machine learning techniques (i.e., CART, LR, RF) and subsets of the data. CRC is a common type of cancer, of which early detection has proven to be important yet challenging. The results are threefold. First, the predictive models generated using our pipeline reconfirmed known predictors and identified new, medically plausible, predictors derived from the cardiovascular and metabolic disease domain, validating the pipeline's effectiveness. Second, the difference between the best model generated by the data-driven subset (AUC 0.891) and the best model generated by the current state of the art hypothesis-driven subset (AUC 0.864) is statistically significant at the 95% confidence interval level. Third, the pipeline itself is highly generic and independent of the specific disease targeted and the EMR used. In conclusion, the application of established machine learning techniques in combination with the proposed pipeline on EMRs has great potential to enhance disease prediction, and hence early detection and intervention in medical practice. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. 77 FR 31827 - Pipeline Safety: Pipeline Damage Prevention Programs

    Science.gov (United States)

    2012-05-30

    ... Safety: Pipeline Damage Prevention Programs AGENCY: Pipeline and Hazardous Materials Safety... excavation damage prevention law enforcement programs; establish an administrative process for making... excavation damage prevention law enforcement programs; and establish the adjudication process...

  16. Pipeline engineering

    CERN Document Server

    Liu, Henry

    2003-01-01

    PART I: PIPE FLOWSINTRODUCTIONDefinition and Scope Brief History of PipelinesExisting Major PipelinesImportance of PipelinesFreight (Solids) Transport by PipelinesTypes of PipelinesComponents of PipelinesAdvantages of PipelinesReferencesSINGLE-PHASE INCOMPRESSIBLE NEWTONIAN FLUIDIntroductionFlow RegimesLocal Mean Velocity and Its Distribution (Velocity Profile)Flow Equations for One-Dimensional AnalysisHydraulic and Energy Grade LinesCavitation in Pipeline SystemsPipe in Series and ParallelInterconnected ReservoirsPipe NetworkUnsteady Flow in PipeSINGLE-PHASE COMPRESSIBLE FLOW IN PIPEFlow Ana

  17. On-line, real-time monitoring for petrochemical and pipeline process control applications

    Energy Technology Data Exchange (ETDEWEB)

    Kane, Russell D.; Eden, D.C.; Cayard, M.S.; Eden, D.A.; Mclean, D.T. [InterCorr International, Inc., 14503 Bammel N. Houston, Suite 300, Houston Texas 77014 (United States); Kintz, J. [BASF Corporation, 602 Copper Rd., Freeport, Texas 77541 (United States)

    2004-07-01

    Corrosion problems in petroleum and petrochemical plants and pipeline may be inherent to the processes, but costly and damaging equipment losses are not. With the continual drive to increase productivity, while protecting both product quality, safety and the environment, corrosion must become a variable that can be continuously monitored and assessed. This millennium has seen the introduction of new 'real-time', online measurement technologies and vast improvements in methods of electronic data handling. The 'replace when it fails' approach is receding into a distant memory; facilities management today is embracing new technology, and rapidly appreciating the value it has to offer. It has offered the capabilities to increase system run time between major inspections, reduce the time and expense associated with turnaround or in-line inspections, and reduce major upsets which cause unplanned shut downs. The end result is the ability to know on a practical basis of how 'hard' facilities can be pushed before excessive corrosion damage will result, so that process engineers can understand the impact of their process control actions and implement true asset management. This paper makes reference to use of a online, real-time electrochemical corrosion monitoring system - SmartCET 1- in a plant running a mostly organic process media. It also highlights other pertinent examples where similar systems have been used to provide useful real-time information to detect system upsets, which would not have been possible otherwise. This monitoring/process control approach has operators and engineers to see, for the first time, changes in corrosion behavior caused by specific variations in process parameters. Process adjustments have been identified that reduce corrosion rates while maintaining acceptable yields and quality. The monitoring system has provided a new window into the chemistry of the process, helping chemical engineers improve their process

  18. Integrating the ODI-PPA scientific gateway with the QuickReduce pipeline for on-demand processing

    Science.gov (United States)

    Young, Michael D.; Kotulla, Ralf; Gopu, Arvind; Liu, Wilson

    2014-07-01

    As imaging systems improve, the size of astronomical data has continued to grow, making the transfer and processing of data a significant burden. To solve this problem for the WIYN Observatory One Degree Imager (ODI), we developed the ODI-Portal, Pipeline, and Archive (ODI-PPA) science gateway, integrating the data archive, data reduction pipelines, and a user portal. In this paper, we discuss the integration of the QuickReduce (QR) pipeline into PPA's Tier 2 processing framework. QR is a set of parallelized, stand-alone Python routines accessible to all users, and operators who can create master calibration products and produce standardized calibrated data, with a short turn-around time. Upon completion, the data are ingested into the archive and portal, and made available to authorized users. Quality metrics and diagnostic plots are generated and presented via the portal for operator approval and user perusal. Additionally, users can tailor the calibration process to their specific science objective(s) by selecting custom datasets, applying preferred master calibrations or generating their own, and selecting pipeline options. Submission of a QuickReduce job initiates data staging, pipeline execution, and ingestion of output data products all while allowing the user to monitor the process status, and to download or further process/analyze the output within the portal. User-generated data products are placed into a private user-space within the portal. ODI-PPA leverages cyberinfrastructure at Indiana University including the Big Red II supercomputer, the Scholarly Data Archive tape system and the Data Capacitor shared file system.

  19. WASS: an open-source stereo processing pipeline for sea waves 3D reconstruction

    Science.gov (United States)

    Bergamasco, Filippo; Benetazzo, Alvise; Torsello, Andrea; Barbariol, Francesco; Carniel, Sandro; Sclavo, Mauro

    2017-04-01

    Stereo 3D reconstruction of ocean waves is gaining more and more popularity in the oceanographic community. In fact, recent advances of both computer vision algorithms and CPU processing power can now allow the study of the spatio-temporal wave fields with unprecedented accuracy, especially at small scales. Even if simple in theory, multiple details are difficult to be mastered for a practitioner so that the implementation of a 3D reconstruction pipeline is in general considered a complex task. For instance, camera calibration, reliable stereo feature matching and mean sea-plane estimation are all factors for which a well designed implementation can make the difference to obtain valuable results. For this reason, we believe that the open availability of a well-tested software package that automates the steps from stereo images to a 3D point cloud would be a valuable addition for future researches in this area. We present WASS, a completely Open-Source stereo processing pipeline for sea waves 3D reconstruction, available at http://www.dais.unive.it/wass/. Our tool completely automates the recovery of dense point clouds from stereo images by providing three main functionalities. First, WASS can automatically recover the extrinsic parameters of the stereo rig (up to scale) so that no delicate calibration has to be performed on the field. Second, WASS implements a fast 3D dense stereo reconstruction procedure so that an accurate 3D point cloud can be computed from each stereo pair. We rely on the well-consolidated OpenCV library both for the image stereo rectification and disparity map recovery. Lastly, a set of 2D and 3D filtering techniques both on the disparity map and the produced point cloud are implemented to remove the vast majority of erroneous points that can naturally arise while analyzing the optically complex nature of the water surface (examples are sun-glares, large white-capped areas, fog and water areosol, etc). Developed to be as fast as possible, WASS

  20. Research on Precipitate Behavior during Holding Process of X80 Pipeline Steel

    Directory of Open Access Journals (Sweden)

    Niu Tao

    2016-01-01

    Full Text Available PTT (Precipitate-Temperature-Time curve of X80 pipeline steel was obtained by strain relaxation method. The evolution of precipitate particle size during holding process was simulated combined with kinetic calculation, and observed using TEM in the samples of industrial produced X80 steel with different holding time. It is revealed that the shape of PPT curve is typical “C” type with the nose temperature of 900°C and incubation time of about 5s. Kinetic calculation results show that the average particle size increases obviously with the increment of holding time. Meantime, after holding for 90s at 950°C, observation of industrial produced X80 steel reveals that the proportion of precipitate particles larger than 60nm increases dramatically, which basically agreed with the calculation results. Dissolved Nb can effectively reduce grain boundary mobility and retard recrystallization by solute drag effect. Therefore, it is strongly recommended to shorten the holding time without increasing holding temperature in industrial production, so that to reduce precipitate of Nb at high temperature, and increase the strength and toughness of steel.

  1. The Pan-STARRS Image Processing Pipeline : Design, Expectations, and PSLib

    Science.gov (United States)

    Magnier, E. A.; Price, P. A.; Pan-STARRS IPP Team

    2005-12-01

    The Pan-STARRS project is nearing completion of a 1.8m wide-field survey telescope with a 1.4 Gigapixel focal plane, called PS-1. Survey operations with this telescope are expected to begin in late 2006. Although PS-1 will perform surveys which exceed several existing major surveys in scale and data volume (eg, SDSS and the CFHT-Legacy Survey), PS-1 is just a prototype for a system consisting of four co-aligned optical trains (PS-4), planned for the 2009 time period. This article discusses the Pan-STARRS Image Processing Pipeline (IPP), which is nearing its initial release. The IPP is designed for both the PS-1 and PS-4 data analysis challenges, and is sufficiently flexible to allow easy adaptation to changes in the telescope design and survey strategies. We present the design concepts, demonstrated throughput measurements, and expectations for astrometric and photometric precision from the data analysis system. We also present the data analysis foundation library which has been built for the IPP, called PSLib. Pan-STARRS is funded under a grant from the U.S. Air Force.

  2. Extending the Fermi-LAT Data Processing Pipeline to the Grid

    CERN Document Server

    Zimmer, Stephan; Glanzman, Tom; Johnson, Tony; Lavalley, Claudia; Tsaregorodtsev, Andrei; 10.1088/1742-6596/396/3/032121

    2012-01-01

    The Data Handling Pipeline ("Pipeline") has been developed for the Fermi Gamma-Ray Space Telescope (Fermi) Large Area Telescope (LAT) which launched in June 2008. Since then it has been in use to completely automate the production of data quality monitoring quantities, reconstruction and routine analysis of all data received from the satellite and to deliver science products to the collaboration and the Fermi Science Support Center. Aside from the reconstruction of raw data from the satellite (Level 1), data reprocessing and various event-level analyses are also reasonably heavy loads on the pipeline and computing resources. These other loads, unlike Level 1, can run continuously for weeks or months at a time. In addition it receives heavy use in performing production Monte Carlo tasks. The software comprises web-services that allow online monitoring and provides charts summarizing work flow aspects and performance information. The server supports communication with several batch systems such as LSF and BQS a...

  3. Extending the Fermi-LAT data processing pipeline to the grid

    Energy Technology Data Exchange (ETDEWEB)

    Zimmer, S. [Stockholm Univ., Stockholm (Sweden); The Oskar Klein Centre for Cosmoparticle Physics, Stockholm (Sweden); Arrabito, L. [Univ. Montpellier 2, Montpellier (France); Glanzman, T. [SLAC National Accelerator Lab., Menlo Park, CA (United States); Johnson, T. [SLAC National Accelerator Lab., Menlo Park, CA (United States); Lavalley, C. [Univ. Montpellier 2, Montpellier (France); Tsaregorodtsev, A. [Centre de Physique des Particules de Marseille, Marseille (France)

    2015-05-12

    The Data Handling Pipeline ("Pipeline") has been developed for the Fermi Gamma-Ray Space Telescope (Fermi) Large Area Telescope (LAT) which launched in June 2008. Since then it has been in use to completely automate the production of data quality monitoring quantities, reconstruction and routine analysis of all data received from the satellite and to deliver science products to the collaboration and the Fermi Science Support Center. Aside from the reconstruction of raw data from the satellite (Level 1), data reprocessing and various event-level analyses are also reasonably heavy loads on the pipeline and computing resources. These other loads, unlike Level 1, can run continuously for weeks or months at a time. Additionally, it receives heavy use in performing production Monte Carlo tasks.

  4. CLOTU: An online pipeline for processing and clustering of 454 amplicon reads into OTUs followed by taxonomic annotation

    Directory of Open Access Journals (Sweden)

    Shalchian-Tabrizi Kamran

    2011-05-01

    Full Text Available Abstract Background The implementation of high throughput sequencing for exploring biodiversity poses high demands on bioinformatics applications for automated data processing. Here we introduce CLOTU, an online and open access pipeline for processing 454 amplicon reads. CLOTU has been constructed to be highly user-friendly and flexible, since different types of analyses are needed for different datasets. Results In CLOTU, the user can filter out low quality sequences, trim tags, primers, adaptors, perform clustering of sequence reads, and run BLAST against NCBInr or a customized database in a high performance computing environment. The resulting data may be browsed in a user-friendly manner and easily forwarded to downstream analyses. Although CLOTU is specifically designed for analyzing 454 amplicon reads, other types of DNA sequence data can also be processed. A fungal ITS sequence dataset generated by 454 sequencing of environmental samples is used to demonstrate the utility of CLOTU. Conclusions CLOTU is a flexible and easy to use bioinformatics pipeline that includes different options for filtering, trimming, clustering and taxonomic annotation of high throughput sequence reads. Some of these options are not included in comparable pipelines. CLOTU is implemented in a Linux computer cluster and is freely accessible to academic users through the Bioportal web-based bioinformatics service (http://www.bioportal.uio.no.

  5. Hybrid Pluggable Processing Pipeline (HyP3): A cloud-based infrastructure for generic processing of SAR data

    Science.gov (United States)

    Hogenson, K.; Arko, S. A.; Buechler, B.; Hogenson, R.; Herrmann, J.; Geiger, A.

    2016-12-01

    A problem often faced by Earth science researchers is how to scale algorithms that were developed against few datasets and take them to regional or global scales. One significant hurdle can be the processing and storage resources available for such a task, not to mention the administration of those resources. As a processing environment, the cloud offers nearly unlimited potential for compute and storage, with limited administration required. The goal of the Hybrid Pluggable Processing Pipeline (HyP3) project was to demonstrate the utility of the Amazon cloud to process large amounts of data quickly and cost effectively, while remaining generic enough to incorporate new algorithms with limited administration time or expense. Principally built by three undergraduate students at the ASF DAAC, the HyP3 system relies on core Amazon services such as Lambda, the Simple Notification Service (SNS), Relational Database Service (RDS), Elastic Compute Cloud (EC2), Simple Storage Service (S3), and Elastic Beanstalk. The HyP3 user interface was written using elastic beanstalk, and the system uses SNS and Lamdba to handle creating, instantiating, executing, and terminating EC2 instances automatically. Data are sent to S3 for delivery to customers and removed using standard data lifecycle management rules. In HyP3 all data processing is ephemeral; there are no persistent processes taking compute and storage resources or generating added cost. When complete, HyP3 will leverage the automatic scaling up and down of EC2 compute power to respond to event-driven demand surges correlated with natural disaster or reprocessing efforts. Massive simultaneous processing within EC2 will be able match the demand spike in ways conventional physical computing power never could, and then tail off incurring no costs when not needed. This presentation will focus on the development techniques and technologies that were used in developing the HyP3 system. Data and process flow will be shown

  6. Error, reproducibility and sensitivity: a pipeline for data processing of Agilent oligonucleotide expression arrays

    Directory of Open Access Journals (Sweden)

    Posch Wilfried

    2010-06-01

    Full Text Available Abstract Background Expression microarrays are increasingly used to obtain large scale transcriptomic information on a wide range of biological samples. Nevertheless, there is still much debate on the best ways to process data, to design experiments and analyse the output. Furthermore, many of the more sophisticated mathematical approaches to data analysis in the literature remain inaccessible to much of the biological research community. In this study we examine ways of extracting and analysing a large data set obtained using the Agilent long oligonucleotide transcriptomics platform, applied to a set of human macrophage and dendritic cell samples. Results We describe and validate a series of data extraction, transformation and normalisation steps which are implemented via a new R function. Analysis of replicate normalised reference data demonstrate that intrarray variability is small (only around 2% of the mean log signal, while interarray variability from replicate array measurements has a standard deviation (SD of around 0.5 log2 units ( 6% of mean. The common practise of working with ratios of Cy5/Cy3 signal offers little further improvement in terms of reducing error. Comparison to expression data obtained using Arabidopsis samples demonstrates that the large number of genes in each sample showing a low level of transcription reflect the real complexity of the cellular transcriptome. Multidimensional scaling is used to show that the processed data identifies an underlying structure which reflect some of the key biological variables which define the data set. This structure is robust, allowing reliable comparison of samples collected over a number of years and collected by a variety of operators. Conclusions This study outlines a robust and easily implemented pipeline for extracting, transforming normalising and visualising transcriptomic array data from Agilent expression platform. The analysis is used to obtain quantitative estimates of

  7. THE PROCESSING STEPS IN THE RENEW OF PLUG-FORMING DETAILS OF PIPELINE FITTINGS

    Directory of Open Access Journals (Sweden)

    Vladimir A. Skryabin

    2016-06-01

    Full Text Available Introduction. In production and repairs of pipeline armature grinding (debugging is considered as one of the major technological operations. The main task is the providing of impermeability of breech-block. Whatever problems did not arise up in the achievement of impermeability, diagnosis of reason, practically, always one - the process of grinding in of fine surfaces is well not enough conducted. There is a large stake of truth in such answer, however, its not all and problem not only in grinding in. Grinding in is the finish operation of polishing of compressions and effective of its application depends not only on the exact observance of the recommended terms and modes of process. A major value of the the stages is the forming of quality and preceding to grinding in of the operation of treatment of compressions. If prior actions are executed off grade, then efficiency of realization of portable radio operations of grinding in will be. Materials and Methods. To the article a growing requirement is driven in the improvement of quality, increment of productivity and increment of longevity and reliability of machines and wares. The process of grinding (polishing in allows to get the surfaces of processed details with high quality descriptions. Quality of implementation of finishing operation is estimated on following criteria: it is exactly in size, it is an error of form, they are indices of waviness of surface, indices of roughness of surface, the light reflect¬ing ability and quality descriptions of surface layer. For renewal of corps of wedge bolt by a main task providing of impermeability of breech-block. For its implementation hard requirements are produced, namely; a small roughness of surface, form and location. Thus fine surface of corps of wedge bolt must be homogeneous. Results. In order to attain the set roughness of fine surface, the trajectory of motion of instrument must have certain character. Because on this machine-tool a

  8. Pipeline Processing at the Isaac Newton Group: Using ``Live" Images for Public Understanding of Science

    Science.gov (United States)

    Greimel, Robert; Mendez, Javier; Skillen, Ian; Lennon, D. J.; Walton, Nick A.

    The Isaac Newton Group of Telescopes is currently implementing both optical and near-infrared data reduction pipelines for its imaging cameras. For quality control purposes the quicklook pipelines generate postage stamp and full size images (in jpeg format) of all reduced data frames, which can easily be accessed through a web interface. For spectroscopic instruments only the raw data frames are converted into jpeg images. In this poster we show how these images, combined with automated access to the scheduling information and current weather and observing conditions, can be used as input to form near real time web pages for public relations purposes. To enhance the usefulness of this service, a description of the observing project, accessible to the general public, is requested from the observer. Possible use of such a service for planetariums and museums is discussed. This will provide a valuable means for disseminating the dynamic nature of the observatory to the wider public.

  9. Slurry pipeline design approach

    Energy Technology Data Exchange (ETDEWEB)

    Betinol, Roy; Navarro R, Luis [Brass Chile S.A., Santiago (Chile)

    2009-12-19

    Compared to other engineering technologies, the design of a commercial long distance Slurry Pipeline design is a relatively new engineering concept which gained more recognition in the mid 1960 's. Slurry pipeline was first introduced to reduce cost in transporting coal to power generating units. Since then this technology has caught-up worldwide to transport other minerals such as limestone, copper, zinc and iron. In South America, the use of pipeline is commonly practiced in the transport of Copper (Chile, Peru and Argentina), Iron (Chile and Brazil), Zinc (Peru) and Bauxite (Brazil). As more mining operations expand and new mine facilities are opened, the design of the long distance slurry pipeline will continuously present a commercially viable option. The intent of this paper is to present the design process and discuss any new techniques and approach used today to ensure a better, safer and economical slurry pipeline. (author)

  10. Leadership Pipeline

    DEFF Research Database (Denmark)

    Elmholdt, Claus Westergård

    2012-01-01

    Artiklen analyserer grundlaget for Leadership Pipeline modellen med henblik på en vurdering af substansen bag modellen, og perspektiverne for generalisering af modellen til en dansk organisatorisk kontekst.......Artiklen analyserer grundlaget for Leadership Pipeline modellen med henblik på en vurdering af substansen bag modellen, og perspektiverne for generalisering af modellen til en dansk organisatorisk kontekst....

  11. The NOAO Pipeline Data Manager

    Science.gov (United States)

    Hiriart, R.; Valdes, F.; Pierfederici, F.; Smith, C.; Miller, M.

    2004-07-01

    The Data Manager for NOAO Pipeline system is a set of interrelated components that are being developed to fulfill the pipeline system data needs. It includes: (1) management of calibration files (flat, bias, bad pixel mask and xtalk calibration data.); (2) management of the pipeline stages' configuration parameters; and (3) management of the pipeline processing historic information, for each of the data products generated by the pipeline. The Data Manager components uses a distributed, CORBA based architecture, providing a flexible and extensible object oriented framework, capable of accommodating the present and future pipeline data requirements. The Data Manager communicates with the pipeline modules, with internal and external databases, and with other NOAO systems such as the NOAO Archive and the NOAO Data Transport System.

  12. HiCUP: pipeline for mapping and processing Hi-C data [version 1; referees: 2 approved, 1 approved with reservations

    Directory of Open Access Journals (Sweden)

    Steven Wingett

    2015-11-01

    Full Text Available HiCUP is a pipeline for processing sequence data generated by Hi-C and Capture Hi-C (CHi-C experiments, which are techniques used to investigate three-dimensional genomic organisation. The pipeline maps data to a specified reference genome and removes artefacts that would otherwise hinder subsequent analysis. HiCUP also produces an easy-to-interpret yet detailed quality control (QC report that assists in refining experimental protocols for future studies. The software is freely available and has already been used for processing Hi-C and CHi-C data in several recently published peer-reviewed studies.

  13. Remaining Sites Verification Package for the 100-F-26:12, 1.8-m (72-in.) Main Process Sewer Pipeline, Waste Site Reclassification Form 2007-034

    Energy Technology Data Exchange (ETDEWEB)

    J. M. Capron

    2008-04-29

    The 100-F-26:12 waste site was an approximately 308-m-long, 1.8-m-diameter east-west-trending reinforced concrete pipe that joined the North Process Sewer Pipelines (100-F-26:1) and the South Process Pipelines (100-F-26:4) with the 1.8-m reactor cooling water effluent pipeline (100-F-19). In accordance with this evaluation, the verification sampling results support a reclassification of this site to Interim Closed Out. The results of verification sampling show that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.

  14. Improving oceanographic data delivery through pipeline processing in a Commercial Cloud Services environment: the Australian Integrated Marine Observing System

    Science.gov (United States)

    Besnard, Laurent; Blain, Peter; Mancini, Sebastien; Proctor, Roger

    2017-04-01

    The Integrated Marine Observing System (IMOS) is a national project funded by the Australian government established to deliver ocean observations to the marine and climate science community. Now in its 10th year its mission is to undertake systematic and sustained observations and to turn them into data, products and analyses that can be freely used and reused for broad societal benefits. As IMOS has matured as an observing system expectation on the system's availability and reliability has also increased and IMOS is now seen as delivering 'operational' information. In responding to this expectation, IMOS has relocated its services to the commercial cloud service Amazon Web Services. This has enabled IMOS to improve the system architecture, utilizing more advanced features like object storage (S3 - Simple Storage Service) and autoscaling features, and introducing new checking procedures in a pipeline approach. This has improved data availability and resilience while protecting against human errors in data handling and providing a more efficient ingestion process.

  15. PBAP: a pipeline for file processing and quality control of pedigree data with dense genetic markers

    Science.gov (United States)

    Nato, Alejandro Q.; Chapman, Nicola H.; Sohi, Harkirat K.; Nguyen, Hiep D.; Brkanac, Zoran; Wijsman, Ellen M.

    2015-01-01

    Motivation: Huge genetic datasets with dense marker panels are now common. With the availability of sequence data and recognition of importance of rare variants, smaller studies based on pedigrees are again also common. Pedigree-based samples often start with a dense marker panel, a subset of which may be used for linkage analysis to reduce computational burden and to limit linkage disequilibrium between single-nucleotide polymorphisms (SNPs). Programs attempting to select markers for linkage panels exist but lack flexibility. Results: We developed a pedigree-based analysis pipeline (PBAP) suite of programs geared towards SNPs and sequence data. PBAP performs quality control, marker selection and file preparation. PBAP sets up files for MORGAN, which can handle analyses for small and large pedigrees, typically human, and results can be used with other programs and for downstream analyses. We evaluate and illustrate its features with two real datasets. Availability and implementation: PBAP scripts may be downloaded from http://faculty.washington.edu/wijsman/software.shtml. Contact: wijsman@uw.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26231429

  16. The core pipeline equipment localization process and application prospects in China

    Directory of Open Access Journals (Sweden)

    Zejun Huang

    2014-12-01

    Full Text Available To improve the economic efficiency of gas pipelines, core equipment such as compressor sets and large-diameter valves must be localized. For this purpose, in alliance with other related enterprises, PetroChina Company Limited established an equipment localization R&D system and a new product testing system and successfully developed a 20 MW class motor-driven compressor set, a 30 MW-class gas turbine-driven compressor unit, and a high-pressure and large-diameter welded ball valve. First, the motor-driven compressor R&D focuses on three main units. The developed frequency-control device structure is a cascaded multilevel with a capacity of 25 MVA. The developed anti-explosion dynamo with a motor speed of 4800 rpm can produce a power of 22 MW. The developed compressor is PCL800 with features of a high efficiency and a wide flow-operating point-adjustment range. Second, there are two steps of the R&D of a GT-driven compressor unit (product A + product B: auxiliary supporting systems and control systems are developed for the imported GT25000 gas turbine, together with China-made compressors, to constitute product A; simultaneously, the R&D of product B of a gas turbine is carried out, which would replace the imported one. Third, aiming to solve the problems of sealing and welding, we developed the high-pressure and large-diameter all-welded ball valves in full replace of the same kind of imported products with three different sizes: NPS40 Class 600, NPS48 Class 600, and NPS48 Class 900.

  17. Slurry pipeline technology: an overview

    Energy Technology Data Exchange (ETDEWEB)

    Chapman, Jay P. [Pipeline Systems Incorporated (PSI), Belo Horizonte, MG (Brazil); Lima, Rafael; Pinto, Daniel; Vidal, Alisson [Ausenco do Brasil Engenharia Ltda., Nova Lima, MG (Brazil). PSI Div.

    2009-12-19

    Slurry pipelines represent an economical and environmentally friendly transportation means for many solid materials. This paper provides an over-view of the technology, its evolution and current Brazilian activity. Mineral resources are increasingly moving farther away from ports, processing plants and end use points, and slurry pipelines are an important mode of solids transport. Application guidelines are discussed. State-of-the-Art technical solutions such as pipeline system simulation, pipe materials, pumps, valves, automation, telecommunications, and construction techniques that have made the technology successful are presented. A discussion of where long distant slurry pipelines fit in a picture that also includes thickened and paste materials pipe lining is included. (author)

  18. Parallel pipeline networking and signal processing with field-programmable gate arrays (FPGAs) and VCSEL-MSM smart pixels

    Science.gov (United States)

    Kuznia, C. B.; Sawchuk, Alexander A.; Zhang, Liping; Hoanca, Bogdan; Hong, Sunkwang; Min, Chris; Pansatiankul, Dhawat E.; Alpaslan, Zahir Y.

    2000-05-01

    We present a networking and signal processing architecture called Transpar-TR (Translucent Smart Pixel Array-Token- Ring) that utilizes smart pixel technology to perform 2D parallel optical data transfer between digital processing nodes. Transpar-TR moves data through the network in the form of 3D packets (2D spatial and 1D time). By utilizing many spatial parallel channels, Transpar-TR can achieve high throughput, low latency communication between nodes, even with each channel operating at moderate data rates. The 2D array of optical channels is created by an array of smart pixels, each with an optical input and optical output. Each smart pixel consists of two sections, an optical network interface and ALU-based processor with local memory. The optical network interface is responsible for transmitting and receiving optical data packets using a slotted token ring network protocol. The smart pixel array operates as a single-instruction multiple-data processor when processing data. The Transpar-TR network, consisting of networked smart pixel arrays, can perform pipelined parallel processing very efficiently on 2D data structures such as images and video. This paper discusses the Transpar-TR implementation in which each node is the printed circuit board integration of a VCSEL-MSM chip, a transimpedance receiver array chip and an FPGA chip.

  19. Hydroxyl carboxylate based non-phosphorus corrosion inhibition process for reclaimed water pipeline and downstream recirculating cooling water system.

    Science.gov (United States)

    Wang, Jun; Wang, Dong; Hou, Deyin

    2016-01-01

    A combined process was developed to inhibit the corrosion both in the pipeline of reclaimed water supplies (PRWS) and in downstream recirculating cooling water systems (RCWS) using the reclaimed water as makeup. Hydroxyl carboxylate-based corrosion inhibitors (e.g., gluconate, citrate, tartrate) and zinc sulfate heptahydrate, which provided Zn(2+) as a synergistic corrosion inhibition additive, were added prior to the PRWS when the phosphate (which could be utilized as a corrosion inhibitor) content in the reclaimed water was below 1.7 mg/L, and no additional corrosion inhibitors were required for the downstream RCWS. Satisfactory corrosion inhibition was achieved even if the RCWS was operated under the condition of high numbers of concentration cycles. The corrosion inhibition requirement was also met by the appropriate combination of PO4(3-) and Zn(2+) when the phosphate content in the reclaimed water was more than 1.7 mg/L. The process integrated not only water reclamation and reuse, and the operation of a highly concentrated RCWS, but also the comprehensive utilization of phosphate in reclaimed water and the application of non-phosphorus corrosion inhibitors. The proposed process reduced the operating cost of the PRWS and the RCWS, and lowered the environmental hazard caused by the excessive discharge of phosphate. Furthermore, larger amounts of water resources could be conserved as a result.

  20. Leadership Pipeline

    DEFF Research Database (Denmark)

    Elmholdt, Claus Westergård

    2013-01-01

    I artiklen undersøges det empiriske grundlag for Leader- ship Pipeline. Først beskrives Leadership Pipeline modellen om le- delsesbaner og skilleveje i opadgående transitioner mellem orga- nisatoriske ledelsesniveauer (Freedman, 1998; Charan, Drotter and Noel, 2001). Dernæst sættes fokus på det...... forholdet mellem kontinuitet- og diskontinuitet i ledel- seskompetencer på tværs af organisatoriske niveauer præsenteres og diskuteres. Afslutningsvis diskuteres begrænsningerne i en kompetencebaseret tilgang til Leadership Pipeline, og det foreslås, at succesfuld ledelse i ligeså høj grad afhænger af...

  1. Leadership Pipeline

    DEFF Research Database (Denmark)

    Elmholdt, Claus Westergård

    2013-01-01

    I artiklen undersøges det empiriske grundlag for Leader- ship Pipeline. Først beskrives Leadership Pipeline modellen om le- delsesbaner og skilleveje i opadgående transitioner mellem orga- nisatoriske ledelsesniveauer (Freedman, 1998; Charan, Drotter and Noel, 2001). Dernæst sættes fokus på det...... forholdet mellem kontinuitet- og diskontinuitet i ledel- seskompetencer på tværs af organisatoriske niveauer præsenteres og diskuteres. Afslutningsvis diskuteres begrænsningerne i en kompetencebaseret tilgang til Leadership Pipeline, og det foreslås, at succesfuld ledelse i ligeså høj grad afhænger af...

  2. 78 FR 56268 - Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension

    Science.gov (United States)

    2013-09-12

    ... Process'' flowchart. PHMSA is using this notice to announce the revised ``Integrity Verification Process'' flowchart and extend the comment period from September 9, 2013, to October 7, 2013. DATES: The closing...

  3. Business process modeling of industrial maintenance at TRANSPETRO: integrating oil pipeline and marine terminals activities

    Energy Technology Data Exchange (ETDEWEB)

    Arruda, Daniela Mendonca; Oliveira, Italo Luiz [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil). Diretoria de Terminais e Oleodutos; Almeida, Maria Fatima Ludovico de [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio), Rio de Janeiro, RJ (Brazil). Programa de Pos-Graduacao em Metrologia para Qualidade e Inovacao

    2009-07-01

    This paper describes the experience of TRANSPETRO in remodeling industrial maintenance activities focusing on: preparing for business process modeling (BPM); mapping and analyzing 'As-Is' process; designing 'To-Be' process; implementing remodeled process; improving process continuously. The conceptual model and results achieved will contribute to several areas within the company as: reliability engineering; human resources, including employees' selective processes, training and development, and certifications; standardization process encompassing standard and operational procedures adoption according to up-dating external normative references and legal requirements; health, safety and environment (HSE) performance improvement. These are some of potential benefits from BPM focusing on TRANSPETRO's industrial maintenance area in the search of operational excellence. (author)

  4. Research on Pipeline Holdup Measurement Technology

    Institute of Scientific and Technical Information of China (English)

    LU; Wen-guang; XU; Zheng; CHENG; Yi-mei; SUI; Hong-zhi; YIN; Hong-he

    2012-01-01

    <正>Some of the nuclear material could be deposited in the pipeline system of the nuclear facilities in the operation process. That kind of nuclear materials in the pipeline are called holdup. The measurement of pipeline holdup is not only important for the nuclear material accounting and control of facilities, but also important for the safe operation of facilities.

  5. CPL: Common Pipeline Library

    Science.gov (United States)

    ESO CPL Development Team

    2014-02-01

    The Common Pipeline Library (CPL) is a set of ISO-C libraries that provide a comprehensive, efficient and robust software toolkit to create automated astronomical data reduction pipelines. Though initially developed as a standardized way to build VLT instrument pipelines, the CPL may be more generally applied to any similar application. The code also provides a variety of general purpose image- and signal-processing functions, making it an excellent framework for the creation of more generic data handling packages. The CPL handles low-level data types (images, tables, matrices, strings, property lists, etc.) and medium-level data access methods (a simple data abstraction layer for FITS files). It also provides table organization and manipulation, keyword/value handling and management, and support for dynamic loading of recipe modules using programs such as EsoRex (ascl:1504.003).

  6. Thermal Behavior of an HSLA Steel and the Impact in Phase Transformation: Submerged Arc Welding (SAW) Process Approach to Pipelines

    Science.gov (United States)

    Costa, P. S.; Reyes-Valdés, F. A.; Saldaña-Garcés, R.; Delgado, E. R.; Salinas-Rodríguez, A.

    Heat input during welding metal fusion generates different transformations, such as grain growth, hydrogen cracking, and the formation of brittle structures, generally associated with the heat-affected zone (HAZ). For this reason, it is very important to know the behavior of this area before welding. This paper presents a study of the thermal behavior and its effect on phase transformations in the HAZ, depending on cooling rates (0.1-200 °C/s) to obtain continuous cooling transformation (CCT) curves for an high-strength low-alloy (HSLA) steel. In order to determine the formed phases, optical microscopy and Vickers microhardness measurement were used. The experimental CCT curve was obtained from an HSLA steel, and the results showed that, with the used cooling conditions, the steel did not provide formation of brittle structures. Therefore, it is unlikely that welds made by submerged arc welding (SAW) may lead to hydrogen embrittlement in the HAZ, which is one of the biggest problems of cracking in gas conduction pipelines. In addition, with these results, it will be possible to control the microstructure to optimize the pipe fabrication with SAW process in industrial plants.

  7. An Ontology-Enabled Natural Language Processing Pipeline for Provenance Metadata Extraction from Biomedical Text (Short Paper).

    Science.gov (United States)

    Valdez, Joshua; Rueschman, Michael; Kim, Matthew; Redline, Susan; Sahoo, Satya S

    2016-10-01

    Extraction of structured information from biomedical literature is a complex and challenging problem due to the complexity of biomedical domain and lack of appropriate natural language processing (NLP) techniques. High quality domain ontologies model both data and metadata information at a fine level of granularity, which can be effectively used to accurately extract structured information from biomedical text. Extraction of provenance metadata, which describes the history or source of information, from published articles is an important task to support scientific reproducibility. Reproducibility of results reported by previous research studies is a foundational component of scientific advancement. This is highlighted by the recent initiative by the US National Institutes of Health called "Principles of Rigor and Reproducibility". In this paper, we describe an effective approach to extract provenance metadata from published biomedical research literature using an ontology-enabled NLP platform as part of the Provenance for Clinical and Healthcare Research (ProvCaRe). The ProvCaRe-NLP tool extends the clinical Text Analysis and Knowledge Extraction System (cTAKES) platform using both provenance and biomedical domain ontologies. We demonstrate the effectiveness of ProvCaRe-NLP tool using a corpus of 20 peer-reviewed publications. The results of our evaluation demonstrate that the ProvCaRe-NLP tool has significantly higher recall in extracting provenance metadata as compared to existing NLP pipelines such as MetaMap.

  8. Proposal and design of a natural gas liquefaction process recovering the energy obtained from the pressure reducing stations of high-pressure pipelines

    Science.gov (United States)

    Tan, Hongbo; Zhao, Qingxuan; Sun, Nannan; Li, Yanzhong

    2016-12-01

    Taking advantage of the refrigerating effect in the expansion at an appropriate temperature, a fraction of high-pressure natural gas transported by pipelines could be liquefied in a city gate station through a well-organized pressure reducing process without consuming any extra energy. The authors proposed such a new process, which mainly consists of a turbo-expander driven booster, throttle valves, multi-stream heat exchangers and separators, to yield liquefied natural gas (LNG) and liquid light hydrocarbons (LLHs) utilizing the high-pressure of the pipelines. Based on the assessment of the effects of several key parameters on the system performance by a steady-state simulation in Aspen HYSYS, an optimal design condition of the proposed process was determined. The results showed that the new process is more appropriate to be applied in a pressure reducing station (PRS) for the pipelines with higher pressure. For the feed gas at the pressure of 10 MPa, the maximum total liquefaction rate (ytot) of 15.4% and the maximum exergy utilizing rate (EUR) of 21.7% could be reached at the optimal condition. The present process could be used as a small-scale natural gas liquefying and peak-shaving plant at a city gate station.

  9. Pipelined data processing system utilizing ideal floating point execution condition detection

    Energy Technology Data Exchange (ETDEWEB)

    Lee, H.P.S.; Rawlinson, S.J.; Si, S.S.C.

    1988-09-20

    This patent describes an instruction execution unit responsive to an instruction for providing a sequence of microcode control words to direct the processing of operand data associated with the instruction. The instruction execution unit consists of: (a) sequencing means, responsive to the instruction, for issuing first and second sequences of microcode control words corresponding to the instruction, the sequencing means including selector means for selecting the first or the second sequence of microcode control words for issuance by the sequencing means; and (b) determining means for determining from the operand data, concurrent with the issuance of a microcode control word by the sequencing means, whether the operand data is ideal with respect to the instruction, the determining means causing the selector means to select the second sequence of microcode control words for issuance to complete the processing of the operand data in response to the instruction when the operand data is ideal with respect to the instruction.

  10. Modeling Parameters of Reliability of Technological Processes of Hydrocarbon Pipeline Transportation

    Directory of Open Access Journals (Sweden)

    Shalay Viktor

    2016-01-01

    Full Text Available On the basis of methods of system analysis and parametric reliability theory, the mathematical modeling of processes of oil and gas equipment operation in reliability monitoring was conducted according to dispatching data. To check the quality of empiric distribution coordination , an algorithm and mathematical methods of analysis are worked out in the on-line mode in a changing operating conditions. An analysis of physical cause-and-effect relations mechanism between the key factors and changing parameters of technical systems of oil and gas facilities is made, the basic types of technical distribution parameters are defined. Evaluation of the adequacy the analyzed parameters of the type of distribution is provided by using a criterion A.Kolmogorov, as the most universal, accurate and adequate to verify the distribution of continuous processes of complex multiple-technical systems. Methods of calculation are provided for supervising by independent bodies for risk assessment and safety facilities.

  11. Anchor Loads on Pipelines

    OpenAIRE

    Wei, Ying

    2015-01-01

    Anchor hooking on a subsea pipeline has been investigated in this thesis. Anchor loads on pipelines is in general a rarely occurring event, however, the severity when it occurs could easily jeopardize the integrity of any pipeline. It is considered as an accidental load in the design of pipelines. Pipeline Loads, limit state criteria and anchor categories are defined by the DNV standards. For pipeline, DNV-OS-F101 (08.2012), Submarine Pipeline Systems is adopted. Offshore standard DNV-RP...

  12. Nonlinear contact between pipeline's outer wall and slip-on buckle arrestor's inner wall during buckling process

    Science.gov (United States)

    Ma, Weilin; Liu, Jiande; Dong, Sheng; Zhang, Xin; Ma, Xiaozhou

    2017-02-01

    In order to theoretically study the buckle propagation of subsea pipelines with slip-on buckle arrestors, a two-dimensional ring model was set up to represent the pipeline and a nonlinear spring model was adopted to simulate the contact between pipeline's inner walls and between pipeline's outer wall and slip-on buckle arrestor's inner wall during buckle propagation. In addition, some reverse springs are added to prevent the wall of left and right sides separating from the inner wall of slip-on buckle arrestors. Considering large deformation kinematics relations and the elastic-plastic constitutive relation of material, balance equations were established with the principle of virtual work. The variation of external pressure with respect to the cross-sectional area of pipelines was analyzed, and the lower bound of the crossover pressure of slip-on buckle arrestors was calculated based on Maxwell's energy balance method. By comparing the theoretical results with experiment and finite element numerical simulation, the theoretical method is proved to be correct and reliable.

  13. Processing Pipeline of Sugarcane Spectral Response to Characterize the Fallen Plants Phenomenon

    Science.gov (United States)

    Solano, Agustín; Kemerer, Alejandra; Hadad, Alejandro

    2016-04-01

    Nowadays, in agronomic systems it is possible to make a variable management of inputs to improve the efficiency of agronomic industry and optimize the logistics of the harvesting process. In this way, it was proposed for sugarcane culture the use of remote sensing tools and computational methods to identify useful areas in the cultivated lands. The objective was to use these areas to make variable management of the crop. When at the moment of harvesting the sugarcane there are fallen stalks, together with them some strange material (vegetal or mineral) is collected. This strange material is not millable and when it enters onto the sugar mill it causes important looses of efficiency in the sugar extraction processes and affects its quality. Considering this issue, the spectral response of sugarcane plants in aerial multispectral images was studied. The spectral response was analyzed in different bands of the electromagnetic spectrum. Then, the aerial images were segmented to obtain homogeneous regions useful for producers to make decisions related to the use of inputs and resources according to the variability of the system (existence of fallen cane and standing cane). The obtained segmentation results were satisfactory. It was possible to identify regions with fallen cane and regions with standing cane with high precision rates.

  14. Effort problem of chemical pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Okrajni, J.; Ciesla, M.; Mutwil, K. [Silesian Technical University, Katowice (Poland)

    1998-12-31

    The problem of the technical state assessment of the chemical pipelines working under mechanical and thermal loading has been shown in the paper. The pipelines effort after the long time operating period has been analysed. Material geometrical and loading conditions of the crack initiation and crack growth process in the chosen object has been discussed. Areas of the maximal effort have been determined. The material structure charges after the long time operating period have been described. Mechanisms of the crack initiation and crack growth in the pipeline elements have been analysed and mutual relations between the chemical and mechanical influences have been shown. (orig.) 16 refs.

  15. Development and optimization of an advanced process for non-dig installation of pipelines transporting energy and raw materials; Entwicklung und Optimierung eines neuen Verfahrens zur grabenlosen Verlegung von Rohrleitungen fuer den Energie- und Rohstofftransport

    Energy Technology Data Exchange (ETDEWEB)

    Koegler, Ruediger

    2008-04-07

    Controllable horizontal drilling is a method established worldwide for laying pipelines under natural or artificial obstacles without trenches. In 2002 an 18'' gas pipeline was laid under the river Rhone under the most difficult topographical and geological conditions for the French energy supplier Gaz de France by means of horizontal drilling technology. In this thesis the Easy pipe procedure has been developed derived from MT (Mircrotunneling) engineering. The procedure is stepwise as a pilot process introduced and furtheron developed and installed for pipeline laying. (orig./GL)

  16. ALMA Pipeline Heuristics

    Science.gov (United States)

    Muders, D.; Boone, F.; Wyrowski, F.; Lightfoot, J.; Kosugi, G.; Wilson, C.; Davis, L.; Shepherd, D.

    2007-10-01

    The Atacama Large Millimeter Array / Atacama Compact Array (ALMA / ACA) Pipeline Heuristics system is being developed to automatically reduce data taken with the standard observing modes such as single fields, mosaics or on-the-fly maps. The goal is to make ALMA user-friendly to astronomers who are not experts in radio interferometry. The Pipeline Heuristics must capture the expert knowledge required to provide data products that can be used without further processing. The Pipeline Heuristics system is being developed as a set of Python scripts using as the data processing engines the Common Astronomy Software Applications (CASA[PY]) libraries and the ATNF Spectral Analysis Package (ASAP). The interferometry heuristics scripts currently provide an end-to-end process for the single field mode comprising flagging, initial calibration, re-flagging, re-calibration, and imaging of the target data. A Java browser provides user-friendly access to the heuristics results. The initial single-dish heuristics scripts implement automatic spectral line detection, baseline fitting and image gridding. The resulting data cubes are analyzed to detect source emission spectrally and spatially in order to calculate signal-to-noise ratios for comparison against the science goals specified by the observer.

  17. ALMA Pipeline Heuristics

    Science.gov (United States)

    Lightfoot, J.; Wyrowski, F.; Muders, D.; Boone, F.; Davis, L.; Shepherd, D.; Wilson, C.

    2006-07-01

    The ALMA (Atacama Large Millimeter Array) Pipeline Heuristics system is being developed to automatically reduce data taken with the standard observing modes. The goal is to make ALMA user-friendly to astronomers who are not experts in radio interferometry. The Pipeline Heuristics system must capture the expert knowledge required to provide data products that can be used without further processing. Observing modes to be processed by the system include single field interferometry, mosaics and single dish `on-the-fly' maps, and combinations of these modes. The data will be produced by the main ALMA array, the ALMA Compact Array (ACA) and single dish antennas. The Pipeline Heuristics system is being developed as a set of Python scripts. For interferometry these use as data processing engines the CASA/AIPS++ libraries and their bindings as CORBA objects within the ALMA Common Software (ACS). Initial development has used VLA and Plateau de Bure data sets to build and test a heuristic script capable of reducing single field data. In this paper we describe the reduction datapath and the algorithms used at each stage. Test results are presented. The path for future development is outlined.

  18. Correlation between designed wall thickness of gas pipelines and external and internal corrosion processes; Adequacao de espessura de parede projetada em funcao de processos de corrosao externa e interna em gasodutos

    Energy Technology Data Exchange (ETDEWEB)

    Gomes, Jose Antonio da Cunha Ponciano [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-Graduacao de Engenharia (COPPE). Programa de Engenharia Metalurgica

    2004-07-01

    Corrosion control on gas pipelines plays an important role on the assessment of pipeline integrity and reliability. In many countries a great extension of buried pipelines is used on transport and distribution systems. This extension will be certainly increased in a near future due to the increasing consumption of natural gas. Inadequate corrosion control can drive to pipeline failures, bringing up the possibility of accidents in populated or environmental protected areas, bringing together severe economical, legal and environmental consequences. Corrosion is frequently considered as a natural and inevitable phenomenon. Based upon this assumption, some recommendations are included on design standards of gas pipelines in order to compensate its detrimental effect. The aim of this work is to present a review of the correlation between external corrosion process and the guidelines established during the project phase of gas pipelines. It is intended to contribute for a better understanding of the impacts of corrosion on integrity, reliability and readiness of gas transport and distribution systems. Some aspects regarding external corrosion of pipelines extracted from technical papers will be summarised. Information provided will be compared to design criterion prescribed by the NBR 12712 Standard. (author)

  19. Developing an ESIP-wide Process "Pipeline" to Extract Data-driven Stories from Compelling Agriculture and Energy Research on Climate Resilience

    Science.gov (United States)

    Hoebelheinrich, N. J.; Eckman, R.; Teng, W. L.; Beltz, C.

    2016-12-01

    The classic approach to scientific storytelling, especially for publication, is to establish the research problem, describe the potential solution and the efforts to solve the problem, and end with the results - whether "successful" or not - as the "Ta Da!" of the story. This classic approach, however, does not necessarily adapt well to the kind of storytelling that policy-making and general public end-users find more compelling, i.e., with the "Ta Da!" element of the story immediately evident. Working with the U.S. Climate Resilience Toolkit (CRT) staff, two collaborative groups of the Earth Science Information Partners (ESIP), Agriculture and Climate and Energy and Climate, have begun to assist agriculture and energy researchers in making the switch in story telling approach and, thus, get more easily understood and actionable information out to potential end-users about how the research data produced can help them. The CRT is a platform for telling stories based on both end-user needs and the data that are used to meet those needs. The ESIP groups are establishing an ESIP-wide process "pipeline," through which research results and data, with the help of group discussions and the use of CRT templates, are transformed into potential stories. When appropriate, the stories are handed off to the CRT staff to be fully developed. Two case studies that are in the process of being added to the CRT involve (1) the use of the RETScreen tool by Natural Resources Canada and (2) a fallow lands mapping project with the California Department of Water Resources to monitor ongoing drought conditions in California. These two case studies will be used to illustrate the process pipeline being developed, discuss lessons learned to date, and suggest future plans for further refining and expanding the process "pipeline."

  20. Natural gas pipeline technology overview.

    Energy Technology Data Exchange (ETDEWEB)

    Folga, S. M.; Decision and Information Sciences

    2007-11-01

    transmission companies. Compressor stations at required distances boost the pressure that is lost through friction as the gas moves through the steel pipes (EPA 2000). The natural gas system is generally described in terms of production, processing and purification, transmission and storage, and distribution (NaturalGas.org 2004b). Figure 1.1-2 shows a schematic of the system through transmission. This report focuses on the transmission pipeline, compressor stations, and city gates.

  1. VLT Instruments Pipeline System Overview

    Science.gov (United States)

    Jung, Y.; Ballester, P.; Banse, K.; Hummel, W.; Izzo, C.; McKay, D. J.; Kiesgen, M.; Lundin, L. K.; Modigliani, A.; Palsa, R. M.; Sabet, C.

    2004-07-01

    Since the beginning of the VLT operations in 1998, substantial effort has been put in the development of automatic data reduction tools for the VLT instruments. A VLT instrument pipeline is a complex system that has to be able to identify and classify each produced FITS file, optionally retrieve calibration files from a database, use an image processing software to reduce the data, compute and log quality control parameters, produce FITS images or tables with the correct headers, optionally display them in the control room and send them to the archive. Each instrument has its own dedicated pipeline, based on a common infrastructure and installed with the VLT Data Flow System (DFS). With the increase in the number and the complexity of supported instruments and in the rate of produced data, these pipelines are becoming vital for both the VLT operations and the users, and request more and more resources for development and maintenance. This paper describes the different pipeline tasks with some real examples. It also explains how the development process has been improved to both decrease its cost and increase the pipelines quality using the lessons learned from the first instruments pipelines development.

  2. Underground pipeline corrosion

    CERN Document Server

    Orazem, Mark

    2014-01-01

    Underground pipelines transporting liquid petroleum products and natural gas are critical components of civil infrastructure, making corrosion prevention an essential part of asset-protection strategy. Underground Pipeline Corrosion provides a basic understanding of the problems associated with corrosion detection and mitigation, and of the state of the art in corrosion prevention. The topics covered in part one include: basic principles for corrosion in underground pipelines, AC-induced corrosion of underground pipelines, significance of corrosion in onshore oil and gas pipelines, n

  3. 77 FR 70543 - Pipeline Safety: Meeting of the Gas Pipeline Advisory Committee and the Liquid Pipeline Advisory...

    Science.gov (United States)

    2012-11-26

    ... for natural gas pipelines and for hazardous liquid pipelines. Both committees were established under... TRANSPORTATION Pipeline and Hazardous Materials Safety Administration Pipeline Safety: Meeting of the Gas Pipeline Advisory Committee and the Liquid Pipeline Advisory Committee AGENCY: Pipeline and...

  4. RiboProfiling: a Bioconductor package for standard Ribo-seq pipeline processing [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Alexandra Popa

    2016-06-01

    Full Text Available The ribosome profiling technique (Ribo-seq allows the selective sequencing of translated RNA regions. Recently, the analysis of genomic sequences associated to Ribo-seq reads has been widely employed to assess their coding potential. These analyses led to the identification of differentially translated transcripts under different experimental conditions, and/or ribosome pausing on codon motifs. In the context of the ever-growing need for tools analyzing Ribo-seq reads, we have developed ‘RiboProfiling’, a new Bioconductor open-source package. ‘RiboProfiling’ provides a full pipeline to cover all key steps for the analysis of ribosome footprints. This pipeline has been implemented in a single R workflow. The package takes an alignment (BAM file as input and performs ribosome footprint quantification at a transcript level. It also identifies footprint accumulation on particular amino acids or multi amino-acids motifs. Report summary graphs and data quantification are generated automatically. The package facilitates quality assessment and quantification of Ribo-seq experiments. Its implementation in Bioconductor enables the modeling and statistical analysis of its output through the vast choice of packages available in R. This article illustrates how to identify codon-motifs accumulating ribosome footprints, based on data from Escherichia coli.

  5. Analysis on the Nitrogen Control of Pipeline Steel X70 in LF Process%X70管线钢LF控氮工艺分析

    Institute of Scientific and Technical Information of China (English)

    胡国旭

    2012-01-01

    The production process for X70 pipeline steel and the control nitrogen factors in LF process were comprehensively analyzed. The results show that the nitrogen pick-up in liquid steel is mainly attributed to the liquid steel exposed to atmosphere at the desulfurization stage under the white slag conditions in LF processs. Accordingly, six key technologies of nitrogen control for pipeline steel X70 from BOF tap to LF end-point were proposed. According to LF practice, constant perfecting and optimizing its technical parameters can lay a theoretical and technological foundation for producing high grade steel containing lower nitrogen.%对X70管线钢生产工艺及LF过程控氮因素进行了综合分析.结果表明,LF工况条件下造白渣脱硫阶段的“钢液裸露”是造成钢液增氮最为主要的原因.在此基础上提出了从BOF出钢到LF处理结束等6项X70管线钢LF控氮工艺要点.根据LF生产实践不断完善与优化其各环节的技术参数,为生产更低含氮量要求的高级别钢种奠定理论与工艺基础.

  6. 埋地热油管道投产方案模拟分析%Simulation analysis of commission process of a buried hot oil pipeline

    Institute of Scientific and Technical Information of China (English)

    李海涛

    2016-01-01

    In order to get an economic scheme of preheating pipeline,pipeline simulation software SPS is used to analyse the preheating process.According to the result calculated by SPS,the fluid temperature is firstly increased and then decreased to a stable temperature,with the increase of preheating.The fluid density goes steadly at first and then decreases a lot to a new stable value.The economic scheme is determined by choosing the reasonable oil condensation point as 3℃ higher than the temperature of oil/water mixture at the end of the pipeline.Considering preheating time and the amount of hot water,the optimal and economic scheme is finally screened out.%为获得原油预热投产的经济环保方案,对某即将投产管线进行热力学仿真计算.通过建立SPS仿真模型并分析计算结果后发现,在投产过程中,末端流体温度随着预热投产时间先上升而后下降至平稳温度,末端流体密度先保持平稳而后迅速下降至稳定.选取油水混合物进站温度高于原油凝点3℃为边界条件,且综合考虑预热时间、用水量后确定了最优的投产方案,为现场操作提供技术支持.

  7. Influence of Partitioning Process on the Microstructure and Mechanical Properties of High Deformability Oil-Gas Pipeline

    Directory of Open Access Journals (Sweden)

    Jing Ma

    2014-11-01

    Full Text Available Multiphase structure of bainite and M/A constituent can be obtained for X80 oil-gas pipeline through a novel heat online partitioning (HOP technology. The effects of partitioning temperature on the microstructure and mechanical properties of the experimental steels were researched by means of mechanical properties test, microscopic analysis, and X-ray diffraction. The results show that with the increase of the partitioning temperature, the strength of the experimental steel decreases and the ductility increases because of the increase of bainite lath width, the decrease of dislocation density, the increase of retained austenite content, and carbides coarsening. The decrease of the volume content and stability of retained austenite is the key factor, which leads to the increase of strength and the decrease of plasticity in a high range of partitioning temperature.

  8. The value analysis of oil-gas currents' flowing process in pipeline%气液两相流在管内流动的数值研究

    Institute of Scientific and Technical Information of China (English)

    谢黎明; 朱绪胜; 王岩

    2011-01-01

    Through the flow field computation model in the Fluent, it can simulate the transportation process of the oil-gas lubrication system in pipeline. To identify the influence of Reynold's number regarding two phase currents in the oil dripping size, and whether the oil dripping is continuous. Through simulation,when the Reynold's number value is reasonable, it forms continual oil-gas mixture in the pipeline,futhemore, the lubricating oil turns to dripping shape. All of the above conforms to the oil gas lubrication specification.%通过在Fluent建立流场计算模型,对油气润滑系统中润滑油在管道中的输送过程进行数值计算,对雷诺数在两相流中对油滴大小的影响及油滴是否连续的影响进行研究.通过模拟发现,当雷诺数取值合理时,在管道中形成连续的油气混合物,且润滑油被离散为油滴状,符合油气润滑的技术要求.

  9. Shipbuilding pipeline production quality improvement

    Directory of Open Access Journals (Sweden)

    T. Buksa

    2010-06-01

    Full Text Available Purpose: The pipeline production is one of major processes in shipbuilding industry. Quality improvement and risk assessment in this process can yield significant savings, both in terms of internal quality costs as well as in terms of customer satisfactions.Design/methodology/approach: Shipbuilding pipeline production quality improvement has been carried out by application of FMEA (Failure Mode and Effect Analysis method. For the successful implementation of FMEA method it is necessary to identify process failure modes or possibility of the appearance of non-compliance, as well as their possible causes. For qualitative analysis of key input variables of the process, in the paper is used Ishikawa diagram and p-chart.Findings: It is shown that proposed approach to risk assessment in shipbuilding pipeline production is applicable to real casa scenario. The analysis has identified the points in the process with the highest probability of occurrence of nonconformities, or the highest risk for error.Research limitations/implications: As the experimenting has been conducted in shipyard, within production process, research schedule must have been set in accordance with production pace. Also, due to character of production process the data collecting was adopted to the production plan in that particular moment.Practical implications: Dealing with causes of potential nonconformities in the process can significantly contribute to the reliability and robustness of the process. Corrective actions that have been taken based on results of analysis significantly contributed to the level of quality in the pipeline production process.Originality/value: The pepper is dealing with a well known method applied in different production environment that are mostly conservative in production approach. It was shown that successful application of proposed approach can yield benefits especially in improved quality of produced pipelines within shipbuilding industry.

  10. The correlation between materials, processes and final properties in the pipeline coating system with polyethylene in triple layer; A correlacao entre materiais, processos e propriedades finais no sistema de revestimento de tubos com polietileno em tripla camada

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Luiz C.; Campos, Paulo H. [Confab Industrial S.A., Pindamonhangaba, SP (Brazil); Silva, Christian E.; Santos, Paulo T. [Soco-Ril do Brasil S.A., Pindamonhangaba, SP (Brazil)

    2003-07-01

    The use of anticorrosion coating is a common practice in industrial pipeline applications. Among the several coatings types to buried and submerged pipelines, over all, the Fusion Bonded Epoxy and Three Layer Polyethylene coating systems have been large employed. They have showed an excellent performance protecting the pipe metal from external corrosive environment, considerably decreasing the designed cathodic protection requirements, basically in the first years of pipeline operation. Coating system success depends on not only of a suitable design or of the materials technology, but also depends on the process parameters and the raw material characteristics exhibited during the application. In this paper will be presented in a theoretical approach how the process parameters and the raw materials characteristics may affect the three layer polyethylene anticorrosion coating final properties. (author)

  11. Trace Software Pipelining

    Institute of Scientific and Technical Information of China (English)

    王剑; AndreasKrall; 等

    1995-01-01

    Global software pipelining is a complex but efficient compilation technique to exploit instruction-level parallelism for loops with branches.This paper presents a novel global software pipelining technique,called Trace Software Pipelining,targeted to the instruction-level parallel processors such as Very Long Instruction Word (VLIW) and superscalar machines.Trace software pipelining applies a global code scheduling technique to compact the original loop body.The resulting loop is called a trace software pipelined (TSP) code.The trace softwrae pipelined code can be directly executed with special architectural support or can be transformed into a globally software pipelined loop for the current VLIW and superscalar processors.Thus,exploiting parallelism across all iterations of a loop can be completed through compacting the original loop body with any global code scheduling technique.This makes our new technique very promising in practical compilers.Finally,we also present the preliminary experimental results to support our new approach.

  12. Pipeline rules of thumb handbook a manual of quick, accurate solutions to everyday pipeline engineering problems

    CERN Document Server

    McAllister, EW

    2014-01-01

    Presented in easy-to-use, step-by-step order, Pipeline Rules of Thumb Handbook is a quick reference for day-to-day pipeline operations. For more than 35 years, the Pipeline Rules of Thumb Handbook has served as the ""go-to"" reference for solving even the most day-to-day vexing pipeline workflow problems. Now in its 8th edition, this handbook continues to set the standard by which all other piping books are judged. Along with over 30% new or updated material regarding codes, construction processes, and equipment, this book continues to offer hundreds of ""how-to"" methods and ha

  13. Pipeline modeling and assessment in unstable slopes

    Energy Technology Data Exchange (ETDEWEB)

    Caceres, Carlos Nieves [Oleoducto Central S.A., Bogota, Cundinamarca (Colombia); Ordonez, Mauricio Pereira [SOLSIN S.A.S, Bogota, Cundinamarca (Colombia)

    2010-07-01

    The OCENSA pipeline system is vulnerable to geotechnical problems such as faults, landslides or creeping slopes, which are well-known in the Andes Mountains and tropical countries like Colombia. This paper proposes a methodology to evaluate the pipe behaviour during the soil displacements of slow landslides. Three different cases of analysis are examined, according to site characteristics. The process starts with a simplified analytical model and develops into 3D finite element numerical simulations applied to the on-site geometry of soil and pipe. Case 1 should be used when the unstable site is subject to landslides impacting significant lengths of pipeline, pipeline is straight, and landslide is simple from the geotechnical perspective. Case 2 should be used when pipeline is straight and landslide is complex (creeping slopes and non-conventional stabilization solutions). Case 3 should be used if the pipeline presents vertical or horizontal bends.

  14. Green pipeline dreams; Gruene Pipeline-Traeume

    Energy Technology Data Exchange (ETDEWEB)

    Wiedemann, Karsten

    2010-11-15

    In theory, Germany and the other EU states would be able to cover their natural gas demand completely with pipeline-supplied biomethane. But will this be really possible in practice? The contribution takes a closer look. (orig.)

  15. Ultrasound monitoring of pipelines; Ultraschallueberwachung an Pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Kircher, W.; Skerra, B.; Kobitsch-Meyer, S. [SONOTEC Ultraschallsensorik, Halle GmbH (Germany)

    2007-01-15

    Pipelines are the most modern, effective and safest transport system, which is world widely spread in a network of millions km length and is annually enlarged about thousands of km. It is sure that these systems, if they should stay save and effective, must be maintained adequately. A technique, which provides accurate and reliable measurement data without interrupting the pipeline operation, ''through the wall'', is the ultrasonic technology. This non-intrusive technology provides data for pig detection as well as it is used for recognising products in pipelines, detecting levels or full/empty states and accomplishment of sediment measurement, distance measurements, position detections and leak search. The article gives a review and describes some applications of ultrasonic technology in pipeline technique. (orig.)

  16. Pipeline risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Kariyawasam, S. [TransCanada PipeLines Ltd., Calgary, AB (Canada); Weir, D. [Enbridge Pipelines Inc., Calgary, AB (Canada)] (comps.)

    2009-07-01

    Risk assessments and risk analysis are system-wide activities that include site-specific risk and reliability-based decision-making, implementation, and monitoring. This working group discussed the risk management process in the pipeline industry, including reliability-based integrity management and risk control processes. Attendants at the group discussed reliability-based decision support and performance measurements designed to support corporate risk management policies. New developments and technologies designed to optimize risk management procedures were also presented. The group was divided into 3 sessions: (1) current practice, strengths and limitations of system-wide risk assessments for facility assets; (2) accounting for uncertainties to assure safety; and (3) reliability based excavation repair criteria and removing potentially unsafe corrosion defects. Presentations of risk assessment procedures used at various companies were given. The role of regulators, best practices, and effective networking environments in ensuring the success of risk assessment policies was discussed. Risk assessment models were also reviewed.

  17. Oil and gas pipelines in nontechnical language

    National Research Council Canada - National Science Library

    Miesner, Thomas O; Leffler, William L

    2006-01-01

    Oil & Gas Pipelines in Nontechnical Language examines the processes, techniques, equipment, and facilities used to transport fluids such as refined products, crude oil, natural gas, and natural gas liquids...

  18. The LOFAR Known Pulsar Data Pipeline

    CERN Document Server

    Alexov, A; Mol, J D; Stappers, B; van Leeuwen, J

    2010-01-01

    Transient radio phenomena and pulsars are one of six LOFAR Key Science Projects (KSPs). As part of the Transients KSP, the Pulsar Working Group (PWG) has been developing the LOFAR Pulsar Data Pipelines to both study known pulsars as well as search for new ones. The pipelines are being developed for the Blue Gene/P (BG/P) supercomputer and a large Linux cluster in order to utilize enormous amounts of computational capabilities (50Tflops) to process data streams of up to 23TB/hour. The LOFAR pipeline output will be using the Hierarchical Data Format 5 (HDF5) to efficiently store large amounts of numerical data, and to manage complex data encompassing a variety of data types, across distributed storage and processing architectures. We present the LOFAR Known Pulsar Data Pipeline overview, the pulsar beam-formed data format, the status of the pipeline processing as well as our future plans for developing the LOFAR Pulsar Search Pipeline. These LOFAR pipelines and software tools are being developed as the next gen...

  19. Differences Analysis on Domestic and Foreign Process Pipeline Pressure Testing Standards for Oil and Gas Transportation Stations%国内外输油气站工艺管道试压标准差异分析

    Institute of Scientific and Technical Information of China (English)

    余运复; 张继霞; 吴俊松; 于宏庆

    2014-01-01

    Domestic process pipeline pressure testing standards for oil and gas transportation stations were classified and assessed, with prompting the idea of adopting foreign standards to improve the domestic design and operating level of oil and gas transportation stations. The advancement of foreign process pipeline pressure testing standards for oil and gas transportation stations were systematacially expounded, including integration strength testing and leakage testing, or leakage testing alone for low operating pressure pipeline, and classification principle of process pipeline, and air inspection method and pressure reducing procedure during pressure testing, and pressure testing periods of process pipeline. In addition, advanced technology and construction cases of domestic and foreign process pipeline pressure testing for oil and gas transportation stations were also introduced, such as the determination of reasonable sustaining time based on leakage mathematical model, and the determination of air pressure testing damage radius based on high pressure air explosion energy calculation method, etc. Finally, by means of learning foreign standards, recommendations are made to raise the level of Chinese process pipeline pressure testing standards for oil and gas transportation stations.%梳理评价了国内输油气站工艺管道试压标准,提出了借鉴国外标准的先进理念,提高国内输油气站设计和运行水平的思路。国外输油气站工艺管道试压标准的先进性主要体现在以下几个方面:管道运行压力较低,强度试验和严密性试验合并进行,或者仅进行严密性试验;工艺管道划分原则;试压过程空气含量检测方法和降压程序;工艺管道试压周期等。此外还介绍了国内外关于输油气站工艺管道试压的先进技术和施工实例,包括利用严密性数学模型确定合理稳压时间;利用高压气体爆炸能量计算方法确定气试压损伤半径

  20. 77 FR 66830 - LNG Development Company, LLC and Oregon Pipeline Company; Northwest Pipeline GP; Notice of...

    Science.gov (United States)

    2012-11-07

    ... Energy Regulatory Commission LNG Development Company, LLC and Oregon Pipeline Company; Northwest Pipeline GP; Notice of Extension of Comment Period for the Oregon LNG Export and Washington Expansion Projects This notice announces the extension of the public scoping process and comment period for the Oregon...

  1. 75 FR 4134 - Pipeline Safety: Leak Detection on Hazardous Liquid Pipelines

    Science.gov (United States)

    2010-01-26

    ... safety study on pipeline Supervisory Control and Data Acquisition (SCADA) systems (NTSB/SS-05/02). The... indications of a leak on the SCADA interface was the impetus for this study. The NTSB examined 13 hazardous... large pipeline breaks. The line balance processes incorporating SCADA or other technology are geared...

  2. Regular pipeline maintenance of gas pipeline using technical operational diagnostics methods

    Energy Technology Data Exchange (ETDEWEB)

    Volentic, J. [Gas Transportation Department, Slovensky plynarensky priemysel, Slovak Gas Industry, Bratislava (Slovakia)

    1997-12-31

    Slovensky plynarensky priemysel (SPP) has operated 17 487 km of gas pipelines in 1995. The length of the long-line pipelines reached 5 191 km, distribution network was 12 296 km. The international transit system of long-line gas pipelines ranged 1 939 km of pipelines of various dimensions. The described scale of transport and distribution system represents a multibillion investments stored in the ground, which are exposed to the environmental influences and to pipeline operational stresses. In spite of all technical and maintenance arrangements, which have to be performed upon operating gas pipelines, the gradual ageing takes place anyway, expressed in degradation process both in steel tube, as well as in the anti-corrosion coating. Within a certain time horizon, a consistent and regular application of methods and means of in-service technical diagnostics and rehabilitation of existing pipeline systems make it possible to save substantial investment funds, postponing the need in funds for a complex or partial reconstruction or a new construction of a specific gas section. The purpose of this presentation is to report on the implementation of the programme of in-service technical diagnostics of gas pipelines within the framework of regular maintenance of SPP s.p. Bratislava high pressure gas pipelines. (orig.) 6 refs.

  3. OPUS: the FUSE science data pipeline

    Science.gov (United States)

    Rose, James F.; Heller-Boyer, C.; Rose, M. A.; Swam, M.; Miller, W.; Kriss, G. A.; Oegerle, William R.

    1998-07-01

    This paper describes how the OPUS pipeline, currently used for processing science data from the Hubble Space Telescope (HST), was used as the backbone for developing the science data pipeline for a much smaller mission. The far ultraviolet spectroscopic explorer (FUSE) project selected OPUS for its data processing pipeline platform and selected the OPUS team at the STScI to write the FUSE pipeline applications. A total of 105 new modules were developed for the FUSE pipeline. The foundation of over 250 modules in the OPUS libraries allowed development to proceed quickly and with considerable confidence that the underlying functionality is reliable and robust. Each task represented roughly 90 percent reuse, and the project as a whole shows over 70 percent reuse of the existing OPUS system. Taking an existing system that is operational, and will be maintained for many years to come, was a key decision for the FUSE mission. Adding the extensive experience of the OPUS team to the task resulted in the development of a complete telemetry pipeline system within a matter of months. Reusable software has been the siren song of software engineering and object- oriented design for a decade or more. The development of inexpensive software systems by adapting existing code to new applications is as attractive as it has been elusive. The OPUS telemetry pipeline for the FUSE mission has proven to be a significant exception to that trend.

  4. Redefining the Data Pipeline Using GPUs

    Science.gov (United States)

    Warner, C.; Eikenberry, S. S.; Gonzalez, A. H.; Packham, C.

    2013-10-01

    There are two major challenges facing the next generation of data processing pipelines: 1) handling an ever increasing volume of data as array sizes continue to increase and 2) the desire to process data in near real-time to maximize observing efficiency by providing rapid feedback on data quality. Combining the power of modern graphics processing units (GPUs), relational database management systems (RDBMSs), and extensible markup language (XML) to re-imagine traditional data pipelines will allow us to meet these challenges. Modern GPUs contain hundreds of processing cores, each of which can process hundreds of threads concurrently. Technologies such as Nvidia's Compute Unified Device Architecture (CUDA) platform and the PyCUDA (http://mathema.tician.de/software/pycuda) module for Python allow us to write parallel algorithms and easily link GPU-optimized code into existing data pipeline frameworks. This approach has produced speed gains of over a factor of 100 compared to CPU implementations for individual algorithms and overall pipeline speed gains of a factor of 10-25 compared to traditionally built data pipelines for both imaging and spectroscopy (Warner et al., 2011). However, there are still many bottlenecks inherent in the design of traditional data pipelines. For instance, file input/output of intermediate steps is now a significant portion of the overall processing time. In addition, most traditional pipelines are not designed to be able to process data on-the-fly in real time. We present a model for a next-generation data pipeline that has the flexibility to process data in near real-time at the observatory as well as to automatically process huge archives of past data by using a simple XML configuration file. XML is ideal for describing both the dataset and the processes that will be applied to the data. Meta-data for the datasets would be stored using an RDBMS (such as mysql or PostgreSQL) which could be easily and rapidly queried and file I/O would be

  5. Python Bindings for the Common Pipeline Library

    Science.gov (United States)

    Streicher, O.; Weilbacher, P. M.

    2012-09-01

    The Common Pipeline Library is a set of routines written by ESO to provide a standard interface for VLT instrument data reduction tasks (“pipelines”). To control these pipelines from Python, we developed a wrapper called PYTHON-CPL that allows one to conveniently work interactively and to process data as part of an automated data reduction system. The package will be used to implement the MUSE pipeline in the AstroWISE data management system. We describe the features and design of the package.

  6. The investigation of dangerous geological processes resulting in land subsidence while designing the main gas pipeline in South Yakutia

    Science.gov (United States)

    Strokova, L. A.; Ermolaeva, A. V.; Golubeva, V. V.

    2016-09-01

    The number of gas main accidents has increased recently due to dangerous geological processes in underdeveloped areas located in difficult geological conditions. The paper analyses land subsidence caused by karst and thermokarst processes in the right of way, reveals the assessment criteria for geological hazards and creates zoning schemes considering the levels of karst and thermorkarst hazards.

  7. Study on Heating Process of Induction Bend for Oil and Gas Pipeline%油气管道用弯管感应加热工艺研究

    Institute of Scientific and Technical Information of China (English)

    池强; 刘腾跃; 燕铸; 李小波

    2012-01-01

    Three kinds of induction heating processes of induction bends for oil and gas pipeline were researched, which included the local heating process, continuous whole heating process and multiple-steps whole heating process. The results show that the X70 grade pipe can be bent via the local heating process, which provides a high processing efficiency. For the bending of X80 grade pipe, the whole heating process shall be adopted, which can improve effectively the toughness of tangent of bend, reduce the strength and yield ratio of tangent in a certain extent, and the match of property between the tangent and the bend part is reasonable. The multiple-step whole heating process can receive the similar result of the continuous whole heating process. The property of intermediate zone by double quenching has not any clear change. This kind of process is suit to the pipe-bending machine that can not realize continuous whole induction heating process.%针对油气管道用感应加热弯管,研究了3种感应加热工艺,包括局部加热工艺、连续式整体加热工艺和分布式整体加热工艺.研究结果表明,对于X70级别钢管,可采用局部加热工艺进行弯制加工,提高加工效率;对于X80级别钢管,适于采用整体加热工艺,可改善弯管直管段焊缝的韧性,并在一定程度上降低直管段的强度和屈强比,使得弯管整体的强韧性匹配更加合理.分布式整体加热工艺与连续式整体加热工艺效果相似,二次淬火过渡段性能未出现明显变化,此工艺适合于不能进行连续式整体加热工艺的弯管设备.

  8. INTERNAL REPAIR OF PIPELINES

    Energy Technology Data Exchange (ETDEWEB)

    Robin Gordon; Bill Bruce; Ian Harris; Dennis Harwig; Nancy Porter; Mike Sullivan; Chris Neary

    2004-04-12

    The two broad categories of deposited weld metal repair and fiber-reinforced composite liner repair technologies were reviewed for potential application for internal repair of gas transmission pipelines. Both are used to some extent for other applications and could be further developed for internal, local, structural repair of gas transmission pipelines. Preliminary test programs were developed for both deposited weld metal repair and for fiber-reinforced composite liner repair. Evaluation trials have been conducted using a modified fiber-reinforced composite liner provided by RolaTube and pipe sections without liners. All pipe section specimens failed in areas of simulated damage. Pipe sections containing fiber-reinforced composite liners failed at pressures marginally greater than the pipe sections without liners. The next step is to evaluate a liner material with a modulus of elasticity approximately 95% of the modulus of elasticity for steel. Preliminary welding parameters were developed for deposited weld metal repair in preparation of the receipt of Pacific Gas & Electric's internal pipeline welding repair system (that was designed specifically for 559 mm (22 in.) diameter pipe) and the receipt of 559 mm (22 in.) pipe sections from Panhandle Eastern. The next steps are to transfer welding parameters to the PG&E system and to pressure test repaired pipe sections to failure. A survey of pipeline operators was conducted to better understand the needs and performance requirements of the natural gas transmission industry regarding internal repair. Completed surveys contained the following principal conclusions: (1) Use of internal weld repair is most attractive for river crossings, under other bodies of water, in difficult soil conditions, under highways, under congested intersections, and under railway crossings. (2) Internal pipe repair offers a strong potential advantage to the high cost of horizontal direct drilling (HDD) when a new bore must be created

  9. Holdup Measurement of Pipeline

    Institute of Scientific and Technical Information of China (English)

    LU; Wen-guang; XU; Zheng

    2015-01-01

    This research mainly adopts gamma spectroscopy to detect the pipeline retention.The calculation of retention of uranium has been obtained based on the intensity of gamma rays of 185.715 keV emitted by 235U,and the analysis method for the pipeline retention has been established.

  10. Slurry pipeline hydrostatic testing

    Energy Technology Data Exchange (ETDEWEB)

    Betinol, Roy G.; Navarro Rojas, Luis Alejandro [BRASS Chile S.A., Santiago (Chile)

    2009-07-01

    The transportation of concentrates and tailings through long distance pipeline has been proven in recent years to be the most economic, environmentally friendly and secure means of transporting of mine products. This success has led to an increase in the demand for long distance pipeline throughout the mining industry. In year 2007 alone, a total of over 500 km of pipeline has been installed in South America alone and over 800 km are in the planning stages. As more pipelines are being installed, the need to ensure its operating integrity is ever increasing. Hydrostatic testing of long distance pipeline is one of the most economical and expeditious way to proving the operational integrity of the pipe. The intent of this paper is to show the sound reasoning behind construction hydro testing and the economic benefit it presents. It will show how hydro test pressures are determined based on ASME B31.11 criteria. (author)

  11. Demonstrating the Effects of Shop Flow Process Variability on the Air Force Depot Level Reparable Item Pipeline

    Science.gov (United States)

    1992-09-01

    DO41). AFLCR 57-4. Wright-Patterson AFB OH: HQ AFLC, 29 April 1983. 137 12. Goldratt , Eliyahu M. and Jeff Cox. The Goal. Croton-On-Hudson NY: North...manufacturing process (14:411). Goldratt and Cox give another example of the same impact of variability in their book The Goal when they demonstrated what

  12. The Herschel Data Processing System - HIPE and Pipelines - Up and Running Since the Start of the Mission

    CERN Document Server

    Ott, Stephan; Agency, European Space

    2010-01-01

    The Herschel Space Observatory is the fourth cornerstone mission in the ESA science programme and performs photometry and spectroscopy in the 55 - 672 micron range. The development of the Herschel Data Processing System started in 2002 to support the data analysis for Instrument Level Tests. The Herschel Data Processing System was used for the pre-flight characterisation of the instruments, and during various ground segment test campaigns. Following the successful launch of Herschel 14th of May 2009 the Herschel Data Processing System demonstrated its maturity when the first PACS preview observation of M51 was processed within 30 minutes of reception of the first science data after launch. Also the first HIFI observations on DR21 were successfully reduced to high quality spectra, followed by SPIRE observations on M66 and M74. A fast turn-around cycle between data retrieval and the production of science-ready products was demonstrated during the Herschel Science Demonstration Phase Initial Results Workshop hel...

  13. INTERNAL REPAIR OF PIPELINES

    Energy Technology Data Exchange (ETDEWEB)

    Robin Gordon; Bill Bruce; Ian Harris; Dennis Harwig; George Ritter; Bill Mohr; Matt Boring; Nancy Porter; Mike Sullivan; Chris Neary

    2004-08-17

    , indicating that this type of liner is only marginally effective at restoring the pressure containing capabilities of pipelines. Failure pressures for larger diameter pipe repaired with a semi-circular patch of carbon fiber-reinforced composite lines were also marginally greater than that of a pipe section with un-repaired simulated damage without a liner. These results indicate that fiber reinforced composite liners have the potential to increase the burst pressure of pipe sections with external damage Carbon fiber based liners are viewed as more promising than glass fiber based liners because of the potential for more closely matching the mechanical properties of steel. Pipe repaired with weld deposition failed at pressures lower than that of un-repaired pipe in both the virgin and damaged conditions, indicating that this repair technology is less effective at restoring the pressure containing capability of pipe than a carbon fiber-reinforced liner repair. Physical testing indicates that carbon fiber-reinforced liner repair is the most promising technology evaluated to-date. Development of a comprehensive test plan for this process is recommended for use in the field trial portion of this program.

  14. INTERNAL REPAIR OF PIPELINES

    Energy Technology Data Exchange (ETDEWEB)

    Robin Gordon; Bill Bruce; Ian Harris; Dennis Harwig; George Ritter; Bill Mohr; Matt Boring; Nancy Porter; Mike Sullivan; Chris Neary

    2004-12-31

    liners, indicating that this type of liner is only marginally effective at restoring the pressure containing capabilities of pipelines. Failure pressures for larger diameter pipe repaired with a semi-circular patch of carbon fiber-reinforced composite lines were also marginally greater than that of a pipe section with un-repaired simulated damage without a liner. These results indicate that fiber reinforced composite liners have the potential to increase the burst pressure of pipe sections with external damage Carbon fiber based liners are viewed as more promising than glass fiber based liners because of the potential for more closely matching the mechanical properties of steel. Pipe repaired with weld deposition failed at pressures lower than that of un-repaired pipe in both the virgin and damaged conditions, indicating that this repair technology is less effective at restoring the pressure containing capability of pipe than a carbon fiber-reinforced liner repair. Physical testing indicates that carbon fiber-reinforced liner repair is the most promising technology evaluated to-date. The first round of optimization and validation activities for carbon-fiber repairs are complete. Development of a comprehensive test plan for this process is recommended for use in the field trial portion of this program.

  15. On the influence of the UOE forming process on material properties and collapse pressure of deep water pipelines: experimental work

    Energy Technology Data Exchange (ETDEWEB)

    Timms, Chris; Swanek, Doug; DeGeer, Duane [C-FER Technologies, Alberta (Canada); Mantovano, Luciano O. [Tenaris Siderca, Buenos Aires (Argentina); Ernst, Hugo A. [Tenaris Siderca, Buenos Aires (Argentina). Structural Integrity Dept.; Toscano, Rita G. [SIM y TEC, Buenos Aires (Argentina); Souza, Marcos P.; Chad, Luis C. [Tenaris Confab, Pindamonhangaba, SP (Brazil)

    2009-07-01

    Large diameter pipes for onshore and offshore applications are manufactured using the UOE process. The manufacturing process consists in the cold forming of heavy plates followed by welding and then by an expansion. It has been demonstrated in previous work that, for deep water applications, the cold forming process involved in UOE pipe manufacturing significantly reduces pipe collapse strength. To improve the understanding of these effects, Tenaris has embarked on a program to model the phases of the UOE manufacturing process using finite element methods. Previous phases of this work formulated the basis for the model development and described the 2D approach taken to model the various steps of manufacture. More recent developments included modeling enhancements, some sensitivity analyses, and comparison of predictions to the results of full-scale collapse testing performed at C FER. This work has shown correlations between manufacturing parameters and collapse pressure predictions. The results of the latest phase of the research program are presented in this paper. This work consists of full scale collapse testing and extensive coupon testing on samples collected from various stages of the UOE pipe manufacturing process including plate, UO, UOE, and thermally aged UOE. Four UOE pipe samples manufactured with varying forming parameters were provided by Tenaris for this test program along with associated plate and UO samples. Full-scale collapse and buckle propagation tests were conducted on a sample from each of the four UOE pipes including one that was thermally aged. Additional coupon-scale work included measurement of the through thickness variation of material properties and a thermal ageing study aimed at better understanding UOE pipe strength recovery. The results of these tests will provide the basis for further refinement of the finite element model as the program proceeds into the next phase. (author)

  16. STORMSeq: an open-source, user-friendly pipeline for processing personal genomics data in the cloud.

    Directory of Open Access Journals (Sweden)

    Konrad J Karczewski

    Full Text Available The increasing public availability of personal complete genome sequencing data has ushered in an era of democratized genomics. However, read mapping and variant calling software is constantly improving and individuals with personal genomic data may prefer to customize and update their variant calls. Here, we describe STORMSeq (Scalable Tools for Open-Source Read Mapping, a graphical interface cloud computing solution that does not require a parallel computing environment or extensive technical experience. This customizable and modular system performs read mapping, read cleaning, and variant calling and annotation. At present, STORMSeq costs approximately $2 and 5-10 hours to process a full exome sequence and $30 and 3-8 days to process a whole genome sequence. We provide this open-access and open-source resource as a user-friendly interface in Amazon EC2.

  17. STORMSeq: an open-source, user-friendly pipeline for processing personal genomics data in the cloud.

    Science.gov (United States)

    Karczewski, Konrad J; Fernald, Guy Haskin; Martin, Alicia R; Snyder, Michael; Tatonetti, Nicholas P; Dudley, Joel T

    2014-01-01

    The increasing public availability of personal complete genome sequencing data has ushered in an era of democratized genomics. However, read mapping and variant calling software is constantly improving and individuals with personal genomic data may prefer to customize and update their variant calls. Here, we describe STORMSeq (Scalable Tools for Open-Source Read Mapping), a graphical interface cloud computing solution that does not require a parallel computing environment or extensive technical experience. This customizable and modular system performs read mapping, read cleaning, and variant calling and annotation. At present, STORMSeq costs approximately $2 and 5-10 hours to process a full exome sequence and $30 and 3-8 days to process a whole genome sequence. We provide this open-access and open-source resource as a user-friendly interface in Amazon EC2.

  18. Dynamical Mechanisms of Effects of Landslides on Long Distance Oil and Gas Pipelines

    Institute of Scientific and Technical Information of China (English)

    MA Qingwen; WANG Chenghua; KONG Jiming

    2006-01-01

    According to the investigations on the oil and gas pipelines such as the Lan-Cheng-Chong pipeline and the Southwest pipeline, there are two ways of laying pipeline: pipelines paralleling (approximately) to the main slide direction and pipelines perpendicular (approximately) to the main slide direction. If earth-retaining walls have been built for pipelines paralleling to the main slide direction, they will prevent the lands from sliding; On the contrary, without earth-retaining walls, the sharp broken rocks in the backfilling soil will scratch the safeguard of the pipeline when the landslides take place. Pipelines perpendicular to the main slide direction can be classified into four types according to the relative positions between pipelines and landslides: Pipelines over the slide planes, pipelines inside the fracture strips of slide planes, pipelines below the slide planes and pipelines behind the backsides of landslides. The different dynamical mechanisms of the process in which landslide acts against pipelines are analyzed based on whether the pipelines are equipped with fixed frusta, because the sliding resistance depends on whether and how many fixed frusta are equipped and the distance between frusta.

  19. Pipeline four-dimension management is the trend of pipeline integrity management in the future

    Energy Technology Data Exchange (ETDEWEB)

    Shaohua, Dong; Feifan; Zhongchen, Han [China National Petroleum Corporation (CNPC), Beijing (China)

    2009-07-01

    Pipeline integrity management is essential for today's operators to operate their pipelines safety and cost effectively. The latest developments of pipeline integrity management around the world are involved with change of regulation, industry standard and innovation of technology. And who know the trend of PIM in the future, which can be answered in the paper. As a result, the concept of P4DM was set up firstly in the world. The paper analyzed the pipeline HSE management, pipeline integrity management (PIM) and asset integrity management (AIM), the problem of management was produced, and also the Pipeline 4-dimension Management (P4DM) theory was brought forward. According to P4DM, from the hierarchy of P4DM, the management elements, fields, space and time was analyzed. The main content is P4DM integrate the space geography location and time, control and manage the pipeline system in whole process, anywhere and anytime. It includes the pipeline integrity, pipeline operation and emergency, which is integrated by IT system. It come true that the idea, solution, technology, organization, manager alternately intelligently control the process of management. What the paper talks about included the definition of pipeline 4D management, the research develop of P4DM, the theory of P4DM, the relationship between P4DM and PIM, the technology basis of P4DM, how to perform the P4DM and conclusion. The P4DM was produced, which provide the development direction of PIM in the future, and also provide the new ideas for PetroChina in the field of technology and management. (author)

  20. Pipeline operators training and certification using thermohydraulic simulators

    Energy Technology Data Exchange (ETDEWEB)

    Barreto, Claudio V.; Plasencia C, Jose [Pontificia Universidade Catolica (PUC-Rio), Rio de Janeiro, RJ (Brazil). Nucleo de Simulacao Termohidraulica de Dutos (SIMDUT); Montalvao, Filipe; Costa, Luciano [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    The continuous pipeline operators training and certification of the TRANSPETRO's Pipeline National Operations Control Center (CNCO) is an essential task aiming the efficiency and safety of the oil and derivatives transport operations through the Brazilian pipeline network. For this objective, a hydraulic simulator is considered an excellent tool that allows the creation of different operational scenarios for training the pipeline hydraulic behavior as well as for testing the operator's responses to normal and abnormal real time operational conditions. The hydraulic simulator is developed based on a pipeline simulation software that supplies the hydraulic responses normally acquired from the pipeline remote units in the field. The pipeline simulation software has a communication interface system that sends and receives data to the SCADA supervisory system database. Using the SCADA graphical interface to create and to customize human machine interfaces (HMI) from which the operator/instructor has total control of the pipeline/system and instrumentation by sending commands. Therefore, it is possible to have realistic training outside of the real production systems, while acquiring experience during training hours with the operation of a real pipeline. A pilot Project was initiated at TRANSPETRO - CNCO targeting to evaluate the hydraulic simulators advantages in pipeline operators training and certification programs. The first part of the project was the development of three simulators for different pipelines. The excellent results permitted the project expansion for a total of twenty different pipelines, being implemented in training programs for pipelines presently operated by CNCO as well as for the new ones that are being migrated. The main objective of this paper is to present an overview of the implementation process and the development of a training environment through a pipe simulation environment using commercial software. This paper also presents

  1. Dispatch optimization of gathering pipeline network based on quality indices of natural gas processing plant%基于天然气处理厂气质指标的集输管网调度优化

    Institute of Scientific and Technical Information of China (English)

    徐源; 艾慕阳; 刘武; 师春元; 刘春艳

    2013-01-01

    在建立符合实际工况的多气源组分平衡计算模型的基础上,以处理厂原料气气质指标为目标函数,建立优化模型并求解,得到在满足管网运行约束及各处理厂原料气气质指标前提下的管网运行参数.实例应用表明,建立的集输管网优化调度模型既可体现各气源点气质的差异性,又能满足各天然气处理厂对原料气气质的不同要求,计算结果能够准确地反映管网中的气质状况,从而确保净化装置的平稳运行.将该模型应用于实际管网,可为其生产运行调度提供重要的参考依据.%This paper,based on the construction of multi-gas sources component equilibrium calculation model in line with real working conditions,takes quality indices of raw gas in natural gas processing plant as objective function,sets up an optimized model to search solutions and finally get operational parameters of pipeline network under the condition of sound pipeline network running constraints and quality indices for the raw gas in gas processing plants.The field cases show that optimized dispatch model for gathering pipeline network can represent the differential gas quality of gas source locations,meet the needs of the natural gas processing plants for raw gas quality,and enable the calculation results to reflect gas quality status in the pipeline network,faithfully and accordingly ensure steady running of purification units.The model,if applied in the real pipeline networks,will provide important basis to the production/operation dispatch of the pipeline networks.

  2. Analytic prognostic for petrochemical pipelines

    CERN Document Server

    Jaoude, Abdo Abou; El-Tawil, Khaled; Noura, Hassan; Ouladsine, Mustapha

    2012-01-01

    Pipelines tubes are part of vital mechanical systems largely used in petrochemical industries. They serve to transport natural gases or liquids. They are cylindrical tubes and are submitted to the risks of corrosion due to high PH concentrations of the transported liquids in addition to fatigue cracks due to the alternation of pressure-depression of gas along the time, initiating therefore in the tubes body micro-cracks that can propagate abruptly to lead to failure. The development of the prognostic process for such systems increases largely their performance and their availability, as well decreases the global cost of their missions. Therefore, this paper deals with a new prognostic approach to improve the performance of these pipelines. Only the first mode of crack, that is, the opening mode, is considered.

  3. Effects of welding wire composition and welding process on the weld metal toughness of submerged arc welded pipeline steel

    Institute of Scientific and Technical Information of China (English)

    De-liang Ren; Fu-ren Xiao; Peng Tian; Xu Wang; Bo Liao

    2009-01-01

    The effects of alloying elements in welding wires and submerged arc welding process on the microstructures and low-temperature impact toughness of weld metals have been investigated.The results indicate that the optimal contents of alloying elements in welding wires can improve the low-temperature impact toughness of weld metals because the proentectoid ferrite and bainite formations can be suppressed,and the fraction of acicular ferrite increases.However,the contents of alloying elements need to vary along with the welding heat input.With the increase in welding heat input,the contents of alloying elements in welding wires need to be increased accordingly.The microstructures mainly consisting of acicular ferrite can be obtained in weld metals after four-wire submerged arc welding using the wires with a low carbon content and appropriate contents of Mn,Mo,Ti-B,Cu,Ni,and RE,resulting in the high low-temperature impact toughness of weld metals.

  4. Data reduction pipeline for the MMT Magellan Infrared Spectrograph

    CERN Document Server

    Chilingarian, Igor; Fabricant, Daniel; McLeod, Brian; Roll, John; Szentgyorgyi, Andrew

    2012-01-01

    We describe principal components of the new spectroscopic data pipeline for the multi-object MMT/Magellan Infrared Spectrograph (MMIRS). The pipeline is implemented in IDL and C++. The performance of the data processing algorithms is sufficient to reduce a single dataset in 2--3 min on a modern PC workstation so that one can use the pipeline as a quick-look tool during observations. We provide an example of the spectral data processed by our pipeline and demonstrate that the sky subtraction quality gets close to the limits set by the Poisson photon statistics.

  5. Natural Gas Liquid Pipelines

    Data.gov (United States)

    Department of Homeland Security — Natural gas interstate and intrastate pipelines in the United States. Based on a variety of sources with varying scales and levels of accuracy and therefore accuracy...

  6. BSEE_Pacific_Pipelines

    Data.gov (United States)

    Bureau of Ocean Energy Management, Department of the Interior — This data set contains the locations of oil and gas pipelines in the Bureau of Safety and Environmental Enforcement Pacific OCS Region

  7. Central oxygen pipeline failure

    African Journals Online (AJOL)

    Anaesthetic and critical care staff play a governing role in the comprehension of a ... complete central oxygen pipeline failure occurred throughout. Tygerberg Hospital. ..... emergency stations and at plant room emergency supply manifolds.

  8. A versatile and low-cost 3D acquisition and processing pipeline for collecting mass of archaeological findings on the field

    Science.gov (United States)

    Gattet, E.; Devogelaere, J.; Raffin, R.; Bergerot, L.; Daniel, M.; Jockey, Ph.; De Luca, L.

    2015-02-01

    In recent years, advances in the fields of photogrammetry and computer vision have produced several solutions for generating 3D reconstruction starting from simple images. Even if the potentialities of the image-based 3D reconstruction approach are nowadays very well-known in terms of reliability, accuracy and flexibility, there is still a lack of low-cost, open-source and automated solutions for collecting mass of archaeological findings, specially if one consider the real (and non theoretical) contextual aspects of a digitization campaign on the field (number of objects to acquire, available time, lighting conditions, equipment transport, budget, etc...) as well as the accuracy requirements for an in-depth shape analysis and classification purpose. In this paper we present a prototype system (integrating hardware and software) for the 3D acquisition, geometric reconstruction, documentation and archiving of large collections of archaeological findings. All the aspects of our approach are based on high-end image-based modeling techniques and designed basing on an accurate analysis of the typical field conditions of an archaeological campaign, as well as on the specific requirements of archaeological finding documentation and analysis. This paper presents all the aspects integrated into the prototype: - a hardware development of a transportable photobooth for the automated image acquisition consisting of a turntable and three DSLR controlled by a microcontroller; - an automatic image processing pipeline (based on Apero/Micmac) including mask generation, tie-point extraction, bundle adjustment, multi-view stereo correlation, point cloud generation, surface reconstruction; - a versatile (off-line/on-line) portable database for associating descriptive attributes (archaeological description) to the 3D digitizations on site; - a platform for data-gathering, archiving and sharing collections of 3D digitizations on the Web. The presentation and the assessment of this

  9. Nearshore Pipeline Installation Methods.

    Science.gov (United States)

    1981-08-01

    179. 5. Aldridge, R. G., and Bomba , J. G., "Deep Water Pipelines - Interdependence of Design and Construction", ASCE Paper. 6. American Society Civil...October 13, 1967. 24. Bomba , J. G. and Seeds, K. J., "Pipelining in 600 feet of water .... A Case Study of Washington Natural Gas Company’s Puget Sound...Crossing", Offshore Technology Conference, paper OTC 1188, 1970. 25. Bomba , J., "Submarine Pipe Construction Methods", Petroleum Engineer, Vol. 32

  10. Co-Production of Olefins, Fuels, and Electricity from Conventional Pipeline Gas and Shale Gas with Near-Zero CO2 Emissions. Part I: Process Development and Technical Performance

    Directory of Open Access Journals (Sweden)

    Yaser Khojasteh Salkuyeh

    2015-04-01

    Full Text Available A novel polygeneration process is presented in this paper that co-produces olefins, methanol, dimethyl ether, and electricity from conventional pipeline natural gas and different kinds of shale gases. Technical analyses of many variants of the process are performed, considering differences in power generation strategy and gas type. The technical analysis results show that the efficiency of the plant varies between 22%–57% (HHV depending on the product portfolio. The efficiency is higher than a traditional methanol-to-olefin process, which enables it to be competitive with traditional naphtha cracking plants.

  11. Pipeline coating comparison methods for northern pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Singh, P. [Shaw Pipe Protection, Calgary, AB (Canada); Purves, G.A. [Cimarron Engineering Ltd., Calgary, AB (Canada)

    2004-07-01

    Two high-quality pipe coatings designed for northern environments were compared for their relative costs and suitability for the conditions that will be encountered in the field. Coating selection should consider local conditions to achieve the optimum life-cycle costs for the system. Some of the key factors affecting the integrity of the protective coating on a pipe include the effects of cold temperature and soil types. In this study, both Fusion Bonded Epoxy (FBE) and High Performance Composite Coatings (HPCC) were evaluated for an entire pipeline installation in a northern environment, from the coating plant to the pipe trench. The evaluation focused on the advantages of better abrasion resistance of the HPCC coating. This was compared against the incremental cost of HPCC coating over FBE on large diameter NPS 30 to NPS 48 pipelines. The following parameters influenced the choice of coating: storage, transportation and handling; bending ability under cold weather conditions; pipe installation and backfilling; weld joint coatings; coating repair and cathodic protection and pipeline integrity. Some of the construction costs that are indirectly affected by the choice of pipe coating include right-of-way preparation and restoration; trenching; supervision, service and downtime and specialist crossings. It was concluded that HPCC has better resistance to abrasion than FBE and is more flexible in extremely cold temperatures. Standard FBE is about 10 per cent less expensive than HPCC. In general HPCC will require less coating protection than FBE, depending on site conditions. 3 refs., 18 tabs., 8 figs.

  12. Toward a graphical user interface for the SPIRE spectrometer pipeline

    Science.gov (United States)

    Ordenovic, C.; Surace, C.; Baluteau, J. P.; Benielli, D.; Davis, P.; Fulton, T.

    2008-08-01

    Herschel is a satellite mission led by ESA and involving an international consortium of countries. The HCSS is in charge of the data processing pipeline. This pipeline is written in Jython and includes java classes. We present a convenient way for a user to deal with SPIRE photometer and spectrometer pipeline scripts. The provided Graphical User Interface is built up automatically from Jython script. The user can choose tasks to be executed, parameterise them and set breakpoints during the pipeline execution. Results can be displayed and saved in FITS and VOTable formats.

  13. 77 FR 2126 - Pipeline Safety: Implementation of the National Registry of Pipeline and Liquefied Natural Gas...

    Science.gov (United States)

    2012-01-13

    ... Registry of Pipeline and Liquefied Natural Gas Operators AGENCY: Pipeline and Hazardous Materials Safety... registry of pipeline and liquefied natural gas operators. FOR FURTHER INFORMATION CONTACT: Jamerson Pender... 72878), titled: ``Pipeline Safety: Updates to Pipeline and Liquefied Natural Gas Reporting...

  14. Pipeliners go regulator shopping

    Energy Technology Data Exchange (ETDEWEB)

    Byfield, M.

    1996-12-09

    The weakening of Alberta`s regulatory grip on gas pipelines was discussed. Palliser Pipeline Limited has challenged Nova Corp`s monopoly by applying to the National Energy Board (NEB) for permission to build a 150-mile pipeline from Calgary to the Saskatchewan border. If the $350 million project proceeds, it would mean that gas would be flowing out of Alberta for the first time through a line that is not operated by Nova Corp. Palliser would operate with a lower shipping toll, set by the NEB rather than Alberta`s Energy and Utilities Board. Alliance Pipeline Ltd. will also apply to the NEB to build a 1850-mile pipeline that would originate in British Columbia, cross Alberta and terminate in Chicago. Nova Corp has implied that it might have to consider charging distance-based tolls if the Palliser bypass line proceeds. However, Palliser countered that it should not be necessary to change the postage stamp system for that small a fraction. Palliser suggested that Nova was simply reacting because it was facing competition for the first time. Final decision is in the hands of the federal government.

  15. Protecting a pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Gray, D.H (Univ. of Michigan, Ann Arbor, MI (United States)); Garcia-Lopez, M. (Ingenieria y Geotecnia Ltda., Santafe de Bogota (Colombia))

    1994-12-01

    This article describes some of the difficulties in constructing an oil pipeline in Colombia across a forested mountain range that has erosion-prone slopes. Engineers are finding ways to protect the pipeline against slope failures and severe erosion problems while contending with threats of guerrilla attacks. Torrential rainfall, precipitous slopes, unstable soils, unfavorable geology and difficult access make construction of an oil pipeline in Colombia a formidable undertaking. Add the threat of guerrilla attacks, and the project takes on a new dimension. In the country's central uplands, a 76 cm pipeline traverses some of the most daunting and formidable terrain in the world. The right-of-way crosses rugged mountains with vertical elevations ranging from 300 m to 2,000 mm above sea level over a distance of some 30 km. The pipeline snakes up and down steep forested inclines in some spots and crosses streams and faults in others, carrying the country's major export--petroleum--from the Cusiana oil field, located in Colombia's lowland interior, to the coast.

  16. Overview of interstate hydrogen pipeline systems.

    Energy Technology Data Exchange (ETDEWEB)

    Gillette, J .L.; Kolpa, R. L

    2008-02-01

    The use of hydrogen in the energy sector of the United States is projected to increase significantly in the future. Current uses are predominantly in the petroleum refining sector, with hydrogen also being used in the manufacture of chemicals and other specialized products. Growth in hydrogen consumption is likely to appear in the refining sector, where greater quantities of hydrogen will be required as the quality of the raw crude decreases, and in the mining and processing of tar sands and other energy resources that are not currently used at a significant level. Furthermore, the use of hydrogen as a transportation fuel has been proposed both by automobile manufacturers and the federal government. Assuming that the use of hydrogen will significantly increase in the future, there would be a corresponding need to transport this material. A variety of production technologies are available for making hydrogen, and there are equally varied raw materials. Potential raw materials include natural gas, coal, nuclear fuel, and renewables such as solar, wind, or wave energy. As these raw materials are not uniformly distributed throughout the United States, it would be necessary to transport either the raw materials or the hydrogen long distances to the appropriate markets. While hydrogen may be transported in a number of possible forms, pipelines currently appear to be the most economical means of moving it in large quantities over great distances. One means of controlling hydrogen pipeline costs is to use common rights-of-way (ROWs) whenever feasible. For that reason, information on hydrogen pipelines is the focus of this document. Many of the features of hydrogen pipelines are similar to those of natural gas pipelines. Furthermore, as hydrogen pipeline networks expand, many of the same construction and operating features of natural gas networks would be replicated. As a result, the description of hydrogen pipelines will be very similar to that of natural gas pipelines

  17. Pipelines. Economy's veins; Pipelines. Adern der Wirtschaft

    Energy Technology Data Exchange (ETDEWEB)

    Feizlmayr, Adolf; Goestl, Stefan [ILF Beratende Ingenieure, Muenchen (Germany)

    2011-02-15

    According to the existing prognoses more than 1 million km of gas pipelines, oil pipelines and water pipelines are built up to the year 2030. The predominant portion is from gas pipelines. The safe continued utilization of the aging pipelines is a large challenge. In addition, the diagnostic technology, the evaluation and risk assessment have to be developed further. With the design of new oil pipelines and gas pipelines, aspects of environmental protection, the energy efficiency of transport and thus the emission reduction of carbon dioxide, the public acceptance and the market strategy of the exporters gain in importance. With the offshore pipelines one soon will exceed the present border of 2,000 m depth of water and penetrate into larger sea depths.

  18. Progress with the LOFAR Imaging Pipeline

    CERN Document Server

    Heald, George; Pizzo, Roberto; van Diepen, Ger; van Zwieten, Joris E; van Weeren, Reinout J; Rafferty, David; van der Tol, Sebastiaan; Birzan, Laura; Shulevski, Aleksandar; Swinbank, John; Orru, Emanuela; De Gasperin, Francesco; Ker, Louise; Bonafede, Annalisa; Macario, Giulia; Ferrari, Chiara

    2010-01-01

    One of the science drivers of the new Low Frequency Array (LOFAR) is large-area surveys of the low-frequency radio sky. Realizing this goal requires automated processing of the interferometric data, such that fully calibrated images are produced by the system during survey operations. The LOFAR Imaging Pipeline is the tool intended for this purpose, and is now undergoing significant commissioning work. The pipeline is now functional as an automated processing chain. Here we present several recent LOFAR images that have been produced during the still ongoing commissioning period. These early LOFAR images are representative of some of the science goals of the commissioning team members.

  19. The Analysis of Pipeline Transportation Process for CO2 Captured From Reference Coal-Fired 900 MW Power Plant to Sequestration Region

    Directory of Open Access Journals (Sweden)

    Witkowski Andrzej

    2014-12-01

    Full Text Available Three commercially available intercooled compression strategies for compressing CO2 were studied. All of the compression concepts required a final delivery pressure of 153 bar at the inlet to the pipeline. Then, simulations were used to determine the maximum safe pipeline distance to subsequent booster stations as a function of inlet pressure, environmental temperature, thickness of the thermal insulation and ground level heat flux conditions. The results show that subcooled liquid transport increases energy efficiency and minimises the cost of CO2 transport over long distances under heat transfer conditions. The study also found that the thermal insulation layer should not be laid on the external surface of the pipe in atmospheric conditions in Poland. The most important problems from the environmental protection point of view are rigorous and robust hazard identification which indirectly affects CO2 transportation. This paper analyses ways of reducing transport risk by means of safety valves.

  20. Pipeline rehabilitation planning

    Energy Technology Data Exchange (ETDEWEB)

    Palmer-Jones, Roland; Hopkins, Phil; Eyre, David [PENSPEN (United Kingdom)

    2005-07-01

    An operator faced with an onshore pipeline that has extensive damage must consider the need for rehabilitation, the sort of rehabilitation to be used, and the rehabilitation schedule. This paper will consider pipeline rehabilitation based on the authors' experiences from recent projects, and recommend a simple strategy for planning pipeline rehabilitation. It will also consider rehabilitation options: external re-coating; internal lining; internal painting; programmed repairs. The main focus will be external re-coating. Consideration will be given to rehabilitation coating types, including tape wraps, epoxy, and polyurethane. Finally it will discuss different options for scheduling the rehabilitation of corrosion damage including: the statistical comparison of signals from inspection pigs; statistical comparison of selected measurements from inspection pigs and other inspections; the use of corrosion rates estimated for the mechanisms and conditions; expert judgement. (author)

  1. Pipelined Two-Operand Modular Adders

    Directory of Open Access Journals (Sweden)

    M. Czyzak

    2015-04-01

    Full Text Available Pipelined two-operand modular adder (TOMA is one of basic components used in digital signal processing (DSP systems that use the residue number system (RNS. Such modular adders are used in binary/residue and residue/binary converters, residue multipliers and scalers as well as within residue processing channels. The design of pipelined TOMAs is usually obtained by inserting an appriopriate number of latch layers inside a nonpipelined TOMA structure. Hence their area is also determined by the number of latches and the delay by the number of latch layers. In this paper we propose a new pipelined TOMA that is based on a new TOMA, that has the smaller area and smaller delay than other known structures. Comparisons are made using data from the very large scale of integration (VLSI standard cell library.

  2. 78 FR 70623 - Pipeline Safety: Meeting of the Gas Pipeline Advisory Committee and the Liquid Pipeline Advisory...

    Science.gov (United States)

    2013-11-26

    ... gas pipelines and for hazardous liquid pipelines. Both committees were established under the Federal... Administration [Docket No. PHMSA-2009-0203] Pipeline Safety: Meeting of the Gas Pipeline Advisory Committee and the Liquid Pipeline Advisory Committee AGENCY: Pipeline and Hazardous Materials Safety...

  3. Modeling and monitoring of pipelines and networks advanced tools for automatic monitoring and supervision of pipelines

    CERN Document Server

    Torres, Lizeth

    2017-01-01

    This book focuses on the analysis and design of advanced techniques for on-line automatic computational monitoring of pipelines and pipe networks. It discusses how to improve the systems’ security considering mathematical models of the flow, historical flow rate and pressure data, with the main goal of reducing the number of sensors installed along a pipeline. The techniques presented in the book have been implemented in digital systems to enhance the abilities of the pipeline network’s operators in recognizing anomalies. A real leak scenario in a Mexican water pipeline is used to illustrate the benefits of these techniques in locating the position of a leak. Intended for an interdisciplinary audience, the book addresses researchers and professionals in the areas of mechanical, civil and control engineering. It covers topics on fluid mechanics, instrumentation, automatic control, signal processing, computing, construction and diagnostic technologies.

  4. 原油工艺管道内腐蚀检测技术及方法%The Technology and Method of Inner Corrosion Testing of Crude Oil Process Pipeline

    Institute of Scientific and Technical Information of China (English)

    韩烨; 薛正林; 陈波; 王志刚; 骆苏军

    2016-01-01

    介绍了原油工艺管道的结构、状态和应用环境,系统分析了常用原油工艺管道无损检测方法的技术特点及其局限性,提出了由远及近、由粗到精的管道检测原则,使用多种不同的无损检测方法对不同区段和不同应用环境下的管道进行检测。通过采用低频导波远距离筛查、高频导波近距离定位、磁记忆接触式快速排查、超声波相控阵测厚和 C 扫描成像精确定量等技术,能够实现原油工艺管道快速、高效、精确、可靠的检测。%Because of the crude oil medium,big diameter,insulating layer and complex constructure,the oil process pipeline couldn’t be inspected rapidly by traditional NDT methods.This article demonstrated the process of experimental Process Pipeline testing in a large oil transport station of SINOPEC and proposed the principle of from distance to contact and from rough screening to exact testing,by using the technologies of Low Frequency Guided Wave,High Frequency Guided Wave,Magnetic Stress,Phased Array,UT C-Scan testing.At last,this article summarized the suitability and limitation of those technologies in Process Pipeline testing.

  5. Processing Technology for Underground Diaphragm Walls crossing Underground Pipeline%穿越地下管线的地连墙处理技术

    Institute of Scientific and Technical Information of China (English)

    王剑; 邢荣亮; 赵明时

    2016-01-01

    The paper present theconstruction of underground diaphragm wall is limited by complicated underground pipeline in the historic city,because of complicated underground pipeline. Tosolvingthis problem, a kind of trench cutting method of hydraulic bucket with reverse circulation drilling trench is proposed . and the paper states five important points about the construction of underground diaphragm wall , what are division of underground diaphrabm wall , examiningandfirming pipelines, construction ofguiding wall , excavated by hydraulic bucket with reverse circulation drilling trench , fabrication and installation of the rebar cages , concrete casting.The technology can not only solve construction problems , save investment ,ensure duration , but also provide a reference for similar projects .%由于城市地下管线错综复杂,地连墙施工时受到地下管线的限制,为此,提出了穿越地下管线的地连墙处理技术,并对施工工艺中的重点问题在地连墙的分幅、管线的刨验加固与导墙施工、液压抓斗与反循环钻机联合成槽、钢筋笼处理、混凝土浇注五个方面进行了叙述.该技术解决了施工难题,节约了成本并保证了工期,为类似工程提供参考.

  6. The TROBAR pipeline

    Science.gov (United States)

    Stefanon, Mauro

    TROBAR is a 60cm robotic telescope installed at the Observatrio de Aras de los Olmos (OAO), approximately 100km north-west of Valencia (Spain). It is currently equipped with a 4K×4K optical camera covering a FoV of 30×30 arcmin^2. We are now implementing a pipeline for the automatic reduction of its data. In this paper we will present the main features of the pipeline, with particular care to some of the algorithms implemented to assess the quality of the produced data and showing their application to synthetic images.

  7. Superscalar pipelined inner product computation unit for signed unsigned number

    Directory of Open Access Journals (Sweden)

    Ravindra P. Rajput

    2016-09-01

    Full Text Available In this paper, we proposed superscalar pipelined inner product computation unit for signed-unsigned number operating at 16 GHz. This is designed using five stage pipelined operation with four 8 × 8 multipliers operating in parallel. Superscalar pipelined is designed to compute four 8 × 8 products in parallel in three clock cycles. In the fourth clock cycle of the pipeline operation, two inner products are computed using two adders in parallel. Fifth stage of the pipeline is designed to compute the final product by adding two inner partial products. Upon the pipeline is filled up, every clock cycle the new product of 16 × 16-bit signed unsigned number is obtained. The worst delay measured among the pipeline stage is 0.062 ns, and this delay is considered as the clock cycle period. With the delay of 0.062 ns clock cycle period, the pipeline stage can be operated with 16 GHz synchronous clock signal. Each superscalar pipeline stage is implemented using 45 nm CMOS process technology, and the comparison of results shows that the delay is decreased by 38%, area is reduced by 45% and power dissipation is saved by 32%.

  8. Risk Analysis using Corrosion Rate Parameter on Gas Transmission Pipeline

    Science.gov (United States)

    Sasikirono, B.; Kim, S. J.; Haryadi, G. D.; Huda, A.

    2017-05-01

    In the oil and gas industry, the pipeline is a major component in the transmission and distribution process of oil and gas. Oil and gas distribution process sometimes performed past the pipeline across the various types of environmental conditions. Therefore, in the transmission and distribution process of oil and gas, a pipeline should operate safely so that it does not harm the surrounding environment. Corrosion is still a major cause of failure in some components of the equipment in a production facility. In pipeline systems, corrosion can cause failures in the wall and damage to the pipeline. Therefore it takes care and periodic inspections or checks on the pipeline system. Every production facility in an industry has a level of risk for damage which is a result of the opportunities and consequences of damage caused. The purpose of this research is to analyze the level of risk of 20-inch Natural Gas Transmission Pipeline using Risk-based inspection semi-quantitative based on API 581 associated with the likelihood of failure and the consequences of the failure of a component of the equipment. Then the result is used to determine the next inspection plans. Nine pipeline components were observed, such as a straight pipes inlet, connection tee, and straight pipes outlet. The risk assessment level of the nine pipeline’s components is presented in a risk matrix. The risk level of components is examined at medium risk levels. The failure mechanism that is used in this research is the mechanism of thinning. Based on the results of corrosion rate calculation, remaining pipeline components age can be obtained, so the remaining lifetime of pipeline components are known. The calculation of remaining lifetime obtained and the results vary for each component. Next step is planning the inspection of pipeline components by NDT external methods.

  9. Deep water construction on live oil and gas pipelines using the SmartPlug high-pressure pipeline isolation tools

    Energy Technology Data Exchange (ETDEWEB)

    Parrott, Ralph; Tveit, Edd; Sauthier, Daniel [PSI, Vancouver, BC (Canada)

    2005-07-01

    The world first successful SmartPlug operation took place at the Dimlington process plant in UK in 1999. Since 1999, the SmartPlug system has been deployed on more than 80 projects worldwide allowing the operators to perform pipeline repair work, modifications, or tie-ins with the pipeline systems full of product, and at full production pressure. Mid-line applications of the SmartPlug system allows the operator to tie a new pipeline into an existing pipeline without displacing the oil or gas and eliminates time needed to flare, depressurize, empty the line and commission the line before start up. The first SmartPlug installations were done to isolate pig trap valves or ESD valve from the pipeline allowing the valve to be replaced at full pipeline pressure, and in some instances while production was flowing. Some pipelines have multiple platforms or fields tied in along the pipeline, and the SmartPlug system is frequently used to isolate a single platform to allow platform removal or riser repair without impacting the production flow in the remaining part of the pipeline system. (author)

  10. Validation of pig operations through pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Tolmasquim, Sueli Tiomno [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil); Nieckele, Angela O. [Pontificia Univ. Catolica do Rio de Janeiro, RJ (Brazil). Dept. de Engenharia Mecanica

    2005-07-01

    In the oil industry, pigging operations in pipelines have been largely applied for different purposes: pipe cleaning, inspection, liquid removal and product separation, among others. An efficient and safe pigging operation requires that a number of operational parameters, such as maximum and minimum pressures in the pipeline and pig velocity, to be well evaluated during the planning stage and maintained within stipulated limits while the operation is accomplished. With the objective of providing an efficient tool to assist in the control and design of pig operations through pipelines, a numerical code was developed, based on a finite difference scheme, which allows the simulation of two fluid transient flow, like liquid-liquid, gas-gas or liquid-gas products in the pipeline. Modules to automatically control process variables were included to employ different strategies to reach an efficient operation. Different test cases were investigated, to corroborate the robustness of the methodology. To validate the methodology, the results obtained with the code were compared with a real liquid displacement operation of a section of the OSPAR oil pipeline, belonging to PETROBRAS, with 30'' diameter and 60 km length, presenting good agreement. (author)

  11. Sinopec: Pipeline Goes Ahead

    Institute of Scientific and Technical Information of China (English)

    Xie Ye

    2002-01-01

    @@ Asia's largest refinery, Sinopec Corp, will proceed with a 1,600-kilometre oil pipeline across southern provinces of China, although speculation continues to linger that the company will scrap the plan due to a postponement of the multi-million-dollar project.

  12. Submarine Pipeline Routing Risk Quantitative Analysis

    Institute of Scientific and Technical Information of China (English)

    徐慧; 于莉; 胡云昌; 王金英

    2004-01-01

    A new method for submarine pipeline routing risk quantitative analysis was provided, and the study was developed from qualitative analysis to quantitative analysis.The characteristics of the potential risk of the submarine pipeline system were considered, and grey-mode identification theory was used. The study process was composed of three parts: establishing the indexes system of routing risk quantitative analysis, establishing the model of grey-mode identification for routing risk quantitative analysis, and establishing the standard of mode identification result. It is shown that this model can directly and concisely reflect the hazard degree of the routing through computing example, and prepares the routing selection for the future.

  13. Thermal Fatigue Analysis of Takeover Pipeline

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    This article uses the finite element analysis software ANSYS to analyze the fatigue life of the three links pipeline with different angles in the first level pipe of experimental fast reactor. The fatigue analysis is operated following the startup and shutdown process which has two load step,

  14. Data as a Service: A Seismic Web Service Pipeline

    Science.gov (United States)

    Martinez, E.

    2016-12-01

    Publishing data as a service pipeline provides an improved, dynamic approach over static data archives. A service pipeline is a collection of micro web services that each perform a specific task and expose the results of that task. Structured request/response formats allow micro web services to be chained together into a service pipeline to provide more complex results. The U.S. Geological Survey adopted service pipelines to publish seismic hazard and design data supporting both specific and generalized audiences. The seismic web service pipeline starts at source data and exposes probability and deterministic hazard curves, response spectra, risk-targeted ground motions, and seismic design provision metadata. This pipeline supports public/private organizations and individual engineers/researchers. Publishing data as a service pipeline provides a variety of benefits. Exposing the component services enables advanced users to inspect or use the data at each processing step. Exposing a composite service enables new users quick access to published data with a very low barrier to entry. Advanced users may re-use micro web services by chaining them in new ways or injecting new micros services into the pipeline. This allows the user to test hypothesis and compare their results to published results. Exposing data at each step in the pipeline enables users to review and validate the data and process more quickly and accurately. Making the source code open source, per USGS policy, further enables this transparency. Each micro service may be scaled independent of any other micro service. This ensures data remains available and timely in a cost-effective manner regardless of load. Additionally, if a new or more efficient approach to processing the data is discovered, this new approach may replace the old approach at any time, keeping the pipeline running while not affecting other micro services.

  15. Vulnerability of pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2006-07-01

    Although pipelines may be damaged due to natural sources such as stress corrosion cracking (SCC) or hydrogen-induced cracking (HIC), most pipeline damages are a result of third-party interference, such as unauthorized construction in a right of way. Pipelines are also among the prime targets for sabotage because interruptions in energy distribution can render large segments of a population debilitated. The importance of protecting critical infrastructure was emphasized in this theme issue which disseminated information on vulnerability of pipelines due to third-party intrusions, both intentional and unintentional. It summarized the 10 presentations that were delivered at a pipelines security forum in Calgary, Alberta, addressing Canadian and U.S. government and industry approaches to oil and natural gas pipeline security. The opening keynote address remarked on the evolution of international terror networks, the targeting of the energy sector, and the terrorist threat and presence in Canada. Policies towards critical energy infrastructure protection (CIP) were then examined in light of these threats. A policy shift away from traditional defensive protective security towards an offensive intelligence-led strategy to forestall terrorist threats was advocated. Energy sector representatives agreed that Canada needs an effective national lead agency to provide threat assessments, alert notification, and coordination of information pertaining to CIP. It was agreed that early warning information must come from Canadian as well as U.S. sources in order to be pertinent. The conference session on information collection concentrated on defining what sort of threat information is needed by the energy sector, who should collect it and how should it be shared. It was emphasized that government leadership should coordinate threat reporting and disseminate information, set standards, and address the issues of terrorism risk insurance. Concern was raised about the lack of

  16. Planning of a 54 km long GRP pipeline for the brine process of the Ruedersdorf gas storage facility in Brandenburg; Planung einer 54 km langen GfK-Rohrleitung fuer den Solprozess des Brandenburger Gasspeichers Ruedersdorf

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, T. [EWE AG, Oldenburg (Germany)

    2002-07-01

    EWE Aktiengesellschaft is building a cavern storage facility in Ruedersdorf, 10 km east of Berlin, for the gas supply in Brandenburg. The creating of cavern volume is a result of the controlled solution mining process with water out of the watersystem in the neighbourhood of Ruedersdorf. For the transportation of the brine, which will be produced during the construction of the caverns, EWE will build up a pipeline from Ruedersdorf to Heckelberg with a total length of 54 km and an inside diameter of 17 inch. Pipes and fittings are made of glassfiber reinforced epoxy (GRE). In Heckelberg the brine will be injected into porous sandstone layers in a depth of nearly 1,000 m without any danger for the environment and for the underground water. The planning of the transportation pipeline with respect to the material options, alignment, technology and safety systems is described in the article. A buried leakage detection system is an important contribution to the safety of the pipeline. (orig.) [German] Die EWE Aktiengesellschaft baut fuer die Gasversorgung in Brandenburg einen Kavernenspeicher im Salzgebirge in Ruedersdorf, 10 km oestlich von Berlin. Die Erstellung des Kavernenvolumens erfolgt durch kontrolliertes Aussolen des Salzkissens mit Wasser aus dem Gewaessersystem um Ruedersdorf. Fuer den Transport der in der Bauphase des Speichers anfallenden Sole wird EWE eine 54 km lange Rohrleitung DN 450 aus glasfaserverstaerktem Kunststoffrohr (GfK-Rohr) von Ruedersdorf nach Heckelberg verlegen. Dort in Heckelberg wird die Sole in tiefen Sandsteinformationen im Untergrund in einer Teufe von etwa 1000 m ohne Gefaehrdung der Umwelt und des Grundwassers versenkt. Im Beitrag wird die Planung der Leitung im Hinblick auf Werkstoffwahl, Trasse, technische Daten und Sicherheitseinrichtungen beschrieben. Ein wesentlicher Beitrag zur Sicherheit der Leitung ist die Mitverlegung eines Leckortungs- und Ueberwachungssystems. (orig.)

  17. Forecasting and Evaluation of Gas Pipelines Geometric Forms Breach Hazard

    Science.gov (United States)

    Voronin, K. S.

    2016-10-01

    Main gas pipelines during operation are under the influence of the permanent pressure drops which leads to their lengthening and as a result, to instability of their position in space. In dynamic systems that have feedback, phenomena, preceding emergencies, should be observed. The article discusses the forced vibrations of the gas pipeline cylindrical surface under the influence of dynamic loads caused by pressure surges, and the process of its geometric shape deformation. Frequency of vibrations, arising in the pipeline at the stage preceding its bending, is being determined. Identification of this frequency can be the basis for the development of a method of monitoring the technical condition of the gas pipeline, and forecasting possible emergency situations allows planning and carrying out in due time reconstruction works on sections of gas pipeline with a possible deviation from the design position.

  18. Pipelined Viterbi Decoder Using FPGA

    Directory of Open Access Journals (Sweden)

    Nayel Al-Zubi

    2013-02-01

    Full Text Available Convolutional encoding is used in almost all digital communication systems to get better gain in BER (Bit Error Rate, and all applications needs high throughput rate. The Viterbi algorithm is the solution in decoding process. The nonlinear and feedback nature of the Viterbi decoder makes its high speed implementation harder. One of promising approaches to get high throughput in the Viterbi decoder is to introduce a pipelining. This work applies a carry-save technique, which gets the advantage that the critical path in the ACS feedback becomes in one direction and get rid of carry ripple in the “Add” part of ACS unit. In this simulation and implementation show how this technique will improve the throughput of the Viterbi decoder. The design complexities for the bit-pipelined architecture are evaluated and demonstrated using Verilog HDL simulation. And a general algorithm in software that simulates a Viterbi Decoder was developed. Our research is concerned with implementation of the Viterbi Decoders for Field Programmable Gate Arrays (FPGA. Generally FPGA's are slower than custom integrated circuits but can be configured in the lab in few hours as compared to fabrication which takes few months. The design implemented using Verilog HDL and synthesized for Xilinx FPGA's.

  19. Distributed acoustic sensing for pipeline monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Hill, David; McEwen-King, Magnus [OptaSense, QinetiQ Ltd., London (United Kingdom)

    2009-07-01

    Optical fibre is deployed widely across the oil and gas industry. As well as being deployed regularly to provide high bandwidth telecommunications and infrastructure for SCADA it is increasingly being used to sense pressure, temperature and strain along buried pipelines, on subsea pipelines and downhole. In this paper we present results from the latest sensing capability using standard optical fibre to detect acoustic signals along the entire length of a pipeline. In Distributed Acoustic Sensing (DAS) an optical fibre is used for both sensing and telemetry. In this paper we present results from the OptaSense{sup TM} system which has been used to detect third party intervention (TPI) along buried pipelines. In a typical deployment the system is connected to an existing standard single-mode fibre, up to 50km in length, and was used to independently listen to the acoustic / seismic activity at every 10 meter interval. We will show that through the use of advanced array processing of the independent, simultaneously sampled channels it is possible to detect and locate activity within the vicinity of the pipeline and through sophisticated acoustic signal processing to obtain the acoustic signature to classify the type of activity. By combining spare fibre capacity in existing buried fibre optic cables; processing and display techniques commonly found in sonar; and state-of-the-art in fibre-optic distributed acoustic sensing, we will describe the new monitoring capabilities that are available to the pipeline operator. Without the expense of retrofitting sensors to the pipeline, this technology can provide a high performance, rapidly deployable and cost effective method of providing gapless and persistent monitoring of a pipeline. We will show how this approach can be used to detect, classify and locate activity such as; third party interference (including activity indicative of illegal hot tapping); real time tracking of pigs; and leak detection. We will also show how an

  20. Optimal Design of Capsule Transporting Pipeline carrying Spherical Capsules

    Science.gov (United States)

    Asim, Taimoor; Mishra, Rakesh; Ubbi, Kuldip

    2012-05-01

    A capsule pipeline transports material or cargo in capsules propelled by fluid flowing through a pipeline. The cargo may either be contained in capsules (such as wheat enclosed inside sealed cylindrical containers), or may itself be the capsules (such as coal compressed into the shape of a cylinder or sphere). As the concept of capsule transportation is relatively new, the capsule pipelines need to be designed optimally for commercial viability. An optimal design of such a pipeline would have minimum pressure drop due to the presence of the solid medium in the pipeline, which corresponds to minimum head loss and hence minimum pumping power required to drive the capsules and the transporting fluid. The total cost for the manufacturing and maintenance of such pipelines is yet another important variable that needs to be considered for the widespread commercial acceptance of capsule transporting pipelines. To address this, the optimisation technique presented here is based on the least-cost principle. Pressure drop relationships have been incorporated to calculate the pumping requirements for the system. The maintenance and manufacturing costs have been computed separately to analyse their effects on the optimisation process. A design example has been included to show the usage of the model presented. The results indicate that for a specific throughput, there exists an optimum diameter of the pipeline for which the total cost for the piping system is at its minimum.

  1. Instrumented Pipeline Initiative

    Energy Technology Data Exchange (ETDEWEB)

    Thomas Piro; Michael Ream

    2010-07-31

    This report summarizes technical progress achieved during the cooperative agreement between Concurrent Technologies Corporation (CTC) and U.S. Department of Energy to address the need for a for low-cost monitoring and inspection sensor system as identified in the Department of Energy (DOE) National Gas Infrastructure Research & Development (R&D) Delivery Reliability Program Roadmap.. The Instrumented Pipeline Initiative (IPI) achieved the objective by researching technologies for the monitoring of pipeline delivery integrity, through a ubiquitous network of sensors and controllers to detect and diagnose incipient defects, leaks, and failures. This report is organized by tasks as detailed in the Statement of Project Objectives (SOPO). The sections all state the objective and approach before detailing results of work.

  2. Pipeline ADC Design Methodology

    OpenAIRE

    Zhao, Hui

    2012-01-01

    Demand for high-performance analog-to-digital converter (ADC) integrated circuits (ICs) with optimal combined specifications of resolution, sampling rate and power consumption becomes dominant due to emerging applications in wireless communications, broad band transceivers, digital-intermediate frequency (IF) receivers and countless of digital devices. This research is dedicated to develop a pipeline ADC design methodology with minimum power dissipation, while keeping relatively high speed an...

  3. Pipeline ADC Design Methodology

    OpenAIRE

    Zhao, Hui

    2012-01-01

    Demand for high-performance analog-to-digital converter (ADC) integrated circuits (ICs) with optimal combined specifications of resolution, sampling rate and power consumption becomes dominant due to emerging applications in wireless communications, broad band transceivers, digital-intermediate frequency (IF) receivers and countless of digital devices. This research is dedicated to develop a pipeline ADC design methodology with minimum power dissipation, while keeping relatively high speed an...

  4. The MIS Pipeline Toolkit

    Science.gov (United States)

    Teuben, Peter J.; Pound, M. W.; Storm, S.; Mundy, L. G.; Salter, D. M.; Lee, K.; Kwon, W.; Fernandez Lopez, M.; Plunkett, A.

    2013-01-01

    A pipeline toolkit was developed to help organizing, reducing and analyzing a large number of near-identical datasets. This is a very general problem, for which many different solutions have been implemented. In this poster we present one such solution that lends itself to users of the Unix command line, using the Unix "make" utility, and adapts itself easily to observational as well as theoretical projects. Two examples are given, one from the CARMA CLASSy survey, and another from a simulated kinematic survey of early galaxy forming disks. The CLASSy survey (discussed in more detail in three accompanying posters) consists of 5 different star forming regions, observed with CARMA, each containing roughly 10-20 datasets in continuum and 3 different molecular lines, that need to be combined in final data cubes and maps. The strength of such a pipeline toolkit shows itself as new data are accumulated, the data reduction steps are improved and easily re-applied to previously taken data. For this we employed a master script that was run nightly, and collaborators submitted improved script and/or pipeline parameters that control these scripts. MIS is freely available for download.

  5. The inverse electroencephalography pipeline

    Science.gov (United States)

    Weinstein, David Michael

    The inverse electroencephalography (EEG) problem is defined as determining which regions of the brain are active based on remote measurements recorded with scalp EEG electrodes. An accurate solution to this problem would benefit both fundamental neuroscience research and clinical neuroscience applications. However, constructing accurate patient-specific inverse EEG solutions requires complex modeling, simulation, and visualization algorithms, and to date only a few systems have been developed that provide such capabilities. In this dissertation, a computational system for generating and investigating patient-specific inverse EEG solutions is introduced, and the requirements for each stage of this Inverse EEG Pipeline are defined and discussed. While the requirements of many of the stages are satisfied with existing algorithms, others have motivated research into novel modeling and simulation methods. The principal technical results of this work include novel surface-based volume modeling techniques, an efficient construction for the EEG lead field, and the Open Source release of the Inverse EEG Pipeline software for use by the bioelectric field research community. In this work, the Inverse EEG Pipeline is applied to three research problems in neurology: comparing focal and distributed source imaging algorithms; separating measurements into independent activation components for multifocal epilepsy; and localizing the cortical activity that produces the P300 effect in schizophrenia.

  6. Pipeline integrity : control by coatings

    Energy Technology Data Exchange (ETDEWEB)

    Khanna, A.S. [Indian Inst. of Technology, Bombay (India)

    2008-07-01

    This presentation provided background information on the history of cross-country pipelines in India. It discussed the major use of gas. The key users were described as being the power and fertilizer industries, followed by vehicles using compressed natural gas to replace liquid fuels and thereby reduce pollution. The presentation also addressed the integrity of pipelines in terms of high production, safety, and monitoring. Integrity issues of pipelines were discussed with reference to basic design, control of corrosion, and periodic health monitoring. Other topics that were outlined included integrity by corrosion control; integrity by health monitoring; coatings requirements; classification of UCC pipeline coatings; and how the pipeline integrity approach can help to achieve coatings which give design life without any failure. Surface cleanliness, coating conditions, and the relationship between temperature of Epoxy coating and the time of adhesive coating were also discussed. Last, the presentation provided the results of an audit of the HBJ pipeline conducted from 1999 to 2000. tabs., figs.

  7. Transient flow processes in pipelines. Visualisation and calculation of cavitation in pipeline systems behind fast-closing control valves; Transiente Stroemungsvorgaenge in Rohrleitungen. Visualisierung und Berechnung von Kavitation in Rohrleitungssystemen hinter schnellschliessenden Regelklappen

    Energy Technology Data Exchange (ETDEWEB)

    Dudlik, A.; Schlueter, S. [Fraunhofer-Institut fuer Umwelt-, Sicherheits- und Energietechnik UMSICHT, Oberhausen (Germany); Prasser, H.M. [Forschungszentrum Rossendorf e.V. (FZR), Dresden (Germany)

    1997-12-01

    In this article, experimental results of the current BMBF research project are introduced. The transient pressure curves when closing a control valve quickly are measured and compared with the predictions of the project software. Flow visualisation shows that the computer model of the concentrated steam cavitation can be used in the application of the characterisation process. Due to the reduction in pressure in cavitating system, there is outgassing of air, which remains as dispersed phase in condensation and dampens the sudden condensation. For long periods of time with system pressures below the saturation partial pressures of the released gases, one must expect a reduced speed of wave propagation. [Deutsch] In diesem Beitrag werden Versuchsergebnisse des laufenden BMBF-Forschungsvorhabens vorgestellt. Es werden die transienten Druckverlaeufe beim Schnellschluss einer Regelklappe gemessen und mit den Vorhersagen der Projektsoftware verglichen. Die berechneten Druckspitzen liegen im Bereich der gemessenen Werte. Stroemungsvisualisierungen zeigen, dass das Rechenmodell der konzentrierten Dampfkavitation bei Anwendung des Charakteristikenverfahrens anwendbar ist. Es kommt durch die Druckabsenkung im kavitierenden System zur Ausgasung von Luft, die bei der Kondensation als disperse Phase verbleibt und die Kondensationsschlaege daempft. Fuer laengere Zeitperioden mit Systemdruecken unterhalb der Saettigungspartialdruecke der geloesten Gase muss mit einer verminderten Wellenausbreitungsgeschwindigkeit gerechnet werden. (orig.)

  8. [Automatic analysis pipeline of next-generation sequencing data].

    Science.gov (United States)

    Wenke, Li; Fengyu, Li; Siyao, Zhang; Bin, Cai; Na, Zheng; Yu, Nie; Dao, Zhou; Qian, Zhao

    2014-06-01

    The development of next-generation sequencing has generated high demand for data processing and analysis. Although there are a lot of software for analyzing next-generation sequencing data, most of them are designed for one specific function (e.g., alignment, variant calling or annotation). Therefore, it is necessary to combine them together for data analysis and to generate interpretable results for biologists. This study designed a pipeline to process Illumina sequencing data based on Perl programming language and SGE system. The pipeline takes original sequence data (fastq format) as input, calls the standard data processing software (e.g., BWA, Samtools, GATK, and Annovar), and finally outputs a list of annotated variants that researchers can further analyze. The pipeline simplifies the manual operation and improves the efficiency by automatization and parallel computation. Users can easily run the pipeline by editing the configuration file or clicking the graphical interface. Our work will facilitate the research projects using the sequencing technology.

  9. Remote monitoring of pipeline operations

    Energy Technology Data Exchange (ETDEWEB)

    Bost, R.C. [ERM-Southwest, Inc., Houston, TX (United States); White, D. [Glenrose Systems, Austin, TX (United States)

    1995-12-31

    The demands for monitoring of pipeline operations have recently increased greatly due to new regulatory requirements. Most companies rely upon conventional System Control and Data Acquisition (SCADA) system architecture to meet their needs. Current systems are often plagued by limited data conversion and processing capacity at the workstations. A state-of-the-art Data Acquisition Node (DAN) that relieves the workstation of much of its workload is described in this paper. Use of this DAN may eliminate the need for installing completely new systems. It facilitates marrying foreign devices to existing operation monitoring systems to satisfy new regulatory requirements. The DAN allows a system to utilize commercial communications satellites or other communication networks and real-time, object oriented programming and different devices and data requirements without the necessity of custom software development.

  10. Innovative Electromagnetic Sensors for Pipeline Crawlers

    Energy Technology Data Exchange (ETDEWEB)

    J. Bruce Nestleroth

    2006-05-04

    semiannual report on this project reported on experimental and modeling results. The results showed that the rotating system was more adaptable to pipeline inspection and therefore only this system will be carried into the second year of the sensor development. In the third reporting period, the rotating system inspection was further developed. Since this is a new inspection modality without published fundamentals to build upon, basic analytical and experimental investigations were performed. A closed form equation for designing rotating exciters and positioning sensors was derived from fundamental principles. Also signal processing methods were investigated for detection and assessment of pipeline anomalies. A lock in amplifier approach was chosen as the method for detecting the signals. Finally, mechanical implementations for passing tight restrictions such as plug valves were investigated. This inspection concept is new and unique; a United States patent application has been submitted. In this reporting period, a general design of the rotating permanent magnet inspection system is presented. The rotating permanent magnet inspection system is feasible for pipes ranging in diameter from 8 to 18 inches using a two pole configuration. Experimental results and theoretical calculations provide the basis for selection of the critical design parameters. The parameters include a significant magnet to pipe separation that will facilitate the passage of pipeline features. With the basic values of critical components established, the next step is a detailed mechanical design of a pipeline ready inspection system.

  11. Kepler Science Operations Center Pipeline Framework

    Science.gov (United States)

    Klaus, Todd C.; McCauliff, Sean; Cote, Miles T.; Girouard, Forrest R.; Wohler, Bill; Allen, Christopher; Middour, Christopher; Caldwell, Douglas A.; Jenkins, Jon M.

    2010-01-01

    The Kepler mission is designed to continuously monitor up to 170,000 stars at a 30 minute cadence for 3.5 years searching for Earth-size planets. The data are processed at the Science Operations Center (SOC) at NASA Ames Research Center. Because of the large volume of data and the memory and CPU-intensive nature of the analysis, significant computing hardware is required. We have developed generic pipeline framework software that is used to distribute and synchronize the processing across a cluster of CPUs and to manage the resulting products. The framework is written in Java and is therefore platform-independent, and scales from a single, standalone workstation (for development and research on small data sets) to a full cluster of homogeneous or heterogeneous hardware with minimal configuration changes. A plug-in architecture provides customized control of the unit of work without the need to modify the framework itself. Distributed transaction services provide for atomic storage of pipeline products for a unit of work across a relational database and the custom Kepler DB. Generic parameter management and data accountability services are provided to record the parameter values, software versions, and other meta-data used for each pipeline execution. A graphical console allows for the configuration, execution, and monitoring of pipelines. An alert and metrics subsystem is used to monitor the health and performance of the pipeline. The framework was developed for the Kepler project based on Kepler requirements, but the framework itself is generic and could be used for a variety of applications where these features are needed.

  12. Leak detection in pipelines using cepstrum analysis

    Science.gov (United States)

    Taghvaei, M.; Beck, S. B. M.; Staszewski, W. J.

    2006-02-01

    The detection and location of leaks in pipeline networks is a major problem and the reduction of these leaks has become a major priority for pipeline authorities around the world. Although the reasons for these leaks are well known, some of the current methods for locating and identifying them are either complicated or imprecise; most of them are time consuming. The work described here shows that cepstrum analysis is a viable approach to leak detection and location in pipeline networks. The method uses pressure waves caused by quickly opening and closing a solenoid valve. Due to their simplicity and robustness, transient analyses provide a plausible route towards leak detection. For this work, the time domain signals of these pressure transients were obtained using a single pressure transducer. These pressure signals were first filtered using discrete wavelets to remove the dc offset, and the low and high frequencies. They were then analysed using a cepstrum method which identified the time delay between the initial wave and its reflections. There were some features in the processed results which can be ascribed to features in the pipeline network such as junctions and pipe ends. When holes were drilled in the pipe, new peaks occurred which identified the presence of a leak in the pipeline network. When tested with holes of different sizes, the amplitude of the processed peak was seen to increase as the cube root of the leak diameter. Using this method, it is possible to identify leaks that are difficult to find by other methods as they are small in comparison with the flow through the pipe.

  13. INTERNAL REPAIR OF PIPELINES

    Energy Technology Data Exchange (ETDEWEB)

    Bill Bruce; Nancy Porter; George Ritter; Matt Boring; Mark Lozev; Ian Harris; Bill Mohr; Dennis Harwig; Robin Gordon; Chris Neary; Mike Sullivan

    2005-07-20

    The two broad categories of fiber-reinforced composite liner repair and deposited weld metal repair technologies were reviewed and evaluated for potential application for internal repair of gas transmission pipelines. Both are used to some extent for other applications and could be further developed for internal, local, structural repair of gas transmission pipelines. Principal conclusions from a survey of natural gas transmission industry pipeline operators can be summarized in terms of the following performance requirements for internal repair: (1) Use of internal repair is most attractive for river crossings, under other bodies of water, in difficult soil conditions, under highways, under congested intersections, and under railway crossings. (2) Internal pipe repair offers a strong potential advantage to the high cost of horizontal direct drilling when a new bore must be created to solve a leak or other problem. (3) Typical travel distances can be divided into three distinct groups: up to 305 m (1,000 ft.); between 305 m and 610 m (1,000 ft. and 2,000 ft.); and beyond 914 m (3,000 ft.). All three groups require pig-based systems. A despooled umbilical system would suffice for the first two groups which represents 81% of survey respondents. The third group would require an onboard self-contained power unit for propulsion and welding/liner repair energy needs. (4) The most common size range for 80% to 90% of operators surveyed is 508 mm (20 in.) to 762 mm (30 in.), with 95% using 558.8 mm (22 in.) pipe. Evaluation trials were conducted on pipe sections with simulated corrosion damage repaired with glass fiber-reinforced composite liners, carbon fiber-reinforced composite liners, and weld deposition. Additional un-repaired pipe sections were evaluated in the virgin condition and with simulated damage. Hydrostatic failure pressures for pipe sections repaired with glass fiber-reinforced composite liner were only marginally greater than that of pipe sections without

  14. Intermediate Palomar Transient Factory: Realtime Image Subtraction Pipeline

    CERN Document Server

    Cao, Yi; Kasliwal, Mansi M

    2016-01-01

    A fast-turnaround pipeline for realtime data reduction plays an essential role in discovering and permitting follow-up observations to young supernovae and fast-evolving transients in modern time-domain surveys. In this paper, we present the realtime image subtraction pipeline in the intermediate Palomar Transient Factory. By using high-performance computing, efficient database, and machine learning algorithms, this pipeline manages to reliably deliver transient candidates within ten minutes of images being taken. Our experience in using high performance computing resources to process big data in astronomy serves as a trailblazer to dealing with data from large-scale time-domain facilities in near future.

  15. Automated HST/STIS reference file generation pipeline using OPUS

    Science.gov (United States)

    Diaz, Rosa I.; Swam, Michel; Goudfrooij, Paul

    2008-08-01

    Bias and Dark reference files are part of the basic reduction of the CCD data taken by the Space Telescope Imaging Spectrograph (STIS) aboard the Hubble Space Telescope (HST). At STScI, the STIS team has been creating these reference files using the Bias and Dark Pipeline. This pipeline system starts with automatic retrieval of bias and dark exposures from the HST archive after they have been ingested. After data retrieval, a number of automatic scripts is executed in a manner compatible with the OPUS pipeline architecture. We encourage any group looking to streamline a stepwise calibration process to look into this software.

  16. Kvitebjoern gas pipeline repair - baptism of remote pipeline repair system

    Energy Technology Data Exchange (ETDEWEB)

    Gjertveit, Erling

    2010-07-01

    On the 1st of November 2007, severe anchor damage was discovered on the 30 inch Kvitebjoern gas export pipeline. The damage constituted a localised dent and a 17deg buckle, but no leakage. Statoil has invested in building an effective repair contingency structure for the large pipeline network on the Norwegian Continental shelf, with particular focus on the large gas export pipelines. The repair method for the Kvitebjoern pipeline was remotely operated using two Morgrip couplings and a spool. The installation used the purpose built Pipeline Repair System stored at Killingoey and couplings produced and tested back in 2005. This presentation will cover the initial damage investigations, the temporary operational phase, the repair preparations, the actual repair and lessons learned. (Author)

  17. Early generation pipeline girth welding practices and their implications for integrity management of North American pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Amend, Bill [DNV Columbus, Inc., Yorba Linda, CA (United States)

    2010-07-01

    In response to the interest in ensuring the continued safe operation of vintage pipelines and the integrity management challenges related to those pipelines, PRCI sponsored in 2009 a project called {sup V}intage Girth Weld Defect Assessment - Comprehensive Study{sup .} Its objectives focused on girth welds made with the shielded metal arc welding (SMAW) process, particularly with regard to: review of approaches for evaluating the integrity of these welds; description of the typical characteristics and properties of SMAW vintage welds; determination of gaps in available information and technology that hinder effective integrity assessment and management of vintage girth welds. A very extensive literature review was performed as part of this project. Key findings include the following. The failure rate of early generation girth welds is low, especially when considering the rate of catastrophic failures. Pipeline girth welds are unlikely to fail unless subjected to axial strains that far exceed the strains related to internal pressure alone.

  18. SERPent: Scripted E-merlin Rfi-mitigation PipelinE for iNTerferometry

    Science.gov (United States)

    Peck, Luke W.; Fenech, Danielle M.

    2013-12-01

    SERPent is an automated reduction and RFI-mitigation procedure that uses the SumThreshold methodology. It was originally developed for the LOFAR pipeline. SERPent is written in Parseltongue, enabling interaction with the Astronomical Image Processing Software (AIPS) program. Moreover, SERPent is a simple "out of the box" Python script, which is easy to set up and is free of compilers.

  19. Transportation of coal by pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Round, G.F.

    1982-01-01

    Discusses Canada's coal resources, technology of long distance coal slurry pipelines, existing and planned coal slurry pipelines, their economics, liquid carbon dioxide, methanol and crude oil instead of water as carrier fluid, and coal slurry research in Canada.

  20. CRISPRED: A data pipeline for the CRISP imaging spectropolarimeter

    CERN Document Server

    Rodríguez, J de la Cruz; Sütterlin, P; Hillberg, T; van der Voort, L Rouppe

    2014-01-01

    The production of science-ready data from major solar telescopes requires expertise beyond that of the typical observer. This is a consequence of the increasing complexity of instruments and observing sequences, which require calibrations and corrections for instrumental and seeing effects that are not only difficult to measure, but are also coupled in ways that require careful analysis in the design of the correction procedures. Modern space-based telescopes have data-processing pipelines capable of routinely producing well-characterized data products. High-resolution imaging spectropolarimeters at ground-based telescopes need similar data pipelines. The purpose of this paper is to document a procedure that forms the basis of current state of the art processing of data from the CRISP imaging spectropolarimeter at the Swedish 1-m Solar Telescope (SST). By collecting, implementing, and testing a suite of computer programs, we have defined a data reduction pipeline for this instrument. This pipeline, CRISPRED, ...

  1. PLUGGING AND UNPLUGGING OF WASTE TRANSFER PIPELINES

    Energy Technology Data Exchange (ETDEWEB)

    M.A. Ebadian, Ph.D.

    1999-01-01

    This project, which began in FY97, involves both the flow loop research on plugging and unplugging of waste transfer pipelines, and the large-scale industrial equipment test of plugging locating and unplugging technologies. In FY98, the related work was performed under the project name ''Mixing, Settling, and Pipe Unplugging of Waste Transfer Lines.'' The mixing, settling, and pipeline plugging and unplugging are critical to the design and maintenance of a waste transfer pipeline system, especially for the High-Level Waste (HLW) pipeline transfer. The major objective of this work is to recreate pipeline plugging conditions for equipment testing of plug locating and removal and to provide systematic operating data for modification of equipment design and enhancement of performance of waste transfer lines used at DOE sites. As the waste tank clean-out and decommissioning program becomes active at the DOE sites, there is an increasing potential that the waste slurry transfer lines will become plugged and unable to transport waste slurry from one tank to another or from the mixing tank to processing facilities. Transfer systems may potentially become plugged if the solids concentration of the material being transferred increases beyond the capability of the prime mover or if upstream mixing is inadequately performed. Plugging can occur due to the solids' settling in either the mixing tank, the pumping system, or the transfer lines. In order to enhance and optimize the slurry's removal and transfer, refined and reliable data on the mixing, sampling, and pipe unplugging systems must be obtained based on both laboratory-scale and simulated in-situ operating conditions.

  2. Energy cost reduction in oil pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Limeira, Fabio Machado; Correa, Joao Luiz Lavoura; Costa, Luciano Macedo Josino da; Silva, Jose Luiz da; Henriques, Fausto Metzger Pessanha [Petrobras Transporte S.A. (TRANSPETRO), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    One of the key questions of modern society consists on the rational use of the planet's natural resources and energy. Due to the lack of energy, many companies are forced to reduce their workload, especially during peak hours, because residential demand reaches its top and there is not enough energy to fulfill the needs of all users, which affects major industries. Therefore, using energy more wisely has become a strategic issue for any company, due to the limited supply and also for the excessive cost it represents. With the objective of saving energy and reducing costs for oil pipelines, it has been identified that the increase in energy consumption is primordially related to pumping stations and also by the way many facilities are operated, that is, differently from what was originally designed. Realizing this opportunity, in order to optimize the process, this article intends to examine the possibility of gains evaluating alternatives regarding changes in the pump scheme configuration and non-use of pump stations at peak hours. Initially, an oil pipeline with potential to reduce energy costs was chosen being followed by a history analysis, in order to confirm if there was sufficient room to change the operation mode. After confirming the pipeline choice, the system is briefly described and the literature is reviewed, explaining how the energy cost is calculated and also the main characteristics of a pumping system in series and in parallel. In that sequence, technically feasible alternatives are studied in order to operate and also to negotiate the energy demand contract. Finally, costs are calculated to identify the most economical alternative, that is, for a scenario with no increase in the actual transported volume of the pipeline and for another scenario that considers an increase of about 20%. The conclusion of this study indicates that the chosen pipeline can achieve a reduction on energy costs of up to 25% without the need for investments in new

  3. Diverless pipeline repair system for deep water

    Energy Technology Data Exchange (ETDEWEB)

    Spinelli, Carlo M. [Eni Gas and Power, Milan (Italy); Fabbri, Sergio; Bachetta, Giuseppe [Saipem/SES, Venice (Italy)

    2009-07-01

    SiRCoS (Sistema Riparazione Condotte Sottomarine) is a diverless pipeline repair system composed of a suite of tools to perform a reliable subsea pipeline repair intervention in deep and ultra deep water which has been on the ground of the long lasting experience of Eni and Saipem in designing, laying and operating deep water pipelines. The key element of SiRCoS is a Connection System comprising two end connectors and a repair spool piece to replace a damaged pipeline section. A Repair Clamp with elastomeric seals is also available for pipe local damages. The Connection System is based on pipe cold forging process, consisting in swaging the pipe inside connectors with suitable profile, by using high pressure seawater. Three swaging operations have to be performed to replace the damaged pipe length. This technology has been developed through extensive theoretical work and laboratory testing, ending in a Type Approval by DNV over pipe sizes ranging from 20 inches to 48 inches OD. A complete SiRCoS system has been realised for the Green Stream pipeline, thoroughly tested in workshop as well as in shallow water and is now ready, in the event of an emergency situation.The key functional requirements for the system are: diverless repair intervention and fully piggability after repair. Eni owns this technology and is now available to other operators under Repair Club arrangement providing stand-by repair services carried out by Saipem Energy Services. The paper gives a description of the main features of the Repair System as well as an insight into the technological developments on pipe cold forging reliability and long term duration evaluation. (author)

  4. The e-MERLIN Data Reduction Pipeline

    CERN Document Server

    Argo, Megan

    2015-01-01

    Written in Python and utilising ParselTongue to interface with the Astronomical Image Processing System (AIPS), the e-MERLIN data reduction pipeline is intended to automate the procedures required in processing and calibrating radio astronomy data from the e-MERLIN correlator. Driven by a plain text file of input parameters, the pipeline is modular and can be run in stages by the user, depending on requirements. The software includes options to load raw data, average in time and/or frequency, flag known sources of interference, flag more comprehensively with SERPent, carry out some or all of the calibration procedures including self-calibration), and image in either normal or wide-field mode. It also optionally produces a number of useful diagnostic plots at various stages so that the quality of the data can be assessed. The software is available for download from the e-MERLIN website or via Github.

  5. The e-MERLIN Data Reduction Pipeline

    Directory of Open Access Journals (Sweden)

    Megan Kirsty Argo

    2015-01-01

    Full Text Available Written in Python and utilising ParselTongue to interface with the Astronomical Image Processing System (AIPS, the e-MERLIN data reduction pipeline is intended to automate the procedures required in processing and calibrating radio astronomy data from the e-MERLIN correlator. Driven by a plain text file of input parameters, the pipeline is modular and can be run in stages by the user, depending on requirements. The software includes options to load raw data, average in time and/or frequency, flag known sources of interference, flag more comprehensively with SERPent, carry out some or all of the calibration procedures (including self-calibration, and image in either normal or wide-field mode. It also optionally produces a number of useful diagnostic plots at various stages so that the quality of the data can be assessed. The software is available for download from the e-MERLIN website or via Github.

  6. Proof of pipeline strength based on measurements of inspection pigs; Festigkeitsnachweis von Pipelines aufgrund der Messergebnisse von Pruefmolchen

    Energy Technology Data Exchange (ETDEWEB)

    De la Camp, H.J.; Feser, G.; Hofmann, A.; Wolf, B.; Schmidt, H. [TUeV Sueddeutschland Bau und Betrieb GmbH, Muenchen (Germany); Herforth, H.E.; Juengling, K.H.; Schmidt, W. [TUeV Anlagentechnik GmbH, Berlin-Schoeneberg (Germany). Unternehmensgruppe TUeV Rheinland/Berlin-Brandenburg

    2002-01-01

    The report is aimed at collecting and documenting the state of the art and the extensive know how of experts and pipeline operators with regard to judging the structural integrity of pipelines. In order to assess the actual mechanical strength of pipelines based on measurement results obtained by inspection pigs, guidance is given for future processing, which eventually can be used as a basis for an industry standard. A literature study of the commercially available types of inspection pigs describes and synoptically lists the respective pros and cons. In essence the report comprises besides check lists of operating data for the pipeline and the pig runs mainly the evaluation of defects and respective calculating procedures. Included are recommendations regarding maintenance planning, verification of defects as well as repetition of pig runs. (orig.) [German] Ziel des Berichtes ist die Erfassung und Dokumentation zum derzeitigen Stand der Technik und des vorhandenen umfangreichen Know-how von Sachverstaendigen und Pipelinebetreibern auf dem Gebiet der sicherheitstechnischen Beurteilung von Pipelines. Fuer den Festigkeitsnachweis von Pipelines aufgrund der Messergebnisse von Pruefmolchen wurde ein Leitfaden als Basis fuer die zukuenftige Vorgehensweise erstellt, der eventuell die Grundlage eines normativen Regelwerkes bilden kann. In einer Literaturstudie wurden die auf dem Markt befindlichen Pruefmolchtypen zusammenfassend beschrieben und ihre Vor- und Nachteile tabellarisch gegenuebergestellt und bewertet. Neben der Erstellung von Checklisten fuer notwendige Daten zum Betrieb der Pipeline und der Molchlaeufe bildet die Fehlerbewertung mit entsprechenden Berechnungsverfahren den Hauptteil dieses Berichtes. Hinweise zur Instandhaltungsplanung (Fehlerverifikation und Molchlaufwiederholung) werden gegeben. (orig.)

  7. Maglev crude oil pipeline

    Science.gov (United States)

    Knolle, Ernst G.

    1994-05-01

    This maglev crude oil pipeline consists of two conduits guiding an endless stream of long containers. One conduit carries loaded containers and the other empty returns. The containers are levitated by permanent magnets in repulsion and propelled by stationary linear induction motors. The containers are linked to each other in a manner that allows them, while in continuous motion, to be folded into side by side position at loading and unloading points. This folding causes a speed reduction in proportion to the ratio of container diameter to container length. While in side by side position, containers are opened at their ends to be filled or emptied. Container size and speed are elected to produce a desired carrying capacity.

  8. Northern pipelines : challenges and needs

    Energy Technology Data Exchange (ETDEWEB)

    Dean, D.; Brownie, D. [ProLog Canada Inc., Calgary, AB (Canada); Fafara, R. [TransCanada PipeLines Ltd., Calgary, AB (Canada)

    2007-07-01

    Working Group 10 presented experiences acquired from the operation of pipeline systems in a northern environment. There are currently 3 pipelines operating north of 60, notably the Shiha gas pipeline near Fort Liard, the Ikhil gas pipeline in Inuvik and the Norman Wells oil pipeline. Each has its unique commissioning, operating and maintenance challenges, as well as specific training and logistical support requirements for the use of in-line inspection tools and other forms of integrity assessment. The effectiveness of cathodic protection systems in a permafrost northern environment was also discussed. It was noted that the delay of the Mackenzie Gas Pipeline Project by two to three years due to joint regulatory review may lead to resource constraints for the project as well as competition for already scarce human resources. The issue of a potential timing conflict with the Alaskan Pipeline Project was also addressed as well as land use issues for routing of supply roads. Integrity monitoring and assessment issues were outlined with reference to pipe soil interaction monitoring in discontinuous permafrost; south facing denuded slope stability; base lining projects; and reclamation issues. It was noted that automatic welding and inspection will increase productivity, while reducing the need for manual labour. In response to anticipated training needs, companies are planning to involve and train Aboriginal labour and will provide camp living conditions that will attract labour. tabs., figs.

  9. PETROBAS amazon gas pipeline - repair logistics evaluation study

    Energy Technology Data Exchange (ETDEWEB)

    Faertes, Denise [Petrobas, Rio de Janeiro, (Brazil); Domingues, Joaquim [DNV, Rio de Janeiro, (Brazil)

    2010-07-01

    Repair logistics is often a challenge in the pipeline industry because of extreme operating conditions. This paper presents an evaluation of the repair logistics at the Urucu-Coari-Manaus gas pipeline in Brazil. This study establishes strategies for each identified failure scenario, classified by type of repair, logistics, resources and costs. Several meetings and brain-storming workshops, bringing together experienced teams from PETROBRAS took place. They provided an analysis of operating conditions for different pipeline sections and an evaluation of the best practices and strategies to be adopted for pipeline repair. Different repair strategies and logistics options were compared with a base case crisis scenario to evaluate gains in terms of repair time reductions. A cost analysis was then done to prioritize these strategies. This study provided important support to the decision making process, with respect to different repair resources and logistics options. It provided formal and innovative solutions.

  10. Knowledge Based Pipeline Network Classification and Recognition Method of Maps

    Institute of Scientific and Technical Information of China (English)

    Liu Tongyu; Gu Shusheng

    2001-01-01

    Map recognition is an e.ssenfial data input means of Geographic Information System(GIS). How to solve the problems in the procedure, such as recognition of maps with crisscross pipeline networks, classification of buildings and roads, and processing of connected text, is a critical step for GIS keeping high-speed development. In this paper, a new recognition method of pipeline maps is presented, and some common patterns of pipeline connection and component labels are establishecd Through pattern matching, pipelines and component labels are recognized and peeled off from maps. After this approach, maps simply consist of buildings and roads, which are recognized and classified with fuzzy classification method. In addition, the Double Sides Scan (DSS) technique is also described, through which the effect of connected text can be eliminated.

  11. Electrometrical Methods Application for Detection of Heating System Pipeline Corrosion

    Science.gov (United States)

    Vetrov, A.; Ilyin, Y.; Isaev, V.; Rondel, A.; Shapovalov, N.

    2004-12-01

    Coated steel underground pipelines are widely used for the petroleum and gaze transportation, for the water and heat supply. The soils, where the pipelines are placed, are usually highly corrosive for pipe's metal. In the places of crippling of external coating the corrosion processes begin, and this can provoke a pipe breakage. To ensure the pipeline survivability it is necessary to carry out the control of pipeline conditions. The geophysical methods are used to provide such diagnostic. Authors have studied the corrosion processes of the municipal heating system pipelines in Saint-Petersburg (Russia) using the air thermal imaging method, the investigation of electromagnetic fields and spontaneous polarization, measurements of electrode potentials of metal tubes. The pipeline reparation works, which have been provided this year, allowed us to make the visual observation of pipes. The investigation object comprises a pipeline composed of two parallel tubes, which are placed 1-2 meters deep. The fact that the Russian Federation and CIS countries still use the direct heat supply system makes impossible any addition of anticorrosion components to circulating water. Pipelines operate under high pressure (up to 5 atm) and high temperature (designed temperature is 150°C). Tube's isolation is meant for heat loss minimization, and ordinary has poor hydro-isolation. Some pipeline construction elements (sliding and fixed bearings, pressure compensators, heat enclosures) are often non-isolated, and tube's metal contacts with soil. Hard usage condition, ingress of technical contamination cause, stray currents etc. cause high accidental rate. Realization of geophysical diagnostics, including electrometry, is hampered in a city by underground communication systems, power lines, isolating ground cover (asphalt), limitation of the working area with buildings. These restrictions form the investigation conditions. In order to detect and localize isolation (coat) defects authors

  12. DDBJ read annotation pipeline: a cloud computing-based pipeline for high-throughput analysis of next-generation sequencing data.

    Science.gov (United States)

    Nagasaki, Hideki; Mochizuki, Takako; Kodama, Yuichi; Saruhashi, Satoshi; Morizaki, Shota; Sugawara, Hideaki; Ohyanagi, Hajime; Kurata, Nori; Okubo, Kousaku; Takagi, Toshihisa; Kaminuma, Eli; Nakamura, Yasukazu

    2013-08-01

    High-performance next-generation sequencing (NGS) technologies are advancing genomics and molecular biological research. However, the immense amount of sequence data requires computational skills and suitable hardware resources that are a challenge to molecular biologists. The DNA Data Bank of Japan (DDBJ) of the National Institute of Genetics (NIG) has initiated a cloud computing-based analytical pipeline, the DDBJ Read Annotation Pipeline (DDBJ Pipeline), for a high-throughput annotation of NGS reads. The DDBJ Pipeline offers a user-friendly graphical web interface and processes massive NGS datasets using decentralized processing by NIG supercomputers currently free of charge. The proposed pipeline consists of two analysis components: basic analysis for reference genome mapping and de novo assembly and subsequent high-level analysis of structural and functional annotations. Users may smoothly switch between the two components in the pipeline, facilitating web-based operations on a supercomputer for high-throughput data analysis. Moreover, public NGS reads of the DDBJ Sequence Read Archive located on the same supercomputer can be imported into the pipeline through the input of only an accession number. This proposed pipeline will facilitate research by utilizing unified analytical workflows applied to the NGS data. The DDBJ Pipeline is accessible at http://p.ddbj.nig.ac.jp/.

  13. Pipeline clean-up : speed, environment drive pipelining equipment

    Energy Technology Data Exchange (ETDEWEB)

    Budd, G.

    2004-08-01

    Horizontal drilling technology is the single most important enhanced oil recovery technology which has resulted in a significant increase in pipeline utilization. Pipeline operators such as Calgary-based Denim Pipeline Construction Ltd. are responding by using the latest equipment, including excavation equipment, to avoid maintenance delays and downtime. The sales of Denim's horizontal pipe bending equipment have increased due to their attention to worker safety. Denim's horizontal bending machine does not require as much technical support, plus it is faster to install and speeds up production. The machine consists of 3 hydraulic jacks that move on a horizontal plate. Curved dies can be modified to accommodate various diameters of pipe. The bending operation is performed very near to the ground, thereby significantly reducing the risk of pipe injury. Environmental damage is minimized through the use of mechanized mulching which has replaced burning of unwanted trees and brush to clean for pipelines. 1 fig.

  14. RUSSIA AND ITS PIPELINE WEAPON

    Directory of Open Access Journals (Sweden)

    FODOR Cosmin

    2010-12-01

    Full Text Available In this paper we intend to present the new power which is given to Russia upon EU due to her great natural resources and due to her control upon pipelines. Now Moscow can exert influence upon countries in Europe not through its revolutionary zeal and its tanks and army, but through its resources. And she knows how to use them and how make the EU dependent on her will: this is a new geopolitics, a 21-th century geopolitics, which is centered upon the control of gas pipelines in Central Asian states and upon EU states great dependence on Russian pipeline system.

  15. Performance of the SDSS-III MARVELS New Data Pipeline

    Science.gov (United States)

    Li, Rui; Ge, J.; Thomas, N. B.; Shi, J.; Petersen, E.; Ouyang, Y.; Wang, J.; Ma, B.; Sithajan, S.

    2013-01-01

    As one of the four surveys in the SDSS-III program, MARVELS (Multi-object APO Radial Velocity Exoplanet Large-area Survey) had monitored over 3,300 stars during 2008-2012 with each observed about 27 times over a 2-year window. MARVELS has successfully produced over 20 brown dwarf candidates and several hundreds of binaries. However, the early data pipeline has large long term systematic errors and cannot reliably produce giant planet candidates. Our new MARVELS pipeline team, with the assistance of UF Department of Mathematics, has made great progress in dealing with the long-term systematic errors over the past 9 months. We redesigned the entire pre-processing procedure to handle various types of systematic effects caused by the instrument (such as trace, slant and distortion) and observation condition changes (such as illumination profile). We explored several advanced methods to precisely extract the RV signal from the processed spectra. We also developed a new simulation program to model all of these effects and used it to test the performance of our new pipeline. Our goal is to deliver a new pipeline to meet the survey baseline performance 10-35 m/s for the survey stars) by the end of 2012. We will report the fundamental performance of the pipeline and lessons learned from the pipeline development.

  16. e-MERLIN data reduction pipeline

    Science.gov (United States)

    Argo, Megan

    2014-07-01

    Written in Python and utilizing ParselTongue (ascl:1208.020) to interface with AIPS (ascl:9911.003), the e-MERLIN data reduction pipeline processes, calibrates and images data from the UK's radio interferometric array (Multi-Element Remote-Linked Interferometer Network). Driven by a plain text input file, the pipeline is modular and can be run in stages. The software includes options to load raw data, average in time and/or frequency, flag known sources of interference, flag more comprehensively with SERPent (ascl:1312.001), carry out some or all of the calibration procedures (including self-calibration), and image in either normal or wide-field mode. It also optionally produces a number of useful diagnostic plots at various stages so data quality can be assessed.

  17. Wave Pipelining Using Self Reset Logic

    Directory of Open Access Journals (Sweden)

    Miguel E. Litvin

    2008-01-01

    Full Text Available This study presents a novel design approach combining wave pipelining and self reset logic, which provides an elegant solution at high-speed data throughput with significant savings in power and area as compared with other dynamic CMOS logic implementations. To overcome some limitations in SRL art, we employ a new SRL family, namely, dual-rail self reset logic with input disable (DRSRL-ID. These gates depict fairly constant timing parameters, specially the width of the output pulse, for varying fan-out and logic depth, helping accommodate process, supply voltage, and temperature variations (PVT. These properties simplify the implementation of wave pipelined circuits. General timing analysis is provided and compared with previous implementations. Results of circuit implementation are presented together with conclusions and future work.

  18. The MUSE Data Reduction Software Pipeline

    Science.gov (United States)

    Weilbacher, P. M.; Roth, M. M.; Pécontal-Rousset, A.; Bacon, R.; Muse Team

    2006-07-01

    After giving a short overview of the instrument characteristics of the second generation VLT instrument MUSE, we discuss properties of the data will look like and present challenges and goals of its data reduction software. It is conceived as a number of pipeline recipes to be run in an automated way within the ESO data flow system. These recipes are based on a data reduction library that is being written in the C language using ESO's CPL API. We give a short overview of the steps needed for reduction and post-processing of science data, discuss requirements of a future visualization tool for integral field spectroscopy and close with the timeline for MUSE and its data reduction pipeline.

  19. 75 FR 72877 - Pipeline Safety: Updates to Pipeline and Liquefied Natural Gas Reporting Requirements

    Science.gov (United States)

    2010-11-26

    ... Safety: Updates to Pipeline and Liquefied Natural Gas Reporting Requirements; Final Rule #0;#0;Federal... to Pipeline and Liquefied Natural Gas Reporting Requirements AGENCY: Pipeline and Hazardous Materials... collections from operators of natural gas pipelines, hazardous liquid pipelines, and liquefied natural......

  20. Logistics aspects of petroleum pipeline operations

    Directory of Open Access Journals (Sweden)

    W. J. Pienaar

    2010-11-01

    Full Text Available The paper identifies, assesses and describes the logistics aspects of the commercial operation of petroleum pipelines. The nature of petroleum-product supply chains, in which pipelines play a role, is outlined and the types of petroleum pipeline systems are described. An outline is presented of the nature of the logistics activities of petroleum pipeline operations. The reasons for the cost efficiency of petroleum pipeline operations are given. The relative modal service effectiveness of petroleum pipeline transport, based on the most pertinent service performance measures, is offered. The segments in the petroleum-products supply chain where pipelines can play an efficient and effective role are identified.

  1. Pipeline integrity handbook risk management and evaluation

    CERN Document Server

    Singh, Ramesh

    2013-01-01

    Based on over 40 years of experience in the field, Ramesh Singh goes beyond corrosion control, providing techniques for addressing present and future integrity issues. Pipeline Integrity Handbook provides pipeline engineers with the tools to evaluate and inspect pipelines, safeguard the life cycle of their pipeline asset and ensure that they are optimizing delivery and capability. Presented in easy-to-use, step-by-step order, Pipeline Integrity Handbook is a quick reference for day-to-day use in identifying key pipeline degradation mechanisms and threats to pipeline integrity. The book begins

  2. PIPELINES AS COMMUNICATION NETWORK LINKS

    Energy Technology Data Exchange (ETDEWEB)

    Kelvin T. Erickson; Ann Miller; E. Keith Stanek; C.H. Wu; Shari Dunn-Norman

    2005-03-14

    This report presents the results of an investigation into two methods of using the natural gas pipeline as a communication medium. The work addressed the need to develop secure system monitoring and control techniques between the field and control centers and to robotic devices in the pipeline. In the first method, the pipeline was treated as a microwave waveguide. In the second method, the pipe was treated as a leaky feeder or a multi-ground neutral and the signal was directly injected onto the metal pipe. These methods were tested on existing pipeline loops at UMR and Batelle. The results reported in this report indicate the feasibility of both methods. In addition, a few suitable communication link protocols for this network were analyzed.

  3. Nondestructive characterization of pipeline materials

    Science.gov (United States)

    Engle, Brady J.; Smart, Lucinda J.; Bond, Leonard J.

    2015-03-01

    There is a growing need to quantitatively and nondestructively evaluate the strength and toughness properties of pipeline steels, particularly in aging pipeline infrastructure. These strength and toughness properties, namely yield strength, tensile strength, transition temperature, and toughness, are essential for determining the safe operating pressure of the pipelines. For some older pipelines crucial information can be unknown, which makes determining the pressure rating difficult. Current inspection techniques address some of these issues, but they are not comprehensive. This paper will briefly discuss current inspection techniques and relevant literature for relating nondestructive measurements to key strength and toughness properties. A project is in progress to provide new in-trench tools that will give strength properties without the need for sample removal and destructive testing. Preliminary experimental ultrasonic methods and measurements will be presented, including velocity, attenuation, and backscatter measurements.

  4. The analysis of repeated failures of pipelines in Kal'chinskoe oil field

    Science.gov (United States)

    Shavlov, E. N.; Brusnik, O. V.; Lukjanov, V. G.

    2016-09-01

    The paper presents the chemical analysis of oilfield water and hydraulic analysis of the liquid flow in Kal'chinskoe oil field pipeline that allow detecting the causes of the internal corrosion processes. The inhibitor protection is suggested to reduce the corrosion rate in the pipelines of Kal'chinskoe oil field. Based on the analysis of the pipeline failures, it is suggested to replace steel pipes by fiberglass pipes.

  5. Global lateral buckling analysis of idealized subsea pipelines

    Institute of Scientific and Technical Information of China (English)

    刘润; 刘文彬; 吴新利; 闫澍旺

    2014-01-01

    In order to avoid the curing effects of paraffin on the transport process and reduce the transport difficulty, usually high temperature and high pressure are used in the transportation of oil and gas. The differences of temperature and pressure cause additional stress along the pipeline, due to the constraint of the foundation soil, the additional stress can not release freely, when the additional stress is large enough to motivate the submarine pipelines buckle. In this work, the energy method is introduced to deduce the analytical solution which is suitable for the global buckling modes of idealized subsea pipeline and analyze the relationship between the critical buckling temperature, buckling length and amplitude under different high-order global lateral buckling modes. To obtain a consistent formulation of the problem, the principles of virtual displacements and the variation calculus for variable matching points are applied. The finite element method based on elasto-plastic theory is used to simulate the lateral global buckling of the pipelines under high temperature and pressure. The factors influencing the lateral buckling of pipelines are further studied. Based upon some actual engineering projects, the finite element results are compared with the analytical ones, and then the influence of thermal stress, the section rigidity of pipeline, the soil properties and the trigging force to the high order lateral buckling are discussed. The method of applying the small trigging force on pipeline is reliable in global buckling numerical analysis. In practice, increasing the section rigidity of a pipeline is an effective measure to improve the ability to resist the global buckling.

  6. The pipeline for the GOSSS data reduction

    CERN Document Server

    Sota, Alfredo

    2011-01-01

    The Galactic O-Star Spectroscopic Survey (GOSSS) is an ambitious project that is observing all known Galactic O stars with B < 13 in the blue-violet part of the spectrum with R-2500. It is based on version 2 of the most complete catalog to date of Galactic O stars with accurate spectral types (v1, Ma\\'iz Apell\\'aniz et al. 2004 ;v2, Sota et al. 2008). Given the large amount of data that we are getting (more than 150 nights of observations at three different observatories in the last 4 years) we have developed an automatic spectroscopic reduction pipeline. This pipeline has been programmed in IDL and automates the process of data reduction. It can operate in two modes: automatic data reduction (quicklook) or semi-automatic data reduction (full). In "quicklook", we are able to get rectified and calibrated spectra of all stars of a full night just minutes after the observations. The pipeline automatically identifies the type of image and applies the standard reduction procedure (bias subtraction, flat field c...

  7. Automated pipelines for spectroscopic analysis

    Science.gov (United States)

    Allende Prieto, C.

    2016-09-01

    The Gaia mission will have a profound impact on our understanding of the structure and dynamics of the Milky Way. Gaia is providing an exhaustive census of stellar parallaxes, proper motions, positions, colors and radial velocities, but also leaves some glaring holes in an otherwise complete data set. The radial velocities measured with the on-board high-resolution spectrograph will only reach some 10 % of the full sample of stars with astrometry and photometry from the mission, and detailed chemical information will be obtained for less than 1 %. Teams all over the world are organizing large-scale projects to provide complementary radial velocities and chemistry, since this can now be done very efficiently from the ground thanks to large and mid-size telescopes with a wide field-of-view and multi-object spectrographs. As a result, automated data processing is taking an ever increasing relevance, and the concept is applying to many more areas, from targeting to analysis. In this paper, I provide a quick overview of recent, ongoing, and upcoming spectroscopic surveys, and the strategies adopted in their automated analysis pipelines.

  8. 75 FR 5244 - Pipeline Safety: Integrity Management Program for Gas Distribution Pipelines; Correction

    Science.gov (United States)

    2010-02-02

    ... implement integrity management programs. In addition to a minor correction in terminology, this document...: Integrity Management Program for Gas Distribution Pipelines; Correction AGENCY: Pipeline and...

  9. Local scour at submarine pipelines

    Institute of Scientific and Technical Information of China (English)

    Yee-Meng Chiew

    2010-01-01

    The rapid development of offshore oil_fields has increased the number of submarine pipelines being constructed for the transport of crude oil to onshore refineries.Interactions between the pipeline and an erodible bed under the influence of current and waves often lead to local scouring around the structure.When this occurs, the pipeline may be suspended on the seabed resulting in the formation of a span.If the free span is long enough, the pipe may experience resonant flow-induced oscillations,leading to structural failure.This study examines the complex flow-structure-sediment interaction leading to the development of local scour holes around submarine pipelines.It reviews published literature in this area,which primarily is confined to the development of 2-dimensional scour holes.Despite the abundance of such research studies,pipeline-scour in the field essentially is 3-dimensional in nature.Hence, most of these studies have overlooked the importance of the transverse dimension of the scour hole,while emphasizing on its vertical dimension.This dearly is an issue that must be re-examined in light of the potential hazard and environmental disaster that one faces in the event of a pipeline failure.Recent studies have begun to recognize this shortcoming,and attempts have been made to overcome the deficiency.The study presents the state-of-the-art knowledge on local scour at submarine pipelines,both from a 2-dimensional as well as the 3-dimensional perspective.

  10. An overview of Samarco's pipelines and their KPI'S (Key Performance Indicators)

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Ivan; Andrade, Ricardo Bruno Nebias; Silva, Tatiana [Samarco Mineracao S.A., Belo Horizonte, MG (Brazil)

    2009-07-01

    Samarco is the owner and operates the biggest slurry pipeline grid of the world composed of three pipelines with a total length of 801 km. This paper shows some important key performance indicators (KPI's) of Samarco's pipelines as: pumped tonnage; slurry concentration; availability and safety. This paper also presents the main features and the flow sheet of each pipeline. The objective of this paper is to give a brief idea of Samarco's pipelines process and how was possible to improve these main KPI's presented. (author)

  11. PRIMO: An Interactive Homology Modeling Pipeline

    Science.gov (United States)

    Glenister, Michael

    2016-01-01

    The development of automated servers to predict the three-dimensional structure of proteins has seen much progress over the years. These servers make calculations simpler, but largely exclude users from the process. In this study, we present the PRotein Interactive MOdeling (PRIMO) pipeline for homology modeling of protein monomers. The pipeline eases the multi-step modeling process, and reduces the workload required by the user, while still allowing engagement from the user during every step. Default parameters are given for each step, which can either be modified or supplemented with additional external input. PRIMO has been designed for users of varying levels of experience with homology modeling. The pipeline incorporates a user-friendly interface that makes it easy to alter parameters used during modeling. During each stage of the modeling process, the site provides suggestions for novice users to improve the quality of their models. PRIMO provides functionality that allows users to also model ligands and ions in complex with their protein targets. Herein, we assess the accuracy of the fully automated capabilities of the server, including a comparative analysis of the available alignment programs, as well as of the refinement levels used during modeling. The tests presented here demonstrate the reliability of the PRIMO server when producing a large number of protein models. While PRIMO does focus on user involvement in the homology modeling process, the results indicate that in the presence of suitable templates, good quality models can be produced even without user intervention. This gives an idea of the base level accuracy of PRIMO, which users can improve upon by adjusting parameters in their modeling runs. The accuracy of PRIMO’s automated scripts is being continuously evaluated by the CAMEO (Continuous Automated Model EvaluatiOn) project. The PRIMO site is free for non-commercial use and can be accessed at https://primo.rubi.ru.ac.za/. PMID:27855192

  12. PRIMO: An Interactive Homology Modeling Pipeline.

    Science.gov (United States)

    Hatherley, Rowan; Brown, David K; Glenister, Michael; Tastan Bishop, Özlem

    2016-01-01

    The development of automated servers to predict the three-dimensional structure of proteins has seen much progress over the years. These servers make calculations simpler, but largely exclude users from the process. In this study, we present the PRotein Interactive MOdeling (PRIMO) pipeline for homology modeling of protein monomers. The pipeline eases the multi-step modeling process, and reduces the workload required by the user, while still allowing engagement from the user during every step. Default parameters are given for each step, which can either be modified or supplemented with additional external input. PRIMO has been designed for users of varying levels of experience with homology modeling. The pipeline incorporates a user-friendly interface that makes it easy to alter parameters used during modeling. During each stage of the modeling process, the site provides suggestions for novice users to improve the quality of their models. PRIMO provides functionality that allows users to also model ligands and ions in complex with their protein targets. Herein, we assess the accuracy of the fully automated capabilities of the server, including a comparative analysis of the available alignment programs, as well as of the refinement levels used during modeling. The tests presented here demonstrate the reliability of the PRIMO server when producing a large number of protein models. While PRIMO does focus on user involvement in the homology modeling process, the results indicate that in the presence of suitable templates, good quality models can be produced even without user intervention. This gives an idea of the base level accuracy of PRIMO, which users can improve upon by adjusting parameters in their modeling runs. The accuracy of PRIMO's automated scripts is being continuously evaluated by the CAMEO (Continuous Automated Model EvaluatiOn) project. The PRIMO site is free for non-commercial use and can be accessed at https://primo.rubi.ru.ac.za/.

  13. China Oil & Gas Pipeline Survey & Design Institute, Pipeline

    Institute of Scientific and Technical Information of China (English)

    Bureau of CNPC; Zhao Surong

    1995-01-01

    @@ China Oil/Gas Pipeline Bureau(P.B) is the only professional organization in China specialized in oil/gas pipelines design and construction since 1980s. It has ever cooperated with certain number of well known companies from Japan,USA, Germany, Canada, as well as Italy in the designs of many large oil/gas pipeline projects, during the course of which, personnel from P.B accumulated much experience in international project designs. During the execution of each particular project, they strictly followed the common-use international codes and standards with computers as the auxiliary design system combined with the self-developed software. All its clients showed their trust in this organization and gave it high praise for its outstanding survey, design and technical service.

  14. X80管线钢焊接热影响区组织性能改善措施%Discussion on Improving Processing Measure of Welding HAZ Microstructure Performance for X80 Pipeline Steel

    Institute of Scientific and Technical Information of China (English)

    赵波; 李国鹏; 王旭; 谷雨; 肖福仁

    2014-01-01

    采用热模拟试验方法,研究了一种X80管线钢热影响区在不同冷速、焊接线能量和峰值温度下的金相组织和力学性能变化规律。试验表明:适当提高焊缝热影响区冷速可以改善其组织和强韧性能;在相同焊接线能量的条件下,试验X80钢焊接热影响区的临界区、粗晶区冲击韧性最差并分别出现一处谷值,细晶区强度最差;采用较小线能量焊接有利于改善热影响区综合强韧性;采用高熔敷率复合高效多丝埋弧焊低线能量化焊接工艺、随焊加速冷却工艺、焊后焊缝及热影响区局部中频正火热处理工艺,可以有效改善X80管线钢热影响区强韧匹配性能。%By using the thermal simulation test, the changing rules of the metallurgical structure and mechanical properties of weld HAZ of X80 pipeline steel under different cooling rates, welding heat inputs and peak temperatures were discussed in this essay. The tests showed that: Appropriate increase of the cooling rate in the weld HAZ can improve the microstructure, strength and toughness of the weld HAZ;under the same welding heat input, the impact toughness of the critical zone and the coarse grain zone are poor and a valley occurs respectively in these two zones, the strength of the fine grain zone is the lowest;welding adopting a relatively low heat input is beneficial to improve the comprehensive strength and toughness of weld HAZ. According to analysis result, a welding procedure with high deposit rate, high-efficiency multi-wire submerged arc welding and lower heat input, a simultaneous accelerate cooling process, and a local intermediate-frequency normalizing treatment of weld & HAZ post-welding can effectively improve the comprehensive strength-toughness performance of weld HAZ of pipeline steel.

  15. Update on the SDSS-III MARVELS data pipeline development

    Science.gov (United States)

    Li, Rui; Ge, J.; Thomas, N. B.; Petersen, E.; Wang, J.; Ma, B.; Sithajan, S.; Shi, J.; Ouyang, Y.; Chen, Y.

    2014-01-01

    MARVELS (Multi-object APO Radial Velocity Exoplanet Large-area Survey), as one of the four surveys in the SDSS-III program, has monitored over 3,300 stars during 2008-2012, with each being visited an average of 26 times over a 2-year window. Although the early data pipeline was able to detect over 20 brown dwarf candidates and several hundreds of binaries, no giant planet candidates have been reliably identified due to its large systematic errors. Learning from past data pipeline lessons, we re-designed the entire pipeline to handle various types of systematic effects caused by the instrument (such as trace, slant, distortion, drifts and dispersion) and observation condition changes (such as illumination profile and continuum). We also introduced several advanced methods to precisely extract the RV signals. To date, we have achieved a long term RMS RV measurement error of 14 m/s for HIP-14810 (one of our reference stars) after removal of the known planet signal based on previous HIRES RV measurement. This new 1-D data pipeline has been used to robustly identify four giant planet candidates within the small fraction of the survey data that has been processed (Thomas et al. this meeting). The team is currently working hard to optimize the pipeline, especially the 2-D interference-fringe RV extraction, where early results show a 1.5 times improvement over the 1-D data pipeline. We are quickly approaching the survey baseline performance requirement of 10-35 m/s RMS for 8-12 solar type stars. With this fine-tuned pipeline and the soon to be processed plates of data, we expect to discover many more giant planet candidates and make a large statistical impact to the exoplanet study.

  16. Emergency preparedness of OSBRA Pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Magalhaes, Milton P.; Torres, Carlos A.R.; Almeida, Francisco J.C. [TRANSPETRO, Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper presents the experience of PETROBRAS Transporte S. A. - TRANSPETRO in the preparation for emergencies in the OSBRA pipeline, showing specific aspects and solutions developed. The company has a standardized approach for the emergency management, based on risk analysis studies, risk management plan and contingency plans. To cover almost 1,000 km of pipeline, the Company avails of Emergency Response Centers and Environmental Defense Center, located at strategic points. In order to achieve preparation, fire fighting training and oil leakage elimination training are provided. Additionally, simulation exercises are performed, following a schedule worked out according to specific criteria and guidelines. As a conclusion, a picture is presented of the evolution of the preparation for emergencies in the OSBRA System which bears the enormous responsibility of transporting flammable products for almost 1,000 km of pipeline, crossing 40 municipalities, 3 states and the Federal District. (author)

  17. Visual Basic: Pipeline SCADA and enterprise wide management

    Energy Technology Data Exchange (ETDEWEB)

    Verma, R. [Parijat Controlware, Inc., Houston, TX (United States)

    1997-02-01

    Visual Basic (VB), Microsoft`s powerful yet user friendly development environment, has steadily become more accepted and used in Man Machine Interface (MMI) and Supervisory Control and Data Acquisition (SCADA) applications. Known as an easy way to rapidly develop applications for the commercial, financial, and computer games industries, VB is now making its mark in the pipeline SCADA industry. For pipeline operating companies, employing PLCs/Flow computers/Chromatographs as a front for control and data acquisition of pumping stations, meter stations, product delivery terminals, LACT units, gas plants, gathering and production systems, VB is an ideal software package. The author has successfully applied VB for gas pipeline turbine-driven compressor station systems with over 4,000 alarms, 200+ analog inputs, 60+ PID control loops and 400+ digital I/O, with the process control being delegated to PLCs and VB performing all the MMI functions and any upstream data processing, manipulation and management functions.

  18. Application of the Fuzzy Comprehensive Assessment Technique to Optimal Selection of Pipeline Design Alternative

    Institute of Scientific and Technical Information of China (English)

    ChuFeixue; ChuYanfan; LiuXiumin

    2005-01-01

    Regarding the influencing factors in an optimal selection of pipeline design alternative as fuzzy variables with different weights, a fuzzy comprehensive assessment was applied to an optimal selection of the design alternative. Giving the Lanzhou-Chengdu pipeline as an example to explain the process, the result shows that this method is acceptable.

  19. Simulation of Heat Transfer of Heating-System and Water Pipelines Under Northern Conditions

    Science.gov (United States)

    Stepanov, A. V.; Egorova, G. N.

    2016-09-01

    A mathematical model of joint laying of water pipelines and of city-block heating-system pipelines is considered. The effect of radiation on the process of combined heat transfer in the heat insulation jacket between the construction elements is investigated. The results of mathematical simulation of heat losses with account of the radiant component are given.

  20. INCREASING COMBINATIONAL CIRCUIT PERFORMANCE VIA PIPELINING

    Directory of Open Access Journals (Sweden)

    Yu. V. Pottosin

    2013-01-01

    Full Text Available The question of increasing performance of a device with no memory, which develops a sequence of discrete signals, is considered. A problem is set to divide a given multilevel combinational circuit into a given number of cascades with registers providing pipeline-wise development of incoming signals. To solve this problem we use a model based on representation of combinational circuit as a directed graph. In the process of solving this problem, the frequency of incoming signals is established. This frequency must be as high as possible.

  1. GASVOL 18'' gas pipeline - risk based inspection study

    Energy Technology Data Exchange (ETDEWEB)

    Bjoernoey, Ola H.; Etterdal, Birger A. [Det Norske Veritas (DNV), Oslo (Norway); Guarize, Rosimar; Oliveira, Luiz F.S. [Det Norske Veritas (DNV) (Brazil); Faertes, Denise; Dias, Ricardo [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

    2003-07-01

    This paper describes a risk based approach and inspection planning as part of the Pipeline Integrity Management (PIM) system for the 95.5 km long 18'' GASVOL gas pipeline in the South eastern region of Brazil transporting circa 5 000 000 m3 dry gas per day. Pipeline systems can be subject to several degradation mechanisms and inspection and monitoring are used to ensure system integrity. Modern pipeline regulations and codes are normally based on a core safety or risk philosophy. The detailed design requirements presented in design codes are practical interpretations established so as to fulfill these core objectives. A given pipeline, designed, constructed and installed according to a pipeline code is therefore the realization of a structure, which, along its whole length, meets the applicable safety objectives of that code. The main objective of Pipeline Integrity Management (PIM) is to control and document the integrity of the pipeline for its whole service life, and to do this in a cost-effective manner. DNV has a specific approach to RBI planning, starting with an initial qualitative assessment where pipelines and damage type are ranked according to risk and potential risk reduction by an inspection and then carried forward to a quantitative detailed assessment where the level of complexity and accuracy can vary based on availability of information and owner needs. Detailed assessment requires significant effort in data gathering. The findings are dependent upon the accuracy of the inspection data, and on DNV's interpretation of the pipeline reference system and simplifications in the inspection data reported. The following specific failure mechanisms were investigated: internal corrosion, external corrosion, third party interference, landslides and black powder. RBI planning, in general words, is a 'living process'. In order to optimize future inspections, it is essential that the analyses utilize the most recent information regarding

  2. Method and system for pipeline communication

    Science.gov (United States)

    Richardson,; John, G [Idaho Falls, ID

    2008-01-29

    A pipeline communication system and method includes a pipeline having a surface extending along at least a portion of the length of the pipeline. A conductive bus is formed to and extends along a portion of the surface of the pipeline. The conductive bus includes a first conductive trace and a second conductive trace with the first and second conductive traces being adapted to conformally couple with a pipeline at the surface extending along at least a portion of the length of the pipeline. A transmitter for sending information along the conductive bus on the pipeline is coupled thereto and a receiver for receiving the information from the conductive bus on the pipeline is also couple to the conductive bus.

  3. 77 FR 19799 - Pipeline Safety: Pipeline Damage Prevention Programs

    Science.gov (United States)

    2012-04-02

    ..., excavation, tunneling, or construction activity to establish the location of underground facilities in the demolition, excavation, tunneling, or construction area; 2. Disregard location information or markings... construction activity; and 3. Fail to report excavation damage to a pipeline facility to the owner or operator...

  4. 75 FR 13342 - Pipeline Safety: Workshop on Distribution Pipeline Construction

    Science.gov (United States)

    2010-03-19

    ... practices in natural gas distribution pipeline construction management and quality control. This workshop... St. Louis at the Ballpark, 1 South Broadway, St. Louis, MO 63102. Hotel reservations under the ``U.S...-845-7354. A daily base rate of $110.00 is available. The meeting room will be posted at the hotel on...

  5. 78 FR 41496 - Pipeline Safety: Meetings of the Gas and Liquid Pipeline Advisory Committees

    Science.gov (United States)

    2013-07-10

    ... Pipeline and Hazardous Materials Safety Administration Pipeline Safety: Meetings of the Gas and Liquid... for natural gas pipelines and for hazardous liquid pipelines. Both committees were established under.... ACTION: Notice of advisory committee meeting. SUMMARY: This notice announces a public meeting of the...

  6. California Natural Gas Pipelines: A Brief Guide

    Energy Technology Data Exchange (ETDEWEB)

    Neuscamman, Stephanie [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Price, Don [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pezzola, Genny [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Glascoe, Lee [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-01-22

    The purpose of this document is to familiarize the reader with the general configuration and operation of the natural gas pipelines in California and to discuss potential LLNL contributions that would support the Partnership for the 21st Century collaboration. First, pipeline infrastructure will be reviewed. Then, recent pipeline events will be examined. Selected current pipeline industry research will be summarized. Finally, industry acronyms are listed for reference.

  7. Recovery of blended product pipeline slops

    Energy Technology Data Exchange (ETDEWEB)

    Sloley, A.W. [Process Consulting Services, Inc., Houston, TX (United States)

    1996-09-01

    Both product pipeline operation and terminal blending often generate slops consisting of mixed hydrocarbon streams. Typical slops dispositions include local burning of the fuel for heat or power generation or reshipment to a refinery in a crude stream. Both of these dispositions can incur significant economic penalties. An alternative is the use of a small local plant for the separation of the streams back into pipeline products. This is achievable as long as blend stocks rather then final products containing performance additives are being separated. Final products (gasoline, diesel) contain additives and blending components difficult to handle within the constraints of a small process unit. A proposed multi-product separation unit is presented. The case investigated shows the process configuration required for a unit to process a range of mixtures containing material from LPG to atmospheric gas oil. The material presented includes the plant flow scheme, identification of major equipment, and overall sizing of major equipment. The study results summarize the investment and operating costs of the unit compared to the values of the recoverable products.

  8. Corporate social responsibility along pipelines: communities and corporations working together

    Energy Technology Data Exchange (ETDEWEB)

    Carvalho, Edison D.R.; Lopes, Luciano E.; Danciguer, Lucilene; Macarini, Samuel; Souza, Maira de [Grupo de Aplicacao Interdisciplinar a Aprendizagem (GAIA), Campinas, SP (Brazil)

    2009-07-01

    improving communities' life quality. 7. Follow-up, supporting communities leaders during dissemination of information about pipelines, project fund-raising and implementation. 8. Creation and followup of companies' networks to support some of the projects elaborated by the communities. 9. Impact evaluation, measuring the results accomplished by the whole project after its realization. The overall process is monitored with management and quality tools such as PDCA and processes and results indicators. The elaboration of projects by communities' members, organizing their needs and requests, facilitates management decisions regarding private social investment. During the follow-up, GAIA supports the communities' fund-raising from several organizations, as well as creates networks of potential local supporters. Those initiatives tend to dilute the requests from communities to companies. Thus, companies foment communities' autonomy and citizenship, creating a situation in which both, companies and communities, are benefited. (author)

  9. Working group 7: pipeline risk management

    Energy Technology Data Exchange (ETDEWEB)

    Kariyawasam, Shahani; Weir, David

    2011-07-01

    This seventh working group of the Banff 2011 conference provided an understanding of risk management process in the pipeline industry, including system-wide risk assessment, risk based integrity systems and risk control techniques. The presentations furnished a basis on which to discuss programs, processes and procedures including reliability based decision support and performance measures that support a company's risk management policy. This workshop was divided into three sessions. The first session focused on the comparison between reliability methods and conventional deterministic methods in terms of accuracy, simplicity and sensitivity. Next, the importance of low probability high consequence events and the processes to prevent them were discussed. The last session discussed the consequences of management processes on failures and risks. The debates following these presentations tried to identify the best management practices to reduce risks, and the regulations and requirements to develop.

  10. High-throughput bioinformatics with the Cyrille2 pipeline system

    Directory of Open Access Journals (Sweden)

    de Groot Joost CW

    2008-02-01

    Full Text Available Abstract Background Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses are often interdependent and chained together to form complex workflows or pipelines. Given the volume of the data used and the multitude of computational resources available, specialized pipeline software is required to make high-throughput analysis of large-scale omics datasets feasible. Results We have developed a generic pipeline system called Cyrille2. The system is modular in design and consists of three functionally distinct parts: 1 a web based, graphical user interface (GUI that enables a pipeline operator to manage the system; 2 the Scheduler, which forms the functional core of the system and which tracks what data enters the system and determines what jobs must be scheduled for execution, and; 3 the Executor, which searches for scheduled jobs and executes these on a compute cluster. Conclusion The Cyrille2 system is an extensible, modular system, implementing the stated requirements. Cyrille2 enables easy creation and execution of high throughput, flexible bioinformatics pipelines.

  11. The Spanish in-kind contribution to ESO - Pipelines

    Science.gov (United States)

    de Bilbao, L.; Ballester, P.

    2011-11-01

    Ever since entering ESO in 2006, Spain has made an in-kind contribution invarious fields. In the Pipeline Systems Department (PSD), this contribution hasbeen made to a number of projects, in particular to the CPL (Common Pipeline Library), to the pipelines of the instruments ISAAC, SofI and FORS, and to theintegration of the pipeline for the new VISTA telescope into the DFS (Data FlowSystem) at Paranal. Some of these projects started with the VLT a decade ago, and have experiencedan evolution since, as a consequence of standardization processes, of thesearch of elements in common and, especially in the last couple of years, ofthe efforts to provide the astronomy community with 'science-ready' dataproducts. At the same time, the PSD faces new challenges today, as the dramatic increase in the data volume of the Paranal Observatory since the start of operations of VISTA, or the needs of the pipelines of the future instruments of the E-ELT. In the latter, more concretely, the current status of the CPL is being analyzed, with the goal of establishing the necessary changes to meet those future needs.In this article we introduce briefly these projects and detail the Spanish in-kind contribution to them.

  12. The minimal preprocessing pipelines for the Human Connectome Project.

    Science.gov (United States)

    Glasser, Matthew F; Sotiropoulos, Stamatios N; Wilson, J Anthony; Coalson, Timothy S; Fischl, Bruce; Andersson, Jesper L; Xu, Junqian; Jbabdi, Saad; Webster, Matthew; Polimeni, Jonathan R; Van Essen, David C; Jenkinson, Mark

    2013-10-15

    The Human Connectome Project (HCP) faces the challenging task of bringing multiple magnetic resonance imaging (MRI) modalities together in a common automated preprocessing framework across a large cohort of subjects. The MRI data acquired by the HCP differ in many ways from data acquired on conventional 3 Tesla scanners and often require newly developed preprocessing methods. We describe the minimal preprocessing pipelines for structural, functional, and diffusion MRI that were developed by the HCP to accomplish many low level tasks, including spatial artifact/distortion removal, surface generation, cross-modal registration, and alignment to standard space. These pipelines are specially designed to capitalize on the high quality data offered by the HCP. The final standard space makes use of a recently introduced CIFTI file format and the associated grayordinate spatial coordinate system. This allows for combined cortical surface and subcortical volume analyses while reducing the storage and processing requirements for high spatial and temporal resolution data. Here, we provide the minimum image acquisition requirements for the HCP minimal preprocessing pipelines and additional advice for investigators interested in replicating the HCP's acquisition protocols or using these pipelines. Finally, we discuss some potential future improvements to the pipelines.

  13. Optimal energy consumption analysis of natural gas pipeline.

    Science.gov (United States)

    Liu, Enbin; Li, Changjun; Yang, Yi

    2014-01-01

    There are many compressor stations along long-distance natural gas pipelines. Natural gas can be transported using different boot programs and import pressures, combined with temperature control parameters. Moreover, different transport methods have correspondingly different energy consumptions. At present, the operating parameters of many pipelines are determined empirically by dispatchers, resulting in high energy consumption. This practice does not abide by energy reduction policies. Therefore, based on a full understanding of the actual needs of pipeline companies, we introduce production unit consumption indicators to establish an objective function for achieving the goal of lowering energy consumption. By using a dynamic programming method for solving the model and preparing calculation software, we can ensure that the solution process is quick and efficient. Using established optimization methods, we analyzed the energy savings for the XQ gas pipeline. By optimizing the boot program, the import station pressure, and the temperature parameters, we achieved the optimal energy consumption. By comparison with the measured energy consumption, the pipeline now has the potential to reduce energy consumption by 11 to 16 percent.

  14. Pipeline information system, a tool for making decisions

    Energy Technology Data Exchange (ETDEWEB)

    Polanco, R.P.; Betancourt, E.R. [Pemex Gas y Petroquimica Basica (Mexico)

    2000-07-01

    Issues regarding the operation, maintenance, safety and inspection of the 11,000 km of pipeline located in 24 of the 32 Mexican states was discussed with focus on the technical information system that Pemex Gas y Petroquimica Basica established to integrate digitalized pipeline trajectory with several geographic maps and technical databases. The objective was to establish a means for technical consultation in decision-making processes. Most of the pipelines carry natural gas, liquefied petroleum gas or basic petrochemical products. Information managing and worker safety in surrounding populated areas requires an effective system to accurately locate transportation and distribution facilities. Pemex's technical information system makes use of a geographic positioning satellite (GPS) to show aboveground facilities and pipeline trajectory on digitized geographic maps. The system can also be used to manipulate technical databases, upgrade pipelines' cathodic protection values, plus measure and pinpoint regular or serious problems detected by internal surveillance inspection equipment. Some of the most important parameters that the system deals with is the integration of information on pipe construction codes, pipe thickness and diameter, design and construction year, and pipe maximum allowance pressure. The authors emphasized how important data transmission through digital media will be in the coming years. 1 ref., 6 figs.

  15. Pipeline building programmes in decline world-wide

    Energy Technology Data Exchange (ETDEWEB)

    1979-06-01

    Apart from current construction on the Transmediterranean gas pipeline fron Tunisia to Italy, on the 747 mi crude oil line from Abqaiq to Yanbu in Saudi Arabia, on crude and gas lines in Mexico, and in the U.S.S.R., pipeline work world-wide is in decline. In the U.S., Standard Oil Co. (Ohio) has abandoned its planned $1 billion oil pipeline from Long Beach, Calif., to Midland, Tex., and the $10 billion, 4800 mi Alaska Highway gas pipeline has run into financial difficulties. In Iran, Snamprogetti has stopped work on the Iranian Gas Trunkline II because of the revolution. In the North Sea, construction this summer is limited to short inter-field link lines. The surplus refining and shipping capacity in Europe indicates there is little economic incentive to build new oil and chemical processing capacity or the pipelines to serve them. Prospects for gas-line construction are better owing to the increasing availability of gas. France plans to extend its network, and Denmark, which plans to exploit its offshore associated and nonassociated gas finds, is considering a gas grid, possibly connecting to Sweden. Possible oil and gas lines in the Shetlands, the Netherlands, offshore Spain, Greece, and Italy are discussed.

  16. Detecting abnormalities in gas pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Smati, A. (Institut National des Hydrocarbures et de la Chimie, Bournerdes (Azerbaijan))

    1994-12-01

    The results of the measurement of the principal operating parameters can contain precious information on the condition of gas pipelines. This article explains how statistical tests may be useful in detecting anomalies that can occur on lines and in compressor stations. (author). 6 refs., 3 tabs., 1 fig.

  17. Wax deposition in crude oil pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Assuncao, Pablo Morelato; Rodrigues, Lorennzo Marrochi Nolding [Universidade Federal do Espirito Santo, Sao Mateus, ES (Brazil). Centro Universitario Norte do Espirito Santo. Engenharia de Petroleo; Romero, Mao Ilich [University of Wyoming, Laramie, WY (United States). Enhanced Oil Recovery Institute], e-mail: mromerov@uwyo.edu

    2010-07-01

    Crude oil is a complex mixture of hydrocarbons which consists of aromatics, paraffins, naphthenics, resins asphaltenes, etc. When the temperature of crude oil is reduced, the heavy components, like paraffin, will precipitate and deposit on the pipe internal wall in the form of a wax-oil gel. The gel deposit consists of wax crystals that trap some amount of oil. As the temperature gets cooler, more wax will precipitate and the thickness of the wax gel will increase, causing gradual solidification of the crude and eventually the oil stop moving inside the offshore pipeline. Crude oil may not be able to be re-mobilized during re-startup. The effective diameter will be reduced with wax deposition, resulting in several problems, for example, higher pressure drop which means additional pumping energy costs, poor oil quality, use of chemical components like precipitation inhibitors or flowing facilitators, equipment failure, risk of leakage, clogging of the ducts and process equipment. Wax deposition problems can become so sever that the whole pipeline can be completely blocked. It would cost millions of dollars to remediate an offshore pipeline that is blocked by wax. Wax solubility decreases drastically with decreasing temperature. At low temperatures, as encountered in deep water production, is easy to wax precipitate. The highest temperature below which the paraffins begins to precipitate as wax crystals is defined as wax appearance temperature (WAT). Deposition process is a complex free surface problem involving thermodynamics, fluid dynamics, mass and heat transfer. In this work, a numerical analysis of wax deposition by molecular diffusion and shear dispersion mechanisms in crude oil pipeline is studied. Diffusion flux of wax toward the wall is estimated by Fick's law of diffusion, in similar way the shear dispersion; wax concentration gradient at the solid-liquid interface is obtained by the volume fraction conservation equation; and since the wax deposition

  18. Real-Time Pulsars Pipeline Using Many-Cores

    Science.gov (United States)

    Sclocco, Alessio; Van Nieuwpoort, R.; Bal, H. E.

    2014-04-01

    Exascale radio astronomy presents challenges to both astronomers and computer scientists. One of these challenges is processing the enormous amount of data that will be produced by exascale instruments, like the Square Kilometer Array (SKA). Traditional approaches, based on storing data to process them off-line, are common nowadays, but are unfeasible in the exascale era due to the high bandwidths. We investigate the use of many-core accelerators as a way to achieve real-time performance without exceeding cost and power constraints. In our current research, we aim at accelerating the pulsar searching process, and produce a real-time and scalable software pipeline for the exascale era. Our pipeline consists of three main steps: dedispersion, folding and signal-to-noise ratio computation. It is open source and implemented using the Open Computing Language (OpenCL). To achieve our goals of real-time performance, scalability and portability, we applied three different techniques. First, we designed all steps of the pulsars pipeline to run on many-core accelerators, even the less computational intensive. This way, communication between host and accelerator is minimized, avoiding a common bottleneck of many-core accelerated computing. Second, we parallelized the pipeline with a fine-grained approach. Because of this parallelization strategy, it is not only possible to distribute the input beams to different computation nodes, but also to define which part of the search space is explored by any node. This completely avoids inter-node communication, and scalability of the pipeline can simply be achieved by adding more machines. Third, we use extensive auto-tuning for both the single processing kernels and the pipeline as a whole. By using auto-tuning, we do not simply find the best possible parameter configuration, thus obtaining high-performance, but also make the pipeline portable among different computing devices, and adaptable to different telescopes and observational

  19. Condensation-Induced Steam Bubble Collapse in a Pipeline

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Steam bubbles often occur in pipelines due to the pipeline structure or operational errors. The water hammer force induced by the steam bubble collapse is a hidden safety concern. This paper experimentally and numerically investigates the conditions for steam bubble formation and collapse. A series of video pictures taken in the laboratory show that steam bubbles form and collapse over several cycles. The pressure history of the steam bubbles is measured in conjunction with the pictures. In the experiment, the liquid column cavitated at the low pressures and then the cavities collapsed due to condensation causing high pressure pulses. The process was also simulated numerically. The results suggest that coolant pipeline design and operation must include procedures to avoid steam bubble formation.

  20. Detection of underground pipeline based on Golay waveform design

    Science.gov (United States)

    Dai, Jingjing; Xu, Dazhuan

    2017-08-01

    The detection of underground pipeline is an important problem in the development of the city, but the research about it is not mature at present. In this paper, based on the principle of waveform design in wireless communication, we design an acoustic signal detection system to detect the location of underground pipelines. According to the principle of acoustic localization, we chose DSP-F28335 as the development board, and use DA and AD module as the master control chip. The DA module uses complementary Golay sequence as emission signal. The AD module acquisiting data synchronously, so that the echo signals which containing position information of the target is recovered through the signal processing. The test result shows that the method in this paper can not only calculate the sound velocity of the soil, but also can locate the location of underground pipelines accurately.

  1. Simulation of pipeline in the area of the underwater crossing

    Science.gov (United States)

    Burkov, P.; Chernyavskiy, D.; Burkova, S.; Konan, E. C.

    2014-08-01

    The article studies stress-strain behavior of the main oil-pipeline section Alexandrovskoye-Anzhero-Sudzhensk using software system Ansys. This method of examination and assessment of technical conditions of objects of pipeline transport studies the objects and the processes that affect the technical condition of these facilities, including the research on the basis of computer simulation. Such approach allows to develop the theory, methods of calculations and designing of objects of pipeline transport, units and parts of machines, regardless of their industry and destination with a view to improve the existing constructions and create new structures, machines of high performance, durability and reliability, maintainability, low material capacity and cost, which have competitiveness on the world market.

  2. National Energy Board (NEB) pipeline integrity management program

    Energy Technology Data Exchange (ETDEWEB)

    Hendershot, J. (National Energy Board, Calgary, AB (Canada))

    1999-01-01

    The National Energy Board (NEB) ensures the safe design, construction and operation of pipelines that cross provincial or national borders. Since 1991, there have been 22 major pipeline failures of which most were closed by corrosion, 5 from stress corrosion cracks, and 3 from slope stability problems. After a meeting of pipeline companies with the NEB, new regulations were put in place. The new regulations include: an emphasis on maintenance, a requirement for proactivity by owners and integrity management guidelines. While the integrity management guidelines are not regulations, they represent industry best practices, allow a degree of flexibility, and allow enforcement based on intent and the use of an audit process. The guidelines are comprised of a management system, a working records management system, condition monitoring, and mitigation.

  3. National Energy Board (NEB) pipeline integrity management program

    Energy Technology Data Exchange (ETDEWEB)

    Hendershot, J. [National Energy Board, Calgary, AB (Canada)

    1999-11-01

    The National Energy Board (NEB) ensures the safe design, construction and operation of pipelines that cross provincial or national borders. Since 1991, there have been 22 major pipeline failures of which most were closed by corrosion, 5 from stress corrosion cracks, and 3 from slope stability problems. After a meeting of pipeline companies with the NEB, new regulations were put in place. The new regulations include: an emphasis on maintenance, a requirement for proactivity by owners and integrity management guidelines. While the integrity management guidelines are not regulations, they represent industry best practices, allow a degree of flexibility, and allow enforcement based on intent and the use of an audit process. The guidelines are comprised of a management system, a working records management system, condition monitoring, and mitigation.

  4. Hydraulic calculation of gravity transportation pipeline system for backfill slurry

    Institute of Scientific and Technical Information of China (English)

    ZHANG Qin-li; HU Guan-yu; WANG Xin-min

    2008-01-01

    Taking cemented coal gangue pipeline transportation system in Suncun Coal Mine, Xinwen Mining Group, Shandong Province, China, as an example, the hydraulic calculation approaches and process about gravity pipeline transportation of backfill slurry were investigated. The results show that the backfill capability of the backfill system should be higher than 74.4m3/h according to the mining production and backfill times in the mine; the minimum velocity (critical velocity) and practical working velocity of the backfill slurry are 1.44 and 3.82m/s, respectively. Various formulae give the maximum ratio of total length to vertical height of pipeline (L/H ratio) of the backfill system of 5.4, and then the reliability and capability of the system can be evaluated.

  5. Simulation of pipelining pours point depressant beneficiated waxy crude oil through China West Crude Oil Pipeline

    Institute of Scientific and Technical Information of China (English)

    李鸿英; 张劲军; 凌霄; 黄启玉; 林小飞; 贾邦龙; 李宇光

    2008-01-01

    Flow properties of waxy crude oils,particularly the beneficiated waxy crude oils,are sensitive to shear history that the crude oil experienced,called the shear history effect.To simulate this shear history effect accurately is vital to pipeline design and operation.It has been demonstrated by our previous that the energy dissipation or entropy generation due to viscous flow in the shear process is a suitable parameter for simulating the shear history effect.In order to further verify the reliability of this approach,experimental simulations were conducted for three PPD-beneficiated waxy crude oils transported through the China West Crude Oil Pipeline,a most complicated long-distance-crude-oil-pipeline technically and operationally so far in China.The simulations were made by using a stirred vessel and with the energy dissipation of viscous flow as the shear simulation parameter.Comparison between the flow properties of crude oils obtained from field test and experimental simulations,it is found that the gel points and viscosities from experimental simulations are in good agreement with the field data.

  6. Optical Fiber Pipeline Security Forewarning System

    Institute of Scientific and Technical Information of China (English)

    Jiang Qishan; Ren Ruijun; Ren Peikui

    2010-01-01

    @@ With the rapid development of China's economy,such incidents occurring to oil & gas pipelines as industrial and agricultural production,natural disasters,oil stealing,etc.have been prevailing and brought negative influences to the normal operation of pipelines.On account of all such destructive activities,firstly the soil around the pipeline should be vibrated,and then the cable laid in the pipe trench could respond to the vibration.Using this technology,the Department of Science & Technology of CNPC has embarked on the research of relevant equipment to monitor pipeline activities along the pipeline since 2001.

  7. The JCSG high-throughput structural biology pipeline.

    Science.gov (United States)

    Elsliger, Marc André; Deacon, Ashley M; Godzik, Adam; Lesley, Scott A; Wooley, John; Wüthrich, Kurt; Wilson, Ian A

    2010-10-01

    The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years. The JCSG has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe, as well as making substantial inroads into structural coverage of an entire organism. Targets are processed through an extensive combination of bioinformatics and biophysical analyses to efficiently characterize and optimize each target prior to selection for structure determination. The pipeline uses parallel processing methods at almost every step in the process and can adapt to a wide range of protein targets from bacterial to human. The construction, expansion and optimization of the JCSG gene-to-structure pipeline over the years have resulted in many technological and methodological advances and developments. The vast number of targets and the enormous amounts of associated data processed through the multiple stages of the experimental pipeline required the development of variety of valuable resources that, wherever feasible, have been converted to free-access web-based tools and applications.

  8. Oil pipeline valve automation for spill reduction

    Energy Technology Data Exchange (ETDEWEB)

    Mohitpour, Mo; Trefanenko, Bill [Enbridge Technology Inc, Calgary (Canada); Tolmasquim, Sueli Tiomno; Kossatz, Helmut [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

    2003-07-01

    Liquid pipeline codes generally stipulate placement of block valves along liquid transmission pipelines such as on each side of major river crossings where environmental hazards could cause or are foreseen to potentially cause serious consequences. Codes, however, do not stipulate any requirement for block valve spacing for low vapour pressure petroleum transportation, nor for remote pipeline valve operations to reduce spills. A review of pipeline codes for valve requirement and spill limitation in high consequence areas is thus presented along with a criteria for an acceptable spill volume that could be caused by pipeline leak/full rupture. A technique for deciding economically and technically effective pipeline block valve automation for remote operation to reduce oil spilled and control of hazards is also provided. In this review, industry practice is highlighted and application of the criteria for maximum permissible oil spill and the technique for deciding valve automation thus developed, as applied to ORSUB pipeline is presented. ORSUB is one of the three initially selected pipelines that have been studied. These pipelines represent about 14% of the total length of petroleum transmission lines operated by PETROBRAS Transporte S.A. (TRANSPETRO) in Brazil. Based on the implementation of valve motorization on these three pipeline, motorization of block valves for remote operation on the remaining pipelines is intended, depending on the success of these implementations, on historical records of failure and appropriate ranking. (author)

  9. Upstream pipelines : inspection, corrosion and integrity management

    Energy Technology Data Exchange (ETDEWEB)

    Paez, J.; Stephenson, M. [Talisman Energy Inc., Calgary, AB (Canada)] (comps.)

    2009-07-01

    Accurate inspection techniques are needed to ensure the integrity of pipelines. This working group discussed methods of reducing pipeline failures for a variety of pipes. A summary of recent pipeline performance statistics was presented, as well as details of third party damage and fiberglass pipe failures. A batch inhibitor joint industry project was described. The session demonstrated that integrity program need to be developed at the field-level as well as at the upper management level. Fiberglass pipeline failures are significant problem for pipeline operators. Corrosion monitoring, pigging and specific budgets are needed in order to ensure the successful management of pipeline integrity. New software developed to predict pipeline corrosion rates was discussed, and methods of determining mole fractions and flow regimes were presented. The sessions included updates from regulators and standards agencies as well as discussions of best practices, regulations, codes and standards related to pipeline integrity. The working group was divided into 4 sessions: (1) updates since 2007 with input from the Canadian Association of Petroleum Producers (CAPP) and the Upstream Pipeline Integrity Management Association (UPIMA); (2) integrity of non-metallic pipelines; (3) upstream pipeline integrity issues; and (4) hot topics. tabs., figs.

  10. The MIRI Medium Resolution Spectrometer calibration pipeline

    CERN Document Server

    Labiano, A; Bailey, J I; Beard, S; Dicken, D; García-Marín, M; Geers, V; Glasse, A; Glauser, A; Gordon, K; Justtanont, K; Klaassen, P; Lahuis, F; Law, D; Morrison, J; Müller, M; Rieke, G; Vandenbussche, B; Wright, G

    2016-01-01

    The Mid-Infrared Instrument (MIRI) Medium Resolution Spectrometer (MRS) is the only mid-IR Integral Field Spectrometer on board James Webb Space Telescope. The complexity of the MRS requires a very specialized pipeline, with some specific steps not present in other pipelines of JWST instruments, such as fringe corrections and wavelength offsets, with different algorithms for point source or extended source data. The MRS pipeline has also two different variants: the baseline pipeline, optimized for most foreseen science cases, and the optimal pipeline, where extra steps will be needed for specific science cases. This paper provides a comprehensive description of the MRS Calibration Pipeline from uncalibrated slope images to final scientific products, with brief descriptions of its algorithms, input and output data, and the accessory data and calibration data products necessary to run the pipeline.

  11. Welding technique studies on the "West-East" pipeline project

    Institute of Scientific and Technical Information of China (English)

    Sui Yongli; Du Zeyu; Huang Fuxiang; Qi Lichun

    2006-01-01

    This paper described the work of welding process design for the "West-East" pipeline project, which is high pressure, large diameter and heavy wall thickness. According to the different geographical situation, climate, culture and the flexibility of the welding methods, this work recommended the semi-automatic process at the east and middle sections and automatic process at the west section of the pipeline project. The manual process is recommended on the tie-in joints and repairs. The double joint pipe and the 3 joint pipe are recommended at the water net place and some in-ditch welding place to reduce the welding volume. Also the special redesigned bevels are recommended for the automatic process and the semiautomatic process. Through all destructive tests, the results shows the welds are meet the requirements of related standards,specifications and design documents.

  12. Russia: the pipeline diplomacy; Russie: la diplomatie du pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Bourdillon, Y

    2005-01-15

    First world producer of oil and gas, Russia wishes to use its mastery of energy distribution to recover its great power status. The oil and gas pipelines network is the basement used by Russia to build up its hegemony in Europe. The Russian oil and gas companies are also carrying out a long-term strategy of international expansion, in particular thanks to investments in the neighboring countries for the building of new infrastructures or the purchase of oil refineries. (J.S.)

  13. Application of new experimental methods to pipeline stress corrosion cracking. Annual report, March 1992-February 1993

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, C.G.; Kobayashi, T.; Becker, C.H.; Pound, B.G.; Simons, J.W.

    1993-04-01

    The objective of the investigation is to develop a physically based understanding of the mechanisms of stress corrosion cracking (SCC) in pipeline steels by applying advanced fracture surface and electrochemical characterization techniques to samples taken from fielded pipeline. The investigations found that the effect of pressure fluctuations on the propagation of stress corrosion cracks was readily evident from an analysis of the topographies of conjugate fracture surfaces. Substantial crack blunting was produced under normal pipeline operating conditions. Corrosion deposits were removed from the fracture surfaces of a stress corrosion crack in a pipeline specimen recovered from service. The topography of the underlying metal surface appears to be preserved with little corrosion damage after crack formation. This allowed the cracking process to be reconstructed from the surface topography. In some cases, deposits on the fracture surfaces of stress corrosion cracks contain significant concentrations of metallic elements that are not found in pipeline steels but are likely to be commonplace in the surrounding environment.

  14. Logistics aspects of pipeline transport in the supply of petroleum products

    Directory of Open Access Journals (Sweden)

    Wessel Pienaar

    2008-09-01

    Full Text Available The commercial transportation of crude oil and petroleum products by pipeline is receiving increased attention in South Africa. Transnet Pipeline Transport has recently obtained permission from the National Energy Regulator of South Africa (Nersa to construct and operate a new petroleum products pipeline of 60 cm diameter from Durban to Gauteng. At an operating speed of 10 km/h the proposed 60 cm Transnet pipeline would be able to deliver 3,54 million litres of petroleum product per hour. This is equivalent to 89 deliveries per hour using road tank vehicles with an average carrying capacity of 40 000 litres of fuel per vehicle. This pipeline throughput is also equivalent to two trains departing per hour, each consisting of 42 petroleum tank wagons with an average carrying capacity of 42 500 litres of fuel per wagon. Considering that such road trucks and rail wagons return empty to the upstream refineries in Durban, it is clear that there is no tenable long-term alternative to pipeline transport:pipeline transport is substantially cheaper than road and rail transport;pipeline transport is much safer than rail and especially road transport; andpipeline transport frees up alternative road and rail transport capacity.Pipeline transport is a non-containerised bulk mode of transport for the carriage of suitable liquids (for example, petroleum commodities, which include crude oil, refined fuel products and liquid petro-chemicals, gas, slurrified coal and certain water-suspended ores and minerals. InSouth Africa, petroleum products account for the majority of commercial pipeline traffic, followed by crude oil and natural gas. There are three basic types of petroleum pipeline transport systems:Gathering pipeline systemsCrude oil trunk pipeline systemsRefined products pipeline systems Collectively, these systems provide a continuous link between extraction, processing, distribution, and wholesalers’ depots in areas of consumption. The following

  15. Global buckling assessment of high pressure and high temperature (HP/HT) offshore pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Seung-Ho; Jung, Jong-Jin; Lee, Woo-Seob [Maritime Research Institute, Hyundai Heavy Industries, Ulsan, (Korea, Republic of); Kim, Yun-Hak; Kim, Jong-Bae [Offshore Installation Engineering Department, Hyundai Heavy Industries, Ulsan, (Korea, Republic of)

    2010-07-01

    High pressure and high temperature (HP/HT) offshore pipelines are frequently subjected to lateral buckling due to excessive compressive axial force. Several control processes have been designed such as sleepers to reduce lateral buckling. This paper investigated the effect of the introduction of sleepers as buckle triggers on the behavior of HP/HT pipelines. A 3D finite element analysis using ABAQUS software was performed to simulate concrete sleepers and a profile of the seabed. The analysis criteria were the buckling amplitude, Von Mises stress, equivalent plastic strain and the effective axial force on the pipeline. A case study for HP/HT pipeline was been carried out based on installation surveys. Comparisons between the results from a model without buckle trigger and those from a model with buckle trigger were carried out. It was found that the change to the support structure, adding a buckle trigger, affected the behaviour of the pipeline considerably.

  16. Markov chain model helps predict pitting corrosion depth and rate in underground pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Caleyo, F.; Velazquez, J.C.; Hallen, J. M. [ESIQIE, Instituto Politecnico Nacional, Mexico D. F. (Mexico); Esquivel-Amezcua, A. [PEMEX PEP Region Sur, Villahermosa, Tabasco (Mexico); Valor, A. [Universidad de la Habana, Vedado, La Habana (Cuba)

    2010-07-01

    Recent reports place pipeline corrosion costs in North America at seven billion dollars per year. Pitting corrosion causes the higher percentage of failures among other corrosion mechanisms. This has motivated multiple modelling studies to be focused on corrosion pitting of underground pipelines. In this study, a continuous-time, non-homogenous pure birth Markov chain serves to model external pitting corrosion in buried pipelines. The analytical solution of Kolmogorov's forward equations for this type of Markov process gives the transition probability function in a discrete space of pit depths. The transition probability function can be completely identified by making a correlation between the stochastic pit depth mean and the deterministic mean obtained experimentally. The model proposed in this study can be applied to pitting corrosion data from repeated in-line pipeline inspections. Case studies presented in this work show how pipeline inspection and maintenance planning can be improved by using the proposed Markovian model for pitting corrosion.

  17. Natural gas pipelines for biomethane distribution

    Energy Technology Data Exchange (ETDEWEB)

    Wojcik, Monika [PGNiG SA, Warszawa (Poland). Centrala Spolki

    2011-07-01

    The study reveals natural gas pipelines of high and medium pressure in Poland and Baltic countries, such as: Estonia, Latvia, Lithuania, Norway, Sweden, Germany, Finland and the Kaliningrad Oblast. The basic aim of the study was assessing the possibility of injecting biogas produced in biogas plants to the gas network or its use as CNG fuel delivered via pipeline directly to the station. Characterized qualitative factors for the transmission of the biogas (purified to the natural gas) in existing gas networks and proposes the location of the biogas plants in relation to the deployment of these networks. The study shows existing solutions of the distribution of biomethane in selected countries bordering the Baltic Sea, and analyzes the cross-border transmission capacity of the gas. The article also contains a characterization and assessment of legal and economic conditions affecting the use of biomethane processes as fuel for motor vehicles. It also shows the main priorities in this area and environmental and social benefits arising from the production and use of biomethane as a motor fuel. (orig.)

  18. A closed solution for the collapse load of pressurized pipelines in free spans

    Energy Technology Data Exchange (ETDEWEB)

    Bezerra, Luciano M. [Brasilia Univ., DF (Brazil). Dept. de Engenharia Civil; Murray, David W.; Xuejun Song [University of Alberta (Canada). Civil Engineering Dept.

    2005-07-01

    Submarine pipelines for oil exploitation, generally, are under internal pressure and compressive thermal loading. Due to rough see-bottom terrains, these pipelines may be supported only intermittently and span freely. The collapse of such pipelines may produce oil leakage to the environment. A common engineering practice for the determination of the collapse load of such pipelines is the use of finite element modeling. This paper presents an analytical method for the determination of the collapse load of pressurized pipelines extended over free spans. The formulation also takes into account the internal pressure and initial imperfection, generally present in these pipelines. Collapse load is determined from a deduced transcendental equation. Results of the presented formulation are compared with sophisticated finite element analyses. While sophisticated finite element analysis requires hours of computer processing, the present formulation takes practically no time to assess a good approximation for the collapse load of pressurized free span pipelines under compression. The present paper is not intended to substitute the more precise finite element analyses but to provide an easier, faster, and practical way to determine a first approximation of the collapse load of pressurized free span pipelines. (author)

  19. Analyses and application of the magnetic field at girth welds in pipelines

    Science.gov (United States)

    Huang, Xinjing; Chen, Shili; Guo, Shixu; Zhao, Wei; Zhang, Yu; Jin, Shijiu

    2013-11-01

    This paper proposes a novel method of utilizing weld identification to improve the positioning accuracy of an in-pipe detector in pipelines. The distributions of the magnetic field inside a single pipe are analysed using the equivalent magnetic charge method. Then the causes and characteristics of abnormal magnetic fields near the welds in the pipelines when pipes are welded together are discussed. A spherical carrier equipped with a magnetic sensor is designed and is used to measure the magnetic field inside an annular experimental pipeline when the carrier is pushed forward by the fluid in the pipeline and is rolling along the pipeline. Theory and experimental research show that there are very obvious abnormalities of the magnetic field at the girth welds in the pipelines. The abnormal magnetic field near the welds can be remarkably enhanced and accurately located using the signal enhancement method of continuous wavelet transformation and the peak detection method based on quadratic polynomial fitting respectively, thus enabling the identification and location of the welds. Different approaches are adopted to process magnetic field signals of different intensities in the pipelines in different directions. Finally, by considering construction information on the pipelines recorded previously, the positioning error is kept to less than 5 cm.

  20. The life of hydrotransport pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Turchaninov, S.P.

    1973-01-01

    This book summarizes information on the hydroabrasive wear of pressurized pipes transporting bulk solid materials, such as coal. An analysis is presented of the operation of pipelines, and measures are recommended for their effective use. Methods of laboratory and production studies of hydroabrasive wear of pipes are described. The regularities of hydroabrasive wear of pipes are systematized as functions of the hydraulic characteristics of transportation and other operational factors. Methods are studied for increasing the durability of pipelines, and recommendations are given for determination of their throughput capacity through their entire service life. The book is designed for engineering and technical workers, planning-design and scientific research organizations, and may also be useful to university students. (87 refs.)

  1. Design and realization of a high-speed 12-bit pipelined analog-digital converter IP block

    OpenAIRE

    Toprak, Zeynep

    2001-01-01

    This thesis presents the design, verification, system integration and the physical realization of a monolithic high-speed analog-digital converter (ADC) with 12-bit accuracy. The architecture of the ADC has been realized as a pipelined structure consisting of four pipeline stages, each of which is capable of processing the incoming analog signal with 4-bit accuracy. A bit-overlapping technique has been employed for digital error correction between the pipeline stages so that the influence of ...

  2. A novel pipeline based FPGA implementation of a genetic algorithm

    Science.gov (United States)

    Thirer, Nonel

    2014-05-01

    To solve problems when an analytical solution is not available, more and more bio-inspired computation techniques have been applied in the last years. Thus, an efficient algorithm is the Genetic Algorithm (GA), which imitates the biological evolution process, finding the solution by the mechanism of "natural selection", where the strong has higher chances to survive. A genetic algorithm is an iterative procedure which operates on a population of individuals called "chromosomes" or "possible solutions" (usually represented by a binary code). GA performs several processes with the population individuals to produce a new population, like in the biological evolution. To provide a high speed solution, pipelined based FPGA hardware implementations are used, with a nstages pipeline for a n-phases genetic algorithm. The FPGA pipeline implementations are constraints by the different execution time of each stage and by the FPGA chip resources. To minimize these difficulties, we propose a bio-inspired technique to modify the crossover step by using non identical twins. Thus two of the chosen chromosomes (parents) will build up two new chromosomes (children) not only one as in classical GA. We analyze the contribution of this method to reduce the execution time in the asynchronous and synchronous pipelines and also the possibility to a cheaper FPGA implementation, by using smaller populations. The full hardware architecture for a FPGA implementation to our target ALTERA development card is presented and analyzed.

  3. STATUS AND PROSPECT OF OIL AND GAS PIPELINES IN CHINA

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    @@ With the exploration and development of natural gas and the increase of crude oil import, the industry of China's Oil and Gas Pipelines has witnessed rapid development. Especially the gas pipeline industry is entering a peak period of development. Thanks to the completion and operation of large-scale pipeline projects including West-East Gas Transportation Pipeline project,Shanxi-Beijing Gas Pipeline Ⅱ, Ji-Ning Pipeline,Huaiyang-Wuhan Pipeline, Guangdong LNG Pipeline,Western Pipeline and Pearl River Delta Oil Product Pipeline, many trans-regional gas and oil pipeline networks with initial scale have been gradually established and improved in China. Meanwhile, the metallurgy,manufacturing and construction level of pipelines has been greatly developed, achieving world top level. The next five years is still a peak period of development for China's gas and oil pipeline industry which will enjoy a broader prospect.

  4. The MACHO data pipeline

    CERN Document Server

    Axelrod, T S; Quinn, P J; Bennett, D P; Freeman, K C; Peterson, B A; Rodgers, A W; Alcock, C B; Cook, K H; Griest, K; Marshall, S L; Pratt, M R; Stubbs, C W; Sutherland, W

    1995-01-01

    The MACHO experiment is searching for dark matter in the halo of the Galaxy by monitoring more than 20 million stars in the LMC and Galactic bulge for gravitational microlensing events. The hardware consists of a 50 inch telescope, a two-color 32 megapixel ccd camera, and a network of computers. On clear nights the system generates up to 8 GB of raw data and 1 GB of reduced data. The computer system is responsible for all realtime control tasks, for data reduction, and for storing all data associated with each observation in a data base. The subject of this paper is the software system that handles these functions. It is an integrated system controlled by Petri nets that consists of multiple processes communicating via mailboxes and a bulletin board. The system is highly automated, readily extensible, and incorporates flexible error recovery capabilities. It is implemented with C++ in a Unix environment.

  5. An integrated SNP mining and utilization (ISMU pipeline for next generation sequencing data.

    Directory of Open Access Journals (Sweden)

    Sarwar Azam

    Full Text Available Open source single nucleotide polymorphism (SNP discovery pipelines for next generation sequencing data commonly requires working knowledge of command line interface, massive computational resources and expertise which is a daunting task for biologists. Further, the SNP information generated may not be readily used for downstream processes such as genotyping. Hence, a comprehensive pipeline has been developed by integrating several open source next generation sequencing (NGS tools along with a graphical user interface called Integrated SNP Mining and Utilization (ISMU for SNP discovery and their utilization by developing genotyping assays. The pipeline features functionalities such as pre-processing of raw data, integration of open source alignment tools (Bowtie2, BWA, Maq, NovoAlign and SOAP2, SNP prediction (SAMtools/SOAPsnp/CNS2snp and CbCC methods and interfaces for developing genotyping assays. The pipeline outputs a list of high quality SNPs between all pairwise combinations of genotypes analyzed, in addition to the reference genome/sequence. Visualization tools (Tablet and Flapjack integrated into the pipeline enable inspection of the alignment and errors, if any. The pipeline also provides a confidence score or polymorphism information content value with flanking sequences for identified SNPs in standard format required for developing marker genotyping (KASP and Golden Gate assays. The pipeline enables users to process a range of NGS datasets such as whole genome re-sequencing, restriction site associated DNA sequencing and transcriptome sequencing data at a fast speed. The pipeline is very useful for plant genetics and breeding community with no computational expertise in order to discover SNPs and utilize in genomics, genetics and breeding studies. The pipeline has been parallelized to process huge datasets of next generation sequencing. It has been developed in Java language and is available at http://hpc.icrisat.cgiar.org/ISMU as a

  6. Study on Water Hammer Suppression of Pipeline in Opening Process of Ultra-supercritical Steam Trap%超(超)临界疏水阀开启过程管道水锤抑制研究

    Institute of Scientific and Technical Information of China (English)

    李树勋; 娄燕鹏; 徐晓刚; 丁强伟

    2016-01-01

    针对超(超)临界疏水阀开启过程中管道水锤振动问题,采用特征线法对阀控管道水锤瞬变模型进行了数值计算,运用 Matlab 求解得到了疏水阀在不同流量特性、不同节流效应下开启时管道中流量和水锤压头的时域曲线。研究表明:选用等百分比流量特性的疏水阀,可以减缓阀开启时阀后管道的水锤现象,同时节流效应对疏水阀开启时阀后管道水锤的影响显著,节流越明显,管道中水锤压头幅值越小,水锤波动趋势也越缓和。研究结果为抑制管路水锤与冲击设计提供理论参考。%Aiming at severe vibration phenomenon of pipe behind the ultra-supercritical steam trap during the valve-opening process,the transient model on water hammer in the pipe was calculated by method of characteristics (MOC).Based on Matlab software,the time domain curves of flow and water hammer pressure head on fluid was obtained in different flow characteristic and different throttling effect for valve.The analyses show that adopting equal percentage flow characteristic about the valve could alle-viate water hammer in the pipe.Moreover,the influences of throttling effect for steam trap on water hammer are obvious,and the more obvious the throttling effect for steam trap,the more relieving water hammer wave and the smaller water hammer pressure head peak.The results provide theoretical references for design of steam trap in preventing water hammer and anti-shock analysis of the fluid on pipeline.

  7. FPGA Implementation of Wave Pipelining CORDIC Algorithms

    Institute of Scientific and Technical Information of China (English)

    CUI Wei

    2008-01-01

    The implementation of the coordinate rotational digital computer (CORDIC) algorithm with wave pipelining technique on field programmable gate array (FPGA) is described. All data in FPGA-based wave pipelining pass through a number of logic gates, in the same way that all data pass through the same number of registers in a conventional pipeline. Moreover, all paths are routed using identical routing resources. The manual placement, timing driven routing and timing analyzing techniques are applied to optimize the layout for achieving good path balance. Experimental results show that a 256-LUT logic depth circuit mapped on XC4VLX15-12 runs as high as 330MHz, which is a little lower than the speed of 336MHz based on the conventional 16-stage pipelining in the same chip. The latency of the wave pipelining circuit is 30.3ns, which is 36.4% shorter than the latency of 16-stage conventional pipelining circuit.

  8. 49 CFR 195.9 - Outer continental shelf pipelines.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 3 2010-10-01 2010-10-01 false Outer continental shelf pipelines. 195.9 Section... HAZARDOUS LIQUIDS BY PIPELINE General § 195.9 Outer continental shelf pipelines. Operators of transportation pipelines on the Outer Continental Shelf must identify on all their respective pipelines the specific...

  9. Tension fracture behaviors of welded joints in X70 steel pipeline

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    The surface of welded joints in X70 steel pipeline was processed by laser shock wave, its mechanical behaviors of tension fracture were analyzed with tension test,and the fracture morphologies and the distributions of chemical element were observed with scanning electron microscope and energy dispersive spectrum,respectively.The experimental results show that the phenomenon of grain refinement occurs in the surface of welded joints in X70 steel pipeline after the laser shock processing,and compressive re...

  10. Performance estimation of micro-pipeline based calculations

    Directory of Open Access Journals (Sweden)

    Antoniya Tyaneva

    2012-06-01

    Full Text Available Main focus of this article is the performance estimation of any arbitrary computational structure in various forms of control organization. Four organization cases are observed - sequential processing of high number of tasks by classical control of the computing structure using (synchronous and asynchronous end state machine and parallel processing of tasks, organized in pipelined computational structure (synchronous and asynchronous controlled. The latency of each operational level is approximated with normally distributed random variable.

  11. Tubular lining material for pipelines having bends

    Energy Technology Data Exchange (ETDEWEB)

    Moringa, A.; Sakaguchi, Y.; Hyodo, M.; Yagi, I.

    1987-03-24

    A tubular lining material for pipelines having bends or curved portions comprises a tubular textile jacket made of warps and wefts woven in a tubular form overlaid with a coating of a flexible synthetic resin. It is applicable onto the inner surface of a pipeline having bends or curved portions in such manner that the tubular lining material with a binder onto the inner surface thereof is inserted into the pipeline and allowed to advance within the pipeline, with or without the aid of a leading rope-like elongated element, while turning the tubular lining material inside out under fluid pressure. In this manner the tubular lining material is applied onto the inner surface of the pipeline with the binder being interposed between the pipeline and the tubular lining material. The lining material is characterized in that a part of all of the warps are comprised of an elastic yarn around which, over the full length thereof, a synthetic fiber yarn or yarns have been left-and/or right-handedly coiled. This tubular lining material is particularly suitable for lining a pipeline having an inner diameter of 25-200 mm and a plurality of bends, such as gas service pipelines or house pipelines, without occurrence of wrinkles in the lining material in a bend.

  12. Transmission pipeline calculations and simulations manual

    CERN Document Server

    Menon, E Shashi

    2014-01-01

    Transmission Pipeline Calculations and Simulations Manual is a valuable time- and money-saving tool to quickly pinpoint the essential formulae, equations, and calculations needed for transmission pipeline routing and construction decisions. The manual's three-part treatment starts with gas and petroleum data tables, followed by self-contained chapters concerning applications. Case studies at the end of each chapter provide practical experience for problem solving. Topics in this book include pressure and temperature profile of natural gas pipelines, how to size pipelines for specified f

  13. A quick guide to pipeline engineering

    CERN Document Server

    Alkazraji, D

    2008-01-01

    Pipeline engineering requires an understanding of a wide range of topics. Operators must take into account numerous pipeline codes and standards, calculation approaches, and reference materials in order to make accurate and informed decisions.A Quick Guide to Pipeline Engineering provides concise, easy-to-use, and accessible information on onshore and offshore pipeline engineering. Topics covered include: design; construction; testing; operation and maintenance; and decommissioning.Basic principles are discussed and clear guidance on regulations is provided, in a way that will

  14. 77 FR 16471 - Pipeline Safety: Implementation of the National Registry of Pipeline and Liquefied Natural Gas...

    Science.gov (United States)

    2012-03-21

    ...: Implementation of the National Registry of Pipeline and Liquefied Natural Gas Operators AGENCY: Pipeline and... registry of pipeline and liquefied natural gas operators. This notice provides updates to the information... and liquefied natural gas (LNG) operators. New operators use the national registry to obtain...

  15. Retropath: automated pipeline for embedded metabolic circuits.

    Science.gov (United States)

    Carbonell, Pablo; Parutto, Pierre; Baudier, Claire; Junot, Christophe; Faulon, Jean-Loup

    2014-08-15

    Metabolic circuits are a promising alternative to other conventional genetic circuits as modular parts implementing functionalities required for synthetic biology applications. To date, metabolic design has been mainly focused on production circuits. Emergent applications such as smart therapeutics, however, require circuits that enable sensing and regulation. Here, we present RetroPath, an automated pipeline for embedded metabolic circuits that explores the circuit design space from a given set of specifications and selects the best circuits to implement based on desired constraints. Synthetic biology circuits embedded in a chassis organism that are capable of controlling the production, processing, sensing, and the release of specific molecules were enumerated in the metabolic space through a standard procedure. In that way, design and implementation of applications such as therapeutic circuits that autonomously diagnose and treat disease, are enabled, and their optimization is streamlined.

  16. Comparing MapReduce and Pipeline Implementations for Counting Triangles

    Directory of Open Access Journals (Sweden)

    Edelmira Pasarella

    2017-01-01

    Full Text Available A common method to define a parallel solution for a computational problem consists in finding a way to use the Divide and Conquer paradigm in order to have processors acting on its own data and scheduled in a parallel fashion. MapReduce is a programming model that follows this paradigm, and allows for the definition of efficient solutions by both decomposing a problem into steps on subsets of the input data and combining the results of each step to produce final results. Albeit used for the implementation of a wide variety of computational problems, MapReduce performance can be negatively affected whenever the replication factor grows or the size of the input is larger than the resources available at each processor. In this paper we show an alternative approach to implement the Divide and Conquer paradigm, named dynamic pipeline. The main features of dynamic pipelines are illustrated on a parallel implementation of the well-known problem of counting triangles in a graph. This problem is especially interesting either when the input graph does not fit in memory or is dynamically generated. To evaluate the properties of pipeline, a dynamic pipeline of processes and an ad-hoc version of MapReduce are implemented in the language Go, exploiting its ability to deal with channels and spawned processes. An empirical evaluation is conducted on graphs of different topologies, sizes, and densities. Observed results suggest that dynamic pipelines allows for an efficient implementation of the problem of counting triangles in a graph, particularly, in dense and large graphs, drastically reducing the execution time with respect to the MapReduce implementation.

  17. FEM analysis of impact of external objects to pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Gracie, Robert; Konuk, Ibrahim [Geological Survey of Canada, Ottawa, ON (Canada)]. E-mail: ikonuk@NRCan.gc.ca; Fredj, Abdelfettah [BMT Fleet Technology Limited, Ottawa, ON (Canada)

    2003-07-01

    One of the most common hazards to pipelines is impact of external objects. Earth moving machinery, farm equipment or bullets can dent or fail land pipelines. External objects such as anchors, fishing gear, ice can damage offshore pipelines. This paper develops an FEM model to simulate the impact process and presents investigations using the FEM model to determine the influence of the geometry and velocity of the impacting object and also will study the influence of the pipe diameter, wall thickness, and concrete thickness along with internal pressure. The FEM model is developed by using LS-DYNA explicit FEM software utilizing shell and solid elements. The model allows damage and removal of the concrete and corrosion coating elements during impact. Parametric studies will be presented relating the dent size to pipe diameter, wall thickness and concrete thickness, internal pipe pressure, and impacting object geometry. The primary objective of this paper is to develop and present the FEM model. The model can be applied to both offshore and land pipeline problems. Some examples are used to illustrate how the model can be applied to real life problems. A future paper will present more detailed parametric studies. (author)

  18. A 5-M approach to control external pipeline corrosion

    Energy Technology Data Exchange (ETDEWEB)

    Papavinasam, S.; Doiron, A. [Natural Resources Canada, Ottawa, ON (Canada). CANMET Materials Technology Laboratory

    2009-07-01

    The application of polymeric coatings and cathodic protection (CP) are the general methods of controlling external corrosion of pipelines. Since many pipelines are being operated beyond their original design span, several methods are used to assess their integrity, such as inline inspection (ILI), hydrostatic testing and direct assessment (DA). The 5-M approach to controlling external pipeline corrosion combines the strengths of various methodologies to overcome the weaknesses in any one approach. The 5-M approach includes mitigation, modeling, monitoring, maintenance and management. This paper described the procedures and practices of the 5Ms. The 5-M approach is useful in developing an effective integrity management program by identifying susceptible conditions under which coating deterioration may occur, by establishing priorities for repair or replacement, and by protecting the environment from pipeline leakage and ruptures. A freeware software program has been developed to coordinate various activities of the 5-M approach. The 5-M approach complements the external corrosion direct assessment (ECDA) process of the National Association of Corrosion Engineers (NACE), and provides quantitative scores to the ECDA data. 24 refs., 4 tabs., 2 figs.

  19. Transforming microbial genotyping: a robotic pipeline for genotyping bacterial strains.

    Directory of Open Access Journals (Sweden)

    Brian O'Farrell

    Full Text Available Microbial genotyping increasingly deals with large numbers of samples, and data are commonly evaluated by unstructured approaches, such as spread-sheets. The efficiency, reliability and throughput of genotyping would benefit from the automation of manual manipulations within the context of sophisticated data storage. We developed a medium- throughput genotyping pipeline for MultiLocus Sequence Typing (MLST of bacterial pathogens. This pipeline was implemented through a combination of four automated liquid handling systems, a Laboratory Information Management System (LIMS consisting of a variety of dedicated commercial operating systems and programs, including a Sample Management System, plus numerous Python scripts. All tubes and microwell racks were bar-coded and their locations and status were recorded in the LIMS. We also created a hierarchical set of items that could be used to represent bacterial species, their products and experiments. The LIMS allowed reliable, semi-automated, traceable bacterial genotyping from initial single colony isolation and sub-cultivation through DNA extraction and normalization to PCRs, sequencing and MLST sequence trace evaluation. We also describe robotic sequencing to facilitate cherrypicking of sequence dropouts. This pipeline is user-friendly, with a throughput of 96 strains within 10 working days at a total cost of 200,000 items were processed by two to three people. Our sophisticated automated pipeline can be implemented by a small microbiology group without extensive external support, and provides a general framework for semi-automated bacterial genotyping of large numbers of samples at low cost.

  20. STRESS AND STRAIN STATE OF REPAIRING SECTION OF PIPELINE

    Directory of Open Access Journals (Sweden)

    V. V. Nikolaev

    2015-01-01

    Full Text Available Reliability of continuous operation of pipelines is an actual problem. For this reason should be developed an effective warning system of the main pipelines‘  failures and accidents not only in design and operation but also in selected repair. Changing of linear, unloaded by bending position leads to the change of stress and strain state of pipelines. And besides this, the stress and strain state should be determined and controlled in the process of carrying out the repair works. The article presents mathematical model of pipeline’s section straining in viscoelastic setting taking into account soils creep and high-speed stress state of pipeline with the purpose of stresses evaluation and load-supporting capacity of repairing section of pipeline, depending on time.  Stress and strain state analysis of pipeline includes longitudinal and circular stresses calculation  with  account of axis-asymmetrical straining and  was  fulfilled  on  the base of momentless theory of shells. To prove the consistency of data there were compared the calcu- lation results and the solution results by analytical methods for different cases (long pipeline’s section strain only under influence of cross-axis action; long pipeline’s section strain under in- fluence of longitudinal stress; long pipeline’s section strain; which is on the elastic foundation, under influence of cross-axis action. Comparison results shows that the calculation error is not more than 3 %.Analysis of stress-strain state change of pipeline’s section was carried out with development  of  this  model,  which  indicates  the  enlargement  of  span  deflection  in  comparison with problem’s solution in elastic approach. It is also proved, that for consistent assessment of pipeline maintenance conditions, it is necessary to consider the areolas of rheological processes of soils. On the base of complex analysis of pipelines there were determined stresses and time

  1. The CARMA Data Reduction Pipeline

    Science.gov (United States)

    Friedel, D. N.

    2013-10-01

    The Combined Array for Millimeter-wave Astronomy (CARMA) data reduction pipeline (CADRE) has been developed to give investigators a first look at a fully reduced set of their data. It runs automatically on all data produced by the telescope as they arrive in the data archive. CADRE is written in Python and uses Python wrappers for MIRIAD subroutines for direct access to the data. It applies passband, gain and flux calibration to the data sets and produces a set of continuum and spectral line maps in both MIRIAD and FITS format. CADRE has been in production for a year and this poster will discuss the current capabilities and planned improvements.

  2. Addressing the workforce pipeline challenge

    Energy Technology Data Exchange (ETDEWEB)

    Leonard Bond; Kevin Kostelnik; Richard Holman

    2006-11-01

    A secure and affordable energy supply is essential for achieving U.S. national security, in continuing U.S. prosperity and in laying the foundations to enable future economic growth. To meet this goal the next generation energy workforce in the U.S., in particular those needed to support instrumentation, controls and advanced operations and maintenance, is a critical element. The workforce is aging and a new workforce pipeline, to support both current generation and new build has yet to be established. The paper reviews the challenges and some actions being taken to address this need.

  3. A 40-MFLOPS 32-bit floating-point processor with elastic pipeline scheme

    Energy Technology Data Exchange (ETDEWEB)

    Komori, S.; Takata, H.; Tamura, T.; Asai, F.; Ohno, T.; Tomisawa, O. (LSI Research and Development Lab., Mitsubishi Electric Corp., Itami, Hyogo 664 (JP)); Yamasaki, T.; Shima, K. (Industrial Electronics and Systems Development Lab., Mitsubishi Electric Corp., Hyogo (JP)); Nishikawa, H.; Terada, H. (Osaka Univ., Dept. of Information Systems Engineering, Osaka (JP))

    1989-10-01

    This paper presents a 40-MFLOPS 32-bit floating-point processor (FP) which is a component chip for a data-driven single-board processor. The FP is the first practical LSI chip which has introduced the elastic pipeline scheme. All parts in the FP are autonomously controlled by self-timed circuits, and no system clock is needed for processing. The elastic pipeline scheme provides data buffering capability and stabilization of circuit operation at the same time. Pipelining has been extensively utilized so that high throughput over 40-MFLOPS can be achieved. An automatic power conservation technique, called latch mode control, is also described.

  4. Karst hazard assessment in the design of the main gas pipeline (South Yakutia)

    Science.gov (United States)

    Strokova, L. A.; Dutova, E. M.; Ermolaeva, A. V.; Alimova, I. N.; Strelnikova, A. B.

    2015-11-01

    The paper represents the description of the zonal and regional geological factors of geoengineering conditions which characterize the territory in South Yakutia crossed by the designed main gas pipeline. Cryogenic processes and karst are considered to be the most dangerous hazards for gas pipeline maintenance. Karst hazard assessment of the gas pipeline section made in the course of the research has involved a complex of geological methods: geoengineering, geophysical, hydrogeological, and mapping. Sections prone to karst development have been identified. The authors have suggested the measures to protect potentially hazardous sections and to ensure timely informing on sinkhole collapses.

  5. 77 FR 51848 - Pipeline Safety: Information Collection Activities

    Science.gov (United States)

    2012-08-27

    ... Pipeline and Hazardous Materials Safety Administration Pipeline Safety: Information Collection Activities...) Current expiration date; (4) Type of request; (5) Abstract of the information collection activity; (6... activity. PHMSA requests comments on the following information collections: Title: Pipeline Safety:...

  6. 75 FR 30099 - Pipeline Safety: Information Collection Activities

    Science.gov (United States)

    2010-05-28

    ... TRANSPORTATION Pipeline and Hazardous Materials Safety Administration Pipeline Safety: Information Collection Activities AGENCY: Pipeline and Hazardous Materials Safety Administration, DOT. ACTION: Notice. SUMMARY: In compliance with the Paperwork Reduction Act, PHMSA announces that the currently approved...

  7. Oil and Natural Gas Pipelines, North America, 2010, Platts

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Oil and Natural Gas Pipeline geospatial data layer contains gathering, interstate, and intrastate natural gas pipelines, crude and product oil pipelines, and...

  8. 18 CFR 284.227 - Certain transportation by intrastate pipelines.

    Science.gov (United States)

    2010-04-01

    ... Interstate Pipelines on Behalf of Others and Services by Local Distribution Companies § 284.227 Certain... interstate pipeline or local distribution company served by an interstate pipeline. (e) Pregrant...

  9. INTERNAL REPAIR OF PIPELINES REVIEW & EVALUATION OF INTERNAL PIPELINE REPAIR TRIALS REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Robin Gordon; Bill Bruce; Ian Harris; Dennis Harwig; George Ritter; Bill Mohr; Matt Boring; Nancy Porter; Mike Sullivan; Chris Neary

    2004-09-01

    The two broad categories of fiber-reinforced composite liner repair and deposited weld metal repair technologies were reviewed and evaluated for potential application for internal repair of gas transmission pipelines. Both are used to some extent for other applications and could be further developed for internal, local, structural repair of gas transmission pipelines. Evaluation trials were conducted on pipe sections with simulated corrosion damage repaired with glass fiber-reinforced composite liners, carbon fiber-reinforced composite liners, and weld deposition. Additional un-repaired pipe sections were evaluated in the virgin condition and with simulated damage. Hydrostatic failure pressures for pipe sections repaired with glass fiber-reinforced composite liner were only marginally greater than that of pipe sections without liners, indicating that this type of liner is generally ineffective at restoring the pressure containing capabilities of pipelines. Failure pressure for pipe repaired with carbon fiber-reinforced composite liner was greater than that of the un-repaired pipe section with damage, indicating that this type of liner is effective at restoring the pressure containing capability of pipe. Pipe repaired with weld deposition failed at pressures lower than that of un-repaired pipe in both the virgin and damaged conditions, indicating that this repair technology is less effective at restoring the pressure containing capability of pipe than a carbon fiber-reinforced liner repair. Physical testing indicates that carbon fiber-reinforced liner repair is the most promising technology evaluated to-date. Development of a comprehensive test plan for this process is recommended for use in the next phase of this project.

  10. Microstructure and Mechanism of Strengthening of Microalloyed Pipeline Steel: Ultra-Fast Cooling (UFC) Versus Laminar Cooling (LC)

    Science.gov (United States)

    Zhao, J.; Wang, X.; Hu, W.; Kang, J.; Yuan, G.; Di, H.; Misra, R. D. K.

    2016-06-01

    A novel thermo-mechanical controlled processing (TMCP) schedule involving ultra-fast cooling (UFC) technique was used to process X70 (420 MPa) microalloyed pipeline steel with high strength-high toughness combination. A relative comparison is made between microstructure and mechanical properties between conventionally processed (CP) and ultra-fast cooled (UFC) pipeline steels, together with differences in strengthening mechanisms with respect to both types of processes. UFC-processed steel exhibited best combination of strength and good toughness compared to the CP process. The microstructure of CP pipeline steel mainly consisted of acicular ferrite (AF), bainitic ferrite (BF), and dispersed secondary martensite/austenite (M/A) constituent and a small fraction of fine quasi-polygonal ferrite. In contrast, the microstructure of UFC-processed pipeline steel was predominantly composed of finer AF, BF, and dispersed M/A constituent. The primary strengthening mechanisms in UFC pipeline steel were grain size strengthening and dislocation strengthening with strength increment of ~277 and ~151 MPa, respectively. However, the strengthening contribution in CP steel was related to grain size strengthening, dislocation strengthening, and precipitation strengthening, and the corresponding strength increments were ~212, ~149 and ~86 MPa, respectively. The decrease in strength induced by reducing Nb and Cr in UFC pipeline steel was compensated by enhancing the contribution of grain size strengthening in the UFC process. In conclusion, cooling schedule of UFC combined with LC is a promising method for processing low-cost pipeline steels.

  11. Review of Oil and Gas Pipeline Construction in 2007

    Institute of Scientific and Technical Information of China (English)

    Qu Hong

    2008-01-01

    @@ China's pipeline industry has developed for 50 years till 2008. In the past 10 years, more than 50,000 kilometers of long-distance oil and gas pipelines have been constructed,of which gas pipelines reached about 30,000 kilometers,crude oil pipelines about 17,000 kilometers, and product oil pipelines about 7,000 kilometers. Oil and gas pipeline networks across regions have taken shape.

  12. NCBI prokaryotic genome annotation pipeline.

    Science.gov (United States)

    Tatusova, Tatiana; DiCuccio, Michael; Badretdin, Azat; Chetvernin, Vyacheslav; Nawrocki, Eric P; Zaslavsky, Leonid; Lomsadze, Alexandre; Pruitt, Kim D; Borodovsky, Mark; Ostell, James

    2016-08-19

    Recent technological advances have opened unprecedented opportunities for large-scale sequencing and analysis of populations of pathogenic species in disease outbreaks, as well as for large-scale diversity studies aimed at expanding our knowledge across the whole domain of prokaryotes. To meet the challenge of timely interpretation of structure, function and meaning of this vast genetic information, a comprehensive approach to automatic genome annotation is critically needed. In collaboration with Georgia Tech, NCBI has developed a new approach to genome annotation that combines alignment based methods with methods of predicting protein-coding and RNA genes and other functional elements directly from sequence. A new gene finding tool, GeneMarkS+, uses the combined evidence of protein and RNA placement by homology as an initial map of annotation to generate and modify ab initio gene predictions across the whole genome. Thus, the new NCBI's Prokaryotic Genome Annotation Pipeline (PGAP) relies more on sequence similarity when confident comparative data are available, while it relies more on statistical predictions in the absence of external evidence. The pipeline provides a framework for generation and analysis of annotation on the full breadth of prokaryotic taxonomy. For additional information on PGAP see https://www.ncbi.nlm.nih.gov/genome/annotation_prok/ and the NCBI Handbook, https://www.ncbi.nlm.nih.gov/books/NBK174280/.

  13. Software for pipeline integrity administration

    Energy Technology Data Exchange (ETDEWEB)

    Soula, Gerardo; Perona, Lucas Fernandez [Gie SA., Buenos Aires (Argentina); Martinich, Carlos [Refinaria do Norte S. A. (REFINOR), Tartagal, Provincia de Salta (Argentina)

    2009-07-01

    A Software for 'pipeline integrity management' was developed. It allows to deal with Geographical Information and a PODS database (Pipeline Open database Standard) simultaneously, in a simple and reliable way. The premises for the design were the following: didactic, geo referenced, multiple reference systems. Program skills: 1.PODS+GIS: the PODS database in which the software is based on is completely integrated with the GIS module. 2 Management of different kinds of information: it allows to manage information on facilities, repairs, interventions, physical inspections, geographical characteristics, compliance with regulations, training, offline events, operation measures, O and M information treatment and importing specific data and studies in a massive way. It also assures the integrity of the loaded information. 3 Right of way survey: it allows to verify the class location, ROW occupation, sensitive areas identification and to manage landowners. 4 Risk analysis: it is done in a qualitative way, depending on the entered data, allowing the user to identify the riskiest stretches of the system. Either results from risk analysis, data and consultations made about the database, can be exported to standard formats. (author)

  14. Electrical fingerprint of pipeline defects

    Energy Technology Data Exchange (ETDEWEB)

    Mica, Isabella [STMicroelectronics Srl, via C.Olivetti 2, 20041 Agrate Brianza (Italy)]. E-mail: isabella.mica@st.com; Polignano, Maria Luisa [STMicroelectronics Srl, via C.Olivetti 2, 20041 Agrate Brianza (Italy); Marco, Cinzia De [STMicroelectronics Srl, via C.Olivetti 2, 20041 Agrate Brianza (Italy)

    2004-12-15

    Pipeline defects are dislocations that connect the source region of the transistor with the drain region. They were widely reported to occur in CMOS, BiCMOS devices and recently in SOI technologies. They can reduce device yield either by affecting the devices functionality or by increasing the current consumption under stand-by conditions. In this work the electrical fingerprint of these dislocations is studied, its purpose is to enable us to identify these defects as the ones responsible for device failure. It is shown that the pipeline defects are responsible for a leakage current from source to drain in the transistors. This leakage has a resistive characteristic and it is lightly modulated by the body bias. It is not sensitive to temperature; vice versa the off-current of a good transistor exhibits the well-known exponential dependence on 1/T. The emission spectrum of these defects was studied and compared with the spectrum of a good transistor. The paper aims to show that the spectrum of a defective transistor is quite peculiar; it shows well defined peaks, whereas the spectrum of a good transistor under saturation conditions is characterized by a broad spectral light emission distribution. Finally the deep-level transient spectroscopy (DLTS) is tried on defective diodes.

  15. Offshore Pipeline Locations in the Gulf of Mexico, Geographic NAD27, MMS (2007) [pipelines_points_mms_2007

    Data.gov (United States)

    Louisiana Geographic Information Center — Offshore Minerals Management Pipeline Locations for the Gulf of Mexico (GOM). Contains the points of the pipeline in the GOM. All pipelines existing in the databases...

  16. Offshore Pipeline Locations in the Gulf of Mexico, Geographic NAD27, MMS (2007) [pipelines_vectors_mms_2007

    Data.gov (United States)

    Louisiana Geographic Information Center — Offshore Minerals Management Pipeline Locations for the Gulf of Mexico (GOM). Contains the lines of the pipeline in the GOM. All pipelines existing in the databases...

  17. PETROCHINA WEST EAST GAS PIPELINE & SALES COMPANY

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    @@ PetroChina West East Gas Pipeline & Sales Company, a regional company directly under PetroChina Company Limited (PetroChina), is responsible for the construction and operation of the West-East Gas Pipeline Project, and the gas marketing and sales of the natural gas market in China.

  18. Testing the School-to-Prison Pipeline

    Science.gov (United States)

    Owens, Emily G.

    2017-01-01

    The School-to-Prison Pipeline is a social phenomenon where students become formally involved with the criminal justice system as a result of school policies that use law enforcement, rather than discipline, to address behavioral problems. A potentially important part of the School-to-Prison Pipeline is the use of sworn School Resource Officers…

  19. China Pins Hopes on Pipeline with Russia

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    @@ China still has faith in the gigantic Sino-Russia oil pipeline, despite reports which said that Russia is inclined to build a competing pipeline in favor of Japan. CNPC, the company representing China to negotiate with Russia on the project, is reported to continue its preparation work to receive Russian crude.

  20. Pipeline Protection Has Its Own Law

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    @@ The Law of the People's Republic of China on the Protection of Oil and Natural Gas Pipelines (hereinafter called "the Law") will be implemented officially on October 1 this year.This is the first time that oil and natural gas pipelines were protected and managed on legal basis.

  1. MECHANICAL PROPERTIES OF LONGITUDINAL SUBMERGED ARC WELDED STEEL PIPES USED FOR GAS PIPELINE OF OFFSHORE OIL

    Institute of Scientific and Technical Information of China (English)

    Z.Z. Yang; W. Tian; Q.R. Ma; Y.L. Li; J.K. Li; J.Z. Gao; H.B. Zhang; Y.H. Yang

    2008-01-01

    Since the development of offshore oil and gas, increased submarine oil and gas pipelines were installed. All the early steel pipes of submarine pipelines depended on importing because of the strict requirements of comprehensive properties, such as,anti-corrosion, resistance to pressure and so on. To research and develop domestic steel pipes used for the submarine pipeline, the Longitudinal-seam Submerged Arc Welded (LSAW) pipes were made of steel plates cut from leveled hot rolled coils by both the JCOE and UOE (the forming process in which the plate like the letter "J", "C", "O" or "U" shape, then expansion) forming processes. Furthermore, the mechanical properties of the pipe base metal and weld metal were tested, and the results were in accordance with the corresponding pipe specification API SPEC 5L or DNV-OS-F101, which showed that domestic LSAW pipes could be used for submarine oil and gas pipelines.

  2. Pipeline leakage recognition based on the projection singular value features and support vector machine

    Energy Technology Data Exchange (ETDEWEB)

    Liang, Wei; Zhang, Laibin; Mingda, Wang; Jinqiu, Hu [College of Mechanical and Transportation Engineering, China University of Petroleum, Beijing, (China)

    2010-07-01

    The negative wave pressure method is one of the processes used to detect leaks on oil pipelines. The development of new leakage recognition processes is difficult because it is practically impossible to collect leakage pressure samples. The method of leakage feature extraction and the selection of the recognition model are also important in pipeline leakage detection. This study investigated a new feature extraction approach Singular Value Projection (SVP). It projects the singular value to a standard basis. A new pipeline recognition model based on the multi-class Support Vector Machines was also developed. It was found that SVP is a clear and concise recognition feature of the negative pressure wave. Field experiments proved that the model provided a high recognition accuracy rate. This approach to pipeline leakage detection based on the SVP and SVM has a high application value.

  3. Development of ecologically safe method for main oil and gas pipeline trenching

    Directory of Open Access Journals (Sweden)

    Akhmedov Asvar Mikdadovich

    2014-05-01

    Full Text Available Constructive, technical and technological reliability of major pipeline ensures ecological safety on different stages of life circle - beginning with project preparation activities up to the end of major pipeline operation. Even in the process of transition into new life circle stage, no matter if the pipeline needs major repairs or reconstruction, such technical and technological solutions should be found, which would preserve ecological stability of nature-anthropogenic system. Development of ecology protection technologies of construction, reconstruction and major repairs of main pipelines is of great importance not only for a region, but ensures ecological safety across the globe. The article presents a new way of trenching the main oil and gas pipeline, preservation and increase of ecological safety during its service. The updated technological plan is given in the paper for overhaul of the main oil and gas pipeline using the new technology of pipeline trenching. The suggested technical solution contributes to environment preservation with the help of deteriorating shells - the shells’ material decomposes into environment-friendly components: carbon dioxide, water and humus. The quantity of polluting agents in the atmosphere decreases with the decrease of construction term and quantity of technical equipment.

  4. Life Cycle Analysis of Bitumen Transportation to Refineries by Rail and Pipeline.

    Science.gov (United States)

    Nimana, Balwinder; Verma, Aman; Di Lullo, Giovanni; Rahman, Md Mustafizur; Canter, Christina E; Olateju, Babatunde; Zhang, Hao; Kumar, Amit

    2017-01-03

    Crude oil is currently transported primarily by pipelines and rail from extraction sites to refineries around the world. This research evaluates energy use and greenhouse gas (GHG) emissions for three scenarios (synthetic crude oil and dilbit with and without diluent return) in which 750 000 bpd of Alberta's bitumen is transported 3000 km to determine which method has a lower environmental impact. Each scenario has a pipeline and rail pathway, and the dilbit without diluent return scenario has an additional heated bitumen pathway, which does not require diluent. An Excel based bottom-up model is developed using engineering first-principles to calculate mass and energy balances for each process. Results show that pipeline transportation produced between 61% and 77% fewer GHG emissions than by rail. The GHG emissions decreased by 15% and 73% for rail and pipelines as the capacity increased from 100 000 to 800 000 bpd. A Monte Carlo simulation was performed to determine the uncertainty in the emissions and found that the uncertainty was larger for pipelines (up to ±73%) and smaller for rail (up to ±28%). The uncertainty ranges do not overlap, thus confirming that pipelines have lower GHG emissions, which is important information for policy makers conducting pipeline reviews.

  5. Unipro UGENE NGS pipelines and components for variant calling, RNA-seq and ChIP-seq data analyses.

    Science.gov (United States)

    Golosova, Olga; Henderson, Ross; Vaskin, Yuriy; Gabrielian, Andrei; Grekhov, German; Nagarajan, Vijayaraj; Oler, Andrew J; Quiñones, Mariam; Hurt, Darrell; Fursov, Mikhail; Huyen, Yentram

    2014-01-01

    The advent of Next Generation Sequencing (NGS) technologies has opened new possibilities for researchers. However, the more biology becomes a data-intensive field, the more biologists have to learn how to process and analyze NGS data with complex computational tools. Even with the availability of common pipeline specifications, it is often a time-consuming and cumbersome task for a bench scientist to install and configure the pipeline tools. We believe that a unified, desktop and biologist-friendly front end to NGS data analysis tools will substantially improve productivity in this field. Here we present NGS pipelines "Variant Calling with SAMtools", "Tuxedo Pipeline for RNA-seq Data Analysis" and "Cistrome Pipeline for ChIP-seq Data Analysis" integrated into the Unipro UGENE desktop toolkit. We describe the available UGENE infrastructure that helps researchers run these pipelines on different datasets, store and investigate the results and re-run the pipelines with the same parameters. These pipeline tools are included in the UGENE NGS package. Individual blocks of these pipelines are also available for expert users to create their own advanced workflows.

  6. The development of the strategy and plan for the decommissioning and abandonment of 36'' offshore oil export pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Matthews, Richard J. [PIMS of London Ltd, London, (United Kingdom); Galvez Reyes, Marco Antonio [PEMEX Refinacion, Veracruz, (Mexico)

    2010-07-01

    The decommissioning and abandonment of platforms and pipelines are big challenges for the pipeline industry. This paper presents a review of the decommissioning and abandonment processes based on a study case, the Rabon Grande pipeline system. First, the applicable international codes, standards and regulations associated with the decommissioning of pipelines are discussed. Next, this paper presents a review of the decommissioning and abandonment options and considerations available for the study case. The Rabon Grande pipeline system, which was shut down and isolated in 1990 pending decommissioning, is used as an example of applying decommissioning and abandonment best practice and establishing a realistic scope of work. A decommissioning plan is developed in light of these previous studies, followed by an environmental impact assessment. It is found that contrary to what was done in the case of the Rabon Grande pipeline, when a pipeline is to be shutdown, the best practice methodology is to temporally or fully decommission the system as soon as possible.

  7. Efficiency improvements in pipeline transportation systems

    Energy Technology Data Exchange (ETDEWEB)

    Banks, W. F.; Horton, J. F.

    1977-09-09

    This report identifies potential energy-conservative pipeline innovations that are most energy- and cost-effective and formulates recommendations for the R, D, and D programs needed to exploit those opportunities. From a candidate field of over twenty classes of efficiency improvements, eight systems are recommended for pursuit. Most of these possess two highly important attributes: large potential energy savings and broad applicability outside the pipeline industry. The R, D, and D program for each improvement and the recommended immediate next step are described. The eight technologies recommended for R, D, and D are gas-fired combined cycle compressor station; internally cooled internal combustion engine; methanol-coal slurry pipeline; methanol-coal slurry-fired and coal-fired engines; indirect-fired coal-burning combined-cycle pump station; fuel-cell pump station; drag-reducing additives in liquid pipelines; and internal coatings in pipelines.

  8. optimization for trenchless reconstruction of pipelines

    Directory of Open Access Journals (Sweden)

    Zhmakov Gennadiy Nikolaevich

    2015-01-01

    Full Text Available Today the technologies of trenchless reconstruction of pipelines are becoming and more widely used in Russia and abroad. One of the most perspective is methods is shock-free destruction of the old pipeline being replaced with the help of hydraulic installations with working mechanism representing a cutting unit with knife disks and a conic expander. A construction of a working mechanism, which allows making trenchless reconstruction of pipelines of different diameters, is optimized and patented and its developmental prototype is manufactured. The dependence of pipeline cutting force from knifes obtusion of the working mechanisms. The cutting force of old steel pipelines with obtuse knife increases proportional to the value of its obtusion. Two stands for endurance tests of the knifes in laboratory environment are offered and patented.

  9. Cartier Pipeline : tying it all together

    Energy Technology Data Exchange (ETDEWEB)

    Brochu, S. [Enbridge Consumers Gas, Calgary, AB (Canada); Gaz Metropolitain, Montreal, PQ (Canada)

    2001-07-01

    The Cartier Pipeline is a proposed stand-alone pipeline involving equal partnership between Alberta's Enbridge Consumers Gas and Quebec's Gaz Metropolitain to bring offshore Atlantic gas to markets in eastern Canada and the New England states. The inservice date for the proposed pipeline is November 2004. The $270 million dollar project will require 262 km of pipeline in Quebec with an annual transportation capacity of 67 Bcf initially with a cost effective expandability to 125 Bcf. Contracted commitments so far include 30 Bcf/year from Gaz Metropolitain and the same from Enbridge. Cartier offers attractive, base load, long term, complementary market diversification for Atlantic production. It also provides producers with a competitive channel to Ontario storage. In addition, the pipeline will contribute to lower tolls in Canada and the overall path to Dracut (Boston). Several graphs depicting expected costs of delivered gas supplies to Montreal were also included with this power point presentation. tabs., figs.

  10. Technical progress in pipeline design and construction

    Energy Technology Data Exchange (ETDEWEB)

    Hausken, K.B.

    1995-12-31

    This paper considers the technical progress in offshore pipeline construction with limitation to some general subjects covering pipeline design, installation and start-up. In future the use of limit state pipeline design philosophy, may be implemented as an alternative to the stress based design commonly used to day giving a potential for further optimisation of the pipeline design and consequently reduction of the initial investment. Comprehensive research and development efforts in Norway in the second half of the 1970`s, made it technically feasible to cross the deep water Norwegian Trench in the 1980`s. In addition, the development of several offshore pipeline systems until to day including gas distribution systems to the European continent, have brought Norway to the forefront of technical expertise

  11. Onset of scour below pipelines and self-burial

    DEFF Research Database (Denmark)

    Sumer, B. Mutlu; Truelsen, Christoffer; Sichmann, T.;

    2001-01-01

    locally (but not along the length of the pipeline as a two-dimensional process). The critical condition corresponding to the onset of scour was determined both in the case of currents and in the case of waves. Once the scour breaks out, it will propagate along the length of the pipeline, scour holes being......). At this point, the pipe begins to sink at the span shoulder (self-burial). It was found that the self-burial depth is governed mainly by the Keulegan-Carpenter number. The time scale of the self-burial process, on the other hand, is governed by the Keulegan-Carpenter number and the Shields parameter. Diagrams...

  12. MegaPipe: the MegaCam image stacking pipeline

    CERN Document Server

    Gwyn, Stephen D J

    2009-01-01

    This paper describes the MegaPipe image processing pipeline at the Canadian Astronomical Data Centre (CADC). The pipeline takes multiple images from the MegaCam mosaic camera on CFHT and combines them into a single output image. MegaPipe takes as input detrended MegaCam images and does a careful astrometric and photometric calibration on them. The calibrated images are then resampled and combined into image stacks. MegaPipe is run on PI data by request, data from large surveys (the CFHT Legacy Survey and the Next Generation Virgo Survey) and all non-proprietary MegaCam data in the CFHT archive. The stacked images and catalogs derived from these images are available through the CADC website. Currently, 1500 square degrees have been processed.

  13. Fatigue Crack Growth on Double Butt Weld with Toe Crack of Pipelines Steel

    OpenAIRE

    HADJOUI, Féthi; Benachour, Mustapha; Benguediab,Mohamed

    2012-01-01

    The welded structures have a broad applicability (car industry, aeronautical, marine, pipelines, etc.). The welding being an assembled process, presents both advantages and disadvantages. A simple existing defect after welding can generate a catastrophic fracture. This work studies the fatigue crack growth of double butt weld with toe crack. Two types of pipeline material are studied with knowing API 5L grades X60 and X70 where tension form of loading is applied. In order to p...

  14. Pipeline integrity: ILI baseline data for QRA

    Energy Technology Data Exchange (ETDEWEB)

    Porter, Todd R. [Tuboscope Pipeline Services, Houston, TX (United States)]. E-mail: tporter@varco.com; Silva, Jose Augusto Pereira da [Pipeway Engenharia, Rio de Janeiro, RJ (Brazil)]. E-mail: guto@pipeway.com; Marr, James [MARR and Associates, Calgary, AB (Canada)]. E-mail: jmarr@marr-associates.com

    2003-07-01

    The initial phase of a pipeline integrity management program (IMP) is conducting a baseline assessment of the pipeline system and segments as part of Quantitative Risk Assessment (QRA). This gives the operator's integrity team the opportunity to identify critical areas and deficiencies in the protection, maintenance, and mitigation strategies. As a part of data gathering and integration of a wide variety of sources, in-line inspection (ILI) data is a key element. In order to move forward in the integrity program development and execution, the baseline geometry of the pipeline must be determined with accuracy and confidence. From this, all subsequent analysis and conclusions will be derived. Tuboscope Pipeline Services (TPS), in conjunction with Pipeway Engenharia of Brazil, operate ILI inertial navigation system (INS) and Caliper geometry tools, to address this integrity requirement. This INS and Caliper ILI tool data provides pipeline trajectory at centimeter level resolution and sub-metre 3D position accuracy along with internal geometry - ovality, dents, misalignment, and wrinkle/buckle characterization. Global strain can be derived from precise INS curvature measurements and departure from the initial pipeline state. Accurate pipeline elevation profile data is essential in the identification of sag/over bend sections for fluid dynamic and hydrostatic calculations. This data, along with pipeline construction, operations, direct assessment and maintenance data is integrated in LinaViewPRO{sup TM}, a pipeline data management system for decision support functions, and subsequent QRA operations. This technology provides the baseline for an informed, accurate and confident integrity management program. This paper/presentation will detail these aspects of an effective IMP, and experience will be presented, showing the benefits for liquid and gas pipeline systems. (author)

  15. 77 FR 17119 - Pipeline Safety: Cast Iron Pipe (Supplementary Advisory Bulletin)

    Science.gov (United States)

    2012-03-23

    ... Pipeline and Hazardous Materials Safety Administration Pipeline Safety: Cast Iron Pipe (Supplementary... operators of natural gas cast iron distribution pipelines and state pipeline safety representatives. Recent deadly explosions in Philadelphia and Allentown, Pennsylvania involving cast iron pipelines installed...

  16. High performance pipelined multiplier with fast carry-save adder

    Science.gov (United States)

    Wu, Angus

    1990-01-01

    A high-performance pipelined multiplier is described. Its high performance results from the fast carry-save adder basic cell which has a simple structure and is suitable for the Gate Forest semi-custom environment. The carry-save adder computes the sum and carry within two gate delay. Results show that the proposed adder can operate at 200 MHz for a 2-micron CMOS process; better performance is expected in a Gate Forest realization.

  17. Magnetic anomaly inversion using magnetic dipole reconstruction based on the pipeline section segmentation method

    Science.gov (United States)

    Pan, Qi; Liu, De-Jun; Guo, Zhi-Yong; Fang, Hua-Feng; Feng, Mu-Qun

    2016-06-01

    In the model of a horizontal straight pipeline of finite length, the segmentation of the pipeline elements is a significant factor in the accuracy and rapidity of the forward modeling and inversion processes, but the existing pipeline segmentation method is very time-consuming. This paper proposes a section segmentation method to study the characteristics of pipeline magnetic anomalies—and the effect of model parameters on these magnetic anomalies—as a way to enhance computational performance and accelerate the convergence process of the inversion. Forward models using the piece segmentation method and section segmentation method based on magnetic dipole reconstruction (MDR) are established for comparison. The results show that the magnetic anomalies calculated by these two segmentation methods are almost the same regardless of different measuring heights and variations of the inclination and declination of the pipeline. In the optimized inversion procedure the results of the simulation data calculated by these two methods agree with the synthetic data from the original model, and the inversion accuracies of the burial depths of the two methods are approximately equal. The proposed method is more computationally efficient than the piece segmentation method—in other words, the section segmentation method can meet the requirements for precision in the detection of pipelines by magnetic anomalies and reduce the computation time of the whole process.

  18. Rapid Large Scale Reprocessing of the ODI Archive using the QuickReduce Pipeline

    Science.gov (United States)

    Gopu, A.; Kotulla, R.; Young, M. D.; Hayashi, S.; Harbeck, D.; Liu, W.; Henschel, R.

    2015-09-01

    The traditional model of astronomers collecting their observations as raw instrument data is being increasingly replaced by astronomical observatories serving standard calibrated data products to observers and to the public at large once proprietary restrictions are lifted. For this model to be effective, observatories need the ability to periodically re-calibrate archival data products as improved master calibration products or pipeline improvements become available, and also to allow users to rapidly calibrate their data on-the-fly. Traditional astronomy pipelines are heavily I/O dependent and do not scale with increasing data volumes. In this paper, we present the One Degree Imager - Portal, Pipeline and Archive (ODI-PPA) calibration pipeline framework which integrates the efficient and parallelized QuickReduce pipeline to enable a large number of simultaneous, parallel data reduction jobs - initiated by operators AND/OR users - while also ensuring rapid processing times and full data provenance. Our integrated pipeline system allows re-processing of the entire ODI archive (˜15,000 raw science frames, ˜3.0 TB compressed) within ˜18 hours using twelve 32-core compute nodes on the Big Red II supercomputer. Our flexible, fast, easy to operate, and highly scalable framework improves access to ODI data, in particular when data rates double with an upgraded focal plane (scheduled for 2015), and also serve as a template for future data processing infrastructure across the astronomical community and beyond.

  19. 基于 Windows 标准应用程序模式下的城市地下管线信息化数据处理系统设计与实现%Urban Windows Standard Application Mode Underground Pipeline Information Design and Implementation of a Data Processing System

    Institute of Scientific and Technical Information of China (English)

    魏磊

    2015-01-01

    针对目前管线成图还是运用单一的看图改库的传统作业模式,对某一类管线属性的修改必须同时修改草图、数据库、探测手簿和管线图,给作业带来了很大的不便。本文从系统设计上打破了传统的数据处理模式,实现了图库联动、测量成图一体化功能,通过在图上修改管线的属性信息和空间信息,达到图、表、库三者同时修改的目的,极大地提高了工作效率。%For the current pipeline into a single map or use flashcards to change the traditional mode of operation of the library, to modify the properties of a class of pipeline must also modify the sketch, databases, detecting hand book and pipeline map, to the job is a big inconvenience.In this paper, the system is designed to break the traditional data processing mode, to achieve the gallery link-age mapping integrated measuring function, through the purpose of modifying the properties and spatial information line on the graph, to graphs, tables, library, while three changes greatly improved work efficiency.

  20. Current pipelines for neglected diseases.

    Directory of Open Access Journals (Sweden)

    Paolo di Procolo

    2014-09-01

    Full Text Available This paper scrutinises pipelines for Neglected Diseases (NDs, through freely accessible and at-least-weekly updated trials databases. It updates to 2012 data provided by recent publications, and integrates these analyses with information on location of trials coordinators and patients recruitment status. Additionally, it provides (i disease-specific information to better understand the rational of investments in NDs, (ii yearly data, to understand the investment trends. The search identified 650 clinical studies. Leishmaniasis, Arbovirus infection, and Dengue are the top three diseases by number of clinical studies. Disease diffusion risk seems to be the most important driver of the clinical trials target choice, whereas the role played by disease prevalence and unmet need is controversial. Number of trials is stable between 2005 and 2010, with an increase in the last two years. Patient recruitment was completed for most studies (57.6%, and Phases II and III account for 35% and 28% of trials, respectively. The primary purpose of clinical investigations is prevention (49.3%, especially for infectious diseases with mosquitoes and sand flies as the vector, and treatment (43.2%, which is the primary target for parasitic diseases Research centres and public organisations are the most important clinical studies sponsors (58.9%, followed by the pharmaceutical industry (24.1%, foundations and non-governmental organisations (9.3%. Many coordinator centres are located in less affluent countries (43.7%, whereas OECD countries and BRICS account for 34.7% and 17.5% of trials, respectively. Information was partially missing for some parameters. Notwithstanding, and despite its descriptive nature, this research has enhanced the evidence of the literature on pipelines for NDs. Future contributions may further investigate whether trials metrics are consistent with the characteristics of the interested countries and the explicative variables of trials location

  1. Current pipelines for neglected diseases.

    Science.gov (United States)

    di Procolo, Paolo; Jommi, Claudio

    2014-09-01

    This paper scrutinises pipelines for Neglected Diseases (NDs), through freely accessible and at-least-weekly updated trials databases. It updates to 2012 data provided by recent publications, and integrates these analyses with information on location of trials coordinators and patients recruitment status. Additionally, it provides (i) disease-specific information to better understand the rational of investments in NDs, (ii) yearly data, to understand the investment trends. The search identified 650 clinical studies. Leishmaniasis, Arbovirus infection, and Dengue are the top three diseases by number of clinical studies. Disease diffusion risk seems to be the most important driver of the clinical trials target choice, whereas the role played by disease prevalence and unmet need is controversial. Number of trials is stable between 2005 and 2010, with an increase in the last two years. Patient recruitment was completed for most studies (57.6%), and Phases II and III account for 35% and 28% of trials, respectively. The primary purpose of clinical investigations is prevention (49.3%), especially for infectious diseases with mosquitoes and sand flies as the vector, and treatment (43.2%), which is the primary target for parasitic diseases Research centres and public organisations are the most important clinical studies sponsors (58.9%), followed by the pharmaceutical industry (24.1%), foundations and non-governmental organisations (9.3%). Many coordinator centres are located in less affluent countries (43.7%), whereas OECD countries and BRICS account for 34.7% and 17.5% of trials, respectively. Information was partially missing for some parameters. Notwithstanding, and despite its descriptive nature, this research has enhanced the evidence of the literature on pipelines for NDs. Future contributions may further investigate whether trials metrics are consistent with the characteristics of the interested countries and the explicative variables of trials location, target

  2. Application of Morphological Segmentation to Leaking Defect Detection in Sewer Pipelines

    Directory of Open Access Journals (Sweden)

    Tung-Ching Su

    2014-05-01

    Full Text Available As one of major underground pipelines, sewerage is an important infrastructure in any modern city. The most common problem occurring in sewerage is leaking, whose position and failure level is typically identified through closed circuit television (CCTV inspection in order to facilitate rehabilitation process. This paper proposes a novel method of computer vision, morphological segmentation based on edge detection (MSED, to assist inspectors in detecting pipeline defects in CCTV inspection images. In addition to MSED, other mathematical morphology-based image segmentation methods, including opening top-hat operation (OTHO and closing bottom-hat operation (CBHO, were also applied to the defect detection in vitrified clay sewer pipelines. The CCTV inspection images of the sewer system in the 9th district, Taichung City, Taiwan were selected as the experimental materials. The segmentation results demonstrate that MSED and OTHO are useful for the detection of cracks and open joints, respectively, which are the typical leakage defects found in sewer pipelines.

  3. Experimental and Numerical Analysis of a Water Emptying Pipeline Using Different Air Valves

    Directory of Open Access Journals (Sweden)

    Oscar E. Coronado-Hernández

    2017-02-01

    Full Text Available The emptying procedure is a common operation that engineers have to face in pipelines. This generates subatmospheric pressure caused by the expansion of air pockets, which can produce the collapse of the system depending on the conditions of the installation. To avoid this problem, engineers have to install air valves in pipelines. However, if air valves are not adequately designed, then the risk in pipelines continues. In this research, a mathematical model is developed to simulate an emptying process in pipelines that can be used for planning this type of operation. The one-dimensional proposed model analyzes the water phase propagation by a new rigid model and the air pockets effect using thermodynamic formulations. The proposed model is validated through measurements of the air pocket absolute pressure, the water velocity and the length of the emptying columns in an experimental facility. Results show that the proposed model can accurately predict the hydraulic characteristic variables.

  4. Method to reduce arc blow during DC arc welding of pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Espina-Hernandez, J. H.; Rueda-Morales, G.L.; Caleyo, F.; Hallen, J. M. [Instituto Politecnico Nacional, Mexico, (Mexico); Lopez-Montenegro, A.; Perz-Baruch, E. [Pemex Exploracion y Produccion, Tabasco, (Mexico)

    2010-07-01

    Steel pipelines are huge ferromagnetic structures and can be easily subjected to arc blow during the DC arc welding process. The development of methods to avoid arc blow during pipeline DC arc welding is a major objective in the pipeline industry. This study developed a simple procedure to compensate the residual magnetic field in the groove during DC arc welding. A Gaussmeter was used to perform magnetic flux density measurements in pipelines in southern Mexico. These data were used to perform magnetic finite element simulations using FEMM. Different variables were studied such as the residual magnetic field in the groove or the position of the coil with respect to the groove. An empirical predictive equation was developed from these trials to compensate for the residual magnetic field. A new method of compensating for the residual magnetic field in the groove by selecting the number of coil turns and the position of the coil with respect to the groove was established.

  5. Extremely high resolution corrosion monitoring of pipelines: retrofittable, non-invasive and real-time

    Energy Technology Data Exchange (ETDEWEB)

    Baltzersen, Oeystein; Tveit, Edd [Sensorlink AS, Trondheim (Norway); Verley, Richard [StatoilHydro ASA, Stockholm (Sweden)

    2009-07-01

    The Ultramonit unit is a clamp-on tool (removable) that uses an array of sensors to provide online, real-time, reliable and repeatable high accuracy ultrasonic wall thickness measurements and corrosion monitoring at selected locations along the pipeline. The unit can be installed on new or existing pipelines by diver or ROV. The system is based on the well-established ultrasonic pulse-echo method (A-scan). Special processing methods, and the fact that the unit is fixed to the pipeline, enable detection of changes in wall thickness in the micro-meter range. By utilizing this kind of resolution, it is possible to project corrosion rates in hours or days. The tool is used for calibration of corrosion inhibitor programs, verification and calibration of inspection pig data and general corrosion monitoring of new and existing pipelines. (author)

  6. Integrity assessment of pipelines - additional remarks; Avaliacao da integridade de dutos - observacoes adicionais

    Energy Technology Data Exchange (ETDEWEB)

    Alves, Luis F.C. [PETROBRAS S.A., Salvador, BA (Brazil). Unidade de Negocios. Exploracao e Producao

    2005-07-01

    Integrity assessment of pipelines is part of a process that aims to enhance the operating safety of pipelines. During this task, questions related to the interpretation of inspection reports and the way of regarding the impact of several parameters on the pipeline integrity normally come up. In order to satisfactorily answer such questions, the integrity assessment team must be able to suitably approach different subjects such as corrosion control and monitoring, assessment of metal loss and geometric anomalies, and third party activities. This paper presents additional remarks on some of these questions based on the integrity assessment of almost fifty pipelines that has been done at PETROBRAS/E and P Bahia over the past eight years. (author)

  7. Real-Time Low Frequency Impedance Measurements for Determination of Hydrogen Content in Pipeline Steel

    Science.gov (United States)

    Lasseigne, A. N.; Koenig, K.; Olson, D. L.; Jackson, J. E.; Mishra, B.; McColskey, J. D.

    2009-03-01

    The assessment of hydrogen content in pipeline steel is an essential requirement to monitor loss of pipe integrity with time and to prevent failures. The use of pipeline steels of increasing strength significantly reduces the threshold hydrogen concentration for hydrogen cracking. Cathodic protection and corrosion processes both contribute to accumulation of hydrogen as a function of time, which may eventually meet the cracking criteria. New and unique methodologies based on electronic property measurements offer the pipeline industry advanced non-destructive tools to provide quantified in-situ hydrogen content measurements in real-time. The use of low frequency impedance measurements as a non-contact sensor has been demonstrated for real-time determination of hydrogen content in coated pipeline steel specimens in the laboratory. Scale-up to field measurements is in progress, and the development and use of a field sensor are discussed.

  8. Basic Block of Pipelined ADC Design Requirements

    Directory of Open Access Journals (Sweden)

    V. Kledrowetz

    2011-04-01

    Full Text Available The paper describes design requirements of a basic stage (called MDAC - Multiplying Digital-to- Analog Converter of a pipelined ADC. There exist error sources such as finite DC gain of opamp, capacitor mismatch, thermal noise, etc., arising when the switched capacitor (SC technique and CMOS technology are used. These non-idealities are explained and their influences on overall parameters of a pipelined ADC are studied. The pipelined ADC including non-idealities was modeled in MATLAB - Simulink simulation environment.

  9. Suriname installing first crude-oil pipeline

    Energy Technology Data Exchange (ETDEWEB)

    McAllister, E.W. (E.W. McAllister Engineering Services, Houston, TX (US))

    1992-04-27

    This paper reports that the first cross country crude-oil pipeline in the south American country of Suriname is currently under construction. The State Oil Co. of Suriname (Staatsolie) is building the 34.4-mile, 14-in. pipeline to deliver crude oil from the Catharina Sophia field (Tambaredjo) to the Tout Lui Faut terminal near the capital, Paramaribo. Crude oil from the Jossi Kreek field will be injected at mile point (MP) 3.4. Oil from these two fields is now being moved to tout Lui Faut by Staatsolie-owned motorized ocean barges. Increased production to meet requirements of a planned refinery near Tout Lui Faut prompted the pipeline.

  10. A Survey of Visual Analytic Pipelines

    Institute of Scientific and Technical Information of China (English)

    Xu-Meng Wang; Tian-Ye Zhang; Yu-Xin Ma; Jing Xia; Wei Chen

    2016-01-01

    Visual analytics has been widely studied in the past decade. One key to make visual analytics practical for both research and industrial applications is the appropriate definition and implementation of the visual analytics pipeline which provides effective abstractions for designing and implementing visual analytics systems. In this paper we review the previous work on visual analytics pipelines and individual modules from multiple perspectives: data, visualization, model and knowledge. In each module we discuss various representations and descriptions of pipelines inside the module, and compare the commonalities and the differences among them.

  11. Management of a multi-product pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Bettoli, R.; Iacovoni, A. [Nuovo Pignone S.p.A., Roma (Italy). Div. Sistemi Automazione; Holden, D. [LICconsult, Stockton-on-Tees (United Kingdom)

    1996-09-01

    The paper describes the SCADA (System Control and Data Acquisition) Tele-supervisory System for the Kandla-Bhatinda (KBPL) multi-product pipeline. The KBPL pipeline is 1,443 km in length; it is to carry petroleum products, in a batch cycle organization, from the Kandla foreshore terminal to Bhatinda. It consists of two inlet stations, four pumping stations, five delivery stations, and two terminal stations, and is equipped with a total of 85 block valves. All the stations have the capability for launching and receiving scrapers. The SCADA system consists of 10 Station Control Centers (SCC) and one Master Control Center (MCC), all located along the pipeline.

  12. Vision-Based System of AUV for An Underwater Pipeline Tracker

    Institute of Scientific and Technical Information of China (English)

    ZHANG Tie-dong; ZENG Wen-jing; WAN Lei; QIN Zai-bai

    2012-01-01

    This paper describes a new framework for detection and tracking of underwater pipeline,which includes software system and hardware system.It is designed for vision system of AUV based on monocular CCD camera.First,the real-time data flow from image capture card is pre-processed and pipeline features are extracted for navigation.The region saturation degree is advanced to remove false edge point group after Sobel operation.An appropriate way is proposed to clear the disturbance around the peak point in the process of Hough transform.Second,the continuity of pipeline layout is taken into account to improve the efficiency of line extraction.Once the line information has been obtained,the reference zone is predicted by Kalman filter.It denotes the possible appearance position of the pipeline in the image.Kalman filter is used to estimate this position in next frame so that the information of pipeline of each frame can be known in advance.Results obtained on real optic vision data in tank experiment are displayed and discussed.They show that the proposed system can detect and track the underwater pipeline online,and is effective and feasible.

  13. Leak in spiral weld in a 16 inches gas pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Fazzini, Pablo G.; Bona, Jeremias de [GIE S.A., Mar del Plata (Argentina); Otegui, Jose L. [University of Mar del Plata (Argentina)

    2009-07-01

    This paper discusses a failure analysis after a leak in the spiral weld of a 16 inches natural gas pipeline, in service since 1974. The leak was the result of the coalescence of two different defects, on each surface of the pipe wall, located in the center of the inner cord of the helical DSAW weld. Fractographic and metallographic studies revealed that the leak was a combination of three conditions. During fabrication of the pipe, segregation in grain boundary grouped in mid weld. During service, these segregations underwent a process of selective galvanic corrosion. One of these volumetric defects coincided with a tubular pore in the outer weld. Pigging of the pipeline in 2005 for cleaning likely contributed to the increase of the leak flow, when eliminating corrosion product plugs. Although these defects are likely to repeat, fracture mechanics shows that a defect of this type is unlikely to cause a blowout. (author)

  14. PEARL RIVER DELTA OIL PRODUCTS PIPELINE ENTERS SHENZHEN

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    @@ With the strong support of Shenzhen, Huizhou Municipal governments and Shenzhen Petroleum Subsidiary Company, the construction of Shenzhen (Huizhou) section of Pearl River Delta oil products pipeline was started ahead of schedule on Sep.30, 2004,and coordination work for most of Shenzhen section has been completed up to the end of April this year.Presently, the construction of Shenzhen section is carried out smoothly in general, for the pipeline route, 69km for pileline pruging, 60km for pipe laying, 57.7km for welding; for process stations, the removal of Mawan oil storage has been completed by the plan, and the foundation for Dapengwan oil storage tank is under construction.

  15. Inclusion variations and calcium treatment optimization in pipeline steel production

    Science.gov (United States)

    Liu, Jian-Hua; Wu, Hua-Jie; Bao, Yan-Ping; Wang, Min

    2011-10-01

    SiCa line and SiCaBaFe alloy were injected into liquid pipeline steel at the end of LF refining as calcium treatment, and samples were taken from the ladles, mould, and slabs. Analysis of Ca content and inclusions shows that Ca content in steel decreases obviously in the following process after calcium treatment; the compositions, morphology, and sizes of inclusions also vary much in the production; primary inclusions in the ladles prior to calcium treatment are mainly Al2O3 inclusions, but they turn to fine irregular CaS-CaO-Al2O3 compound inclusions after the treatment, then become fine globular CaO-Al2O3 inclusions in the mould, and finally change to a few larger irregular CaS-CaO-Al2O3 complex inclusions in the slabs. Thermodynamic study reveals that inclusion variations are related with the preferential reactions among Ca, Al2O3, and S and the precipitation of S in CaO-Al2O3 inclusions with high sulfur capacity. New evaluation standards for calcium treatment in high-grade pipeline steel were put forward according to the inclusion variations and requirements of pipeline steel on inclusion controlling, and the calcium process was studied and optimized.

  16. 49 CFR 192.513 - Test requirements for plastic pipelines.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 3 2010-10-01 2010-10-01 false Test requirements for plastic pipelines. 192.513 Section 192.513 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND... Test requirements for plastic pipelines. (a) Each segment of a plastic pipeline must be tested...

  17. 75 FR 66425 - Pipeline Safety: Request for Special Permit

    Science.gov (United States)

    2010-10-28

    ... the Federal Pipeline Safety Laws, PHMSA is publishing this notice of a special permit request we have received from Gulf South Pipeline Company, LP, a natural gas pipeline operator, seeking relief from compliance with certain requirements in the Federal Pipeline Safety Regulations. This notice seeks public...

  18. 49 CFR 192.10 - Outer continental shelf pipelines.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 3 2010-10-01 2010-10-01 false Outer continental shelf pipelines. 192.10 Section... NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS General § 192.10 Outer continental shelf pipelines. Operators of transportation pipelines on the Outer Continental Shelf (as defined in...

  19. Corral framework: Trustworthy and fully functional data intensive parallel astronomical pipelines

    Science.gov (United States)

    Cabral, J. B.; Sánchez, B.; Beroiz, M.; Domínguez, M.; Lares, M.; Gurovich, S.; Granitto, P.

    2017-07-01

    Data processing pipelines represent an important slice of the astronomical software library that include chains of processes that transform raw data into valuable information via data reduction and analysis. In this work we present Corral, a Python framework for astronomical pipeline generation. Corral features a Model-View-Controller design pattern on top of an SQL Relational Database capable of handling: custom data models; processing stages; and communication alerts, and also provides automatic quality and structural metrics based on unit testing. The Model-View-Controller provides concept separation between the user logic and the data models, delivering at the same time multi-processing and distributed computing capabilities. Corral represents an improvement over commonly found data processing pipelines in astronomysince the design pattern eases the programmer from dealing with processing flow and parallelization issues, allowing them to focus on the specific algorithms needed for the successive data transformations and at the same time provides a broad measure of quality over the created pipeline. Corral and working examples of pipelines that use it are available to the community at https://github.com/toros-astro.

  20. Environmental analysis for pipeline gas demonstration plants

    Energy Technology Data Exchange (ETDEWEB)

    Stinton, L.H.

    1978-09-01

    The Department of Energy (DOE) has implemented programs for encouraging the development and commercialization of coal-related technologies, which include coal gasification demonstration-scale activities. In support of commercialization activities the Environmental Analysis for Pipeline Gas Demonstration Plants has been prepared as a reference document to be used in evaluating potential environmental and socioeconomic effects from construction and operation of site- and process-specific projects. Effluents and associated impacts are identified for six coal gasification processes at three contrasting settings. In general, impacts from construction of a high-Btu gas demonstration plant are similar to those caused by the construction of any chemical plant of similar size. The operation of a high-Btu gas demonstration plant, however, has several unique aspects that differentiate it from other chemical plants. Offsite development (surface mining) and disposal of large quantities of waste solids constitute important sources of potential impact. In addition, air emissions require monitoring for trace metals, polycyclic aromatic hydrocarbons, phenols, and other emissions. Potential biological impacts from long-term exposure to these emissions are unknown, and additional research and data analysis may be necessary to determine such effects. Possible effects of pollutants on vegetation and human populations are discussed. The occurrence of chemical contaminants in liquid effluents and the bioaccumulation of these contaminants in aquatic organisms may lead to adverse ecological impact. Socioeconomic impacts are similar to those from a chemical plant of equivalent size and are summarized and contrasted for the three surrogate sites.

  1. 75 FR 63774 - Pipeline Safety: Safety of On-Shore Hazardous Liquid Pipelines

    Science.gov (United States)

    2010-10-18

    ... an operator's Supervisory Control and Data Acquisition (SCADA) system for controlling the pipeline... activation timing, or methods for integration of EFRD operation with an operator's SCADA and leak...

  2. Pipelines in Louisiana, Geographic NAD83, USGS (1999) [pipelines_la_usgs_1999

    Data.gov (United States)

    Louisiana Geographic Information Center — This dataset contains vector line map information of various pipelines throughout the State of Louisiana. The vector data contain selected base categories of...

  3. A Novel Method to Enhance Pipeline Trajectory Determination Using Pipeline Junctions.

    Science.gov (United States)

    Sahli, Hussein; El-Sheimy, Naser

    2016-04-21

    Pipeline inspection gauges (pigs) have been used for many years to perform various maintenance operations in oil and gas pipelines. Different pipeline parameters can be inspected during the pig journey. Although pigs use many sensors to detect the required pipeline parameters, matching these data with the corresponding pipeline location is considered a very important parameter. High-end, tactical-grade inertial measurement units (IMUs) are used in pigging applications to locate the detected problems of pipeline using other sensors, and to reconstruct the trajectories of the pig. These IMUs are accurate; however, their high cost and large sizes limit their use in small diameter pipelines (8″ or less). This paper describes a new methodology for the use of MEMS-based IMUs using an extended Kalman filter (EKF) and the pipeline junctions to increase the position parameters' accuracy and to reduce the total RMS errors even during the unavailability of above ground markers (AGMs). The results of this new proposed method using a micro-electro-mechanical systems (MEMS)-based IMU revealed that the position RMS errors were reduced by approximately 85% compared to the standard EKF solution. Therefore, this approach will enable the mapping of small diameter pipelines, which was not possible before.

  4. A Novel Method to Enhance Pipeline Trajectory Determination Using Pipeline Junctions

    Directory of Open Access Journals (Sweden)

    Hussein Sahli

    2016-04-01

    Full Text Available Pipeline inspection gauges (pigs have been used for many years to perform various maintenance operations in oil and gas pipelines. Different pipeline parameters can be inspected during the pig journey. Although pigs use many sensors to detect the required pipeline parameters, matching these data with the corresponding pipeline location is considered a very important parameter. High-end, tactical-grade inertial measurement units (IMUs are used in pigging applications to locate the detected problems of pipeline using other sensors, and to reconstruct the trajectories of the pig. These IMUs are accurate; however, their high cost and large sizes limit their use in small diameter pipelines (8″ or less. This paper describes a new methodology for the use of MEMS-based IMUs using an extended Kalman filter (EKF and the pipeline junctions to increase the position parameters’ accuracy and to reduce the total RMS errors even during the unavailability of above ground markers (AGMs. The results of this new proposed method using a micro-electro-mechanical systems (MEMS-based IMU revealed that the position RMS errors were reduced by approximately 85% compared to the standard EKF solution. Therefore, this approach will enable the mapping of small diameter pipelines, which was not possible before.

  5. Communication systems vital to Colombian pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Serrato, E. [Ecopetrol, Bogota (Colombia); Mailloux, R. [Bristol Babcock Inc., Watertown, CT (United States)

    1997-02-01

    Construction of the Centro Oriente Gas Pipeline represents a major step in Colombia`s goal to strengthen the emerging natural gas business. With construction beginning in 1995, the Centro Oriente is scheduled to begin operation early this year transporting 150 MMcf/d. The 779-kilometer (484-mile) pipeline ranging in diameter from 22-inch to 12-inches, provides the central transportation link between major gas suppliers in both the northern and western regions of Colombia and new markets throughout their immediate regions as well as in the central and eastern regions. TransCanada, operating company for the Centro Oriente pipeline, will develop and manage the support organizations required to operate and maintain the system. The central control system for the CPC is the Gas SCADA system, ADACS, provided by Bristol Babcock Inc. (BBI). This control system provides the data acquisition and control capabilities necessary to operate the entire pipeline safely and efficiently from Burcaramanga.

  6. Citizenship program in near communities of pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Mascarenhas, Carina R.; Vilas Boas, Ianne P. [TELSAN Engenharia, Belo Horizonte, MG (Brazil); Bourscheid, Pitagoras [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil)

    2009-12-19

    During the construction of a pipeline, the IENE - Engineering Unit of PETROBRAS, responsible for the construction and erection of pipelines and related plants in northeastern Brazil, crossed more than 7 states and 250 counties, had implemented a social responsibility program, in special a citizenship program. This action was the result of community studies located near of the pipelines AID - Direct Influence Area (438 yards right and left of the pipeline) and through the evidence that those locations were poor and have no personal documents and citizen position in society. This paper intents to share the experience of IENE about its citizen program that worked in three big lines: community mobilization; citizenship qualification; and citizenship board. This last one, turns possible to people obtains theirs personal documents and exercise the plenitude of citizenship. (author)

  7. Sinopec Fuels Development of Pipelines in East China

    Institute of Scientific and Technical Information of China (English)

    Li Xuemei

    2004-01-01

    @@ Crude pipeline completed in Yangtze River Delta The Ningbo-Shanghai-Nanjing crude pipeline was recently wound up for construction following completion of the subsea segment in Hangzhou Bay,the last part of the pipeline. The 711 mm-diameter crude pipeline, designed and constructed by Sinopec, adopts the advanced SCADA system for full automation control. The pipeline is designed to have an annual crude transmission capacity of 20million tons.

  8. Developing pipeline risk methodology for environmental license permit; Metodologia para avaliacao do risco em dutos, no licenciamento ambiental

    Energy Technology Data Exchange (ETDEWEB)

    Arruda, Paulo; Naime, Andre [Instituto Brasileiro do Meio Ambiente e dos Recursos Naturais Renovaveis (IBAMA), Brasilia, DF (Brazil). Diretoria de Licenciamento e Qualidade Ambiental; Serpa, Ricardo [Companhia de Tecnologia de Saneamento Ambiental (CETESB), Sao Paulo, SP (Brazil). Setor de Analise de Riscos; Mendes, Renato F. [PETROBRAS Engenharia, RJ (Brazil); Ventura, Gilmar [TRANSPETRO - PETROBRAS Transportes, Rio de Janeiro, RJ (Brazil)

    2005-07-01

    Some new pipelines undertakings aim to establish connection between gas provinces in the Southeast and consumers in the Northeast of Brazil, in order to supply medium consuming centers and regions with minor potential of development. Consulting companies are carrying out Environmental Assessments studies and among them is the Risk Analyses of these pipeline transmission systems, in order to receive environmental permits by IBAMA, the Federal Brazilian Environmental Agency. In addition, existing interstate pipeline systems which are under IBAMA regulation will also require the same attention. For the purpose of defining a Pipeline Risk Analysis Protocol with methodology and risk criteria, with minimum risk analysis information on a comprehensive process, it has been decided for a 'tour de force' formed by experts from IBAMA and PETROBRAS engineers. The risk assessment protocol is focus on the risk to communities in the neighborhood of these pipelines and on the potential damage to the environment near and far from the ROW. The joined work ended up in two protocols, which attempt to provide environmental license permits for oil pipeline and gas pipelines with minimum contents for risk analysis studies. Another aspect is the environmental risk that has been focused on the contingency plan approach, since there are no consolidated environmental risk criteria for application as a common worldwide sense. The environmental risk mapping - MARA methodology will indicate areas with potential to be affected by leakages along a pipeline system. (author)

  9. Developing Pipeline Transportation in West China

    Institute of Scientific and Technical Information of China (English)

    Yang Chenghan; Wang Wei

    1997-01-01

    @@ Since the late 1980s, focus of exploration and development for oil & gas has been diverted to thewest of China, resulting in the discovery and development of Shaanbei gas field followed by large-scale exploration and development of Tarim,Turpan and Hami basins. Responding to this situation, pipeline construction focus has also been transferred to west China where large-scale development of pipeline transportation, an opportunity as well as a challenge, is expected.

  10. Consensus between pipelines in structural brain networks.

    Directory of Open Access Journals (Sweden)

    Christopher S Parker

    Full Text Available Structural brain networks may be reconstructed from diffusion MRI tractography data and have great potential to further our understanding of the topological organisation of brain structure in health and disease. Network reconstruction is complex and involves a series of processesing methods including anatomical parcellation, registration, fiber orientation estimation and whole-brain fiber tractography. Methodological choices at each stage can affect the anatomical accuracy and graph theoretical properties of the reconstructed networks, meaning applying different combinations in a network reconstruction pipeline may produce substantially different networks. Furthermore, the choice of which connections are considered important is unclear. In this study, we assessed the similarity between structural networks obtained using two independent state-of-the-art reconstruction pipelines. We aimed to quantify network similarity and identify the core connections emerging most robustly in both pipelines. Similarity of network connections was compared between pipelines employing different atlases by merging parcels to a common and equivalent node scale. We found a high agreement between the networks across a range of fiber density thresholds. In addition, we identified a robust core of highly connected regions coinciding with a peak in similarity across network density thresholds, and replicated these results with atlases at different node scales. The binary network properties of these core connections were similar between pipelines but showed some differences in atlases across node scales. This study demonstrates the utility of applying multiple structural network reconstrution pipelines to diffusion data in order to identify the most important connections for further study.

  11. Design check against the construction code (DNV 2012) of an offshore pipeline using numerical methods

    Science.gov (United States)

    Stan, L. C.; Călimănescu, I.; Velcea, D. D.

    2016-08-01

    The production of oil and gas from offshore oil fields is, nowadays, more and more important. As a result of the increasing demand of oil, and being the shallow water reserves not enough, the industry is pushed forward to develop and exploit more difficult fields in deeper waters. In this paper, there will be deployed the new design code DNV 2012 in terms of checking an offshore pipeline as compliance with the requests of this new construction code, using the Bentley Autopipe V8i. The August 2012 revision of DNV offshore standard, DNV- OS-F101, Submarine Pipeline Systems is supported by AutoPIPE version 9.6. This paper provides a quick walk through for entering input data, analyzing and generating code compliance reports for a model with piping code selected as DNV Offshore 2012. As seen in the present paper, the simulations comprise geometrically complex pipeline subjected to various and variable loading conditions. At the end of the designing process the Engineer has to answer to a simple question: is that pipeline safe or not? The pipeline set as an example, has some sections that are not complying in terms of size and strength with the code DNV 2012 offshore pipelines. Obviously those sections have to be redesigned in a manner to meet those conditions.

  12. Effects of EPI distortion correction pipelines on the connectome in Parkinson's Disease

    Science.gov (United States)

    Galvis, Justin; Mezher, Adam F.; Ragothaman, Anjanibhargavi; Villalon-Reina, Julio E.; Fletcher, P. Thomas; Thompson, Paul M.; Prasad, Gautam

    2016-03-01

    Echo-planar imaging (EPI) is commonly used for diffusion-weighted imaging (DWI) but is susceptible to nonlinear geometric distortions arising from inhomogeneities in the static magnetic field. These inhomogeneities can be measured and corrected using a fieldmap image acquired during the scanning process. In studies where the fieldmap image is not collected, these distortions can be corrected, to some extent, by nonlinearly registering the diffusion image to a corresponding anatomical image, either a T1- or T2-weighted image. Here we compared two EPI distortion correction pipelines, both based on nonlinear registration, which were optimized for the particular weighting of the structural image registration target. The first pipeline used a 3D nonlinear registration to a T1-weighted target, while the second pipeline used a 1D nonlinear registration to a T2-weighted target. We assessed each pipeline in its ability to characterize high-level measures of brain connectivity in Parkinson's disease (PD) in 189 individuals (58 healthy controls, 131 people with PD) from the Parkinson's Progression Markers Initiative (PPMI) dataset. We computed a structural connectome (connectivity map) for each participant using regions of interest from a cortical parcellation combined with DWI-based whole-brain tractography. We evaluated test-retest reliability of the connectome for each EPI distortion correction pipeline using a second diffusion scan acquired directly after the participants' first. Finally, we used support vector machine (SVM) classification to assess how accurately each pipeline classified PD versus healthy controls using each participants' structural connectome.

  13. Black powder in gas pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Sherik, Abdelmounam [Saudi Aramco, Dhahran (Saudi Arabia)

    2009-07-01

    Despite its common occurrence in the gas industry, black powder is a problem that is not well understood across the industry, in terms of its chemical and physical properties, source, formation, prevention or management of its impacts. In order to prevent or effectively manage the impacts of black powder, it is essential to have knowledge of its chemical and physical properties, formation mechanisms and sources. The present paper is divided into three parts. The first part of this paper is a synopsis of published literature. The second part reviews the recent laboratory and field work conducted at Saudi Aramco Research and Development Center to determine the compositions, properties, sources and formation mechanisms of black powder in gas transmission systems. Microhardness, nano-indentation, X-ray Diffraction (XRD), X-ray Fluorescence (XRF) and Scanning Electron Microscopy (SEM) techniques were used to analyze a large number of black powder samples collected from the field. Our findings showed that black powder is generated inside pipelines due to internal corrosion and that the composition of black powder is dependent on the composition of transported gas. The final part presents a summary and brief discussion of various black powder management methods. (author)

  14. Analysis of pipeline transportation systems for carbon dioxide sequestration

    Directory of Open Access Journals (Sweden)

    Witkowski Andrzej

    2014-03-01

    Full Text Available A commercially available ASPEN PLUS simulation using a pipe model was employed to determine the maximum safe pipeline distances to subsequent booster stations as a function of carbon dioxide (CO2 inlet pressure, ambient temperature and ground level heat flux parameters under three conditions: isothermal, adiabatic and with account of heat transfer. In the paper, the CO2 working area was assumed to be either in the liquid or in the supercritical state and results for these two states were compared. The following power station data were used: a 900 MW pulverized coal-fired power plant with 90% of CO2 recovered (156.43 kg/s and the monothanolamine absorption method for separating CO2 from flue gases. The results show that a subcooled liquid transport maximizes energy efficiency and minimizes the cost of CO2 transport over long distances under isothermal, adiabatic and heat transfer conditions. After CO2 is compressed and boosted to above 9 MPa, its temperature is usually higher than ambient temperature. The thermal insulation layer slows down the CO2 temperature decrease process, increasing the pressure drop in the pipeline. Therefore in Poland, considering the atmospheric conditions, the thermal insulation layer should not be laid on the external surface of the pipeline.

  15. Architecting the Finite Element Method Pipeline for the GPU.

    Science.gov (United States)

    Fu, Zhisong; Lewis, T James; Kirby, Robert M; Whitaker, Ross T

    2014-02-01

    The finite element method (FEM) is a widely employed numerical technique for approximating the solution of partial differential equations (PDEs) in various science and engineering applications. Many of these applications benefit from fast execution of the FEM pipeline. One way to accelerate the FEM pipeline is by exploiting advances in modern computational hardware, such as the many-core streaming processors like the graphical processing unit (GPU). In this paper, we present the algorithms and data-structures necessary to move the entire FEM pipeline to the GPU. First we propose an efficient GPU-based algorithm to generate local element information and to assemble the global linear system associated with the FEM discretization of an elliptic PDE. To solve the corresponding linear system efficiently on the GPU, we implement a conjugate gradient method preconditioned with a geometry-informed algebraic multi-grid (AMG) method preconditioner. We propose a new fine-grained parallelism strategy, a corresponding multigrid cycling stage and efficient data mapping to the many-core architecture of GPU. Comparison of our on-GPU assembly versus a traditional serial implementation on the CPU achieves up to an 87 × speedup. Focusing on the linear system solver alone, we achieve a speedup of up to 51 × versus use of a comparable state-of-the-art serial CPU linear system solver. Furthermore, the method compares favorably with other GPU-based, sparse, linear solvers.

  16. Simulation of Wave-Plus-Current Scour beneath Submarine Pipelines

    DEFF Research Database (Denmark)

    Eltard-Larsen, Bjarke; Fuhrman, David R.; Sumer, B. Mutlu

    2016-01-01

    A fully coupled hydrodynamic and morphologic numerical model was utilized for the simulation of wave-plus-current scour beneath submarine pipelines. The model was based on incompressible Reynolds-averaged Navier–Stokes equations, coupled with k-ω turbulence closure, with additional bed and suspen......A fully coupled hydrodynamic and morphologic numerical model was utilized for the simulation of wave-plus-current scour beneath submarine pipelines. The model was based on incompressible Reynolds-averaged Navier–Stokes equations, coupled with k-ω turbulence closure, with additional bed...... and suspended load descriptions forming the basis for seabed morphology. The model was successfully validated against experimental measurements involving scour development and eventual equilibrium in pure-current flows over a range of Shields parameters characteristic of both clear-water and live-bed regimes....... This validation complements previously demonstrated accuracy for the same model in simulating pipeline scour processes in pure-wave environments. The model was subsequently utilized to simulate combined wave-plus-current scour over a wide range of combined Keulegan–Carpenter numbers and relative current strengths...

  17. Pipeline cost reduction through effective project management and applied technology

    Energy Technology Data Exchange (ETDEWEB)

    Jenkins, A. [TransCanada Pipeline Ltd., Alberta (Canada); Babuk, T. [Empress International Inc., Westwood, NJ (United States); Mohitpour, M. [Tempsys Pipeline Solutions Inc., Vancouver, BC (Canada); Murray, M.A. [National Energy Board of Canada (Canada)

    2005-07-01

    Pipelines are regarded by many as passive structures with the technology involved in their construction and operation being viewed as relatively simple and stable. If such is the case how can there be much room for cost improvement? In reality, there have been many technological and regulatory innovations required within the pipeline industry to meet the challenges posed by ever increasing consumer demand for hydrocarbons, the effects of aging infrastructure and a need to control operating and maintenance expenditures. The importance of technology management, as a subset of overall project management, is a key element of life cycle cost control. Assurance of public safety and the integrity of the system are other key elements in ensuring a successful pipeline project. The essentials of best practise project management from an owner/ operator's perspective are set out in the paper. Particular attention is paid to the appropriate introduction of new technology, strategic procurement practice and material selection, indicating that capital cost savings of up to 15% are achievable without harming life cycle cost. The value of partnering leading to technical innovation, cost savings and improved profitability for all the participants is described. Partnering also helps avoid duplicated effort through the use of common tools for design, planning schedule tracking and reporting. Investing in appropriate technology development has been a major source of cost reduction in recent years and the impact of a number of these recently introduced technologies in the areas of materials, construction processes and operation and maintenance are discussed in the paper. (author)

  18. A system and approach for total pipeline integrity management

    Energy Technology Data Exchange (ETDEWEB)

    Porter, Todd; Neidhardt, Dietmar [Tuboscope Pipeline Services, Houston, TX (United States); Gonzalez, Oscar [Tuboscope Mexico S.A. de C.V., Mexico, D.F. (Mexico)

    2005-07-01

    Pipeline rehabilitation and maintenance decisions are made using a wide variety of data, criteria, and expertise. The goal is to arrive at an optimal plan that considers risk and best return on Repair and Maintenance (R and M) expenditures for aging pipeline systems in both regulated and non-regulated environments. In order to achieve these goals, historical, operations, and assessment data is gathered, aligned and integrated as part of a baseline assessment. Integrity threats are identified based on operations and industry experience, and combined with potential consequences to public safety, the environment, and business to clearly delineate high risk exposure segments in the system. Integrity assessments are conducted in a prioritized manner, using the most appropriate technology and methods to address the threats. These include In Line Inspection technologies: MFL, Deformation, UT, INS (combinations thereof), Direct Assessment for EC and SCC threats, Hydro testing, and other indirect methods. From these results, decisions are made and R and M plans developed. To arrive at an optimal R and M plan, proper use of existing data, new integrity assessment data, and decision risk models is required. This paper presentation will detail the tactical aspect of an effective integrity management platform. Experience in decision support, operations priorities and execution of a rehabilitation plan using LinaView Pro{sup TM} integrity management system with risk-based integrity tools and maintenance planning will be presented. A process overview, results, and benefits will be given using these examples from operating oil and gas transmission pipelines. (author)

  19. Analysis of pipeline transportation systems for carbon dioxide sequestration

    Science.gov (United States)

    Witkowski, Andrzej; Majkut, Mirosław; Rulik, Sebastian

    2014-03-01

    A commercially available ASPEN PLUS simulation using a pipe model was employed to determine the maximum safe pipeline distances to subsequent booster stations as a function of carbon dioxide (CO2) inlet pressure, ambient temperature and ground level heat flux parameters under three conditions: isothermal, adiabatic and with account of heat transfer. In the paper, the CO2 working area was assumed to be either in the liquid or in the supercritical state and results for these two states were compared. The following power station data were used: a 900 MW pulverized coal-fired power plant with 90% of CO2 recovered (156.43 kg/s) and the monothanolamine absorption method for separating CO2 from flue gases. The results show that a subcooled liquid transport maximizes energy efficiency and minimizes the cost of CO2 transport over long distances under isothermal, adiabatic and heat transfer conditions. After CO2 is compressed and boosted to above 9 MPa, its temperature is usually higher than ambient temperature. The thermal insulation layer slows down the CO2 temperature decrease process, increasing the pressure drop in the pipeline. Therefore in Poland, considering the atmospheric conditions, the thermal insulation layer should not be laid on the external surface of the pipeline.

  20. The Ruptured Pipeline: Analysis of the Mining Engineering Faculty Pipeline

    Science.gov (United States)

    Poulton, M.

    2011-12-01

    The booming commodities markets of the past seven years have created an enormous demand for economic geologists, mining engineers, and extractive metallurgists. The mining sector has largely been recession proof due to demand drivers coming from developing rather than developed nations. The strong demand for new hires as well as mid-career hires has exposed the weakness of the U.S. university supply pipeline for these career fields. A survey of mining and metallurgical engineering faculty and graduate students was conducted in 2010 at the request of the Society for Mining, Metallurgy, and Exploration. The goals of the surveys were to determine the demographics of the U.S. faculty in mining and metallurgical engineering, the expected faculty turn over by 2010 and the potential supply of graduate students as the future professorate. All Mining Engineering and Metallurgical Engineering degrees in the U.S. are accredited by the Accreditation Board for Engineering and Technology (ABET) and the specific courses required are set by the sponsoring professional society, Society for Mining, Metallurgy, and Exploration. There are 13 universities in the U.S. that offer a degree in Mining Engineering accredited as Mining Engineering and 1 university that grants a Mining Engineering degree accredited under general engineering program requirements. Faculty numbers are approximately 87 tenure track positions with a total undergraduate enrollment of slightly over 1,000 in the 2008-2009 academic year. There are approximately 262 graduate students in mining engineering in the U.S. including 87 Ph.D. students. Mining Engineering department heads have identified 14 positions open in 2010 and 18 positions expected to be open in the next 5 years and an additional 21 positions open by 2020. The current survey predicts a 56% turn over in mining faculty ranks over the next 10 years but a retirement of 100% of senior faculty over 10 years. 63% of graduate students say they are interested in

  1. 49 CFR 195.303 - Risk-based alternative to pressure testing older hazardous liquid and carbon dioxide pipelines.

    Science.gov (United States)

    2010-10-01

    ... mechanical properties, including fracture toughness; the manufacturing process and controls related to seam... Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT... lapwelded pipe is deemed susceptible to longitudinal seam failures unless an engineering analysis shows...

  2. The Reasons of Steam Pipeline Elbow Rupture

    Directory of Open Access Journals (Sweden)

    Mesjasz A.

    2016-09-01

    Full Text Available In the paper the reasons for steam pipeline’s elbow material rupture, made of steel 13CrMo4-5 (15HM that is being used in the energetics. Based on the mechanical properties in the ambient temperature (Rm, Rp0,2 and elongation A5 and in the increased temperature (Rp0,2t it was found, that the pipeline elbow’s material sampled from the ruptured area has lower Rp0,2 i Rp0,2t by around 2% than it is a requirement for 13CrMo4-5 steel in it’s base state. The damage appeared as a result of complex stress state, that substantially exceeded the admissible tensions, what was the consequence of considerable structure degradation level. As a result of the microstructure tests on HITACHI S4200 microscope, the considerable development of the creeping process associates were found. Also the advances progress of the microstructure degradation was observed, which is substantial decomposition of bainite and multiple, with varied secretion size, and in most cases forming the micro cracks chains. With the use of lateral micro sections the creeping voids were observed, that creates at some places the shrinkage porosities clusters and micro pores.

  3. Flow Rate Capacity Reduction Due to Temporal and Dynamic Processes in Large Pipelines. Study with Field Measurements; Efectos dinamicos y temporales en la reduccion de la capacidad de conduccion en grandes acueductos. Estudio con medidas en prototipo

    Energy Technology Data Exchange (ETDEWEB)

    Carmona Paredes, Rafael; Ortiz Nunez, Luis Alfonso; Sanchez Huerta, Alejandro [Universidad Nacional Autonoma de Mexico (Mexico)

    2002-06-01

    More than 15 years of operation have show that some water transport pressurized pipelines change their flow rate capacity faster than expected due to normal roughness increase. As explained by the tubular pinch effects, the radial migration of suspended particles in a flow can produce a high concentration close to the pipe wall. The non-uniform particle concentration leads to higher velocities at the center of the tube, equivalent to a reduced hydraulic section that increases the head losses. A model to explain field measurements at the Chapala-Guadalajara Aqueduct is proposed that suggests to hydraulic engineers to be more distrustful when using traditional head loss formulas to analyze water transport pipelines. [Spanish] La perdida de la capacidad de conduccion es un grave problema en la operacion de grandes acueductos. Mas de 15 anos de estudios y de inspeccion directa al interior de las tuberias de varios sistemas de abastecimiento de agua potable han mostrado el desarrollo de capas de material fino fuertemente adheridas a la pared de los tubos. En algunos casos, la variacion de la perdida de carga no ha podido ser explicada con los modelos tradicionales de crecimiento de la rugosidad interna, por lo que para explicar de forma adecuada las mediciones de campo realizadas en el acueducto Chapala-Guadalajara ha sido necesario incorporar de manera simultanea fenomenos dinamicos y temporales. Con base en el fecto de elongacion tubular (o tubular pinch effect), descrito por otros autores, mas observaciones directas al interior de las tuberias, en este trabajo se plantea como una posible de la disminucion de capacidad de conduccion en el acueducto Chapala-Guadalajara un cambio aparente en la seccion efectiva del flujo, originado por la migracion radial hacia la pared del tubo de las particulas suspendidas en el agua. El modelo que propone reproduce las medidas en campo con diferencias menores al 10% e invita a reflexionar sobre las practicas convencionales para

  4. 77 FR 6857 - Pipeline Safety: Notice of Public Meetings on Improving Pipeline Leak Detection System...

    Science.gov (United States)

    2012-02-09

    ... and Research 5:45 p.m. Wrap-Up/Next Steps 6 p.m. Adjournment Preliminary Agenda for the Public Meeting... Pipelines 12:30 p.m. Lunch 2 p.m. Panel 3: Valve Capabilities, Limitations and Research 4 p.m. Wrap-Up/Next... Administrator for Pipeline Safety. BILLING CODE 4910-60-P...

  5. The Dangers of Pipeline Thinking: How the School-to-Prison Pipeline Metaphor Squeezes out Complexity

    Science.gov (United States)

    McGrew, Ken

    2016-01-01

    In this essay Ken McGrew critically examines the "school-to-prison pipeline" metaphor and associated literature. The origins and influence of the metaphor are compared with the origins and influence of the competing "prison industrial complex" concept. Specific weaknesses in the "pipeline literature" are examined.…

  6. 78 FR 41991 - Pipeline Safety: Potential for Damage to Pipeline Facilities Caused by Flooding

    Science.gov (United States)

    2013-07-12

    ... gallons of crude oil into the Yellowstone River. The rupture was caused by debris washing downstream in... structure and by impact and/or waterborne forces. Washouts and erosion may result in loss of support for both buried and exposed pipelines. The flow of water against an exposed pipeline may also result...

  7. Marine Environmental Protection and Transboundary Pipeline Projects: A Case Study of the Nord Stream Pipeline

    NARCIS (Netherlands)

    Lott, Alexander

    2011-01-01

    The Nord Stream transboundary submarine pipeline, significant for its impact on the EU energy policy, has been a heav- ily debated issue in the Baltic Sea region during the past decade. This is partly due to the concerns over the effects that the pipeline might have on the Baltic Sea as a particular

  8. 76 FR 29333 - Pipeline Safety: Meetings of the Technical Pipeline Safety Standards Committee and the Technical...

    Science.gov (United States)

    2011-05-20

    ... staff to assist in the creation of a pipeline safety report to the nation. The subcommittee is made up...) and the creation of a subcommittee to assist PHMSA in the preparation of a pipeline safety report to the nation. PHMSA will host a series of meetings with a newly formed subcommittee to review and...

  9. 77 FR 61825 - Pipeline Safety: Notice of Public Meeting on Pipeline Data

    Science.gov (United States)

    2012-10-11

    ... (NAPSR) are sponsoring this public meeting to discuss how pipeline data is currently used by stakeholders... public meeting on pipeline data will be held on Monday, October 29, 2012, from 1 p.m. to 5:30 p.m. and... meeting are to: 1. Determine how stakeholders, including PHMSA, industry, and the public use the data....

  10. Pipeline design software and the simulation of liquid propane/butane-light oils pipeline operations

    Energy Technology Data Exchange (ETDEWEB)

    Peters, J. [Monenco AGRA Inc., Calgary, Alberta (Canada)

    1996-12-31

    A comprehensive and integrated suite of computer software routines has been developed to simulate the flow of liquids in pipelines. The fluid properties module accommodates Newtonian and non-Newtonian liquids or mixtures including corrections for changes in properties with temperature and pressure. The hydraulic model calculates pressure drop in single or looped pipelines based on the diameter, route (length) and profile data provided. For multi-product pipelines the hydraulics module estimates energy loss for any sequence of batches given the size and fluid properties of each batch, and the velocity in the pipeline. When the characteristics of existing or proposed pipeline pumps are included, location and size of pumps can be optimized. The effect of heat loss on pressure drop is predicted by invoking the module which calculates the fluid temperature profile based on operating conditions, fluid properties, pipe and insulation conductivity and soil heat transfer data. Modules, created to simulate heater or cooler operations, can be incorporated to compensate for changes in temperature. Input data and calculated results can be presented in a format customized by the user. The simulation software has been successfully applied to multi-product, fuel oil, and non-Newtonian emulsion pipelines. The simulation and operation of a refinery products pipeline for the transportation of propane, butane, gasline, jet and diesel batches will be discussed. The impact of high vapor pressure batches (i.e., propane and butane) on the operation of the pipeline and on the upstream and downstream facilities will be examined in detail.

  11. 77 FR 45417 - Pipeline Safety: Inspection and Protection of Pipeline Facilities After Railway Accidents

    Science.gov (United States)

    2012-07-31

    ... Pipeline Facilities After Railway Accidents AGENCY: Pipeline and Hazardous Materials Safety Administration... either during a railroad accident or other event occurring in the right-of-way. Further, the advisory... to identify and notify underground utilities that an incident has occurred in the vicinity of their...

  12. Reliability and risk analysis and evaluation of a port oil pipeline transportation system in variable operation conditions

    Energy Technology Data Exchange (ETDEWEB)

    Soszynska, Joanna [Gdynia Maritime University, Gdynia (Poland)

    2009-07-01

    In the paper the semi-Markov model is applied to describe the port oil pipeline transportation system operation processes and its selected parameters are determined. Multi-state systems are considered and their reliability and risk are found. Next, the joint model of the systems' operation process and the systems' multi-state reliability is applied to the reliability and risk evaluation of the port oil pipeline transportation system. (author)

  13. Reversal Estimation Model of Mechanical Properties for X60 Pipeline Steel Plate during U--forming O--forming(mechanical) Expanding Processes%UOE成形中X60管线钢板的力学性能反推模型

    Institute of Scientific and Technical Information of China (English)

    郭宝峰; 赵石岩; 王林锋; 金淼

    2011-01-01

    Taking X60 pipeline steel for example, bending deformation features for different speci- fications during UOE processes were analyzed using FEM, to achieve reversal calculation method of mechanical properties of steel plate determined by pipe, for mechanical property changes of longitudinally submerged arc welding(LSAW) large diameter linepipe in the process of UOE forming and specimens flattening. Bending specimens with the same equivalent strain were made through four point bending. Mechanical property change law of X60 pipeline steel plate after bending and flattening was investigated according to API standards regarding flattening test methods of mechanical properties for pipe body specimens, and reversal estimation model of yield strength for X60 pipeline steel plate in the equivalent strain range of 0. 0104-0. 0586 was presented.%针对大直径直缝埋弧焊管线钢管在UOE成形及之后的试样压平过程中板料力学性能的变化问题,为实现从钢管力学性能确定毛坯钢板力学性能的反向计算,以X60管线钢为例,应用有限元数值模拟方法分析了不同规格钢管在UOE成形过程中重要特征部位的弯曲变形,采用4点弯曲方法制作了具有相同等效应变的弯曲试件,依据API标准关于管体试样力学性能压平试验方法,分析了X60管线钢在弯曲及压平变形后的力学性能变化规律,给出了等效应变在0.0104~0.0586区间的X60管线钢的屈服强度反向计算近似方法。

  14. A streamlined approach for pipeline integrity management

    Energy Technology Data Exchange (ETDEWEB)

    Porter, T.R. [Tuboscope Pipeline Services, Houston, TX (United States); Marr, J.E. [Marr and Associates, A Tuboscope Company, Calgary, AB (Canada)

    2004-07-01

    While regulations call for safe and reliable operation of pipelines, business calls for economic return and reduced liability. This paper presented a system that provides rapid, comprehensive and economic improvements for pipeline integrity decision support. The first phase of a pipeline integrity management plan (IMP) involves the identification of integrity threats to the pipeline. This may involve 22 root causes as defined by the Pipeline Research Council International (PRCI), grouped into 9 categories of related failure types, further grouped into time related defect types. Time dependent defects include external corrosion, internal corrosion and stress corrosion cracking. Stable defects include manufacturing related or welding defects, while time independent defects include mechanical damage, incorrect operations and outside forces. In designing an IMP, high consequence areas (HCAs) must be defined along with the integrity threats that could affect the pipeline. A baseline risk assessment is then performed using data from the integrity threat models to identify risks areas, individual lines, pipe segments or joints. Integrity management decisions are made based on the outcome of initial assessments, resulting in integrity assessment tools such as in line inspection (ILI) technologies, direct assessment (DA), and hydrostatic testing. Pipeline engineers benefit from having ILI, DA and other data integrated and interacting with geographic information system (GIS) data. This paper presented the LinaView PRO{sup TM} IMP tool developed by Tuboscope that enhances dig smart excavation decision making; remediation and mitigation planning; responding to one-call emergency response; implementation of government regulations; HCA identification; integration of a wide variety of data; comprehensive dynamic segmentation; and, data validation in support of risk assessment. The objective of an IMP is the safe and reliable delivery system for oil and gas product to markets. 9

  15. Study and Application of Internal Coating Technique to Drag Reduction of the Trunk Pipeline for the West-East Gas Pipeline

    Institute of Scientific and Technical Information of China (English)

    HuShixin; QuShenyang; LinZhu

    2004-01-01

    Coverage layer coated in the internal wall of pipeline enables the friction drag to be reduced, the throughput and the gas transmission efficiency to be increased, the frequency of pigging and the number of the intermediate compressor station to be reduced, and the power consumption of the compressor to be decreased etc. The drag reduction is a high advanced scientific technique with outstanding economical benefit. The study and application of internal coating technique for drag reduction of 4000km trunk pipeline in West-East gas transmission pipeline (WEGTP) project are described, in which the main points are the drag reduction principle, coating process and the indoor study of this technique with own-decided knowledge property right at home.

  16. The main causes of in situ internal pipeline painting failures; Fatores que podem implicar em falhas prematuras de pintura interna in situ de dutos

    Energy Technology Data Exchange (ETDEWEB)

    Quintela, Joaquim P.; Vieira, Magda M.; Vieira, Gerson V. [PETROBRAS, Rio de Janeiro, RJ (Brazil). Centro de Pesquisas; Fragata, Fernando de L.; Amorim, Cristina da C. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil)

    2005-07-01

    Resources in coating technology have been used to increase the useful life of pipelines, to guarantee the carried product quality, to increase the operational trustworthiness, to reduce the maintenance costs, the personal and patrimonial risks and environmental damages. Parallel, in virtue of the pipelines natural ageing and operational problems, more advanced technologies, as the internal coating process in situ, have become an important method of pipelines rehabilitation. The aim of this work is to study the main factors that may influence the performance of an internal coating project, allowing the premature damages occurrence in pipelines, used in gas, oil and derivatives transport. (author)

  17. Effect of the welding process on the microstructure and microhardness of API 5L X80 steel welded joint used for oil transportation pipeline; Efeito do processo de soldagem sobre a microestrutura e a microdureza de juntas soldadas de aco API 5L X80 usado em tubulacoes para transporte de petroleo

    Energy Technology Data Exchange (ETDEWEB)

    Alves, R.T.P.; Albuquerque, S.F. [Universidade Federal de Campina Grande (UFCG), PB (Brazil); Maciel, T.M.; Almeida, D.M.; Santos, M.A.

    2008-07-01

    This study had as objective to evaluate the microstructure and microhardness of API 5L X80 steel welded joints, used for pipelines to transport oil and gas, using the Shield Metal Arc Welding process with pre- heating temperature of 200 deg C and 400 deg C and the AWS E8010G electrode as filler metal. For this, besides the microhardness of the welded joint, the weld metals percentiles of micro-constituents and of columnar and regenerated grains and the medium size and extension of the heat affected zone were evaluated. The percentage of acicular ferrite in weld metal ranged from 13% to 33% which generated values of microhardness from 114 HV to 309 HV. (author)

  18. The PREP pipeline: standardized preprocessing for large-scale EEG analysis.

    Science.gov (United States)

    Bigdely-Shamlo, Nima; Mullen, Tim; Kothe, Christian; Su, Kyung-Min; Robbins, Kay A

    2015-01-01

    The technology to collect brain imaging and physiological measures has become portable and ubiquitous, opening the possibility of large-scale analysis of real-world human imaging. By its nature, such data is large and complex, making automated processing essential. This paper shows how lack of attention to the very early stages of an EEG preprocessing pipeline can reduce the signal-to-noise ratio and introduce unwanted artifacts into the data, particularly for computations done in single precision. We demonstrate that ordinary average referencing improves the signal-to-noise ratio, but that noisy channels can contaminate the results. We also show that identification of noisy channels depends on the reference and examine the complex interaction of filtering, noisy channel identification, and referencing. We introduce a multi-stage robust referencing scheme to deal with the noisy channel-reference interaction. We propose a standardized early-stage EEG processing pipeline (PREP) and discuss the application of the pipeline to more than 600 EEG datasets. The pipeline includes an automatically generated report for each dataset processed. Users can download the PREP pipeline as a freely available MATLAB library from http://eegstudy.org/prepcode.

  19. 海底管道铺设中的半自动焊接工艺缺陷分析及其预防%Analyze on the welding defects of the semiautomatic welding process in the submarine pipeline laying project and its prevention

    Institute of Scientific and Technical Information of China (English)

    孙志广

    2012-01-01

    Aiming to the application and practice of the semi-automatic welding process (STT-root welding,filling and cap welding of FCAW-S semi-automatic self-shielded flux wire) of the submarine pipeline laying project in recent years,this paper introduced application characteristics of this combination welding processes.By checking on the inspection results of 3000 welds on submarine pipelines laid in recent years randomly,the representative weld defects were listed,including the lack of fusion,lack of penetration,porosity,slag and so on. The author analyzed the harm and causes of die representative welding defects,and putted forward some appropriate preventive measures in the installation.%针对海洋石油工程海底管道铺设中所采用的半自动焊接工艺(STT半自动根焊,FCAW-S半自动自保护药芯焊丝填充、盖面)的应用和实践,介绍了此组合焊接工艺的应用特点,并通过抽查近年铺设的3000道海底管道焊口的检验结果,统计得出常见的焊接缺陷主要有未熔合、未焊透、气孔、夹渣等.结合海洋石油海底管道铺设过程中的实际施工情况,分析了几种常见焊接缺陷引起的危害以及缺陷产生的原因,并提出了相应的预防措施.

  20. Bauxite slurry pipeline: start up operation

    Energy Technology Data Exchange (ETDEWEB)

    Othon, Otilio; Babosa, Eder; Edvan, Francisco; Brittes, Geraldo; Melo, Gerson; Janir, Joao; Favacho, Orlando; Leao, Marcos; Farias, Obadias [Vale, Rio de Janeiro, RJ (Brazil); Goncalves, Nilton [Anglo Ferrous Brazil S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    The mine of Miltonia is located in Paragominas-PA, in the north of Brazil. Bauxite slurry pipeline starts at the Mine of Miltonia and finishes in the draining installation of Alunorte refinery at the port of Barcarena-PA, located approximately 244km away from the mine. The pipeline runs over seven cities and passes below four great rivers stream beds. The system was designed for an underground 24 inches OD steel pipe to carry 9.9 million dry metric tonnes per annum (dMTAs) of 50.5% solid concentration bauxite slurry, using only one pumping station. The system is composed by four storage tanks and six piston diaphragm pumps, supplying a flow of 1680 m3/h. There is a cathodic protection system along the pipeline extension to prevent external corrosion and five pressure monitoring stations to control hydraulic conditions, there is also a fiber optic cable interconnection between pump station and terminal station. Pipeline Systems Incorporated (PSI) was the designer and followed the commissioning program of the start up operations. This paper will describe the beginning of the pipeline operations, technical aspects of the project, the operational experiences acquired in these two years, the faced problems and also the future planning. (author)

  1. Fiber optic accelerometer for pipeline surveillance

    Energy Technology Data Exchange (ETDEWEB)

    Valente, Luiz C.G.; Cabral, Bruno S. [LUPATECH Monitoring Systems, Caxias do Sul, RS (Brazil); Braga, Arthur M.B. [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio), Rio de Janeiro, RJ (Brazil). Dept. de Engenharia Mecanica

    2009-07-01

    The use of accelerometers for monitoring vibration due to impacts and other sources associated with pipeline operation is not new, but conventional electric accelerometers present practical problems to be deployed in the field. In this paper we evaluate the use of both commercially available and prototypes of new optical fiber accelerometers for this application. They all share the possibility of operating at long distances from the reading unit. All tests were performed at CTDUT facilities on free pipes as well as on a 14 pol-OD, 100 meters long pipeline loop. Using controlled impacts, several aspects of the application have been analyzed such as different ways of fixing the accelerometers to the pipeline wall, influence of barriers between impact and sensor, and signal propagation through buried sections of pipeline. Results of measurements performed during the operation of the loop are also presented. They include passing PIGs, pumping water out from the system, and working on the tubes to open the loop. Results indicate that the accelerometers can be placed at distances measuring hundreds of meters from the source of vibration, and that the difference in time and frequency behavior of signals measured by sensors placed in different locations along the pipeline may be used to locate and identify that source. (author)

  2. JGI Plant Genomics Gene Annotation Pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Shu, Shengqiang; Rokhsar, Dan; Goodstein, David; Hayes, David; Mitros, Therese

    2014-07-14

    Plant genomes vary in size and are highly complex with a high amount of repeats, genome duplication and tandem duplication. Gene encodes a wealth of information useful in studying organism and it is critical to have high quality and stable gene annotation. Thanks to advancement of sequencing technology, many plant species genomes have been sequenced and transcriptomes are also sequenced. To use these vastly large amounts of sequence data to make gene annotation or re-annotation in a timely fashion, an automatic pipeline is needed. JGI plant genomics gene annotation pipeline, called integrated gene call (IGC), is our effort toward this aim with aid of a RNA-seq transcriptome assembly pipeline. It utilizes several gene predictors based on homolog peptides and transcript ORFs. See Methods for detail. Here we present genome annotation of JGI flagship green plants produced by this pipeline plus Arabidopsis and rice except for chlamy which is done by a third party. The genome annotations of these species and others are used in our gene family build pipeline and accessible via JGI Phytozome portal whose URL and front page snapshot are shown below.

  3. Diagnostics and reliability of pipeline systems

    CERN Document Server

    Timashev, Sviatoslav

    2016-01-01

    The book contains solutions to fundamental problems which arise due to the logic of development of specific branches of science, which are related to pipeline safety, but mainly are subordinate to the needs of pipeline transportation.          The book deploys important but not yet solved aspects of reliability and safety assurance of pipeline systems, which are vital aspects not only for the oil and gas industry and, in general, fuel and energy industries , but also to virtually all contemporary industries and technologies. The volume will be useful to specialists and experts in the field of diagnostics/ inspection, monitoring, reliability and safety of critical infrastructures. First and foremost, it will be useful to the decision making persons —operators of different types of pipelines, pipeline diagnostics/inspection vendors, and designers of in-line –inspection (ILI) tools, industrial and ecological safety specialists, as well as to researchers and graduate students.

  4. Research for visualization of running state of long-distance water transmission pipeline based on OpenGL

    Science.gov (United States)

    Zhang, Xiaoping; Xu, Xuejun; Liu, Bing; Zhang, Zhendong

    2017-03-01

    The running condition of long distance water pipeline are complicated and changeable, and the lag of water flow is obvious, which is the key technical problem to be solved in the process of regulation and control. According to the present situation of the long-distance water conveyance project, the visualization simulation technology is used to study the operation and management of the long-distance water conveyance pipeline based on OpenGL technology. The system developed in this paper can combine pipeline information, working condition data and relevant data to provide a visualization platform for analysis and decision-making of project management and operation.

  5. Multi-diameter pigging: factors affecting the design and selection of pigging tools for multi-diameter pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Karl [Pipeline Engineering and Supply Co. Ltd., Richmond, NY (United States)

    2009-07-01

    This paper will consider the process involved in pigging tool selection for pipelines with two or more significant internal diameters which require pigging tools capable of negotiating the different internal diameters whilst also carrying out the necessary pipeline cleaning operation. The paper will include an analysis of pipeline features that affect pigging tool selection and then go on to look at other variables that determine the pigging tool design; this will include a step by step guide outlining how the tool is designed, the development of prototype pigs and the importance of testing and validation prior to final deployment in operational pigging programmes. (author)

  6. CORROSION RESISTANT CERAMIC COATING FOR X80 PIPELINE STEEL BY LOW-TEMPERATURE PACK ALUMINIZING AND OXIDATION TREATMENT

    OpenAIRE

    HUANG MIN; FU QIAN-GANG; WANG YU; ZHONG WEN-WU

    2013-01-01

    In this paper, we discuss the formation of ceramic coatings by a combined processing of low-temperature pack aluminizing and oxidation treatment on the surface of X80 pipeline steel substrates in order to improve the corrosion resistance ability of X80 pipeline steel. First, Fe-Al coating consisting of FeAl3 and Fe2Al5 was prepared by a low-temperature pack aluminizing at 803 K which was fulfilled by adding zinc in the pack powder. Pre-treatment of X80 pipeline steel was carried out through s...

  7. Applications of the pipeline environment for visual informatics and genomics computations

    Directory of Open Access Journals (Sweden)

    Genco Alex

    2011-07-01

    Full Text Available Abstract Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The

  8. PANGEA: pipeline for analysis of next generation amplicons.

    Science.gov (United States)

    Giongo, Adriana; Crabb, David B; Davis-Richardson, Austin G; Chauliac, Diane; Mobberley, Jennifer M; Gano, Kelsey A; Mukherjee, Nabanita; Casella, George; Roesch, Luiz F W; Walts, Brandon; Riva, Alberto; King, Gary; Triplett, Eric W

    2010-07-01

    High-throughput DNA sequencing can identify organisms and describe population structures in many environmental and clinical samples. Current technologies generate millions of reads in a single run, requiring extensive computational strategies to organize, analyze and interpret those sequences. A series of bioinformatics tools for high-throughput sequencing analysis, including pre-processing, clustering, database matching and classification, have been compiled into a pipeline called PANGEA. The PANGEA pipeline was written in Perl and can be run on Mac OSX, Windows or Linux. With PANGEA, sequences obtained directly from the sequencer can be processed quickly to provide the files needed for sequence identification by BLAST and for comparison of microbial communities. Two different sets of bacterial 16S rRNA sequences were used to show the efficiency of this workflow. The first set of 16S rRNA sequences is derived from various soils from Hawaii Volcanoes National Park. The second set is derived from stool samples collected from diabetes-resistant and diabetes-prone rats. The workflow described here allows the investigator to quickly assess libraries of sequences on personal computers with customized databases. PANGEA is provided for users as individual scripts for each step in the process or as a single script where all processes, except the chi(2) step, are joined into one program called the 'backbone'.

  9. HMI global helioseismology data analysis pipeline

    Science.gov (United States)

    Larson, Tim; Schou, Jesper

    2011-01-01

    The HMI global helioseismology data analysis pipeline is based largely on the MDI medium-l program. All of the modules that ran in the SOI Science Support Center have been ported for use in the SDO Joint Science Operations Center (JSOC) and given greater functionality. Many errors and approximations which are present in the standard MDI pipeline have been corrected and improvements have been added. Scripts have been written to automate the submission of compute jobs to our local cluster; it is now possible to go from dopplergrams to mode parameters with the push of a button. JSOC dataseries have been created to hold all intermediate data products, timeseries, window functions, and mode parameters. Here we discuss the operation of the pipeline, the structure of the data it generates, and access to the same.

  10. PLASTIC LIMIT LOAD ANALYSIS OF DEFECTIVE PIPELINES

    Institute of Scientific and Technical Information of China (English)

    ChenGang; LiuYinghua; XuBingye

    2003-01-01

    The integrity assessment of defective pipelines represents a practically important task of structural analysis and design in various technological areas, such as oil and gas industry, power plant engineering and chemical factories. An iterative algorithm is presented for the kinematic limit analysis of 3-D rigid-perfectly plastic bodies. A numerical path scheme for radial loading is adopted to deal with complex multi-loading systems. The numerical procedure has been applied to carry out the plastic collapse analysis of pipelines with part-through slot under internal pressure, bending moment and axial force. The effects of various shapes and sizes of part-through slots on the collapse loads of pipelines are systematically investigated and evaluated. Some typical failure modes corresponding to different configurations of slots and loading forms are studied.

  11. Key Design Properties for Shipping Information Pipeline

    DEFF Research Database (Denmark)

    Jensen, Thomas; Tan, Yao-Hua

    2015-01-01

    on paper, e-mail, phone and text message, and far too costly. This paper explores the design properties for a shared information infrastructure to exchange information between all parties in the supply chain, commercial parties as well as authorities, which is called a Shipping Information Pipeline...... Infrastructures. The paper argues why the previous attempts are inadequate to address the issues in the domain of international supply chains. Instead, a different set of key design properties are proposed for the Shipping Information Pipeline. The solution has been developed in collaboration with a network...... of representatives for major stakeholders in international trade, whom evaluate it positively and are willing to invest, develop and test the prototype of the Shipping Information Pipeline....

  12. Safety installation for preventing pollution by pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Wittgenstein, G.F.

    1972-10-25

    A safety installation for preventing pollution by pipelines, particularly those used for transporting liquid hydrocarbons, is described. It is applicable to any pipeline, but particularly to underground or submarine pipelines, whether made of steel, plastics, or any other material. The 4 essential objects of the invention are to insure reliable prevention of pollution of the environment due to leakage of a hydrocarbon through cracks in the pipe; to evacuate the leakage flow without delay to a vessel; to signal almost instantaneously the existence of a leak; and to effect remote control of operations by which the dynamic pressure in the pipe is cancelled. Each equipped section consists of a fluid-type jacket of plastic material which surrounds the pipe, which at its ends is sealed off. It is these seals which delimit the sections. (7 claims)

  13. Digitally assisted pipeline ADCs theory and implementation

    CERN Document Server

    Murmann, Boris

    2007-01-01

    List of Figures. List of Tables. Acknowledgements. Preface. 1: Introduction. 1. Motivation. 2. Overview. 3. Chapter Organization. 2: Performance Trends. 1. Introduction. 2. Digital Performance Trends. 3. ADC Performance Trends. 3: Scaling Analysis. 1. Introduction. 2. Basic Device Scaling from a Digital Perspective. 3. Technology Metrics for Analog Circuits. 4. Scaling Impact on Matching-Limited Circuits. 5. Scaling Impact on Noise-Limited Circuits. 4: Improving Analog Circuit Efficiency. 1. Introduction. 2. Analog Circuit Challenges. 3. The Cost of Feedback. 4. Two-Stage Feedback Amplifier vs. Open-Loop Gain Stage. 5. Discussion. 5: Open-Loop Pipelined ADCs. 1. A Brief Review of Pipelined ADCs. 2. Conventional Stage Implementation. 3. Open-Loop Pipeline Stages. 4. Alternative Transconductor Implementations. 6: Digital Nonlinearity Correction. 1. Overview. 2. Error Model and Digital Correction. 3. Alternative Error Models. 7: Statistics-Based Parameter Estimation. 1. Introduction. 2. Modulation Approach. 3. R...

  14. Sino-Kazakh Crude Pipeline Starts Operation

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    @@ The Sino-Kazakh Crude Oil Pipeline financed by China National Petroleum Corporation (CNPC) and KazMunaiGaz, the state oil company of Kazakhstan,was launched on December 15, 2005, thanks to the commitments and endeavors of governments and constructors of both countries. The pipeline, with diameter of 813 mm, the total length of 962.2 km running from the Kazakhstan Atasu in the west to China's Alashankou in the east, with the phase I designed annual capacity up to 10 million tons. The launch of the pipeline is a milestone of the China-Kazakhstan energy cooperation, having great importance to the countries' economic growths, China's energy security strategy and the diversification of Kazakh oil exports.

  15. Generic Data Pipelining Using ORAC-DR

    Science.gov (United States)

    Allan, Alasdair; Jenness, Tim; Economou, Frossie; Currie, Malcolm J.; Bly, Martin J.

    A generic data reduction pipeline is, perhaps, the holy grail for data reduction software. We present work which sets us firmly on the path towards this goal. ORAC-DR is an online data reduction pipeline written by the Joint Astronomy Center (JAC) and the UK Astronomy Technology Center (ATC) and distributed as part of the Starlink Software collection (SSC). It is intended to run with a minimum of observer interaction, and is able to handle data from many different instruments, including SCUBA, CGS4, UFTI, IRCAM and Michelle, with support for IRIS2 and UIST under development. Recent work by Starlink in collaboration with the JAC has resulted in an increase in the pipeline's flexibility, opening up the possibility that it could be used for truly generic data reduction for data from any imaging, and eventually spectroscopic, detector.

  16. A New Opportunity Facing the Pipeline Sector

    Institute of Scientific and Technical Information of China (English)

    Yao Changpu

    2001-01-01

    @@ According to the State and CNPC programs for development,the nation's oil/gas pipeline sector is faced with an unprecedented historic opportunity, about to enter its golden period of development.The three major pipelines () SeNing-Lan (from Sebei of Qinghai Province to Lanzhou by way of Xining) for natural gas, LanCheng-Yu (from Lanzhou to Chengdu extending to Chongqing) for finished oils, and Zhong-Wu (from Zhongxian County of Chongqing to Wuhan)for natural gas, either under construction or planned to be constructed, have all been listed as the national priority projects of infrastructure construction. And the double-line project of the existing Shaan-Jing (Shaanxi Province to Beijing) gas pipeline has finished the procedure for establishment.

  17. An Overview of Deepwater Pipeline Laying Technology

    Institute of Scientific and Technical Information of China (English)

    LI Zhi-gang; WANG Cong; HE Ning; ZHAO Dong-yan

    2008-01-01

    The technology and methods involved in pipeline laying in shallow water have evolved to the level of routine and commonplace. However, regarding the unexpected deepwater complexity, the traditional pipeline laying techniques have to confront many new challenges arisen from the increase of the water depth, diameter of the pipe and the welding difficulty, all of which should be modified and/or innovated based on the existed mature experiences. The purpose of this investigation is to outline the existing and new engineering laying techniques and the associated facilities, which can provide some significant information to the related research. In the context, the latest deepwater pipeline laying technology and pipe laying barges of the renowned companies from Switzerland, Norway, Italy etc., are introduced and the corresponding comparison and discussion are presented as well.

  18. Fuel consumption impact on gas pipeline projects

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Sidney P. [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil); Kurz, Rainer; Lubomirsky, Matt [Solar Turbines, Inc., San Diego, CA (United States)

    2005-07-01

    An optimized gas pipeline design requires not only a qualified management of good engineering and planning, but also accurate estimates of capital investment and O and M. Compressor stations play a very important role on the success of a gas pipeline design and a careful selection of centrifugal compressors and drivers are key aspects for the success of the project. The state of the art design available nowadays for these kind of equipment provides overall high thermodynamic performance and consequently minimizes installed power requirements and energy usage with expressive savings on operating expenses along the economic life of the project. This paper will present a guideline for proper station design and selection of its turbo-compressors giving emphasis on the impact of fuel consumption on the economics of a gas pipeline project. (author)

  19. V-GAP: Viral genome assembly pipeline

    KAUST Repository

    Nakamura, Yoji

    2015-10-22

    Next-generation sequencing technologies have allowed the rapid determination of the complete genomes of many organisms. Although shotgun sequences from large genome organisms are still difficult to reconstruct perfect contigs each of which represents a full chromosome, those from small genomes have been assembled successfully into a very small number of contigs. In this study, we show that shotgun reads from phage genomes can be reconstructed into a single contig by controlling the number of read sequences used in de novo assembly. We have developed a pipeline to assemble small viral genomes with good reliability using a resampling method from shotgun data. This pipeline, named V-GAP (Viral Genome Assembly Pipeline), will contribute to the rapid genome typing of viruses, which are highly divergent, and thus will meet the increasing need for viral genome comparisons in metagenomic studies.

  20. Engineering critical assessment of PETROBRAS Camarupim pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Gordon, J.R. [Microalloying International, Houston, TX (United States); Gatlin, R.W. [Global Industries, Rio de Janeiro, RJ (Brazil); Zumpano Junior, P.; Kaspary, T. [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil)

    2009-12-19

    This paper presents details of an Engineering Critical Assessment (ECA) performed to develop girth weld flaw acceptance criteria for the PETROBRAS Camarupim Pipeline which was installed in Espirito Santo Basin, ES, offshore Brazil in May 2008 by Global Industries. The pipeline was constructed using 24-inch diameter API Grade X65 pipe with wall thicknesses of 0.875-inch (22.2 mm) and 1.00 inch (25.4 mm). Although the Camarupim pipeline will initially transport sweet gas there is the potential for mildly sour service operation in mid to late life. To assess the effect of sour service on the material toughness properties a series of slow strain rate fracture toughness tests were performed in a Project representative sour service environment. In addition the results of sour service fatigue crack growth tests were analyzed to develop a conservative sour service fatigue crack growth law for the ECA analysis. (author)

  1. Teaching Scientific Computing: A Model-Centered Approach to Pipeline and Parallel Programming with C

    Directory of Open Access Journals (Sweden)

    Vladimiras Dolgopolovas

    2015-01-01

    Full Text Available The aim of this study is to present an approach to the introduction into pipeline and parallel computing, using a model of the multiphase queueing system. Pipeline computing, including software pipelines, is among the key concepts in modern computing and electronics engineering. The modern computer science and engineering education requires a comprehensive curriculum, so the introduction to pipeline and parallel computing is the essential topic to be included in the curriculum. At the same time, the topic is among the most motivating tasks due to the comprehensive multidisciplinary and technical requirements. To enhance the educational process, the paper proposes a novel model-centered framework and develops the relevant learning objects. It allows implementing an educational platform of constructivist learning process, thus enabling learners’ experimentation with the provided programming models, obtaining learners’ competences of the modern scientific research and computational thinking, and capturing the relevant technical knowledge. It also provides an integral platform that allows a simultaneous and comparative introduction to pipelining and parallel computing. The programming language C for developing programming models and message passing interface (MPI and OpenMP parallelization tools have been chosen for implementation.

  2. Open source pipeline for ESPaDOnS reduction and analysis

    Science.gov (United States)

    Martioli, Eder; Teeple, Doug; Manset, Nadine; Devost, Daniel; Withington, Kanoa; Venne, Andre; Tannock, Megan

    2012-09-01

    OPERA is a Canada-France-Hawaii Telescope (CFHT) open source collaborative software project currently under development for an ESPaDOnS echelle spectro-polarimetric image reduction pipeline. OPERA is designed to be fully automated, performing calibrations and reduction, producing one-dimensional intensity and polarimetric spectra. The calibrations are performed on two-dimensional images. Spectra are extracted using an optimal extraction algorithm. While primarily designed for CFHT ESPaDOnS data, the pipeline is being written to be extensible to other echelle spectrographs. A primary design goal is to make use of fast, modern object-oriented technologies. Processing is controlled by a harness, which manages a set of processing modules, that make use of a collection of native OPERA software libraries and standard external software libraries. The harness and modules are completely parametrized by site configuration and instrument parameters. The software is open- ended, permitting users of OPERA to extend the pipeline capabilities. All these features have been designed to provide a portable infrastructure that facilitates collaborative development, code re-usability and extensibility. OPERA is free software with support for both GNU/Linux and MacOSX platforms. The pipeline is hosted on SourceForge under the name "opera-pipeline".

  3. Design of a pipelined 8-bit-serial single-flux-quantum microprocessor with multiple ALUs

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, M [Nagoya University, Furo-cho, Chikusa-ku, Nagoya 464-8603 (Japan); Kawamoto, T [Nagoya University, Furo-cho, Chikusa-ku, Nagoya 464-8603 (Japan); Yamanashi, Y [Yokohama National University, 79-5 Tokiwa-dai, Hodogaya-ku, Yokohama 240-8501 (Japan); Kamiya, Y [Nagoya University, Furo-cho, Chikusa-ku, Nagoya 464-8603 (Japan); Akimoto, A [Yokohama National University, 79-5 Tokiwa-dai, Hodogaya-ku, Yokohama 240-8501 (Japan); Fujiwara, K [Yokohama National University, 79-5 Tokiwa-dai, Hodogaya-ku, Yokohama 240-8501 (Japan); Fujimaki, A [Nagoya University, Furo-cho, Chikusa-ku, Nagoya 464-8603 (Japan); Yoshikawa, N [Yokohama National University, 79-5 Tokiwa-dai, Hodogaya-ku, Yokohama 240-8501 (Japan); Terai, H [National Institute of Information and Communications Technology, 588-2 Iwaoka, Nishi-ku, Kobe 651-2492 (Japan); Yorozu, S [International Superconductivity Technology Center/Superconductivity Research Laboratory, 34 Miyukigaoka, Tsukuba 305-8501 (Japan)

    2006-05-15

    We have designed a pipelined 8-bit-serial single-flux-quantum microprocessor with multiple ALUs, called CORE1 {beta}. In the CORE1 {beta}, two ALUs connected in cascade enable us to perform two calculations on serial data using a register-to-register instruction, to enhance the peak performance. In addition, we have introduced pipelining to boost the performance. Although the pipelining is a difficult technique that requires a complex design in the datapath, we have implemented a simplified pipeline with seven stages by using two techniques. One is the separation of clock signals for pipelining and bit processing, and the other is the introduction of new buffers driven by independent clock signals for reading and writing flexibly in order to ease the difficulty in timing design between the register file and the ALUs. According to the logic simulation, the peak performance of the designed microprocessor is estimated to be 1500 million operations per second with a power consumption of 3.3 mW. We have fabricated the CORE1 {beta} chip by using the NEC 2.5 kA cm{sup -2} niobium standard process, and confirmed the correct operations of several instructions using high-speed clocks.

  4. Design of a pipelined 8-bit-serial single-flux-quantum microprocessor with multiple ALUs

    Science.gov (United States)

    Tanaka, M.; Kawamoto, T.; Yamanashi, Y.; Kamiya, Y.; Akimoto, A.; Fujiwara, K.; Fujimaki, A.; Yoshikawa, N.; Terai, H.; Yorozu, S.

    2006-05-01

    We have designed a pipelined 8-bit-serial single-flux-quantum microprocessor with multiple ALUs, called CORE1 β. In the CORE1 β, two ALUs connected in cascade enable us to perform two calculations on serial data using a register-to-register instruction, to enhance the peak performance. In addition, we have introduced pipelining to boost the performance. Although the pipelining is a difficult technique that requires a complex design in the datapath, we have implemented a simplified pipeline with seven stages by using two techniques. One is the separation of clock signals for pipelining and bit processing, and the other is the introduction of new buffers driven by independent clock signals for reading and writing flexibly in order to ease the difficulty in timing design between the register file and the ALUs. According to the logic simulation, the peak performance of the designed microprocessor is estimated to be 1500 million operations per second with a power consumption of 3.3 mW. We have fabricated the CORE1 β chip by using the NEC 2.5 kA cm-2 niobium standard process, and confirmed the correct operations of several instructions using high-speed clocks.

  5. The pipeline fracture behavior and pressure assessment under HIC (Hydrogen induced cracking) environment

    Energy Technology Data Exchange (ETDEWEB)

    Shaohua, Dong [China National Petroleum Corporation (CNPC), Beijing (China); Lianwei, Wang [University of Science and Technology Beijing (USTB), Beijing (China)

    2009-07-01

    As Hydrogen's transmit and diffuse, after gestating for a while, the density of hydrogen around crack tip of pipeline will get to the critical density, and the pipeline material will descend, make critical stress factor, the reason of pipeline Hydrogen Induced Cracking is Hydrogen's transmit and diffuse. The stress factor of Hydrogen Induced Cracking under surroundings-condition of stress is the key that estimate material's rupture behavior. The paper study the relationship among hydrogen concentrate, crack tip stress, stain field, hydrogen diffusion and inner pressure for crack tip process zone, then determined the length of HIC (hydrogen induced cracking) process zone. Based on the theory of propagation which reason micro-crack making core, dislocation model is produced for fracture criteria of HIC, the influence between material and environments under the HIC is analyzed, step by step pipeline maximum load pressure and threshold of J-integrity ( J{sub ISCC} ) is calculated, which is very significant for pipeline safety operation. (author)

  6. 77 FR 63309 - Constitution Pipeline Company, LLC; Notice of Public Scoping Meeting and Extension of Scoping...

    Science.gov (United States)

    2012-10-16

    ... Energy Regulatory Commission Constitution Pipeline Company, LLC; Notice of Public Scoping Meeting and Extension of Scoping Period for the Planned Constitution Pipeline Project On October 24, 2012, the Federal... Constitution Pipeline Company's (Constitution) Constitution Pipeline Project. This notice also extends...

  7. EST analysis pipeline: use of distributed computing resources.

    Science.gov (United States)

    González, Francisco Javier; Vizcaíno, Juan Antonio

    2011-01-01

    This chapter describes how a pipeline for the analysis of expressed sequence tag (EST) data can be -implemented, based on our previous experience generating ESTs from Trichoderma spp. We focus on key steps in the workflow, such as the processing of raw data from the sequencers, the clustering of ESTs, and the functional annotation of the sequences using BLAST, InterProScan, and BLAST2GO. Some of the steps require the use of intensive computing power. Since these resources are not available for small research groups or institutes without bioinformatics support, an alternative will be described: the use of distributed computing resources (local grids and Amazon EC2).

  8. The MUSE Data Reduction Pipeline: Status after Preliminary Acceptance Europe

    CERN Document Server

    Weilbacher, Peter M; Urrutia, Tanya; Pécontal-Rousset, Arlette; Jarno, Aurélien; Bacon, Roland

    2015-01-01

    MUSE, a giant integral field spectrograph, is about to become the newest facility instrument at the VLT. It will see first light in February 2014. Here, we summarize the properties of the instrument as built and outline functionality of the data reduction system, that transforms the raw data that gets recorded separately in 24 IFUs by 4k CCDs, into a fully calibrated, scientifically usable data cube. We then describe recent work regarding geometrical calibration of the instrument and testing of the processing pipeline, before concluding with results of the Preliminary Acceptance in Europe and an outlook to the on-sky commissioning.

  9. Experimental study on 830 MPa grade pipeline steel containing chromium

    Institute of Scientific and Technical Information of China (English)

    Yi Ren; Shuai Zhang; Shuang Wang; Wen-yue Liu

    2009-01-01

    The diversity of microstructure and properties of 830 Mpa grade pipeline steel containing chromium was investigated by optical microscope and transmission electron microscopy. The main microstructures were multiple configurations, containing lath bainite and granule bainite. Mechanical properties test results showed that the yield strength and tensile strength improved with in-creasing chromium content. The toughness and elongation decreased at the same time, so temper process was introduced. Appling proper temper parameters, the values of toughness and elongation were improved dramatically, and the strength decreased slightly.

  10. Multinational Gas Pipeline Hopeful on Schedule

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    @@ The Kovykta project, which will transport natural gas from Russia's Eastern Siberia to China and Republic of Korea, might come out ahead over the Sino-Russian oil pipeline project. China and Russia are negotiating the price of the piped gas and the result of negotiations will likely be seen in three to four months, TNK-BP President and Chief Executive Officer Robert Dudley recently said,adding that he is confident that natural gas from Kovykta will start flowing through the pipeline by the end of 2008 as planned.

  11. Self lubrication of bitumen froth in pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Joseph, D.D. [Univ. of Minnesota, Minneapolis, MN (United States)

    1997-12-31

    In this paper I will review the main properties of water lubricated pipelines and explain some new features which have emerged from studies of self-lubrication of Syncrudes` bitumen froth. When heavy oils are lubricated with water, the water and oil are continuously injected into a pipeline and the water is stable when in a lubricating sheath around the oil core. In the case of bitumen froth obtained from the Alberta tar sands, the water is dispersed in the bitumen and it is liberated at the wall under shear; water injection is not necessary because the froth is self-lubricating.

  12. Decameter Pulsars and Transients Survey of the Northern Sky. Status, First Results, Multiparametric Pipeline for Candidate Selection

    Science.gov (United States)

    Zakharenko, V. V.; Kravtsov, I. P.; Vasylieva, I. Y.; Mykhailova, S. S.; Ulyanov, O. M.; Shevtsova, A. I.; Skoryk, A. O.; Zarka, P.; Konovalenko, O. O.

    We present the results of processing first 20% of Northern sky pulsars and transients survey using UTR-2 radio telescope. Data processing is done by an automatic pipeline that detects and outputs a large number of transient candidates (usually dispersed bursts). We have developed a multivariate pipeline for visual inspection of these candidates. By adjustment of input parameters of the pipeline the observer can substantially increase signal-to-noise ratio of detected signals as well as discriminate them from residual low-intensity interference with high significance. About 450 transient signals have passed the examination by the multivariate pipeline. Their distributions on the Galactic latitude and dispersion measure have been derived. The shape of the distributions suggests that these signals might be associated with cosmic sources of radio emission.

  13. 75 FR 73160 - Pipeline Safety: Information Collection Activities

    Science.gov (United States)

    2010-11-29

    ... Pipeline and Hazardous Materials Safety Administration Pipeline Safety: Information Collection Activities... information collection activity. PHMSA requests comments on the following information collection: Title... information collection under Office of Management and Budget (OMB) Control No. 2137-0578, titled...

  14. 76 FR 33808 - Pipeline Safety: Information Collection Activities

    Science.gov (United States)

    2011-06-09

    ... Pipeline and Hazardous Materials Safety Administration Pipeline Safety: Information Collection Activities... collection; (2) OMB control number; (3) Type of request; (4) Abstract of the information collection activity... information collection activity. PHMSA requests comments on the following information collection:...

  15. 75 FR 13807 - Pipeline Safety: Information Collection Activities

    Science.gov (United States)

    2010-03-23

    ... Pipeline and Hazardous Materials Safety Administration Pipeline Safety: Information Collection Activities... information collection activity; (5) description of affected public; (6) estimate of total annual reporting... approval for each information collection activity. PHMSA requests comments on the following...

  16. 75 FR 76077 - Pipeline Safety: Information Collection Activities

    Science.gov (United States)

    2010-12-07

    ... Pipeline and Hazardous Materials Safety Administration Pipeline Safety: Information Collection Activities... information collection activity. PHMSA requests comments on the following information collection: Title... on an information collection under Office of Management and Budget (OMB) Control No....

  17. 78 FR 57455 - Pipeline Safety: Information Collection Activities

    Science.gov (United States)

    2013-09-18

    ... Pipeline and Hazardous Materials Safety Administration Pipeline Safety: Information Collection Activities... control number; (3) type of request; (4) abstract of the information collection activity; (5) description... activity. PHMSA requests comments on the following information collection: SUPPLEMENTARY INFORMATION:...

  18. 78 FR 36016 - Pipeline Safety: Information Collection Activities

    Science.gov (United States)

    2013-06-14

    ... Pipeline and Hazardous Materials Safety Administration Pipeline Safety: Information Collection Activities... information collection activity; (5) description of affected public; (6) estimate of total annual reporting... approval for each information collection activity. PHMSA requests comments on the following...

  19. 77 FR 15453 - Pipeline Safety: Information Collection Activities

    Science.gov (United States)

    2012-03-15

    ... Pipeline and Hazardous Materials Safety Administration Pipeline Safety: Information Collection Activities...; (5) Abstract of the information collection activity; (6) Description of affected public; (7) Estimate... request a three- year term of approval for the information collection activity. PHMSA requests comments...

  20. 76 FR 81013 - Pipeline Safety: Information Collection Activities

    Science.gov (United States)

    2011-12-27

    ... Pipeline and Hazardous Materials Safety Administration Pipeline Safety: Information Collection Activities... request; (5) Abstract of the information collection activity; (6) Description of affected public; (7... request a three- year term of approval for the information collection activity. PHMSA requests comments...

  1. 49 CFR 195.248 - Cover over buried pipeline.

    Science.gov (United States)

    2010-10-01

    ... HAZARDOUS LIQUIDS BY PIPELINE Construction § 195.248 Cover over buried pipeline. (a) Unless specifically... (457) Any other area 30 (762) 18 (457) 1 Rock excavation is any excavation that requires blasting or...

  2. Diagnosing plant pipeline system performance using radiotracer techniques

    Energy Technology Data Exchange (ETDEWEB)

    Kasban, H.; Ali, Elsayed H.; Arafa, H. [Engineering Department, Nuclear Research Center, Atomic Energy Authority, Inshas (Egypt)

    2017-02-15

    This study presents an experimental work in a petrochemical company for scanning a buried pipeline using Tc{sup 99m} radiotracer based on the measured velocity changes, in order to determine the flow reduction along a pipeline. In this work, Tc{sup 99m} radiotracer was injected into the pipeline and monitored by sodium iodide scintillation detectors located at several positions along the pipeline. The flow velocity has been calculated between every two consecutive detectors along the pipeline. Practically, six experiments have been carried out using two different data acquisition systems, each of them being connected to four detectors. During the fifth experiment, a bypass was discovered between the scanned pipeline and another buried parallel pipeline connected after the injection point. The results indicate that the bypass had a bad effect on the volumetric flow rate in the scanned pipeline.

  3. 75 FR 38799 - ETC Tiger Pipeline, LLC; Notice of Application

    Science.gov (United States)

    2010-07-06

    ...] ETC Tiger Pipeline, LLC; Notice of Application June 25, 2010. Take notice that on June 15, 2010, ETC Tiger Pipeline, LLC (ETC Tiger), 711 Louisiana Street, Suite 900, Houston, Texas 77002, filed...

  4. QUANTITATIVE RISK MAPPING OF URBAN GAS PIPELINE NETWORKS USING GIS

    National Research Council Canada - National Science Library

    P. Azari; M. Karimi

    2017-01-01

    Natural gas is considered an important source of energy in the world. By increasing growth of urbanization, urban gas pipelines which transmit natural gas from transmission pipelines to consumers, will become a dense network...

  5. PetroChina Sees Gas Pipeline Profit in Four Years

    Institute of Scientific and Technical Information of China (English)

    Xiao Lu

    2002-01-01

    @@ Sebei-Xining- Lanzhou pipeline put into operation PetroChina, the nation's largest gas producer,recently said that the US$280-million gas pipeline in Northwest China would generate profits within four years.

  6. 78 FR 39721 - Constitution Pipeline Company, LLC; Notice of Application

    Science.gov (United States)

    2013-07-02

    ... Energy Regulatory Commission Constitution Pipeline Company, LLC; Notice of Application Take notice that on June 13, 2013, Constitution Pipeline Company, LLC (Constitution), having its principal place of...\\ Constitution further requests that the Commission grant Constitution a blanket certificate...

  7. Controlled short-circuiting MIG-MAG welding process and procedures applied to the root pass in pipeline construction; Processo de soldagem MIG/MAG em curto-circuito controlado e procedimentos aplicados ao passe de raiz na construcao de linhas dutoviarias

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Regis H.G. e; Gohr Junior, Raul; Weck, Leonardo W.A. [Santa Catarina Univ., Florianopolis, SC (Brazil). Dept. de Engenharia Mecanica. Lab. de Soldagem e Mecatronica (LABSOLDA)

    2005-07-01

    The work deals with the study and development of the Controlled Short-Circuiting MIG/MAG Welding Process (CCC) and procedures for the root pass on pipes, in pipelines construction. The developed process (CCC) consists in an semi-automatic slag free operation, yielding higher productivity than the Coated Electrode and TIG processes, with satisfactory properties on the root weld. The significant influence of the welding over the time schedule and construction cost makes the development of this technology attractive, in order to become available at low cost, enhancing the companies' competitiveness in the globalized oil sector. The developed system, a MIG/MAG variant, features the advantages of short-circuiting metal transfer and avoids its inconveniences (mainly with high CO{sub 2} content gases), enabling its use on pipes root welding. This is possible through current waveform control, providing process and weld pool stability. Procedures for the root pass were determined for each of the welding positions reached in thick walled pipes welding, with the CCC. Also, the low welder training time was notable. (author)

  8. Multi-Criteria Decision Making Models for Water Pipelines

    OpenAIRE

    El Chanati, Hisham; El-Abbasy, Mohammed S.; Mosleh, Fadi; Senouci, Ahmed; Abouhamad, Mona; Gkountis, Iason; Zayed, Tarek; Al-Derham, Hassan

    2016-01-01

    The deterioration of water pipelines leads to impaired water quality, increased breakage rate, and reduced hydraulic capacity. The planning of maintenance programs for water pipelines is essential to minimize health and safety concerns and ensure an adequate supply of water in a safe, cost-effective, reliable, and sustainable manner. It is essential to assess the performance of water pipelines to assist municipalities in planning inspection and rehabilitation programs for their pipelines. Sev...

  9. Geohazard assessment lifecycle for a natural gas pipeline project

    Science.gov (United States)

    Lekkakis, D.; Boone, M. D.; Strassburger, E.; Li, Z.; Duffy, W. P.

    2015-09-01

    This paper is a walkthrough of the geohazard risk assessment performed for the Front End Engineering Design (FEED) of a planned large-diameter natural gas pipeline, extending from Eastern Europe to Western Asia for a total length of approximately 1,850 km. The geohazards discussed herein include liquefaction-induced pipe buoyancy, cyclic softening, lateral spreading, slope instability, groundwater rise-induced pipe buoyancy, and karst. The geohazard risk assessment lifecycle was comprised of 4 stages: initially a desktop study was carried out to describe the geologic setting along the alignment and to conduct a preliminary assessment of the geohazards. The development of a comprehensive Digital Terrain Model topography and aerial photography data were fundamental in this process. Subsequently, field geohazard mapping was conducted with the deployment of 8 teams of geoprofessionals, to investigate the proposed major reroutes and delve into areas of poor or questionable data. During the third stage, a geotechnical subsurface site investigation was then executed based on the results of the above study and mapping efforts in order to obtain sufficient data tailored for risk quantification. Lastly, all gathered and processed information was overlain into a Geographical Information database towards a final determination of the critical reaches of the pipeline alignment. Input from Subject Matter Experts (SME) in the fields of landslides, karst and fluvial geomorphology was incorporated during the second and fourth stages of the assessment. Their experience in that particular geographical region was key to making appropriate decisions based on engineering judgment. As the design evolved through the above stages, the pipeline corridor was narrowed from a 2-km wide corridor, to a 500-m corridor and finally to a fixed alignment. Where the geohazard risk was high, rerouting of the pipeline was generally selected as a mitigation measure. In some cases of high uncertainty in

  10. MMAPPR: mutation mapping analysis pipeline for pooled RNA-seq.

    Science.gov (United States)

    Hill, Jonathon T; Demarest, Bradley L; Bisgrove, Brent W; Gorsi, Bushra; Su, Yi-Chu; Yost, H Joseph

    2013-04-01

    Forward genetic screens in model organisms are vital for identifying novel genes essential for developmental or disease processes. One drawback of these screens is the labor-intensive and sometimes inconclusive process of mapping the causative mutation. To leverage high-throughput techniques to improve this mapping process, we have developed a Mutation Mapping Analysis Pipeline for Pooled RNA-seq (MMAPPR) that works without parental strain information or requiring a preexisting SNP map of the organism, and adapts to differential recombination frequencies across the genome. MMAPPR accommodates the considerable amount of noise in RNA-seq data sets, calculates allelic frequency by Euclidean distance followed by Loess regression analysis, identifies the region where the mutation lies, and generates a list of putative coding region mutations in the linked genomic segment. MMAPPR can exploit RNA-seq data sets from isolated tissues or whole organisms that are used for gene expression and transcriptome analysis in novel mutants. We tested MMAPPR on two known mutant lines in zebrafish, nkx2.5 and tbx1, and used it to map two novel ENU-induced cardiovascular mutants, with mutations found in the ctr9 and cds2 genes. MMAPPR can be directly applied to other model organisms, such as Drosophila and Caenorhabditis elegans, that are amenable to both forward genetic screens and pooled RNA-seq experiments. Thus, MMAPPR is a rapid, cost-efficient, and highly automated pipeline, available to perform mutant mapping in any organism with a well-assembled genome.

  11. From Wellhead to Market. Oil Pipeline Tariffs and Tariff Methodologies in Selected Energy Charter Member Countries

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-01-15

    Freedom of energy transit is an important element of the Energy Charter process. The Energy Charter Treaty obliges its member countries to facilitate energy transit on a nondiscriminatory basis, and to refrain from imposing unreasonable delays, restrictions or charges on energy in transit. A main focus for the Energy Charter process has been the conditions for transit of natural gas. Tariffs, along with access to energy transit infrastructure, are the basis of free transit. To examine gas transit flows and tariff methodologies, the Energy Charter Secretariat published a study on gas transit tariffs in selected Energy Charter member countries in January 2006. This report follows on from the gas tariff study and examines oil transit flows and oil transit tariffs. The Energy Charter constituency in the land-locked part of the Eurasian continent has the world's largest oil pipeline system, which was originally built during the Soviet era. After collapse of the Soviet Union the pipeline system was divided into separate parts by emergence of new borders, and oil transported by the pipeline now has to cross multiple borders before it reaches its destination. The main objectives of this study are; to review transit tariff methodologies for existing and new oil transit pipeline systems across selected member countries of the Energy Charter; to compare transit tariff regimes with those for domestic transport; and to assess the overall consistency of these transit tariffs vis-a-vis the provisions of the Energy Charter Treaty and draft Transit Protocol. Geographically, this study covers the following key oil transit countries; in Eastern Europe, the Caucasus and Central Asia: the Russian Federation, Belarus, Ukraine, Azerbaijan, Kazakhstan, Georgia; and in Western Europe: France, Switzerland, Germany, Austria, Italy, Norway and the UK. Chapter 3 gives a brief review on main domestic and cross-border oil flows in the countries examined. Chapter 4 describes essential

  12. Current Status and Prospects of Oil and Gas Pipelines in China

    Institute of Scientific and Technical Information of China (English)

    Pu Ming

    2010-01-01

    @@ By the end of 2009,the total length of existing oil and gas pipelines in China had reached 75×103 km.The pipelines include 38×103 km of gas pipelines,20×103km of crude oil pipelines and 17×103 km of oil product pipelines,framing a trans-regional pipeline network for the oil and gas delivery.

  13. 77 FR 46155 - Pipeline Safety: Information Collection Activities

    Science.gov (United States)

    2012-08-02

    ... Administration [Docket No. PHMSA-2012-0094] Pipeline Safety: Information Collection Activities AGENCY: Pipeline...) Current expiration date; (4) Type of request; (5) Abstract of the information collection activity; (6... activity. PHMSA requests comments on the following information collections: 1. Title: Pipeline...

  14. Optical fiber sensing technology in the pipeline industry

    Energy Technology Data Exchange (ETDEWEB)

    Braga, A.M.B.; Llerena, R.W.A. [Pontificia Univ. Catolica do Rio de Janeiro, RJ (Brazil). Dept. de Engenharia Mecanica]. E-mail: abraga@mec.puc-rio.br; roberan@mec.puc-rio.br; Valente, L.C.G.; Regazzi, R.D. [Gavea Sensors, Rio de Janeiro, RJ (Brazil)]. E-mail: guedes@gaveasensors.com; regazzi@gaveasensors.com

    2003-07-01

    This paper is concerned with applications of optical fiber sensors to pipeline monitoring. The basic principles of optical fiber sensors are briefly reviewed, with particular attention to fiber Bragg grating technology. Different potential applications in the pipeline industry are discussed, and an example of a pipeline strain monitoring system based on optical fiber Bragg grating sensors is presented. (author)

  15. 77 FR 34458 - Pipeline Safety: Requests for Special Permit

    Science.gov (United States)

    2012-06-11

    ..., Alaska. The pipeline is intended to transport natural gas from the oil and gas producers on the Alaskan... received from Norgasco, Inc., and BreitBurn Energy Company LP, two natural gas pipeline operators, seeking... permits from two natural gas pipeline operators, Norgasco, Inc., (``NI''), and BreitBurn Energy Company...

  16. 49 CFR 192.627 - Tapping pipelines under pressure.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 3 2010-10-01 2010-10-01 false Tapping pipelines under pressure. 192.627 Section 192.627 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS... under pressure. Each tap made on a pipeline under pressure must be performed by a crew qualified to make...

  17. Flexibility, savings chief returns of new pipeline system

    Energy Technology Data Exchange (ETDEWEB)

    Broadbent, G.E.; Williams, T. (Pipelines Authority of South Australia, Glenside (AU))

    1990-04-30

    This paper reports on the development of an optimization system for operation of a gas-pipeline network. The system employs a steady-state model to predict pipeline compressor configurations and setpoints. The system allowed greater accuracy in operations of the pipeline network even when subject to highly transient flows. This article provides a detailed description of the system.

  18. Pipeline contribution to the Middle East oil trades

    Energy Technology Data Exchange (ETDEWEB)

    Khoja, B.A.

    1979-02-01

    The pipelines that physically exist in the Middle East are described. There are 7 pipelines in the Middle East that end in Mediterranean terminals with a total present capacity of 235 million tons annually. Three of the pipelines are out of service, three are being utilized only partially, and one is operating at full capacity. (MCW)

  19. 49 CFR 192.755 - Protecting cast-iron pipelines.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 3 2010-10-01 2010-10-01 false Protecting cast-iron pipelines. 192.755 Section... NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Maintenance § 192.755 Protecting cast-iron pipelines. When an operator has knowledge that the support for a segment of a buried...

  20. 76 FR 68828 - Pipeline Safety: Emergency Responder Forum

    Science.gov (United States)

    2011-11-07

    ... Administration [Docket ID PHMSA-2011-0295] Pipeline Safety: Emergency Responder Forum AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: Notice of Forum. SUMMARY: PHMSA is co-sponsoring a one-day Emergency Responder Forum with the National Association of Pipeline Safety...

  1. Mulitbeam GPU Transient Pipeline for the Medicina BEST-2 Array

    CERN Document Server

    Magro, Alessio; Adami, Kristian Zarb

    2013-01-01

    Radio transient discovery using next generation radio telescopes will pose several digital signal processing and data transfer challenges, requiring specialized high-performance backends. Several accelerator technologies are being considered as prototyping platforms, including Graphics Processing Units (GPUs). In this paper we present a real-time pipeline prototype capable of processing multiple beams concurrently, performing Radio Frequency Interference (RFI) rejection through thresholding, correcting for the delay in signal arrival times across the frequency band using brute-force dedispersion, event detection and clustering, and finally candidate filtering, with the capability of persisting data buffers containing interesting signals to disk. This setup was deployed at the BEST-2 SKA pathfinder in Medicina, Italy, where several benchmarks and test observations of astrophysical transients were conducted. These tests show that on the deployed hardware eight 20MHz beams can be processed simultaneously for $\\s...

  2. Development and Application of Oil-Spill Risk Assessment Model for Offshore Pipeline

    Institute of Scientific and Technical Information of China (English)

    LU Yan; WANG Jia; WEI Wenpu; YANG Yong; AN Wei

    2014-01-01

    To the potential oil-spill risk caused by offshore pipeline more attention has been paid after the Dalian oil spill incident from oil-pipeline explosion. Since then an issue about how to prevent and control the sudden oil-spill from the offshore pipeline has been raised. In this paper, we proposed an optimized model to analyze the main causes (probability) of spill and the consequence with the fuzzy comprehensive assessment model. Considering the complicated assessment process for oil-spill, the assessment factor system involving the spill probability and consequence was established based on the operative manual and statistic leakage/damage data of offshore pipeline in order to estimate the integrated spill risk score automatically. The evaluated factors of spill probability could be grouped into five aspects:corrosion, fatigue, national damage, third party, and operational fault;the consequence evaluated factors of spill included hazard of oil and impact-controlling capability. With some modifications based on experts’ opinions, each of the evaluated factors in our work was developed with a relative weight and evaluation criterion. A test example for an offshore pipe-line in the Bohai waters was described to show how the model can be used for an actual case in more detail. By using the oil-spill risk assessment model, it is easy to determine the risk level associated with the ongoing activity and management level and hence to take the risk mitigation action immediately.

  3. A computational platform for modeling and simulation of pipeline georeferencing systems

    Energy Technology Data Exchange (ETDEWEB)

    Guimaraes, A.G.; Pellanda, P.C.; Gois, J.A. [Instituto Militar de Engenharia (IME), Rio de Janeiro, RJ (Brazil); Roquette, P.; Pinto, M.; Durao, R. [Instituto de Pesquisas da Marinha (IPqM), Rio de Janeiro, RJ (Brazil); Silva, M.S.V.; Martins, W.F.; Camillo, L.M.; Sacsa, R.P.; Madeira, B. [Ministerio de Ciencia e Tecnologia (CT-PETRO2006MCT), Brasilia, DF (Brazil). Financiadora de Estudos e Projetos (FINEP). Plano Nacional de Ciencia e Tecnologia do Setor Petroleo e Gas Natural

    2009-07-01

    This work presents a computational platform for modeling and simulation of pipeline geo referencing systems, which was developed based on typical pipeline characteristics, on the dynamical modeling of Pipeline Inspection Gauge (PIG) and on the analysis and implementation of an inertial navigation algorithm. The software environment of PIG trajectory simulation and navigation allows the user, through a friendly interface, to carry-out evaluation tests of the inertial navigation system under different scenarios. Therefore, it is possible to define the required specifications of the pipeline geo referencing system components, such as: required precision of inertial sensors, characteristics of the navigation auxiliary system (GPS surveyed control points, odometers etc.), pipeline construction information to be considered in order to improve the trajectory estimation precision, and the signal processing techniques more suitable for the treatment of inertial sensors data. The simulation results are analyzed through the evaluation of several performance metrics usually considered in inertial navigation applications, and 2D and 3D plots of trajectory estimation error and of recovered trajectory in the three coordinates are made available to the user. This paper presents the simulation platform and its constituting modules and defines their functional characteristics and interrelationships.(author)

  4. Prediction Model Based on the Grey Theory for Tackling Wax Deposition in Oil Pipelines

    Institute of Scientific and Technical Information of China (English)

    Ming Wu; Shujuan Qiu; Jianfeng Liu; Ling Zhao

    2005-01-01

    Problems involving wax deposition threaten seriously crude pipelines both economically and operationally. Wax deposition in oil pipelines is a complicated problem having a number of uncertainties and indeterminations. The Grey System Theory is a suitable theory for coping with systems in which some information is clear and some is not, so it is an adequate model for studying the process of wax deposition.In order to predict accurately wax deposition along a pipeline, the Grey Model was applied to fit the data of wax deposition rate and the thickness of the deposited wax layer on the pipe-wall, and to give accurate forecast on wax deposition in oil pipelines. The results showed that the average residential error of the Grey Prediction Model is smaller than 2%. They further showed that this model exhibited high prediction accuracy. Our investigation proved that the Grey Model is a viable means for forecasting wax deposition.These findings offer valuable references for the oil industry and for firms dealing with wax cleaning in oil pipelines.

  5. The First Steps into a "Leaky Pipeline"

    DEFF Research Database (Denmark)

    Emerek, Ruth; Larsen, Britt Østergaard

    2011-01-01

    Research shows that the higher the level of academic positions at universities the lower the percentage of women among employees also applies at Danish universities. This may be due to a historical backlog or merely to a 'Leaky pipeline', as earlier studies have revealed that an increasing...

  6. Comb to Pipeline: Fast Software Encryption Revisited

    DEFF Research Database (Denmark)

    Bogdanov, Andrey; Lauridsen, Martin Mehl; Tischhauser, Elmar Wolfgang

    2015-01-01

    AES-NI, or Advanced Encryption Standard New Instructions, is an extension of the x86 architecture proposed by Intel in 2008. With a pipelined implementation utilizing AES-NI, parallelizable modes such as AES-CTR become extremely efficient. However, out of the four non-trivial NIST-recommended enc...

  7. Leadership Pipeline i den offentlige sektor

    DEFF Research Database (Denmark)

    Molly-Søholm, Thorkil; Dahl, Kristian Aagaard

    2013-01-01

    Teorien Leadership Pipeline er de seneste ca. ti år vundet kraftigt frem som en deskriptiv ramme for ledelse og som et styrende paradigme for store internationale koncerners ledelsesinfrastruktur (Kaiser, 2011). Teorien beskriver bl.a. hvilke færdigheder, prioriteter og arbejdsværdier, der skal t...

  8. Security Support in Continuous Deployment Pipeline

    DEFF Research Database (Denmark)

    Ullah, Faheem; Raft, Adam Johannes; Shahin, Mojtaba

    2017-01-01

    Continuous Deployment (CD) has emerged as a new practice in the software industry to continuously and automatically deploy software changes into production. Continuous Deployment Pipeline (CDP) supports CD practice by transferring the changes from the repository to production. Since most of the C...

  9. The Effect of Landslide on Gas Pipeline

    Directory of Open Access Journals (Sweden)

    Valkovič Vojtech

    2016-11-01

    Full Text Available The present paper deals with the calculation of stresses on the pipeline system embedded on a flexible substrate which is burdened by a landslide. As well as taking into account the probability of the influences acting on the pipe as wall thickness, and others.

  10. System Reliability Assessment of Offshore Pipelines

    NARCIS (Netherlands)

    Mustaffa, Z.

    2011-01-01

    The title of this thesis, System Reliability Assessment of Offshore Pipelines, portrays the application of probabilistic methods in assessing the reliability of these structures. The main intention of this thesis is to identify, apply and judge the suitability of the probabilistic methods in evalua

  11. Underwater Adhesives Retrofit Pipelines with Advanced Sensors

    Science.gov (United States)

    2015-01-01

    Houston-based Astro Technology Inc. used a partnership with Johnson Space Center to pioneer an advanced fiber-optic monitoring system for offshore oil pipelines. The company's underwater adhesives allow it to retrofit older deepwater systems in order to measure pressure, temperature, strain, and flow properties, giving energy companies crucial data in real time and significantly decreasing the risk of a catastrophe.

  12. Remaining local buckling resistance of corroded pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Qishi [C-FER Technologies, Edmonton, AB (Canada); Khoo, Heng Aik [Carleton University, Ottawa, Ontario (Canada); Cheng, Roger [University of Alberta, Edmonton, AB (Canada); Zhou, Joe [TransCanada Pipelines Limited, Calgary, AB (Canada)

    2010-07-01

    The Pipeline Research Council International has undertaken a multi-year research program to investigate the local buckling (or wrinkling) of onshore pipelines affected by corrosion. Local buckling resistance depends on wall thickness and seems to be considerably reduced by metal-loss defects. Experimental data were lacking, which led to the use of overly conservative assumptions. C-FER and the University of Alberta conducted research in three phases in order to develop local buckling criteria for pipelines with corrosion defects. In Phase 1, the influence of various corrosion defect features was assessed with finite element analysis, and the ranking of key parameters was determined. On this basis, Phase 2 consisted in developing a test matrix and carrying out 10 full-scale tests to collect data. In Phase 3, finite element models were used to analyze over 150 parametric cases and develop criteria for assessing maximum moment and compressive strain limit. These criteria were applied to in-service pipelines with general corrosion features.

  13. Economic optimization of CO2 pipeline configurations

    NARCIS (Netherlands)

    Knoope, M.M.J.; Ramirez, C.A.; Faaij, A.P.C.

    2013-01-01

    In this article, an economic optimization tool is developed taking into account different steel grades, inlet pressure, diameter and booster stations for point-to-point pipelines as well as for simple networks. Preliminary results show that gaseous CO2 transport is cost effective for relatively smal

  14. Pipeline: An Employment and Training Simulation.

    Science.gov (United States)

    Easterly, Jean L.; Meyer, David P.

    This monograph is one of a series developed for curriculum in higher education which prepares personnel for employment in local, state, and regional levels of Employment and Training Administration programs of the U.S. Department of Labor. This publication describes a simulation game called "Pipeline" rather than a regular university course. The…

  15. Foreign Giants Take Gas Pipeline Stake Equally

    Institute of Scientific and Technical Information of China (English)

    Xie Ye

    2002-01-01

    @@ Oil giants Royal/Dutch, ExxonMobil and Russia's Gazprom have agreed to take 15 percent stakes each in China's US$5.6 billion natural gas pipeline project,clearing away the final obstacles blocking the kickoff of the repeatedly delayed project, according to the latest reports from news media in early July.

  16. Tow techniques for marine pipeline installation

    NARCIS (Netherlands)

    Fernández, M.L.

    1981-01-01

    Tow techniques for marine pipelines frequently offer competitive and commercially attactive solutions over other installation methods and, on occasion, may represent the only alternative to traditional techniques. An assessment is also made of where each tow method is applicable and technically feas

  17. PANDA: a pipeline toolbox for analyzing brain diffusion images

    Directory of Open Access Journals (Sweden)

    Zaixu eCui

    2013-02-01

    Full Text Available Diffusion magnetic resonance imaging (dMRI is widely used in both scientific research and clinical practice in in-vivo studies of the human brain. While a number of post-processing packages have been developed, fully automated processing of dMRI datasets remains challenging. Here, we developed a MATLAB toolbox named Pipeline for Analyzing braiN Diffusion imAges (PANDA for fully automated processing of brain diffusion images. The processing modules of a few established packages, including FMRIB Software Library (FSL, Pipeline System for Octave and Matlab (PSOM, Diffusion Toolkit and MRIcron, were employed in PANDA. Using any number of raw dMRI datasets from different subjects, in either DICOM or NIfTI format, PANDA can automatically perform a series of steps to process DICOM/NIfTI to diffusion metrics (e.g., FA and MD that are ready for statistical analysis at the voxel-level, the atlas-level and the Tract-Based Spatial Statistics (TBSS-level and can finish the construction of anatomical brain networks for all subjects. In particular, PANDA can process different subjects in parallel, using multiple cores either in a single computer or in a distributed computing environment, thus greatly reducing the time cost when dealing with a large number of datasets. In addition, PANDA has a friendly graphical user interface (GUI, allowing the user to be interactive and to adjust the input/output settings, as well as the processing parameters. As an open-source package, PANDA is freely available at http://www.nitrc.org/projects/panda/. This novel toolbox is expected to substantially simplify the image processing of dMRI datasets and facilitate human structural connectome studies.

  18. PANDA: a pipeline toolbox for analyzing brain diffusion images.

    Science.gov (United States)

    Cui, Zaixu; Zhong, Suyu; Xu, Pengfei; He, Yong; Gong, Gaolang

    2013-01-01

    Diffusion magnetic resonance imaging (dMRI) is widely used in both scientific research and clinical practice in in-vivo studies of the human brain. While a number of post-processing packages have been developed, fully automated processing of dMRI datasets remains challenging. Here, we developed a MATLAB toolbox named "Pipeline for Analyzing braiN Diffusion imAges" (PANDA) for fully automated processing of brain diffusion images. The processing modules of a few established packages, including FMRIB Software Library (FSL), Pipeline System for Octave and Matlab (PSOM), Diffusion Toolkit and MRIcron, were employed in PANDA. Using any number of raw dMRI datasets from different subjects, in either DICOM or NIfTI format, PANDA can automatically perform a series of steps to process DICOM/NIfTI to diffusion metrics [e.g., fractional anisotropy (FA) and mean diffusivity (MD)] that are ready for statistical analysis at the voxel-level, the atlas-level and the Tract-Based Spatial Statistics (TBSS)-level and can finish the construction of anatomical brain networks for all subjects. In particular, PANDA can process different subjects in parallel, using multiple cores either in a single computer or in a distributed computing environment, thus greatly reducing the time cost when dealing with a large number of datasets. In addition, PANDA has a friendly graphical user interface (GUI), allowing the user to be interactive and to adjust the input/output settings, as well as the processing parameters. As an open-source package, PANDA is freely available at http://www.nitrc.org/projects/panda/. This novel toolbox is expected to substantially simplify the image processing of dMRI datasets and facilitate human structural connectome studies.

  19. Molgenis-impute: imputation pipeline in a box.

    Science.gov (United States)

    Kanterakis, Alexandros; Deelen, Patrick; van Dijk, Freerk; Byelas, Heorhiy; Dijkstra, Martijn; Swertz, Morris A

    2015-08-19

    Genotype imputation is an important procedure in current genomic analysis such as genome-wide association studies, meta-analyses and fine mapping. Although high quality tools are available that perform the steps of this process, considerable effort and expertise is required to set up and run a best practice imputation pipeline, particularly for larger genotype datasets, where imputation has to scale out in parallel on computer clusters. Here we present MOLGENIS-impute, an 'imputation in a box' solution that seamlessly and transparently automates the set up and running of all the steps of the imputation process. These steps include genome build liftover (liftovering), genotype phasing with SHAPEIT2, quality control, sample and chromosomal chunking/merging, and imputation with IMPUTE2. MOLGENIS-impute builds on MOLGENIS-compute, a simple pipeline management platform for submission and monitoring of bioinformatics tasks in High Performance Computing (HPC) environments like local/cloud servers, clusters and grids. All the required tools, data and scripts are downloaded and installed in a single step. Researchers with diverse backgrounds and expertise have tested MOLGENIS-impute on different locations and imputed over 30,000 samples so far using the 1,000 Genomes Project and new Genome of the Netherlands data as the imputation reference. The tests have been performed on PBS/SGE clusters, cloud VMs and in a grid HPC environment. MOLGENIS-impute gives priority to the ease of setting up, configuring and running an imputation. It has minimal dependencies and wraps the pipeline in a simple command line interface, without sacrificing flexibility to adapt or limiting the options of underlying imputation tools. It does not require knowledge of a workflow system or programming, and is targeted at researchers who just want to apply best practices in imputation via simple commands. It is built on the MOLGENIS compute workflow framework to enable customization with additional

  20. 78 FR 77444 - Natural Gas Pipeline Company of America LLC; Stingray Pipeline Company, L.L.C.; Notice of...

    Science.gov (United States)

    2013-12-23

    ... Energy Regulatory Commission Natural Gas Pipeline Company of America LLC; Stingray Pipeline Company, L.L... America LLC (Natural), 3250 Lacey Road, 7th Floor, Downers Grove, Illinois 60515-7918 and Stingray Pipeline Company, L.L.C. (Stingray), 110 Louisiana Street, Suite 3300, Houston, Texas 77002, filed a...