WorldWideScience

Sample records for swath hdf4 files

  1. Utilizing HDF4 File Content Maps for the Cloud

    Science.gov (United States)

    Lee, Hyokyung Joe

    2016-01-01

    We demonstrate a prototype study that HDF4 file content map can be used for efficiently organizing data in cloud object storage system to facilitate cloud computing. This approach can be extended to any binary data formats and to any existing big data analytics solution powered by cloud computing because HDF4 file content map project started as long term preservation of NASA data that doesn't require HDF4 APIs to access data.

  2. HDF-EOS Dump Tools

    Science.gov (United States)

    Prasad, U.; Rahabi, A.

    2001-05-01

    The following utilities developed for HDF-EOS format data dump are of special use for Earth science data for NASA's Earth Observation System (EOS). This poster demonstrates their use and application. The first four tools take HDF-EOS data files as input. HDF-EOS Metadata Dumper - metadmp Metadata dumper extracts metadata from EOS data granules. It operates by simply copying blocks of metadata from the file to the standard output. It does not process the metadata in any way. Since all metadata in EOS granules is encoded in the Object Description Language (ODL), the output of metadmp will be in the form of complete ODL statements. EOS data granules may contain up to three different sets of metadata (Core, Archive, and Structural Metadata). HDF-EOS Contents Dumper - heosls Heosls dumper displays the contents of HDF-EOS files. This utility provides detailed information on the POINT, SWATH, and GRID data sets. in the files. For example: it will list, the Geo-location fields, Data fields and objects. HDF-EOS ASCII Dumper - asciidmp The ASCII dump utility extracts fields from EOS data granules into plain ASCII text. The output from asciidmp should be easily human readable. With minor editing, asciidmp's output can be made ingestible by any application with ASCII import capabilities. HDF-EOS Binary Dumper - bindmp The binary dumper utility dumps HDF-EOS objects in binary format. This is useful for feeding the output of it into existing program, which does not understand HDF, for example: custom software and COTS products. HDF-EOS User Friendly Metadata - UFM The UFM utility tool is useful for viewing ECS metadata. UFM takes an EOSDIS ODL metadata file and produces an HTML report of the metadata for display using a web browser. HDF-EOS METCHECK - METCHECK METCHECK can be invoked from either Unix or Dos environment with a set of command line options that a user might use to direct the tool inputs and output . METCHECK validates the inventory metadata in (.met file) using The

  3. Tuning HDF5 subfiling performance on parallel file systems

    Energy Technology Data Exchange (ETDEWEB)

    Byna, Suren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Chaarawi, Mohamad [Intel Corp. (United States); Koziol, Quincey [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Mainzer, John [The HDF Group (United States); Willmore, Frank [The HDF Group (United States)

    2017-05-12

    Subfiling is a technique used on parallel file systems to reduce locking and contention issues when multiple compute nodes interact with the same storage target node. Subfiling provides a compromise between the single shared file approach that instigates the lock contention problems on parallel file systems and having one file per process, which results in generating a massive and unmanageable number of files. In this paper, we evaluate and tune the performance of recently implemented subfiling feature in HDF5. In specific, we explain the implementation strategy of subfiling feature in HDF5, provide examples of using the feature, and evaluate and tune parallel I/O performance of this feature with parallel file systems of the Cray XC40 system at NERSC (Cori) that include a burst buffer storage and a Lustre disk-based storage. We also evaluate I/O performance on the Cray XC30 system, Edison, at NERSC. Our results show performance benefits of 1.2X to 6X performance advantage with subfiling compared to writing a single shared HDF5 file. We present our exploration of configurations, such as the number of subfiles and the number of Lustre storage targets to storing files, as optimization parameters to obtain superior I/O performance. Based on this exploration, we discuss recommendations for achieving good I/O performance as well as limitations with using the subfiling feature.

  4. Photon-HDF5: An Open File Format for Timestamp-Based Single-Molecule Fluorescence Experiments.

    Science.gov (United States)

    Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier

    2016-01-05

    We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long-term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode, photomultiplier tube, or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc.) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. The format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference Python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. To encourage adoption by the academic and commercial communities, all software is released under the MIT open source license. Copyright © 2016 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  5. Multiple Independent File Parallel I/O with HDF5

    Energy Technology Data Exchange (ETDEWEB)

    Miller, M. C.

    2016-07-13

    The HDF5 library has supported the I/O requirements of HPC codes at Lawrence Livermore National Labs (LLNL) since the late 90’s. In particular, HDF5 used in the Multiple Independent File (MIF) parallel I/O paradigm has supported LLNL code’s scalable I/O requirements and has recently been gainfully used at scales as large as O(106) parallel tasks.

  6. Photon-HDF5: An Open File Format for Timestamp-Based Single-Molecule Fluorescence Experiments

    OpenAIRE

    Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier

    2016-01-01

    We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long-term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode, photomultiplier tube, or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel n...

  7. Experimental Directory Structure (Exdir: An Alternative to HDF5 Without Introducing a New File Format

    Directory of Open Access Journals (Sweden)

    Svenn-Arne Dragly

    2018-04-01

    Full Text Available Natural sciences generate an increasing amount of data in a wide range of formats developed by different research groups and commercial companies. At the same time there is a growing desire to share data along with publications in order to enable reproducible research. Open formats have publicly available specifications which facilitate data sharing and reproducible research. Hierarchical Data Format 5 (HDF5 is a popular open format widely used in neuroscience, often as a foundation for other, more specialized formats. However, drawbacks related to HDF5's complex specification have initiated a discussion for an improved replacement. We propose a novel alternative, the Experimental Directory Structure (Exdir, an open specification for data storage in experimental pipelines which amends drawbacks associated with HDF5 while retaining its advantages. HDF5 stores data and metadata in a hierarchy within a complex binary file which, among other things, is not human-readable, not optimal for version control systems, and lacks support for easy access to raw data from external applications. Exdir, on the other hand, uses file system directories to represent the hierarchy, with metadata stored in human-readable YAML files, datasets stored in binary NumPy files, and raw data stored directly in subdirectories. Furthermore, storing data in multiple files makes it easier to track for version control systems. Exdir is not a file format in itself, but a specification for organizing files in a directory structure. Exdir uses the same abstractions as HDF5 and is compatible with the HDF5 Abstract Data Model. Several research groups are already using data stored in a directory hierarchy as an alternative to HDF5, but no common standard exists. This complicates and limits the opportunity for data sharing and development of common tools for reading, writing, and analyzing data. Exdir facilitates improved data storage, data sharing, reproducible research, and novel

  8. Experimental Directory Structure (Exdir): An Alternative to HDF5 Without Introducing a New File Format.

    Science.gov (United States)

    Dragly, Svenn-Arne; Hobbi Mobarhan, Milad; Lepperød, Mikkel E; Tennøe, Simen; Fyhn, Marianne; Hafting, Torkel; Malthe-Sørenssen, Anders

    2018-01-01

    Natural sciences generate an increasing amount of data in a wide range of formats developed by different research groups and commercial companies. At the same time there is a growing desire to share data along with publications in order to enable reproducible research. Open formats have publicly available specifications which facilitate data sharing and reproducible research. Hierarchical Data Format 5 (HDF5) is a popular open format widely used in neuroscience, often as a foundation for other, more specialized formats. However, drawbacks related to HDF5's complex specification have initiated a discussion for an improved replacement. We propose a novel alternative, the Experimental Directory Structure (Exdir), an open specification for data storage in experimental pipelines which amends drawbacks associated with HDF5 while retaining its advantages. HDF5 stores data and metadata in a hierarchy within a complex binary file which, among other things, is not human-readable, not optimal for version control systems, and lacks support for easy access to raw data from external applications. Exdir, on the other hand, uses file system directories to represent the hierarchy, with metadata stored in human-readable YAML files, datasets stored in binary NumPy files, and raw data stored directly in subdirectories. Furthermore, storing data in multiple files makes it easier to track for version control systems. Exdir is not a file format in itself, but a specification for organizing files in a directory structure. Exdir uses the same abstractions as HDF5 and is compatible with the HDF5 Abstract Data Model. Several research groups are already using data stored in a directory hierarchy as an alternative to HDF5, but no common standard exists. This complicates and limits the opportunity for data sharing and development of common tools for reading, writing, and analyzing data. Exdir facilitates improved data storage, data sharing, reproducible research, and novel insight from

  9. CoastWatch Regions in HDF Format

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The mapped data derived from AVHRR is divided into files for CoastWatch regions of interest. Each file contains multiple data variables stored using the HDF-4...

  10. HDF-EOS Web Server

    Science.gov (United States)

    Ullman, Richard; Bane, Bob; Yang, Jingli

    2008-01-01

    A shell script has been written as a means of automatically making HDF-EOS-formatted data sets available via the World Wide Web. ("HDF-EOS" and variants thereof are defined in the first of the two immediately preceding articles.) The shell script chains together some software tools developed by the Data Usability Group at Goddard Space Flight Center to perform the following actions: Extract metadata in Object Definition Language (ODL) from an HDF-EOS file, Convert the metadata from ODL to Extensible Markup Language (XML), Reformat the XML metadata into human-readable Hypertext Markup Language (HTML), Publish the HTML metadata and the original HDF-EOS file to a Web server and an Open-source Project for a Network Data Access Protocol (OPeN-DAP) server computer, and Reformat the XML metadata and submit the resulting file to the EOS Clearinghouse, which is a Web-based metadata clearinghouse that facilitates searching for, and exchange of, Earth-Science data.

  11. Converting from XML to HDF-EOS

    Science.gov (United States)

    Ullman, Richard; Bane, Bob; Yang, Jingli

    2008-01-01

    A computer program recreates an HDF-EOS file from an Extensible Markup Language (XML) representation of the contents of that file. This program is one of two programs written to enable testing of the schemas described in the immediately preceding article to determine whether the schemas capture all details of HDF-EOS files.

  12. HDFITS: Porting the FITS data model to HDF5

    Science.gov (United States)

    Price, D. C.; Barsdell, B. R.; Greenhill, L. J.

    2015-09-01

    The FITS (Flexible Image Transport System) data format has been the de facto data format for astronomy-related data products since its inception in the late 1970s. While the FITS file format is widely supported, it lacks many of the features of more modern data serialization, such as the Hierarchical Data Format (HDF5). The HDF5 file format offers considerable advantages over FITS, such as improved I/O speed and compression, but has yet to gain widespread adoption within astronomy. One of the major holdbacks is that HDF5 is not well supported by data reduction software packages and image viewers. Here, we present a comparison of FITS and HDF5 as a format for storage of astronomy datasets. We show that the underlying data model of FITS can be ported to HDF5 in a straightforward manner, and that by doing so the advantages of the HDF5 file format can be leveraged immediately. In addition, we present a software tool, fits2hdf, for converting between FITS and a new 'HDFITS' format, where data are stored in HDF5 in a FITS-like manner. We show that HDFITS allows faster reading of data (up to 100x of FITS in some use cases), and improved compression (higher compression ratios and higher throughput). Finally, we show that by only changing the import lines in Python-based FITS utilities, HDFITS formatted data can be presented transparently as an in-memory FITS equivalent.

  13. Commonalities and differences between MDSplus and HDF5 data systems

    Energy Technology Data Exchange (ETDEWEB)

    Manduchi, G., E-mail: gabriele.manduchi@igi.cnr.i [Consorzio RFX, Euratom-ENEA Association, Corso Stati Uniti 4, Padova 35127 (Italy)

    2010-07-15

    MDSplus is a data acquisition system widely used in nuclear fusion experiments. It defines a file format for pulse files and provides a set of tools for data acquisition and management. The whole MDSplus package is used in several fusion experiments to set up and supervise the data acquisition process. Other experiments use only the data management layer of MDSplus to provide a common format for data exchange between plasma fusion laboratories. HDF5 is a file format and a data access library used by a larger community, mainly outside fusion. HDF5 is used, for example, in earth science research, defence applications and weather services. HDF5 allows managing large and complex data sets and provides a common data format among heterogeneous applications. Both MDSplus and HDF5 support a rich set of data types and a hierarchical data organization, as well as multi-language data access libraries. There are however several significant differences between the two system architectures, making each system better suited in different application contexts. The paper provides a brief overview of the data architectures of MDSplus and HDF5 and analyzes in detail the peculiar aspects of the two systems.

  14. Commonalities and differences between MDSplus and HDF5 data systems

    International Nuclear Information System (INIS)

    Manduchi, G.

    2010-01-01

    MDSplus is a data acquisition system widely used in nuclear fusion experiments. It defines a file format for pulse files and provides a set of tools for data acquisition and management. The whole MDSplus package is used in several fusion experiments to set up and supervise the data acquisition process. Other experiments use only the data management layer of MDSplus to provide a common format for data exchange between plasma fusion laboratories. HDF5 is a file format and a data access library used by a larger community, mainly outside fusion. HDF5 is used, for example, in earth science research, defence applications and weather services. HDF5 allows managing large and complex data sets and provides a common data format among heterogeneous applications. Both MDSplus and HDF5 support a rich set of data types and a hierarchical data organization, as well as multi-language data access libraries. There are however several significant differences between the two system architectures, making each system better suited in different application contexts. The paper provides a brief overview of the data architectures of MDSplus and HDF5 and analyzes in detail the peculiar aspects of the two systems.

  15. Photon-HDF5: Open Data Format and Computational Tools for Timestamp-based Single-Molecule Experiments.

    Science.gov (United States)

    Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier

    2016-02-13

    Archival of experimental data in public databases has increasingly become a requirement for most funding agencies and journals. These data-sharing policies have the potential to maximize data reuse, and to enable confirmatory as well as novel studies. However, the lack of standard data formats can severely hinder data reuse. In photon-counting-based single-molecule fluorescence experiments, data is stored in a variety of vendor-specific or even setup-specific (custom) file formats, making data interchange prohibitively laborious, unless the same hardware-software combination is used. Moreover, the number of available techniques and setup configurations make it difficult to find a common standard. To address this problem, we developed Photon-HDF5 (www.photon-hdf5.org), an open data format for timestamp-based single-molecule fluorescence experiments. Building on the solid foundation of HDF5, Photon-HDF5 provides a platform- and language-independent, easy-to-use file format that is self-describing and supports rich metadata. Photon-HDF5 supports different types of measurements by separating raw data (e.g. photon-timestamps, detectors, etc) from measurement metadata. This approach allows representing several measurement types and setup configurations within the same core structure and makes possible extending the format in backward-compatible way. Complementing the format specifications, we provide open source software to create and convert Photon-HDF5 files, together with code examples in multiple languages showing how to read Photon-HDF5 files. Photon-HDF5 allows sharing data in a format suitable for long term archival, avoiding the effort to document custom binary formats and increasing interoperability with different analysis software. We encourage participation of the single-molecule community to extend interoperability and to help defining future versions of Photon-HDF5.

  16. Incorporating ISO Metadata Using HDF Product Designer

    Science.gov (United States)

    Jelenak, Aleksandar; Kozimor, John; Habermann, Ted

    2016-01-01

    The need to store in HDF5 files increasing amounts of metadata of various complexity is greatly overcoming the capabilities of the Earth science metadata conventions currently in use. Data producers until now did not have much choice but to come up with ad hoc solutions to this challenge. Such solutions, in turn, pose a wide range of issues for data managers, distributors, and, ultimately, data users. The HDF Group is experimenting on a novel approach of using ISO 19115 metadata objects as a catch-all container for all the metadata that cannot be fitted into the current Earth science data conventions. This presentation will showcase how the HDF Product Designer software can be utilized to help data producers include various ISO metadata objects in their products.

  17. XML DTD and Schemas for HDF-EOS

    Science.gov (United States)

    Ullman, Richard; Yang, Jingli

    2008-01-01

    An Extensible Markup Language (XML) document type definition (DTD) standard for the structure and contents of HDF-EOS files and their contents, and an equivalent standard in the form of schemas, have been developed.

  18. Status of LOFAR Data in HDF5 Format

    NARCIS (Netherlands)

    Alexov, A.; Schellart, P.; ter Veen, S.; van den Akker, M.; Bähren, L.; Grießmeier, J.M.; Hessels, J.W.T.; Mol, J.D.; Renting, G.A.; Swinbank, J.; Wise, M.

    2012-01-01

    The Hierarchical Data Format, version 5 (HDF5) is a data model, library, and file format for storing and managing data. It is designed for flexible and efficient I/O and for high volume, complex data. The Low Frequency Array (LOFAR) project is solving the challenge of data size and complexity using

  19. MXA: a customizable HDF5-based data format for multi-dimensional data sets

    International Nuclear Information System (INIS)

    Jackson, M; Simmons, J P; De Graef, M

    2010-01-01

    A new digital file format is proposed for the long-term archival storage of experimental data sets generated by serial sectioning instruments. The format is known as the multi-dimensional eXtensible Archive (MXA) format and is based on the public domain Hierarchical Data Format (HDF5). The MXA data model, its description by means of an eXtensible Markup Language (XML) file with associated Document Type Definition (DTD) are described in detail. The public domain MXA package is available through a dedicated web site (mxa.web.cmu.edu), along with implementation details and example data files

  20. Trade Study: Storing NASA HDF5/netCDF-4 Data in the Amazon Cloud and Retrieving Data via Hyrax Server / THREDDS Data Server

    Science.gov (United States)

    Habermann, Ted; Jelenak, Aleksander; Lee, Joe; Yang, Kent; Gallagher, James; Potter, Nathan

    2017-01-01

    As part of the overall effort to understand implications of migrating ESDIS data and services to the cloud we are testing several common OPeNDAP and HDF use cases against three architectures for general performance and cost characteristics. The architectures include retrieving entire files, retrieving datasets using HTTP range gets, and retrieving elements of datasets (chunks) with HTTP range gets. We will describe these architectures and discuss our approach to estimating cost.

  1. SWATH2stats: An R/Bioconductor Package to Process and Convert Quantitative SWATH-MS Proteomics Data for Downstream Analysis Tools.

    Science.gov (United States)

    Blattmann, Peter; Heusel, Moritz; Aebersold, Ruedi

    2016-01-01

    SWATH-MS is an acquisition and analysis technique of targeted proteomics that enables measuring several thousand proteins with high reproducibility and accuracy across many samples. OpenSWATH is popular open-source software for peptide identification and quantification from SWATH-MS data. For downstream statistical and quantitative analysis there exist different tools such as MSstats, mapDIA and aLFQ. However, the transfer of data from OpenSWATH to the downstream statistical tools is currently technically challenging. Here we introduce the R/Bioconductor package SWATH2stats, which allows convenient processing of the data into a format directly readable by the downstream analysis tools. In addition, SWATH2stats allows annotation, analyzing the variation and the reproducibility of the measurements, FDR estimation, and advanced filtering before submitting the processed data to downstream tools. These functionalities are important to quickly analyze the quality of the SWATH-MS data. Hence, SWATH2stats is a new open-source tool that summarizes several practical functionalities for analyzing, processing, and converting SWATH-MS data and thus facilitates the efficient analysis of large-scale SWATH/DIA datasets.

  2. Application of HDF5 in long-pulse quasi-steady state data acquisition at high sampling rate

    International Nuclear Information System (INIS)

    Chen, Y.; Wang, F.; Li, S.; Xiao, B.J.; Yang, F.

    2014-01-01

    Highlights: • The new data-acquisition system supports long-pulse EAST data acquisition. • The new data-acquisition system is capable for most of the high frequency signals of EAST experiments. • The system's total throughput is about 500 MB/s. • The system uses HDF5 to store data. - Abstract: A new high sampling rate quasi-steady state data-acquisition system has been designed for the microwave reflectometry diagnostic of EAST experiments. In order to meet the requirements of long-pulse discharge and high sampling rate, it is designed based on PXI Express technology. A high-performance digitizer National Instruments PXIe-5122 with two synchronous analog input channels in which the maximum sampling rate is 100 MHz has been adopted. Two PXIe-5122 boards at 60 MSPS and one PXIe-6368 board at 2 MSPS are used in the system and the total throughput is about 500 MB/s. To guarantee the large amounts of data being saved continuously in the long-pulse discharge, an external hard-disk data stream enclosure NI HDD-8265 in which the capacity of sustained speed of reading and writing is 700 MB/s. And in RAID-5 mode its storage capacity is 80% of the total. The obtained raw data firstly stream continuously into NI HDD-8265 during the discharge. Then it will be transferred to the data server automatically and converted into HDF5 file format. HDF5 is an open source file format for data storage and management which has been widely used in various fields, and suitable for long term case. The details of the system are described in the paper

  3. HDF5-Ph-Data format - Version 0.2 Draft

    OpenAIRE

    Antonino Ingargiola

    2014-01-01

    This format has been renamed Photon-HDF5 and the latest specification can be found at http://photon-hdf5.readthedocs.org/. This document contains the specifications for the HDF5-Ph-Data format. This format allows saving single-molecule spectroscopy experiments when there is at least a stream of photon timestamps. It has been envisioned as a standard container format for a broad range of experiments involving confocal microcopy. Notable examples are confocal smFRET experiments (with or wit...

  4. CERES Clouds and Radiative Swath (CRS) data in HDF (CER_CRS_TRMM-PFM-VIRS_Edition2C)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    The Clouds and Radiative Swath (CRS) product contains one hour of instantaneous Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The CRS contains all of the CERES SSF product data. For each CERES footprint on the SSF the CRS also contains vertical flux profiles evaluated at four levels in the atmosphere: the surface, 500-, 70-, and 1-hPa. The CRS fluxes and cloud parameters are adjusted for consistency with a radiative transfer model and adjusted fluxes are evaluated at the four atmospheric levels for both clear-sky and total-sky. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2000-03-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Temporal_Resolution=1 hour; Temporal_Resolution_Range=Hourly - < Daily].

  5. Developing a middleware to support HDF data access in ArcGIS

    Science.gov (United States)

    Sun, M.; Jiang, Y.; Yang, C. P.

    2014-12-01

    Hierarchical Data Format (HDF) is the standard data format for the NASA Earth Observing System (EOS) data products, like the MODIS level-3 data. These data have been widely used in long-term study of the land surface, biosphere, atmosphere, and oceans of the Earth. Several toolkits have been developed to access HDF data, such as the HDF viewer and Geospatial Data Abstraction Library (GDAL), etc. ArcGIS integrated the GDAL providing data user a Graphical User Interface (GUI) to read HDF data. However, there are still some problems when using the toolkits:for example, 1) the projection information is not recognized correctly, 2) the image is dispalyed inverted, and 3) the tool lacks of capability to read the third dimension information stored in the data subsets, etc. Accordingly, in this study we attempt to improve the current HDF toolkits to address the aformentioned issues. Considering the wide-usage of ArcGIS, we develop a middleware for ArcGIS based on GDAL to solve the particular data access problems happening in ArcGIS, so that data users can access HDF data successfully and perform further data analysis with the ArcGIS geoprocessing tools.

  6. CERES Clouds and Radiative Swath (CRS) data in HDF. (CER_CRS_Terra-FM2-MODIS_Edition2B)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    The Clouds and Radiative Swath (CRS) product contains one hour of instantaneous Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The CRS contains all of the CERES SSF product data. For each CERES footprint on the SSF the CRS also contains vertical flux profiles evaluated at four levels in the atmosphere: the surface, 500-, 70-, and 1-hPa. The CRS fluxes and cloud parameters are adjusted for consistency with a radiative transfer model and adjusted fluxes are evaluated at the four atmospheric levels for both clear-sky and total-sky. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2001-10-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Temporal_Resolution=1 hour; Temporal_Resolution_Range=Hourly - < Daily].

  7. CERES Clouds and Radiative Swath (CRS) data in HDF. (CER_CRS_Terra-FM2-MODIS_Edition2A

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    The Clouds and Radiative Swath (CRS) product contains one hour of instantaneous Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The CRS contains all of the CERES SSF product data. For each CERES footprint on the SSF the CRS also contains vertical flux profiles evaluated at four levels in the atmosphere: the surface, 500-, 70-, and 1-hPa. The CRS fluxes and cloud parameters are adjusted for consistency with a radiative transfer model and adjusted fluxes are evaluated at the four atmospheric levels for both clear-sky and total-sky. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2001-10-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Temporal_Resolution=1 hour; Temporal_Resolution_Range=Hourly - < Daily].

  8. 4 CFR 22.4 - Appeal File [Rule 4].

    Science.gov (United States)

    2010-01-01

    ... alternative organization of the appeal file is permitted, such as by document type or topic, documents within... 4 Accounts 1 2010-01-01 2010-01-01 false Appeal File [Rule 4]. 22.4 Section 22.4 Accounts... OFFICE CONTRACT APPEALS BOARD § 22.4 Appeal File [Rule 4]. (a) Duties of the Contracting Officer. (1...

  9. The effect of wide swathing on wilting times and nutritive value of alfalfa haylage.

    Science.gov (United States)

    Kung, L; Stough, E C; McDonell, E E; Schmidt, R J; Hofherr, M W; Reich, L J; Klingerman, C M

    2010-04-01

    On 3 consecutive cuttings, alfalfa from a single field was mowed with a John Deere 946 mower-conditioner (4-m cut width; Moline, IL) to leave narrow swaths (NS) ranging from 1.2 to 1.52 m wide (30-37% of cutter bar width) and wide swaths (WS) ranging from 2.44 to 2.74 m wide (62-67% of cutter bar width). Samples were collected from windrows and dry matter (DM) was monitored during wilting until a target of 43 to 45% DM was obtained. Forage from random windrows (n=4-6) was harvested by hand, chopped through a forage harvester before being packed in replicated vacuum-sealed bags, and allowed to ensile for 65 d. There was no swath width x cutting interaction for any parameter tested. Over all cuttings, the resulting silage DM was not different between the NS silage (43.8%) and the WS silage (44.9%). However, wide swathing greatly reduced the time of wilting before making silage. The hours of wilting time needed to reach the targeted DM for the NS silage compared with the WS silage at cuttings 1, 2, and 3 were 50 versus 29, 54 versus 28, and 25 versus 6, respectively. At the time of ensiling, the WS silage had more water-soluble carbohydrates (5.1%) than did the NS silage (3.7%). The WS silage had a lower pH (4.58) than did the NS silage (4.66), but swath width did not affect fermentation end products (lactic acid, acetic acid, and ethanol). The NS silage had more NH(3)-N (0.26%) than did the WS silage (0.21%). Wide swathing did not affect the concentration of ash or the digestibility of NDF, but it lowered the N content (NS=3.45%; WS=3.23%) and increased the ADF content (NS=39.7%; WS=40.9%) of the resulting silage. Wide swathing can markedly reduce the time that alfalfa must wilt before it can be chopped for silage, but under good conditions, as in this study, the resulting silage quality was generally not improved. Copyright (c) 2010 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  10. Treatment Time or Convection Volume in HDF : What Drives the Reduced Mortality Risk?

    NARCIS (Netherlands)

    de Roij van Zuijdewijn, Camiel L M; Nubé, Menso J.; ter Wee, Piet M.; Blankestijn, Peter J.; Lévesque, Renée; van den Dorpel, Marinus A.; Bots, Michiel L.; Grooteman, Muriel P C

    Background/Aims: Treatment time is associated with survival in hemodialysis (HD) patients and with convection volume in hemodiafiltration (HDF) patients. High-volume HDF is associated with improved survival. Therefore, we investigated whether this survival benefit is explained by treatment time.

  11. An Approach Using Parallel Architecture to Storage DICOM Images in Distributed File System

    International Nuclear Information System (INIS)

    Soares, Tiago S; Prado, Thiago C; Dantas, M A R; De Macedo, Douglas D J; Bauer, Michael A

    2012-01-01

    Telemedicine is a very important area in medical field that is expanding daily motivated by many researchers interested in improving medical applications. In Brazil was started in 2005, in the State of Santa Catarina has a developed server called the CyclopsDCMServer, which the purpose to embrace the HDF for the manipulation of medical images (DICOM) using a distributed file system. Since then, many researches were initiated in order to seek better performance. Our approach for this server represents an additional parallel implementation in I/O operations since HDF version 5 has an essential feature for our work which supports parallel I/O, based upon the MPI paradigm. Early experiments using four parallel nodes, provide good performance when compare to the serial HDF implemented in the CyclopsDCMServer.

  12. Trade Study: Storing NASA HDF5/netCDF-4 Data in the Amazon Cloud and Retrieving Data Via Hyrax Server Data Server

    Science.gov (United States)

    Habermann, Ted; Gallagher, James; Jelenak, Aleksandar; Potter, Nathan; Lee, Joe; Yang, Kent

    2017-01-01

    This study explored three candidate architectures with different types of objects and access paths for serving NASA Earth Science HDF5 data via Hyrax running on Amazon Web Services (AWS). We studied the cost and performance for each architecture using several representative Use-Cases. The objectives of the study were: Conduct a trade study to identify one or more high performance integrated solutions for storing and retrieving NASA HDF5 and netCDF4 data in a cloud (web object store) environment. The target environment is Amazon Web Services (AWS) Simple Storage Service (S3). Conduct needed level of software development to properly evaluate solutions in the trade study and to obtain required benchmarking metrics for input into government decision of potential follow-on prototyping. Develop a cloud cost model for the preferred data storage solution (or solutions) that accounts for different granulation and aggregation schemes as well as cost and performance trades.We will describe the three architectures and the use cases along with performance results and recommendations for further work.

  13. The Use of Variable Q1 Isolation Windows Improves Selectivity in LC-SWATH-MS Acquisition.

    Science.gov (United States)

    Zhang, Ying; Bilbao, Aivett; Bruderer, Tobias; Luban, Jeremy; Strambio-De-Castillia, Caterina; Lisacek, Frédérique; Hopfgartner, Gérard; Varesio, Emmanuel

    2015-10-02

    As tryptic peptides and metabolites are not equally distributed along the mass range, the probability of cross fragment ion interference is higher in certain windows when fixed Q1 SWATH windows are applied. We evaluated the benefits of utilizing variable Q1 SWATH windows with regards to selectivity improvement. Variable windows based on equalizing the distribution of either the precursor ion population (PIP) or the total ion current (TIC) within each window were generated by an in-house software, swathTUNER. These two variable Q1 SWATH window strategies outperformed, with respect to quantification and identification, the basic approach using a fixed window width (FIX) for proteomic profiling of human monocyte-derived dendritic cells (MDDCs). Thus, 13.8 and 8.4% additional peptide precursors, which resulted in 13.1 and 10.0% more proteins, were confidently identified by SWATH using the strategy PIP and TIC, respectively, in the MDDC proteomic sample. On the basis of the spectral library purity score, some improvement warranted by variable Q1 windows was also observed, albeit to a lesser extent, in the metabolomic profiling of human urine. We show that the novel concept of "scheduled SWATH" proposed here, which incorporates (i) variable isolation windows and (ii) precursor retention time segmentation further improves both peptide and metabolite identifications.

  14. Swath sonar mapping of Earth's submarine plate boundaries

    Science.gov (United States)

    Carbotte, S. M.; Ferrini, V. L.; Celnick, M.; Nitsche, F. O.; Ryan, W. B. F.

    2014-12-01

    The recent loss of Malaysia Airlines flight MH370 in an area of the Indian Ocean where less than 5% of the seafloor is mapped with depth sounding data (Smith and Marks, EOS 2014) highlights the striking lack of detailed knowledge of the topography of the seabed for much of the worlds' oceans. Advances in swath sonar mapping technology over the past 30 years have led to dramatic improvements in our capability to map the seabed. However, the oceans are vast and only an estimated 10% of the seafloor has been mapped with these systems. Furthermore, the available coverage is highly heterogeneous and focused within areas of national strategic priority and community scientific interest. The major plate boundaries that encircle the globe, most of which are located in the submarine environment, have been a significant focus of marine geoscience research since the advent of swath sonar mapping. While the location of these plate boundaries are well defined from satellite-derived bathymetry, significant regions remain unmapped at the high-resolutions provided by swath sonars and that are needed to study active volcanic and tectonic plate boundary processes. Within the plate interiors, some fossil plate boundary zones, major hotspot volcanoes, and other volcanic provinces have been the focus of dedicated research programs. Away from these major tectonic structures, swath mapping coverage is limited to sparse ocean transit lines which often reveal previously unknown deep-sea channels and other little studied sedimentary structures not resolvable in existing low-resolution global compilations, highlighting the value of these data even in the tectonically quiet plate interiors. Here, we give an overview of multibeam swath sonar mapping of the major plate boundaries of the globe as extracted from public archives. Significant quantities of swath sonar data acquired from deep-sea regions are in restricted-access international archives. Open access to more of these data sets would

  15. Interdisciplinary Research Scenario Testing of EOSDIS

    Science.gov (United States)

    Emmitt, G. D.

    1999-01-01

    During the reporting period, the Principle Investigator (PI) has continued to serve on numerous review panels, task forces and committees with the goal of providing input and guidance for the Earth Observing System Data and Information System (EOSDIS) program at NASA Headquarters and NASA GSFC. In addition, the PI has worked together with personnel at the University of Virginia and the subcontractor (Simpson Weather Associates (SWA)) to continue to evaluate the latest releases of various versions of the user interfaces to the EOSDIS. Finally, as part of the subcontract, SWA has created an on-line Hierarchial Data Format (HDF) tutorial for non-HDF experts, particularly those that will be using EOSDIS and future EOS data products. A summary of these three activities is provided. The topics include: 1) Participation on EODIS Panels and Committees; 2) Evaluation and Tire Kicking of EODIS User Interfaces; and 3) An On-line HDF Tutorial. The report also includes attachments A, B, and C. Attachment A: Report From the May 1999 Science Data Panel. The topics include: 1) Summary of Data Panel Meeting; and 2) Panel's Comments/Recommendations. Attachment B: Survey Requesting Integrated Design Systems (IDS) Teams Input on the Descoping and Rescoping of the EODIS; and Attachment C: An HDF Tutorial for Beginners: EODIS Users and Small Data Providers (HTML Version). The topics include: 1) Tutorial Overview; 2) An introduction to HDF; 3) The HDF Library: Software and Hardware; 4) Methods of Working with HDF Files; 5) Scientific Data API; 6) Attributes and Metadata; 7) Writing a SDS to an HDF file; 8) Obtaining Information on Existing HDF Files; 9) Reading a Scientific Data Set from an HDF file: 10) Example Programs; 11) Browsing and Visualizing HDF Data; and 12) Laboratory (Question and Answer).

  16. Study of Wide Swath Synthetic Aperture Ladar Imaging Techology

    Directory of Open Access Journals (Sweden)

    Zhang Keshu

    2017-02-01

    Full Text Available Combining synthetic-aperture imaging and coherent-light detection technology, the weak signal identification capacity of Synthetic Aperture Ladar (SAL reaches the photo level, and the image resolution exceeds the diffraction limit of the telescope to obtain high-resolution images irrespective to ranges. This paper introduces SAL, including the development path, technology characteristics, and the restriction of imaging swath. On the basis of this, we propose to integrate the SAL technology for extending its swath. By analyzing the scanning-operation mode and the signal model, the paper explicitly proposes that the former mode will be the developmental trend of the SAL technology. This paper also introduces the flight demonstrations of the SAL and the imaging results of remote targets, showing the potential of the SAL in long-range, high-resolution, and scanning-imaging applications. The technology and the theory of the scanning mode of SAL compensates for the defects related to the swath and operation efficiency of the current SAL. It provides scientific foundation for the SAL system applied in wide swath, high resolution earth observation, and the ISAL system applied in space-targets imaging.

  17. Sandia Data Archive (SDA) file specifications

    Energy Technology Data Exchange (ETDEWEB)

    Dolan, Daniel H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ao, Tommy [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    The Sandia Data Archive (SDA) format is a specific implementation of the HDF5 (Hierarchal Data Format version 5) standard. The format was developed for storing data in a universally accessible manner. SDA files may contain one or more data records, each associated with a distinct text label. Primitive records provide basic data storage, while compound records support more elaborate grouping. External records allow text/binary files to be carried inside an archive and later recovered. This report documents version 1.0 of the SDA standard. The information provided here is sufficient for reading from and writing to an archive. Although the format was original designed for use in MATLAB, broader use is encouraged.

  18. SwathProfiler and NProfiler: Two new ArcGIS Add-ins for the automatic extraction of swath and normalized river profiles

    Science.gov (United States)

    Pérez-Peña, J. V.; Al-Awabdeh, M.; Azañón, J. M.; Galve, J. P.; Booth-Rea, G.; Notti, D.

    2017-07-01

    The present-day great availability of high-resolution Digital Elevation Models has improved tectonic geomorphology analyses in their methodological aspects and geological meaning. Analyses based on topographic profiles are valuable to explore the short and long-term landscape response to tectonic activity and climate changes. Swath and river longitudinal profiles are two of the most used analysis to explore the long and short-term landscape responses. Most of these morphometric analyses are conducted in GIS software, which have become standard tools for analyzing drainage network metrics. In this work we present two ArcGIS Add-Ins to automatically delineate swath and normalized river profiles. Both tools are programmed in Visual Basic . NET and use ArcObjects library-architecture to access directly to vector and raster data. The SwathProfiler Add-In allows analyzing the topography within a swath or band by representing maximum-minimum-mean elevations, first and third quartile, local relief and hypsometry. We have defined a new transverse hypsometric integral index (THi) that analyzes hypsometry along the swath and offer valuable information in these kind of graphics. The NProfiler Add-In allows representing longitudinal normalized river profiles and their related morphometric indexes as normalized concavity (CT), maximum concavity (Cmax) and length of maximum concavity (Lmax). Both tools facilitate the spatial analysis of topography and drainage networks directly in a GIS environment as ArcMap and provide graphical outputs. To illustrate how these tools work, we analyzed two study areas, the Sierra Alhamilla mountain range (Betic Cordillera, SE Spain) and the Eastern margin of the Dead Sea (Jordan). The first study area has been recently studied from a morphotectonic perspective and these new tools can show an added value to the previous studies. The second study area has not been analyzed by quantitative tectonic geomorphology and the results suggest a landscape

  19. Multibeam swath bathymetry signal processing techniques

    Digital Repository Service at National Institute of Oceanography (India)

    Ranade, G.; Sudhakar, T.

    Mathematical advances and the advances in the real time signal processing techniques in the recent times, have considerably improved the state of art in the bathymetry systems. These improvements have helped in developing high resolution swath...

  20. 12 CFR 5.4 - Filing required.

    Science.gov (United States)

    2010-01-01

    ... CORPORATE ACTIVITIES Rules of General Applicability § 5.4 Filing required. (a) Filing. A depository institution shall file an application or notice with the OCC to engage in corporate activities and... advise an applicant through a pre-filing communication to send the filing or submission directly to the...

  1. SWATH-MS data of Drosophila melanogaster proteome dynamics during embryogenesis

    Directory of Open Access Journals (Sweden)

    Bertrand Fabre

    2016-12-01

    Full Text Available Embryogenesis is one of the most important processes in the life of an animal. During this dynamic process, progressive cell division and cellular differentiation are accompanied by significant changes in protein expression at the level of the proteome. However, very few studies to date have described the dynamics of the proteome during the early development of an embryo in any organism. In this dataset, we monitor changes in protein expression across a timecourse of more than 20 h of Drosophila melanogaster embryonic development. Mass-spectrometry data were produced using a SWATH acquisition mode on a Sciex Triple-TOF 6600. A spectral library built in-house was used to analyse these data and more than 1950 proteins were quantified at each embryonic timepoint. The files presented here are a permanent digital map and can be reanalysed to test against new hypotheses. The data have been deposited with the ProteomeXchange Consortium with the dataset identifier PRIDE: PXD0031078.

  2. Archive of side scan sonar and swath bathymetry data collected during USGS cruise 10CCT01 offshore of Cat Island, Gulf Islands National Seashore, Mississippi, March 2010

    Science.gov (United States)

    DeWitt, Nancy T.; Flocks, James G.; Pfeiffer, William R.; Wiese, Dana S.

    2010-01-01

    In March of 2010, the U.S. Geological Survey (USGS) conducted geophysical surveys east of Cat Island, Mississippi (fig. 1). The efforts were part of the USGS Gulf of Mexico Science Coordination partnership with the U.S. Army Corps of Engineers (USACE) to assist the Mississippi Coastal Improvements Program (MsCIP) and the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazards Susceptibility Project by mapping the shallow geological stratigraphic framework of the Mississippi Barrier Island Complex. These geophysical surveys will provide the data necessary for scientists to define, interpret, and provide baseline bathymetry and seafloor habitat for this area and to aid scientists in predicting future geomorpholocial changes of the islands with respect to climate change, storm impact, and sea-level rise. Furthermore, these data will provide information for barrier island restoration, particularly in Camille Cut, and provide protection for the historical Fort Massachusetts. For more information refer to http://ngom.usgs.gov/gomsc/mscip/index.html. This report serves as an archive of the processed swath bathymetry and side scan sonar data (SSS). Data products herein include gridded and interpolated surfaces, surface images, and x,y,z data products for both swath bathymetry and side scan sonar imagery. Additional files include trackline maps, navigation files, GIS files, Field Activity Collection System (FACS) logs, and formal FGDC metadata. Scanned images of the handwritten FACS logs and digital FACS logs are also provided as PDF files. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report or hold the cursor over an acronym for a pop-up explanation. The USGS St. Petersburg Coastal and Marine Science Center assigns a unique identifier to each cruise or field activity. For example, 10CCT01 tells us the data were collected in 2010 for the Coastal Change and Transport (CCT) study and the data were collected during the first field

  3. The implementation of a data acquisition and service system based on HDF5

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Y., E-mail: cheny@ipp.ac.cn [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei, Anhui (China); Wang, F.; Li, S. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei, Anhui (China); Xiao, B.J. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei, Anhui (China); School of nuclear science and technology, University of Science and Technology of China, Hefei, Anhui (China); Yang, F. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei, Anhui (China); Department of Computer Science, Anhui Medical University, Hefei, Anhui (China)

    2016-11-15

    Highlights: • A new data acquisition and service system has been designed and implemented for a new reversed field pinch (RFP) magnetic confinement device. • The new data acquisition and service system is based on HDF5. • It is an entire system including acquisition, storage and data retrieval. • The system is easy to extend and maintain for its modularization design. - Abstract: A data acquisition and service system based on HDF5 has been designed. It includes four components: data acquisition console, data acquisition subsystem, data archive system and data service. The data acquisition console manages all DAQ information and controls the acquisition process. The data acquisition subsystem supports continuous data acquisition with different sampling rates which can be divided into low, medium and high level. All experimental data will be remotely transferred to the data archive system. It adopts HDF5 as its low-level data storage format. The hierarchical data structure of HDF5 is useful for efficiently managing the experimental data and allows users to define special data types and compression filter which can be useful to deal with special signals. Several data service tools have also been developed so that users can get data service via Client/Server or Brower/Server. The system will be demonstrated on Keda Torus eXperiment (KTX) device, which is a new Reversed Field Pinch (RFP) magnetic confinement device. The details are presented in the paper.

  4. The implementation of a data acquisition and service system based on HDF5

    International Nuclear Information System (INIS)

    Chen, Y.; Wang, F.; Li, S.; Xiao, B.J.; Yang, F.

    2016-01-01

    Highlights: • A new data acquisition and service system has been designed and implemented for a new reversed field pinch (RFP) magnetic confinement device. • The new data acquisition and service system is based on HDF5. • It is an entire system including acquisition, storage and data retrieval. • The system is easy to extend and maintain for its modularization design. - Abstract: A data acquisition and service system based on HDF5 has been designed. It includes four components: data acquisition console, data acquisition subsystem, data archive system and data service. The data acquisition console manages all DAQ information and controls the acquisition process. The data acquisition subsystem supports continuous data acquisition with different sampling rates which can be divided into low, medium and high level. All experimental data will be remotely transferred to the data archive system. It adopts HDF5 as its low-level data storage format. The hierarchical data structure of HDF5 is useful for efficiently managing the experimental data and allows users to define special data types and compression filter which can be useful to deal with special signals. Several data service tools have also been developed so that users can get data service via Client/Server or Brower/Server. The system will be demonstrated on Keda Torus eXperiment (KTX) device, which is a new Reversed Field Pinch (RFP) magnetic confinement device. The details are presented in the paper.

  5. An HDF5-based framework for the distribution and analysis of ultrasonic concrete data

    Science.gov (United States)

    Prince, Luke; Clayton, Dwight; Santos-Villalobos, Hector

    2017-02-01

    There are many commercial ultrasonic tomography devices (UTDs) available for use in nondestructive evaluation (NDE) of reinforced concrete structures. These devices emit, measure, and store ultrasonic signals typically in the 25 kHz to 5 MHz frequency range. UTDs are characterized by a composition of multiple transducers, also known as a transducer array or phased array. Often, UTDs data are in a proprietary format. Consequently, NDE research data is limited to those who have prior non-disclosure agreements or the appropriate licenses. Thus, there is a need for a proper universal data framework to exist such that proprietary file datasets for different concrete specimens can be converted, organized, and stored with relative metadata for individual or collaborative NDE research. Building upon the Hierarchical Data Format (HDF5) model, we have developed a UTD data management framework and Graphic User Interface (GUI) to promote the algorithmic reconstruction of ultrasonic data in a controlled environment for easily reproducible and publishable results.

  6. SWATH Mass Spectrometry Performance Using Extended Peptide MS/MS Assay Libraries*

    Science.gov (United States)

    Wu, Jemma X.; Song, Xiaomin; Pascovici, Dana; Zaw, Thiri; Care, Natasha; Krisp, Christoph; Molloy, Mark P.

    2016-01-01

    The use of data-independent acquisition methods such as SWATH for mass spectrometry based proteomics is usually performed with peptide MS/MS assay libraries which enable identification and quantitation of peptide peak areas. Reference assay libraries can be generated locally through information dependent acquisition, or obtained from community data repositories for commonly studied organisms. However, there have been no studies performed to systematically evaluate how locally generated or repository-based assay libraries affect SWATH performance for proteomic studies. To undertake this analysis, we developed a software workflow, SwathXtend, which generates extended peptide assay libraries by integration with a local seed library and delivers statistical analysis of SWATH-quantitative comparisons. We designed test samples using peptides from a yeast extract spiked into peptides from human K562 cell lysates at three different ratios to simulate protein abundance change comparisons. SWATH-MS performance was assessed using local and external assay libraries of varying complexities and proteome compositions. These experiments demonstrated that local seed libraries integrated with external assay libraries achieve better performance than local assay libraries alone, in terms of the number of identified peptides and proteins and the specificity to detect differentially abundant proteins. Our findings show that the performance of extended assay libraries is influenced by the MS/MS feature similarity of the seed and external libraries, while statistical analysis using multiple testing corrections increases the statistical rigor needed when searching against large extended assay libraries. PMID:27161445

  7. MODIS/Terra Calibrated Radiances 5-Min L1B Swath 1km V006

    Data.gov (United States)

    National Aeronautics and Space Administration — The MODIS/Terra Calibrated Radiances 5-Min L1B Swath 1km (MOD021KM) contains calibrated and geolocated at-aperture radiances for 36 discrete bands located in the 0.4...

  8. Effect of different types of blood purification treatment (HD and HDF) on the serum PTH, leptin and Hcy levels in patients with chronic renal failure

    International Nuclear Information System (INIS)

    Yao Yongliang

    2009-01-01

    Objective: To investigate the effect of different type of blood purification treatment (hemodialysis and hemodiafiltration) on the serum PTH, Leptin and Hcy levels in patients with chronic renal failure(CRF). Methods: Serum levels of PTH(with ECLIA), Leptin(with RIA) and Hcy(with biochemistry) were measured in 30 patients treated with hemodialysis(HD) and 30 patients treated with hemodiafiltration(HDF) both before and after treatment. Results: The concentration of PTH decreased significantly from 60.8±32.5pmol/L to 28.2±17.2pmol/L in patients treated with HDF(P 0.05). Yet the concentration of Hcy also decreased significantly from 31.5μ10.5μmol/L to 20.4±8.5μmol/L(P<0.01) in patients treated HD. Conclusion: HDF can eliminate serum PTH, leptin and Hcy better than HD does. (authors)

  9. Cloud Classification in Wide-Swath Passive Sensor Images Aided by Narrow-Swath Active Sensor Data

    Directory of Open Access Journals (Sweden)

    Hongxia Wang

    2018-05-01

    Full Text Available It is a challenge to distinguish between different cloud types because of the complexity and diversity of cloud coverage, which is a significant clutter source that impacts on target detection and identification from the images of space-based infrared sensors. In this paper, a novel strategy for cloud classification in wide-swath passive sensor images is developed, which is aided by narrow-swath active sensor data. The strategy consists of three steps, that is, the orbit registration, most matching donor pixel selection, and cloud type assignment for each recipient pixel. A new criterion for orbit registration is proposed so as to improve the matching accuracy. The most matching donor pixel is selected via the Euclidean distance and the square sum of the radiance relative differences between the recipient and the potential donor pixels. Each recipient pixel is then assigned a cloud type that corresponds to the most matching donor. The cloud classification of the Moderate Resolution Imaging Spectroradiometer (MODIS images is performed with the aid of the data from Cloud Profiling Radar (CPR. The results are compared with the CloudSat product 2B-CLDCLASS, as well as those that are obtained using the method of the International Satellite Cloud Climatology Project (ISCCP, which demonstrates the superior classification performance of the proposed strategy.

  10. Archive of Side Scan Sonar and Swath Bathymetry Data collected during USGS Cruise 10CCT02 Offshore of Petit Bois Island Including Petit Bois Pass, Gulf Islands National Seashore, Mississippi, March 2010

    Science.gov (United States)

    Pfeiffer, William R.; Flocks, James G.; DeWitt, Nancy T.; Forde, Arnell S.; Kelso, Kyle; Thompson, Phillip R.; Wiese, Dana S.

    2011-01-01

    In March of 2010, the U.S. Geological Survey (USGS) conducted geophysical surveys offshore of Petit Bois Island, Mississippi, and Dauphin Island, Alabama (fig. 1). These efforts were part of the USGS Gulf of Mexico Science Coordination partnership with the U.S. Army Corps of Engineers (USACE) to assist the Mississippi Coastal Improvements Program (MsCIP) and the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazards Susceptibility Project by mapping the shallow geologic stratigraphic framework of the Mississippi Barrier Island Complex. These geophysical surveys will provide the data necessary for scientists to define, interpret, and provide baseline bathymetry and seafloor habitat for this area and to aid scientists in predicting future geomorphological changes of the islands with respect to climate change, storm impact, and sea-level rise. Furthermore, these data will provide information for barrier island restoration, particularly in Camille Cut, and protection for the historical Fort Massachusetts on Ship Island, Mississippi. For more information please refer to http://ngom.usgs.gov/gomsc/mscip/index.html. This report serves as an archive of the processed swath bathymetry and side scan sonar data (SSS). Data products herein include gridded and interpolated surfaces, seabed backscatter images, and ASCII x,y,z data products for both swath bathymetry and side scan sonar imagery. Additional files include trackline maps, navigation files, GIS files, Field Activity Collection System (FACS) logs, and formal FGDC metadata. Scanned images of the handwritten and digital FACS logs are also provided as PDF files. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report.

  11. Miniature Ka-band Automated Swath Mapper (KASM), Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal discusses the development and demonstration of a swath-based airborne instrument suite intended as a calibration and validation with relevance to the...

  12. Swath width study. A simulation assessment of costs and benefits of a sensor system for agricultural application

    Science.gov (United States)

    1979-01-01

    Satellites provide an excellent platform from which to observe crops on the scale and frequency required to provide accurate crop production estimates on a worldwide basis. Multispectral imaging sensors aboard these platforms are capable of providing data from which to derive acreage and production estimates. The issue of sensor swath width was examined. The quantitative trade trade necessary to resolve the combined issue of sensor swath width, number of platforms, and their orbits was generated and are included. Problems with different swath width sensors were analyzed and an assessment of system trade-offs of swath width versus number of satellites was made for achieving Global Crop Production Forecasting.

  13. Extracting the Data From the LCM vk4 Formatted Output File

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, James G. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-29

    These are slides about extracting the data from the LCM vk4 formatted output file. The following is covered: vk4 file produced by Keyence VK Software, custom analysis, no off the shelf way to read the file, reading the binary data in a vk4 file, various offsets in decimal lines, finding the height image data, directly in MATLAB, binary output beginning of height image data, color image information, color image binary data, color image decimal and binary data, MATLAB code to read vk4 file (choose a file, read the file, compute offsets, read optical image, laser optical image, read and compute laser intensity image, read height image, timing, display height image, display laser intensity image, display RGB laser optical images, display RGB optical images, display beginning data and save images to workspace, gamma correction subroutine), reading intensity form the vk4 file, linear in the low range, linear in the high range, gamma correction for vk4 files, computing the gamma intensity correction, observations.

  14. Adding Data Management Services to Parallel File Systems

    Energy Technology Data Exchange (ETDEWEB)

    Brandt, Scott [Univ. of California, Santa Cruz, CA (United States)

    2015-03-04

    The objective of this project, called DAMASC for “Data Management in Scientific Computing”, is to coalesce data management with parallel file system management to present a declarative interface to scientists for managing, querying, and analyzing extremely large data sets efficiently and predictably. Managing extremely large data sets is a key challenge of exascale computing. The overhead, energy, and cost of moving massive volumes of data demand designs where computation is close to storage. In current architectures, compute/analysis clusters access data in a physically separate parallel file system and largely leave it scientist to reduce data movement. Over the past decades the high-end computing community has adopted middleware with multiple layers of abstractions and specialized file formats such as NetCDF-4 and HDF5. These abstractions provide a limited set of high-level data processing functions, but have inherent functionality and performance limitations: middleware that provides access to the highly structured contents of scientific data files stored in the (unstructured) file systems can only optimize to the extent that file system interfaces permit; the highly structured formats of these files often impedes native file system performance optimizations. We are developing Damasc, an enhanced high-performance file system with native rich data management services. Damasc will enable efficient queries and updates over files stored in their native byte-stream format while retaining the inherent performance of file system data storage via declarative queries and updates over views of underlying files. Damasc has four key benefits for the development of data-intensive scientific code: (1) applications can use important data-management services, such as declarative queries, views, and provenance tracking, that are currently available only within database systems; (2) the use of these services becomes easier, as they are provided within a familiar file

  15. Improving Seismic Data Accessibility and Performance Using HDF Containers

    Science.gov (United States)

    Evans, B. J. K.; Wang, J.; Yang, R.

    2017-12-01

    The performance of computational geophysical data processing and forward modelling relies on both computational and data. Significant efforts on developing new data formats and libraries have been made the community, such as IRIS/PASSCAL and ASDF in data, and programs and utilities such as ObsPy and SPECFEM. The National Computational Infrastructure hosts a national significant geophysical data collection that is co-located with a high performance computing facility and provides an opportunity to investigate how to improve the data formats from both a data management and a performance point of view. This paper investigates how to enhance the data usability in several perspectives: 1) propose a convention for the seismic (both active and passive) community to improve the data accessibility and interoperability; 2) recommend the convention used in the HDF container when data is made available in PH5 or ASDF formats; 3) provide tools to convert between various seismic data formats; 4) provide performance benchmark cases using ObsPy library and SPECFEM3D to demonstrate how different data organization in terms of chunking size and compression impact on the performance by comparing new data formats, such as PH5 and ASDF to traditional formats such as SEGY, SEED, SAC, etc. In this work we apply our knowledge and experience on data standards and conventions, such as CF and ACDD from the climate community to the seismology community. The generic global attributes widely used in climate community are combined with the existing convention in the seismology community, such as CMT and QuakeML, StationXML, SEGY header convention. We also extend such convention by including the provenance and benchmarking records so that the r user can learn the footprint of the data together with its baseline performance. In practise we convert the example wide angle reflection seismic data from SEGY to PH5 or ASDF by using ObsPy and pyasdf libraries. It quantitatively demonstrates how the

  16. Forage intake and wastage by ewes in pea/hay barley swath grazing and bale feeding systems

    Science.gov (United States)

    Harvested feed costs, particularly during the winter, are traditionally the highest input associated with a ruminant livestock operation. Although swath grazing has been practiced for over 100 years and literature exists for cattle use of swath grazing, no published results are available on use of s...

  17. 4 CFR 21.2 - Time for filing.

    Science.gov (United States)

    2010-01-01

    ... after bid opening or the closing time for receipt of proposals. (b) Protests untimely on their face may... 4 Accounts 1 2010-01-01 2010-01-01 false Time for filing. 21.2 Section 21.2 Accounts GOVERNMENT ACCOUNTABILITY OFFICE GENERAL PROCEDURES BID PROTEST REGULATIONS § 21.2 Time for filing. (a)(1) Protests based...

  18. London SPAN version 4 parameter file format

    International Nuclear Information System (INIS)

    2004-06-01

    Powernext SA is a Multilateral Trading Facility in charge of managing the French power exchange through an optional and anonymous organised trading system. Powernext SA collaborates with the clearing organization LCH.Clearnet SA to secure and facilitate the transactions. The French Standard Portfolio Analysis of Risk (SPAN) is a system used by LCH.Clearnet to calculate the initial margins from and for its clearing members. SPAN is a computerized system which calculates the impact of several possible variations of rates and volatility on by-product portfolios. The initial margin call is equal to the maximum probable loss calculated by the system. This document contains details of the format of the London SPAN version 4 parameter file. This file contains all the parameters and risk arrays required to calculate SPAN margins. London SPAN Version 4 is an upgrade from Version 3, which is also known as LME SPAN. This document contains the full revised file specification, highlighting the changes from Version 3 to Version 4

  19. ANTIMICROBIAL BIO-NONWOVEN FABRICS FOR EYES'S SWATH AND DIAPERS FOR INFANT'S INCUBATORS

    Directory of Open Access Journals (Sweden)

    ElSayed A. ElNashar

    2016-12-01

    Full Text Available An infant incubator is a piece of equipment common to pediatric hospitals, birthing centers and neonatal intensive care units. While the unit may serve several specific functions, it is generally used to provide a safe and stable environment for newborn infants, often those who were born prematurely or with an illness or disability that makes them especially vulnerable for the first several months of life. The objective of this research was to gain a better understanding of New Approach for a Bio-Nonwoven fabrics and infant's incubator in terms of the specific materials as MaterBi/PCL® as Bioplastic and the elements of comfort, drivers associated with it and its waste biodegradation by different methods. Shortly after birth, the beginning in first hours of life babies with neonatal, a byproduct of the red blood cells decomposition. Many convenient features to consider with tow basic disposable eyes` swathe and diapers on infant’s incubator options: cloth of basic disposable eyes` swathe and diapers, with their end use properties. The form design of eyes` swathe® and diapers® shapes, for infant’s incubator stage then consider convenience, cost, and environmental waste.

  20. ASTC-MIMO-TOPS Mode with Digital Beam-Forming in Elevation for High-Resolution Wide-Swath Imaging

    Directory of Open Access Journals (Sweden)

    Pingping Huang

    2015-03-01

    Full Text Available Future spaceborne synthetic aperture radar (SAR missions require complete and frequent coverage of the earth with a high resolution. Terrain Observation by Progressive Scans (TOPS is a novel wide swath mode but has impaired azimuth resolution. In this paper, an innovative extended TOPS mode named Alamouti Space-time Coding multiple-input multiple-output TOPS (ASTC-MIMO-TOPS mode combined with digital beam-forming (DBF in elevation and multi-aperture SAR signal reconstruction in azimuth is proposed. This innovative mode achieves wide-swath coverage with a high geometric resolution and also overcomes major drawbacks in conventional MIMO SAR systems. The data processing scheme of this imaging scheme is presented in detail. The designed system example of the proposed ASTC-MIMO-TOPS mode, which has the imaging capacity of a 400 km wide swath with an azimuth resolution of 3 m, is given. Its system performance analysis results and simulated imaging results on point targets demonstrate the potential of the proposed novel spaceborne SAR mode for high-resolution wide-swath (HRWS imaging.

  1. Effects of cross-education on the muscle after a period of unilateral limb immobilization using a shoulder sling and swathe.

    Science.gov (United States)

    Magnus, Charlene R A; Barss, Trevor S; Lanovaz, Joel L; Farthing, Jonathan P

    2010-12-01

    The purpose of this study was to apply cross-education during 4 wk of unilateral limb immobilization using a shoulder sling and swathe to investigate the effects on muscle strength, muscle size, and muscle activation. Twenty-five right-handed participants were assigned to one of three groups as follows: the Immob + Train group wore a sling and swathe and strength trained (n = 8), the Immob group wore a sling and swathe and did not strength train (n = 8), and the Control group received no treatment (n = 9). Immobilization was applied to the nondominant (left) arm. Strength training consisted of maximal isometric elbow flexion and extension of the dominant (right) arm 3 days/wk. Torque (dynamometer), muscle thickness (ultrasound), maximal voluntary activation (interpolated twitch), and electromyography (EMG) were measured. The change in right biceps and triceps brachii muscle thickness [7.0 ± 1.9 and 7.1 ± 2.2% (SE), respectively] was greater for Immob + Train than Immob (0.4 ± 1.2 and -1.9 ± 1.7%) and Control (0.8 ± 0.5 and 0.0 ± 1.1%, P effect on maximal voluntary activation or EMG. The cross-education effect on the immobilized limb was greater after elbow extension training. This study suggests that strength training the nonimmobilized limb benefits the immobilized limb for muscle size and strength.

  2. Definition of a RACK1 Interaction Network in Drosophila melanogaster Using SWATH-MS.

    Science.gov (United States)

    Kuhn, Lauriane; Majzoub, Karim; Einhorn, Evelyne; Chicher, Johana; Pompon, Julien; Imler, Jean-Luc; Hammann, Philippe; Meignin, Carine

    2017-07-05

    Receptor for Activated protein C kinase 1 (RACK1) is a scaffold protein that has been found in association with several signaling complexes, and with the 40S subunit of the ribosome. Using the model organism Drosophila melanogaster , we recently showed that RACK1 is required at the ribosome for internal ribosome entry site (IRES)-mediated translation of viruses. Here, we report a proteomic characterization of the interactome of RACK1 in Drosophila S2 cells. We carried out Label-Free quantitation using both Data-Dependent and Data-Independent Acquisition (DDA and DIA, respectively) and observed a significant advantage for the Sequential Window Acquisition of all THeoretical fragment-ion spectra (SWATH) method, both in terms of identification of interactants and quantification of low abundance proteins. These data represent the first SWATH spectral library available for Drosophila and will be a useful resource for the community. A total of 52 interacting proteins were identified, including several molecules involved in translation such as structural components of the ribosome, factors regulating translation initiation or elongation, and RNA binding proteins. Among these 52 proteins, 15 were identified as partners by the SWATH strategy only. Interestingly, these 15 proteins are significantly enriched for the functions translation and nucleic acid binding. This enrichment reflects the engagement of RACK1 at the ribosome and highlights the added value of SWATH analysis. A functional screen did not reveal any protein sharing the interesting properties of RACK1, which is required for IRES-dependent translation and not essential for cell viability. Intriguingly however, 10 of the RACK1 partners identified restrict replication of Cricket paralysis virus (CrPV), an IRES-containing virus. Copyright © 2017 Kuhn et al.

  3. H5Part A Portable High Performance Parallel Data Interface for Particle Simulations

    CERN Document Server

    Adelmann, Andreas; Shalf, John M; Siegerist, Cristina

    2005-01-01

    Largest parallel particle simulations, in six dimensional phase space generate wast amont of data. It is also desirable to share data and data analysis tools such as ParViT (Particle Visualization Toolkit) among other groups who are working on particle-based accelerator simulations. We define a very simple file schema built on top of HDF5 (Hierarchical Data Format version 5) as well as an API that simplifies the reading/writing of the data to the HDF5 file format. HDF5 offers a self-describing machine-independent binary file format that supports scalable parallel I/O performance for MPI codes on a variety of supercomputing systems and works equally well on laptop computers. The API is available for C, C++, and Fortran codes. The file format will enable disparate research groups with very different simulation implementations to share data transparently and share data analysis tools. For instance, the common file format will enable groups that depend on completely different simulation implementations to share c...

  4. 77 FR 786 - BOST4 Hydroelectric Company, LLC, (BOST4); Notice of Application Accepted for Filing and...

    Science.gov (United States)

    2012-01-06

    ... Hydroelectric Company, LLC, (BOST4); Notice of Application Accepted for Filing and Soliciting Motions To Intervene and Protests Take notice that the following hydroelectric application has been filed with the... No.: P-12757-003. c. Date filed: February 24, 2011. d. Applicant: BOST4 Hydroelectric Company, LLC...

  5. Defining and Verifying Research Grade Airborne Laser Swath Mapping (ALSM) Observations

    Science.gov (United States)

    Carter, W. E.; Shrestha, R. L.; Slatton, C. C.

    2004-12-01

    The first and primary goal of the National Science Foundation (NSF) supported Center for Airborne Laser Mapping (NCALM), operated jointly by the University of Florida and the University of California, Berkeley, is to make "research grade" ALSM data widely available at affordable cost to the national scientific community. Cost aside, researchers need to know what NCALM considers research grade data and how the quality of the data is verified, to be able to determine the likelihood that the data they receive will meet their project specific requirements. Given the current state of the technology it is reasonable to expect a well planned and executed survey to produce surface elevations with uncertainties less than 10 centimeters and horizontal uncertainties of a few decimeters. Various components of the total error are generally associated with the aircraft trajectory, aircraft orientation, or laser vectors. Aircraft trajectory error is dependent largely on the Global Positioning System (GPS) observations, aircraft orientation on Inertial Measurement Unit (IMU) observations, and laser vectors on the scanning and ranging instrumentation. In addition to the issue of the precision or accuracy of the coordinates of the surface points, consideration must also be given to the point-to-point spacing and voids in the coverage. The major sources of error produce distinct artifacts in the data set. For example, aircraft trajectory errors tend to change slowly as the satellite constellation geometry varies, producing slopes within swaths and offsets between swaths. Roll, pitch and yaw biases in the IMU observations tend to persist through whole flights, and created distinctive artifacts in the swath overlap areas. Errors in the zero-point and scale of the laser scanner cause the edges of swaths to turn up or down. Range walk errors cause offsets between bright and dark surfaces, causing paint stripes to float above the dark surfaces of roads. The three keys to producing

  6. Data Elevator

    Energy Technology Data Exchange (ETDEWEB)

    2017-04-29

    Data Elevator: Efficient Asynchronous Data Movement in Hierarchical Storage Systems Multi-layer storage subsystems, including SSD-based burst buffers and disk-based parallel file systems (PFS), are becoming part of HPC systems. However, software for this storage hierarchy is still in its infancy. Applications may have to explicitly move data among the storage layers. We propose Data Elevator for transparently and efficiently moving data between a burst buffer and a PFS. Users specify the final destination for their data, typically on PFS, Data Elevator intercepts the I/O calls, stages data on burst buffer, and then asynchronously transfers the data to their final destination in the background. This system allows extensive optimizations, such as overlapping read and write operations, choosing I/O modes, and aligning buffer boundaries. In tests with large-scale scientific applications, Data Elevator is as much as 4.2X faster than Cray DataWarp, the start-of-art software for burst buffer, and 4X faster than directly writing to PFS. The Data Elevator library uses HDF5's Virtual Object Layer (VOL) for intercepting parallel I/O calls that write data to PFS. The intercepted calls are redirected to the Data Elevator, which provides a handle to write the file in a faster and intermediate burst buffer system. Once the application finishes writing the data to the burst buffer, the Data Elevator job uses HDF5 to move the data to final destination in an asynchronous manner. Hence, using the Data Elevator library is currently useful for applications that call HDF5 for writing data files. Also, the Data Elevator depends on the HDF5 VOL functionality.

  7. Consistent Continuum Particle Modeling of Hypersonic Flows and Development of HybridSimulation Capability

    Science.gov (United States)

    2017-07-01

    including suggestions for reducing this burden to Department of Defense, Washington Headquarters Services, Directorate for Information Operations...cells and all particles within them) can be interrogated by direct access to the hdf5 data file format. This avoids the process of loading the entire...grid and solution into memory before post-processing. Rather, a precise region of the flow can be interrogated directly from the hdf5 solution file

  8. MODIS/Aqua Aerosol 5-Min L2 Swath 10km V006

    Data.gov (United States)

    National Aeronautics and Space Administration — The MODIS/Aqua Aerosol 5-Min L2 Swath 10km (MYD04_L2) product continues to provide full global coverage of aerosol properties from the Dark Target (DT) and Deep Blue...

  9. Testing Suitability of Cell Cultures for SILAC-Experiments Using SWATH-Mass Spectrometry.

    Science.gov (United States)

    Reinders, Yvonne; Völler, Daniel; Bosserhoff, Anja-K; Oefner, Peter J; Reinders, Jörg

    2016-01-01

    Precise quantification is a major issue in contemporary proteomics. Both stable-isotope-labeling and label-free methods have been established for differential protein quantification and both approaches have different advantages and disadvantages. The present protocol uses the superior precision of label-free SWATH-mass spectrometry to test for suitability of cell lines for a SILAC-labeling approach as systematic regulations may be introduced upon incorporation of the "heavy" amino acids. The SILAC-labeled cell cultures can afterwards be used for further analyses where stable-isotope-labeling is mandatory or has substantial advantages over label-free approaches such as pulse-chase-experiments and differential protein interaction analyses based on co-immunoprecipitation. As SWATH-mass spectrometry avoids the missing-value-problem typically caused by undersampling in highly complex samples and shows superior precision for the quantification, it is better suited for the detection of systematic changes caused by the SILAC-labeling and thus, can serve as a useful tool to test cell lines for changes upon SILAC-labeling.

  10. Highly resolved global distribution of tropospheric NO2 using GOME narrow swath mode data

    Directory of Open Access Journals (Sweden)

    S. Beirle

    2004-01-01

    Full Text Available The Global Ozone Monitoring Experiment (GOME allows the retrieval of tropospheric vertical column densities (VCDs of NO2 on a global scale. Regions with enhanced industrial activity can clearly be detected, but the standard spatial resolution of the GOME ground pixels (320x40km2 is insufficient to resolve regional trace gas distributions or individual cities. Every 10 days within the nominal GOME operation, measurements are executed in the so called narrow swath mode with a much better spatial resolution (80x40km2. We use this data (1997-2001 to construct a detailed picture of the mean global tropospheric NO2 distribution. Since - due to the narrow swath - the global coverage of the high resolution observations is rather poor, it has proved to be essential to deseasonalize the single narrow swath mode observations to retrieve adequate mean maps. This is done by using the GOME backscan information. The retrieved high resolution map illustrates the shortcomings of the standard size GOME pixels and reveals an unprecedented wealth of details in the global distribution of tropospheric NO2. Localised spots of enhanced NO2 VCD can be directly associated to cities, heavy industry centers and even large power plants. Thus our result helps to check emission inventories. The small spatial extent of NO2 'hot spots' allows us to estimate an upper limit of the mean lifetime of boundary layer NOx of 17h on a global scale. The long time series of GOME data allows a quantitative comparison of the narrow swath mode data to the nominal resolution. Thus we can analyse the dependency of NO2 VCDs on pixel size. This is important for comparing GOME data to results of new satellite instruments like SCIAMACHY (launched March 2002 on ENVISAT, OMI (launched July 2004 on AURA or GOME II (to be launched 2005 with an improved spatial resolution.

  11. MODIS/Aqua Raw Radiances in Counts 5-Min L1A Swath V006

    Data.gov (United States)

    National Aeronautics and Space Administration — The MODIS/Aqua Raw Radiances in Counts 5-Min L1A Swath (MYD01) product contains reformatted and packaged raw instrument data. MODIS instrument data, in packetized...

  12. 4 CFR 28.11 - Filing a charge with the Office of General Counsel.

    Science.gov (United States)

    2010-01-01

    ... 4 Accounts 1 2010-01-01 2010-01-01 false Filing a charge with the Office of General Counsel. 28.11... the two kinds of filing. (1) A charge may be filed by personal delivery at the Office of General..., DC 20002. (2) A charge may be filed by mail addressed to the Office of General Counsel, Personnel...

  13. A Detailed Examination of the GPM Core Satellite Gridded Text Product

    Science.gov (United States)

    Stocker, Erich Franz; Kelley, Owen A.; Kummerow, C.; Huffman, George; Olson, William S.; Kwiatowski, John M.

    2015-01-01

    The Global Precipitation Measurement (GPM) mission quarter-degree gridded-text product has a similar file format and a similar purpose as the Tropical Rainfall Measuring Mission (TRMM) 3G68 quarter-degree product. The GPM text-grid format is an hourly summary of surface precipitation retrievals from various GPM instruments and combinations of GPM instruments. The GMI Goddard Profiling (GPROF) retrieval provides the widest swath (800 km) and does the retrieval using the GPM Microwave Imager (GMI). The Ku radar provides the widest radar swath (250 km swath) and also provides continuity with the TRMM Ku Precipitation Radar. GPM's Ku+Ka band matched swath (125 km swath) provides a dual-frequency precipitation retrieval. The "combined" retrieval (125 km swath) provides a multi-instrument precipitation retrieval based on the GMI, the DPR Ku radar, and the DPR Ka radar. While the data are reported in hourly grids, all hours for a day are packaged into a single text file that is g-zipped to reduce file size and to speed up downloading. The data are reported on a 0.25deg x 0.25 deg grid.

  14. Swath bathymetric investigation of the seamounts located in the Laxmi Basin, eastern Arabian Sea

    Digital Repository Service at National Institute of Oceanography (India)

    Bhattacharya, G.C.; Murty, G.P.S.; Srinivas, K.; Chaubey, A.K.; Sudhakar, T.; Nair, R.R.

    Multibeam (hydrosweep) swath bathymetric investigations revealed the presence of a NNW trending linear seamount chain along the axial part of the Laxmi Basin in the eastern Arabian Sea, between 15~'N, 70~'15'E and 17~'20'N, 69~'E. This chain...

  15. The European activation file EAF-4. Summary documentation

    Energy Technology Data Exchange (ETDEWEB)

    Kopeckey, J.; Nierop, D.

    1995-12-01

    This report describes the contents of the fourth version of the European Activation File (EAF-4), containing cross-sections for neutron induced reactions (0-20 MeV energy range) primarily for use in fusion-reactor technology. However, it can be used in other applications as well. The starter was the file EAF-3.1. The present version contains cross section data for all target nuclides which have half-lives longer than 0.5 days extended by actinides up to and including fermium (Z=100). Corss sections to isomeric states are listed separately and if the isomers live longer than 0.5 day they are also included as targets. The library includes 764 target nuclides with 13,096 reactions with non-zero cross-sections (>10{sup -8} b) below 20 MeV. The library is available in point-wise data and multigroup constant data in four different energy group structures (GAM-2, VITAMIN-J, WIMS and XMAS). A complementary uncertainty file has been gereated for all reactions in one-energy group structure for threshold reactions and three-groups for (n, {gamma}) and (n, f) reactions. The error estimates for this file are adopted either form experimental information or from systematics. (orig.).

  16. 12 CFR 30.4 - Filing of safety and soundness compliance plan.

    Science.gov (United States)

    2010-01-01

    ... steps the bank will take to correct the deficiency and the time within which those steps will be taken. (c) Review of safety and soundness compliance plans. Within 30 days after receiving a safety and... AND SOUNDNESS STANDARDS § 30.4 Filing of safety and soundness compliance plan. (a) Schedule for filing...

  17. Technology Development for 3-D Wide Swath Imaging Supporting ACE

    Science.gov (United States)

    Racette, Paul; Heymsfield, Gerry; Li, Lihua; Mclinden, Matthew; Park, Richard; Cooley, Michael; Stenger, Pete; Hand, Thomas

    2014-01-01

    The National Academy of Sciences Decadal Survey (DS) Aerosol-Cloud-Ecosystems Mission (ACE) aims to advance our ability to observe and predict changes to the Earth's hydrological cycle and energy balance in response to climate forcing, especially those changes associated with the effects of aerosol on clouds and precipitation. ACE is focused on obtaining measurements to reduce the uncertainties in current climate models arising from the lack in understanding of aerosol-cloud interactions. As part of the mission instrument suite, a dual-frequency radar comprised of a fixed-beam 94 gigahertz (W-band) radar and a wide-swath 35 gigahertz (Ka-band) imaging radar has been recommended by the ACE Science Working Group.In our 2010 Instrument Incubator Program project, we've developed a radar architecture that addresses the challenge associated with achieving the measurement objectives through an innovative, shared aperture antenna that allows dual-frequency radar operation while achieving wide-swath (100 kilometers) imaging at Ka-band. The antenna system incorporates 2 key technologies; a) a novel dual-band reflectorreflectarray and b) a Ka-band Active Electronically Scanned Array (AESA) feed module. The dual-band antenna is comprised of a primary cylindrical reflectorreflectarray surface illuminated by a point-focus W-band feed (compatible with a quasi-optical beam waveguide feed, such as that employed on CloudSat); the Ka-band AESA line feed provides wide-swath across-track scanning. The benefits of this shared-aperture approach include significant reductions in ACE satellite payload size, weight, and cost, as compared to a two aperture approach. Four objectives were addressed in our project. The first entailed developing the tools for the analysis and design of reflectarray antennas, assessment of candidate reflectarray elements, and validation using test coupons. The second objective was to develop a full-scale aperture design utilizing the reflectarray surface and to

  18. MODIS/Aqua Clouds 5-Min L2 Swath 1km and 5km V006

    Data.gov (United States)

    National Aeronautics and Space Administration — The MODIS/Aqua Clouds 5-Min L2 Swath 1km and 5km (MYD06_L2) product consists of cloud optical and physical parameters. These parameters are derived using remotely...

  19. 45 CFR 672.4 - Filing, service, and form of pleadings and documents.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 3 2010-10-01 2010-10-01 false Filing, service, and form of pleadings and... SCIENCE FOUNDATION ENFORCEMENT AND HEARING PROCEDURES § 672.4 Filing, service, and form of pleadings and... local officer, agency, department, corporation or other instrumentality shall be made by serving a copy...

  20. ArrayBridge: Interweaving declarative array processing with high-performance computing

    Energy Technology Data Exchange (ETDEWEB)

    Xing, Haoyuan [The Ohio State Univ., Columbus, OH (United States); Floratos, Sofoklis [The Ohio State Univ., Columbus, OH (United States); Blanas, Spyros [The Ohio State Univ., Columbus, OH (United States); Byna, Suren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Prabhat, Prabhat [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wu, Kesheng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Brown, Paul [Paradigm4, Inc., Waltham, MA (United States)

    2017-05-04

    Scientists are increasingly turning to datacenter-scale computers to produce and analyze massive arrays. Despite decades of database research that extols the virtues of declarative query processing, scientists still write, debug and parallelize imperative HPC kernels even for the most mundane queries. This impedance mismatch has been partly attributed to the cumbersome data loading process; in response, the database community has proposed in situ mechanisms to access data in scientific file formats. Scientists, however, desire more than a passive access method that reads arrays from files. This paper describes ArrayBridge, a bi-directional array view mechanism for scientific file formats, that aims to make declarative array manipulations interoperable with imperative file-centric analyses. Our prototype implementation of ArrayBridge uses HDF5 as the underlying array storage library and seamlessly integrates into the SciDB open-source array database system. In addition to fast querying over external array objects, ArrayBridge produces arrays in the HDF5 file format just as easily as it can read from it. ArrayBridge also supports time travel queries from imperative kernels through the unmodified HDF5 API, and automatically deduplicates between array versions for space efficiency. Our extensive performance evaluation in NERSC, a large-scale scientific computing facility, shows that ArrayBridge exhibits statistically indistinguishable performance and I/O scalability to the native SciDB storage engine.

  1. 29 CFR 15.4 - Administrative claim; where to file.

    Science.gov (United States)

    2010-07-01

    ... accompanied by a claim for money damages in a sum certain for injury to or loss of property or personal injury... hereunder to the Council for Claims and Compensation, Office of the Solicitor of Labor, U.S. Department of... 29 Labor 1 2010-07-01 2010-07-01 true Administrative claim; where to file. 15.4 Section 15.4 Labor...

  2. z ∼ 7 GALAXY CANDIDATES FROM NICMOS OBSERVATIONS OVER THE HDF-SOUTH AND THE CDF-SOUTH AND HDF-NORTH GOODS FIELDS

    International Nuclear Information System (INIS)

    Bouwens, Rychard J.; Illingworth, Garth D.; Gonzalez, Valentino; Holden, Brad; Magee, Dan; Labbe, Ivo; Franx, Marijn; Conselice, Christopher J.; Blakeslee, John; Van Dokkum, Pieter; Marchesini, Danilo; Zheng Wei

    2010-01-01

    We use ∼88 arcmin 2 of deep (∼>26.5 mag at 5σ) NICMOS data over the two GOODS fields and the HDF-South to conduct a search for bright z ∼> 7 galaxy candidates. This search takes advantage of an efficient preselection over 58 arcmin 2 of NICMOS H 160 -band data where only plausible z ∼> 7 candidates are followed up with NICMOS J 110 -band observations. ∼248 arcmin 2 of deep ground-based near-infrared data (∼>25.5 mag, 5σ) are also considered in the search. In total, we report 15 z 850 -dropout candidates over this area-7 of which are new to these search fields. Two possible z ∼ 9 J 110 -dropout candidates are also found, but seem unlikely to correspond to z ∼ 9 galaxies (given the estimated contamination levels). The present z ∼ 9 search is used to set upper limits on the prevalence of such sources. Rigorous testing is undertaken to establish the level of contamination of our selections by photometric scatter, low-mass stars, supernovae, and spurious sources. The estimated contamination rate of our z ∼ 7 selection is ∼24%. Through careful simulations, the effective volume available to our z ∼> 7 selections is estimated and used to establish constraints on the volume density of luminous (L* z =3 , or ∼-21 mag) galaxies from these searches. We find that the volume density of luminous star-forming galaxies at z ∼ 7 is 13 +8 -5 times lower than at z ∼ 4 and >25 times lower (1σ) at z ∼ 9 than at z ∼ 4. This is the most stringent constraint yet available on the volume density of ∼>L* z =3 galaxies at z ∼ 9. The present wide-area, multi-field search limits cosmic variance to ∼ 850 -dropout sample (18 candidates) identified from all NICMOS observations to date (over the two GOODS fields, the HUDF, galaxy clusters).

  3. Expediting Scientific Data Analysis with Reorganization of Data

    Energy Technology Data Exchange (ETDEWEB)

    Byna, Surendra; Wu, Kesheng

    2013-08-19

    Data producers typically optimize the layout of data files to minimize the write time. In most cases, data analysis tasks read these files in access patterns different from the write patterns causing poor read performance. In this paper, we introduce Scientific Data Services (SDS), a framework for bridging the performance gap between writing and reading scientific data. SDS reorganizes data to match the read patterns of analysis tasks and enables transparent data reads from the reorganized data. We implemented a HDF5 Virtual Object Layer (VOL) plugin to redirect the HDF5 dataset read calls to the reorganized data. To demonstrate the effectiveness of SDS, we applied two parallel data organization techniques: a sort-based organization on a plasma physics data and a transpose-based organization on mass spectrometry imaging data. We also extended the HDF5 data access API to allow selection of data based on their values through a query interface, called SDS Query. We evaluated the execution time in accessing various subsets of data through existing HDF5 Read API and SDS Query. We showed that reading the reorganized data using SDS is up to 55X faster than reading the original data.

  4. MODIS/Aqua Temperature and Water Vapor Profiles 5-Min L2 Swath 5km V006

    Data.gov (United States)

    National Aeronautics and Space Administration — MODIS/Aqua Temperature and Water Vapor Profiles 5-Min L2 Swath 5km (MYD07_L2). MODIS was launched aboard the Aqua satellite on May 04, 2002 (1:30 pm equator crossing...

  5. OMPS/NPP PCA SO2 Total Column 1-Orbit L2 Swath 50x50km NRT

    Data.gov (United States)

    National Aeronautics and Space Administration — The OMPS-NPP L2 NM Sulfur Dioxide (SO2) Total and Tropospheric Column swath orbital collection 2 version 2.0 product contains the retrieved sulfur dioxide (SO2)...

  6. 12 CFR 11.4 - Filing fees.

    Science.gov (United States)

    2010-01-01

    ... Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY SECURITIES EXCHANGE ACT DISCLOSURE RULES... part before it will accept the filing. The OCC provides an applicable fee schedule for such filings in the “Notice of Comptroller of the Currency Fees” described in 12 CFR 8.8. (b) Fees must be paid by...

  7. GEOMETRIC QUALITY ASSESSMENT OF LIDAR DATA BASED ON SWATH OVERLAP

    Directory of Open Access Journals (Sweden)

    A. Sampath

    2016-06-01

    Full Text Available This paper provides guidelines on quantifying the relative horizontal and vertical errors observed between conjugate features in the overlapping regions of lidar data. The quantification of these errors is important because their presence quantifies the geometric quality of the data. A data set can be said to have good geometric quality if measurements of identical features, regardless of their position or orientation, yield identical results. Good geometric quality indicates that the data are produced using sensor models that are working as they are mathematically designed, and data acquisition processes are not introducing any unforeseen distortion in the data. High geometric quality also leads to high geolocation accuracy of the data when the data acquisition process includes coupling the sensor with geopositioning systems. Current specifications (e.g. Heidemann 2014 do not provide adequate means to quantitatively measure these errors, even though they are required to be reported. Current accuracy measurement and reporting practices followed in the industry and as recommended by data specification documents also potentially underestimate the inter-swath errors, including the presence of systematic errors in lidar data. Hence they pose a risk to the user in terms of data acceptance (i.e. a higher potential for Type II error indicating risk of accepting potentially unsuitable data. For example, if the overlap area is too small or if the sampled locations are close to the center of overlap, or if the errors are sampled in flat regions when there are residual pitch errors in the data, the resultant Root Mean Square Differences (RMSD can still be small. To avoid this, the following are suggested to be used as criteria for defining the inter-swath quality of data: a Median Discrepancy Angle b Mean and RMSD of Horizontal Errors using DQM measured on sloping surfaces c RMSD for sampled locations from flat areas (defined as areas with less than 5

  8. PH5: HDF5 Based Format for Integrating and Archiving Seismic Data

    Science.gov (United States)

    Hess, D.; Azevedo, S.; Falco, N.; Beaudoin, B. C.

    2017-12-01

    PH5 is a seismic data format created by IRIS PASSCAL using HDF5. Building PH5 on HDF5 allows for portability and extensibility on a scale that is unavailable in older seismic data formats. PH5 is designed to evolve to accept new data types as they become available in the future and to operate on a variety of platforms (i.e. Mac, Linux, Windows). Exemplifying PH5's flexibility is the evolution from just handling active source seismic data to now including passive source, onshore-offshore, OBS and mixed source seismic data sets. In PH5, metadata is separated from the time series data and stored in a size and performance efficient manner that also allows for easy user interaction and output of the metadata in a format appropriate for the data set. PH5's full-fledged "Kitchen Software Suite" comprises tools for data ingestion (e.g. RefTek, SEG-Y, SEG-D, SEG-2, MSEED), meta-data management, QC, waveform viewing, and data output. This software suite not only includes command line and GUI tools for interacting with PH5, it is also a comprehensive Python package to support the creation of software tools by the community to further enhance PH5. The PH5 software suite is currently being used in multiple capacities, including in-field for creating archive ready data sets as well as by the IRIS Data Management Center (DMC) to offer an FDSN compliant set of web services for serving PH5 data to the community in a variety of standard data and meta-data formats (i.e. StationXML, QuakeML, EventXML, SAC + Poles and Zeroes, MiniSEED, and SEG-Y) as well as StationTXT and ShotText formats. These web services can be accessed via standard FDSN clients such as ObsPy, irisFetch.m, FetchData, and FetchMetadata. This presentation will highlight and demonstrate the benefits of PH5 as a next generation adaptable and extensible data format for use in both archiving and working with seismic data.

  9. MODIS/Aqua Clear Radiance Statistics Indexed to Global Grid 5-Min L2 Swath 10km V006

    Data.gov (United States)

    National Aeronautics and Space Administration — The MODIS/Aqua Clear Radiance Statistics Indexed to Global Grid 5-Min L2 Swath 10km (MYDCSR_G) provides a variety of statistical measures that characterize observed...

  10. Small ships don't shine: classification of ocean vessels from low resolution, large swath area SAR acquisitions

    CSIR Research Space (South Africa)

    Meyer, Rory GV

    2016-07-01

    Full Text Available the Understanding of Our Living Planet, 10-15 July 2016, Beijing, China Small ships don't shine: Classification of ocean vessels from low resolution, large swath area SAR acquisitions R. G. V. Meyer ; W. Kleynhans ; C. P. Schwegmann Abstract: Monitoring...

  11. CERES BiDirectional Scans (BDS) data in HDF (CER_BDS_Aqua-FM4_Edition1)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    Each BiDirectional Scans (BDS) data product contains twenty-four hours of Level-1b data for each CERES scanner instrument mounted on each spacecraft. The BDS includes samples taken in normal and short Earth scan elevation profiles in both fixed and rotating azimuth scan modes (including space, internal calibration, and solar calibration views). The BDS contains Level-0 raw (unconverted) science and instrument data as well as the geolocated converted science and instrument data. The BDS contains additional data not found in the Level-0 input file, including converted satellite position and velocity data, celestial data, converted digital status data, and parameters used in the radiance count conversion equations. The following CERES BDS data sets are currently available: CER_BDS_TRMM-PFM_Edition1 CER_BDS_Terra-FM1_Edition1 CER_BDS_Terra-FM2_Edition1 CER_BDS_Terra-FM1_Edition2 CER_BDS_Terra-FM2_Edition2 CER_BDS_Aqua-FM3_Edition1 CER_BDS_Aqua-FM4_Edition1 CER_BDS_Aqua-FM3_Edition2 CER_BDS_Aqua-FM4_Edition2 CER_BDS_Aqua-FM3_Edition1-CV CER_BDS_Aqua-FM4_Edition1-CV CER_BDS_Terra-FM1_Edition1-CV CER_BDS_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1997-12-27; Stop_Date=2005-04-02] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Temporal_Resolution=1 day; Temporal_Resolution_Range=Daily - < Weekly].

  12. CERES BiDirectional Scans (BDS) data in HDF (CER_BDS_Aqua-FM4_Edition2)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    Each BiDirectional Scans (BDS) data product contains twenty-four hours of Level-1b data for each CERES scanner instrument mounted on each spacecraft. The BDS includes samples taken in normal and short Earth scan elevation profiles in both fixed and rotating azimuth scan modes (including space, internal calibration, and solar calibration views). The BDS contains Level-0 raw (unconverted) science and instrument data as well as the geolocated converted science and instrument data. The BDS contains additional data not found in the Level-0 input file, including converted satellite position and velocity data, celestial data, converted digital status data, and parameters used in the radiance count conversion equations. The following CERES BDS data sets are currently available: CER_BDS_TRMM-PFM_Edition1 CER_BDS_Terra-FM1_Edition1 CER_BDS_Terra-FM2_Edition1 CER_BDS_Terra-FM1_Edition2 CER_BDS_Terra-FM2_Edition2 CER_BDS_Aqua-FM3_Edition1 CER_BDS_Aqua-FM4_Edition1 CER_BDS_Aqua-FM3_Edition2 CER_BDS_Aqua-FM4_Edition2 CER_BDS_Aqua-FM3_Edition1-CV CER_BDS_Aqua-FM4_Edition1-CV CER_BDS_Terra-FM1_Edition1-CV CER_BDS_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1997-12-27; Stop_Date=2005-03-29] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Temporal_Resolution=1 day; Temporal_Resolution_Range=Daily - < Weekly].

  13. The Surge, Wave, and Tide Hydrodynamics (SWaTH) network of the U.S. Geological Survey—Past and future implementation of storm-response monitoring, data collection, and data delivery

    Science.gov (United States)

    Verdi, Richard J.; Lotspeich, R. Russell; Robbins, Jeanne C.; Busciolano, Ronald J.; Mullaney, John R.; Massey, Andrew J.; Banks, William S.; Roland, Mark A.; Jenter, Harry L.; Peppler, Marie C.; Suro, Thomas P.; Schubert, Christopher E.; Nardi, Mark R.

    2017-06-20

    After Hurricane Sandy made landfall along the northeastern Atlantic coast of the United States on October 29, 2012, the U.S. Geological Survey (USGS) carried out scientific investigations to assist with protecting coastal communities and resources from future flooding. The work included development and implementation of the Surge, Wave, and Tide Hydrodynamics (SWaTH) network consisting of more than 900 monitoring stations. The SWaTH network was designed to greatly improve the collection and timely dissemination of information related to storm surge and coastal flooding. The network provides a significant enhancement to USGS data-collection capabilities in the region impacted by Hurricane Sandy and represents a new strategy for observing and monitoring coastal storms, which should result in improved understanding, prediction, and warning of storm-surge impacts and lead to more resilient coastal communities.As innovative as it is, SWaTH evolved from previous USGS efforts to collect storm-surge data needed by others to improve storm-surge modeling, warning, and mitigation. This report discusses the development and implementation of the SWaTH network, and some of the regional stories associated with the landfall of Hurricane Sandy, as well as some previous events that informed the SWaTH development effort. Additional discussions on the mechanics of inundation and how the USGS is working with partners to help protect coastal communities from future storm impacts are also included.

  14. 77 FR 56208 - Filing Dates for the Kentucky Special Election in the 4th Congressional District

    Science.gov (United States)

    2012-09-12

    ... 4th Congressional District AGENCY: Federal Election Commission. ACTION: Notice of filing dates for special election. SUMMARY: Kentucky has scheduled a general election on November 6, 2012, to fill the U.S... required to file reports in connection with the Special General Election on November 6, 2012, shall file a...

  15. 49 CFR 1312.4 - Filing of tariffs.

    Science.gov (United States)

    2010-10-01

    ... identifying each publication filed, and by the appropriate filing fee (see 49 CFR part 1002). Acknowledgment... OF TRANSPORTATION (CONTINUED) CARRIER RATES AND SERVICE TERMS REGULATIONS FOR THE PUBLICATION... English with rates explicitly stated in U.S. dollars and cents. Two copies of each tariff publication...

  16. 76 FR 14651 - BOST4 Hydroelectric Company, LLC; Notice of Application Tendered for Filing With the Commission...

    Science.gov (United States)

    2011-03-17

    ... Hydroelectric Company, LLC; Notice of Application Tendered for Filing With the Commission and Soliciting Additional Study Requests Take notice that the following hydroelectric application has been filed with the... No.: P-12757-003. c. Date filed: February 24, 2011. d. Applicant: BOST4 Hydroelectric Company, LLC...

  17. A price and performance comparison of three different storage architectures for data in cloud-based systems

    Science.gov (United States)

    Gallagher, J. H. R.; Jelenak, A.; Potter, N.; Fulker, D. W.; Habermann, T.

    2017-12-01

    Providing data services based on cloud computing technology that is equivalent to those developed for traditional computing and storage systems is critical for successful migration to cloud-based architectures for data production, scientific analysis and storage. OPeNDAP Web-service capabilities (comprising the Data Access Protocol (DAP) specification plus open-source software for realizing DAP in servers and clients) are among the most widely deployed means for achieving data-as-service functionality in the Earth sciences. OPeNDAP services are especially common in traditional data center environments where servers offer access to datasets stored in (very large) file systems, and a preponderance of the source data for these services is being stored in the Hierarchical Data Format Version 5 (HDF5). Three candidate architectures for serving NASA satellite Earth Science HDF5 data via Hyrax running on Amazon Web Services (AWS) were developed and their performance examined for a set of representative use cases. The performance was based both on runtime and incurred cost. The three architectures differ in how HDF5 files are stored in the Amazon Simple Storage Service (S3) and how the Hyrax server (as an EC2 instance) retrieves their data. The results for both the serial and parallel access to HDF5 data in the S3 will be presented. While the study focused on HDF5 data, OPeNDAP and the Hyrax data server, the architectures are generic and the analysis can be extrapolated to many different data formats, web APIs, and data servers.

  18. Evaluating watershed protection programs in New York City's Cannonsville Reservoir source watershed using SWAT-HS

    Science.gov (United States)

    Hoang, L.; Mukundan, R.; Moore, K. E.; Owens, E. M.; Steenhuis, T. S.

    2017-12-01

    New York City (NYC)'s reservoirs supply over one billion gallons of drinking water each day to over nine million consumers in NYC and upstate communities. The City has invested more than $1.5 billion in watershed protection programs to maintain a waiver from filtration for the Catskill and Delaware Systems. In the last 25 years, the NYC Department of Environmental Protection (NYCDEP) has implemented programs in cooperation with upstate communities that include nutrient management, crop rotations, improvement of barnyards and manure storage, implementing tertiary treatment for Phosphorus (P) in wastewater treatment plants, and replacing failed septic systems in an effort to reduce P loads to water supply reservoirs. There have been several modeling studies evaluating the effect of agricultural Best Management Practices (BMPs) on P control in the Cannonsville watershed in the Delaware System. Although these studies showed that BMPs would reduce dissolved P losses, they were limited to farm-scale or watershed-scale estimates of reduction factors without consideration of the dynamic nature of overland flow and P losses from variable source areas. Recently, we developed the process-based SWAT-Hillslope (SWAT-HS) model, a modified version of the Soil and Water Assessment Tool (SWAT) that can realistically predict variable source runoff processes. The objective of this study is to use the SWAT-HS model to evaluate watershed protection programs addressing both point and non-point sources of P. SWAT-HS predicts streamflow very well for the Cannonsville watershed with a daily Nash Sutcliffe Efficiency (NSE) of 0.85 at the watershed outlet and NSE values ranging from 0.56 - 0.82 at five other locations within the watershed. Based on good hydrological prediction, we applied the model to predict P loads using detailed P inputs that change over time due to the implementation of watershed protection programs. Results from P model predictions provide improved projections of P

  19. In-Swath Spray Deposition Characteristics of a Low Drift Nozzle for Low Volume Aerial Application - Preliminary Results.

    Science.gov (United States)

    CP flat-fan nozzles with selectable tips were evaluated for droplet spectra and coverage using water sensitive papers placed in the spray swath. This study used low application volumes (1, 2, and 3 GPA) at a certain spray application height as measured precisely by laser mounted in the aircraft. No...

  20. OMI/Aura and MODIS/Aqua Merged Cloud Product 1-Orbit L2 Swath 13x24 km V003

    Data.gov (United States)

    National Aeronautics and Space Administration — The OMI/Aura and MODIS/Aqua Merged Cloud Product 1-Orbit L2 Swath 13x24 km (OMMYDCLD) is a Level-2 orbital product that combines cloud parameters retrieved by the...

  1. Bioactive Constituents of Zanthoxylum rhetsa Bark and Its Cytotoxic Potential against B16-F10 Melanoma Cancer and Normal Human Dermal Fibroblast (HDF Cell Lines

    Directory of Open Access Journals (Sweden)

    Ramesh Kumar Santhanam

    2016-05-01

    Full Text Available Zanthoxylum rhetsa is an aromatic tree, known vernacularly as “Indian Prickly Ash”. It has been predominantly used by Indian tribes for the treatment of many infirmities like diabetes, inflammation, rheumatism, toothache and diarrhea. In this study, we identified major volatile constituents present in different solvent fractions of Z. rhetsa bark using GC-MS analysis and isolated two tetrahydrofuran lignans (yangambin and kobusin, a berberine alkaloid (columbamine and a triterpenoid (lupeol from the bioactive chloroform fraction. The solvent fractions and purified compounds were tested for their cytotoxic potential against human dermal fibroblasts (HDF and mouse melanoma (B16-F10 cells, using the MTT assay. All the solvent fractions and purified compounds were found to be non-cytotoxic to HDF cells. However, the chloroform fraction and kobusin exhibited cytotoxic effect against B16-F10 melanoma cells. The presence of bioactive lignans and alkaloids were suggested to be responsible for the cytotoxic property of Z. rhetsa bark against B16-F10 cells.

  2. Multichannel High Resolution Wide Swath SAR Imaging for Hypersonic Air Vehicle with Curved Trajectory

    Directory of Open Access Journals (Sweden)

    Rui Zhou

    2018-01-01

    Full Text Available Synthetic aperture radar (SAR equipped on the hypersonic air vehicle in near space has many advantages over the conventional airborne SAR. However, its high-speed maneuvering characteristics with curved trajectory result in serious range migration, and exacerbate the contradiction between the high resolution and wide swath. To solve this problem, this paper establishes the imaging geometrical model matched with the flight trajectory of the hypersonic platform and the multichannel azimuth sampling model based on the displaced phase center antenna (DPCA technology. Furthermore, based on the multichannel signal reconstruction theory, a more efficient spectrum reconstruction model using discrete Fourier transform is proposed to obtain the azimuth uniform sampling data. Due to the high complexity of the slant range model, it is difficult to deduce the processing algorithm for SAR imaging. Thus, an approximate range model is derived based on the minimax criterion, and the optimal second-order approximate coefficients of cosine function are obtained using the two-population coevolutionary algorithm. On this basis, aiming at the problem that the traditional Omega-K algorithm cannot compensate the residual phase with the difficulty of Stolt mapping along the range frequency axis, this paper proposes an Exact Transfer Function (ETF algorithm for SAR imaging, and presents a method of range division to achieve wide swath imaging. Simulation results verify the effectiveness of the ETF imaging algorithm.

  3. Multichannel High Resolution Wide Swath SAR Imaging for Hypersonic Air Vehicle with Curved Trajectory.

    Science.gov (United States)

    Zhou, Rui; Sun, Jinping; Hu, Yuxin; Qi, Yaolong

    2018-01-31

    Synthetic aperture radar (SAR) equipped on the hypersonic air vehicle in near space has many advantages over the conventional airborne SAR. However, its high-speed maneuvering characteristics with curved trajectory result in serious range migration, and exacerbate the contradiction between the high resolution and wide swath. To solve this problem, this paper establishes the imaging geometrical model matched with the flight trajectory of the hypersonic platform and the multichannel azimuth sampling model based on the displaced phase center antenna (DPCA) technology. Furthermore, based on the multichannel signal reconstruction theory, a more efficient spectrum reconstruction model using discrete Fourier transform is proposed to obtain the azimuth uniform sampling data. Due to the high complexity of the slant range model, it is difficult to deduce the processing algorithm for SAR imaging. Thus, an approximate range model is derived based on the minimax criterion, and the optimal second-order approximate coefficients of cosine function are obtained using the two-population coevolutionary algorithm. On this basis, aiming at the problem that the traditional Omega-K algorithm cannot compensate the residual phase with the difficulty of Stolt mapping along the range frequency axis, this paper proposes an Exact Transfer Function (ETF) algorithm for SAR imaging, and presents a method of range division to achieve wide swath imaging. Simulation results verify the effectiveness of the ETF imaging algorithm.

  4. Shaping ability of 4 different single-file systems in simulated S-shaped canals.

    Science.gov (United States)

    Saleh, Abdulrahman Mohammed; Vakili Gilani, Pouyan; Tavanafar, Saeid; Schäfer, Edgar

    2015-04-01

    The aim of this study was to compare the shaping ability of 4 different single-file systems in simulated S-shaped canals. Sixty-four S-shaped canals in resin blocks were prepared to an apical size of 25 using Reciproc (VDW, Munich, Germany), WaveOne (Dentsply Maillefer, Ballaigues, Switzerland), OneShape (Micro Méga, Besançon, France), and F360 (Komet Brasseler, Lemgo, Germany) (n = 16 canals/group) systems. Composite images were made from the superimposition of pre- and postinstrumentation images. The amount of resin removed by each system was measured by using a digital template and image analysis software. Canal aberrations and the preparation time were also recorded. The data were statistically analyzed by using analysis of variance, Tukey, and chi-square tests. Canals prepared with the F360 and OneShape systems were better centered compared with the Reciproc and WaveOne systems. Reciproc and WaveOne files removed significantly greater amounts of resin from the inner side of both curvatures (P files was significantly faster compared with WaveOne and F360 files (P file instruments were safe to use and were able to prepare the canals efficiently. However, single-file systems that are less tapered seem to be more favorable when preparing S-shaped canals. Copyright © 2015 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  5. z ~ 7 Galaxy Candidates from NICMOS Observations Over the HDF-South and the CDF-South and HDF-North Goods Fields

    Science.gov (United States)

    Bouwens, Rychard J.; Illingworth, Garth D.; González, Valentino; Labbé, Ivo; Franx, Marijn; Conselice, Christopher J.; Blakeslee, John; van Dokkum, Pieter; Holden, Brad; Magee, Dan; Marchesini, Danilo; Zheng, Wei

    2010-12-01

    We use ~88 arcmin2 of deep (gsim26.5 mag at 5σ) NICMOS data over the two GOODS fields and the HDF-South to conduct a search for bright z >~ 7 galaxy candidates. This search takes advantage of an efficient preselection over 58 arcmin2 of NICMOS H 160-band data where only plausible z >~ 7 candidates are followed up with NICMOS J 110-band observations. ~248 arcmin2 of deep ground-based near-infrared data (gsim25.5 mag, 5σ) are also considered in the search. In total, we report 15 z 850-dropout candidates over this area—7 of which are new to these search fields. Two possible z ~ 9 J 110-dropout candidates are also found, but seem unlikely to correspond to z ~ 9 galaxies (given the estimated contamination levels). The present z ~ 9 search is used to set upper limits on the prevalence of such sources. Rigorous testing is undertaken to establish the level of contamination of our selections by photometric scatter, low-mass stars, supernovae, and spurious sources. The estimated contamination rate of our z ~ 7 selection is ~24%. Through careful simulations, the effective volume available to our z >~ 7 selections is estimated and used to establish constraints on the volume density of luminous (L* z = 3, or ~-21 mag) galaxies from these searches. We find that the volume density of luminous star-forming galaxies at z ~ 7 is 13+8 -5 times lower than at z ~ 4 and >25 times lower (1σ) at z ~ 9 than at z ~ 4. This is the most stringent constraint yet available on the volume density of gsimL* z = 3 galaxies at z ~ 9. The present wide-area, multi-field search limits cosmic variance to lsim20%. The evolution we find at the bright end of the UV LF is similar to that found from recent Subaru Suprime-Cam, HAWK-I or ERS WFC3/IR searches. The present paper also includes a complete summary of our final z ~ 7 z 850-dropout sample (18 candidates) identified from all NICMOS observations to date (over the two GOODS fields, the HUDF, galaxy clusters). Based on observations made with the

  6. Using GDAL to Convert NetCDF 4 CF 1.6 to GeoTIFF: Interoperability Problems and Solutions for Data Providers and Distributors

    Science.gov (United States)

    Haran, T. M.; Brodzik, M. J.; Nordgren, B.; Estilow, T.; Scott, D. J.

    2015-12-01

    An increasing number of new Earth science datasets are being producedby data providers in self-describing, machine-independent file formatsincluding Hierarchical Data Format version 5 (HDF5) and NetworkCommon Data Form version 4 (netCDF-4). Furthermore data providers maybe producing netCDF-4 files that follow the conventions for Climateand Forecast metadata version 1.6 (CF 1.6) which, for datasets mappedto a projected raster grid covering all or a portion of the earth,includes the Coordinate Reference System (CRS) used to define howlatitude and longitude are mapped to grid coordinates, i.e. columnsand rows, and vice versa. One problem that users may encounter is thattheir preferred visualization and analysis tool may not yet includesupport for one of these newer formats. Moreover, data distributorssuch as NASA's NSIDC DAAC may not yet include support for on-the-flyconversion of data files for all data sets produced in a new format toa preferred older distributed format.There do exist open source solutions to this dilemma in the form ofsoftware packages that can translate files in one of the new formatsto one of the preferred formats. However these software packagesrequire that the file to be translated conform to the specificationsof its respective format. Although an online CF-Convention compliancechecker is available from cfconventions.org, a recent NSIDC userservices incident described here in detail involved an NSIDC-supporteddata set that passed the (then current) CF Checker Version 2.0.6, butwas in fact lacking two variables necessary for conformance. Thisproblem was not detected until GDAL, a software package which reliedon the missing variables, was employed by a user in an attempt totranslate the data into a different file format, namely GeoTIFF.This incident indicates that testing a candidate data product with oneor more software products written to accept the advertised conventionsis proposed as a practice which improves interoperability

  7. The NeXus data format

    OpenAIRE

    Könnecke, Mark; Akeroyd, Frederick A.; Osborn, Raymond; Peterson, Peter F.; Richter, Tobias; Suzuki, Jiro; Watts, Benjamin; Wintersberger, Eugen; Wuttke, Joachim; Bernstein, Herbert J.; Brewster, Aaron S.; Campbell, Stuart I.; Clausen, Björn; Cottrell, Stephen; Hoffmann, Jens Uwe

    2015-01-01

    NeXus is an effort by an international group of scientists to define a common data exchange and archival format for neutron, X-ray and muon experiments. NeXus is built on top of the scientific data format HDF5 and adds domain-specific rules for organizing data within HDF5 files, in addition to a dictionary of well defined domain-specific field names. The NeXus data format has two purposes. First, it defines a format that can serve as a container for all relevant data associated with a beamlin...

  8. An Enhanced GINGER Simulation Code with Harmonic Emission and HDF5 IO Capabilities

    International Nuclear Information System (INIS)

    Fawley, William M.

    2006-01-01

    GINGER [1] is an axisymmetric, polychromatic (r-z-t) FEL simulation code originally developed in the mid-1980's to model the performance of single-pass amplifiers. Over the past 15 years GINGER's capabilities have been extended to include more complicated configurations such as undulators with drift spaces, dispersive sections, and vacuum chamber wakefield effects; multi-pass oscillators; and multi-stage harmonic cascades. Its coding base has been tuned to permit running effectively on platforms ranging from desktop PC's to massively parallel processors such as the IBM-SP. Recently, we have made significant changes to GINGER by replacing the original predictor-corrector field solver with a new direct implicit algorithm, adding harmonic emission capability, and switching to the HDF5 IO library [2] for output diagnostics. In this paper, we discuss some details regarding these changes and also present simulation results for LCLS SASE emission at λ = 0.15 nm and higher harmonics

  9. The beauty of being (label)-free: sample preparation methods for SWATH-MS and next-generation targeted proteomics

    Science.gov (United States)

    Campbell, Kate; Deery, Michael J.; Lilley, Kathryn S.; Ralser, Markus

    2014-01-01

    The combination of qualitative analysis with label-free quantification has greatly facilitated the throughput and flexibility of novel proteomic techniques. However, such methods rely heavily on robust and reproducible sample preparation procedures. Here, we benchmark a selection of in gel, on filter, and in solution digestion workflows for their application in label-free proteomics. Each procedure was associated with differing advantages and disadvantages. The in gel methods interrogated were cost effective, but were limited in throughput and digest efficiency. Filter-aided sample preparations facilitated reasonable processing times and yielded a balanced representation of membrane proteins, but led to a high signal variation in quantification experiments. Two in solution digest protocols, however, gave optimal performance for label-free proteomics. A protocol based on the detergent RapiGest led to the highest number of detected proteins at second-best signal stability, while a protocol based on acetonitrile-digestion, RapidACN, scored best in throughput and signal stability but came second in protein identification. In addition, we compared label-free data dependent (DDA) and data independent (SWATH) acquisition on a TripleTOF 5600 instrument. While largely similar in protein detection, SWATH outperformed DDA in quantification, reducing signal variation and markedly increasing the number of precisely quantified peptides. PMID:24741437

  10. 48 CFR 4.802 - Contract files.

    Science.gov (United States)

    2010-10-01

    ... performed by the same office. (c) Files must be maintained at organizational levels that ensure— (1... are decentralized (e.g., by type or function) to various organizational elements or to other outside offices, responsibility for their maintenance must be assigned. A central control and, if needed, a...

  11. Kadının ve Kaderin “Tırpan”ı Swath Of Woman And Fate

    Directory of Open Access Journals (Sweden)

    Salim DURUKOĞLU

    2013-07-01

    Full Text Available The novel Tırpan (Swath would be the most prominent outcome of a historical and literary brainstorming about the World Women’s Day in particular and the unfortunate faith of women in general. The novel Tırpan, which first merges and then separates the concepts of woman and faith, is one or probably the only work with a thesis that comes to the mind first in that it was written with the naïve belief that the faith will show up not before but after it is lived and generally all people particularly the women can write their faith; it questions the faith or the unfortunate faith of people, especially the that of women within the boundaries of the genre of novel, expanding to cover class clashes, the breaths of Marxist ideologies, and feminist viewpoints. Inherited from an image of old tool in our mind, accompanied with the motive of Azrael claiming lives with the swath in its hands, swath gains a function beyond that of an agricultural tool in this novel. Swath changes hand from Azrael to women and the woman punishes the man who disrespects her field of existence and freedom, thus reclaims and rewrites her faith with her own hands using the swath. Swath would preserve its function as a goal and an instrument in the writer’s hand, however, in terms of its consequences it will transform into a symbol of woman’s revolution as the hammer and sickle of the Bolshevist revolution. The author, who wants to create a resisting, active and activist women spirit and mentality instead of an understanding which takes it as an escape to commit suicide by hanging themselves when they are forced to get married as a passive reaction, tries to impose the fatalist Turkish society and Turkish women with the idea that they can reclaim their faiths and direct their own lives, offering and inspiring a shift of awareness through this novel. Özelde Dünya Kadınlar Günü ve genelde kadının makus talihi ve tarihi ekseninde ve edebiyat düzleminde yapaca

  12. OMI/Aura and MODIS/Aqua Merged Cloud Product 1-Orbit L2 Swath 13x24 km V003 (OMMYDCLD) at GES DISC

    Data.gov (United States)

    National Aeronautics and Space Administration — The OMI/Aura and MODIS/Aqua Merged Cloud Product 1-Orbit L2 Swath 13x24 km (OMMYDCLD) is a Level-2 orbital product that combines cloud parameters retrieved by the...

  13. SciSpark's SRDD : A Scientific Resilient Distributed Dataset for Multidimensional Data

    Science.gov (United States)

    Palamuttam, R. S.; Wilson, B. D.; Mogrovejo, R. M.; Whitehall, K. D.; Mattmann, C. A.; McGibbney, L. J.; Ramirez, P.

    2015-12-01

    Remote sensing data and climate model output are multi-dimensional arrays of massive sizes locked away in heterogeneous file formats (HDF5/4, NetCDF 3/4) and metadata models (HDF-EOS, CF) making it difficult to perform multi-stage, iterative science processing since each stage requires writing and reading data to and from disk. We have developed SciSpark, a robust Big Data framework, that extends ApacheTM Spark for scaling scientific computations. Apache Spark improves the map-reduce implementation in ApacheTM Hadoop for parallel computing on a cluster, by emphasizing in-memory computation, "spilling" to disk only as needed, and relying on lazy evaluation. Central to Spark is the Resilient Distributed Dataset (RDD), an in-memory distributed data structure that extends the functional paradigm provided by the Scala programming language. However, RDDs are ideal for tabular or unstructured data, and not for highly dimensional data. The SciSpark project introduces the Scientific Resilient Distributed Dataset (sRDD), a distributed-computing array structure which supports iterative scientific algorithms for multidimensional data. SciSpark processes data stored in NetCDF and HDF files by partitioning them across time or space and distributing the partitions among a cluster of compute nodes. We show usability and extensibility of SciSpark by implementing distributed algorithms for geospatial operations on large collections of multi-dimensional grids. In particular we address the problem of scaling an automated method for finding Mesoscale Convective Complexes. SciSpark provides a tensor interface to support the pluggability of different matrix libraries. We evaluate performance of the various matrix libraries in distributed pipelines, such as Nd4jTM and BreezeTM. We detail the architecture and design of SciSpark, our efforts to integrate climate science algorithms, parallel ingest and partitioning (sharding) of A-Train satellite observations from model grids. These

  14. Parameter-Invariant Hierarchical Exclusive Alphabet Design for 2-WRC with HDF Strategy

    Directory of Open Access Journals (Sweden)

    T. Uřičář

    2010-01-01

    Full Text Available Hierarchical eXclusive Code (HXC for the Hierarchical Decode and Forward (HDF strategy in the Wireless 2-Way Relay Channel (2-WRC has the achievable rate region extended beyond the classical MAC region. Although direct HXC design is in general highly complex, a layered approach to HXC design is a feasible solution. While the outer layer code of the layered HXC can be any state-of-the-art capacity approaching code, the inner layer must be designed in such a way that the exclusive property of hierarchical symbols (received at the relay will be provided. The simplest case of the inner HXC layer is a simple signal space channel symbol memoryless mapper called Hierarchical eXclusive Alphabet (HXA. The proper design of HXA is important, especially in the case of parametric channels, where channel parametrization (e.g. phase rotation can violate the exclusive property of hierarchical symbols (as seen by the relay, resulting in significant capacity degradation. In this paper we introduce an example of a geometrical approach to Parameter-Invariant HXA design, and we show that the corresponding hierarchical MAC capacity region extends beyond the classical MAC region, irrespective of the channel pametrization.

  15. A simulator for airborne laser swath mapping via photon counting

    Science.gov (United States)

    Slatton, K. C.; Carter, W. E.; Shrestha, R.

    2005-06-01

    Commercially marketed airborne laser swath mapping (ALSM) instruments currently use laser rangers with sufficient energy per pulse to work with return signals of thousands of photons per shot. The resulting high signal to noise level virtually eliminates spurious range values caused by noise, such as background solar radiation and sensor thermal noise. However, the high signal level approach requires laser repetition rates of hundreds of thousands of pulses per second to obtain contiguous coverage of the terrain at sub-meter spatial resolution, and with currently available technology, affords little scalability for significantly downsizing the hardware, or reducing the costs. A photon-counting ALSM sensor has been designed by the University of Florida and Sigma Space, Inc. for improved topographic mapping with lower power requirements and weight than traditional ALSM sensors. Major elements of the sensor design are presented along with preliminary simulation results. The simulator is being developed so that data phenomenology and target detection potential can be investigated before the system is completed. Early simulations suggest that precise estimates of terrain elevation and target detection will be possible with the sensor design.

  16. Improvements to the swath-level near-surface atmospheric state parameter retrievals within the NRL Ocean Surface Flux System (NFLUX)

    Science.gov (United States)

    May, J. C.; Rowley, C. D.; Meyer, H.

    2017-12-01

    The Naval Research Laboratory (NRL) Ocean Surface Flux System (NFLUX) is an end-to-end data processing and assimilation system used to provide near-real-time satellite-based surface heat flux fields over the global ocean. The first component of NFLUX produces near-real-time swath-level estimates of surface state parameters and downwelling radiative fluxes. The focus here will be on the satellite swath-level state parameter retrievals, namely surface air temperature, surface specific humidity, and surface scalar wind speed over the ocean. Swath-level state parameter retrievals are produced from satellite sensor data records (SDRs) from four passive microwave sensors onboard 10 platforms: the Special Sensor Microwave Imager/Sounder (SSMIS) sensor onboard the DMSP F16, F17, and F18 platforms; the Advanced Microwave Sounding Unit-A (AMSU-A) sensor onboard the NOAA-15, NOAA-18, NOAA-19, Metop-A, and Metop-B platforms; the Advanced Technology Microwave Sounder (ATMS) sensor onboard the S-NPP platform; and the Advanced Microwave Scannin Radiometer 2 (AMSR2) sensor onboard the GCOM-W1 platform. The satellite SDRs are translated into state parameter estimates using multiple polynomial regression algorithms. The coefficients to the algorithms are obtained using a bootstrapping technique with all available brightness temperature channels for a given sensor, in addition to a SST field. For each retrieved parameter for each sensor-platform combination, unique algorithms are developed for ascending and descending orbits, as well as clear vs cloudy conditions. Each of the sensors produces surface air temperature and surface specific humidity retrievals. The SSMIS and AMSR2 sensors also produce surface scalar wind speed retrievals. Improvement is seen in the SSMIS retrievals when separate algorithms are used for the even and odd scans, with the odd scans performing better than the even scans. Currently, NFLUX treats all SSMIS scans as even scans. Additional improvement in all of

  17. 77 FR 13587 - Combined Notice of Filings

    Science.gov (United States)

    2012-03-07

    .... Applicants: Transcontinental Gas Pipe Line Company. Description: Annual Electric Power Tracker Filing... Company. Description: 2012 Annual Fuel and Electric Power Reimbursement to be effective 4/1/2012. Filed... submits tariff filing per 154.403: Storm Surcharge 2012 to be effective 4/1/2012. Filed Date: 3/1/12...

  18. MOPITT Level 1 Radiances HDF file V006

    Data.gov (United States)

    National Aeronautics and Space Administration — The MOPITT Level 1 data product consists of the geolocated, calibrated earth scene radiances, associated instrument engineering data summaries, and inflight...

  19. eddy4R 0.2.0: a DevOps model for community-extensible processing and analysis of eddy-covariance data based on R, Git, Docker, and HDF5

    Science.gov (United States)

    Metzger, Stefan; Durden, David; Sturtevant, Cove; Luo, Hongyan; Pingintha-Durden, Natchaya; Sachs, Torsten; Serafimovich, Andrei; Hartmann, Jörg; Li, Jiahong; Xu, Ke; Desai, Ankur R.

    2017-08-01

    Large differences in instrumentation, site setup, data format, and operating system stymie the adoption of a universal computational environment for processing and analyzing eddy-covariance (EC) data. This results in limited software applicability and extensibility in addition to often substantial inconsistencies in flux estimates. Addressing these concerns, this paper presents the systematic development of portable, reproducible, and extensible EC software achieved by adopting a development and systems operation (DevOps) approach. This software development model is used for the creation of the eddy4R family of EC code packages in the open-source R language for statistical computing. These packages are community developed, iterated via the Git distributed version control system, and wrapped into a portable and reproducible Docker filesystem that is independent of the underlying host operating system. The HDF5 hierarchical data format then provides a streamlined mechanism for highly compressed and fully self-documented data ingest and output. The usefulness of the DevOps approach was evaluated for three test applications. First, the resultant EC processing software was used to analyze standard flux tower data from the first EC instruments installed at a National Ecological Observatory (NEON) field site. Second, through an aircraft test application, we demonstrate the modular extensibility of eddy4R to analyze EC data from other platforms. Third, an intercomparison with commercial-grade software showed excellent agreement (R2 = 1.0 for CO2 flux). In conjunction with this study, a Docker image containing the first two eddy4R packages and an executable example workflow, as well as first NEON EC data products are released publicly. We conclude by describing the work remaining to arrive at the automated generation of science-grade EC fluxes and benefits to the science community at large. This software development model is applicable beyond EC and more generally builds

  20. eddy4R 0.2.0: a DevOps model for community-extensible processing and analysis of eddy-covariance data based on R, Git, Docker, and HDF5

    Directory of Open Access Journals (Sweden)

    S. Metzger

    2017-08-01

    Full Text Available Large differences in instrumentation, site setup, data format, and operating system stymie the adoption of a universal computational environment for processing and analyzing eddy-covariance (EC data. This results in limited software applicability and extensibility in addition to often substantial inconsistencies in flux estimates. Addressing these concerns, this paper presents the systematic development of portable, reproducible, and extensible EC software achieved by adopting a development and systems operation (DevOps approach. This software development model is used for the creation of the eddy4R family of EC code packages in the open-source R language for statistical computing. These packages are community developed, iterated via the Git distributed version control system, and wrapped into a portable and reproducible Docker filesystem that is independent of the underlying host operating system. The HDF5 hierarchical data format then provides a streamlined mechanism for highly compressed and fully self-documented data ingest and output. The usefulness of the DevOps approach was evaluated for three test applications. First, the resultant EC processing software was used to analyze standard flux tower data from the first EC instruments installed at a National Ecological Observatory (NEON field site. Second, through an aircraft test application, we demonstrate the modular extensibility of eddy4R to analyze EC data from other platforms. Third, an intercomparison with commercial-grade software showed excellent agreement (R2  =  1.0 for CO2 flux. In conjunction with this study, a Docker image containing the first two eddy4R packages and an executable example workflow, as well as first NEON EC data products are released publicly. We conclude by describing the work remaining to arrive at the automated generation of science-grade EC fluxes and benefits to the science community at large. This software development model is applicable beyond EC

  1. Global Precipitation Measurement (GPM) Mission: Precipitation Processing System (PPS) GPM Mission Gridded Text Products Provide Surface Precipitation Retrievals

    Science.gov (United States)

    Stocker, Erich Franz; Kelley, O.; Kummerow, C.; Huffman, G.; Olson, W.; Kwiatkowski, J.

    2015-01-01

    In February 2015, the Global Precipitation Measurement (GPM) mission core satellite will complete its first year in space. The core satellite carries a conically scanning microwave imager called the GPM Microwave Imager (GMI), which also has 166 GHz and 183 GHz frequency channels. The GPM core satellite also carries a dual frequency radar (DPR) which operates at Ku frequency, similar to the Tropical Rainfall Measuring Mission (TRMM) Precipitation Radar, and a new Ka frequency. The precipitation processing system (PPS) is producing swath-based instantaneous precipitation retrievals from GMI, both radars including a dual-frequency product, and a combined GMIDPR precipitation retrieval. These level 2 products are written in the HDF5 format and have many additional parameters beyond surface precipitation that are organized into appropriate groups. While these retrieval algorithms were developed prior to launch and are not optimal, these algorithms are producing very creditable retrievals. It is appropriate for a wide group of users to have access to the GPM retrievals. However, for researchers requiring only surface precipitation, these L2 swath products can appear to be very intimidating and they certainly do contain many more variables than the average researcher needs. Some researchers desire only surface retrievals stored in a simple easily accessible format. In response, PPS has begun to produce gridded text based products that contain just the most widely used variables for each instrument (surface rainfall rate, fraction liquid, fraction convective) in a single line for each grid box that contains one or more observations.This paper will describe the gridded data products that are being produced and provide an overview of their content. Currently two types of gridded products are being produced: (1) surface precipitation retrievals from the core satellite instruments GMI, DPR, and combined GMIDPR (2) surface precipitation retrievals for the partner constellation

  2. Mixed Fibronectin-Derived Peptides Conjugated to a Chitosan Matrix Effectively Promotes Biological Activities through Integrins, α4β1, α5β1, αvβ3, and Syndecan

    Directory of Open Access Journals (Sweden)

    Hozumi Kentaro

    2016-11-01

    Full Text Available Mimicking the biological function of the extracellular matrix is an approach to developing cell adhesive biomaterials. The RGD peptide, derived from fibronectin (Fn, mainly binds to integrin αvβ3 and has been widely used as a cell adhesive peptide on various biomaterials. However, cell adhesion to Fn is thought to be mediated by several integrin subtypes and syndecans. In this study, we synthesized an RGD-containing peptide (FIB1 and four integrin α4β1-binding-related motif-containing peptides (LDV, IDAPS, KLDAPT, and PRARI and constructed peptide-chitosan matrices. The FIB1-chitosan matrix promoted human dermal fibroblast (HDF attachment, and the C-terminal elongated PRARI (ePRARI-C-conjugated chitosan matrix significantly promoted HDF attachment through integrin α4β1 and syndecan binding. Next, we constructed a mixed ePRARI-C- and FIB1-chitosan matrix to develop a Fn mimetic biomaterial. The mixed ePRARI-C/FIB1-chitosan matrix promoted significantly better cell attachment and neurite outgrowth compared to those of either ePRARI-C- or FIB1-chitosan matrices. HDF adhesion to the ePRARI-C/FIB1-chitosan matrix was mediated by integrin, α4β1, α5β1, and αvβ3, similar to HDF adhesion to Fn. These data suggest that an ePRARI-C/FIB1-chitosan matrix can be used as a tool to analyze the multiple functions of Fn and can serve as a Fn-mimetic biomaterial.

  3. SWATHtoMRM: Development of High-Coverage Targeted Metabolomics Method Using SWATH Technology for Biomarker Discovery.

    Science.gov (United States)

    Zha, Haihong; Cai, Yuping; Yin, Yandong; Wang, Zhuozhong; Li, Kang; Zhu, Zheng-Jiang

    2018-03-20

    The complexity of metabolome presents a great analytical challenge for quantitative metabolite profiling, and restricts the application of metabolomics in biomarker discovery. Targeted metabolomics using multiple-reaction monitoring (MRM) technique has excellent capability for quantitative analysis, but suffers from the limited metabolite coverage. To address this challenge, we developed a new strategy, namely, SWATHtoMRM, which utilizes the broad coverage of SWATH-MS technology to develop high-coverage targeted metabolomics method. Specifically, SWATH-MS technique was first utilized to untargeted profile one pooled biological sample and to acquire the MS 2 spectra for all metabolites. Then, SWATHtoMRM was used to extract the large-scale MRM transitions for targeted analysis with coverage as high as 1000-2000 metabolites. Then, we demonstrated the advantages of SWATHtoMRM method in quantitative analysis such as coverage, reproducibility, sensitivity, and dynamic range. Finally, we applied our SWATHtoMRM approach to discover potential metabolite biomarkers for colorectal cancer (CRC) diagnosis. A high-coverage targeted metabolomics method with 1303 metabolites in one injection was developed to profile colorectal cancer tissues from CRC patients. A total of 20 potential metabolite biomarkers were discovered and validated for CRC diagnosis. In plasma samples from CRC patients, 17 out of 20 potential biomarkers were further validated to be associated with tumor resection, which may have a great potential in assessing the prognosis of CRC patients after tumor resection. Together, the SWATHtoMRM strategy provides a new way to develop high-coverage targeted metabolomics method, and facilitates the application of targeted metabolomics in disease biomarker discovery. The SWATHtoMRM program is freely available on the Internet ( http://www.zhulab.cn/software.php ).

  4. Quantification of Lysine Acetylation and Succinylation Stoichiometry in Proteins Using Mass Spectrometric Data-Independent Acquisitions (SWATH)

    Science.gov (United States)

    Meyer, Jesse G.; D'Souza, Alexandria K.; Sorensen, Dylan J.; Rardin, Matthew J.; Wolfe, Alan J.; Gibson, Bradford W.; Schilling, Birgit

    2016-11-01

    Post-translational modification of lysine residues by NƐ-acylation is an important regulator of protein function. Many large-scale protein acylation studies have assessed relative changes of lysine acylation sites after antibody enrichment using mass spectrometry-based proteomics. Although relative acylation fold-changes are important, this does not reveal site occupancy, or stoichiometry, of individual modification sites, which is critical to understand functional consequences. Recently, methods for determining lysine acetylation stoichiometry have been proposed based on ratiometric analysis of endogenous levels to those introduced after quantitative per-acetylation of proteins using stable isotope-labeled acetic anhydride. However, in our hands, we find that these methods can overestimate acetylation stoichiometries because of signal interferences when endogenous levels of acylation are very low, which is especially problematic when using MS1 scans for quantification. In this study, we sought to improve the accuracy of determining acylation stoichiometry using data-independent acquisition (DIA). Specifically, we use SWATH acquisition to comprehensively collect both precursor and fragment ion intensity data. The use of fragment ions for stoichiometry quantification not only reduces interferences but also allows for determination of site-level stoichiometry from peptides with multiple lysine residues. We also demonstrate the novel extension of this method to measurements of succinylation stoichiometry using deuterium-labeled succinic anhydride. Proof of principle SWATH acquisition studies were first performed using bovine serum albumin for both acetylation and succinylation occupancy measurements, followed by the analysis of more complex samples of E. coli cell lysates. Although overall site occupancy was low (<1%), some proteins contained lysines with relatively high acetylation occupancy.

  5. 8 CFR 208.4 - Filing the application.

    Science.gov (United States)

    2010-01-01

    ... Aliens and Nationality DEPARTMENT OF HOMELAND SECURITY IMMIGRATION REGULATIONS PROCEDURES FOR ASYLUM AND... interview, or an immigration judge, in a hearing, shall review the application and give the applicant the opportunity to present any relevant and useful information bearing on any prohibitions on filing to determine...

  6. Observations of the Hubble Deep Field with the Infrared Space Observatory .4. Association of sources with Hubble Deep Field galaxies

    DEFF Research Database (Denmark)

    Mann, R.G.; Oliver, S.J.; Serjeant, S.B.G.

    1997-01-01

    We discuss the identification of sources detected by the Infrared Space Observatory (ISO) at 6.7 and 15 mu m in the Hubble Deep Field (HDF) region. We conservatively associate ISO sources with objects in existing optical and near-infrared HDF catalogues using the likelihood ratio method, confirming...... these results (and, in one case, clarifying them) with independent visual searches, We find 15 ISO sources to be reliably associated with bright [I-814(AB) HDF, and one with an I-814(AB)=19.9 star, while a further 11 are associated with objects in the Hubble Flanking Fields (10 galaxies...... and one star), Amongst optically bright HDF galaxies, ISO tends to detect luminous, star-forming galaxies at fairly high redshift and with disturbed morphologies, in preference to nearby ellipticals....

  7. 77 FR 27221 - Combined Notice of Filings

    Science.gov (United States)

    2012-05-09

    ... Generator Status of Minonk Wind, LLC. Filed Date: 4/19/12. Accession Number: 20120419-5196. Comments Due: 5... Self-Certification of Exempt Wholesale Generator Status of Senate Wind, LLC. Filed Date: 4/19/12... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings Take notice...

  8. Optimized convective transport with automated pressure control in on-line postdilution hemodiafiltration.

    Science.gov (United States)

    Joyeux, V; Sijpkens, Y; Haddj-Elmrabet, A; Bijvoet, A J; Nilsson, L-G

    2008-11-01

    In a stable patient population we evaluated on-line postdilution hemodiafiltration (HDF) on the incremental improvement in blood purification versus high-flux HD, using the same dialyzer and blood flow rate. For HDF we used a new way of controlling HDF treatments based on the concept of constant pressure control where the trans-membrane pressure is automatically set by the machine using a feedback loop on the achieved filtration (HDF UC). We enrolled 20 patients on on-line HDF treatment and during a 4-week study period recorded key treatment parameters in HDF UC. For one mid-week study treatment performed in HD and one midweek HDF UC treatment we sampled blood and spent dialysate to evaluate the removal of small- and middle-sized solutes. We achieved 18+/-3 liters of ultrafiltration in four-hour HDF UC treatments, corresponding to 27+/-3% of the treated blood volume. That percentage varied by patient hematocrit level. The ultrafiltration amounted to 49+/-4% of the estimated plasma water volume treated. We noted few machine alarms. For beta2m and factor D the effective reduction in plasma level by HDF (76+/-6% and 43+/-9%, respectively) was significantly greater than in HD, and a similar relation was seen in mass recovered in spent dialysate. Small solute removal was similar in HDF and HD. Albumin loss was low. The additional convective transport provided by on-line HDF significantly improved the removal of middle molecules when all other treatment settings were equal. Using the automated pressure control mode in HDF, the convective volume depended on the blood volume processed and the patient hematocrit level.

  9. Multi-omic network-based interrogation of rat liver metabolism following gastric bypass surgery featuring SWATH proteomics.

    Science.gov (United States)

    Sridharan, Gautham Vivek; D'Alessandro, Matthew; Bale, Shyam Sundhar; Bhagat, Vicky; Gagnon, Hugo; Asara, John M; Uygun, Korkut; Yarmush, Martin L; Saeidi, Nima

    2017-09-01

    Morbidly obese patients often elect for Roux-en-Y gastric bypass (RYGB), a form of bariatric surgery that triggers a remarkable 30% reduction in excess body weight and reversal of insulin resistance for those who are type II diabetic. A more complete understanding of the underlying molecular mechanisms that drive the complex metabolic reprogramming post-RYGB could lead to innovative non-invasive therapeutics that mimic the beneficial effects of the surgery, namely weight loss, achievement of glycemic control, or reversal of non-alcoholic steatohepatitis (NASH). To facilitate these discoveries, we hereby demonstrate the first multi-omic interrogation of a rodent RYGB model to reveal tissue-specific pathway modules implicated in the control of body weight regulation and energy homeostasis. In this study, we focus on and evaluate liver metabolism three months following RYGB in rats using both SWATH proteomics, a burgeoning label free approach using high resolution mass spectrometry to quantify protein levels in biological samples, as well as MRM metabolomics. The SWATH analysis enabled the quantification of 1378 proteins in liver tissue extracts, of which we report the significant down-regulation of Thrsp and Acot13 in RYGB as putative targets of lipid metabolism for weight loss. Furthermore, we develop a computational graph-based metabolic network module detection algorithm for the discovery of non-canonical pathways, or sub-networks, enriched with significantly elevated or depleted metabolites and proteins in RYGB-treated rat livers. The analysis revealed a network connection between the depleted protein Baat and the depleted metabolite taurine, corroborating the clinical observation that taurine-conjugated bile acid levels are perturbed post-RYGB.

  10. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_Aqua-FM4_Edition1)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-03-29] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal

  11. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_Aqua-FM4_Edition2)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-03-29] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal

  12. 78 FR 20901 - Combined Notice of Filings #1

    Science.gov (United States)

    2013-04-08

    ... Marketing Inc. submits Category Seller Clarification to be effective 3/29/2013. Filed Date: 3/29/13... submits Cost- Based Tariffs Compliance Filing to be effective 4/1/2013. Filed Date: 3/29/13. Accession... to be effective N/A. Filed Date: 3/29/13. Accession Number: 20130329-5223. Comments Due: 5 p.m. ET 4...

  13. METH-33 - Performance assessment for the high resolution and wide swath (HRWS) post-Sentinel-1 SAR system

    DEFF Research Database (Denmark)

    Zonno, Mariantonietta; Maria J., Sanjuan-Ferrer,; Lopez-Dekker, Paco

    The next generation, post-Sentinel-1, ESA’s C-band synthetic aperture radar (SAR) system is conceived to provide simultaneously high azimuth resolution and wide swath width (HRWS).There are different ways in which the imaging capabilities of the HRWS SAR system can be exploited, which translate...... or numerical models and, if these are not available, real SAR images as well as numerical algorithms and some explicit simulations of the data and of the inversion process are employed. The tool uses as input the HRWS SAR instrument performance for the different applicable modes and produces as output results...

  14. Estudio comparativo de biocompatibilidad entre la hemodiafiltración en línea y la hemodiafiltración con reinfusión endógena Biocompatibility comparative study between online hemodiafiltration and hemodiafiltration with endogenous reinfusion

    Directory of Open Access Journals (Sweden)

    José Luis Cobo Sánchez

    2012-12-01

    Full Text Available Objetivo: Comparar la biocompatibilidad entre la hemodiafiltración en línea (HDF y la hemodiafiltración con reinfusión endógena (HFR. Material y método: Estudio comparativo observacional en una población de 15 pacientes en hemodiálisis crónica elegidos al azar entre los pacientes de nuestra unidad. Se compararon cambios en el perfil hematológico, nivel de PCR y constantes vitales, pre y post hemodiálisis, tras someterse a ambas técnicas de hemodiafiltración. Se comparó las diferencias entre los parámetros estudiados pre y post hemodiálisis en cada técnica. Resultados: Los niveles de plaquetas descendieron más en la HDF (HDF -1,33 vs HFR -19,73 x10³/mm³, p=0,005. El nivel de leucocitos disminuyó en la HDF y aumentó en la HFR (HDF -0,46 vs HFR +0,8 x10³/mm³; p=0,006. Respecto a la fórmula leucocitaria hubo resultados dispares: segmentados HDF -1,7 vs HFR +5,4%, pAim: To compare biocompatibility between online hemodiafiltration (HDF and hemodiafiltration with endogenous reinfusion. Methods: Observational comparative study in a population of 15 chronic hemodialysis patients randomly selected among the patients in our unit. We compared changes in hematological profile, CRP level and vital signs, pre and post hemodialysis, after undergoing both hemodiafiltration techniques. Comparing the differences between the parameters studied before and after each hemodialysis technique. Results: Platelet levels decreased more in the HDF (HDF -1,33 vs HFR -19,73 x10³/mm³, p=0,005. Leukocyte levels decreased in the HDF and increased with HFR (HDF -0,46 vs HFR +0,8 x10³/mm³; p=0,006. Regarding the leukocyte formula had mixed results: segmented HDF -1,7 vs HFR +5,4%, p<0,001; lymphocytes HDF +1,96 vs HFR -3,62%, p<0,001. With the HFR decreased CRP levels less (HDF -0,05 vs HFR -0,001 mg/dl; p= NS. Regarding vital signs, systolic blood pressure decreased more in the HFR than HDF (HDF -9,93 vs HFR -10,33 mmHg; p<0,001, conversely that the

  15. Instalaţie de condensare gaze exhaustate de la linia de presare feţe de uşi (HDF, cu recuperator de căldură şi colectare ape uzate/Condensing instalation for exhausted gas from door-skin pressing line (HDF with heat recovery and waste water collecting

    Directory of Open Access Journals (Sweden)

    Nicolae BĂDIN

    2016-12-01

    Full Text Available During the door-skin pressing (HDF, on the production facility are emitted an importing quantity of gases which have an substantial part of dust (small particles and another pollutant (formaldehyde, etc. Ordinary production line give all those gases in atmosphere but the newest prescription of European regulation (BAT try to reduce to the minimum (eliminate if it possible those emissions .In this ideas I present one solution for capture gases and use his energy for another purposes. In the same time, waste water with polluted material are collected separately and stored, used or eliminated with proper company.

  16. 17 CFR 249.819 - Form 19b-4, for electronic filing with respect to proposed rule changes by all self-regulatory...

    Science.gov (United States)

    2010-04-01

    ... filing with respect to proposed rule changes by all self-regulatory organizations. 249.819 Section 249..., SECURITIES EXCHANGE ACT OF 1934 Forms for Self-Regulatory Organization Rule Changes and Forms for....819 Form 19b-4, for electronic filing with respect to proposed rule changes by all self-regulatory...

  17. Evaluated neutronic file for indium

    International Nuclear Information System (INIS)

    Smith, A.B.; Chiba, S.; Smith, D.L.; Meadows, J.W.; Guenther, P.T.; Lawson, R.D.; Howerton, R.J.

    1990-01-01

    A comprehensive evaluated neutronic data file for elemental indium is documented. This file, extending from 10 -5 eV to 20 MeV, is presented in the ENDF/B-VI format, and contains all neutron-induced processes necessary for the vast majority of neutronic applications. In addition, an evaluation of the 115 In(n,n') 116m In dosimetry reaction is presented as a separate file. Attention is given in quantitative values, with corresponding uncertainty information. These files have been submitted for consideration as a part of the ENDF/B-VI national evaluated-file system. 144 refs., 10 figs., 4 tabs

  18. 77 FR 23708 - Combined Notice of Filings #2

    Science.gov (United States)

    2012-04-20

    ... submits tariff filing per 35.13(a)(2)(iii: 11--20120413 I&M Oper Co MBR Conc to be effective 1/1/2012... filing per 35.13(a)(2)(iii: 12--20120413 KPCo Oper Co MBR Conc to be effective 1/ 1/2012. Filed Date: 4... OPCo Oper Co MBR Conc to be effective 1/ 1/2012. Filed Date: 4/13/12. Accession Number: 20120413-5179...

  19. Evaluation of iTRAQ and SWATH-MS for the Quantification of Proteins Associated with Insulin Resistance in Human Duodenal Biopsy Samples.

    Directory of Open Access Journals (Sweden)

    Sylvie Bourassa

    Full Text Available Insulin resistance (IR is associated with increased production of triglyceride-rich lipoproteins of intestinal origin. In order to assess whether insulin resistance affects the proteins involved in lipid metabolism, we used two mass spectrometry based quantitative proteomics techniques to compare the intestinal proteome of 14 IR patients to that of 15 insulin sensitive (IS control patients matched for age and waist circumference. A total of 3886 proteins were identified by the iTRAQ (Isobaric Tags for Relative and Absolute Quantitation mass spectrometry approach and 2290 by the SWATH-MS strategy (Serial Window Acquisition of Theoretical Spectra. Using these two methods, 208 common proteins were identified with a confidence corresponding to FDR < 1%, and quantified with p-value < 0.05. The quantification of those 208 proteins has a Pearson correlation coefficient (r2 of 0.728 across the two techniques. Gene Ontology analyses of the differentially expressed proteins revealed that annotations related to lipid metabolic process and oxidation reduction process are overly represented in the set of under-expressed proteins in IR subjects. Furthermore, both methods quantified proteins of relevance to IR. These data also showed that SWATH-MS is a promising and compelling alternative to iTRAQ for protein quantitation of complex mixtures.

  20. ENDF/B-4 General Purpose File 1974

    International Nuclear Information System (INIS)

    Schwerer, O.

    1980-04-01

    This document summarizes contents and documentation of the 1974 version of the General Purpose File of the ENDF/B Library maintained by the National Nuclear Data Center (NNDC) at the Brookhaven National Laboratory, USA. The Library contains numerical neutron reaction data for 90 isotopes or elements. The entire Library or selective retrievals from it can be obtained on magnetic tape from the IAEA Nuclear Data Section. (author)

  1. SDS: A Framework for Scientific Data Services

    Energy Technology Data Exchange (ETDEWEB)

    Dong, Bin; Byna, Surendra; Wu, Kesheng

    2013-10-31

    Large-scale scientific applications typically write their data to parallel file systems with organizations designed to achieve fast write speeds. Analysis tasks frequently read the data in a pattern that is different from the write pattern, and therefore experience poor I/O performance. In this paper, we introduce a prototype framework for bridging the performance gap between write and read stages of data access from parallel file systems. We call this framework Scientific Data Services, or SDS for short. This initial implementation of SDS focuses on reorganizing previously written files into data layouts that benefit read patterns, and transparently directs read calls to the reorganized data. SDS follows a client-server architecture. The SDS Server manages partial or full replicas of reorganized datasets and serves SDS Clients' requests for data. The current version of the SDS client library supports HDF5 programming interface for reading data. The client library intercepts HDF5 calls and transparently redirects them to the reorganized data. The SDS client library also provides a querying interface for reading part of the data based on user-specified selective criteria. We describe the design and implementation of the SDS client-server architecture, and evaluate the response time of the SDS Server and the performance benefits of SDS.

  2. Precise Temporal Profiling of Signaling Complexes in Primary Cells Using SWATH Mass Spectrometry

    Directory of Open Access Journals (Sweden)

    Etienne Caron

    2017-03-01

    Full Text Available Spatiotemporal organization of protein interactions in cell signaling is a fundamental process that drives cellular functions. Given differential protein expression across tissues and developmental stages, the architecture and dynamics of signaling interaction proteomes is, likely, highly context dependent. However, current interaction information has been almost exclusively obtained from transformed cells. In this study, we applied an advanced and robust workflow combining mouse genetics and affinity purification (AP-SWATH mass spectrometry to profile the dynamics of 53 high-confidence protein interactions in primary T cells, using the scaffold protein GRB2 as a model. The workflow also provided a sufficient level of robustness to pinpoint differential interaction dynamics between two similar, but functionally distinct, primary T cell populations. Altogether, we demonstrated that precise and reproducible quantitative measurements of protein interaction dynamics can be achieved in primary cells isolated from mammalian tissues, allowing resolution of the tissue-specific context of cell-signaling events.

  3. Important comments on KERMA factors and DPA cross-section data in ACE files of JENDL-4.0, JEFF-3.2 and ENDF/B-VII.1

    Science.gov (United States)

    Konno, Chikara; Tada, Kenichi; Kwon, Saerom; Ohta, Masayuki; Sato, Satoshi

    2017-09-01

    We have studied reasons of differences of KERMA factors and DPA cross-section data among nuclear data libraries. Here the KERMA factors and DPA cross-section data included in the official ACE files of JENDL-4.0, ENDF/B-VII.1 and JEFF-3.2 are examined in more detail. As a result, it is newly found out that the KERMA factors and DPA cross-section data of a lot of nuclei are different among JENDL-4.0, ENDF/B-VII.1 and JEFF-3.2 and reasons of the differences are the followings: 1) large secondary particle production yield, 2) no secondary gamma data, 3) secondary gamma data in files12-15 mt = 3, 4) mt = 103-107 data without mt = 600 s-800 s data in file6. The issue 1) is considered to be due to nuclear data, while the issues 2)-4) seem to be due to NJOY. The ACE files of JENDL-4.0, ENDF/B-VII.1 and JEFF-3.2 with these problems should be revised after correcting wrong nuclear data and NJOY problems.

  4. Desarrollo e implementación de un conversor HDF para la estación de observación y medición de la atmósfera, CEILAP- RG

    Directory of Open Access Journals (Sweden)

    Karim Omar Hallar

    2014-06-01

    Full Text Available El adelgazamiento de la capa de ozono sobre el polo sur de nuestro planeta es un fenómeno estacional que desde la década de los ‘80 se desarrolla cada año durante la primavera, alcanzado usualmente en sus bordes la parte sur de nuestro país. Para monitorear su variación, se desarrollaron desde 2005 varias campañas de medición a través del equipamiento del “Observatorio Atmosférico de la Patagonia Austral”, emplazado en la ciudad de Río Gallegos. En estas campañas de medición, y a través de una variedad de instrumentos se obtiene un volumen muy significativo de datos, cuya posibilidad de estudio estaba acotada por la necesidad de convertirlas a formato Hierachical Data Format (HDF, que es el estándar de manipulación de datos para información científica de los institutos de estudio de la atmósfera. En este trabajo se muestra el desarrollo y aplicación de una interfaz de conversión de datos medidos por el instrumento DIAL (Diferencial Absorption LIDAR al formato HDF.

  5. 12 CFR 308.10 - Filing of papers.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Filing of papers. 308.10 Section 308.10 Banks... AND PROCEDURE Uniform Rules of Practice and Procedure § 308.10 Filing of papers. (a) Filing. Any papers required to be filed, excluding documents produced in response to a discovery request pursuant to...

  6. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_FM1+FM4_Edition2)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-03-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal

  7. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_Aqua-FM4_Edition1-CV)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-03-29] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal

  8. 78 FR 21927 - Combined Notice of Filings #1

    Science.gov (United States)

    2013-04-12

    ... Creek Limited. Description: First Revised MBR Tariff to be effective 4/4/2013. Filed Date: 4/3/13...: Double ``C'' Limited. Description: First Revised MBR Tariff to be effective 4/4/2013. Filed Date: 4/3/13...-000. Applicants: High Sierra Limited. Description: First Revised MBR Tariff to be effective 4/4/2013...

  9. ATLAS, an integrated structural analysis and design system. Volume 4: Random access file catalog

    Science.gov (United States)

    Gray, F. P., Jr. (Editor)

    1979-01-01

    A complete catalog is presented for the random access files used by the ATLAS integrated structural analysis and design system. ATLAS consists of several technical computation modules which output data matrices to corresponding random access file. A description of the matrices written on these files is contained herein.

  10. 76 FR 21720 - Combined Notice of Filings #2

    Science.gov (United States)

    2011-04-18

    ... submits tariff filing per 35.13(a)(2)(iii): Revision to Sempra Generation FERC Electric MBR Tariff to be... LLC FERC Electric MBR Tariff to be effective 4/5/2011. Filed Date: 04/05/2011. Accession Number...): Revision to Mesquite Power LLC FERC Electric MBR Tariff to be effective 4/5/2011. Filed Date: 04/05/2011...

  11. The Surface Water and Ocean Topography Satellite Mission - An Assessment of Swath Altimetry Measurements of River Hydrodynamics

    Science.gov (United States)

    Wilson, Matthew D.; Durand, Michael; Alsdorf, Douglas; Chul-Jung, Hahn; Andreadis, Konstantinos M.; Lee, Hyongki

    2012-01-01

    The Surface Water and Ocean Topography (SWOT) satellite mission, scheduled for launch in 2020 with development commencing in 2015, will provide a step-change improvement in the measurement of terrestrial surface water storage and dynamics. In particular, it will provide the first, routine two-dimensional measurements of water surface elevations, which will allow for the estimation of river and floodplain flows via the water surface slope. In this paper, we characterize the measurements which may be obtained from SWOT and illustrate how they may be used to derive estimates of river discharge. In particular, we show (i) the spatia-temporal sampling scheme of SWOT, (ii) the errors which maybe expected in swath altimetry measurements of the terrestrial surface water, and (iii) the impacts such errors may have on estimates of water surface slope and river discharge, We illustrate this through a "virtual mission" study for a approximately 300 km reach of the central Amazon river, using a hydraulic model to provide water surface elevations according to the SWOT spatia-temporal sampling scheme (orbit with 78 degree inclination, 22 day repeat and 140 km swath width) to which errors were added based on a two-dimension height error spectrum derived from the SWOT design requirements. Water surface elevation measurements for the Amazon mainstem as may be observed by SWOT were thereby obtained. Using these measurements, estimates of river slope and discharge were derived and compared to those which may be obtained without error, and those obtained directly from the hydraulic model. It was found that discharge can be reproduced highly accurately from the water height, without knowledge of the detailed channel bathymetry using a modified Manning's equation, if friction, depth, width and slope are known. Increasing reach length was found to be an effective method to reduce systematic height error in SWOT measurements.

  12. The SMAP Level 4 Surface and Root-zone Soil Moisture (L4_SM) Product

    Science.gov (United States)

    Reichle, Rolf; Crow, Wade; Koster, Randal; Kimball, John

    2010-01-01

    The Soil Moisture Active and Passive (SMAP) mission is being developed by NASA for launch in 2013 as one of four first-tier missions recommended by the U.S. National Research Council Committee on Earth Science and Applications from Space in 2007. The primary science objectives of SMAP are to enhance understanding of land surface controls on the water, energy and carbon cycles, and to determine their linkages. Moreover, the high resolution soil moisture mapping provided by SMAP has practical applications in weather and seasonal climate prediction, agriculture, human health, drought and flood decision support. In this paper we describe the assimilation of SMAP observations for the generation of the planned SMAP Level 4 Surface and Root-zone Soil Moisture (L4_SM) product. The SMAP mission makes simultaneous active (radar) and passive (radiometer) measurements in the 1.26-1.43 GHz range (L-band) from a sun-synchronous low-earth orbit. Measurements will be obtained across a 1000 km wide swath using conical scanning at a constant incidence angle (40 deg). The radar resolution varies from 1-3 km over the outer 70% of the swath to about 30 km near the center of the swath. The radiometer resolution is 40 km across the entire swath. The radiometer measurements will allow high-accuracy but coarse resolution (40 km) measurements. The radar measurements will add significantly higher resolution information. The radar is however very sensitive to surface roughness and vegetation structure. The combination of the two measurements allows optimal blending of the advantages of each instrument. SMAP directly observes only surface soil moisture (in the top 5 cm of the soil column). Several of the key applications targeted by SMAP, however, require knowledge of root zone soil moisture (approximately top 1 m of the soil column), which is not directly measured by SMAP. The foremost objective of the SMAP L4_SM product is to fill this gap and provide estimates of root zone soil moisture

  13. Unleashing Geophysics Data with Modern Formats and Services

    Science.gov (United States)

    Ip, Alex; Brodie, Ross C.; Druken, Kelsey; Bastrakova, Irina; Evans, Ben; Kemp, Carina; Richardson, Murray; Trenham, Claire; Wang, Jingbo; Wyborn, Lesley

    2016-04-01

    Geoscience Australia (GA) is the national steward of large volumes of geophysical data extending over the entire Australasian region and spanning many decades. The volume and variety of data which must be managed, coupled with the increasing need to support machine-to-machine data access, mean that the old "click-and-ship" model delivering data as downloadable files for local analysis is rapidly becoming unviable - a "big data" problem not unique to geophysics. The Australian Government, through the Research Data Services (RDS) Project, recently funded the Australian National Computational Infrastructure (NCI) to organize a wide range of Earth Systems data from diverse collections including geoscience, geophysics, environment, climate, weather, and water resources onto a single High Performance Data (HPD) Node. This platform, which now contains over 10 petabytes of data, is called the National Environmental Research Data Interoperability Platform (NERDIP), and is designed to facilitate broad user access, maximise reuse, and enable integration. GA has contributed several hundred terabytes of geophysical data to the NERDIP. Historically, geophysical datasets have been stored in a range of formats, with metadata of varying quality and accessibility, and without standardised vocabularies. This has made it extremely difficult to aggregate original data from multiple surveys (particularly un-gridded geophysics point/line data) into standard formats suited to High Performance Computing (HPC) environments. To address this, it was decided to use the NERDIP-preferred Hierarchical Data Format (HDF) 5, which is a proven, standard, open, self-describing and high-performance format supported by extensive software tools, libraries and data services. The Network Common Data Form (NetCDF) 4 API facilitates the use of data in HDF5, whilst the NetCDF Climate & Forecasting conventions (NetCDF-CF) further constrain NetCDF4/HDF5 data so as to provide greater inherent interoperability

  14. Comparison of the effectiveness of sterilizing endodontic files by 4 different methods: An in vitro study

    Directory of Open Access Journals (Sweden)

    Venkatasubramanian R

    2010-03-01

    Full Text Available Sterilization is the best method to counter the threats of microorganisms. The purpose of sterilization in the field of health care is to prevent the spread of infectious diseases. In dentistry, it primarily relates to processing reusable instruments to prevent cross-infection. The aim of this study was to investigate the efficacy of 4 methods of sterilizing endodontic instruments: Autoclaving, carbon dioxide laser sterilization, chemical sterilization (with glutaraldehyde and glass-bead sterilization. The endodontic file was sterilized by 4 different methods after contaminating it with bacillus stearothermophillus and then checked for sterility by incubating after putting it in test tubes containing thioglycollate medium. The study showed that the files sterilized by autoclave and lasers were completely sterile. Those sterilized by glass bead were 90% sterile and those with glutaraldehyde were 80% sterile. The study concluded that autoclave or laser could be used as a method of sterilization in clinical practice and in advanced clinics; laser can be used also as a chair side method of sterilization.

  15. Decay data file based on the ENSDF file

    Energy Technology Data Exchange (ETDEWEB)

    Katakura, J. [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-03-01

    A decay data file with the JENDL (Japanese Evaluated Nuclear Data Library) format based on the ENSDF (Evaluated Nuclear Structure Data File) file was produced as a tentative one of special purpose files of JENDL. The problem using the ENSDF file as primary source data of the JENDL decay data file is presented. (author)

  16. Handling high data rate detectors at Diamond Light Source

    Science.gov (United States)

    Pedersen, U. K.; Rees, N.; Basham, M.; Ferner, F. J. K.

    2013-03-01

    An increasing number of area detectors, in use at Diamond Light Source, produce high rates of data. In order to capture, store and process this data High Performance Computing (HPC) systems have been implemented. This paper will present the architecture and usage for handling high rate data: detector data capture, large volume storage and parallel processing. The EPICS area Detector frame work has been adopted to abstract the detectors for common tasks including live processing, file format and storage. The chosen data format is HDF5 which provides multidimensional data storage and NeXuS compatibility. The storage system and related computing infrastructure include: a centralised Lustre based parallel file system, a dedicated network and a HPC cluster. A well defined roadmap is in place for the evolution of this to meet demand as the requirements and technology advances. For processing the science data the HPC cluster allow efficient parallel computing, on a mixture of ×86 and GPU processing units. The nature of the Lustre storage system in combination with the parallel HDF5 library allow efficient disk I/O during computation jobs. Software developments, which include utilising optimised parallel file reading for a variety of post processing techniques, are being developed in collaboration as part of the Pan-Data EU Project (www.pan-data.eu). These are particularly applicable to tomographic reconstruction and processing of non crystalline diffraction data.

  17. Common data model access; a unified layer to access data from data analysis point of view

    International Nuclear Information System (INIS)

    Poirier, S.; Buteau, A.; Ounsy, M.; Rodriguez, C.; Hauser, N.; Lam, T.; Xiong, N.

    2012-01-01

    For almost 20 years, the scientific community of neutron and synchrotron institutes have been dreaming of a common data format for exchanging experimental results and applications for reducing and analyzing the data. Using HDF5 as a data container has become the standard in many facilities. The big issue is the standardization of the data organization (schema) within the HDF5 container. By introducing a new level of indirection for data access, the Common-Data-Model-Access (CDMA) framework proposes a solution and allows separation of responsibilities between data reduction developers and the institute. Data reduction developers are responsible for data reduction code; the institute provides a plug-in to access the data. The CDMA is a core API that accesses data through a data format plug-in mechanism and scientific application definitions (sets of keywords) coming from a consensus between scientists and institutes. Using a innovative 'mapping' system between application definitions and physical data organizations, the CDMA allows data reduction application development independent of the data file container AND schema. Each institute develops a data access plug-in for its own data file formats along with the mapping between application definitions and its data files. Thus data reduction applications can be developed from a strictly scientific point of view and are immediately able to process data acquired from several institutes. (authors)

  18. Apically extruded dentin debris by reciprocating single-file and multi-file rotary system.

    Science.gov (United States)

    De-Deus, Gustavo; Neves, Aline; Silva, Emmanuel João; Mendonça, Thais Accorsi; Lourenço, Caroline; Calixto, Camila; Lima, Edson Jorge Moreira

    2015-03-01

    This study aims to evaluate the apical extrusion of debris by the two reciprocating single-file systems: WaveOne and Reciproc. Conventional multi-file rotary system was used as a reference for comparison. The hypotheses tested were (i) the reciprocating single-file systems extrude more than conventional multi-file rotary system and (ii) the reciprocating single-file systems extrude similar amounts of dentin debris. After solid selection criteria, 80 mesial roots of lower molars were included in the present study. The use of four different instrumentation techniques resulted in four groups (n = 20): G1 (hand-file technique), G2 (ProTaper), G3 (WaveOne), and G4 (Reciproc). The apparatus used to evaluate the collection of apically extruded debris was typical double-chamber collector. Statistical analysis was performed for multiple comparisons. No significant difference was found in the amount of the debris extruded between the two reciprocating systems. In contrast, conventional multi-file rotary system group extruded significantly more debris than both reciprocating groups. Hand instrumentation group extruded significantly more debris than all other groups. The present results yielded favorable input for both reciprocation single-file systems, inasmuch as they showed an improved control of apically extruded debris. Apical extrusion of debris has been studied extensively because of its clinical relevance, particularly since it may cause flare-ups, originated by the introduction of bacteria, pulpal tissue, and irrigating solutions into the periapical tissues.

  19. PH5 for integrating and archiving different data types

    Science.gov (United States)

    Azevedo, Steve; Hess, Derick; Beaudoin, Bruce

    2016-04-01

    PH5 is IRIS PASSCAL's file organization of HDF5 used for seismic data. The extensibility and portability of HDF5 allows the PH5 format to evolve and operate on a variety of platforms and interfaces. To make PH5 even more flexible, the seismic metadata is separated from the time series data in order to achieve gains in performance as well as ease of use and to simplify user interaction. This separation affords easy updates to metadata after the data are archived without having to access waveform data. To date, PH5 is currently used for integrating and archiving active source, passive source, and onshore-offshore seismic data sets with the IRIS Data Management Center (DMC). Active development to make PH5 fully compatible with FDSN web services and deliver StationXML is near completion. We are also exploring the feasibility of utilizing QuakeML for active seismic source representation. The PH5 software suite, PIC KITCHEN, comprises in-field tools that include data ingestion (e.g. RefTek format, SEG-Y, and SEG-D), meta-data management tools including QC, and a waveform review tool. These tools enable building archive ready data in-field during active source experiments greatly decreasing the time to produce research ready data sets. Once archived, our online request page generates a unique web form and pre-populates much of it based on the metadata provided to it from the PH5 file. The data requester then can intuitively select the extraction parameters as well as data subsets they wish to receive (current output formats include SEG-Y, SAC, mseed). The web interface then passes this on to the PH5 processing tools to generate the requested seismic data, and e-mail the requester a link to the data set automatically as soon as the data are ready. PH5 file organization was originally designed to hold seismic time series data and meta-data from controlled source experiments using RefTek data loggers. The flexibility of HDF5 has enabled us to extend the use of PH5 in several

  20. The Hierarchical Data Format as a Foundation for Community Data Sharing

    Science.gov (United States)

    Habermann, T.

    2017-12-01

    Hierarchical Data Format (HDF) formats and libraries have been used by individual researchers and major science programs across many Earth and Space Science disciplines and sectors to provide high-performance information storage and access for several decades. Generic group, dataset, and attribute objects in HDF have been combined in many ways to form domain objects that scientists understand and use. Well-known applications of HDF in the Earth Sciences include thousands of global satellite observations and products produced by NASA's Earth Observing System using the HDF-EOS conventions, navigation quality bathymetry produced as Bathymetric Attributed Grids (BAGs) by the OpenNavigationSurface project and others, seismic wave collections written into the Adoptable Seismic Data Format (ASDF) and many oceanographic and atmospheric products produced using the climate-forecast conventions with the netCDF4 data model and API to HDF5. This is the modus operandi of these communities: 1) develop a model of scientific data objects and associated metadata used in a domain, 2) implement that model using HDF, 3) develop software libraries that connect that model to tools and 4) encourage adoption of those tools in the community. Understanding these domain object implementations and facilitating communication across communities is an important goal of The HDF Group. We will discuss these examples and approaches to community outreach during this session.

  1. Estudio comparativo de biocompatibilidad entre la hemodiafiltración en línea y la hemodiafiltración con reinfusión endógena

    Directory of Open Access Journals (Sweden)

    José Luis Cobo Sánchez

    Full Text Available Objetivo: Comparar la biocompatibilidad entre la hemodiafiltración en línea (HDF y la hemodiafiltración con reinfusión endógena (HFR. Material y método: Estudio comparativo observacional en una población de 15 pacientes en hemodiálisis crónica elegidos al azar entre los pacientes de nuestra unidad. Se compararon cambios en el perfil hematológico, nivel de PCR y constantes vitales, pre y post hemodiálisis, tras someterse a ambas técnicas de hemodiafiltración. Se comparó las diferencias entre los parámetros estudiados pre y post hemodiálisis en cada técnica. Resultados: Los niveles de plaquetas descendieron más en la HDF (HDF -1,33 vs HFR -19,73 x10³/mm³, p=0,005. El nivel de leucocitos disminuyó en la HDF y aumentó en la HFR (HDF -0,46 vs HFR +0,8 x10³/mm³; p=0,006. Respecto a la fórmula leucocitaria hubo resultados dispares: segmentados HDF -1,7 vs HFR +5,4%, p<0,001; linfocitos HDF +1,96 vs HFR -3,62%, p<0,001. Con la HFR disminuyeron menos los niveles de PCR (HDF -0,05 vs HFR -0,001 mg/dl; p=NS. En lo referente a las constantes vitales, la tensión arterial sistólica descendió más en la HFR que en la HDF (HDF -9,93 vs HFR -10,33 mmHg; p<0,001, a la inversa que la diastólica (HDF -5,2 vs HFR -3 mmHg; p=0,007 y la frecuencia cardiaca (HDF -1,46 vs HFR +1,73 lpm; p=NS. La temperatura corporal aumentó más con la HDF que con la HFR (HDF +0,35 vs HFR +0,06 ºC; p=NS. Conclusiones: Según nuestros resultados la HFR parece más biocompatible que la HDF, probablemente derivado por la reinfusión exógena de la HDF.

  2. The NeXus data format.

    Science.gov (United States)

    Könnecke, Mark; Akeroyd, Frederick A; Bernstein, Herbert J; Brewster, Aaron S; Campbell, Stuart I; Clausen, Björn; Cottrell, Stephen; Hoffmann, Jens Uwe; Jemian, Pete R; Männicke, David; Osborn, Raymond; Peterson, Peter F; Richter, Tobias; Suzuki, Jiro; Watts, Benjamin; Wintersberger, Eugen; Wuttke, Joachim

    2015-02-01

    NeXus is an effort by an international group of scientists to define a common data exchange and archival format for neutron, X-ray and muon experiments. NeXus is built on top of the scientific data format HDF5 and adds domain-specific rules for organizing data within HDF5 files, in addition to a dictionary of well defined domain-specific field names. The NeXus data format has two purposes. First, it defines a format that can serve as a container for all relevant data associated with a beamline. This is a very important use case. Second, it defines standards in the form of application definitions for the exchange of data between applications. NeXus provides structures for raw experimental data as well as for processed data.

  3. 49 CFR 1104.1 - Address, identification, and electronic filing option.

    Science.gov (United States)

    2010-10-01

    ... 2 of 4” and so forth). (e) Persons filing pleadings and documents with the Board have the option of electronically filing (e-filing) certain types of pleadings and documents instead of filing paper copies. Details regarding the types of pleadings and documents eligible for e-filing, the procedures to be followed, and...

  4. GPM Mission Gridded Text Products Providing Surface Precipitation Retrievals

    Science.gov (United States)

    Stocker, Erich Franz; Kelley, Owen; Huffman, George; Kummerow, Christian

    2015-04-01

    In February 2015, the Global Precipitation Measurement (GPM) mission core satellite will complete its first year in space. The core satellite carries a conically scanning microwave imager called the GPM Microwave Imager (GMI), which also has 166 GHz and 183 GHz frequency channels. The GPM core satellite also carries a dual frequency radar (DPR) which operates at Ku frequency, similar to the Tropical Rainfall Measuring Mission (TRMM) Precipitation Radar), and a new Ka frequency. The precipitation processing system (PPS) is producing swath-based instantaneous precipitation retrievals from GMI, both radars including a dual-frequency product, and a combined GMI/DPR precipitation retrieval. These level 2 products are written in the HDF5 format and have many additional parameters beyond surface precipitation that are organized into appropriate groups. While these retrieval algorithms were developed prior to launch and are not optimal, these algorithms are producing very creditable retrievals. It is appropriate for a wide group of users to have access to the GPM retrievals. However, for reseachers requiring only surface precipitation, these L2 swath products can appear to be very intimidating and they certainly do contain many more variables than the average researcher needs. Some researchers desire only surface retrievals stored in a simple easily accessible format. In response, PPS has begun to produce gridded text based products that contain just the most widely used variables for each instrument (surface rainfall rate, fraction liquid, fraction convective) in a single line for each grid box that contains one or more observations. This paper will describe the gridded data products that are being produced and provide an overview of their content. Currently two types of gridded products are being produced: (1) surface precipitation retrievals from the core satellite instruments - GMI, DPR, and combined GMI/DPR (2) surface precipitation retrievals for the partner

  5. 42 CFR 430.63 - Filing and service of papers.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Filing and service of papers. 430.63 Section 430.63... Conformity of State Medicaid Plans and Practice to Federal Requirements § 430.63 Filing and service of papers. (a) Filing. All papers in the proceedings are filed with the CMS Docket Clerk, in an original and two...

  6. 45 CFR 1386.85 - Filing and service of papers.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Filing and service of papers. 1386.85 Section 1386... Requirements General § 1386.85 Filing and service of papers. (a) All papers in the proceedings must be filed... transcripts of testimony need be filed. (b) Copies of papers in the proceedings must be served on all parties...

  7. 76 FR 10890 - Combined Notice of Filings #1

    Science.gov (United States)

    2011-02-28

    ... Paris Amended and Restated Service Agreement to be effective 4/19/2011. Filed Date: 02/18/2011...: PacifiCorp submits tariff filing per 35.13(a)(2)(iii: PacifiCorp Energy Facilities Maintenance Agreement... Interconnection Agreement of Southwest Power Pool, Inc. Filed Date: 02/18/2011. Accession Number: 20110218-5086...

  8. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_Terra-FM1_Edition2)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-12-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal

  9. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_PFM+FM2_Edition1)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2000-03-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal

  10. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_Aqua-FM3_Edition1)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-10-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal

  11. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_PFM+FM1_Edition1)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2000-03-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal

  12. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_Terra-FM1_Edition1)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-10-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal

  13. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_Terra-FM2_Edition1)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-10-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal

  14. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_Aqua-FM3_Edition2)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-12-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal

  15. 78 FR 52171 - Combined Notice of Filings #1

    Science.gov (United States)

    2013-08-22

    ...: ISO New England Inc. Description: Attachment A-1 to be effective 6/15/2013. Filed Date: 8/15/13... following PURPA 210(m)(3) filings: Docket Numbers: QM13-4-000. Applicants: City of Burlington, Vermont... of the City of Burlington, Vermont. Filed Date: 8/15/13. Accession Number: 20130815-5117. Comments...

  16. A tool for NDVI time series extraction from wide-swath remotely sensed images

    Science.gov (United States)

    Li, Zhishan; Shi, Runhe; Zhou, Cong

    2015-09-01

    Normalized Difference Vegetation Index (NDVI) is one of the most widely used indicators for monitoring the vegetation coverage in land surface. The time series features of NDVI are capable of reflecting dynamic changes of various ecosystems. Calculating NDVI via Moderate Resolution Imaging Spectrometer (MODIS) and other wide-swath remotely sensed images provides an important way to monitor the spatial and temporal characteristics of large-scale NDVI. However, difficulties are still existed for ecologists to extract such information correctly and efficiently because of the problems in several professional processes on the original remote sensing images including radiometric calibration, geometric correction, multiple data composition and curve smoothing. In this study, we developed an efficient and convenient online toolbox for non-remote sensing professionals who want to extract NDVI time series with a friendly graphic user interface. It is based on Java Web and Web GIS technically. Moreover, Struts, Spring and Hibernate frameworks (SSH) are integrated in the system for the purpose of easy maintenance and expansion. Latitude, longitude and time period are the key inputs that users need to provide, and the NDVI time series are calculated automatically.

  17. SchemaOnRead Manual

    Energy Technology Data Exchange (ETDEWEB)

    North, Michael J. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-09-30

    SchemaOnRead provides tools for implementing schema-on-read including a single function call (e.g., schemaOnRead("filename")) that reads text (TXT), comma separated value (CSV), raster image (BMP, PNG, GIF, TIFF, and JPG), R data (RDS), HDF5, NetCDF, spreadsheet (XLS, XLSX, ODS, and DIF), Weka Attribute-Relation File Format (ARFF), Epi Info (REC), Pajek network (PAJ), R network (NET), Hypertext Markup Language (HTML), SPSS (SAV), Systat (SYS), and Stata (DTA) files. It also recursively reads folders (e.g., schemaOnRead("folder")), returning a nested list of the contained elements.

  18. Resistance to erythropoiesis stimulating agents in patients treated with online hemodiafiltration and ultrapure low-flux hemodialysis: results from a randomized controlled trial (CONTRAST.

    Directory of Open Access Journals (Sweden)

    Neelke C van der Weerd

    Full Text Available Resistance to erythropoiesis stimulating agents (ESA is common in patients undergoing chronic hemodialysis (HD treatment. ESA responsiveness might be improved by enhanced clearance of uremic toxins of middle molecular weight, as can be obtained by hemodiafiltration (HDF. In this analysis of the randomized controlled CONvective TRAnsport STudy (CONTRAST; NCT00205556, the effect of online HDF on ESA resistance and iron parameters was studied. This was a pre-specified secondary endpoint of the main trial. A 12 months' analysis of 714 patients randomized to either treatment with online post-dilution HDF or continuation of low-flux HD was performed. Both groups were treated with ultrapure dialysis fluids. ESA resistance, measured every three months, was expressed as the ESA index (weight adjusted weekly ESA dose in daily defined doses [DDD]/hematocrit. The mean ESA index during 12 months was not different between patients treated with HDF or HD (mean difference HDF versus HD over time 0.029 DDD/kg/Hct/week [-0.024 to 0.081]; P = 0.29. Mean transferrin saturation ratio and ferritin levels during the study tended to be lower in patients treated with HDF (-2.52% [-4.72 to -0.31]; P = 0.02 and -49 ng/mL [-103 to 4]; P = 0.06 respectively, although there was a trend for those patients to receive slightly more iron supplementation (7.1 mg/week [-0.4 to 14.5]; P = 0.06. In conclusion, compared to low-flux HD with ultrapure dialysis fluid, treatment with online HDF did not result in a decrease in ESA resistance.ClinicalTrials.gov NCT00205556.

  19. Accessing files in an Internet: The Jade file system

    Science.gov (United States)

    Peterson, Larry L.; Rao, Herman C.

    1991-01-01

    Jade is a new distribution file system that provides a uniform way to name and access files in an internet environment. It makes two important contributions. First, Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Jade is designed under the restriction that the underlying file system may not be modified. Second, rather than providing a global name space, Jade permits each user to define a private name space. These private name spaces support two novel features: they allow multiple file systems to be mounted under one directory, and they allow one logical name space to mount other logical name spaces. A prototype of the Jade File System was implemented on Sun Workstations running Unix. It consists of interfaces to the Unix file system, the Sun Network File System, the Andrew File System, and FTP. This paper motivates Jade's design, highlights several aspects of its implementation, and illustrates applications that can take advantage of its features.

  20. Accessing files in an internet - The Jade file system

    Science.gov (United States)

    Rao, Herman C.; Peterson, Larry L.

    1993-01-01

    Jade is a new distribution file system that provides a uniform way to name and access files in an internet environment. It makes two important contributions. First, Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Jade is designed under the restriction that the underlying file system may not be modified. Second, rather than providing a global name space, Jade permits each user to define a private name space. These private name spaces support two novel features: they allow multiple file systems to be mounted under one directory, and they allow one logical name space to mount other logical name spaces. A prototype of the Jade File System was implemented on Sun Workstations running Unix. It consists of interfaces to the Unix file system, the Sun Network File System, the Andrew File System, and FTP. This paper motivates Jade's design, highlights several aspects of its implementation, and illustrates applications that can take advantage of its features.

  1. 78 FR 52765 - Combined Notice of Filings #1

    Science.gov (United States)

    2013-08-26

    .... Docket Numbers: ER13-2173-000. Applicants: Southwest Power Pool, Inc. Description: 1997R2 City of Mulvane.... Description: Order 719 Compliance Filing--Attachment AE, Section 4.1.2 to be effective 3/1/2014. Filed Date: 8... Facilities Charge Agreement [RS No. 156] for the City and County of San Francisco. Filed Date: 8/16/13...

  2. 77 FR 20016 - Combined Notice of Filings #1

    Science.gov (United States)

    2012-04-03

    ... Resource Management, LLC, Enserco Energy LLC. Description: Change in Status Filing of Twin Eagle Resource Management, LLC, et al. Filed Date: 3/26/12. Accession Number: 20120326-5132. Comments Due: 5 p.m. ET 4/16/12.... Description: PJM Interconnection, L.L.C. submits tariff filing per 35.13(a)(2)(iii: Queue Position O50...

  3. 76 FR 12725 - Combined Notice of Filings #2

    Science.gov (United States)

    2011-03-08

    ... MBR Tariff to be effective 3/2/2011. Filed Date: 03/01/2011 Accession Number: 20110301-5090 Comment... Substitute First Revised MBR Tariff to be effective 3/2/ 2011. Filed Date: 03/01/2011 Accession Number... 35.17(b): Revised Application for MBR and MBR Tariffs to be effective 4/1/2011. Filed Date: 02/28...

  4. 21 CFR 225.102 - Master record file and production records.

    Science.gov (United States)

    2010-04-01

    ... or production run of medicated feed to which it pertains. The Master Record File or card shall... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Master record file and production records. 225.102....102 Master record file and production records. (a) The Master Record File provides the complete...

  5. 14 CFR 221.195 - Requirement for filing printed material.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Requirement for filing printed material... filing printed material. (a) Any tariff, or revision thereto, filed in paper format which accompanies... supporting paper tariff, except as authorized by the Department. (b) Any printed justifications, or other...

  6. 78 FR 78352 - Orlando Utilities Commission; Notice of Filing

    Science.gov (United States)

    2013-12-26

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. NJ14-4-000] Orlando Utilities Commission; Notice of Filing Take notice that on December 18, 2013, Orlando Utilities Commission submitted its tariff filing per 35.28(e): Order No. 1000 Further Regional Compliance Filing to be effective...

  7. 76 FR 35209 - Orlando Utilities Commission; Notice of Filing

    Science.gov (United States)

    2011-06-16

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. NJ11-12-001] Orlando Utilities Commission; Notice of Filing Take notice that on May 26, 2011, Orlando Utilities Commission submitted its tariff filing per 35.17(b): Amendment to Compliance Filing to be effective 4/15/2011. Any...

  8. Use of Schema on Read in Earth Science Data Archives

    Science.gov (United States)

    Hegde, Mahabaleshwara; Smit, Christine; Pilone, Paul; Petrenko, Maksym; Pham, Long

    2017-01-01

    Traditionally, NASA Earth Science data archives have file-based storage using proprietary data file formats, such as HDF and HDF-EOS, which are optimized to support fast and efficient storage of spaceborne and model data as they are generated. The use of file-based storage essentially imposes an indexing strategy based on data dimensions. In most cases, NASA Earth Science data uses time as the primary index, leading to poor performance in accessing data in spatial dimensions. For example, producing a time series for a single spatial grid cell involves accessing a large number of data files. With exponential growth in data volume due to the ever-increasing spatial and temporal resolution of the data, using file-based archives poses significant performance and cost barriers to data discovery and access. Storing and disseminating data in proprietary data formats imposes an additional access barrier for users outside the mainstream research community. At the NASA Goddard Earth Sciences Data Information Services Center (GES DISC), we have evaluated applying the schema-on-read principle to data access and distribution. We used Apache Parquet to store geospatial data, and have exposed data through Amazon Web Services (AWS) Athena, AWS Simple Storage Service (S3), and Apache Spark. Using the schema-on-read approach allows customization of indexing spatially or temporally to suit the data access pattern. The storage of data in open formats such as Apache Parquet has widespread support in popular programming languages. A wide range of solutions for handling big data lowers the access barrier for all users. This presentation will discuss formats used for data storage, frameworks with This presentation will discuss formats used for data storage, frameworks with support for schema-on-read used for data access, and common use cases covering data usage patterns seen in a geospatial data archive.

  9. Use of Schema on Read in Earth Science Data Archives

    Science.gov (United States)

    Petrenko, M.; Hegde, M.; Smit, C.; Pilone, P.; Pham, L.

    2017-12-01

    Traditionally, NASA Earth Science data archives have file-based storage using proprietary data file formats, such as HDF and HDF-EOS, which are optimized to support fast and efficient storage of spaceborne and model data as they are generated. The use of file-based storage essentially imposes an indexing strategy based on data dimensions. In most cases, NASA Earth Science data uses time as the primary index, leading to poor performance in accessing data in spatial dimensions. For example, producing a time series for a single spatial grid cell involves accessing a large number of data files. With exponential growth in data volume due to the ever-increasing spatial and temporal resolution of the data, using file-based archives poses significant performance and cost barriers to data discovery and access. Storing and disseminating data in proprietary data formats imposes an additional access barrier for users outside the mainstream research community. At the NASA Goddard Earth Sciences Data Information Services Center (GES DISC), we have evaluated applying the "schema-on-read" principle to data access and distribution. We used Apache Parquet to store geospatial data, and have exposed data through Amazon Web Services (AWS) Athena, AWS Simple Storage Service (S3), and Apache Spark. Using the "schema-on-read" approach allows customization of indexing—spatial or temporal—to suit the data access pattern. The storage of data in open formats such as Apache Parquet has widespread support in popular programming languages. A wide range of solutions for handling big data lowers the access barrier for all users. This presentation will discuss formats used for data storage, frameworks with support for "schema-on-read" used for data access, and common use cases covering data usage patterns seen in a geospatial data archive.

  10. 78 FR 14530 - Combined Notice of Filings #2

    Science.gov (United States)

    2013-03-06

    ...: California Independent System Operator Corporation submits tariff filing per 35.13(a)(2)(iii): 2013-02-27 Pay... per 35.17(b): 2013-02-28--OASIS Att J Errata to be effective 4/15/2013. Filed Date: 2/27/13. Accession... Due: 5 p.m. ET 3/20/13. The filings are accessible in the Commission's eLibrary system by clicking on...

  11. An Extensible Processing Framework for Eddy-covariance Data

    Science.gov (United States)

    Durden, D.; Fox, A. M.; Metzger, S.; Sturtevant, C.; Durden, N. P.; Luo, H.

    2016-12-01

    The evolution of large data collecting networks has not only led to an increase of available information, but also in the complexity of analyzing the observations. Timely dissemination of readily usable data products necessitates a streaming processing framework that is both automatable and flexible. Tower networks, such as ICOS, Ameriflux, and NEON, exemplify this issue by requiring large amounts of data to be processed from dispersed measurement sites. Eddy-covariance data from across the NEON network are expected to amount to 100 Gigabytes per day. The complexity of the algorithmic processing necessary to produce high-quality data products together with the continued development of new analysis techniques led to the development of a modular R-package, eddy4R. This allows algorithms provided by NEON and the larger community to be deployed in streaming processing, and to be used by community members alike. In order to control the processing environment, provide a proficient parallel processing structure, and certify dependencies are available during processing, we chose Docker as our "Development and Operations" (DevOps) platform. The Docker framework allows our processing algorithms to be developed, maintained and deployed at scale. Additionally, the eddy4R-Docker framework fosters community use and extensibility via pre-built Docker images and the Github distributed version control system. The capability to process large data sets is reliant upon efficient input and output of data, data compressibility to reduce compute resource loads, and the ability to easily package metadata. The Hierarchical Data Format (HDF5) is a file format that can meet these needs. A NEON standard HDF5 file structure and metadata attributes allow users to explore larger data sets in an intuitive "directory-like" structure adopting the NEON data product naming conventions.

  12. 5 CFR 831.1205 - Agency-filed disability retirement applications.

    Science.gov (United States)

    2010-01-01

    ... incapable of making a decision to file an application for disability retirement; (4) The employee has no... must inform the employee in writing at the same time it informs the employee of its removal decision... disability retirement applications. (a) Basis for filing an application for an employee. An agency must file...

  13. 12 CFR 308.158 - Filing papers and effective date.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Filing papers and effective date. 308.158 Section 308.158 Banks and Banking FEDERAL DEPOSIT INSURANCE CORPORATION PROCEDURE AND RULES OF PRACTICE... Section 19 of the FDIA § 308.158 Filing papers and effective date. (a) Filing with the regional office...

  14. Long term file migration. Part I: file reference patterns

    International Nuclear Information System (INIS)

    Smith, A.J.

    1978-08-01

    In most large computer installations, files are moved between on-line disk and mass storage (tape, integrated mass storage device) either automatically by the system or specifically at the direction of the user. This is the first of two papers which study the selection of algorithms for the automatic migration of files between mass storage and disk. The use of the text editor data sets at the Stanford Linear Accelerator Center (SLAC) computer installation is examined through the analysis of thirteen months of file reference data. Most files are used very few times. Of those that are used sufficiently frequently that their reference patterns may be examined, about a third show declining rates of reference during their lifetime; of the remainder, very few (about 5%) show correlated interreference intervals, and interreference intervals (in days) appear to be more skewed than would occur with the Bernoulli process. Thus, about two-thirds of all sufficiently active files appear to be referenced as a renewal process with a skewed interreference distribution. A large number of other file reference statistics (file lifetimes, interference distributions, moments, means, number of uses/file, file sizes, file rates of reference, etc.) are computed and presented. The results are applied in the following paper to the development and comparative evaluation of file migration algorithms. 17 figures, 13 tables

  15. 76 FR 5572 - Combined Notice of Filings #1

    Science.gov (United States)

    2011-02-01

    ... Interconnection, L.L.C. submits tariff filing per 35.13(a)(2)(iii: WMPA No. 2720, Queue V4-001, Flemington Solar... Rapids Transmission Project Construction Management Agreement to be effective 12/1/2010. Filed Date: 01...

  16. 76 FR 12724 - Combined Notice of Filings #1

    Science.gov (United States)

    2011-03-08

    ... ITO--RC to be effective 4/ 26/2011. Filed Date: 02/25/2011 Accession Number: 20110225-5145 Comment... Description: Pacific Gas and Electric Company submits tariff filing per 35.13(a)(2)(iii: Corrections to PG&E's...

  17. Endodontic complications of root canal therapy performed by dental students with stainless-steel K-files and nickel-titanium hand files.

    Science.gov (United States)

    Pettiette, M T; Metzger, Z; Phillips, C; Trope, M

    1999-04-01

    Straightening of curved canals is one of the most common procedural errors in endodontic instrumentation. This problem is commonly encountered when dental students perform molar endodontics. The purpose of this study was to compare the effect of the type of instrument used by these students on the extent of straightening and on the incidence of other endodontic procedural errors. Nickel-titanium 0.02 taper hand files were compared with traditional stainless-steel 0.02 taper K-files. Sixty molar teeth comprised of maxillary and mandibular first and second molars were treated by senior dental students. Instrumentation was with either nickel-titanium hand files or stainless-steel K-files. Preoperative and postoperative radiographs of each tooth were taken using an XCP precision instrument with a customized bite block to ensure accurate reproduction of radiographic angulation. The radiographs were scanned and the images stored as TIFF files. By superimposing tracings from the preoperative over the postoperative radiographs, the degree of deviation of the apical third of the root canal filling from the original canal was measured. The presence of other errors, such as strip perforation and instrument breakage, was established by examining the radiographs. In curved canals instrumented by stainless-steel K-files, the average deviation of the apical third of the canals was 14.44 degrees (+/- 10.33 degrees). The deviation was significantly reduced when nickel-titanium hand files were used to an average of 4.39 degrees (+/- 4.53 degrees). The incidence of other procedural errors was also significantly reduced by the use of nickel-titanium hand files.

  18. 18 CFR 154.4 - Electronic filing of tariffs and related materials.

    Science.gov (United States)

    2010-04-01

    ..., governmental authority, agency, or instrumentality on behalf of which the filing is made; or, (iii) A... form. These formats are available on the Internet at http://www.ferc.gov and can be obtained at the...

  19. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_FM1+FM2_Edition1)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2003-12-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal

  20. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_FM1+FM3_Edition2)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2005-12-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal

  1. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_Terra-FM2_Edition1-CV)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2006-10-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal

  2. CERES ERBE-like Monthly Geographical Averages (ES-4) in HDF (CER_ES4_FM1+FM2_Edition2)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=2002-12-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal

  3. Liquid chromatography, in combination with a quadrupole time-of-flight instrument (LC QTOF), with sequential window acquisition of all theoretical fragment-ion spectra (SWATH) acquisition: systematic studies on its use for screenings in clinical and forensic toxicology and comparison with information-dependent acquisition (IDA).

    Science.gov (United States)

    Roemmelt, Andreas T; Steuer, Andrea E; Poetzsch, Michael; Kraemer, Thomas

    2014-12-02

    Forensic and clinical toxicological screening procedures are employing liquid chromatography-tandem mass spectrometry (LC-MS/MS) techniques with information-dependent acquisition (IDA) approaches more and more often. It is known that the complexity of a sample and the IDA settings might prevent important compounds from being triggered. Therefore, data-independent acquisition (DIA) methods should be more suitable for systematic toxicological analysis (STA). The DIA method sequential window acquisition of all theoretical fragment-ion spectra (SWATH), which uses Q1 windows of 20-35 Da for data-independent fragmentation, was systematically investigated for its suitability for STA. Quality of SWATH-generated mass spectra were evaluated with regard to mass error, relative abundance of the fragments, and library hits. With the Q1 window set to 20-25 Da, several precursors pass Q1 at the same time and are fragmented, thus impairing the library search algorithms to a different extent: forward fit was less affected than reverse fit and purity fit. Mass error was not affected. The relative abundance of the fragments was concentration dependent for some analytes and was influenced by cofragmentation, especially of deuterated analogues. Also, the detection rate of IDA compared to SWATH was investigated in a forced coelution experiment (up to 20 analytes coeluting). Even using several different IDA settings, it was observed that IDA failed to trigger relevant compounds. Screening results of 382 authentic forensic cases revealed that SWATH's detection rate was superior to IDA, which failed to trigger ∼10% of the analytes.

  4. 47 CFR 73.3527 - Local public inspection file of noncommercial educational stations.

    Science.gov (United States)

    2010-10-01

    ... main studio and public file outside its community of license shall: (i) Make available to persons... 47 Telecommunication 4 2010-10-01 2010-10-01 false Local public inspection file of noncommercial... public inspection file of noncommercial educational stations. (a) Responsibility to maintain a file. The...

  5. 77 FR 22567 - Combined Notice of Filings #2

    Science.gov (United States)

    2012-04-16

    ... that the Commission received the following electric rate filings: Docket Numbers: ER12-677-001. Applicants: ITC Midwest LLC. Description: Compliance Filing--ITC Midwest, Storm Lake, and IPL Joint Use....m. ET 4/30/12. Docket Numbers: ER12-1417-001. [[Page 22568

  6. JENDL special purpose file

    International Nuclear Information System (INIS)

    Nakagawa, Tsuneo

    1995-01-01

    In JENDL-3,2, the data on all the reactions having significant cross section over the neutron energy from 0.01 meV to 20 MeV are given for 340 nuclides. The object range of application extends widely, such as the neutron engineering, shield and others of fast reactors, thermal neutron reactors and nuclear fusion reactors. This is a general purpose data file. On the contrary to this, the file in which only the data required for a specific application field are collected is called special purpose file. The file for dosimetry is a typical special purpose file. The Nuclear Data Center, Japan Atomic Energy Research Institute, is making ten kinds of JENDL special purpose files. The files, of which the working groups of Sigma Committee are in charge, are listed. As to the format of the files, ENDF format is used similarly to JENDL-3,2. Dosimetry file, activation cross section file, (α, n) reaction data file, fusion file, actinoid file, high energy data file, photonuclear data file, PKA/KERMA file, gas production cross section file and decay data file are described on their contents, the course of development and their verification. Dosimetry file and gas production cross section file have been completed already. As for the others, the expected time of completion is shown. When these files are completed, they are opened to the public. (K.I.)

  7. 11 CFR 100.19 - File, filed or filing (2 U.S.C. 434(a)).

    Science.gov (United States)

    2010-01-01

    ... a facsimile machine or by electronic mail if the reporting entity is not required to file..., including electronic reporting entities, may use the Commission's website's on-line program to file 48-hour... the reporting entity is not required to file electronically in accordance with 11 CFR 104.18. [67 FR...

  8. Storage of sparse files using parallel log-structured file system

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron

    2017-11-07

    A sparse file is stored without holes by storing a data portion of the sparse file using a parallel log-structured file system; and generating an index entry for the data portion, the index entry comprising a logical offset, physical offset and length of the data portion. The holes can be restored to the sparse file upon a reading of the sparse file. The data portion can be stored at a logical end of the sparse file. Additional storage efficiency can optionally be achieved by (i) detecting a write pattern for a plurality of the data portions and generating a single patterned index entry for the plurality of the patterned data portions; and/or (ii) storing the patterned index entries for a plurality of the sparse files in a single directory, wherein each entry in the single directory comprises an identifier of a corresponding sparse file.

  9. File Type Identification of File Fragments using Longest Common Subsequence (LCS)

    Science.gov (United States)

    Rahmat, R. F.; Nicholas, F.; Purnamawati, S.; Sitompul, O. S.

    2017-01-01

    Computer forensic analyst is a person in charge of investigation and evidence tracking. In certain cases, the file needed to be presented as digital evidence was deleted. It is difficult to reconstruct the file, because it often lost its header and cannot be identified while being restored. Therefore, a method is required for identifying the file type of file fragments. In this research, we propose Longest Common Subsequences that consists of three steps, namely training, testing and validation, to identify the file type from file fragments. From all testing results we can conlude that our proposed method works well and achieves 92.91% of accuracy to identify the file type of file fragment for three data types.

  10. 49 CFR 564.5 - Information filing; agency processing of filings.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 6 2010-10-01 2010-10-01 false Information filing; agency processing of filings... HIGHWAY TRAFFIC SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION REPLACEABLE LIGHT SOURCE INFORMATION (Eff. until 12-01-12) § 564.5 Information filing; agency processing of filings. (a) Each manufacturer...

  11. 78 FR 28210 - Combined Notice of Filings #1

    Science.gov (United States)

    2013-05-14

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 1 Take notice... Company of New Mexico. Description: City of Gallup Network Integration Transmission Service Agreement to..., Section III--Distribution of Revenues to be effective 7/1/2013. Filed Date: 4/30/13. Accession Number...

  12. 78 FR 25261 - Combined Notice of Filings #1

    Science.gov (United States)

    2013-04-30

    ...- 004; ER12-2496-004. Applicants: Bangor Hydro Electric Company, Emera Energy Services, Inc., Emera.... Description: Notice of Change in Status of Bangor Hydro Electric Company, et al. Filed Date: 4/22/13... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 1 Take notice...

  13. 77 FR 47831 - Combined Notice of Filings #1

    Science.gov (United States)

    2012-08-10

    ... Corporation for Approval of an Interpretation to Reliability Standard CIP-004-4--Personnel and Training. Filed... Due: 5 p.m. ET 8/23/12. Take notice that the Commission received the following electric reliability filings. Docket Numbers: RD12-5-000. Applicants: North American Electric Reliability Corporation...

  14. Cut-and-Paste file-systems: integrating simulators and file systems

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.

    1995-01-01

    We have implemented an integrated and configurable file system called the Pegasus filesystem (PFS) and a trace-driven file-system simulator called Patsy. Patsy is used for off-line analysis of file-systemalgorithms, PFS is used for on-line file-systemdata storage. Algorithms are first analyzed in

  15. Conflict Detection Algorithm to Minimize Locking for MPI-IO Atomicity

    Science.gov (United States)

    Sehrish, Saba; Wang, Jun; Thakur, Rajeev

    Many scientific applications require high-performance concurrent I/O accesses to a file by multiple processes. Those applications rely indirectly on atomic I/O capabilities in order to perform updates to structured datasets, such as those stored in HDF5 format files. Current support for atomicity in MPI-IO is provided by locking around the operations, imposing lock overhead in all situations, even though in many cases these operations are non-overlapping in the file. We propose to isolate non-overlapping accesses from overlapping ones in independent I/O cases, allowing the non-overlapping ones to proceed without imposing lock overhead. To enable this, we have implemented an efficient conflict detection algorithm in MPI-IO using MPI file views and datatypes. We show that our conflict detection scheme incurs minimal overhead on I/O operations, making it an effective mechanism for avoiding locks when they are not needed.

  16. Radiology Teaching Files on the Internet

    International Nuclear Information System (INIS)

    Lim, Eun Chung; Kim, Eun Kyung

    1996-01-01

    There is increasing attention about radiology teaching files on the Internet in the field of diagnostic radiology. The purpose of this study was to aid in the creation of new radiology teaching file by analysing the present radiology teaching file sites on the Internet with many aspects and evaluating images on those sites, using Macintosh II ci compute r, 28.8kbps TelePort Fax/Modem, Netscape Navigator 2.0 software. The results were as follow : 1. Analysis of radiology teaching file sites (1) Country distribution was the highest in USA (57.5%). (2) Average number of cases was 186 cases and radiology teaching file sites with search engine were 9 sites (22.5%). (3) At the method of case arrangement, anatomic area type and diagnosis type were found at the 10 sites (25%) each, question and answer type was found at the 9 sites (22.5%). (4) Radiology teaching file sites with oro-maxillofacial disorder were 9 sites (22.5%). (5) At the image format, GIF format was found at the 14 sites (35%), and JPEG format found at the 14 sites (35%). (6) Created year was the highest in 1995 (43.7%). (7) Continuing case upload was found at the 35 sites (87.5%). 2. Evaluation of images on the radiology teaching files (1) Average file size of GIF format (71 Kbyte) was greater than that of JPEG format (24 Kbyte). (P<0.001) (2) Image quality of GIF format was better than that of JPEG format. (P<0.001)

  17. Hemodiafiltration history, technology, and clinical results.

    Science.gov (United States)

    Ronco, Claudio; Cruz, Dinna

    2007-07-01

    Hemodiafiltration (HDF) is an extracorporeal renal-replacement technique using a highly permeable membrane, in which diffusion and convection are conveniently combined to enhance solute removal in a wide spectrum of molecular weights. In this modality, ultrafiltration exceeds the desired fluid loss in the patient, and replacement fluid must be administered to achieve the target fluid balance. Over the years, various HDF variants have emerged, including acetate-free biofiltration, high-volume HDF, internal HDF, paired-filtration dialysis, middilution HDF, double high-flux HDF, push-pull HDF, and online HDF. Recent technology has allowed online production of large volumes of microbiologically ultrapure fluid for reinfusion, greatly simplifying the practice of HDF. Several advantages of HDF over purely diffusive hemodialysis techniques have been described in the literature, including a greater clearance of urea, phosphate, beta(2)-microglobulin and other larger solutes, reduction in dialysis hypotension, and improved anemia management. Although randomized controlled trials have failed to show a survival benefit of HDF, recent data from large observational studies suggest a positive effect of HDF on survival. This article provides a brief review of the history of HDF, the various HDF techniques, and summary of their clinical effects.

  18. Incidence of Apical Crack Initiation during Canal Preparation using Hand Stainless Steel (K-File) and Hand NiTi (Protaper) Files.

    Science.gov (United States)

    Soni, Dileep; Raisingani, Deepak; Mathur, Rachit; Madan, Nidha; Visnoi, Suchita

    2016-01-01

    To evaluate the incidence of apical crack initiation during canal preparation with stainless steel K-files and hand protaper files (in vitro study). Sixty extracted mandibular premo-lar teeth are randomly selected and embedded in an acrylic tube filled with autopolymerizing resin. A baseline image of the apical surface of each specimen was recorded under a digital microscope (80×). The cervical and middle thirds of all samples were flared with #2 and #1 Gates-Glidden (GG) drills, and a second image was recorded. The teeth were randomly divided into four groups of 15 teeth each according to the file type (hand K-file and hand-protaper) and working length (WL) (instrumented at WL and 1 mm less than WL). Final image after dye penetration and photomicrograph of the apical root surface were digitally recorded. Maximum numbers of cracks were observed with hand protaper files compared with hand K-file at the WL and 1 mm short of WL. Chi-square testing revealed a highly significant effect of WL on crack formation at WL and 1 mm short of WL (p = 0.000). Minimum numbers of cracks at WL and 1 mm short of WL were observed with hand K-file and maximum with hand protaper files. Soni D, Raisingani D, Mathur R, Madan N, Visnoi S. Incidence of Apical Crack Initiation during Canal Preparation using Hand Stainless Steel (K-File) and Hand NiTi (Protaper) Files. Int J Clin Pediatr Dent 2016;9(4):303-307.

  19. 12 CFR 308.525 - Form, filing, and service of papers.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Form, filing, and service of papers. 308.525... service of papers. (a) Form. (1) Documents filed with the ALJ must include an original and two copies. (2) Every pleading and paper filed in the proceeding must contain a caption setting forth the title of the...

  20. 12 CFR Appendix F to Part 360 - Customer File Structure

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Customer File Structure F Appendix F to Part... POLICY RESOLUTION AND RECEIVERSHIP RULES Pt. 360, App. F Appendix F to Part 360—Customer File Structure This is the structure of the data file to provide to the FDIC information related to each customer who...

  1. A comparison of self-reported quality of life for an Australian haemodialysis and haemodiafiltration cohort.

    Science.gov (United States)

    Hill, Kathleen E; Kim, Susan; Crail, Susan; Elias, Tony J; Whittington, Tiffany

    2017-08-01

    Haemodiafiltration (HDF) has been widely studied for evidence of superior outcomes in comparison with conventional haemodialysis (HD), and there is increasing interest in determining if HDF confers any benefit in relation to quality of life. Studies have been conducted with randomized incident patients; however, little is known regarding HDF and quality of life for prevalent patients. This study examined and compared self-reported quality of life at two time points, 12 months apart in a cohort of satellite HD and HDF patients, using a disease specific questionnaire to determine if HDF conferred an advantage. A longitudinal study with a linear mixed-effect model measuring quality of life in a cohort of 171 patients (HD, n = 85, HDF, n = 86) in seven South Australian satellite dialysis centres. Factors associated with significant reduction across the Kidney Disease Quality Of Life™ domains measured were younger age (- 20 to - 29) and comorbid diabetes (- 4.8 to - 11.1). HDF was not associated with moderation of this reduction at either time point (P > 0.05). Baseline physical functioning was reported as very low (median 33.9) and further reduced at time point two. In addition, dialysing for more than 12 h per week in a satellite dialysis unit was associated with reduced quality of life in relation to the burden of kidney disease (- 13.69). This study has demonstrated that younger age and comorbid diabetes were responsible for a statistically significant reduction in quality of life, and HDF did not confer any advantage. © 2016 Asian Pacific Society of Nephrology.

  2. Cut-and-Paste file-systems : integrating simulators and file systems

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.

    1996-01-01

    We have implemented an integrated and configurable file system called the PFS and a trace-driven file-system simulator called Patsy. Patsy is used for off-line analysis of file-system algorithms, PFS is used for on-line file-system data storage. Algorithms are first analyzed in Patsy and when we are

  3. NOAA TIFF Image - 4m Multibeam Bathymetry , W00216 USVI 2011 , Seafloor Characterization of the US Caribbean - Nancy Foster - NF-11-1 (2011), UTM 20N NAD83

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset contains a GeoTIFF with 4x4 meter cell size representing the bathymetry of a sharply sloping swath of the St. John Shelf, a selected portion of seafloor...

  4. File sharing

    NARCIS (Netherlands)

    van Eijk, N.

    2011-01-01

    File sharing’ has become generally accepted on the Internet. Users share files for downloading music, films, games, software etc. In this note, we have a closer look at the definition of file sharing, the legal and policy-based context as well as enforcement issues. The economic and cultural

  5. 77 FR 20813 - Combined Notice of Filings #2

    Science.gov (United States)

    2012-04-06

    ..., Inc. submits tariff filing per 35: 03-30-12 ATXI Attachment O and GG Compliance to be effective 3/1... City of Pella and MEC to be effective 4/1/2012. Filed Date: 3/30/12. Accession Number: 20120330-5080...., PPL Electric Utilities Corporation. Description: PPL Electric submits revisions to OATT Attachment H...

  6. Tabulation of Fundamental Assembly Heat and Radiation Source Files

    International Nuclear Information System (INIS)

    T. deBues; J.C. Ryman

    2006-01-01

    The purpose of this calculation is to tabulate a set of computer files for use as input to the WPLOAD thermal loading software. These files contain details regarding heat and radiation from pressurized water reactor (PWR) assemblies and boiling water reactor (BWR) assemblies. The scope of this calculation is limited to rearranging and reducing the existing file information into a more streamlined set of tables for use as input to WPLOAD. The electronic source term files used as input to this calculation were generated from the output files of the SAS2H/ORIGIN-S sequence of the SCALE Version 4.3 modular code system, as documented in References 2.1.1 and 2.1.2, and are included in Attachment II

  7. 76 FR 1418 - Combined Notice of Filings #1

    Science.gov (United States)

    2011-01-10

    ... Northeast MBR Sellers submit their Triennial Market Power Analysis. Filed Date: 12/30/2010. Accession Number... MBR Tariff--Seller Category Changes to be effective 3/4/2011. Filed Date: 01/03/2011. Accession Number... 35.13(a)(2)(iii: CalPeak El Cajon--Amendment to MBR Tariff--Seller Category Changes to be effective 3...

  8. 14 CFR 248.2 - Filing of audit reports.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Filing of audit reports. 248.2 Section 248...) ECONOMIC REGULATIONS SUBMISSION OF AUDIT REPORTS § 248.2 Filing of audit reports. (a) Whenever any air... Information, in duplicate, a special report consisting of a true and complete copy of the audit report...

  9. 78 FR 56223 - Combined Notice of Filings #2

    Science.gov (United States)

    2013-09-12

    ...-5229. Comments Due: 5 p.m. ET 9/26/13. Docket Numbers: ER13-1346-000. Applicants: Mesa Wind Power Corporation. Description: Mesa Wind Refund Report to be effective 9/4/2013. Filed Date: 9/5/13. Accession.... Applicants: Duke Energy Progress, Inc. Description: MBR Name Change to be effective 10/25/2013. Filed Date: 9...

  10. 78 FR 57146 - Combined Notice of Filings #1

    Science.gov (United States)

    2013-09-17

    ... Management, LLC, GenOn Mid-Atlantic, LLC, Green Mountain Energy Company, High Plains Ranch II, LLC, Huntley... Revised Service Agreement No. 3452; Queue No. Y1-020 to be effective 8/8/2013. Filed Date: 9/9/13... Agreement No. 3639--Queue Position W4-038 to be effective 8/8/2013. Filed Date: 9/9/13. Accession Number...

  11. Renewal-anomalous-heterogeneous files

    International Nuclear Information System (INIS)

    Flomenbom, Ophir

    2010-01-01

    Renewal-anomalous-heterogeneous files are solved. A simple file is made of Brownian hard spheres that diffuse stochastically in an effective 1D channel. Generally, Brownian files are heterogeneous: the spheres' diffusion coefficients are distributed and the initial spheres' density is non-uniform. In renewal-anomalous files, the distribution of waiting times for individual jumps is not exponential as in Brownian files, yet obeys: ψ α (t)∼t -1-α , 0 2 >, obeys, 2 >∼ 2 > nrml α , where 2 > nrml is the MSD in the corresponding Brownian file. This scaling is an outcome of an exact relation (derived here) connecting probability density functions of Brownian files and renewal-anomalous files. It is also shown that non-renewal-anomalous files are slower than the corresponding renewal ones.

  12. PC Graphic file programing

    International Nuclear Information System (INIS)

    Yang, Jin Seok

    1993-04-01

    This book gives description of basic of graphic knowledge and understanding and realization of graphic file form. The first part deals with graphic with graphic data, store of graphic data and compress of data, programing language such as assembling, stack, compile and link of program and practice and debugging. The next part mentions graphic file form such as Mac paint file, GEM/IMG file, PCX file, GIF file, and TIFF file, consideration of hardware like mono screen driver and color screen driver in high speed, basic conception of dithering and conversion of formality.

  13. Experiences on File Systems: Which is the best file system for you?

    CERN Document Server

    Blomer, J

    2015-01-01

    The distributed file system landscape is scattered. Besides a plethora of research file systems, there is also a large number of production grade file systems with various strengths and weaknesses. The file system, as an abstraction of permanent storage, is appealing because it provides application portability and integration with legacy and third-party applications, including UNIX utilities. On the other hand, the general and simple file system interface makes it notoriously difficult for a distributed file system to perform well under a variety of different workloads. This contribution provides a taxonomy of commonly used distributed file systems and points out areas of research and development that are particularly important for high-energy physics.

  14. 76 FR 25685 - Orlando Utilities Commission; Notice of Filing

    Science.gov (United States)

    2011-05-05

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. NJ11-12-000] Orlando Utilities Commission; Notice of Filing Take notice that on April 15, 2011, Orlando Utilities Commission submitted its tariff filing per 35.25(e): Order 890 compliance to be effective 4/15/2011. Any person...

  15. Storing files in a parallel computing system using list-based index to identify replica files

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Zhang, Zhenhua; Grider, Gary

    2015-07-21

    Improved techniques are provided for storing files in a parallel computing system using a list-based index to identify file replicas. A file and at least one replica of the file are stored in one or more storage nodes of the parallel computing system. An index for the file comprises at least one list comprising a pointer to a storage location of the file and a storage location of the at least one replica of the file. The file comprises one or more of a complete file and one or more sub-files. The index may also comprise a checksum value for one or more of the file and the replica(s) of the file. The checksum value can be evaluated to validate the file and/or the file replica(s). A query can be processed using the list.

  16. Interaction of 2,4,6-trichlorophenol with high carbon iron filings: Reaction and sorption mechanisms

    Energy Technology Data Exchange (ETDEWEB)

    Sinha, Alok [Environmental Engineering and Management Programme, Department of Civil Engineering, Indian Institute of Technology Kanpur, Kanpur 208016 (India); Bose, Purnendu [Environmental Engineering and Management Programme, Department of Civil Engineering, Indian Institute of Technology Kanpur, Kanpur 208016 (India)], E-mail: pbose@iitk.ac.in

    2009-05-15

    Reductive dehalogenation of 2,4,6-trichlorophenol (2,4,6-TCP) by two types of high carbon iron filings (HCIF), HCIF-1 and HCIF-2 was studied in batch reactors. While the iron, copper, manganese and carbon content of the two types of HCIF was similar, the specific surface area of HCIF-1 and HCIF-2 were 1.944 and 3.418 m{sup 2} g{sup -1}, respectively. During interaction with HCIF-1, 2,4,6-TCP adsorbed on HCIF-1 surface resulting in rapid reduction of aqueous phase 2,4,6-TCP concentration. However, reductive dehalogenation of 2,4,6-TCP was negligible. During interaction between 2,4,6-TCP and HCIF-2, both 2,4,6-TCP adsorption on HCIF-2, and 2,4,6,-TCP dechlorination was observed. 2,4,6-TCP partitioning between solid and aqueous phase could be described by a Freundlich isotherm, while 2,4,6-TCP dechlorination could be described by an appropriate rate expression. A mathematical model was developed for describing the overall interaction of 2,4,6-TCP with HCIF-2, incorporating simultaneous adsorption/desorption and dechlorination reactions of 2,4,6-TCP with the HCIF surface. 2,4-Dichlorophenol (2,4-DCP), 2-chlorophenol (2-CP) and minor amounts of 4-chlorophenol (4-CP) evolved as 2,4,6-TCP dechlorination by-products. The evolved 2,4-DCP partitioned strongly to the HCIF surface. 4-CP and 2-CP accumulated in the aqueous phase. No transformation of 2-CP or 4-CP to phenol was observed.

  17. Effect of Taurine on Hemodiafiltration in Patients With Chronic Heart Failure.

    Science.gov (United States)

    Shiohira, Shunji; Komatsu, Mizuki; Okazaki, Masayuki; Naganuma, Toshiaki; Kawaguchi, Hiroshi; Nitta, Kosaku; Tsuchiya, Ken

    2016-02-01

    Taurine, an important factor in the living body, is essential for cardiovascular function and development and function of skeletal muscle, retina and central nervous system. In the present study, its effect on cardiovascular function was specifically taken into consideration. In hemodiafiltration (HDF) patients, the effect of taurine on patients with chronic heart failure (CHF), in whom dry weight was difficult to control, was evaluated. All patients who were subjected to regular HDF for 4 h three times per week at Joban hospital were included in this study. Patients with chronic heart failure, in whom dry weight was difficult to control (N = 4), were included in the evaluation of clinical status. X-ray and echocardiography were determined before and after taurine treatment. Almost all patients were taking nitric acid, warfarin, anti-platelet agents and vasopressors. Because vital signs were unstable in chronic heart failure, all cases withheld antihypertensive drugs during HDF. For unstable vital signs during HDF, pulmonary congestion was chronically recognized. After taurine was started, vital signs stabilized and lowering of dry weight was possible. In addition, X-ray and cardiac diastolic failure on echocardiography improved. Taurine was effective for CHF patients on HDF in whom dry weight was difficult to control in spite of various medications. © 2015 International Society for Apheresis, Japanese Society for Apheresis, and Japanese Society for Dialysis Therapy.

  18. Distributing File-Based Data to Remote Sites Within the BABAR Collaboration

    International Nuclear Information System (INIS)

    Gowdy, Stephen J.

    2002-01-01

    BABAR [1] uses two formats for its data: Objectivity database and root [2] files. This poster concerns the distribution of the latter--for Objectivity data see [3]. The BABAR analysis data is stored in root files--one per physics run and analysis selection channel--maintained in a large directory tree. Currently BABAR has more than 4.5 TBytes in 200,000 root files. This data is (mostly) produced at SLAC, but is required for analysis at universities and research centers throughout the us and Europe. Two basic problems confront us when we seek to import bulk data from slac to an institute's local storage via the network. We must determine which files must be imported (depending on the local site requirements and which files have already been imported), and we must make the optimum use of the network when transferring the data. Basic ftp-like tools (ftp, scp, etc) do not attempt to solve the first problem. More sophisticated tools like rsync [4], the widely-used mirror/synchronization program, compare local and remote file systems, checking for changes (based on file date, size and, if desired, an elaborate checksum) in order to only copy new or modified files. However rsync allows for only limited file selection. Also when, as in BABAR, an extremely large directory structure must be scanned, rsync can take several hours just to determine which files need to be copied. Although rsync (and scp) provides on-the-fly compression, it does not allow us to optimize the network transfer by using multiple streams, adjusting the tcp window size, or separating encrypted authentication from unencrypted data channels

  19. PCF File Format.

    Energy Technology Data Exchange (ETDEWEB)

    Thoreson, Gregory G [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    PCF files are binary files designed to contain gamma spectra and neutron count rates from radiation sensors. It is the native format for the GAmma Detector Response and Analysis Software (GADRAS) package [1]. It can contain multiple spectra and information about each spectrum such as energy calibration. This document outlines the format of the file that would allow one to write a computer program to parse and write such files.

  20. Cleanliness of endodontic files after use and before sterilization

    Directory of Open Access Journals (Sweden)

    Maria de Lourdes Portella

    2008-01-01

    Full Text Available Objectives: To evaluate the efficacy of two endodontic file cleaning methods: manual and with the use of ultrasonic vat Method: Sixty-six endodontic files were used for root canal preparations and afterwards divided into three groups: 1 manual cleaning; 2ultrasonic cleaning; 3 files used in patients, but were not cleaned (positive control.Results: Statistical analysis showed that in the case of manually cleaned files, the percentage of cleaning was 0.4% while in those that were dirty it was 99.6%. In the case of ultrasonic cleaning, the cleaning percentage was 49.21% while the percentage of dirt was 50.79%. Conclusion: The most satisfactory result was obtained with the use of ultrasound, and it is suggested that after ultrasound, brushing, the use of liquid soap and water, and drying should be performed for adequate cleaning of endodontic files.

  1. 75 FR 33299 - Guardian Pipeline, L.L.C.; Notice of Revised Filing

    Science.gov (United States)

    2010-06-11

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RP10-690-002] Guardian Pipeline, L.L.C.; Notice of Revised Filing June 4, 2010. Take notice that on May 14, 2010, Guardian Pipeline, L.L.C. submitted a revised filing to its filing made on May 14, 2010, in the above referenced...

  2. Screw-in forces during instrumentation by various file systems.

    Science.gov (United States)

    Ha, Jung-Hong; Kwak, Sang Won; Kim, Sung-Kyo; Kim, Hyeon-Cheol

    2016-11-01

    The purpose of this study was to compare the maximum screw-in forces generated during the movement of various Nickel-Titanium (NiTi) file systems. Forty simulated canals in resin blocks were randomly divided into 4 groups for the following instruments: Mtwo size 25/0.07 (MTW, VDW GmbH), Reciproc R25 (RPR, VDW GmbH), ProTaper Universal F2 (PTU, Dentsply Maillefer), and ProTaper Next X2 (PTN, Dentsply Maillefer, n = 10). All the artificial canals were prepared to obtain a standardized lumen by using ProTaper Universal F1. Screw-in forces were measured using a custom-made experimental device (AEndoS- k , DMJ system) during instrumentation with each NiTi file system using the designated movement. The rotation speed was set at 350 rpm with an automatic 4 mm pecking motion at a speed of 1 mm/sec. The pecking depth was increased by 1 mm for each pecking motion until the file reach the working length. Forces were recorded during file movement, and the maximum force was extracted from the data. Maximum screw-in forces were analyzed by one-way ANOVA and Tukey's post hoc comparison at a significance level of 95%. Reciproc and ProTaper Universal files generated the highest maximum screw-in forces among all the instruments while M-two and ProTaper Next showed the lowest ( p files with smaller cross-sectional area for higher flexibility is recommended.

  3. Provider of Services File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The POS file consists of two data files, one for CLIA labs and one for 18 other provider types. The file names are CLIA and OTHER. If downloading the file, note it...

  4. CAREN-4, ENDF/B Utility, Discontinuity Check at Resonance Region Boundary. CHECK-4, ENDF/B Utility, Structure Consistency Check and Format Check. CRECT, ENDF/B Utility, Data Correlation and Data Update. DICT-4, ENDF/B Utility, Section Table of Contents Generator. LISTF-4, ENDF/B Utility, Data Listing. RIGEL-4, ENDF/B Utility, Data Retrieval, BCD to BIN Conversion. SUMUP-4, ENDF/B Utility, Partial Cross-Sections Sum Check Against Tot Cross-Sections

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1975-08-01

    Description of problem or function: These programs are updated versions of ENDF/B checking, retrieval, and display codes capable of processing the Version-IV data files. RIGEL4 retrieves ENDF/B data and changes mode (BCD-BIN) or arrangement. Updated version processes decay data (mf=1, mt=457) and error files. CHECK4 checks the structure, consistency, and formats of ENDF data files. Updated version recognizes newly defined mt and mf numbers, checks mt=457 and the formats of the error files, contains RSIC photon file changes. SUMUP4 checks whether partial cross sections add up to the total. Updated version flags grid points present in the partial cross section, absent in the total. LISTF4 produces interpreted listings of ENDF/B data. Updated version lists mt=457, skips over error files, contains minor corrections. PLOTF4 produces plots of ENDF/B data. Updated version plots mt=457, skips over error files, contains minor corrections. RESEND processes ENDF materials with resonance parameters into a pointwise form. Capability of processing Adler-Adler parameters added. CRECT corrects ENDF/B data files. DICT4 generates a section table of contents (dictionary) for ENDF/B materials. CAREN4 tests for discontinuities across the limits of resonance ranges of an ENDF/B material. Updated version contains minor corrections.

  5. Thermal lattice benchmarks for testing basic evaluated data files, developed with MCNP4B

    International Nuclear Information System (INIS)

    Maucec, M.; Glumac, B.

    1996-01-01

    The development of unit cell and full reactor core models of DIMPLE S01A and TRX-1 and TRX-2 benchmark experiments, using Monte Carlo computer code MCNP4B is presented. Nuclear data from ENDF/B-V and VI version of cross-section library were used in the calculations. In addition, a comparison to results obtained with the similar models and cross-section data from the EJ2-MCNPlib library (which is based upon the JEF-2.2 evaluation) developed in IRC Petten, Netherlands is presented. The results of the criticality calculation with ENDF/B-VI data library, and a comparison to results obtained using JEF-2.2 evaluation, confirm the MCNP4B full core model of a DIMPLE reactor as a good benchmark for testing basic evaluated data files. On the other hand, the criticality calculations results obtained using the TRX full core models show less agreement with experiment. It is obvious that without additional data about the TRX geometry, our TRX models are not suitable as Monte Carlo benchmarks. (author)

  6. NASA ARCH- A FILE ARCHIVAL SYSTEM FOR THE DEC VAX

    Science.gov (United States)

    Scott, P. J.

    1994-01-01

    The function of the NASA ARCH system is to provide a permanent storage area for files that are infrequently accessed. The NASA ARCH routines were designed to provide a simple mechanism by which users can easily store and retrieve files. The user treats NASA ARCH as the interface to a black box where files are stored. There are only five NASA ARCH user commands, even though NASA ARCH employs standard VMS directives and the VAX BACKUP utility. Special care is taken to provide the security needed to insure file integrity over a period of years. The archived files may exist in any of three storage areas: a temporary buffer, the main buffer, and a magnetic tape library. When the main buffer fills up, it is transferred to permanent magnetic tape storage and deleted from disk. Files may be restored from any of the three storage areas. A single file, multiple files, or entire directories can be stored and retrieved. archived entities hold the same name, extension, version number, and VMS file protection scheme as they had in the user's account prior to archival. NASA ARCH is capable of handling up to 7 directory levels. Wildcards are supported. User commands include TEMPCOPY, DISKCOPY, DELETE, RESTORE, and DIRECTORY. The DIRECTORY command searches a directory of savesets covering all three archival areas, listing matches according to area, date, filename, or other criteria supplied by the user. The system manager commands include 1) ARCHIVE- to transfer the main buffer to duplicate magnetic tapes, 2) REPORTto determine when the main buffer is full enough to archive, 3) INCREMENT- to back up the partially filled main buffer, and 4) FULLBACKUP- to back up the entire main buffer. On-line help files are provided for all NASA ARCH commands. NASA ARCH is written in DEC VAX DCL for interactive execution and has been implemented on a DEC VAX computer operating under VMS 4.X. This program was developed in 1985.

  7. Text File Comparator

    Science.gov (United States)

    Kotler, R. S.

    1983-01-01

    File Comparator program IFCOMP, is text file comparator for IBM OS/VScompatable systems. IFCOMP accepts as input two text files and produces listing of differences in pseudo-update form. IFCOMP is very useful in monitoring changes made to software at the source code level.

  8. File System Virtual Appliances

    Science.gov (United States)

    2010-05-01

    4 KB of data is read or written, data is copied back and forth using trampoline buffers — pages that are shared during proxy initialization — because...in 2008. CIO Magazine. 104 · File system virtual appliances [64] Megiddo, N. and Modha, D. S. 2003. ARC: A Self-Tuning, Low Over- head Replacement

  9. Status and evaluation methods of JENDL fusion file and JENDL PKA/KERMA file

    International Nuclear Information System (INIS)

    Chiba, S.; Fukahori, T.; Shibata, K.; Yu Baosheng; Kosako, K.

    1997-01-01

    The status of evaluated nuclear data in the JENDL fusion file and PKA/KERMA file is presented. The JENDL fusion file was prepared in order to improve the quality of the JENDL-3.1 data especially on the double-differential cross sections (DDXs) of secondary neutrons and gamma-ray production cross sections, and to provide DDXs of secondary charged particles (p, d, t, 3 He and α-particle) for the calculation of PKA and KERMA factors. The JENDL fusion file contains evaluated data of 26 elements ranging from Li to Bi. The data in JENDL fusion file reproduce the measured data on neutron and charged-particle DDXs and also on gamma-ray production cross sections. Recoil spectra in PKA/KERMA file were calculated from secondary neutron and charged-particle DDXs contained in the fusion file with two-body reaction kinematics. The data in the JENDL fusion file and PKA/KERMA file were compiled in ENDF-6 format with an MF=6 option to store the DDX data. (orig.)

  10. NOAA TIFF Image - 4m Backscatter Mosaic , W00216 USVI 2011 , Seafloor Characterization of the US Caribbean - Nancy Foster - NF-11-1 (2011), UTM 20N NAD83

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This image represents a 4 meter resolution backscatter mosaic of of a sharply sloping swath of the St. John Shelf, south of St. John, US Virgin Islands. NOAA's...

  11. Eight-Year Experience with Nocturnal, Every-Other-Day, Online Haemodiafiltration.

    Science.gov (United States)

    Maduell, Francisco; Ojeda, Raquel; Arias-Guillen, Marta; Rossi, Florencia; Fontseré, Néstor; Vera, Manel; Rico, Nayra; Gonzalez, Leonardo Nicolás; Piñeiro, Gastón; Jiménez-Hernández, Mario; Rodas, Lida; Bedini, José Luis

    2016-01-01

    New haemodialysis therapeutic regimens are required to improve patient survival. Longer and more frequent dialysis sessions have produced excellent survival and clinical advantages, while online haemodiafiltration (OL-HDF) provides the most efficient form of dialysis treatment. In this single-centre observational study, 57 patients on 4-5-hour thrice-weekly OL-HDF were switched to nocturnal every-other-day OL-HDF. Inclusion criteria consisted of stable patients with good prospects for improved occupational, psychological and social rehabilitation. The aim of this study was to report our 8-year experience with this schedule and to evaluate analytical and clinical outcomes. Nocturnal, every-other-day OL-HDF was well tolerated and 56% of patients were working. The convective volume increased from 26.7 ± 2 litres at baseline to 46.6 ± 6.5 litres at 24 months (p every-other-day OL-HDF could be an excellent therapeutic alternative since it is well tolerated and leads to clinical and social-occupational rehabilitation with satisfactory morbidity and mortality. These encouraging results strengthen us to continue and invite other clinicians to join this initiative. © 2016 S. Karger AG, Basel.

  12. 29 CFR 4902.11 - Specific exemptions: Office of Inspector General Investigative File System.

    Science.gov (United States)

    2010-07-01

    ... Investigative File System. 4902.11 Section 4902.11 Labor Regulations Relating to Labor (Continued) PENSION... General Investigative File System. (a) Criminal Law Enforcement. (1) Exemption. Under the authority... Inspector General Investigative File System—PBGC” from the provisions of 5 U.S.C. 552a (c)(3), (c)(4), (d)(1...

  13. 75 FR 33799 - EasTrans, LLC; Notice of Baseline Filing

    Science.gov (United States)

    2010-06-15

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR10-30-000] EasTrans, LLC; Notice of Baseline Filing June 8, 2010. Take notice that on June 4, 2010, EasTrans, LLC submitted a baseline filing of its Statement of Operating Conditions for services provided under section 311 of the...

  14. Multi-level, automatic file management system using magnetic disk, mass storage system and magnetic tape

    International Nuclear Information System (INIS)

    Fujii, Minoru; Asai, Kiyoshi

    1979-12-01

    A simple, effective file management system using magnetic disk, mass storage system (MSS) and magnetic tape is described. Following are the concepts and techniques introduced in this file management system. (1) File distribution and continuity character of file references are closely approximated by memory retention function. A density function using the memory retention function is thus defined. (2) A method of computing the cost/benefit lines for magnetic disk, MSS and magnetic tape is presented. (3) A decision process of an optimal organization of file facilities incorporating file demands distribution to respective file devices, is presented. (4) A method of simple, practical, effective, automatic file management, incorporating multi-level file management, space management and file migration control, is proposed. (author)

  15. TH-AB-201-12: Using Machine Log-Files for Treatment Planning and Delivery QA

    Energy Technology Data Exchange (ETDEWEB)

    Stanhope, C [Beaumont Health System, Royal Oak MI and Wayne State University, Detroit, MI (United States); Liang, J; Drake, D; Yan, D [Beaumont Health System, Royal Oak, MI (United States)

    2016-06-15

    Purpose: To determine the segment reduction and dose resolution necessary for machine log-files to effectively replace current phantom-based patient-specific quality assurance, while minimizing computational cost. Methods: Elekta’s Log File Convertor R3.2 records linac delivery parameters (dose rate, gantry angle, leaf position) every 40ms. Five VMAT plans [4 H&N, 1 Pulsed Brain] comprised of 2 arcs each were delivered on the ArcCHECK phantom. Log-files were reconstructed in Pinnacle on the phantom geometry using 1/2/3/4° control point spacing and 2/3/4mm dose grid resolution. Reconstruction effectiveness was quantified by comparing 2%/2mm gamma passing rates of the original and log-file plans. Modulation complexity scores (MCS) were calculated for each beam to correlate reconstruction accuracy and beam modulation. Percent error in absolute dose for each plan-pair combination (log-file vs. ArcCHECK, original vs. ArcCHECK, log-file vs. original) was calculated for each arc and every diode greater than 10% of the maximum measured dose (per beam). Comparing standard deviations of the three plan-pair distributions, relative noise of the ArcCHECK and log-file systems was elucidated. Results: The original plans exhibit a mean passing rate of 95.1±1.3%. The eight more modulated H&N arcs [MCS=0.088±0.014] and two less modulated brain arcs [MCS=0.291±0.004] yielded log-file pass rates most similar to the original plan when using 1°/2mm [0.05%±1.3% lower] and 2°/3mm [0.35±0.64% higher] log-file reconstructions respectively. Log-file and original plans displayed percent diode dose errors 4.29±6.27% and 3.61±6.57% higher than measurement. Excluding the phantom eliminates diode miscalibration and setup errors; log-file dose errors were 0.72±3.06% higher than the original plans – significantly less noisy. Conclusion: For log-file reconstructed VMAT arcs, 1° control point spacing and 2mm dose resolution is recommended, however, less modulated arcs may allow less

  16. Swath-altimetry measurements of the main stem Amazon River: measurement errors and hydraulic implications

    Science.gov (United States)

    Wilson, M. D.; Durand, M.; Jung, H. C.; Alsdorf, D.

    2015-04-01

    The Surface Water and Ocean Topography (SWOT) mission, scheduled for launch in 2020, will provide a step-change improvement in the measurement of terrestrial surface-water storage and dynamics. In particular, it will provide the first, routine two-dimensional measurements of water-surface elevations. In this paper, we aimed to (i) characterise and illustrate in two dimensions the errors which may be found in SWOT swath measurements of terrestrial surface water, (ii) simulate the spatio-temporal sampling scheme of SWOT for the Amazon, and (iii) assess the impact of each of these on estimates of water-surface slope and river discharge which may be obtained from SWOT imagery. We based our analysis on a virtual mission for a ~260 km reach of the central Amazon (Solimões) River, using a hydraulic model to provide water-surface elevations according to SWOT spatio-temporal sampling to which errors were added based on a two-dimensional height error spectrum derived from the SWOT design requirements. We thereby obtained water-surface elevation measurements for the Amazon main stem as may be observed by SWOT. Using these measurements, we derived estimates of river slope and discharge and compared them to those obtained directly from the hydraulic model. We found that cross-channel and along-reach averaging of SWOT measurements using reach lengths greater than 4 km for the Solimões and 7.5 km for Purus reduced the effect of systematic height errors, enabling discharge to be reproduced accurately from the water height, assuming known bathymetry and friction. Using cross-sectional averaging and 20 km reach lengths, results show Nash-Sutcliffe model efficiency values of 0.99 for the Solimões and 0.88 for the Purus, with 2.6 and 19.1 % average overall error in discharge, respectively. We extend the results to other rivers worldwide and infer that SWOT-derived discharge estimates may be more accurate for rivers with larger channel widths (permitting a greater level of cross

  17. 76 FR 52650 - Federal Energy Regulatory Commission Combined Notice of Filings #1

    Science.gov (United States)

    2011-08-23

    ... Depreciation Rate Update) to be effective 1/1/2012. Filed Date: 08/11/2011. Accession Number: 20110811-5114...)(iii: JEA Scherer Unit 4 TSA Amendment Filing (SEGCO Depreciation Rate Update) to be effective 1/1/2012...

  18. Securing the AliEn File Catalogue - Enforcing authorization with accountable file operations

    International Nuclear Information System (INIS)

    Schreiner, Steffen; Banerjee, Subho Sankar; Betev, Latchezar; Carminati, Federico; Vladimirovna Datskova, Olga; Furano, Fabrizio; Grigoras, Alina; Grigoras, Costin; Mendez Lorenzo, Patricia; Peters, Andreas Joachim; Saiz, Pablo; Bagnasco, Stefano; Zhu Jianlin

    2011-01-01

    The AliEn Grid Services, as operated by the ALICE Collaboration in its global physics analysis grid framework, is based on a central File Catalogue together with a distributed set of storage systems and the possibility to register links to external data resources. This paper describes several identified vulnerabilities in the AliEn File Catalogue access protocol regarding fraud and unauthorized file alteration and presents a more secure and revised design: a new mechanism, called LFN Booking Table, is introduced in order to keep track of access authorization in the transient state of files entering or leaving the File Catalogue. Due to a simplification of the original Access Envelope mechanism for xrootd-protocol-based storage systems, fundamental computational improvements of the mechanism were achieved as well as an up to 50% reduction of the credential's size. By extending the access protocol with signed status messages from the underlying storage system, the File Catalogue receives trusted information about a file's size and checksum and the protocol is no longer dependent on client trust. Altogether, the revised design complies with atomic and consistent transactions and allows for accountable, authentic, and traceable file operations. This paper describes these changes as part and beyond the development of AliEn version 2.19.

  19. A Metadata-Rich File System

    Energy Technology Data Exchange (ETDEWEB)

    Ames, S; Gokhale, M B; Maltzahn, C

    2009-01-07

    Despite continual improvements in the performance and reliability of large scale file systems, the management of file system metadata has changed little in the past decade. The mismatch between the size and complexity of large scale data stores and their ability to organize and query their metadata has led to a de facto standard in which raw data is stored in traditional file systems, while related, application-specific metadata is stored in relational databases. This separation of data and metadata requires considerable effort to maintain consistency and can result in complex, slow, and inflexible system operation. To address these problems, we have developed the Quasar File System (QFS), a metadata-rich file system in which files, metadata, and file relationships are all first class objects. In contrast to hierarchical file systems and relational databases, QFS defines a graph data model composed of files and their relationships. QFS includes Quasar, an XPATH-extended query language for searching the file system. Results from our QFS prototype show the effectiveness of this approach. Compared to the defacto standard, the QFS prototype shows superior ingest performance and comparable query performance on user metadata-intensive operations and superior performance on normal file metadata operations.

  20. SU-E-T-473: A Patient-Specific QC Paradigm Based On Trajectory Log Files and DICOM Plan Files

    International Nuclear Information System (INIS)

    DeMarco, J; McCloskey, S; Low, D; Moran, J

    2014-01-01

    Purpose: To evaluate a remote QC tool for monitoring treatment machine parameters and treatment workflow. Methods: The Varian TrueBeamTM linear accelerator is a digital machine that records machine axis parameters and MLC leaf positions as a function of delivered monitor unit or control point. This information is saved to a binary trajectory log file for every treatment or imaging field in the patient treatment session. A MATLAB analysis routine was developed to parse the trajectory log files for a given patient, compare the expected versus actual machine and MLC positions as well as perform a cross-comparison with the DICOM-RT plan file exported from the treatment planning system. The parsing routine sorts the trajectory log files based on the time and date stamp and generates a sequential report file listing treatment parameters and provides a match relative to the DICOM-RT plan file. Results: The trajectory log parsing-routine was compared against a standard record and verify listing for patients undergoing initial IMRT dosimetry verification and weekly and final chart QC. The complete treatment course was independently verified for 10 patients of varying treatment site and a total of 1267 treatment fields were evaluated including pre-treatment imaging fields where applicable. In the context of IMRT plan verification, eight prostate SBRT plans with 4-arcs per plan were evaluated based on expected versus actual machine axis parameters. The average value for the maximum RMS MLC error was 0.067±0.001mm and 0.066±0.002mm for leaf bank A and B respectively. Conclusion: A real-time QC analysis program was tested using trajectory log files and DICOM-RT plan files. The parsing routine is efficient and able to evaluate all relevant machine axis parameters during a patient treatment course including MLC leaf positions and table positions at time of image acquisition and during treatment

  1. HUD GIS Boundary Files

    Data.gov (United States)

    Department of Housing and Urban Development — The HUD GIS Boundary Files are intended to supplement boundary files available from the U.S. Census Bureau. The files are for community planners interested in...

  2. 10 CFR 708.14 - How much time does an employee have to file a complaint?

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false How much time does an employee have to file a complaint... Complaint Resolution Process § 708.14 How much time does an employee have to file a complaint? (a) You must... filing stops running on the day the internal grievance is filed and begins to run again on the earlier of...

  3. NOAA TIFF Image - 4m Backscatter Mosaic , W00216 USVI 2011 , Seafloor Characterization of the US Caribbean - Nancy Foster - NF-11-1 (2011), UTM 20N NAD83 (NCEI Accession 0131858)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This image represents a 4 meter resolution backscatter mosaic of of a sharply sloping swath of the St. John Shelf, south of St. John, US Virgin Islands. NOAA's...

  4. 33 CFR 148.246 - When is a document considered filed and where should I file it?

    Science.gov (United States)

    2010-07-01

    ... filed and where should I file it? 148.246 Section 148.246 Navigation and Navigable Waters COAST GUARD... Formal Hearings § 148.246 When is a document considered filed and where should I file it? (a) If a document to be filed is submitted by mail, it is considered filed on the date it is postmarked. If a...

  5. SIDS-toADF File Mapping Manual

    Science.gov (United States)

    McCarthy, Douglas; Smith, Matthew; Poirier, Diane; Smith, Charles A. (Technical Monitor)

    2002-01-01

    The "CFD General Notation System" (CGNS) consists of a collection of conventions, and conforming software, for the storage and retrieval of Computational Fluid Dynamics (CFD) data. It facilitates the exchange of data between sites and applications, and helps stabilize the archiving of aerodynamic data. This effort was initiated in order to streamline the procedures in exchanging data and software between NASA and its customers, but the goal is to develop CGNS into a National Standard for the exchange of aerodynamic data. The CGNS development team is comprised of members from Boeing Commercial Airplane Group, NASA-Ames, NASA-Langley, NASA-Lewis, McDonnell-Douglas Corporation (now Boeing-St. Louis), Air Force-Wright Lab., and ICEM-CFD Engineering. The elements of CGNS address all activities associated with the storage of data on external media and its movement to and from application programs. These elements include: 1) The Advanced Data Format (ADF) Database manager, consisting of both a file format specification and its I/O software, which handles the actual reading and writing of data from and to external storage media; 2) The Standard Interface Data Structures (SIDS), which specify the intellectual content of CFD data and the conventions governing naming and terminology; 3) The SIDS-to-ADF File Mapping conventions, which specify the exact location where the CFD data defined by the SIDS is to be stored within the ADF file(s); and 4) The CGNS Mid-level Library, which provides CFD-knowledgeable routines suitable for direct installation into application codes. The SIDS-toADF File Mapping Manual specifies the exact manner in which, under CGNS conventions, CFD data structures (the SIDS) are to be stored in (i.e., mapped onto) the file structure provided by the database manager (ADF). The result is a conforming CGNS database. Adherence to the mapping conventions guarantees uniform meaning and location of CFD data within ADF files, and thereby allows the construction of

  6. 12 CFR Appendix C to Part 360 - Deposit File Structure

    Science.gov (United States)

    2010-01-01

    .... • STATE = State government. • COMM = Commercial. • CORP = Corporate. • BANK = Bank Owned. • DUE TO = Other... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Deposit File Structure C Appendix C to Part 360... RESOLUTION AND RECEIVERSHIP RULES Pt. 360, App. C Appendix C to Part 360—Deposit File Structure This is the...

  7. Protecting your files on the DFS file system

    CERN Multimedia

    Computer Security Team

    2011-01-01

    The Windows Distributed File System (DFS) hosts user directories for all NICE users plus many more data.    Files can be accessed from anywhere, via a dedicated web portal (http://cern.ch/dfs). Due to the ease of access to DFS with in CERN it is of utmost importance to properly protect access to sensitive data. As the use of DFS access control mechanisms is not obvious to all users, passwords, certificates or sensitive files might get exposed. At least this happened in past to the Andrews File System (AFS) - the Linux equivalent to DFS) - and led to bad publicity due to a journalist accessing supposedly "private" AFS folders (SonntagsZeitung 2009/11/08). This problem does not only affect the individual user but also has a bad impact on CERN's reputation when it comes to IT security. Therefore, all departments and LHC experiments agreed recently to apply more stringent protections to all DFS user folders. The goal of this data protection policy is to assist users in pro...

  8. Protecting your files on the AFS file system

    CERN Multimedia

    2011-01-01

    The Andrew File System is a world-wide distributed file system linking hundreds of universities and organizations, including CERN. Files can be accessed from anywhere, via dedicated AFS client programs or via web interfaces that export the file contents on the web. Due to the ease of access to AFS it is of utmost importance to properly protect access to sensitive data in AFS. As the use of AFS access control mechanisms is not obvious to all users, passwords, private SSH keys or certificates have been exposed in the past. In one specific instance, this also led to bad publicity due to a journalist accessing supposedly "private" AFS folders (SonntagsZeitung 2009/11/08). This problem does not only affect the individual user but also has a bad impact on CERN's reputation when it comes to IT security. Therefore, all departments and LHC experiments agreed in April 2010 to apply more stringent folder protections to all AFS user folders. The goal of this data protection policy is to assist users in...

  9. Zebra: A striped network file system

    Science.gov (United States)

    Hartman, John H.; Ousterhout, John K.

    1992-01-01

    The design of Zebra, a striped network file system, is presented. Zebra applies ideas from log-structured file system (LFS) and RAID research to network file systems, resulting in a network file system that has scalable performance, uses its servers efficiently even when its applications are using small files, and provides high availability. Zebra stripes file data across multiple servers, so that the file transfer rate is not limited by the performance of a single server. High availability is achieved by maintaining parity information for the file system. If a server fails its contents can be reconstructed using the contents of the remaining servers and the parity information. Zebra differs from existing striped file systems in the way it stripes file data: Zebra does not stripe on a per-file basis; instead it stripes the stream of bytes written by each client. Clients write to the servers in units called stripe fragments, which are analogous to segments in an LFS. Stripe fragments contain file blocks that were written recently, without regard to which file they belong. This method of striping has numerous advantages over per-file striping, including increased server efficiency, efficient parity computation, and elimination of parity update.

  10. CERES ERBE-like Instantaneous TOA Estimates (ES-8) in HDF (CER_ES4_TRMM-PFM_Edition1)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    The ERBE-like Monthly Geographical Averages (ES-4) product contains a month of space and time averaged Clouds and the Earth's Radiant Energy System (CERES) data for a single scanner instrument. The ES-4 is also produced for combinations of scanner instruments. For each observed 2.5-degree spatial region, the daily average, the hourly average over the month, and the overall monthly average of shortwave and longwave fluxes at the Top-of-the-Atmosphere (TOA) from the CERES ES-9 product are spatially nested up from 2.5-degree regions to 5- and 10-degree regions, to 2.5-, 5-, and 10-degree zonal averages, and to global monthly averages. For each nested area, the albedo and net flux are given. For each region, the daily average flux is estimated from an algorithm that uses the available hourly data, scene identification data, and diurnal models. This algorithm is 'like' the algorithm used for the Earth Radiation Budget Experiment (ERBE). The following CERES ES4 data sets are currently available: CER_ES4_FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition1 CER_ES4_PFM+FM1+FM2_Edition2 CER_ES4_PFM+FM1_Edition1 CER_ES4_PFM+FM2_Edition1 CER_ES4_TRMM-PFM_Edition1 CER_ES4_TRMM-PFM_Edition2 CER_ES4_Terra-FM1_Edition1 CER_ES4_Terra-FM2_Edition1 CER_ES4_FM1+FM2_Edition2 CER_ES4_Terra-FM1_Edition2 CER_ES4_Terra-FM2_Edition2 CER_ES4_Aqua-FM3_Edition1 CER_ES4_Aqua-FM4_Edition1 CER_ES4_FM1+FM2+FM3+FM4_Edition1 CER_ES4_Aqua-FM3_Edition2 CER_ES4_Aqua-FM4_Edition2 CER_ES4_FM1+FM3_Edition2 CER_ES4_FM1+FM4_Edition2 CER_ES4_PFM+FM1_Edition2 CER_ES4_PFM+FM2_Edition2 CER_ES4_Aqua-FM3_Edition1-CV CER_ES4_Aqua-FM4_Edition1-CV CER_ES4_Terra-FM1_Edition1-CV CER_ES4_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1998-01-01; Stop_Date=1998-08-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Latitude_Resolution=2.5 degree; Longitude_Resolution=2.5 degree; Horizontal

  11. Advanced graphical user interface for multi-physics simulations using AMST

    Science.gov (United States)

    Hoffmann, Florian; Vogel, Frank

    2017-07-01

    Numerical modelling of particulate matter has gained much popularity in recent decades. Advanced Multi-physics Simulation Technology (AMST) is a state-of-the-art three dimensional numerical modelling technique combining the eX-tended Discrete Element Method (XDEM) with Computational Fluid Dynamics (CFD) and Finite Element Analysis (FEA) [1]. One major limitation of this code is the lack of a graphical user interface (GUI) meaning that all pre-processing has to be made directly in a HDF5-file. This contribution presents the first graphical pre-processor developed for AMST.

  12. CERES BiDirectional Scans (BDS) data in HDF (CER_BDS_TRMM-PFM_Edition1)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    Each BiDirectional Scans (BDS) data product contains twenty-four hours of Level-1b data for each CERES scanner instrument mounted on each spacecraft. The BDS includes samples taken in normal and short Earth scan elevation profiles in both fixed and rotating azimuth scan modes (including space, internal calibration, and solar calibration views). The BDS contains Level-0 raw (unconverted) science and instrument data as well as the geolocated converted science and instrument data. The BDS contains additional data not found in the Level-0 input file, including converted satellite position and velocity data, celestial data, converted digital status data, and parameters used in the radiance count conversion equations. The following CERES BDS data sets are currently available: CER_BDS_TRMM-PFM_Edition1 CER_BDS_Terra-FM1_Edition1 CER_BDS_Terra-FM2_Edition1 CER_BDS_Terra-FM1_Edition2 CER_BDS_Terra-FM2_Edition2 CER_BDS_Aqua-FM3_Edition1 CER_BDS_Aqua-FM4_Edition1 CER_BDS_Aqua-FM3_Edition2 CER_BDS_Aqua-FM4_Edition2 CER_BDS_Aqua-FM3_Edition1-CV CER_BDS_Aqua-FM4_Edition1-CV CER_BDS_Terra-FM1_Edition1-CV CER_BDS_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1997-12-27; Stop_Date=2000-12-31] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Temporal_Resolution=1 day; Temporal_Resolution_Range=Daily - < Weekly].

  13. JENDL Dosimetry File

    International Nuclear Information System (INIS)

    Nakazawa, Masaharu; Iguchi, Tetsuo; Kobayashi, Katsuhei; Iwasaki, Shin; Sakurai, Kiyoshi; Ikeda, Yujiro; Nakagawa, Tsuneo.

    1992-03-01

    The JENDL Dosimetry File based on JENDL-3 was compiled and integral tests of cross section data were performed by the Dosimetry Integral Test Working Group of the Japanese Nuclear Data Committee. Data stored in the JENDL Dosimetry File are the cross sections and their covariance data for 61 reactions. The cross sections were mainly taken from JENDL-3 and the covariances from IRDF-85. For some reactions, data were adopted from other evaluated data files. The data are given in the neutron energy region below 20 MeV in both of point-wise and group-wise files in the ENDF-5 format. In order to confirm reliability of the data, several integral tests were carried out; comparison with the data in IRDF-85 and average cross sections measured in fission neutron fields, fast reactor spectra, DT neutron fields and Li(d, n) neutron fields. As a result, it has been found that the JENDL Dosimetry File gives better results than IRDF-85 but there are some problems to be improved in future. The contents of the JENDL Dosimetry File and the results of the integral tests are described in this report. All of the dosimetry cross sections are shown in a graphical form. (author) 76 refs

  14. JENDL Dosimetry File

    Energy Technology Data Exchange (ETDEWEB)

    Nakazawa, Masaharu; Iguchi, Tetsuo [Tokyo Univ. (Japan). Faculty of Engineering; Kobayashi, Katsuhei [Kyoto Univ., Kumatori, Osaka (Japan). Research Reactor Inst.; Iwasaki, Shin [Tohoku Univ., Sendai (Japan). Faculty of Engineering; Sakurai, Kiyoshi; Ikeda, Yujior; Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1992-03-15

    The JENDL Dosimetry File based on JENDL-3 was compiled and integral tests of cross section data were performed by the Dosimetry Integral Test Working Group of the Japanese Nuclear Data Committee. Data stored in the JENDL Dosimetry File are the cross sections and their covariance data for 61 reactions. The cross sections were mainly taken from JENDL-3 and the covariances from IRDF-85. For some reactions, data were adopted from other evaluated data files. The data are given in the neutron energy region below 20 MeV in both of point-wise and group-wise files in the ENDF-5 format. In order to confirm reliability of the data, several integral tests were carried out; comparison with the data in IRDF-85 and average cross sections measured in fission neutron fields, fast reactor spectra, DT neutron fields and Li(d,n) neutron fields. As a result, it has been found that the JENDL Dosimetry File gives better results than IRDF-85 but there are some problems to be improved in future. The contents of the JENDL Dosimetry File and the results of the integral tests are described in this report. All of the dosimetry cross sections are shown in a graphical form.

  15. A feeder-free culture using autogeneic conditioned medium for undifferentiated growth of human embryonic stem cells: Comparative expression profiles of mRNAs, microRNAs and proteins among different feeders and conditioned media

    Directory of Open Access Journals (Sweden)

    Chou Chi-Hsien

    2010-10-01

    Full Text Available Abstract Background Human embryonic stem (hES cell lines were derived from the inner cell mass of human blastocysts, and were cultured on mouse embryonic fibroblast (MEF feeder to maintain undifferentiated growth, extensive renewal capacity, and pluripotency. The hES-T3 cell line with normal female karyotype was previously used to differentiate into autogeneic fibroblast-like cells (T3HDF as feeder to support the undifferentiated growth of hES-T3 cells (T3/HDF for 14 passages. Results A feeder-free culture on Matrigel in hES medium conditioned by the autogeneic feeder cells (T3HDF was established to maintain the undifferentiated growth of hES-T3 cells (T3/CMHDF for 8 passages in this investigation. The gene expression profiles of mRNAs, microRNAs and proteins between the undifferentiated T3/HDF and T3/CMHDF cells were shown to be very similar, and their expression profiles were also found to be similar to those of T3/MEF and T3/CMMEF cells grown on MEF feeder and feeder-free Matrigel in MEF-conditioned medium, respectively. The undifferentiated state of T3/HDF and T3/CMHDF as well as T3/MEF andT3/CMMEF cells was evidenced by the very high expression levels of "stemness" genes and low expression levels of differentiation markers of ectoderm, mesoderm and endoderm in addition to the strong staining of OCT4 and NANOG. Conclusion The T3HDF feeder and T3HDF-conditioned medium were able to support the undifferentiated growth of hES cells, and they would be useful for drug development and toxicity testing in addition to the reduced risks of xenogeneic pathogens when used for medical applications such as cell therapies.

  16. Vasoactive Peptide Levels after Change of Dialysis Mode

    Directory of Open Access Journals (Sweden)

    Fredrik Uhlin

    2015-10-01

    Full Text Available Background/Aims: Plasma concentrations of the N-terminal fragment of pro-brain natriuretic peptide (NT-proBNP are increased in end-stage renal disease. Improvement in hemodynamic stability has been reported when switching from hemodialysis (HD to on-line hemodiafiltration (ol-HDF. The aim of this study was to investigate plasma concentrations of NT-proBNP, BNP and neuropeptide Y (NPY during a 1-year follow-up, after a change from high-flux HD to postdilution ol-HDF. Additional variables were also studied, e.g. pulse wave velocity and ordinary clinical parameters. Method: We conducted a prospective, single-center study including 35 patients who were switched from HD to HDF. Plasma concentrations of NT-proBNP, BNP and NPY before and after dialysis were measured at baseline (i.e. HD and at 1, 2, 4, 6 and 12 months on HDF. Results: All three peptide levels decreased significantly during HD and HDF when comparing concentrations before and after dialysis. Mean absolute value (before/after and relative decrease (% before versus after dialysis was 13.697/9.497 ng/l (31% for NT-proBNP, 62/40 ng/ml (35% for BNP and 664/364 pg/l (45% for NPY. No significant differences were observed when comparing predialysis values over time. However, postdialysis NT-proBNP concentration showed a significant decrease of 48% over time after the switch to HDF. Conclusion: The postdialysis plasma levels of NT-proBNP, BNP and NPY decreased significantly during both dialysis modes when compared to before dialysis. The postdialysis lowering of NT-proBNP increased further over time after the switch to ol-HDF; the predialysis levels were unchanged, suggesting no effect on its production in the ventricles of the heart.

  17. A File Archival System

    Science.gov (United States)

    Fanselow, J. L.; Vavrus, J. L.

    1984-01-01

    ARCH, file archival system for DEC VAX, provides for easy offline storage and retrieval of arbitrary files on DEC VAX system. System designed to eliminate situations that tie up disk space and lead to confusion when different programers develop different versions of same programs and associated files.

  18. 29 CFR 1602.43 - Commission's remedy for school systems' or districts' failure to file report.

    Science.gov (United States)

    2010-07-01

    ...' failure to file report. Any school system or district failing or refusing to file report EEO-5 when... 29 Labor 4 2010-07-01 2010-07-01 false Commission's remedy for school systems' or districts' failure to file report. 1602.43 Section 1602.43 Labor Regulations Relating to Labor (Continued) EQUAL...

  19. Comparative evaluation of debris extruded apically by using, Protaper retreatment file, K3 file and H-file with solvent in endodontic retreatment

    Directory of Open Access Journals (Sweden)

    Chetna Arora

    2012-01-01

    Full Text Available Aim: The aim of this study was to evaluate the apical extrusion of debris comparing 2 engine driven systems and hand instrumentation technique during root canal retreatment. Materials and Methods: Forty five human permanent mandibular premolars were prepared using the step-back technique, obturated with gutta-percha/zinc oxide eugenol sealer and cold lateral condensation technique. The teeth were divided into three groups: Group A: Protaper retreatment file, Group B: K3, file Group C: H-file with tetrachloroethylene. All the canals were irrigated with 20ml distilled water during instrumentation. Debris extruded along with the irrigating solution during retreatment procedure was carefully collected in preweighed Eppendorf tubes. The tubes were stored in an incubator for 5 days, placed in a desiccator and then re-weighed. Weight of dry debris was calculated by subtracting the weight of the tube before instrumentation and from the weight of the tube after instrumentation. Data was analyzed using Two Way ANOVA and Post Hoc test. Results : There was statistically significant difference in the apical extrusion of debris between hand instrumentation and protaper retreatment file and K3 file. The amount of extruded debris caused by protaper retreatment file and K3 file instrumentation technique was not statistically significant. All the three instrumentation techniques produced apically extruded debris and irrigant. Conclusion: The best way to minimize the extrusion of debris is by adapting crown down technique therefore the use of rotary technique (Protaper retreatment file, K3 file is recommended.

  20. Reduced protein bound uraemic toxins in vegetarian kidney failure patients treated by haemodiafiltration.

    Science.gov (United States)

    Kandouz, Sakina; Mohamed, Ali Shendi; Zheng, Yishan; Sandeman, Susan; Davenport, Andrew

    2016-10-01

    Introduction Indoxyl sulfate (IS) and p cresyl sulfate (PCS) are protein bound toxins which accumulate with chronic kidney disease. Haemodiafiltration (HDF) increases middle molecule clearances and has been suggested to increase IS and PCS clearance. We therefore wished to establish whether higher convective clearances with HDF would reduce IS and PCS concentrations. Methods We measured total plasma IS and PCS in a cohort of 138 CKD5d patients treated by On-line HDF (Ol-HDF), by high pressure liquid chromatography. Findings Mean patient age was 64.6 ± 16.5 years, 60.1% male, 57.3% diabetic, median dialysis vintage 25.9 months (12.4-62.0). The mean ICS concentration was 79.8 ± 56.4 umol/L and PCS 140.3 ± 101.8 umol/L. On multivariate analysis, IS was associated with serum albumin (β 4.31,P vegetarian diet(β-28.3, P = 0.048) and PCS negatively with log C reactive protein (β-75.8, P vegetarian diet (β-109, P = 0.001). Vegetarian patients had lower IS and PCS levels (median 41.5 (24.2-71.9) vs. 78.1 (49.5-107.5) and PCS (41.6 (14.2-178.3) vs. 127.3 (77.4-205.6) µmol/L, respectively, P Vegetarian patients had lower preOl-HDF serum urea, and phosphate (13.8 ±3.8 vs. 18.4 ± 5.2 mmol/L, and 1.33 ± 0.21 vs. 1.58 ± 0.45 mmol/L), and estimated urea nitrogen intake (1.25 ± 0.28 vs. 1.62 ± 0.5 g/kg/day), respectively, all P vegetarian diet had reduced IS and PCS concentrations. Although this could be due to differences in dietary protein intake, a vegetarian diet may also potentially reduce IS and PCS production by the intestinal microbiome. © 2016 International Society for Hemodialysis.

  1. Computer Forensics Method in Analysis of Files Timestamps in Microsoft Windows Operating System and NTFS File System

    Directory of Open Access Journals (Sweden)

    Vesta Sergeevna Matveeva

    2013-02-01

    Full Text Available All existing file browsers displays 3 timestamps for every file in file system NTFS. Nowadays there are a lot of utilities that can manipulate temporal attributes to conceal the traces of file using. However every file in NTFS has 8 timestamps that are stored in file record and used in detecting the fact of attributes substitution. The authors suggest a method of revealing original timestamps after replacement and automated variant of it in case of a set of files.

  2. 78 FR 25070 - Combined Notice of Filings #1

    Science.gov (United States)

    2013-04-29

    ...-1311-000 Applicants: Duke Energy Carolinas, LLC Description: Joint OATT Progress Depreciation to be... for Revised Depreciation Rates for Power Supply and Coordination Agreement. Filed Date: 4/19/13...

  3. Processing of evaluated neutron data files in ENDF format on personal computers

    International Nuclear Information System (INIS)

    Vertes, P.

    1991-11-01

    A computer code package - FDMXPC - has been developed for processing evaluated data files in ENDF format. The earlier version of this package is supplemented with modules performing calculations using Reich-Moore and Adler-Adler resonance parameters. The processing of evaluated neutron data files by personal computers requires special programming considerations outlined in this report. The scope of the FDMXPC program system is demonstrated by means of numerical examples. (author). 5 refs, 4 figs, 4 tabs

  4. 76 FR 43679 - Filing via the Internet; Notice of Additional File Formats for efiling

    Science.gov (United States)

    2011-07-21

    ... list of acceptable file formats the four-character file extensions for Microsoft Office 2007/2010... files from Office 2007 or Office 2010 in an Office 2003 format prior to submission. Dated: July 15, 2011...

  5. UPIN Group File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Group Unique Physician Identifier Number (UPIN) File is the business entity file that contains the group practice UPIN and descriptive information. It does NOT...

  6. Interoperability format translation and transformation between IFC architectural design file and simulation file formats

    Science.gov (United States)

    Chao, Tian-Jy; Kim, Younghun

    2015-01-06

    Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function to convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.

  7. A new insight into the oscillation characteristics of endosonic files used in dentistry

    International Nuclear Information System (INIS)

    Lea, S C; Walmsley, A D; Lumley, P J; Landini, G

    2004-01-01

    The aim of this study was to assess the oscillation characteristics of unconstrained endosonic files using a scanning laser vibrometer (SLV). Factors investigated included file vibration frequency and node/antinode location as well as the variation in file displacement amplitude due to increasing generator power setting. A 30 kHz Mini Piezon generator (Electro-Medical Systems, Switzerland) was used in conjunction with a no. 15 and no. 35 K-file. Each file was fixed in position with the long axis of the file perpendicular to the SLV camera head. The laser from the SLV was scanned over the length of the oscillating file for generator power settings 1 to 5 (minimum to half power). Measurements were repeated ten times. The fundamental vibration frequency for both files was 27.50 kHz. Scans of each file showed the positions of nodes/anti-nodes along the file length. The no. 15 file demonstrated no significant variation in its mean maximum displacement amplitude with increasing generator power, except at power setting 5, where a decrease in displacement amplitude was observed. The no. 35 file showed a general increase in mean maximum displacement amplitude with increasing power setting, except at power setting 4 where a 65% decrease in displacement amplitude occurred. In conclusion, scanning laser vibrometry is an effective method for assessing endosonic file vibration characteristics. The SLV was able to demonstrate that (unloaded) file vibration displacement amplitude does not increase linearly with increasing generator power. Further work is being performed on a greater variety of files and generators. Vibration characteristics of files under various loads and varying degrees of constraint should also be investigated

  8. 12 CFR 303.5 - Effect of Community Reinvestment Act performance on filings.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Effect of Community Reinvestment Act performance on filings. 303.5 Section 303.5 Banks and Banking FEDERAL DEPOSIT INSURANCE CORPORATION PROCEDURE... Reinvestment Act performance on filings. Among other factors, the FDIC takes into account the record of...

  9. Huygens file service and storage architecture

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.; Stabell-Kulo, Tage; Stabell-Kulo, Tage

    1993-01-01

    The Huygens file server is a high-performance file server which is able to deliver multi-media data in a timely manner while also providing clients with ordinary “Unix” like file I/O. The file server integrates client machines, file servers and tertiary storage servers in the same storage

  10. Huygens File Service and Storage Architecture

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.; Stabell-Kulo, Tage; Stabell-Kulo, Tage

    1993-01-01

    The Huygens file server is a high-performance file server which is able to deliver multi-media data in a timely manner while also providing clients with ordinary “Unix” like file I/O. The file server integrates client machines, file servers and tertiary storage servers in the same storage

  11. Building a parallel file system simulator

    International Nuclear Information System (INIS)

    Molina-Estolano, E; Maltzahn, C; Brandt, S A; Bent, J

    2009-01-01

    Parallel file systems are gaining in popularity in high-end computing centers as well as commercial data centers. High-end computing systems are expected to scale exponentially and to pose new challenges to their storage scalability in terms of cost and power. To address these challenges scientists and file system designers will need a thorough understanding of the design space of parallel file systems. Yet there exist few systematic studies of parallel file system behavior at petabyte- and exabyte scale. An important reason is the significant cost of getting access to large-scale hardware to test parallel file systems. To contribute to this understanding we are building a parallel file system simulator that can simulate parallel file systems at very large scale. Our goal is to simulate petabyte-scale parallel file systems on a small cluster or even a single machine in reasonable time and fidelity. With this simulator, file system experts will be able to tune existing file systems for specific workloads, scientists and file system deployment engineers will be able to better communicate workload requirements, file system designers and researchers will be able to try out design alternatives and innovations at scale, and instructors will be able to study very large-scale parallel file system behavior in the class room. In this paper we describe our approach and provide preliminary results that are encouraging both in terms of fidelity and simulation scalability.

  12. 76 FR 52323 - Combined Notice of Filings; Filings Instituting Proceedings

    Science.gov (United States)

    2011-08-22

    .... Applicants: Young Gas Storage Company, Ltd. Description: Young Gas Storage Company, Ltd. submits tariff..., but intervention is necessary to become a party to the proceeding. The filings are accessible in the.... More detailed information relating to filing requirements, interventions, protests, and service can be...

  13. Cost-effectiveness analysis of online hemodiafiltration versus high-flux hemodialysis

    Directory of Open Access Journals (Sweden)

    Ramponi F

    2016-09-01

    Full Text Available Francesco Ramponi,1,2 Claudio Ronco,1,3 Giacomo Mason,1 Enrico Rettore,4 Daniele Marcelli,5,6 Francesca Martino,1,3 Mauro Neri,1,7 Alejandro Martin-Malo,8 Bernard Canaud,5,9 Francesco Locatelli10 1International Renal Research Institute (IRRIV, San Bortolo Hospital, Vicenza, 2Department of Economics and Management, University of Padova, Padova, 3Department of Nephrology, San Bortolo Hospital, Vicenza, 4Department of Sociology and Social Research, University of Trento, FBK-IRVAPP & IZA, Trento, Italy; 5Europe, Middle East, Africa and Latin America Medical Board, Fresenius Medical Care,, Bad Homburg, Germany; 6Danube University, Krems, Austria; 7Department of Management and Engineering, University of Padova, Vicenza, Italy; 8Nephrology Unit, Reina Sofia University Hospital, Córdoba, Spain; 9School of Medicine, Montpellier University, Montpellier, France; 10Department of Nephrology, Manzoni Hospital, Lecco, Italy Background: Clinical studies suggest that hemodiafiltration (HDF may lead to better clinical outcomes than high-flux hemodialysis (HF-HD, but concerns have been raised about the cost-effectiveness of HDF versus HF-HD. Aim of this study was to investigate whether clinical benefits, in terms of longer survival and better health-related quality of life, are worth the possibly higher costs of HDF compared to HF-HD.Methods: The analysis comprised a simulation based on the combined results of previous published studies, with the following steps: 1 estimation of the survival function of HF-HD patients from a clinical trial and of HDF patients using the risk reduction estimated in a meta-analysis; 2 simulation of the survival of the same sample of patients as if allocated to HF-HD or HDF using three-state Markov models; and 3 application of state-specific health-related quality of life coefficients and differential costs derived from the literature. Several Monte Carlo simulations were performed, including simulations for patients with different

  14. Fleet Battle Experiment Juliet Final Reconstruction and Analysis Report

    Science.gov (United States)

    2003-04-01

    8000 tons 145 Selected Vessel Statistics Joint Venture Sea SLICE Ship particulars Wave Piercing Catamaran ( CAT ) Small Waterplane Area Twin Hull (SWATH...and broadcasting capabilities. NSWC Corona used a built-in function within GCCS-M to broadcast all OTH Gold Contact (CTC) messages to a file. These...example would be the prohibition of e-mail from electronically accessing the e- 309 mail address book; thus denying many self-propagating viruses

  15. Evolution of the Data Access Protocol in Response to Community Needs

    Science.gov (United States)

    Gallagher, J.; Caron, J. L.; Davis, E.; Fulker, D.; Heimbigner, D.; Holloway, D.; Howe, B.; Moe, S.; Potter, N.

    2012-12-01

    Under the aegis of the OPULS (OPeNDAP-Unidata Linked Servers) Project, funded by NOAA, version 2 of OPeNDAP's Data Access Protocol (DAP2) is being updated to version 4. DAP4 is the first major upgrade in almost two decades and will embody three main areas of advancement. First, the data-model extensions developed by the OPULS team focus on three areas: Better support for coverages, access to HDF5 files and access to relational databases. DAP2 support for coverages (defined as a sampled functions) was limited to simple rectangular coverages that work well for (some) model outputs and processed satellite data but that cannot represent trajectories or satellite swath data, for example. We have extended the coverage concept in DAP4 to remove these limitations. These changes are informed by work at Unidata on the Common Data Model and also by the OGC's abstract coverages specification. In a similar vein, we have extended DAP2's support for relations by including the concept of foreign keys, so that tables can be explicitly related to one another. Second, the web interfaces - web services - that provides access to data using via DAP will be more clearly defined and use other (, orthogonal), standards where they are appropriate. An important case is the XML interface, which provides a cleaner way to build other response media types such as JSON and RDF (for metadata) and to build support for Atom, thus simplify the integration of DAP servers with tools that support OpenSearch. Input from the ESIP federation and work performed with IOOS have informed our choices here. Last, DAP4-compliant servers will support richer data-processing capabilities than DAP2, enabling a wider array of server functions that manipulate data before returning values. Two projects currently are exploring just what can be done even with DAP2's server-function model: The MIIC project at LARC and OPULS itself (with work performed at the University of Washington). Both projects have demonstrated that

  16. 78 FR 20904 - Combined Notice of Filings #2

    Science.gov (United States)

    2013-04-08

    ... Market Power Analysis of Southwestern Public Service Company. Filed Date: 4/1/13. Accession Number....m. ET 4/22/13. Docket Numbers: ER13-1211-000. Applicants: Public Service Company of New Mexico. Description: Public Service Company of New Mexico submits Modification of Real Power Loss Factor SA to be...

  17. FHEO Filed Cases

    Data.gov (United States)

    Department of Housing and Urban Development — The dataset is a list of all the Title VIII fair housing cases filed by FHEO from 1/1/2007 - 12/31/2012 including the case number, case name, filing date, state and...

  18. Nocturnal, every-other-day, online haemodiafiltration: an effective therapeutic alternative.

    Science.gov (United States)

    Maduell, Francisco; Arias, Marta; Durán, Carlos E; Vera, Manel; Fontseré, Néstor; Azqueta, Manel; Rico, Nayra; Pérez, Nuria; Sentis, Alexis; Elena, Montserrat; Rodriguez, Néstor; Arcal, Carola; Bergadá, Eduardo; Cases, Aleix; Bedini, Jose Luis; Campistol, Josep M

    2012-04-01

    Longer and more frequent dialysis sessions have demonstrated excellent survival and clinical advantages, while online haemodiafiltration (OL-HDF) provides the most efficient form of dialysis treatment. The aim of this study was to evaluate the beneficial effects of a longer (nocturnal) and more frequent (every-other-day) dialysis schedule with OL-HDF at the same or the highest convective volume. This prospective, in-centre crossover study was carried out in 26 patients, 18 males and 8 females, 49.2±14 years old, on 4-5 h thrice-weekly post-dilution OL-HDF, switched to nocturnal every-other-day OL-HDF. Patient inclusion criteria consisted of stable patients with good vascular access and with good prospects for improved occupational, psychological and social rehabilitation. Patients were randomly assigned into two groups: Group A received the same convective volume as previously for 6 months followed by a higher convective volume for a further 6 months, while Group B received the same schedule in reverse order. Nocturnal every-other-day OL-HDF was well tolerated and 56% of patients who were working during the baseline period continued to work throughout the study with practically no absenteeism. The convective volume was 26.7±2 L at baseline, 27.5±2 with the unchanged volume and 42.9±4 L with the higher volume. eKt/V increased from 1.75±0.4 to 3.37±0.9. Bicarbonate, blood urea nitrogen (BUN) and creatinine values decreased, while phosphate levels fell markedly with a 90% reduction in phosphate binders. Blood pressure and left ventricular hypertrophy (LVH) improved and the use of anti-hypertensive drugs decreased. In both groups, BUN, creatinine and β2-microglobulin reduction ratios improved. Different removal patterns were observed for myoglobin, prolactin and α1-acid glycoprotein. Nocturnal every-other-day OL-HDF could be an excellent therapeutic alternative since good tolerance and occupational rehabilitation, marked improvement in dialysis dose

  19. Distributing file-based data to remote sites within the BABAR collaboration

    International Nuclear Information System (INIS)

    Adye, T.; Dorigo, A.; Forti, A.; Leonardi, E.

    2001-01-01

    BABAR uses two formats for its data: Objectivity database and ROOT files. This poster concerns the distribution of the latter--for Objectivity data see. The BABAR analysis data is stored in ROOT files--one per physics run and analysis selection channel-maintained in a large directory tree. Currently BABAR has more than 4.5 TBytes in 200,00- ROOT files. This data is (mostly) produced at SLAC, but is required for analysis at universities and research centres throughout the US and Europe. Two basic problems confront us when we seek to import bulk data from SLAC to an institute's local storage via the network. We must determine which files must be imported (depending on the local site requirements and which files have already been imported), and the authors must make the optimum use of the network when transferring the data. Basic ftp-like tools (ftp, scp, etc) do not attempt to solve the first problem. More sophisticated tools like rsync, the widely-used mirror/synchronisation program, compare local and remote file systems, checking for changes (based on file date, size and, if desired, an elaborate checksum) in order to only copy new or modified files. However rsync allows for only limited file selection. Also when, as in BABAR, an extremely large directory structure must be scanned, rsync can take several hours just to determine which files need to be copied. Although rsync (and scp) provides on-the-fly compression, it does not allow us to optimise the network transfer by using multiple streams, adjusting the TCP window size, or separating encrypted authentication from unencrypted data channels

  20. 78 FR 9916 - Filing Dates for the Missouri Special Election in the 8th Congressional District

    Science.gov (United States)

    2013-02-12

    ... 8th Congressional District AGENCY: Federal Election Commission. ACTION: Notice of filing dates for special election. SUMMARY: Missouri has scheduled a Special General Election on June 4, 2013, to fill the.... Committees required to file reports in connection with the Special General Election on June 4, 2013, shall...

  1. Cross-system log file analysis for hypothesis testing

    NARCIS (Netherlands)

    Glahn, Christian

    2008-01-01

    Glahn, C. (2008). Cross-system log file analysis for hypothesis testing. Presented at Empowering Learners for Lifelong Competence Development: pedagogical, organisational and technological issues. 4th TENCompetence Open Workshop. April, 10, 2008, Madrid, Spain.

  2. 76 FR 62092 - Filing Procedures

    Science.gov (United States)

    2011-10-06

    ... INTERNATIONAL TRADE COMMISSION Filing Procedures AGENCY: International Trade Commission. ACTION: Notice of issuance of Handbook on Filing Procedures. SUMMARY: The United States International Trade Commission (``Commission'') is issuing a Handbook on Filing Procedures to replace its Handbook on Electronic...

  3. CERES BiDirectional Scans (BDS) data in HDF (CER_BDS_Terra-FM1_Edition1)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    Each BiDirectional Scans (BDS) data product contains twenty-four hours of Level-1b data for each CERES scanner instrument mounted on each spacecraft. The BDS includes samples taken in normal and short Earth scan elevation profiles in both fixed and rotating azimuth scan modes (including space, internal calibration, and solar calibration views). The BDS contains Level-0 raw (unconverted) science and instrument data as well as the geolocated converted science and instrument data. The BDS contains additional data not found in the Level-0 input file, including converted satellite position and velocity data, celestial data, converted digital status data, and parameters used in the radiance count conversion equations. The following CERES BDS data sets are currently available: CER_BDS_TRMM-PFM_Edition1 CER_BDS_Terra-FM1_Edition1 CER_BDS_Terra-FM2_Edition1 CER_BDS_Terra-FM1_Edition2 CER_BDS_Terra-FM2_Edition2 CER_BDS_Aqua-FM3_Edition1 CER_BDS_Aqua-FM4_Edition1 CER_BDS_Aqua-FM3_Edition2 CER_BDS_Aqua-FM4_Edition2 CER_BDS_Aqua-FM3_Edition1-CV CER_BDS_Aqua-FM4_Edition1-CV CER_BDS_Terra-FM1_Edition1-CV CER_BDS_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1997-12-27; Stop_Date=2005-11-02] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Temporal_Resolution=1 day; Temporal_Resolution_Range=Daily - < Weekly].

  4. CERES BiDirectional Scans (BDS) data in HDF (CER_BDS_Aqua-FM3_Edition1)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    Each BiDirectional Scans (BDS) data product contains twenty-four hours of Level-1b data for each CERES scanner instrument mounted on each spacecraft. The BDS includes samples taken in normal and short Earth scan elevation profiles in both fixed and rotating azimuth scan modes (including space, internal calibration, and solar calibration views). The BDS contains Level-0 raw (unconverted) science and instrument data as well as the geolocated converted science and instrument data. The BDS contains additional data not found in the Level-0 input file, including converted satellite position and velocity data, celestial data, converted digital status data, and parameters used in the radiance count conversion equations. The following CERES BDS data sets are currently available: CER_BDS_TRMM-PFM_Edition1 CER_BDS_Terra-FM1_Edition1 CER_BDS_Terra-FM2_Edition1 CER_BDS_Terra-FM1_Edition2 CER_BDS_Terra-FM2_Edition2 CER_BDS_Aqua-FM3_Edition1 CER_BDS_Aqua-FM4_Edition1 CER_BDS_Aqua-FM3_Edition2 CER_BDS_Aqua-FM4_Edition2 CER_BDS_Aqua-FM3_Edition1-CV CER_BDS_Aqua-FM4_Edition1-CV CER_BDS_Terra-FM1_Edition1-CV CER_BDS_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1997-12-27; Stop_Date=2005-11-02] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Temporal_Resolution=1 day; Temporal_Resolution_Range=Daily - < Weekly].

  5. CERES BiDirectional Scans (BDS) data in HDF (CER_BDS_Aqua-FM3_Edition2)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    Each BiDirectional Scans (BDS) data product contains twenty-four hours of Level-1b data for each CERES scanner instrument mounted on each spacecraft. The BDS includes samples taken in normal and short Earth scan elevation profiles in both fixed and rotating azimuth scan modes (including space, internal calibration, and solar calibration views). The BDS contains Level-0 raw (unconverted) science and instrument data as well as the geolocated converted science and instrument data. The BDS contains additional data not found in the Level-0 input file, including converted satellite position and velocity data, celestial data, converted digital status data, and parameters used in the radiance count conversion equations. The following CERES BDS data sets are currently available: CER_BDS_TRMM-PFM_Edition1 CER_BDS_Terra-FM1_Edition1 CER_BDS_Terra-FM2_Edition1 CER_BDS_Terra-FM1_Edition2 CER_BDS_Terra-FM2_Edition2 CER_BDS_Aqua-FM3_Edition1 CER_BDS_Aqua-FM4_Edition1 CER_BDS_Aqua-FM3_Edition2 CER_BDS_Aqua-FM4_Edition2 CER_BDS_Aqua-FM3_Edition1-CV CER_BDS_Aqua-FM4_Edition1-CV CER_BDS_Terra-FM1_Edition1-CV CER_BDS_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1997-12-27; Stop_Date=2006-01-01] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Temporal_Resolution=1 day; Temporal_Resolution_Range=Daily - < Weekly].

  6. CERES BiDirectional Scans (BDS) data in HDF (CER_BDS_Terra-FM2_Edition2)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    Each BiDirectional Scans (BDS) data product contains twenty-four hours of Level-1b data for each CERES scanner instrument mounted on each spacecraft. The BDS includes samples taken in normal and short Earth scan elevation profiles in both fixed and rotating azimuth scan modes (including space, internal calibration, and solar calibration views). The BDS contains Level-0 raw (unconverted) science and instrument data as well as the geolocated converted science and instrument data. The BDS contains additional data not found in the Level-0 input file, including converted satellite position and velocity data, celestial data, converted digital status data, and parameters used in the radiance count conversion equations. The following CERES BDS data sets are currently available: CER_BDS_TRMM-PFM_Edition1 CER_BDS_Terra-FM1_Edition1 CER_BDS_Terra-FM2_Edition1 CER_BDS_Terra-FM1_Edition2 CER_BDS_Terra-FM2_Edition2 CER_BDS_Aqua-FM3_Edition1 CER_BDS_Aqua-FM4_Edition1 CER_BDS_Aqua-FM3_Edition2 CER_BDS_Aqua-FM4_Edition2 CER_BDS_Aqua-FM3_Edition1-CV CER_BDS_Aqua-FM4_Edition1-CV CER_BDS_Terra-FM1_Edition1-CV CER_BDS_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1997-12-27; Stop_Date=2006-01-01] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Temporal_Resolution=1 day; Temporal_Resolution_Range=Daily - < Weekly].

  7. 12 CFR 1780.9 - Filing of papers.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Filing of papers. 1780.9 Section 1780.9 Banks... papers. (a) Filing. Any papers required to be filed shall be addressed to the presiding officer and filed... Director or the presiding officer. All papers filed by electronic media shall also concurrently be filed in...

  8. 77 FR 22566 - Combined Notice of Filings #1

    Science.gov (United States)

    2012-04-16

    .... 3265; Queue No. X1-042 to be effective 3/2/2012. Filed Date: 4/6/12. Accession Number: 20120406-5071..., ATC Management Inc. Description: Application under Section 204 of The Federal Power Act for...

  9. An Adaptable Seismic Data Format

    Science.gov (United States)

    Krischer, Lion; Smith, James; Lei, Wenjie; Lefebvre, Matthieu; Ruan, Youyi; de Andrade, Elliott Sales; Podhorszki, Norbert; Bozdağ, Ebru; Tromp, Jeroen

    2016-11-01

    We present ASDF, the Adaptable Seismic Data Format, a modern and practical data format for all branches of seismology and beyond. The growing volume of freely available data coupled with ever expanding computational power opens avenues to tackle larger and more complex problems. Current bottlenecks include inefficient resource usage and insufficient data organization. Properly scaling a problem requires the resolution of both these challenges, and existing data formats are no longer up to the task. ASDF stores any number of synthetic, processed or unaltered waveforms in a single file. A key improvement compared to existing formats is the inclusion of comprehensive meta information, such as event or station information, in the same file. Additionally, it is also usable for any non-waveform data, for example, cross-correlations, adjoint sources or receiver functions. Last but not least, full provenance information can be stored alongside each item of data, thereby enhancing reproducibility and accountability. Any data set in our proposed format is self-describing and can be readily exchanged with others, facilitating collaboration. The utilization of the HDF5 container format grants efficient and parallel I/O operations, integrated compression algorithms and check sums to guard against data corruption. To not reinvent the wheel and to build upon past developments, we use existing standards like QuakeML, StationXML, W3C PROV and HDF5 wherever feasible. Usability and tool support are crucial for any new format to gain acceptance. We developed mature C/Fortran and Python based APIs coupling ASDF to the widely used SPECFEM3D_GLOBE and ObsPy toolkits.

  10. The Jade File System. Ph.D. Thesis

    Science.gov (United States)

    Rao, Herman Chung-Hwa

    1991-01-01

    File systems have long been the most important and most widely used form of shared permanent storage. File systems in traditional time-sharing systems, such as Unix, support a coherent sharing model for multiple users. Distributed file systems implement this sharing model in local area networks. However, most distributed file systems fail to scale from local area networks to an internet. Four characteristics of scalability were recognized: size, wide area, autonomy, and heterogeneity. Owing to size and wide area, techniques such as broadcasting, central control, and central resources, which are widely adopted by local area network file systems, are not adequate for an internet file system. An internet file system must also support the notion of autonomy because an internet is made up by a collection of independent organizations. Finally, heterogeneity is the nature of an internet file system, not only because of its size, but also because of the autonomy of the organizations in an internet. The Jade File System, which provides a uniform way to name and access files in the internet environment, is presented. Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Because of autonomy, Jade is designed under the restriction that the underlying file systems may not be modified. In order to avoid the complexity of maintaining an internet-wide, global name space, Jade permits each user to define a private name space. In Jade's design, we pay careful attention to avoiding unnecessary network messages between clients and file servers in order to achieve acceptable performance. Jade's name space supports two novel features: (1) it allows multiple file systems to be mounted under one direction; and (2) it permits one logical name space to mount other logical name spaces. A prototype of Jade was implemented to examine and validate its

  11. Bit Grooming: statistically accurate precision-preserving quantization with compression, evaluated in the netCDF Operators (NCO, v4.4.8+)

    Science.gov (United States)

    Zender, Charles S.

    2016-09-01

    Geoscientific models and measurements generate false precision (scientifically meaningless data bits) that wastes storage space. False precision can mislead (by implying noise is signal) and be scientifically pointless, especially for measurements. By contrast, lossy compression can be both economical (save space) and heuristic (clarify data limitations) without compromising the scientific integrity of data. Data quantization can thus be appropriate regardless of whether space limitations are a concern. We introduce, implement, and characterize a new lossy compression scheme suitable for IEEE floating-point data. Our new Bit Grooming algorithm alternately shaves (to zero) and sets (to one) the least significant bits of consecutive values to preserve a desired precision. This is a symmetric, two-sided variant of an algorithm sometimes called Bit Shaving that quantizes values solely by zeroing bits. Our variation eliminates the artificial low bias produced by always zeroing bits, and makes Bit Grooming more suitable for arrays and multi-dimensional fields whose mean statistics are important. Bit Grooming relies on standard lossless compression to achieve the actual reduction in storage space, so we tested Bit Grooming by applying the DEFLATE compression algorithm to bit-groomed and full-precision climate data stored in netCDF3, netCDF4, HDF4, and HDF5 formats. Bit Grooming reduces the storage space required by initially uncompressed and compressed climate data by 25-80 and 5-65 %, respectively, for single-precision values (the most common case for climate data) quantized to retain 1-5 decimal digits of precision. The potential reduction is greater for double-precision datasets. When used aggressively (i.e., preserving only 1-2 digits), Bit Grooming produces storage reductions comparable to other quantization techniques such as Linear Packing. Unlike Linear Packing, whose guaranteed precision rapidly degrades within the relatively narrow dynamic range of values that

  12. 75 FR 1362 - WM Renewable Energy, L.L.C.; Notice of Filing

    Science.gov (United States)

    2010-01-11

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket Nos. EL10-32-000, QF08-622-002] WM Renewable Energy, L.L.C.; Notice of Filing January 4, 2010. Take notice that on December 31, 2009, WM Renewable Energy, L.L.C. filed a petition for a declaratory order, pursuant to Rule 207(a)(2) of...

  13. The Galley Parallel File System

    Science.gov (United States)

    Nieuwejaar, Nils; Kotz, David

    1996-01-01

    Most current multiprocessor file systems are designed to use multiple disks in parallel, using the high aggregate bandwidth to meet the growing I/0 requirements of parallel scientific applications. Many multiprocessor file systems provide applications with a conventional Unix-like interface, allowing the application to access multiple disks transparently. This interface conceals the parallelism within the file system, increasing the ease of programmability, but making it difficult or impossible for sophisticated programmers and libraries to use knowledge about their I/O needs to exploit that parallelism. In addition to providing an insufficient interface, most current multiprocessor file systems are optimized for a different workload than they are being asked to support. We introduce Galley, a new parallel file system that is intended to efficiently support realistic scientific multiprocessor workloads. We discuss Galley's file structure and application interface, as well as the performance advantages offered by that interface.

  14. 77 FR 23705 - Combined Notice of Filings #1

    Science.gov (United States)

    2012-04-20

    ... Agreement SCE--RE Victor Phelan Solar One LLC to be effective 4/12/2012. Filed Date: 4/11/12. Accession...-000. Applicants: EGP Stillwater Solar, LLC. Description: EGP Stillwater Solar, LLC Notice of Change in.... ET 5/2/12. Docket Numbers: ER12-1517-000. Applicants: Energia Sierra Juarez U.S., LLC, San Diego Gas...

  15. A new generation of cellulose triacetate suitable for online haemodiafiltration

    Directory of Open Access Journals (Sweden)

    Francisco Maduell

    2018-03-01

    Full Text Available Background: Online haemodiafiltration (OL-HDF is currently the most effective dialysis technique that also improves survival. To date, high permeability membranes with low albumin loss, such as polysulfone, polyamide and polyacrylonitrile membranes have been the most widely used. However, the initially restricted use of cellulose triacetate (CTA membranes in OL-HDF has expanded. The aim of the study was to ascertain whether the latest generation asymmetric CTA membranes are more effective in obtaining high convective transport. Patients and methods: A total of 16 patients (10 males and 6 females undergoing OL-HDF were studied. Each patient underwent 4 different sessions, with haemodialysis or OL-HDF, and/or with CTA or asymmetric CTA 1.9 m2 membranes. Each session was assigned in a randomised order. Serum levels of urea, creatinine, β2-microglobulin, myoglobin, prolactin, α1-microglobulin, α1-acid glycoprotein and albumin where measured at the beginning and end of each session to obtain the reduction rate. The loss of solutes and albumin was quantified from the dialysate. Results: A significantly greater replacement volume in OL-HDF (32.1 ± 3.1 vs. 19.7 ± 4.5 l, p < 0.001 was obtained by using asymmetrical CTA membranes compared to conventional CTA membranes. Regarding uraemic toxin removal, both membranes obtained similar results for small molecules, whereas asymmetric CTA membranes achieved better results for large molecules, increasing the reduction ratio by 29% for β2-microglobulin, 27.7% for myoglobin, 19.5% for prolactin, 49% for α1-microglobulin and double for α1-acid glycoprotein (p < 0.01 in all situations. The loss of albumin was less than 2 g for all treatment sessions. Conclusion: Latest-generation asymmetric CTA have proven to be effective in attaining OL-HDF objectives without increased albumin loss. Resumen: Antecedentes: La hemodiafiltración on-line (HDF-OL es actualmente

  16. 77 FR 20816 - Combined Notice of Filings #1

    Science.gov (United States)

    2012-04-06

    ...: Westar Energy, Inc. Description: KEPCo, Revisions to Attachment A--Delivery Points (4/ 1/12) to be...: The City of Wamego, Kansas Wholesale Power Sales Service to be effective 6/1/2012. Filed Date: 3/28/12...

  17. 78 FR 23243 - Combined Notice of Filings #1

    Science.gov (United States)

    2013-04-18

    ...-000 Applicants: BayWa r.e. Mozart, LLC Description: Notice of Self-Certification of Exempt Wholesale Generator Status of BayWa r.e. Mozart, LLC. Filed Date: 4/10/13 Accession Number: 20130410-5093 Comments Due...

  18. 37 CFR 360.4 - Compliance with statutory dates.

    Science.gov (United States)

    2010-07-01

    ... dates. 360.4 Section 360.4 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS SUBMISSION OF ROYALTY CLAIMS FILING OF CLAIMS TO ROYALTY FEES COLLECTED UNDER COMPULSORY LICENSE Cable Claims § 360.4 Compliance with statutory dates. (a) Claims filed with the Copyright Royalty Board...

  19. A Study of NetCDF as an Approach for High Performance Medical Image Storage

    International Nuclear Information System (INIS)

    Magnus, Marcone; Prado, Thiago Coelho; Von Wangenhein, Aldo; De Macedo, Douglas D J; Dantas, M A R

    2012-01-01

    The spread of telemedicine systems increases every day. The systems and PACS based on DICOM images has become common. This rise reflects the need to develop new storage systems, more efficient and with lower computational costs. With this in mind, this article discusses a study for application in NetCDF data format as the basic platform for storage of DICOM images. The study case comparison adopts an ordinary database, the HDF5 and the NetCDF to storage the medical images. Empirical results, using a real set of images, indicate that the time to retrieve images from the NetCDF for large scale images has a higher latency compared to the other two methods. In addition, the latency is proportional to the file size, which represents a drawback to a telemedicine system that is characterized by a large amount of large image files.

  20. 77 FR 67722 - Self-Regulatory Organizations; BOX Options Exchange LLC; Notice of Filing of Proposed Minor Rule...

    Science.gov (United States)

    2012-11-13

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-68170; File No. 4-655] Self-Regulatory Organizations; BOX Options Exchange LLC; Notice of Filing of Proposed Minor Rule Violation Plan November 6, 2012... Rule 19d-1(c)(1) of the Act \\3\\ requiring that a self- regulatory organization (``SRO'') promptly file...

  1. Effect of repetitive pecking at working length for glide path preparation using G-file

    Directory of Open Access Journals (Sweden)

    Jung-Hong Ha

    2015-05-01

    Full Text Available Objectives Glide path preparation is recommended to reduce torsional failure of nickel-titanium (NiTi rotary instruments and to prevent root canal transportation. This study evaluated whether the repetitive insertions of G-files to the working length maintain the apical size as well as provide sufficient lumen as a glide path for subsequent instrumentation. Materials and Methods The G-file system (Micro-Mega composed of G1 and G2 files for glide path preparation was used with the J-shaped, simulated resin canals. After inserting a G1 file twice, a G2 file was inserted to the working length 1, 4, 7, or 10 times for four each experimental group, respectively (n = 10. Then the canals were cleaned by copious irrigation, and lubricated with a separating gel medium. Canal replicas were made using silicone impression material, and the diameter of the replicas was measured at working length (D0 and 1 mm level (D1 under a scanning electron microscope. Data was analysed by one-way ANOVA and post-hoc tests (p = 0.05. Results The diameter at D0 level did not show any significant difference between the 1, 2, 4, and 10 times of repetitive pecking insertions of G2 files at working length. However, 10 times of pecking motion with G2 file resulted in significantly larger canal diameter at D1 (p < 0.05. Conclusions Under the limitations of this study, the repetitive insertion of a G2 file up to 10 times at working length created an adequate lumen for subsequent apical shaping with other rotary files bigger than International Organization for Standardization (ISO size 20, without apical transportation at D0 level.

  2. 78 FR 21930 - Aquenergy Systems, Inc.; Notice of Intent To File License Application, Filing of Pre-Application...

    Science.gov (United States)

    2013-04-12

    ... Systems, Inc.; Notice of Intent To File License Application, Filing of Pre-Application Document, and Approving Use of the Traditional Licensing Process a. Type of Filing: Notice of Intent to File License...: November 11, 2012. d. Submitted by: Aquenergy Systems, Inc., a fully owned subsidiaries of Enel Green Power...

  3. CFD Modeling of a Multiphase Gravity Separator Vessel

    KAUST Repository

    Narayan, Gautham

    2017-05-23

    The poster highlights a CFD study that incorporates a combined Eulerian multi-fluid multiphase and a Population Balance Model (PBM) to study the flow inside a typical multiphase gravity separator vessel (GSV) found in oil and gas industry. The simulations were performed using Ansys Fluent CFD package running on KAUST supercomputer, Shaheen. Also, a highlight of a scalability study is presented. The effect of I/O bottlenecks and using Hierarchical Data Format (HDF5) for collective and independent parallel reading of case file is presented. This work is an outcome of a research collaboration on an Aramco project on Shaheen.

  4. CFD Modeling of a Multiphase Gravity Separator Vessel

    KAUST Repository

    Narayan, Gautham; Khurram, Rooh Ul Amin; Elsaadawy, Ehab

    2017-01-01

    The poster highlights a CFD study that incorporates a combined Eulerian multi-fluid multiphase and a Population Balance Model (PBM) to study the flow inside a typical multiphase gravity separator vessel (GSV) found in oil and gas industry. The simulations were performed using Ansys Fluent CFD package running on KAUST supercomputer, Shaheen. Also, a highlight of a scalability study is presented. The effect of I/O bottlenecks and using Hierarchical Data Format (HDF5) for collective and independent parallel reading of case file is presented. This work is an outcome of a research collaboration on an Aramco project on Shaheen.

  5. 12 CFR 16.33 - Filing fees.

    Science.gov (United States)

    2010-01-01

    ... Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY SECURITIES OFFERING DISCLOSURE RULES § 16.33 Filing fees. (a) Filing fees must accompany certain filings made under the provisions of this part... Comptroller of the Currency Fees published pursuant to § 8.8 of this chapter. (b) Filing fees must be paid by...

  6. 75 FR 4689 - Electronic Tariff Filings

    Science.gov (United States)

    2010-01-29

    ... elements ``are required to properly identify the nature of the tariff filing, organize the tariff database... (or other pleading) and the Type of Filing code chosen will be resolved in favor of the Type of Filing...'s wish expressed in its transmittal letter or in other pleadings, the Commission may not review a...

  7. A History of the Andrew File System

    CERN Multimedia

    CERN. Geneva; Altman, Jeffrey

    2011-01-01

    Derrick Brashear and Jeffrey Altman will present a technical history of the evolution of Andrew File System starting with the early days of the Andrew Project at Carnegie Mellon through the commercialization by Transarc Corporation and IBM and a decade of OpenAFS. The talk will be technical with a focus on the various decisions and implementation trade-offs that were made over the course of AFS versions 1 through 4, the development of the Distributed Computing Environment Distributed File System (DCE DFS), and the course of the OpenAFS development community. The speakers will also discuss the various AFS branches developed at the University of Michigan, Massachusetts Institute of Technology and Carnegie Mellon University.

  8. 78 FR 15359 - Combined Notice of Filings

    Science.gov (United States)

    2013-03-11

    ...: WBI Energy Transmission, Inc. Description: 2013 Annual Fuel and Electric Power Reimbursement to be.... Description: Storm Surcharge 2013 to be effective 4/1/2013. Filed Date: 3/1/13. Accession Number: 20130301... Numbers: RP13-668-000. Applicants: CF Industries Enterprises, Inc., CF Industries Nitrogen, LLC...

  9. 78 FR 75554 - Combined Notice of Filings

    Science.gov (United States)

    2013-12-12

    ...-000. Applicants: Young Gas Storage Company, Ltd. Description: Young Fuel Reimbursement Filing to be.... Protests may be considered, but intervention is necessary to become a party to the proceeding. eFiling is... qualifying facilities filings can be found at: http://www.ferc.gov/docs-filing/efiling/filing-req.pdf . For...

  10. Small file aggregation in a parallel computing system

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Zhang, Jingwang

    2014-09-02

    Techniques are provided for small file aggregation in a parallel computing system. An exemplary method for storing a plurality of files generated by a plurality of processes in a parallel computing system comprises aggregating the plurality of files into a single aggregated file; and generating metadata for the single aggregated file. The metadata comprises an offset and a length of each of the plurality of files in the single aggregated file. The metadata can be used to unpack one or more of the files from the single aggregated file.

  11. 5 CFR 1203.13 - Filing pleadings.

    Science.gov (United States)

    2010-01-01

    ... delivery, by facsimile, or by e-filing in accordance with § 1201.14 of this chapter. If the document was... submitted by e-filing, it is considered to have been filed on the date of electronic submission. (e... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Filing pleadings. 1203.13 Section 1203.13...

  12. PFS: a distributed and customizable file system

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.

    1996-01-01

    In this paper we present our ongoing work on the Pegasus File System (PFS), a distributed and customizable file system that can be used for off-line file system experiments and on-line file system storage. PFS is best described as an object-oriented component library from which either a true file

  13. OFFSCALE: PC input processor for SCALE-4 criticality sequences

    International Nuclear Information System (INIS)

    Bowman, S.M.

    1991-01-01

    OFFSCALE is a personal computer program that serves as a user-friendly interface for the Criticality Safety Analysis Sequences (CSAS) available in SCALE-4. It is designed to assist a SCALE-4 user in preparing an input file for execution of criticality safety problems. Output from OFFSCALE is a card-image input file that may be uploaded to a mainframe computer to execute the CSAS4 control module in SCALE-4. OFFSCALE features a pulldown menu system that accesses sophisticated data entry screens. The program allows the user to quickly set up a CSAS4 input file and perform data checking

  14. 76 FR 61351 - Combined Notice of Filings #1

    Science.gov (United States)

    2011-10-04

    ... MBR Baseline Tariff Filing to be effective 9/22/2011. Filed Date: 09/22/2011. Accession Number... submits tariff filing per 35.1: ECNY MBR Re-File to be effective 9/22/2011. Filed Date: 09/22/2011... Industrial Energy Buyers, LLC submits tariff filing per 35.1: NYIEB MBR Re-File to be effective 9/22/2011...

  15. Deceit: A flexible distributed file system

    Science.gov (United States)

    Siegel, Alex; Birman, Kenneth; Marzullo, Keith

    1989-01-01

    Deceit, a distributed file system (DFS) being developed at Cornell, focuses on flexible file semantics in relation to efficiency, scalability, and reliability. Deceit servers are interchangeable and collectively provide the illusion of a single, large server machine to any clients of the Deceit service. Non-volatile replicas of each file are stored on a subset of the file servers. The user is able to set parameters on a file to achieve different levels of availability, performance, and one-copy serializability. Deceit also supports a file version control mechanism. In contrast with many recent DFS efforts, Deceit can behave like a plain Sun Network File System (NFS) server and can be used by any NFS client without modifying any client software. The current Deceit prototype uses the ISIS Distributed Programming Environment for all communication and process group management, an approach that reduces system complexity and increases system robustness.

  16. 10 CFR 110.89 - Filing and service.

    Science.gov (United States)

    2010-01-01

    ...: Rulemakings and Adjudications Staff or via the E-Filing system, following the procedure set forth in 10 CFR 2.302. Filing by mail is complete upon deposit in the mail. Filing via the E-Filing system is completed... residence with some occupant of suitable age and discretion; (2) Following the requirements for E-Filing in...

  17. 49 CFR 1104.6 - Timely filing required.

    Science.gov (United States)

    2010-10-01

    ... offers next day delivery to Washington, DC. If the e-filing option is chosen (for those pleadings and documents that are appropriate for e-filing, as determined by reference to the information on the Board's Web site), then the e-filed pleading or document is timely filed if the e-filing process is completed...

  18. DICOM supported sofware configuration by XML files

    International Nuclear Information System (INIS)

    LucenaG, Bioing Fabian M; Valdez D, Andres E; Gomez, Maria E; Nasisi, Oscar H

    2007-01-01

    A method for the configuration of informatics systems that provide support to DICOM standards using XML files is proposed. The difference with other proposals is base on that this system does not code the information of a DICOM objects file, but codes the standard itself in an XML file. The development itself is the format for the XML files mentioned, in order that they can support what DICOM normalizes for multiple languages. In this way, the same configuration file (or files) can be use in different systems. Jointly the XML configuration file generated, we wrote also a set of CSS and XSL files. So the same file can be visualized in a standard browser, as a query system of DICOM standard, emerging use, that did not was a main objective but brings a great utility and versatility. We exposed also some uses examples of the configuration file mainly in relation with the load of DICOM information objects. Finally, at the conclusions we show the utility that the system has already provided when the edition of DICOM standard changes from 2006 to 2007

  19. 78 FR 54502 - Self-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of Filing of...

    Science.gov (United States)

    2013-09-04

    ...-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of Filing of a Proposed Rule... Authority, Inc. (``FINRA'') filed with the Securities and Exchange Commission (``SEC'' or ``Commission... or manipulative motivation for the trading activity at issue.\\4\\ Specifically, proposed Supplementary...

  20. 12 CFR 908.25 - Filing of papers.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Filing of papers. 908.25 Section 908.25 Banks... RULES OF PRACTICE AND PROCEDURE IN HEARINGS ON THE RECORD General Rules § 908.25 Filing of papers. (a) Filing. Any papers required to be filed shall be addressed to the presiding officer and filed with the...

  1. Online Hemodiafiltration Reduces Bisphenol A Levels.

    Science.gov (United States)

    Quiroga, Borja; Bosch, Ricardo J; Fiallos, Ruth A; Sánchez-Heras, Marta; Olea-Herrero, Nuria; López-Aparicio, Pilar; Muñóz-Moreno, Carmen; Pérez-Alvarsan, Miguel Angel; De Arriba, Gabriel

    2017-02-01

    Several uremic toxins have been identified and related to higher rates of morbidity and mortality in dialysis patients. Bisphenol A (BPA) accumulates in patients with chronic kidney disease. The aim of this study is to demonstrate the usefulness of online hemodiafiltration (OL-HDF) in reducing BPA levels. Thirty stable hemodialysis patients were selected to participate in this paired study. During three periods of 3 weeks each, patients were switched from high-flux hemodialysis (HF-HD) to OL-HDF, and back to HF-HD. BPA levels were measured in the last session of each period (pre- and post-dialysis) using ELISA and HPLC. Twenty-two patients (mean age 73 ± 14 years; 86.4% males) were included. Measurements of BPA levels by HPLC and ELISA assays showed a weak but significant correlation (r = 0.218, P = 0.012). BPA levels decreased in the OL-HDF period of hemodialysis, in contrast to the HF-HD period when they remained stable (P = 0.002). In conclusion, OL-HDF reduced BPA levels in dialysis patients. © 2016 International Society for Apheresis, Japanese Society for Apheresis, and Japanese Society for Dialysis Therapy.

  2. Influence of cervical preflaring on apical file size determination.

    Science.gov (United States)

    Pecora, J D; Capelli, A; Guerisoli, D M Z; Spanó, J C E; Estrela, C

    2005-07-01

    To investigate the influence of cervical preflaring with different instruments (Gates-Glidden drills, Quantec Flare series instruments and LA Axxess burs) on the first file that binds at working length (WL) in maxillary central incisors. Forty human maxillary central incisors with complete root formation were used. After standard access cavities, a size 06 K-file was inserted into each canal until the apical foramen was reached. The WL was set 1 mm short of the apical foramen. Group 1 received the initial apical instrument without previous preflaring of the cervical and middle thirds of the root canal. Group 2 had the cervical and middle portion of the root canals enlarged with Gates-Glidden drills sizes 90, 110 and 130. Group 3 had the cervical and middle thirds of the root canals enlarged with nickel-titanium Quantec Flare series instruments. Titanium-nitrite treated, stainless steel LA Axxess burs were used for preflaring the cervical and middle portions of root canals from group 4. Each canal was sized using manual K-files, starting with size 08 files with passive movements until the WL was reached. File sizes were increased until a binding sensation was felt at the WL, and the instrument size was recorded for each tooth. The apical region was then observed under a stereoscopic magnifier, images were recorded digitally and the differences between root canal and maximum file diameters were evaluated for each sample. Significant differences were found between experimental groups regarding anatomical diameter at the WL and the first file to bind in the canal (P Flare instruments were ranked in an intermediary position, with no statistically significant differences between them (0.093 mm average). The instrument binding technique for determining anatomical diameter at WL is not precise. Preflaring of the cervical and middle thirds of the root canal improved anatomical diameter determination; the instrument used for preflaring played a major role in determining the

  3. PFS: a distributed and customizable file system

    OpenAIRE

    Bosch, H.G.P.; Mullender, Sape J.

    1996-01-01

    In this paper we present our ongoing work on the Pegasus File System (PFS), a distributed and customizable file system that can be used for off-line file system experiments and on-line file system storage. PFS is best described as an object-oriented component library from which either a true file system or a file-system simulator can be constructed. Each of the components in the library is easily replaced by another implementation to accommodate a wide range of applications.

  4. Detecting Malicious Code by Binary File Checking

    Directory of Open Access Journals (Sweden)

    Marius POPA

    2014-01-01

    Full Text Available The object, library and executable code is stored in binary files. Functionality of a binary file is altered when its content or program source code is changed, causing undesired effects. A direct content change is possible when the intruder knows the structural information of the binary file. The paper describes the structural properties of the binary object files, how the content can be controlled by a possible intruder and what the ways to identify malicious code in such kind of files. Because the object files are inputs in linking processes, early detection of the malicious content is crucial to avoid infection of the binary executable files.

  5. Processing and validation of intermediate energy evaluated data files

    International Nuclear Information System (INIS)

    2000-01-01

    Current accelerator-driven and other intermediate energy technologies require accurate nuclear data to model the performance of the target/blanket assembly, neutron production, activation, heating and damage. In a previous WPEC subgroup, SG13 on intermediate energy nuclear data, various aspects of intermediate energy data, such as nuclear data needs, experiments, model calculations and file formatting issues were investigated and categorized to come to a joint evaluation effort. The successor of SG13, SG14 on the processing and validation of intermediate energy evaluated data files, goes one step further. The nuclear data files that have been created with the aforementioned information need to be processed and validated in order to be applicable in realistic intermediate energy simulations. We emphasize that the work of SG14 excludes the 0-20 MeV data part of the neutron evaluations, which is supposed to be covered elsewhere. This final report contains the following sections: section 2: a survey of the data files above 20 MeV that have been considered for validation in SG14; section 3: a summary of the review of the 150 MeV intermediate energy data files for ENDF/B-VI and, more briefly, the other libraries; section 4: validation of the data library against an integral experiment with MCNPX; section 5: conclusions. (author)

  6. Formalizing a hierarchical file system

    NARCIS (Netherlands)

    Hesselink, Wim H.; Lali, Muhammad Ikram

    An abstract file system is defined here as a partial function from (absolute) paths to data. Such a file system determines the set of valid paths. It allows the file system to be read and written at a valid path, and it allows the system to be modified by the Unix operations for creation, removal,

  7. Evaluation of the incidence of microcracks caused by Mtwo and ProTaper Next rotary file systems versus the self-adjusting file: A scanning electron microscopic study.

    Science.gov (United States)

    Saha, Suparna Ganguly; Vijaywargiya, Neelam; Saxena, Divya; Saha, Mainak Kanti; Bharadwaj, Anuj; Dubey, Sandeep

    2017-01-01

    To evaluate the incidence of microcrack formation canal preparation with two rotary nickel-titanium systems Mtwo and ProTaper Next along with the self-adjusting file system. One hundred and twenty mandibular premolar teeth were selected. Standardized access cavities were prepared and the canals were manually prepared up to size 20 after coronal preflaring. The teeth were divided into three experimental groups and one control group ( n = 30). Group 1: The canals were prepared using Mtwo rotary files. Group 2: The canals were prepared with ProTaper Next files. Group 3: The canals were prepared with self-adjusting files. Group 4: The canals were unprepared and used as a control. The roots were sectioned horizontally 3, 6, and 9 mm from the apex and examined under a scanning electron microscope to check for the presence of microcracks. The Pearson's Chi-square test was applied. The highest incidence of microcracks were associated with the ProTaper Next group, 80% ( P = 0.00), followed by the Mtwo group, 70% ( P = 0.000), and the least number of microcracks was noted in the self-adjusting file group, 10% ( P = 0.068). No significant difference was found between the ProTaper Next and Mtwo groups ( P = 0.368) while a significant difference was observed between the ProTaper Next and self-adjusting file groups ( P = 0.000) as well as the Mtwo and self-adjusting file groups ( P = 0.000). All nickel-titanium rotary instrument systems were associated with microcracks. However, the self-adjusting file system had significantly fewer microcracks when compared with the Mtwo and ProTaper Next.

  8. Bulk Extractor 1.4 User’s Manual

    Science.gov (United States)

    2013-08-01

    optimistically decompresses data in ZIP, GZIP, RAR, and Mi- crosoft’s Hibernation files. This has proven useful, for example, in recovering email...command line. Java 7 or above must be installed on the machine for the Bulk Extractor Viewer to run. Instructions on running bulk_extractor from the... Hibernation File Fragments (decompressed and processed, not carved) Subsection 4.6 winprefetch Windows Prefetch files, file fragments (processed

  9. 76 FR 70438 - Enterprise Intrastate L.P., Enterprise Texas Pipeline LLC; Notice of Filing

    Science.gov (United States)

    2011-11-14

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR12-4-000; Docket No. PR12-5-000; Not Consolidated] Enterprise Intrastate L.P., Enterprise Texas Pipeline LLC; Notice of Filing Take notice that on November 1, 2011, the applicants listed above filed a revised Statement of...

  10. Simple Automatic File Exchange (SAFE) to Support Low-Cost Spacecraft Operation via the Internet

    Science.gov (United States)

    Baker, Paul; Repaci, Max; Sames, David

    1998-01-01

    Various issues associated with Simple Automatic File Exchange (SAFE) are presented in viewgraph form. Specific topics include: 1) Packet telemetry, Internet IP networks and cost reduction; 2) Basic functions and technical features of SAFE; 3) Project goals, including low-cost satellite transmission to data centers to be distributed via an Internet; 4) Operations with a replicated file protocol; 5) File exchange operation; 6) Ground stations as gateways; 7) Lessons learned from demonstrations and tests with SAFE; and 8) Feedback and future initiatives.

  11. 77 FR 74839 - Combined Notice of Filings

    Science.gov (United States)

    2012-12-18

    ..., LP. Description: National Grid LNG, LP submits tariff filing per 154.203: Adoption of NAESB Version 2... with Order to Amend NAESB Version 2.0 Filing to be effective 12/1/2012. Filed Date: 12/11/12. Accession...: Refile to comply with Order on NAESB Version 2.0 Filing to be effective 12/1/2012. Filed Date: 12/11/12...

  12. Formalizing a Hierarchical File System

    NARCIS (Netherlands)

    Hesselink, Wim H.; Lali, M.I.

    2009-01-01

    In this note, we define an abstract file system as a partial function from (absolute) paths to data. Such a file system determines the set of valid paths. It allows the file system to be read and written at a valid path, and it allows the system to be modified by the Unix operations for removal

  13. 76 FR 6459 - Mahoning Hydropower, LLC; Notice of Preliminary Permit Application Accepted for Filing and...

    Science.gov (United States)

    2011-02-04

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 13954-000] Mahoning Hydropower, LLC; Notice of Preliminary Permit Application Accepted for Filing and Soliciting Comments... Hydropower, LLC filed an application for a preliminary permit, pursuant to section 4(f) of the Federal Power...

  14. 76 FR 7838 - Mahoning Hydropower, LLC; Notice of Preliminary Permit Application Accepted for Filing and...

    Science.gov (United States)

    2011-02-11

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 13953-000] Mahoning Hydropower, LLC; Notice of Preliminary Permit Application Accepted for Filing and Soliciting Comments... Hydropower, LLC filed an application for a preliminary permit, pursuant to section 4(f) of the Federal Power...

  15. Hemodiafiltration Versus Hemodialysis and Survival in Patients With ESRD: The French Renal Epidemiology and Information Network (REIN) Registry.

    Science.gov (United States)

    Mercadal, Lucile; Franck, Jeanna-Eve; Metzger, Marie; Urena Torres, Pablo; de Cornelissen, François; Edet, Stéphane; Béchade, Clémence; Vigneau, Cécile; Drüeke, Tilman; Jacquelinet, Christian; Stengel, Bénédicte

    2016-08-01

    Recent randomized trials report that mortality is lower with high-convection-volume hemodiafiltration (HDF) than with hemodialysis (HD). We used data from the French national Renal Epidemiology and Information Network (REIN) registry to investigate trends in HDF use and its relationship with mortality in the total population of incident dialysis patients. The study included those who initiated HD therapy from January 1, 2008, through December 31, 2011, and were dialyzed for more than 3 months; follow-up extended to the end of 2012. HDF use at the patient and facility level. All-cause and cardiovascular mortality, using Cox models to estimate HRs of HDF as time-dependent covariate at the patient level, with age as time scale and fully adjusted for comorbid conditions and laboratory data at baseline, catheter use, and facility type as time-dependent covariates. Analyses completed by Cox models for HRs of the facility-level exposure to HDF updated yearly. Of 28,407 HD patients, 5,526 used HDF for a median of 1.2 (IQR, 0.9-1.9) years; 2,254 of them used HDF exclusively. HRs for all-cause and cardiovascular mortality associated with HDF use were 0.84 (95% CI, 0.77-0.91) and 0.73 (95% CI, 0.61-0.88), respectively. In patients treated exclusively with HDF, these HRs were 0.77 (95% CI, 0.67-0.87) and 0.66 (95% CI, 0.50-0.86). At the facility level, increasing the percentage of patients using HDF from 0% to 100% was associated with HRs for all-cause and cardiovascular mortality of 0.87 (95% CI, 0.77-0.99) and 0.72 (95% CI, 0.54-0.96), respectively. Observational study. Whether analyzed as a patient- or facility-level predictor, HDF treatment was associated with better survival. Copyright © 2015 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.

  16. 77 FR 35371 - Combined Notice of Filings #1

    Science.gov (United States)

    2012-06-13

    .... Applicants: Duke Energy Miami Fort, LLC. Description: MBR Filing to be effective 10/1/2012. Filed Date: 6/5...-000. Applicants: Duke Energy Piketon, LLC. Description: MBR Filing to be effective 10/1/2012. Filed...-1959-000. Applicants: Duke Energy Stuart, LLC. Description: MBR Filing to be effective 10/1/2012. Filed...

  17. Schnek: A C++ library for the development of parallel simulation codes on regular grids

    Science.gov (United States)

    Schmitz, Holger

    2018-05-01

    A large number of algorithms across the field of computational physics are formulated on grids with a regular topology. We present Schnek, a library that enables fast development of parallel simulations on regular grids. Schnek contains a number of easy-to-use modules that greatly reduce the amount of administrative code for large-scale simulation codes. The library provides an interface for reading simulation setup files with a hierarchical structure. The structure of the setup file is translated into a hierarchy of simulation modules that the developer can specify. The reader parses and evaluates mathematical expressions and initialises variables or grid data. This enables developers to write modular and flexible simulation codes with minimal effort. Regular grids of arbitrary dimension are defined as well as mechanisms for defining physical domain sizes, grid staggering, and ghost cells on these grids. Ghost cells can be exchanged between neighbouring processes using MPI with a simple interface. The grid data can easily be written into HDF5 files using serial or parallel I/O.

  18. Virtual file system for PSDS

    Science.gov (United States)

    Runnels, Tyson D.

    1993-01-01

    This is a case study. It deals with the use of a 'virtual file system' (VFS) for Boeing's UNIX-based Product Standards Data System (PSDS). One of the objectives of PSDS is to store digital standards documents. The file-storage requirements are that the files must be rapidly accessible, stored for long periods of time - as though they were paper, protected from disaster, and accumulative to about 80 billion characters (80 gigabytes). This volume of data will be approached in the first two years of the project's operation. The approach chosen is to install a hierarchical file migration system using optical disk cartridges. Files are migrated from high-performance media to lower performance optical media based on a least-frequency-used algorithm. The optical media are less expensive per character stored and are removable. Vital statistics about the removable optical disk cartridges are maintained in a database. The assembly of hardware and software acts as a single virtual file system transparent to the PSDS user. The files are copied to 'backup-and-recover' media whose vital statistics are also stored in the database. Seventeen months into operation, PSDS is storing 49 gigabytes. A number of operational and performance problems were overcome. Costs are under control. New and/or alternative uses for the VFS are being considered.

  19. SU-F-T-465: Two Years of Radiotherapy Treatments Analyzed Through MLC Log Files

    Energy Technology Data Exchange (ETDEWEB)

    Defoor, D [University of Texas HSC SA, New Braunfles, TX (United States); Kabat, C; Papanikolaou, N [University of Texas HSC SA, San Antonio, TX (United States); Stathakis, S [Cancer Therapy and Research Center, San Antonio, TX (United States)

    2016-06-15

    Purpose: To present treatment statistics of a Varian Novalis Tx using more than 90,000 Varian Dynalog files collected over the past 2 years. Methods: Varian Dynalog files are recorded for every patient treated on our Varian Novalis Tx. The files are collected and analyzed daily to check interfraction agreement of treatment deliveries. This is accomplished by creating fluence maps from the data contained in the Dynalog files. From the Dynalog files we have also compiled statistics for treatment delivery times, MLC errors, gantry errors and collimator errors. Results: The mean treatment time for VMAT patients was 153 ± 86 seconds while the mean treatment time for step & shoot was 256 ± 149 seconds. Patient’s treatment times showed a variation of 0.4% over there treatment course for VMAT and 0.5% for step & shoot. The average field sizes were 40 cm2 and 26 cm2 for VMAT and step & shoot respectively. VMAT beams contained and average overall leaf travel of 34.17 meters and step & shoot beams averaged less than half of that at 15.93 meters. When comparing planned and delivered fluence maps generated using the Dynalog files VMAT plans showed an average gamma passing percentage of 99.85 ± 0.47. Step & shoot plans showed an average gamma passing percentage of 97.04 ± 0.04. 5.3% of beams contained an MLC error greater than 1 mm and 2.4% had an error greater than 2mm. The mean gantry speed for VMAT plans was 1.01 degrees/s with a maximum of 6.5 degrees/s. Conclusion: Varian Dynalog files are useful for monitoring machine performance treatment parameters. The Dynalog files have shown that the performance of the Novalis Tx is consistent over the course of a patients treatment with only slight variations in patient treatment times and a low rate of MLC errors.

  20. Evidences of intraplate deformation in the West Madeira Abyssal Plain (eastern North Atlantic) from seismic reflection and multibeam swath bathymetry data

    Science.gov (United States)

    Roque, C.; Simões, M.; Lourenço, N.; Pinto de Abreu, M.

    2009-04-01

    The West Madeira Abyssal Plain is located in the eastern North Atlantic off Madeira Islands, forming part of the Canary Basin and reaching a mean water depth of 5300 m. This region is also located within Africa plate at about 500 km southwards from the Açores-Gibraltar plate boundary, and for that reason lacks seismic activity. Although this region being located in an intraplate setting, the presence of faulted sediments was reported in several works published during the eighties of last century following a study conducted in late 1970s to evaluate the feasibility of disposal of high-level radioactive wastes in the ocean. According these works, the Madeira Abyssal Plain sediments are cut by many normal growth faults and this deformation is a result of compaction and dewatering of the sediments. Evidences of tectonic deformation of oceanic sediments in intraplate settings are uncommon, but folded sediments and reverse faults extending into the basement, were recognized in the equatorial Indian Ocean and in the West African continental margin. Recently, during 2006 multi-channel seismic reflection and multibeam swath bathymetry surveys were carried out in the West Madeira Abyssal Plain by EMEPC in order to prepare the Portuguese proposal for the extension of the continental shelf. The seismic lines were acquired onboard R/V Akademik Shatskiy using a source of 5720 cu in bolt gun array, cable length of 7950 m and shot interval of 50.00 m. The multibeam swath bathymetry was acquired onboard NRP Gago Coutinho, and allowed a high resolution mapping of the main geomorphological features. The multichannel seismic lines, oriented WNW-ESE, image the Madeira island lower slope located at about 4000 m water depth and the almost flat abyssal plain at about 5300 m water depth. These seismic lines show a thick sedimentary succession that reaches a maximum thickness of about 1.5 sec twt in the deepest parts of the West Madeira Abyssal Plain, overlying an irregular diffractive

  1. 76 FR 63291 - Combined Notice Of Filings #1

    Science.gov (United States)

    2011-10-12

    ... filing per 35: MBR Tariff to be effective 9/23/2011. Filed Date: 09/23/2011. Accession Number: 20110923.... submits tariff filing per 35: MBR Tariff to be effective 9/23/2011. Filed Date: 09/23/2011. Accession.... submits tariff filing per 35: MBR Tariff to be effective 9/23/2011. Filed Date: 09/23/2011. Accession...

  2. Titanium-II: an evaluated nuclear data file

    International Nuclear Information System (INIS)

    Philis, C.; Howerton, R.; Smith, A.B.

    1977-06-01

    A comprehensive evaluated nuclear data file for elemental titanium is outlined including definition of the data base, the evaluation procedures and judgments, and the final evaluated results. The file describes all significant neutron-induced reactions with elemental titanium and the associated photon-production processes to incident neutron energies of 20.0 MeV. In addition, isotopic-reaction files, consistent with the elemental file, are separately defined for those processes which are important to applied considerations of material-damage and neutron-dosimetry. The file is formulated in the ENDF format. This report formally documents the evaluation and, together with the numerical file, is submitted for consideration as a part of the ENDF/B-V evaluated file system. 20 figures, 9 tables

  3. CERES BiDirectional Scans (BDS) data in HDF (CER_BDS_Terra-FM1_Edition1-CV)

    Science.gov (United States)

    Wielicki, Bruce A. (Principal Investigator)

    Each BiDirectional Scans (BDS) data product contains twenty-four hours of Level-1b data for each CERES scanner instrument mounted on each spacecraft. The BDS includes samples taken in normal and short Earth scan elevation profiles in both fixed and rotating azimuth scan modes (including space, internal calibration, and solar calibration views). The BDS contains Level-0 raw (unconverted) science and instrument data as well as the geolocated converted science and instrument data. The BDS contains additional data not found in the Level-0 input file, including converted satellite position and velocity data, celestial data, converted digital status data, and parameters used in the radiance count conversion equations. The following CERES BDS data sets are currently available: CER_BDS_TRMM-PFM_Edition1 CER_BDS_Terra-FM1_Edition1 CER_BDS_Terra-FM2_Edition1 CER_BDS_Terra-FM1_Edition2 CER_BDS_Terra-FM2_Edition2 CER_BDS_Aqua-FM3_Edition1 CER_BDS_Aqua-FM4_Edition1 CER_BDS_Aqua-FM3_Edition2 CER_BDS_Aqua-FM4_Edition2 CER_BDS_Aqua-FM3_Edition1-CV CER_BDS_Aqua-FM4_Edition1-CV CER_BDS_Terra-FM1_Edition1-CV CER_BDS_Terra-FM2_Edition1-CV. [Location=GLOBAL] [Temporal_Coverage: Start_Date=1997-12-27; Stop_Date=2006-11-02] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180] [Data_Resolution: Temporal_Resolution=1 day; Temporal_Resolution_Range=Daily - < Weekly].

  4. SPOTS4. Group data library and computer code, preparing ENDF/B-4 data for input to LEOPARD

    International Nuclear Information System (INIS)

    Kim, J.D.; Lee, J.T.

    1981-09-01

    The magnetic tape SPOTS4 contains in file 1 a data library to be used as input to the SPOTS4 program which is contained in file 2. The data library is based on ENDF/B-4 and consists of two parts in TEMPEST format (246 groups) and MUFT format (54 groups) respectively. From this library the SPOTS4 program produces a 172 + 54 group library for LEOPARD input. A copy of the magnetic tape is available from the IAEA Nuclear Data Section. (author)

  5. Preservation of root canal anatomy using self-adjusting file instrumentation with glide path prepared by 20/0.02 hand files versus 20/0.04 rotary files

    Science.gov (United States)

    Jain, Niharika; Pawar, Ajinkya M.; Ukey, Piyush D.; Jain, Prashant K.; Thakur, Bhagyashree; Gupta, Abhishek

    2017-01-01

    Objectives: To compare the relative axis modification and canal concentricity after glide path preparation with 20/0.02 hand K-file (NITIFLEX®) and 20/0.04 rotary file (HyFlex™ CM) with subsequent instrumentation with 1.5 mm self-adjusting file (SAF). Materials and Methods: One hundred and twenty ISO 15, 0.02 taper, Endo Training Blocks (Dentsply Maillefer, Ballaigues, Switzerland) were acquired and randomly divided into following two groups (n = 60): group 1, establishing glide path till 20/0.02 hand K-file (NITIFLEX®) followed by instrumentation with 1.5 mm SAF; and Group 2, establishing glide path till 20/0.04 rotary file (HyFlex™ CM) followed by instrumentation with 1.5 mm SAF. Pre- and post-instrumentation digital images were processed with MATLAB R 2013 software to identify the central axis, and then superimposed using digital imaging software (Picasa 3.0 software, Google Inc., California, USA) taking five landmarks as reference points. Student's t-test for pairwise comparisons was applied with the level of significance set at 0.05. Results: Training blocks instrumented with 20/0.04 rotary file and SAF were associated less deviation in canal axis (at all the five marked points), representing better canal concentricity compared to those, in which glide path was established by 20/0.02 hand K-files followed by SAF instrumentation. Conclusion: Canal geometry is better maintained after SAF instrumentation with a prior glide path established with 20/0.04 rotary file. PMID:28855752

  6. 76 FR 28018 - Combined Notice of Filings #1

    Science.gov (United States)

    2011-05-13

    ... tariff filing per 35.13(a)(2)(iii: Information Policy Revisions to be effective 6/20/ 2011. Filed Date... Interconnection, L.L.C. Description: PJM Interconnection, L.L.C. submits tariff filing per 35.13(a)(2)(iii: Queue... New Mexico submits tariff filing per 35.13(a)(2)(iii: PNM LGIP Filing to be effective 7/5/2011. Filed...

  7. 75 FR 62381 - Combined Notice of Filings #2

    Science.gov (United States)

    2010-10-08

    ... filing per 35.12: MeadWestvaco Virginia MBR Filing to be effective 9/ 28/2010. Filed Date: 09/29/2010... submits tariff filing per 35.12: City Power MBR Tariff to be effective 9/30/2010. Filed Date: 09/29/2010... Baseline MBR Tariff to be effective 9[sol]29[sol]2010. Filed Date: 09/29/2010. Accession Number: 20100929...

  8. Hemodiafiltration Improves Plasma 25-Hepcidin Levels: A Prospective, Randomized, Blinded, Cross-Over Study Comparing Hemodialysis and Hemodiafiltration

    Directory of Open Access Journals (Sweden)

    Bergur V. Stefánsson

    2012-03-01

    Full Text Available Background/Aims: Data from studies comparing the effect of hemodiafiltration (HDF and conventional hemodialysis (HD on clinically important outcomes are insufficient to support superiority of HDF. None of these studies has been participant-blinded. Methods: We performed a prospective, randomized, and patient-blinded cross-over study. Twenty patients on chronic HD received either HD for 2 months followed by post-dilution HDF for 2 months or in opposite order. A range of clinical parameters, as well as markers of inflammation, oxidative stress and iron metabolism was measured. Results: The two treatments were similar with respect to dialysis-related complications, quality of life, and the biomarkers of oxidative stress and inflammation. Compared to HD, 25-hepcidin and β2-microglobulin were 38 and 32%, respectively, lower after 60 days of HDF (p Conclusion: In short term, HDF is not superior to HD regarding dialysis-related complications. The higher ESA consumption observed with HDF can be explained by blood clotting in tubing and dialyzers, as more anticoagulation was needed with post-dilution HDF. In a longer perspective, lowering serum hepcidin levels may improve pathological iron homeostasis.

  9. 77 FR 31237 - Electronic Filing in the Copyright Office of Notices of Intention To Obtain a Section 115...

    Science.gov (United States)

    2012-05-25

    ... law, such notices may be filed in the Office only when the public records of the Copyright Office do... filed in the Copyright Office is sufficient as a matter of law under this section, that issue shall be... LIBRARY OF CONGRESS Copyright Office 37 CFR Part 201 [Docket No. RM 2012-4] Electronic Filing in...

  10. Evaluation of canal transportation after preparation with Reciproc single-file systems with or without glide path files.

    Science.gov (United States)

    Aydin, Ugur; Karataslioglu, Emrah

    2017-01-01

    Canal transportation is a common sequel caused by rotary instruments. The purpose of the present study is to evaluate the degree of transportation after the use of Reciproc single-file instruments with or without glide path files. Thirty resin blocks with L-shaped canals were divided into three groups ( n = 10). Group 1 - canals were prepared with Reciproc-25 file. Group 2 - glide path file-G1 was used before Reciproc. Group 3 - glide path files-G1 and G2 were used before Reciproc. Pre- and post-instrumentation images were superimposed under microscope, and resin removed from the inner and outer surfaces of the root canal was calculated throughout 10 points. Statistical analysis was performed with Kruskal-Wallis test and post hoc Dunn test. For coronal and middle one-thirds, there was no significant difference among groups ( P > 0.05). For apical section, transportation of Group 1 was significantly higher than other groups ( P files before Reciproc single-file system reduced the degree of apical canal transportation.

  11. 43 CFR 4.117 - Service of papers.

    Science.gov (United States)

    2010-10-01

    ... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false Service of papers. 4.117 Section 4.117... Service of papers. A copy of all pleadings, briefs, motions, letters, or other papers filed with the Board, shall be served upon the other party at the time of filing. Service of papers may be made personally or...

  12. 10 CFR 2.302 - Filing of documents.

    Science.gov (United States)

    2010-01-01

    ... this part shall be electronically transmitted through the E-Filing system, unless the Commission or... all methods of filing have been completed. (e) For filings by electronic transmission, the filer must... digital ID certificates, the NRC permits participants in the proceeding to access the E-Filing system to...

  13. 5 CFR 1201.14 - Electronic filing procedures.

    Science.gov (United States)

    2010-01-01

    ... form. (b) Matters subject to electronic filing. Subject to the registration requirement of paragraph (e) of this section, parties and representatives may use electronic filing (e-filing) to do any of the...). (d) Internet is sole venue for electronic filing. Following the instructions at e-Appeal Online, the...

  14. Coseismic slip in the 2010 Yushu earthquake (China, constrained by wide-swath and strip-map InSAR

    Directory of Open Access Journals (Sweden)

    Y. Wen

    2013-01-01

    Full Text Available On 14 April 2010, an Mw = 6.9 earthquake occurred in the Yushu county of China, which caused ~3000 people to lose their lives. Integrated with the information from the observed surface ruptures and aftershock locations, the faulting pattern of this earthquake is derived from the descending wide-swath and ascending strip mode PALSAR data collected by ALOS satellite. We used a layered crustal model and stress drop smoothing constraint to infer the coseismic slip distribution. Our model suggests that the earthquake fault can be divided into four segments and the slip mainly occurs within the upper 12 km with a maximum slip of 2.0 m at depth of 3 km on the Jiegu segment. The rupture of the upper 12 km is dominated by left-lateral strike-slip motion. The relatively small slip along the SE region of Yushu segment suggests a slip deficit there. The inverted geodetic moment is approximately Mw = 6.9, consistent with the seismological results. The average stress drop caused by the earthquake is about 2 MPa with a maximum stress drop of 8.3 MPa. Furthermore, the calculated static Coulomb stress changes in surrounding regions show increased Coulomb stress occurred in the SE region along the Yushu segment but with less aftershock, indicating an increased seismic hazard in this region after the earthquake.

  15. The File System Interface is an Anachronism

    OpenAIRE

    Ellard, Daniel

    2003-01-01

    Contemporary file systems implement a set of abstractions and semantics that are suboptimal for many (if not most) purposes. The philosophy of using the simple mechanisms of the file system as the basis for a vast array of higher-level mechanisms leads to inefficient and incorrect implementations. We propose several extensions to the canonical file system model, including explicit support for lock files, indexed files, and resource forks, and the benefit of session semantics for write updates...

  16. DMPD: LPS, TLR4 and infectious disease diversity. [Dynamic Macrophage Pathway CSML Database

    Lifescience Database Archive (English)

    Full Text Available Nat Rev Microbiol. 2005 Jan;3(1):36-46. (.png) (.svg) (.html) (.csml) Show LPS, TLR4 and infectious disease... (.png) SVG File (.svg) HTML File (.html) CSML File (.csml) Open .csml file with CIOPlayer Open .csml file w

  17. High School and Beyond: Twins and Siblings' File Users' Manual, User's Manual for Teacher Comment File, Friends File Users' Manual.

    Science.gov (United States)

    National Center for Education Statistics (ED), Washington, DC.

    These three users' manuals are for specific files of the High School and Beyond Study, a national longitudinal study of high school sophomores and seniors in 1980. The three files are computerized databases that are available on magnetic tape. As one component of base year data collection, information identifying twins, triplets, and some non-twin…

  18. Menggabungkan Beberapa File Dalam SPSS/PC

    Directory of Open Access Journals (Sweden)

    Syahrudji Naseh

    2012-09-01

    Full Text Available Pada dasamya piranti lunak komputer dapat dibagi ke dalam lima kelompok besar yaitu pengolah kata, spreadsheet database, statistika dan animasi/desktop. Masing-masing mempunyai kelebihan dan kekurangannya. Piranti lunak dBase 111+ yang merupakan piranti lunak paling populer dalam"database", hanya dapat menampung 128 variabel saja. Oleh karenanya pada suatu kuesioner yang besar seperti Susenas (Survei Sosial Ekonomi Nasional atau SKRT (Survei Kesehatan Rumah Tangga, datanya tidak dapat dijadikan satu "file". Biasanya dipecah menjadi banyak "file", umpamanya fileldbf, file2.dbf dan seterusnya.Masalahnya adalah bagaimana menggabung beberapa variabel yang ada di file1.dbf engan beberapa variabel yang ada di file5.dbf? Tulisan ini mencoba membahas masalah tersebut

  19. 76 FR 70651 - Fee for Filing a Patent Application Other Than by the Electronic Filing System

    Science.gov (United States)

    2011-11-15

    ... government; or (3) preempt tribal law. Therefore, a tribal summary impact statement is not required under... 0651-AC64 Fee for Filing a Patent Application Other Than by the Electronic Filing System AGENCY: United..., that is not filed by electronic means as prescribed by the Director of the United States Patent and...

  20. Cross-system log file analysis for hypothesis testing

    NARCIS (Netherlands)

    Glahn, Christian; Specht, Marcus; Schoonenboom, Judith; Sligte, Henk; Moghnieh, Ayman; Hernández-Leo, Davinia; Stefanov, Krassen; Lemmers, Ruud; Koper, Rob

    2008-01-01

    Glahn, C., Specht, M., Schoonenboom, J., Sligte, H., Moghnieh, A., Hernández-Leo, D. Stefanov, K., Lemmers, R., & Koper, R. (2008). Cross-system log file analysis for hypothesis testing. In H. Sligte & R. Koper (Eds.), Proceedings of the 4th TENCompetence Open Workshop. Empowering Learners for

  1. Earnings Public-Use File, 2006

    Data.gov (United States)

    Social Security Administration — Social Security Administration released Earnings Public-Use File (EPUF) for 2006. File contains earnings information for individuals drawn from a systematic random...

  2. 76 FR 70192 - Self-Regulatory Organizations; BATS Exchange, Inc.; Notice of Filing and Immediate Effectiveness...

    Science.gov (United States)

    2011-11-10

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-65694; File No. SR-BATS-2011-046] Self-Regulatory Organizations; BATS Exchange, Inc.; Notice of Filing and Immediate Effectiveness of Proposed Rule Change Related to Fees for Use of BATS Exchange, Inc. November 4, 2011. Pursuant to Section 19(b)(1) of...

  3. 12 CFR 509.10 - Filing of papers.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Filing of papers. 509.10 Section 509.10 Banks... IN ADJUDICATORY PROCEEDINGS Uniform Rules of Practice and Procedure § 509.10 Filing of papers. (a) Filing. Any papers required to be filed, excluding documents produced in response to a discovery request...

  4. File Detection On Network Traffic Using Approximate Matching

    Directory of Open Access Journals (Sweden)

    Frank Breitinger

    2014-09-01

    Full Text Available In recent years, Internet technologies changed enormously and allow faster Internet connections, higher data rates and mobile usage. Hence, it is possible to send huge amounts of data / files easily which is often used by insiders or attackers to steal intellectual property. As a consequence, data leakage prevention systems (DLPS have been developed which analyze network traffic and alert in case of a data leak. Although the overall concepts of the detection techniques are known, the systems are mostly closed and commercial.Within this paper we present a new technique for network trac analysis based on approximate matching (a.k.a fuzzy hashing which is very common in digital forensics to correlate similar files. This paper demonstrates how to optimize and apply them on single network packets. Our contribution is a straightforward concept which does not need a comprehensive conguration: hash the file and store the digest in the database. Within our experiments we obtained false positive rates between 10-4 and 10-5 and an algorithm throughput of over 650 Mbit/s.

  5. 47 CFR 61.14 - Method of filing publications.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 3 2010-10-01 2010-10-01 false Method of filing publications. 61.14 Section 61...) TARIFFS Rules for Electronic Filing § 61.14 Method of filing publications. (a) Publications filed... date of a publication received by the Electronic Tariff Filing System will be determined by the date...

  6. 12 CFR 263.10 - Filing of papers.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Filing of papers. 263.10 Section 263.10 Banks... OF PRACTICE FOR HEARINGS Uniform Rules of Practice and Procedure § 263.10 Filing of papers. (a) Filing. Any papers required to be filed, excluding documents produced in response to a discovery request...

  7. 29 CFR 1981.103 - Filing of discrimination complaint.

    Science.gov (United States)

    2010-07-01

    ... constitute the violations. (c) Place of filing. The complaint should be filed with the OSHA Area Director... or she has been discriminated against by an employer in violation of the Act may file, or have filed..., but may be filed with any OSHA officer or employee. Addresses and telephone numbers for these...

  8. 77 FR 66458 - Combined Notice of Filings #1

    Science.gov (United States)

    2012-11-05

    ... Service Company of Colorado. Description: 2012--10--26 PSCo MBR Filing to be effective 12/26/ 2012. Filed...--SPS MBR Filing to be effective 12/26/2012. Filed Date: 10/26/12. Accession Number: 20121026-5123...: Revised Application for MBR Authorization to be effective 10/16/2012. Filed Date: 10/25/12. Accession...

  9. 75 FR 66075 - Combined Notice of Filings #1

    Science.gov (United States)

    2010-10-27

    ....12: Baseline MBR Concurrence to be effective 10/8/2010. Filed Date: 10/19/2010. Accession Number... Company submits tariff filing per 35.12: Baseline MBR Concurrence to be effective 10/8/2010. Filed Date... Power Company submits tariff filing per 35.12: Baseline MBR Concurrence to be effective 10/8/2010. Filed...

  10. The Global File System

    Science.gov (United States)

    Soltis, Steven R.; Ruwart, Thomas M.; OKeefe, Matthew T.

    1996-01-01

    The global file system (GFS) is a prototype design for a distributed file system in which cluster nodes physically share storage devices connected via a network-like fiber channel. Networks and network-attached storage devices have advanced to a level of performance and extensibility so that the previous disadvantages of shared disk architectures are no longer valid. This shared storage architecture attempts to exploit the sophistication of storage device technologies whereas a server architecture diminishes a device's role to that of a simple component. GFS distributes the file system responsibilities across processing nodes, storage across the devices, and file system resources across the entire storage pool. GFS caches data on the storage devices instead of the main memories of the machines. Consistency is established by using a locking mechanism maintained by the storage devices to facilitate atomic read-modify-write operations. The locking mechanism is being prototyped in the Silicon Graphics IRIX operating system and is accessed using standard Unix commands and modules.

  11. 12 CFR 19.10 - Filing of papers.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 1 2010-01-01 2010-01-01 false Filing of papers. 19.10 Section 19.10 Banks and... Rules of Practice and Procedure § 19.10 Filing of papers. (a) Filing. Any papers required to be filed...) Delivering the papers to a reliable commercial courier service, overnight delivery service, or to the U.S...

  12. 12 CFR 747.10 - Filing of papers.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Filing of papers. 747.10 Section 747.10 Banks... Practice and Procedure § 747.10 Filing of papers. (a) Filing. Any papers required to be filed, excluding...) Delivering the papers to a reliable commercial courier service, overnight delivery service, or to the U.S...

  13. Impact of Nitrogen Fertilization on Forest Carbon Sequestration and Water Loss in a Chronosequence of Three Douglas-Fir Stands in the Pacific Northwest

    Directory of Open Access Journals (Sweden)

    Xianming Dou

    2015-05-01

    Full Text Available To examine the effect of nitrogen (N fertilization on forest carbon (C sequestration and water loss, we used an artificial neural network model to estimate C fluxes and evapotranspiration (ET in response to N fertilization during four post-fertilization years in a Pacific Northwest chronosequence of three Douglas-fir stands aged 61, 22 and 10 years old in 2010 (DF49, HDF88 and HDF00, respectively. Results showed that N fertilization increased gross primary productivity (GPP for all three sites in all four years with the largest absolute increase at HDF00 followed by HDF88. Ecosystem respiration increased in all four years at HDF00, but decreased over the last three years at HDF88 and over all four years at DF49. As a result, fertilization increased the net ecosystem productivity of all three stands with the largest increase at HDF88, followed by DF49. Fertilization had no discernible effect on ET in any of the stands. Consequently, fertilization increased water use efficiency (WUE in all four post-fertilization years at all three sites and also increased light use efficiency (LUE of all the stands, especially HDF00. Our results suggest that the effects of fertilization on forest C sequestration and water loss may be associated with stand age and fertilization; the two younger stands appeared to be more efficient than the older stand with respect to GPP, WUE and LUE.

  14. Identifying Urinary and Serum Exosome Biomarkers for Radiation Exposure Using a Data Dependent Acquisition and SWATH-MS Combined Workflow

    International Nuclear Information System (INIS)

    Kulkarni, Shilpa; Koller, Antonius; Mani, Kartik M.; Wen, Ruofeng; Alfieri, Alan; Saha, Subhrajit; Wang, Jian; Patel, Purvi; Bandeira, Nuno; Guha, Chandan

    2016-01-01

    Purpose: Early and accurate assessment of radiation injury by radiation-responsive biomarkers is critical for triage and early intervention. Biofluids such as urine and serum are convenient for such analysis. Recent research has also suggested that exosomes are a reliable source of biomarkers in disease progression. In the present study, we analyzed total urine proteome and exosomes isolated from urine or serum for potential biomarkers of acute and persistent radiation injury in mice exposed to lethal whole body irradiation (WBI). Methods and Materials: For feasibility studies, the mice were irradiated at 10.4 Gy WBI, and urine and serum samples were collected 24 and 72 hours after irradiation. Exosomes were isolated and analyzed using liquid chromatography mass spectrometry/mass spectrometry-based workflow for radiation exposure signatures. A data dependent acquisition and SWATH-MS combined workflow approach was used to identify significantly exosome biomarkers indicative of acute or persistent radiation-induced responses. For the validation studies, mice were exposed to 3, 6, 8, or 10 Gy WBI, and samples were analyzed for comparison. Results: A comparison between total urine proteomics and urine exosome proteomics demonstrated that exosome proteomic analysis was superior in identifying radiation signatures. Feasibility studies identified 23 biomarkers from urine and 24 biomarkers from serum exosomes after WBI. Urinary exosome signatures identified different physiological parameters than the ones obtained in serum exosomes. Exosome signatures from urine indicated injury to the liver, gastrointestinal, and genitourinary tracts. In contrast, serum showed vascular injuries and acute inflammation in response to radiation. Selected urinary exosomal biomarkers also showed changes at lower radiation doses in validation studies. Conclusions: Exosome proteomics revealed radiation- and time-dependent protein signatures after WBI. A total of 47 differentially secreted

  15. Identifying Urinary and Serum Exosome Biomarkers for Radiation Exposure Using a Data Dependent Acquisition and SWATH-MS Combined Workflow

    Energy Technology Data Exchange (ETDEWEB)

    Kulkarni, Shilpa [Department of Radiation Oncology, Albert Einstein College of Medicine, Bronx, New York (United States); Koller, Antonius [Proteomics Center, Stony Brook University School of Medicine, Stony Brook, New York (United States); Proteomics Shared Resource, Herbert Irving Comprehensive Cancer Center, New York, New York (United States); Mani, Kartik M. [Department of Radiation Oncology, Albert Einstein College of Medicine, Bronx, New York (United States); Wen, Ruofeng [Department of Applied Mathematics and Statistics, Stony Brook University, Stony Brook, New York (United States); Alfieri, Alan; Saha, Subhrajit [Department of Radiation Oncology, Albert Einstein College of Medicine, Bronx, New York (United States); Wang, Jian [Center for Computational Mass Spectrometry, University of California, San Diego, California (United States); Department of Computer Science and Engineering, University of California, San Diego, California (United States); Patel, Purvi [Proteomics Shared Resource, Herbert Irving Comprehensive Cancer Center, New York, New York (United States); Department of Pharmacological Sciences, Stony Brook University, Stony Brook, New York (United States); Bandeira, Nuno [Center for Computational Mass Spectrometry, University of California, San Diego, California (United States); Department of Computer Science and Engineering, University of California, San Diego, California (United States); Skaggs School of Pharmacy and Pharmaceutical Sciences, University of California, San Diego, California (United States); Guha, Chandan, E-mail: cguha@montefiore.org [Department of Radiation Oncology, Albert Einstein College of Medicine, Bronx, New York (United States); and others

    2016-11-01

    Purpose: Early and accurate assessment of radiation injury by radiation-responsive biomarkers is critical for triage and early intervention. Biofluids such as urine and serum are convenient for such analysis. Recent research has also suggested that exosomes are a reliable source of biomarkers in disease progression. In the present study, we analyzed total urine proteome and exosomes isolated from urine or serum for potential biomarkers of acute and persistent radiation injury in mice exposed to lethal whole body irradiation (WBI). Methods and Materials: For feasibility studies, the mice were irradiated at 10.4 Gy WBI, and urine and serum samples were collected 24 and 72 hours after irradiation. Exosomes were isolated and analyzed using liquid chromatography mass spectrometry/mass spectrometry-based workflow for radiation exposure signatures. A data dependent acquisition and SWATH-MS combined workflow approach was used to identify significantly exosome biomarkers indicative of acute or persistent radiation-induced responses. For the validation studies, mice were exposed to 3, 6, 8, or 10 Gy WBI, and samples were analyzed for comparison. Results: A comparison between total urine proteomics and urine exosome proteomics demonstrated that exosome proteomic analysis was superior in identifying radiation signatures. Feasibility studies identified 23 biomarkers from urine and 24 biomarkers from serum exosomes after WBI. Urinary exosome signatures identified different physiological parameters than the ones obtained in serum exosomes. Exosome signatures from urine indicated injury to the liver, gastrointestinal, and genitourinary tracts. In contrast, serum showed vascular injuries and acute inflammation in response to radiation. Selected urinary exosomal biomarkers also showed changes at lower radiation doses in validation studies. Conclusions: Exosome proteomics revealed radiation- and time-dependent protein signatures after WBI. A total of 47 differentially secreted

  16. The effects of various exposure times in the detectability on the tips of the endodontic files in Digora

    International Nuclear Information System (INIS)

    Ko, Jee Young; Park, Chang Seo

    1997-01-01

    Digora-an intraoral digital radiography system utilizing image plate(IP) - has a dynamic range of exposure time which allows it to decrease the patient's exposure time and to increase diagnostic ability through image process sing, transmission and storage. The purpose of this study was to evaluate the Digora system by assessing the effects of various exposure times on the detectability on the tip of the endodontic file. Examining the root canals of 45 extracted sound premolars, K-files No. 10, 15, and 20 were placed at slightly varying distances from the apex. The teeth were glued onto resin-pla ster blocks. Five exposure times varying between 0.01 seconds and 0.25 seconds were used. Four observers were asked to measure the distance between the tip of the file and a reduction of crown portion, and obtained mean errors (subtracting true file length from the measured file length), comparing Digora monitors with E-plus films, which were both obtained under the same geometrical positions. The results were as follows : 1. Comparing E-plus film with Digora at 0.01 seconds, the mean errors in E-plus film showed -4.453 mm, -4.497 mm, and -3.857 mm, while the mean errors in Digora showed 0.065 mm, 0.607 mm, and 0.719 mm according to the file groups. Therefore there was a significant difference between E-plus film and Digora (P<0.05). 2. By comparison of mean errors according to the various exposure times in the Digora system, the mean error at standard deviation was the highest at 0.01 seconds was significantly lower than that at 0.12 and 0.25 seconds in No. 10 and 20 file group (P<0.05). and the standard deviation was the highest at 0.01 seconds. 3. Comparing E-plus film at 0.25 seconds with the Digora system, the mean errors showed a significant difference between E-plus at 0.25 seconds and the Digora system at 0.25 seconds in No. 10 and 20 file groups (P<0.05). 4. Comparing E-plus film at 0.25 seconds and E-plus film at 0.01 and 0.03 seconds in 10 file group (P<0.05). In

  17. 78 FR 12050 - S. Martinez Livestock, Inc.; Notice of Preliminary Permit Application Accepted for Filing and...

    Science.gov (United States)

    2013-02-21

    ... Livestock, Inc.; Notice of Preliminary Permit Application Accepted for Filing and Soliciting Comments.... Martinez Livestock, Inc. filed an application for a preliminary permit, pursuant to section 4(f) of the... megawatt hours. Applicant Contact: Mr. Daniel T. Martinez, S. Martinez Livestock, Inc., 13395 Hwy. 24...

  18. 76 FR 9340 - Mill Town Power Project; Notice of Preliminary Permit Application Accepted for Filing and...

    Science.gov (United States)

    2011-02-17

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 13995-000] Mill Town Power Project; Notice of Preliminary Permit Application Accepted for Filing and Soliciting Comments, Motions To Intervene, and Competing Applications On January 4, 2011, Mill Town Power Project filed an application for a...

  19. Mixed-Media File Systems

    NARCIS (Netherlands)

    Bosch, H.G.P.

    1999-01-01

    This thesis addresses the problem of implementing mixed-media storage systems. In this work a mixed-media file system is defined to be a system that stores both conventional (best-effort) file data and real-time continuous-media data. Continuous-media data is usually bulky, and servers storing and

  20. SU-E-T-184: Clinical VMAT QA Practice Using LINAC Delivery Log Files

    International Nuclear Information System (INIS)

    Johnston, H; Jacobson, T; Gu, X; Jiang, S; Stojadinovic, S

    2015-01-01

    Purpose: To evaluate the accuracy of volumetric modulated arc therapy (VMAT) treatment delivery dose clouds by comparing linac log data to doses measured using an ionization chamber and film. Methods: A commercial IMRT quality assurance (QA) process utilizing a DICOM-RT framework was tested for clinical practice using 30 prostate and 30 head and neck VMAT plans. Delivered 3D VMAT dose distributions were independently checked using a PinPoint ionization chamber and radiographic film in a solid water phantom. DICOM RT coordinates were used to extract the corresponding point and planar doses from 3D log file dose distributions. Point doses were evaluated by computing the percent error between log file and chamber measured values. A planar dose evaluation was performed for each plan using a 2D gamma analysis with 3% global dose difference and 3 mm isodose point distance criteria. The same analysis was performed to compare treatment planning system (TPS) doses to measured values to establish a baseline assessment of agreement. Results: The mean percent error between log file and ionization chamber dose was 1.0%±2.1% for prostate VMAT plans and −0.2%±1.4% for head and neck plans. The corresponding TPS calculated and measured ionization chamber values agree within 1.7%±1.6%. The average 2D gamma passing rates for the log file comparison to film are 98.8%±1.0% and 96.2%±4.2% for the prostate and head and neck plans, respectively. The corresponding passing rates for the TPS comparison to film are 99.4%±0.5% and 93.9%±5.1%. Overall, the point dose and film data indicate that log file determined doses are in excellent agreement with measured values. Conclusion: Clinical VMAT QA practice using LINAC treatment log files is a fast and reliable method for patient-specific plan evaluation

  1. 77 FR 14512 - Combined Notice of Filings #1

    Science.gov (United States)

    2012-03-12

    ... Company. Description: Schedule 4 & 10--Energy & Generator Imbalance Changes to be effective 3/5/2012... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 1 Take notice...: 5 p.m. ET 3/23/12. Docket Numbers: ER12-458-004. Applicants: Quantum Choctaw Power, LLC. Description...

  2. 78 FR 23760 - Combined Notice of Filings #1

    Science.gov (United States)

    2013-04-22

    ...-1266-000. Applicants: CalEnergy, LLC. Description: CalEnergy FERC MBR Tariff Application to be.... Docket Numbers: ER13-1267-000. Applicants: CE Leathers Company. Description: CE Leathers FERC MBR Tariff... Company MBR Tariff Application to be effective 6/3/2013. Filed Date: 4/12/13. Accession Number: 20130412...

  3. 76 FR 58257 - Combined Notice of Filings #2

    Science.gov (United States)

    2011-09-20

    ... Hills Wind Farm, LLC MBR Tariff to be effective 10/31/2007. Filed Date: 09/12/2011. Accession Number... filing per 35.1: Smoky Hills Wind Project II, LLC MBR Tariff to be effective 10/20/2008. Filed Date: 09..., LLC submits tariff filing per 35.1: Enel Stillwater, LLC MBR Tariff to be effective 12/5/2008. Filed...

  4. 77 FR 28592 - Combined Notice of Filings #1

    Science.gov (United States)

    2012-05-15

    ...: Middletown MBR Application to be effective 5/8/2012. Filed Date: 5/7/12. Accession Number: 20120507-5128..., LLC. Description: Southern Energy Initial MBR Filing to be effective 5/ 7/2012. Filed Date: 5/8/12... Company submits tariff filing per 35.37: MBR Triennial Filing--1st Rev MBR to be effective 9/30/2010...

  5. 76 FR 59676 - Combined Notice of Filings #1

    Science.gov (United States)

    2011-09-27

    ... MBR Tariff to be effective 10/1/2011. Filed Date: 09/16/2011. Accession Number: 20110916-5146. Comment... 35.1: ONEOK Energy Services Company Baseline MBR Filing to be effective 9/16/2011. Filed Date: 09/16... Services Order No. 697 Compliance Filing of MBR Tariff to be effective 9/16/2011. Filed Date: 09/16/2011...

  6. K-file vs ProFiles in cleaning capacity and instrumentation time in primary molar root canals: An in vitro study

    Directory of Open Access Journals (Sweden)

    N Madan

    2011-01-01

    Full Text Available Objectives: This study compares the efficiency of manual K-files and rotary ProFiles in cleaning capacity and instrumentation time in primary molar root canals. Materials and Methods: Seventy-five maxillary and mandibular primary molar root canals were instrumented with ProFiles and K-files in the step-back manner from size #10 to #40. The teeth were decalcified, dehydrated and cleared, and analyzed for the presence of dye remaining on the root canal walls, which served as an evidence of cleaning capacity of both the techniques. Results: The results showed a significant difference in the cleaning capacity of the root canals with ProFiles and K-files, in apical and coronal thirds of the root canal. ProFiles have been found to be more efficient in cleaning the coronal thirds and K-files in cleaning apical thirds of the root canals. Both the techniques were almost equally effective in cleaning the middle thirds of the canals. The time taken during the cleaning of the root canals appeared to be statistically shorter with K-files than profiles.

  7. Curved canals: Ancestral files revisited

    Directory of Open Access Journals (Sweden)

    Jain Nidhi

    2008-01-01

    Full Text Available The aim of this article is to provide an insight into different techniques of cleaning and shaping of curved root canals with hand instruments. Although a plethora of root canal instruments like ProFile, ProTaper, LightSpeed ® etc dominate the current scenario, the inexpensive conventional root canal hand files such as K-files and flexible files can be used to get optimum results when handled meticulously. Special emphasis has been put on the modifications in biomechanical canal preparation in a variety of curved canal cases. This article compiles a series of clinical cases of root canals with curvatures in the middle and apical third and with S-shaped curvatures that were successfully completed by employing only conventional root canal hand instruments.

  8. 76 FR 40905 - Turnagain Arm Tidal Electric Energy Project; Notice of Intent To File License Application, Filing...

    Science.gov (United States)

    2011-07-12

    ... Tidal Electric Energy Project; Notice of Intent To File License Application, Filing of Pre-Application....: 13509-001. c. Dated Filed: May 11, 2011. d. Submitted By: Turnagain Arm Tidal Energy Corporation. e. Name of Project: Turnagain Arm Tidal Electric Energy Project. f. Location: Of the Upper Cook Inlet off...

  9. 77 FR 58370 - Pennamaquan Tidal Power LLC; Notice of Intent To File License Application, Filing of Pre...

    Science.gov (United States)

    2012-09-20

    ... Tidal Power LLC; Notice of Intent To File License Application, Filing of Pre-Application Document (PAD... Filed: July 19, 2012. d. Submitted By: Pennamaquan Tidal Power LLC (Pennamaquan Power). e. Name of Project: Pennamaquan Tidal Power Plant Project. f. Location: On the Pennamaquan River at the entrance to...

  10. 75 FR 61474 - Juneau Hydropower, Inc.; Notice of Intent To File License Application, Filing of Pre-Application...

    Science.gov (United States)

    2010-10-05

    ... Hydropower, Inc.; Notice of Intent To File License Application, Filing of Pre-Application Document, and....: 13563-001. c. Dated Filed: July 28, 2010. d. Submitted By: Juneau Hydropower, Inc. e. Name of Project... Commission's regulations. h. Potential Applicant Contact: Duff W. Mitchell, Juneau Hydropower, Inc., P.O. Box...

  11. Evaluated nuclear-data file for niobium

    International Nuclear Information System (INIS)

    Smith, A.B.; Smith, D.L.; Howerton, R.J.

    1985-03-01

    A comprehensive evaluated nuclear-data file for elemental niobium is provided in the ENDF/B format. This file, extending over the energy range 10 -11 -20 MeV, is suitable for comprehensive neutronic calculations, particulary those dealing with fusion-energy systems. It also provides dosimetry information. Attention is given to the internal consistancy of the file, energy balance, and the quantitative specification of uncertainties. Comparisons are made with experimental data and previous evaluated files. The results of integral tests are described and remaining outstanding problem areas are cited. 107 refs

  12. Data_files_Reyes_EHP_phthalates

    Data.gov (United States)

    U.S. Environmental Protection Agency — The file contains three files in comma separated values (.csv) format. “Reyes_EHP_Phthalates_US_metabolites.csv” contains information about the National Health and...

  13. Intermittent hemodialysis is superior to continuous veno-venous hemodialysis/hemodiafiltration to eliminate methanol and formate during treatment for methanol poisoning

    Science.gov (United States)

    Zakharov, Sergey; Pelclova, Daniela; Navratil, Tomas; Belacek, Jaromir; Kurcova, Ivana; Komzak, Ondrej; Salek, Tomas; Latta, Jiri; Turek, Radovan; Bocek, Robert; Kucera, Cyril; Hubacek, Jaroslav A; Fenclova, Zdenka; Petrik, Vit; Cermak, Martin; Hovda, Knut Erik

    2014-01-01

    During an outbreak of methanol poisonings in the Czech Republic in 2012, we were able to study methanol and formate elimination half-lives during intermittent hemodialysis (IHD) and continuous veno-venous hemodialysis/hemodiafiltration (CVVHD/HDF) and the relative impact of dialysate and blood flow rates on elimination. Data were obtained from 11 IHD and 13 CVVHD/HDF patients. Serum methanol and formate concentrations were measured by gas chromatography and an enzymatic method. The groups were relatively comparable, but the CVVHD/HDF group was significantly more acidotic (mean pH 6.9 vs. 7.1 IHD). The mean elimination half-life of methanol was 3.7 and formate 1.6 h with IHD, versus 8.1 and 3.6 h, respectively, with CVVHD/HDF (both significant). The 54% greater reduction in methanol and 56% reduction in formate elimination half-life during IHD resulted from the higher blood and dialysate flow rates. Increased blood and dialysate flow on the CVVHD/HDF also increased elimination significantly. Thus, IHD is superior to CVVHD/HDF for more rapid methanol and formate elimination, and if CVVHD/HDF is the only treatment available then elimination is greater with greater blood and dialysate flow rates. PMID:24621917

  14. HYDRA Hyperspectral Data Research Application Tom Rink and Tom Whittaker

    Science.gov (United States)

    Rink, T.; Whittaker, T.

    2005-12-01

    HYDRA is a freely available, easy to install tool for visualization and analysis of large local or remote hyper/multi-spectral datasets. HYDRA is implemented on top of the open source VisAD Java library via Jython - the Java implementation of the user friendly Python programming language. VisAD provides data integration, through its generalized data model, user-display interaction and display rendering. Jython has an easy to read, concise, scripting-like, syntax which eases software development. HYDRA allows data sharing of large datasets through its support of the OpenDAP and OpenADDE server-client protocols. The users can explore and interrogate data, and subset in physical and/or spectral space to isolate key areas of interest for further analysis without having to download an entire dataset. It also has an extensible data input architecture to recognize new instruments and understand different local file formats, currently NetCDF and HDF4 are supported.

  15. SPICODYN: A Toolbox for the Analysis of Neuronal Network Dynamics and Connectivity from Multi-Site Spike Signal Recordings.

    Science.gov (United States)

    Pastore, Vito Paolo; Godjoski, Aleksandar; Martinoia, Sergio; Massobrio, Paolo

    2018-01-01

    We implemented an automated and efficient open-source software for the analysis of multi-site neuronal spike signals. The software package, named SPICODYN, has been developed as a standalone windows GUI application, using C# programming language with Microsoft Visual Studio based on .NET framework 4.5 development environment. Accepted input data formats are HDF5, level 5 MAT and text files, containing recorded or generated time series spike signals data. SPICODYN processes such electrophysiological signals focusing on: spiking and bursting dynamics and functional-effective connectivity analysis. In particular, for inferring network connectivity, a new implementation of the transfer entropy method is presented dealing with multiple time delays (temporal extension) and with multiple binary patterns (high order extension). SPICODYN is specifically tailored to process data coming from different Multi-Electrode Arrays setups, guarantying, in those specific cases, automated processing. The optimized implementation of the Delayed Transfer Entropy and the High-Order Transfer Entropy algorithms, allows performing accurate and rapid analysis on multiple spike trains from thousands of electrodes.

  16. A comparative evaluation of gutta percha removal and extrusion of apical debris by rotary and hand files.

    Science.gov (United States)

    Chandrasekar; Ebenezar, A V Rajesh; Kumar, Mohan; Sivakumar, A

    2014-11-01

    The aim of this study was to evaluate the efficacy of Protaper retreatment files in comparison with RaCe, K3 and H-files for removal of gutta-percha and apically extruded debris using volumetric analysis. Forty extracted single rooted maxillary incisor teeth with straight canals and mature apices were selected for the study. After access cavity preparation, apical patency was confirmed with a size 10 K-file extending 1mm beyond the point at which it was first visible at the apical end. Working lengths were determined with the use of size 15 K-file. The canals were prepared in a step-back technique and the master apical file was size 30 for all teeth. 3% sodium hypochlorite was used as an irrigant after each instrumentation. Before final rinse, size 20 K-file was passed 1mm beyond the apex to remove any dentinal shaving plugs and maintain the apical patency. Then the canals were dried with paper points. The root canal was filled using standard gutta-percha points and zinc oxide eugenol sealer under lateral condensation technique. The teeth were then randomly divided into four groups of ten teeth each based on the instrument used for gutta percha removal. All the rotary instruments used in this study were rotated at 300rpm. The instruments used were: Group 1 - RaCe Files, Group 2 - ProTaper retreatment Files, Group 3 - K3 Files and Group 4 - H Files. The volume of the obturating material was calculated before and after removal using volumetric analysis with spiral CT. The removal efficacy with each instrument was calculated and statistically analysed. The results of the study show that the ProTaper retreatment files (Group 2) (97.4%) showed the highest efficiency in the removal of obturating material, which was followed by RaCe (95.74%), K3 (92.86%) and H files (90.14%) with the efficiency in the decreasing order. Similarly the mean apical extrusion in H files (0.000 ± 0.002) was significantly lower than all the rotary instruments. However, the difference among the

  17. OpenMC: a state-of-the-Art Monte Carlo code for research and development

    International Nuclear Information System (INIS)

    Romano, P.K.; Horelik, N.E.; Herman, B.R.; Forget, B.; Smith, K.; Nelson, A.G.

    2013-01-01

    This paper gives an overview of OpenMC, an open source Monte Carlo particle transport code recently developed at the Massachusetts Institute of Technology. OpenMC uses continuous-energy cross sections and a constructive solid geometry representation, enabling high-fidelity modeling of nuclear reactors and other systems. Modern, portable input/output file formats are used in OpenMC: XML for input, and HDF5 for output. High performance parallel algorithms in OpenMC have demonstrated near-linear scaling to over 100,000 processors on modern supercomputers. Other topics discussed in this paper include plotting, CMFD acceleration, variance reduction, eigenvalue calculations, and software development processes. (authors)

  18. 75 FR 2128 - Basin Farm Renewable, LLC; Notice of Preliminary Permit Application Accepted for Filing and...

    Science.gov (United States)

    2010-01-14

    ... Renewable, LLC; Notice of Preliminary Permit Application Accepted for Filing and Soliciting Comments... Renewable, LLC filed an application, pursuant to section 4(f) of the Federal Power Act, proposing to study the feasibility of the Basin Farm Renewable Energy Project, to be located on Saxtons River, in...

  19. Evaluation of Contact Friction in Fracture of Rotationally Bent Nitinol Endodontic Files

    Science.gov (United States)

    Haimed, Tariq Abu

    2011-12-01

    The high flexibility of rotary Nitinol (Ni-Ti) files has helped clinicians perform root canal treatments with fewer technical errors than seen with stainless steel files. However, intracanal file fracture can occur, compromising the outcome of the treatment. Ni-Ti file fracture incidence is roughly around 4% amongst specialists and higher amongst general practitioners. Therefore, eliminating or reducing this problem should improve patient care. The aim of this project was to isolate and examine the role of friction between files and the canal walls of the glass tube model, and bending-related maximum strain amplitudes, on Ni-Ti file lifetimes-tofracture in the presence of different irrigant solutions and file coatings. A specifically designed device was used to test over 300 electropolished EndoSequenceRTM Ni-Ti files for number of cycles to failure (NCF) in smooth, bent glass tube models at 45 and 60 degrees during dry, coated and liquid-lubricated rotation at 600rpm. Fractured files were examined under Scanning Electron Microscopy (SEM) afterwards. Four different file sizes 25.04, 25.06, 35.04, 35.06 (diameter in mm/taper %) and six surface modification conditions were used independently. These conditions included, three solutions; (1) a surfactant-based solution, Surface-Active-Displacement-Solution (SADS), (2) a mouth wash proven to remove biofilms, Delmopinol 1%(DEL), and (3) Bleach 6% (vol.%), the most common antibacterial endodontic irrigant solution. The conditions also included two low-friction silane-based coating groups, 3-Hepta-fluoroisopropyl-propoxymethyl-dichlorosilane (3-HEPT) and Octadecyltrichlorosilane (ODS), in addition to an as-received file control group (Dry). The coefficient of friction (CF) between the file and the canal walls for each condition was measured as well as the surface tension of the irrigant solutions and the critical surface tension of the coated and uncoated files by contact angle measurements. The radius of curvature and

  20. 75 FR 51994 - Combined Notice of Filings

    Science.gov (United States)

    2010-08-24

    ...: Panther Interstate Pipeline Energy, LLC. Description: Panther Interstate Pipeline Energy, LLC submits tariff filing per 154.203: Panther Baseline eTariff Filing to be effective 8/ 12/2010. Filed Date: 08/13...

  1. Review of uncertainty files and improved multigroup cross section files for FENDL

    International Nuclear Information System (INIS)

    Ganesan, S.

    1994-03-01

    The IAEA Nuclear Data Section, in co-operation with several national nuclear data centers and research groups, is creating an internationally available Fusion Evaluated Nuclear Data Library (FENDL), which will serve as a comprehensive source of processed and tested nuclear data tailored to the requirements of the Engineering and Development Activities (EDA) of the International Thermonuclear Experimental Reactor (ITER) Project and other fusion-related development projects. The FENDL project of the International Atomic Energy Agency has the task of coordination with the goal of assembling, processing and testing a comprehensive, fusion-relevant Fusion Evaluated Nuclear Data Library with unrestricted international distribution. The present report contains the summary of the IAEA Advisory Group Meeting on ''Review of Uncertainty Files and Improved Multigroup Cross Section Files for FENDL'', held during 8-12 November 1993 at the Tokai Research Establishment, JAERI, Japan, organized in cooperation with the Japan Atomic Energy Research Institute. The report presents the current status of the FENDL activity and the future work plans in the form of conclusions and recommendations of the four Working Groups of the Advisory Group Meeting on (1) experimental and calculational benchmarks, (2) preparation processed libraries for FENDL/ITER, (3) specifying procedures for improving FENDL and (4) selection of activation libraries for FENDL. (author). 1 tab

  2. 75 FR 49923 - Combined Notice of Filings #1

    Science.gov (United States)

    2010-08-16

    ... filing per 35.12: KCP&L-GMO Baseline Filing (Market-Based Volume 28) to be effective 8/2/2010. Filed Date...: KCP&L Greater Missouri Operations Company submits tariff filing per 35.12: GMO Volume 33 (Cost-Based...

  3. 78 FR 21353 - Combined Notice of Filings #1

    Science.gov (United States)

    2013-04-10

    ... Partners, LLC, Delaware City Refining Company, LLC, PBF Power Marketing LLC. Description: Supplement to Lea.... Description: Duquesne submits new OATT Attachment H-17C to be effective 6/1/2013. Filed Date: 4/1/13.... Docket Numbers: ER13-1222-000. Applicants: NV Energy, Inc. Description: OATT Revisions to Attachment N...

  4. 78 FR 68431 - Combined Notice of Filings #1

    Science.gov (United States)

    2013-11-14

    ... Company's Petition for Limited Waiver. Filed Date: 11/4/13. Accession Number: 20131104-5152. Comments Due.... ET 11/26/13. Docket Numbers: ER14-313-000. Applicants: Public Service Company of New Hampshire. Description: Public Service Company of New Hampshire submits Cancellation of LCRA with CMEEC to be effective 1...

  5. Comparison of survival between dialysis patients with incident high-flux hemodialysis versus on-line hemodiafiltration: A single center experience in Saudi Arabia

    Directory of Open Access Journals (Sweden)

    Mohamed Said Abdelsalam

    2018-01-01

    Full Text Available Conventional hemodialysis (HD is the most common treatment modality used for renal replacement therapy. The concept of HD is based on the diffusion of solutes across a semipermeable membrane. Hemofiltration (HF is based on convective transport of solutes; hemodiafiltration (HDF is based on combined convective and diffusive therapies. Data about survival benefit of on-line HDF (OL-HDF over high-flux HD (HF-HD is conflicting. We conducted this study to investigate if there is a survival difference between the two treatment modalities. This study is a retrospective, single-center study in which 78 patients were screened; 18 were excluded and 60 patients were analyzed. The study patients were aged 47.5 ± 20.7 years, 33 patients (55% were on HF-HD, and 27 patients (45% were on OL-HDF. A total of 24 patients (40% of both groups were diabetic and, the mean duration on dialysis was 43.5 ±21.3 months in the HF-HD group and 41.2 ± 22.0 months in the OL-HDF group. The mean substitution volume for OL-HDF was 22.3 ± 2.5 L. Survival was 73% [95%, confidence interval (CI 60–84] in the HF-HD group and 65% (95%, CI 54–75 in the OL-HDF group by the end of the study period. The unadjusted hazard ratio (HR with 95% CI comparing HF-HD to high-volume postdilution OL-HDF was 0.78 (0.10–5.6; P = 0.810. Kaplan–Meier analysis for patient survival over five years showed no significant difference between the two modalities. Prospective controlled trials with a larger number of patients will be needed to assess the long-term clinical outcome of postdilution OL-HDF over HF-HD.

  6. High-Flux Hemodialysis and High-Volume Hemodiafiltration Improve Serum Calcification Propensity.

    Directory of Open Access Journals (Sweden)

    Marijke Dekker

    Full Text Available Calciprotein particles (CPPs may play an important role in the calcification process. The calcification propensity of serum (T50 is highly predictive of all-cause mortality in chronic kidney disease patients. Whether T50 is therapeutically improvable, by high-flux hemodialysis (HD or hemodiafiltration (HDF, has not been studied yet.We designed a cross-sectional single center study, and included stable prevalent in-center dialysis patients on HD or HDF. Patients were divided into two groups based on dialysis modality, were on a thrice-weekly schedule, had a dialysis vintage of > 3 months and vascular access providing a blood flow rate > 300 ml/min. Calcification propensity of serum was measured by the time of transformation from primary to secondary CPP (T50 test, by time-resolved nephelometry.We included 64 patients, mean convective volume was 21.7L (SD 3.3L. In the pooled analysis, T50 levels increased in both the HD and HDF group with pre- and post-dialysis (mean (SD of 244(64 - 301(57 and 253(55 - 304(61 min respectively (P = 0.43(HD vs. HDF. The mean increase in T50 was 26.29% for HD and 21.97% for HDF patients (P = 0.61 (HD vs. HDF. The delta values (Δ of calcium, phosphate and serum albumin were equal in both groups. Baseline T50 was negatively correlated with phosphate, and positively correlated with serum magnesium and fetuin-A. The ΔT50 was mostly influenced by Δ phosphate (r = -0.342; P = 0.002 HD and r = -0.396; P<0.001 HDF in both groups.HD and HDF patients present with same baseline T50 calcification propensity values pre-dialysis. Calcification propensity is significantly improved during both HD and HDF sessions without significant differences between both modalities.

  7. The Effect of Online Hemodiafiltration on Infections: Results from the CONvective TRAnsport STudy.

    Directory of Open Access Journals (Sweden)

    Claire H den Hoedt

    Full Text Available Hemodialysis (HD patients have a high risk of infections. The uremic milieu has a negative impact on several immune responses. Online hemodiafiltration (HDF may reduce the risk of infections by ameliorating the uremic milieu through enhanced clearance of middle molecules. Since there are few data on infectious outcomes in HDF, we compared the effects of HDF with low-flux HD on the incidence and type of infections.We used data of the 714 HD patients (age 64 ±14, 62% men, 25% Diabetes Mellitus, 7% catheters participating in the CONvective TRAnsport STudy (CONTRAST, a randomized controlled trial evaluating the effect of HDF as compared to low-flux HD. The events were adjudicated by an independent event committee. The risk of infectious events was compared with Cox regression for repeated events and Cox proportional hazard models. The distributions of types of infection were compared between the groups.Thirty one percent of the patients suffered from one or more infections leading to hospitalization during the study (median follow-up 1.96 years. The risk for infections during the entire follow-up did not differ significantly between treatment arms (HDF 198 and HD 169 infections in 800 and 798 person-years respectively, hazard ratio HDF vs. HD 1.09 (0.88-1.34, P = 0.42. No difference was found in the occurrence of the first infectious event (either fatal, non-fatal or type specific. Of all infections, respiratory infections (25% in HDF, 28% in HD were most common, followed by skin/musculoskeletal infections (21% in HDF, 13% in HD.HDF as compared to HD did not result in a reduced risk of infections, larger studies are needed to confirm our findings.ClinicalTrials.gov NCT00205556.

  8. SV40-transformed human fibroblasts: evidence for cellular aging in pre-crisis cells.

    Science.gov (United States)

    Stein, G H

    1985-10-01

    Pre-crisis SV40-transformed human diploid fibroblast (HDF) cultures have a finite proliferative lifespan, but they do not enter a viable senescent state at end of lifespan. Little is known about either the mechanism for this finite lifespan in SV40-transformed HDF or its relationship to finite lifespan in normal HDF. Recently we proposed that in normal HDF the phenomena of finite lifespan and arrest in a viable senescent state depend on two separate processes: 1) an age-related decrease in the ability of the cells to recognize or respond to serum and/or other mitogens such that the cells become functionally mitogen-deprived at the end of lifespan; and 2) the ability of the cells to enter a viable, G1-arrested state whenever they experience mitogen deprivation. In this paper, data are presented that suggest that pre-crisis SV40-transformed HDF retain the first process described above, but lack the second process. It is shown that SV40-transformed HDF have a progressively decreasing ability to respond to serum as they age, but they continue to traverse the cell cycle at the end of lifespan. Concomitantly, the rate of cell death increases steadily toward the end of lifespan, thereby causing the total population to cease growing and ultimately to decline. Previous studies have shown that when SV40-transformed HDF are environmentally serum deprived, they likewise exhibit continued cell cycle traverse coupled with increased cell death. Thus, these results support the hypothesis that pre-crisis SV40-transformed HDF still undergo the same aging process as do normal HDF, but they end their lifespan in crisis rather than in the normal G1-arrested senescent state because they have lost their ability to enter a viable, G1-arrested state in response to mitogen deprivation.

  9. Detection Of Alterations In Audio Files Using Spectrograph Analysis

    Directory of Open Access Journals (Sweden)

    Anandha Krishnan G

    2015-08-01

    Full Text Available The corresponding study was carried out to detect changes in audio file using spectrograph. An audio file format is a file format for storing digital audio data on a computer system. A sound spectrograph is a laboratory instrument that displays a graphical representation of the strengths of the various component frequencies of a sound as time passes. The objectives of the study were to find the changes in spectrograph of audio after altering them to compare altering changes with spectrograph of original files and to check for similarity and difference in mp3 and wav. Five different alterations were carried out on each audio file to analyze the differences between the original and the altered file. For altering the audio file MP3 or WAV by cutcopy the file was opened in Audacity. A different audio was then pasted to the audio file. This new file was analyzed to view the differences. By adjusting the necessary parameters the noise was reduced. The differences between the new file and the original file were analyzed. By adjusting the parameters from the dialog box the necessary changes were made. The edited audio file was opened in the software named spek where after analyzing a graph is obtained of that particular file which is saved for further analysis. The original audio graph received was combined with the edited audio file graph to see the alterations.

  10. 77 FR 1542 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing and Immediate...

    Science.gov (United States)

    2012-01-10

    ...-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing and Immediate Effectiveness of Proposed Rule Change To Modify Fees for Members Using the NASDAQ Market Center January 4, 2012. Pursuant to... is hereby given that on December 28, 2011, The NASDAQ Stock Market LLC (``NASDAQ'') filed with the...

  11. 75 FR 2127 - Basin Farm Renewable, LLC; Notice of Preliminary Permit Application Accepted for Filing and...

    Science.gov (United States)

    2010-01-14

    ... Renewable, LLC; Notice of Preliminary Permit Application Accepted for Filing and Soliciting Comments... Renewable, LLC filed an application, pursuant to section 4(f) of the Federal Power Act, proposing to study the feasibility of the Basin Farm Renewable Energy Project Project, to be located on Saxtons River, in...

  12. 78 FR 71689 - Self-Regulatory Organizations; Topaz Exchange, LLC; Notice of Filing of Proposed Minor Rule...

    Science.gov (United States)

    2013-11-29

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-70927; File No. 4-669] Self-Regulatory Organizations; Topaz Exchange, LLC; Notice of Filing of Proposed Minor Rule Violation Plan November 22, 2013... of Rule 19d-1(c)(1) of the Act \\3\\ requiring that a self- regulatory organization (``SRO'') promptly...

  13. The crystallographic information file (CIF): A new standard archive file for crystallography

    International Nuclear Information System (INIS)

    Hall, S.R.; Allen, F.H.; Brown, I.D.

    1991-01-01

    The specification of a new standard Crystallographic Information File (CIF) is described. Its development is based on the Self-Defining Text Archieve and Retrieval (STAR) procedure. The CIF is a general, flexible and easily extensible free-format archive file; it is human and machine readable and can be edited by a simple editor. The CIF is designed for the electronic transmission of crystallographic data between individual laboratories, journals and databases: It has been adopted by the International Union of Crystallography as the recommended medium for this purpose. The file consists of data names and data items, together with a loop facility for repeated items. The data names, constructed hierarchically so as to form data categories, are self-descriptive within a 32-character limit. The sorted list of data names, together with their precise definitions, constitutes the CIF dictionary (core version 1991). The CIF core dictionary is presented in full and covers the fundamental and most commonly used data items relevant to crystal structure analysis. The dictionary is also available as an electronic file suitable for CIF computer applications. Future extensions to the dictionary will include data items used in more specialized areas of crystallography. (orig.)

  14. 78 FR 8504 - Combined Notice of Filings #1

    Science.gov (United States)

    2013-02-06

    ... Wholesale Generator Status of Delano Energy Center, LLC. Filed Date: 1/25/13. Accession Number: 20130125...: Docket Numbers: LA12-4-000. Applicants: BP Energy Company, BP West Coast Products LLC, Cedar Creek Wind Energy, LLC, Cedar Creek II, LLC, Flat Ridge 2 Wind Energy LLC, Flat Ridge Wind Energy, LLC, Fowler Ridge...

  15. Fast probabilistic file fingerprinting for big data.

    Science.gov (United States)

    Tretyakov, Konstantin; Laur, Sven; Smant, Geert; Vilo, Jaak; Prins, Pjotr

    2013-01-01

    Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily concerns logistics within and between data centers, but is also important for workstation users in the analysis phase. Common usage patterns, such as comparing and transferring files, are proving computationally expensive and are tying down shared resources. We present an efficient method for calculating file uniqueness for large scientific data files, that takes less computational effort than existing techniques. This method, called Probabilistic Fast File Fingerprinting (PFFF), exploits the variation present in biological data and computes file fingerprints by sampling randomly from the file instead of reading it in full. Consequently, it has a flat performance characteristic, correlated with data variation rather than file size. We demonstrate that probabilistic fingerprinting can be as reliable as existing hashing techniques, with provably negligible risk of collisions. We measure the performance of the algorithm on a number of data storage and access technologies, identifying its strengths as well as limitations. Probabilistic fingerprinting may significantly reduce the use of computational resources when comparing very large files. Utilisation of probabilistic fingerprinting techniques can increase the speed of common file-related workflows, both in the data center and for workbench analysis. The implementation of the algorithm is available as an open-source tool named pfff, as a command-line tool as well as a C library. The tool can be downloaded from http://biit.cs.ut.ee/pfff.

  16. A World Wide Web Human Dimensions Framework and Database for Wildlife and Forest Planning

    Science.gov (United States)

    Michael A. Tarrant; Alan D. Bright; H. Ken Cordell

    1999-01-01

    The paper describes a human dimensions framework(HDF) for application in wildlife and forest planning. The HDF is delivered via the world wide web and retrieves data on-line from the Social, Economic, Environmental, Leisure, and Attitudes (SEELA) database. The proposed HDF is guided by ten fundamental HD principles, and is applied to wildlife and forest planning using...

  17. Pengaruh Penerapan E-filing, Tingkat Pemahaman Perpajakan dan Kesadaran Wajib Pajak terhadap Kepatuhan Wajib Pajak di Kpp Pratama YOGYAKARTA

    OpenAIRE

    Agustiningsih, Wulandari; Isroah, Isroah

    2016-01-01

    Penelitian ini bertujuan untuk mengetahui (1) Pengaruh penerapan e-filing terhadap kepatuhan wajib pajak. (2) Pengaruh tingkat pemahaman perpajakan terhadap kepatuhan wajib pajak. (3) Pengaruh kesadaran wajib pajak terhadap kepatuhan wajib pajak. (4) Pengaruh penerapan e-filing, tingkat pemahaman perpajakan dan kesadaran wajib pajak terhadap kepatuhan wajib pajak. Populasi penelitian ini adalah Wajib Pajak pengguna e-filing di KPP Pratama Yogyakarta deng...

  18. 77 FR 34030 - BOST1 Hydroelectric LLC; Notice of Intent To File License Application, Filing of Pre-Application...

    Science.gov (United States)

    2012-06-08

    ... Hydroelectric LLC; Notice of Intent To File License Application, Filing of Pre-Application Document, and.... Date Filed: March 21, 2012. d. Submitted By: BOST1 Hydroelectric LLC (BOST1). e. Name of Project: Coon Rapids Dam Hydroelectric Project. f. Location: Mississippi River in Hennepin and Anoka counties...

  19. High-Performance, Multi-Node File Copies and Checksums for Clustered File Systems

    Science.gov (United States)

    Kolano, Paul Z.; Ciotti, Robert B.

    2012-01-01

    Modern parallel file systems achieve high performance using a variety of techniques, such as striping files across multiple disks to increase aggregate I/O bandwidth and spreading disks across multiple servers to increase aggregate interconnect bandwidth. To achieve peak performance from such systems, it is typically necessary to utilize multiple concurrent readers/writers from multiple systems to overcome various singlesystem limitations, such as number of processors and network bandwidth. The standard cp and md5sum tools of GNU coreutils found on every modern Unix/Linux system, however, utilize a single execution thread on a single CPU core of a single system, and hence cannot take full advantage of the increased performance of clustered file systems. Mcp and msum are drop-in replacements for the standard cp and md5sum programs that utilize multiple types of parallelism and other optimizations to achieve maximum copy and checksum performance on clustered file systems. Multi-threading is used to ensure that nodes are kept as busy as possible. Read/write parallelism allows individual operations of a single copy to be overlapped using asynchronous I/O. Multinode cooperation allows different nodes to take part in the same copy/checksum. Split-file processing allows multiple threads to operate concurrently on the same file. Finally, hash trees allow inherently serial checksums to be performed in parallel. Mcp and msum provide significant performance improvements over standard cp and md5sum using multiple types of parallelism and other optimizations. The total speed-ups from all improvements are significant. Mcp improves cp performance over 27x, msum improves md5sum performance almost 19x, and the combination of mcp and msum improves verified copies via cp and md5sum by almost 22x. These improvements come in the form of drop-in replacements for cp and md5sum, so are easily used and are available for download as open source software at http://mutil.sourceforge.net.

  20. 75 FR 68340 - Combined Notice of Filings # 1

    Science.gov (United States)

    2010-11-05

    ... Project, LP; Southern Company--Florida LLC. Description: Report of Non-Material Change in Estimated Coal... Energy Ohio, Inc. submits tariff filing per 35: Compliance Filing to be effective 7/29/2010. Filed Date..., Inc. Description: Midwest Independent Transmission System Operator, Inc. submits tariff filing per 35...

  1. 77 FR 60424 - FFP Project 73 LLC; Notice of Preliminary Permit Application Accepted for Filing and Soliciting...

    Science.gov (United States)

    2012-10-03

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 13484-001] FFP Project 73 LLC; Notice of Preliminary Permit Application Accepted for Filing and Soliciting Comments, Motions To Intervene, and Competing Applications On September 4, 2012, FFP Project 73, LLC filed an application for a...

  2. Improving the Accessibility and Use of NASA Earth Science Data

    Science.gov (United States)

    Tisdale, Matthew; Tisdale, Brian

    2015-01-01

    Many of the NASA Langley Atmospheric Science Data Center (ASDC) Distributed Active Archive Center (DAAC) multidimensional tropospheric and atmospheric chemistry data products are stored in HDF4, HDF5 or NetCDF format, which traditionally have been difficult to analyze and visualize with geospatial tools. With the rising demand from the diverse end-user communities for geospatial tools to handle multidimensional products, several applications, such as ArcGIS, have refined their software. Many geospatial applications now have new functionalities that enable the end user to: Store, serve, and perform analysis on each individual variable, its time dimension, and vertical dimension. Use NetCDF, GRIB, and HDF raster data formats across applications directly. Publish output within REST image services or WMS for time and space enabled web application development. During this webinar, participants will learn how to leverage geospatial applications such as ArcGIS, OPeNDAP and ncWMS in the production of Earth science information, and in increasing data accessibility and usability.

  3. 75 FR 15475 - Self-Regulatory Organizations; EDGX Exchange, Inc.; Notice of Filing of Proposed Minor Rule...

    Science.gov (United States)

    2010-03-29

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-61752; File No. 4-594] Self-Regulatory... Rule 19d-1(c)(1) of the Act \\3\\ requiring that a self- regulatory organization promptly file notice... Commission adopted amendments to paragraph (c) of Rule 19d-1 to allow self-regulatory organizations (``SROs...

  4. 75 FR 15471 - Self-Regulatory Organizations; EDGA Exchange, Inc.; Notice of Filing of Proposed Minor Rule...

    Science.gov (United States)

    2010-03-29

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-61753; File No. 4-595] Self-Regulatory... Rule 19d-1(c)(1) of the Act \\3\\ requiring that a self- regulatory organization promptly file notice... Commission adopted amendments to paragraph (c) of Rule 19d-1 to allow self-regulatory organizations (``SROs...

  5. An ML-Based Radial Velocity Estimation Algorithm for Moving Targets in Spaceborne High-Resolution and Wide-Swath SAR Systems

    Directory of Open Access Journals (Sweden)

    Tingting Jin

    2017-04-01

    Full Text Available Multichannel synthetic aperture radar (SAR is a significant breakthrough to the inherent limitation between high-resolution and wide-swath (HRWS compared with conventional SAR. Moving target indication (MTI is an important application of spaceborne HRWS SAR systems. In contrast to previous studies of SAR MTI, the HRWS SAR mainly faces the problem of under-sampled data of each channel, causing single-channel imaging and processing to be infeasible. In this study, the estimation of velocity is equivalent to the estimation of the cone angle according to their relationship. The maximum likelihood (ML based algorithm is proposed to estimate the radial velocity in the existence of Doppler ambiguities. After that, the signal reconstruction and compensation for the phase offset caused by radial velocity are processed for a moving target. Finally, the traditional imaging algorithm is applied to obtain a focused moving target image. Experiments are conducted to evaluate the accuracy and effectiveness of the estimator under different signal-to-noise ratios (SNR. Furthermore, the performance is analyzed with respect to the motion ship that experiences interference due to different distributions of sea clutter. The results verify that the proposed algorithm is accurate and efficient with low computational complexity. This paper aims at providing a solution to the velocity estimation problem in the future HRWS SAR systems with multiple receive channels.

  6. Association of intradialytic hypotension and convective volume in hemodiafiltration: results from a retrospective cohort study

    Directory of Open Access Journals (Sweden)

    Mora-Bravo Franklin G

    2012-09-01

    Full Text Available Abstract Background Hemodiafiltration (HDF, as a convective blood purification technique, has been associated with favorable outcomes improved phosphate control, removal of middle-molecules such as Beta2-microglobulin and the occurrence of intradialytic hypotension (IDH as compared to diffusive techniques. The aim of this retrospective cohort study in dialysis patients receiving HDF in one urban dialysis facility in Mexico City was to investigate the occurrence of IDH during HDF treatments with varying convective volume prescriptions. Methods Subjects were stratified into equal groups of percentiles of convective volume prescription: Group 1 of 0 to 7.53 liters, group 2 of 7.54 to 14.8 liters, group 3 of 14.9 to 16.96 liters, group 4 of 16.97 to 18.9 liters, group 5 of 21 to 19.9 liters and group 6 of 21.1 to 30 liters. Logistic Regression with and without adjustment for confounding factors was used to evaluate factors associated with the occurrence of IDH. Results 2276 treatments of 154 patients were analyzed. IDH occurred during 239 HDF treatments (10.5% of all treatments. Group 1 showed 31 treatments (8.2% with IDH whereas group 6 showed IDH in only 15 sessions (4% of all treatments. Odds Ratio of IDH for Group 6 was 0.47 (95% CI 0.25 to 0.88 as compared to Group 1 after adjustment. Conclusions In summary the data of this retrospective cohort study shows an inverse correlation between the occurrence of IDH and convective volume prescription. Further research in prospective settings is needed to confirm these findings.

  7. 77 FR 105 - Combined Notice of Filings #2

    Science.gov (United States)

    2012-01-03

    ... tariff filing per 35.17(b): Effective Date Change for ``Fully Integrated'' DR Rules to be effective 6/1... Numbers: ER12-10-001. Applicants: Energy International Power Marketing. Description: Energy International Power Marketing submits tariff filing per 35: EIP Compliance Filing to be effective 10/3/2011. Filed...

  8. Design and application of remote file management system

    International Nuclear Information System (INIS)

    Zhu Haijun; Liu Dekang; Shen liren

    2006-01-01

    File transfer protocol can help users transfer files between computers on internet. FTP can not fulfill the needs of users in special occasions, so it needs programmer define file transfer protocol himself based on users. The method or realization and application for user-defined file transfer protocol is introduced. (authors)

  9. 37 CFR 360.21 - Time of filing.

    Science.gov (United States)

    2010-07-01

    ... OF ROYALTY CLAIMS FILING OF CLAIMS TO ROYALTY FEES COLLECTED UNDER COMPULSORY LICENSE Digital Audio... purposes of royalties filing and fee distribution. Such written authorization, however, will not be... members or affiliates before the Copyright Royalty Board in royalty filing and fee distribution...

  10. 76 FR 77222 - Combined Notice of Filings #2

    Science.gov (United States)

    2011-12-12

    ..., FirstLight Hydro Generating Corporation, FirstLight Power Resources Management, L,GDF SUEZ Energy..., L.L.C. Description: Original Service Agreement Nos. 3156 and 3157-PJM Queue X2-082 to be effective...: Original Service Agreement No. 3153; Queue No. W1-029 to be effective 11/4/2011. Filed Date: 12/5/11...

  11. X-Files opjat v stroju

    Index Scriptorium Estoniae

    2008-01-01

    USA sarjale "The X-Files" põhinev teine järjefilm "Salatoimikud: Ma tahan uskuda" ("The X-Files: I Want to Believe") : režissöör Chris Carter : peaosades David Duchovny, Gillian Anderson : Ameerika Ühendriigid - Kanada 2008

  12. Both IL-1β and TNF-α Regulate NGAL Expression in Polymorphonuclear Granulocytes of Chronic Hemodialysis Patients

    Directory of Open Access Journals (Sweden)

    Adriana Arena

    2010-01-01

    Full Text Available Background. NGAL is involved in modulation of the inflammatory response and is found in the sera of uremic patients. We investigated whether hemodiafiltration (HDF could influence the ability of polymorphonuclear granulocytes (PMGs to release NGAL. The involvement of interleukin- (IL-1β and tumor necrosis factor- (TNF-α on NGAL release was evaluated. Methods. We studied end-stage renal disease (ESRD patients at the start of dialysis (Pre-HDF and at the end of treatment (Post-HDF and 18 healthy subjects (HSs. Peripheral venous blood was taken from HDF patients at the start of dialysis and at the end of treatment. Results. PMGs obtained from ESRD patients were hyporesponsive to LPS treatment, with respect to PMG from HS. IL-1β and TNF-α produced by PMG from post-HDF patients were higher than those obtained by PMG from pre-HDF. Neutralization of IL-1β, but not of TNF-α, determined a clear-cut production of NGAL in PMG from healthy donors. On the contrary, specific induction of NGAL in PMG from uremic patients was dependent on the presence in supernatants of IL-1β and TNF-α. Conclusion. Our data demonstrate that in PMG from healthy subjects, NGAL production was supported solely by IL-1β, whereas in PMG from HDF patients, NGAL production was supported by IL-1β, TNF-α.

  13. 78 FR 37217 - Ted P. Sorenson; Notice of Preliminary Permit Application Accepted for Filing and Soliciting...

    Science.gov (United States)

    2013-06-20

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 14519-000] Ted P. Sorenson; Notice of Preliminary Permit Application Accepted for Filing and Soliciting Comments, Motions To Intervene, and Competing Applications On May 1, 2013, Ted P. Sorenson filed an application for a preliminary permit, pursuant to section 4(f) of the...

  14. Smear layer removal evaluation of different protocol of Bio Race file and XP- endo Finisher file in corporation with EDTA 17% and NaOCl.

    Science.gov (United States)

    Zand, Vahid; Mokhtari, Hadi; Reyhani, Mohammad-Frough; Nahavandizadeh, Neda; Azimi, Shahram

    2017-11-01

    The aim of the present study was to compare the amount of the smear layer remaining in prepared root canals with different protocols of Bio RaCe files and XP-endo Finisher file (XPF) in association with 17% EDTA and sodium hypochlorite solution. A total of 68 extracted single-rooted teeth were randomly divided into 4 experimental groups (n=14) and two control groups (n=6). The root canals were prepared with Bio RaCe files (FKG Dentaire, Switzerland) using the crown-down technique based on manufacturer's instructions and irrigated according to the following irrigation techniques: Group 1: XPF with 2 mL of 2.5% NaOCl for 1 minute. Group 2:, XPF with 1 mL of 17% EDTA for one minute. Group 3: XPF was used for 1 minute in association with normal saline solution. Group 4: XP-endo Finisher file for 30 seconds in association with 2.5% NaOCl and 17% EDTA for 30 seconds. The negative control group: NaOCl (2.5%) was used during root canal preparation, followed by irrigation with 17% EDTA at the end of root canal preparation. The positive control group: Normal saline solution was used for irrigation during root canal preparation. In all the groups, during preparation of the root canals with Bio RaCe file, 20 mL of 2.5% NaOCl was used for root canal irrigation and at the end of the procedural steps 20 mL of normal saline solution was used as a final irrigant. The samples were analyzed under SEM at ×1000‒2000 magnification and evaluated using Torabinejad scoring system. Data were analyzed with non-parametric Kruskal-Wallis test and post hoc Mann-Whitney U test, using SPSS. Statistical significant was defined at P <0.05. The results of the study showed the least amount of the smear layer at coronal, middle and apical thirds of the root canals in groups 2, which was not significantly different from the negative control group ( P <0.5). Under the limitations of the present study, use of a combination of NaOCl and EDTA in association with XPF exhibited the best efficacy for the

  15. 29 CFR 1979.103 - Filing of discrimination complaint.

    Science.gov (United States)

    2010-07-01

    ... subcontractor of an air carrier in violation of the Act may file, or have filed by any person on the employee's... acts and omissions, with pertinent dates, which are believed to constitute the violations. (c) Place of filing. The complaint should be filed with the OSHA Area Director responsible for enforcement activities...

  16. 76 FR 6125 - Combined Notice of Filings #1

    Science.gov (United States)

    2011-02-03

    ... filing per 35: Arthur Kill--Amendment to MBR Tariff 01262011 to be effective 10/8/ 2010. Filed Date: 01... Turbine Power LLC submits tariff filing per 35: Astoria--Amendment to MBR Tariff 01/26/2011 to be... Description: Conemaugh Power LLC submits tariff filing per 35: Conemaugh--Amendment to MBR Tariff 01262011 to...

  17. 77 FR 43820 - Combined Notice of Filings #1

    Science.gov (United States)

    2012-07-26

    .... Docket Numbers: ER12-1946-001. Applicants: Duke Energy Beckjord, LLC. Description: Amendment to MBR...: Amendment to MBR Tariff Filing to be effective 10/1/ 2012. Filed Date: 7/18/12. Accession Number: 20120718... Creek, LLC. Description: Amendment to MBR Tariff Filing to be effective 10/1/ 2012. Filed Date: 7/18/12...

  18. 77 FR 60418 - Combined Notice of Filings

    Science.gov (United States)

    2012-10-03

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings Take notice that the Commission has received the following Natural Gas Pipeline Rate and Refund Report filings...-1064-000. Applicants: Venice Gathering System, L.L.C. Description: NAESB 2.0 Compliance Filing to be...

  19. An exploratory discussion on business files compilation

    International Nuclear Information System (INIS)

    Gao Chunying

    2014-01-01

    Business files compilation for an enterprise is a distillation and recreation of its spiritual wealth, from which the applicable information can be available to those who want to use it in a fast, extensive and precise way. Proceeding from the effects of business files compilation on scientific researches, productive constructions and developments, this paper in five points discusses the way how to define topics, analyze historical materials, search or select data and process it to an enterprise archives collection. Firstly, to expound the importance and necessity of business files compilation in production, operation and development of an company; secondly, to present processing methods from topic definition, material searching and data selection to final examination and correction; thirdly, to define principle and classification in order to make different categories and levels of processing methods available to business files compilation; fourthly, to discuss the specific method how to implement a file compilation through a documentation collection upon principle of topic definition gearing with demand; fifthly, to address application of information technology to business files compilation in view point of widely needs for business files so as to level up enterprise archives management. The present discussion focuses on the examination and correction principle of enterprise historical material compilation and the basic classifications as well as the major forms of business files compilation achievements. (author)

  20. Serum sclerostin: relation with mortality and impact of hemodiafiltration.

    Science.gov (United States)

    Lips, Lotte; de Roij van Zuijdewijn, Camiel L M; Ter Wee, Piet M; Bots, Michiel L; Blankestijn, Peter J; van den Dorpel, Marinus A; Fouque, Denis; de Jongh, Renate; Pelletier, Solenne; Vervloet, Marc G; Nubé, Menso J; Grooteman, Muriel P C

    2017-07-01

    The glycoprotein sclerostin (Scl; 22 kDa), which is involved in bone metabolism, may play a role in vascular calcification in haemodialysis (HD) patients. In the present study, we investigated the relation between serum Scl (sScl) and mortality. The effects of dialysis modality and the magnitude of the convection volume in haemodiafiltration (HDF) on sScl were also investigated. In a subset of patients from the CONTRAST study, a randomized controlled trial comparing HDF with HD, sScl was measured at baseline and at intervals of 6, 12, 24 and 36 months. Patients were divided into quartiles, according to their baseline sScl. The relation between time-varying sScl and mortality with a 4-year follow-up period was investigated using crude and adjusted Cox regression models. Linear mixed models were used for longitudinal measurements of sScl. The mean (±standard deviation) age of 396 test subjects was 63.6 (±13.9 years), 61.6% were male and the median follow-up was 2.9 years. Subjects with the highest sScl had a lower mortality risk than those with the lowest concentrations [adjusted hazard ratio 0.51 (95% confidence interval, CI, 0.31-0.86, P = 0.01)]. Stratified models showed a stable sScl in patients treated with HD (Δ +2.9 pmol/L/year, 95% CI -0.5 to +6.3, P = 0.09) and a decreasing concentration in those treated with HDF (Δ -4.5 pmol/L/year, 95% CI -8.0 to -0.9, P = 0.02). The relative change in the latter group was related to the magnitude of the convection volume. (i) A high sScl is associated with a lower mortality risk in patients with end-stage kidney disease; (ii) treatment with HDF causes sScl to fall; and (iii) the relative decline in patients treated with HDF is dependent on the magnitude of the convection volume. © The Author 2016. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  1. Wear of the Primary WaveOne single file when shaping vestibular root canals of first maxillary molar.

    Science.gov (United States)

    Aracena, Daniel; Borie, Eduardo; Betancourt, Pablo; Aracena, Angella; Guzmán, Mario

    2017-03-01

    It is very important for a clinician to know the increased wear of mechanized files when establishing endodontic therapy. The aim of this study was to check the wear of the Primary WaveOne file upon shaping two, four and six maxillary molar vestibular canals. The deterioration of 40 files, divided into four groups, was evaluated microscopically: group 1, control (unused); group 2, two canals; group 3, four canals; and group 4, six canals. After instrumentation, the files were embedded in resin and sectioned at their apical third into three equal parts. To analyze the wear of edges in the different sections, AutoCAD software was used and analysis of variance (ANOVA) was then performed to compare the mean rake angles. The files with two and four uses showed slight wear, whereas those with six applications showed significant wear ( p <0.05). Primary WaveOne files can be used in up to four root canals without their edges losing effectiveness. Key words: Files wear, reciprocating motion, shaping capacity, WaveOne.

  2. 75 FR 62371 - Combined Notice of Filings #2

    Science.gov (United States)

    2010-10-08

    .... Description: Weyerhaeuser NR Company submits tariff filing per 35.12: Weyerhaeuser NR Company MBR Tariff to be..., Inc. MBR eTariff Filing to be effective 9/30/2010. Filed Date: 09/30/2010. Accession Number: 20100930... 35.12: Chandler Wind Partners, LLC MBR Tariff to be effective 9/30/ 2010. Filed Date: 09/30/2010...

  3. Collective operations in a file system based execution model

    Science.gov (United States)

    Shinde, Pravin; Van Hensbergen, Eric

    2013-02-19

    A mechanism is provided for group communications using a MULTI-PIPE synthetic file system. A master application creates a multi-pipe synthetic file in the MULTI-PIPE synthetic file system, the master application indicating a multi-pipe operation to be performed. The master application then writes a header-control block of the multi-pipe synthetic file specifying at least one of a multi-pipe synthetic file system name, a message type, a message size, a specific destination, or a specification of the multi-pipe operation. Any other application participating in the group communications then opens the same multi-pipe synthetic file. A MULTI-PIPE file system module then implements the multi-pipe operation as identified by the master application. The master application and the other applications then either read or write operation messages to the multi-pipe synthetic file and the MULTI-PIPE synthetic file system module performs appropriate actions.

  4. Perancangan dan Penerapan Algoritme 4DES (Studi Kasus Pada Keamanan Berkas Rekam Medis

    Directory of Open Access Journals (Sweden)

    Yeni Yanti

    2017-01-01

    Full Text Available Information is necessary for life because everything can not be done properly in the absence of information. The security problem is one of the most crucial aspects of a file containing sensitive information, for example, medical record files. Often, the file owner, designer, and manager of the information systems pay less attention to the security issues. One way to anticipate this is by using a cryptographic method, which is the science and art to keep the message security. This study aimed to evaluate the performance analysis and building a security system prototype of medical record files using the 4DES algorithm. The 4DES algorithm is a variant of the 3DES algorithm that is more robust and capable of protecting information properly. The 4DES security system has four keys; each key has a key length of 64 bits so that the total length of four keys is 256 bits and K1≠K2≠K3≠K4. The encrypted / decrypted files (Word, Excel, and Image  using an external key of minimum eight characters (64 bits. During encryption, there was an addition of padding bytes in each of data block size to minimalized attack from the attacker using a CBC operation mode process. Results showed that the processing speed of the encrypted files using the 4DES was 1 second faster than that of using the 3DES algorithm. Also, the 4DES algorithm has superiority regarding of file safety, which has time enduring 3.45 x1056years longer to brute force attack technique which able to discover text file and the secret key.

  5. Search the SEC website for the latest EDGAR filings

    Data.gov (United States)

    Securities and Exchange Commission — This listing contains the most recent filings for the current official filing date (including filings made after the 5:30pm deadline on the previous filing day)....

  6. 21 CFR 720.2 - Times for filing.

    Science.gov (United States)

    2010-04-01

    ... Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) COSMETICS VOLUNTARY FILING OF COSMETIC PRODUCT INGREDIENT COMPOSITION STATEMENTS § 720.2 Times for filing. Within 180 days after forms are made available to the industry, Form FDA 2512 should be filed for each cosmetic...

  7. 75 FR 51451 - Erie Boulevard Hydropower, L.P.; Notice of Intent To File License Application, Filing of Pre...

    Science.gov (United States)

    2010-08-20

    ... Hydropower, L.P.; Notice of Intent To File License Application, Filing of Pre-Application Document, and.... Project No.: 7320-040. c. Dated Filed: June 29, 2010. d. Submitted By: Erie Boulevard Hydropower, L.P. e...: John Mudre at (202) 502-8902; or e-mail at [email protected] . j. Erie Boulevard Hydropower, L.P...

  8. Flexibility and Performance of Parallel File Systems

    Science.gov (United States)

    Kotz, David; Nieuwejaar, Nils

    1996-01-01

    As we gain experience with parallel file systems, it becomes increasingly clear that a single solution does not suit all applications. For example, it appears to be impossible to find a single appropriate interface, caching policy, file structure, or disk-management strategy. Furthermore, the proliferation of file-system interfaces and abstractions make applications difficult to port. We propose that the traditional functionality of parallel file systems be separated into two components: a fixed core that is standard on all platforms, encapsulating only primitive abstractions and interfaces, and a set of high-level libraries to provide a variety of abstractions and application-programmer interfaces (API's). We present our current and next-generation file systems as examples of this structure. Their features, such as a three-dimensional file structure, strided read and write interfaces, and I/O-node programs, are specifically designed with the flexibility and performance necessary to support a wide range of applications.

  9. Evaluated nuclear data file of Th-232

    International Nuclear Information System (INIS)

    Meadows, J.; Poenitz, W.; Smith, A.; Smith, D.; Whalen, J.; Howerton, R.

    1977-09-01

    An evaluated nuclear data file for thorium is described. The file extends over the energy range 0.049 (i.e., the inelastic-scattering threshold) to 20.0 MeV and is formulated within the framework of the ENDF system. The input data base, the evaluation procedures and judgments, and ancillary experiments carried out in conjunction with the evaluation are outlined. The file includes: neutron total cross sections, neutron scattering processes, neutron radiative capture cross sections, fission cross sections, (n;2n) and (n;3n) processes, fission properties (e.g., nu-bar and delayed neutron emission) and photon production processes. Regions of uncertainty are pointed out particularly where new measured results would be of value. The file is extended to thermal energies using previously reported resonance evaluations thereby providing a complete file for neutronic calculations. Integral data tests indicated that the file was suitable for neutronic calculations in the MeV range

  10. 29 CFR 24.103 - Filing of retaliation complaint.

    Science.gov (United States)

    2010-07-01

    ... violations. (c) Place of Filing. The complaint should be filed with the OSHA Area Director responsible for... by an employer in violation of any of the statutes listed in § 24.100(a) may file, or have filed by... with any OSHA officer or employee. Addresses and telephone numbers for these officials are set forth in...

  11. 76 FR 16404 - Combined Notice of Filings #1

    Science.gov (United States)

    2011-03-23

    ...: Amendment to MBR Tariff to be effective 5/13/2011. Filed Date: 03/14/2011. Accession Number: 20110314-5272... tariff filing per 35: Amendment to MBR Tariff to be effective 5/13/ 2011. Filed Date: 03/14/2011.... Subsidiary No. 2, Inc. submits tariff filing per 35: Amendment to MBR Tariff to be effective 5/13/ 2011...

  12. 77 FR 33206 - Combined Notice of Filings #2

    Science.gov (United States)

    2012-06-05

    ... tariff filing per 35: High Trail Wind Farm First Revised MBR to be effective 5/26/2012. Filed Date: 5/25... per 35: Old Trail Wind Farm First Revised MBR to be effective 5/26/2012. Filed Date: 5/25/12... First Revised MBR to be effective 6/1/2012. Filed Date: 5/25/12. Accession Number: 20120525-5088...

  13. 75 FR 5780 - Claverack Creek, LLC; Notice of Preliminary Permit Application Accepted for Filing and Soliciting...

    Science.gov (United States)

    2010-02-04

    ... sea level; (3) an existing turbine with a new generator and a new turbine- generator with a total capacity of 450 kilowatts; (4) an existing 10- foot-wide, 8-foot-deep intake canal; (5) new trash racks... Commission's Web site under the ``eFiling'' link. If unable to be filed electronically, documents may be...

  14. The effect of on-line hemodiafiltration on improving the cardiovascular function parameters in children on regular dialysis

    Directory of Open Access Journals (Sweden)

    Fatina I Fadel

    2015-01-01

    Full Text Available The cardiovascular disease is an important cause of morbidity and accounts for almost 50% of deaths in patients undergoing maintenance dialysis. Many harmful molecules of the uremic milieu, such as the middle molecules, are difficult to remove by conventional hemodialysis (HD. On-line hemodiafiltration (OL-HDF can achieve a considerable clearance of middle molecules and, together with its sterile ultrapure infusate, may have favorable effects on inflammation and cardiovascular complications. We aimed in this study to assess the effect of OL-HDF on improving the chronic inflammatory state associated with chronic kidney disease and the possible impact of these changes on myocardial function in chronic HD children. Thirty pediatric patients [12 (40% males and 18 (60% females with a mean age of 11.3 ± 3.2 years] on conventional HD for at least six months were switched to OL-HDF for six months. Variables for comparison at the end of each period included the levels of serum C-reactive protein and Kt/V as well as electrocardiography and echocardiographic measurements, including left ventricular mass index (LVMI. On changing from HD to OL-HDF, there was a significant decrease in hs-CRP (from 7.9 ± 8.9 to 3.4 ± 3 μ g/mL (P = 0.01 and frequency of diastolic dysfunction (P = 0.04, while systolic function (FS and EF improved significantly (P = 0.007 and 0.05, respectively, while LVMI did not change. We conclude that OL-HDF was well tolerated in children with improvement of the systolic function of the myocardium and the overall frequency of diastolic dysfunction.

  15. Lessons Learned in Deploying the World s Largest Scale Lustre File System

    Energy Technology Data Exchange (ETDEWEB)

    Dillow, David A [ORNL; Fuller, Douglas [ORNL; Wang, Feiyi [ORNL; Oral, H Sarp [ORNL; Zhang, Zhe [ORNL; Hill, Jason J [ORNL; Shipman, Galen M [ORNL

    2010-01-01

    The Spider system at the Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) is the world's largest scale Lustre parallel file system. Envisioned as a shared parallel file system capable of delivering both the bandwidth and capacity requirements of the OLCF's diverse computational environment, the project had a number of ambitious goals. To support the workloads of the OLCF's diverse computational platforms, the aggregate performance and storage capacity of Spider exceed that of our previously deployed systems by a factor of 6x - 240 GB/sec, and 17x - 10 Petabytes, respectively. Furthermore, Spider supports over 26,000 clients concurrently accessing the file system, which exceeds our previously deployed systems by nearly 4x. In addition to these scalability challenges, moving to a center-wide shared file system required dramatically improved resiliency and fault-tolerance mechanisms. This paper details our efforts in designing, deploying, and operating Spider. Through a phased approach of research and development, prototyping, deployment, and transition to operations, this work has resulted in a number of insights into large-scale parallel file system architectures, from both the design and the operational perspectives. We present in this paper our solutions to issues such as network congestion, performance baselining and evaluation, file system journaling overheads, and high availability in a system with tens of thousands of components. We also discuss areas of continued challenges, such as stressed metadata performance and the need for file system quality of service alongside with our efforts to address them. Finally, operational aspects of managing a system of this scale are discussed along with real-world data and observations.

  16. Index files for Belle II - very small skim containers

    Science.gov (United States)

    Sevior, Martin; Bloomfield, Tristan; Kuhr, Thomas; Ueda, I.; Miyake, H.; Hara, T.

    2017-10-01

    The Belle II experiment[1] employs the root file format[2] for recording data and is investigating the use of “index-files” to reduce the size of data skims. These files contain pointers to the location of interesting events within the total Belle II data set and reduce the size of data skims by 2 orders of magnitude. We implement this scheme on the Belle II grid by recording the parent file metadata and the event location within the parent file. While the scheme works, it is substantially slower than a normal sequential read of standard skim files using default root file parameters. We investigate the performance of the scheme by adjusting the “splitLevel” and “autoflushsize” parameters of the root files in the parent data files.

  17. 77 FR 23474 - Combined Notice of Filings

    Science.gov (United States)

    2012-04-19

    ...: Young Gas Storage Company, Ltd. Description: EBB Notice Categories to be effective 5/15/2012. Filed Date... intervention is necessary to become a party to the proceeding. Filings in Existing Proceedings Docket Numbers... requirements, interventions, protests, and service can be found at: http://www.ferc.gov/docs-filing/efiling...

  18. DMFS: A Data Migration File System for NetBSD

    Science.gov (United States)

    Studenmund, William

    2000-01-01

    I have recently developed DMFS, a Data Migration File System, for NetBSD. This file system provides kernel support for the data migration system being developed by my research group at NASA/Ames. The file system utilizes an underlying file store to provide the file backing, and coordinates user and system access to the files. It stores its internal metadata in a flat file, which resides on a separate file system. This paper will first describe our data migration system to provide a context for DMFS, then it will describe DMFS. It also will describe the changes to NetBSD needed to make DMFS work. Then it will give an overview of the file archival and restoration procedures, and describe how some typical user actions are modified by DMFS. Lastly, the paper will present simple performance measurements which indicate that there is little performance loss due to the use of the DMFS layer.

  19. Design and Implementation of a Metadata-rich File System

    Energy Technology Data Exchange (ETDEWEB)

    Ames, S; Gokhale, M B; Maltzahn, C

    2010-01-19

    Despite continual improvements in the performance and reliability of large scale file systems, the management of user-defined file system metadata has changed little in the past decade. The mismatch between the size and complexity of large scale data stores and their ability to organize and query their metadata has led to a de facto standard in which raw data is stored in traditional file systems, while related, application-specific metadata is stored in relational databases. This separation of data and semantic metadata requires considerable effort to maintain consistency and can result in complex, slow, and inflexible system operation. To address these problems, we have developed the Quasar File System (QFS), a metadata-rich file system in which files, user-defined attributes, and file relationships are all first class objects. In contrast to hierarchical file systems and relational databases, QFS defines a graph data model composed of files and their relationships. QFS incorporates Quasar, an XPATH-extended query language for searching the file system. Results from our QFS prototype show the effectiveness of this approach. Compared to the de facto standard, the QFS prototype shows superior ingest performance and comparable query performance on user metadata-intensive operations and superior performance on normal file metadata operations.

  20. Beginning Python from novice to professional

    CERN Document Server

    Hetland, Magnus Lie

    2005-01-01

    ""Beginning Python: From Novice to Professional"" is the most comprehensive book on the Python ever written. Based on ""Practical Python,"" this newly revised book is both an introduction and practical reference for a swath of Python-related programming topics, including addressing language internals, database integration, network programming, and web services. Advanced topics, such as extending Python and packaging/distributing Python applications, are also covered. Ten different projects illustrate the concepts introduced in the book. You will learn how to create a P2P file-sharing applicati

  1. 831 Files

    Data.gov (United States)

    Social Security Administration — SSA-831 file is a collection of initial and reconsideration adjudicative level DDS disability determinations. (A few hearing level cases are also present, but the...

  2. 47 CFR 1.1704 - Station files.

    Science.gov (United States)

    2010-10-01

    ... System (COALS) § 1.1704 Station files. Applications, notifications, correspondence, electronic filings... Television Relay Service (CARS) are maintained by the Commission in COALS and the Public Reference Room...

  3. Post-Dilution on Line Haemodiafiltration with Citrate Dialysate: First Clinical Experience in Chronic Dialysis Patients

    Directory of Open Access Journals (Sweden)

    Vincenzo Panichi

    2013-01-01

    Full Text Available Background. Citrate has anticoagulative properties and favorable effects on inflammation, but it has the potential hazards of inducing hypocalcemia. Bicarbonate dialysate (BHD replacing citrate for acetate is now used in chronic haemodialysis but has never been tested in postdilution online haemodiafiltration (OL-HDF. Methods. Thirteen chronic stable dialysis patients were enrolled in a pilot, short-term study. Patients underwent one week (3 dialysis sessions of BHD with 0.8 mmol/L citrate dialysate, followed by one week of postdilution high volume OL-HDF with standard bicarbonate dialysate, and one week of high volume OL-HDF with 0.8 mmol/L citrate dialysate. Results. In citrate OL-HDF pretreatment plasma levels of C-reactive protein and β2-microglobulin were significantly reduced; intra-treatment plasma acetate levels increased in the former technique and decreased in the latter. During both citrate techniques (OL-HDF and HD ionized calcium levels remained stable within the normal range. Conclusions. Should our promising results be confirmed in a long-term study on a wider population, then OL-HDF with citrate dialysate may represent a further step in improving dialysis biocompatibility.

  4. Spatial Region Estimation for Autonomous CoT Clustering Using Hidden Markov Model

    Directory of Open Access Journals (Sweden)

    Joon‐young Jung

    2018-02-01

    Full Text Available This paper proposes a hierarchical dual filtering (HDF algorithm to estimate the spatial region between a Cloud of Things (CoT gateway and an Internet of Things (IoT device. The accuracy of the spatial region estimation is important for autonomous CoT clustering. We conduct spatial region estimation using a hidden Markov model (HMM with a raw Bluetooth received signal strength indicator (RSSI. However, the accuracy of the region estimation using the validation data is only 53.8%. To increase the accuracy of the spatial region estimation, the HDF algorithm removes the high‐frequency signals hierarchically, and alters the parameters according to whether the IoT device moves. The accuracy of spatial region estimation using a raw RSSI, Kalman filter, and HDF are compared to evaluate the effectiveness of the HDF algorithm. The success rate and root mean square error (RMSE of all regions are 0.538, 0.622, and 0.75, and 0.997, 0.812, and 0.5 when raw RSSI, a Kalman filter, and HDF are used, respectively. The HDF algorithm attains the best results in terms of the success rate and RMSE of spatial region estimation using HMM.

  5. 78 FR 70029 - Combined Notice of Filings #1

    Science.gov (United States)

    2013-11-22

    ..., Ameren Energy Marketing Company, AmerenEnergy Resources Generating Company, AmerenEnergy Medina Valley... filing per 35: Integrated Marketplace Second Compliance Filing to be effective 3/ 1/2014. Filed Date: 11...

  6. Design and creation of a direct access nuclear data file

    International Nuclear Information System (INIS)

    Charpentier, P.

    1981-06-01

    General considerations on the structure of instructions and files are reviewed. Design, organization and mode of use of the different files: instruction file, index files, inverted files, automatic analysis and inquiry programs are examined [fr

  7. 37 CFR 11.41 - Filing of papers.

    Science.gov (United States)

    2010-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Filing of papers. 11.41... Disciplinary Proceedings; Jurisdiction, Sanctions, Investigations, and Proceedings § 11.41 Filing of papers. (a... papers filed after the complaint and prior to entry of an initial decision by the hearing officer shall...

  8. 78 FR 13050 - Combined Notice of Filings

    Science.gov (United States)

    2013-02-26

    ... be considered, but intervention is necessary to become a party to the proceeding. Filings in Existing Proceedings Docket Numbers: RP13-106-002. Applicants: Young Gas Storage Company, Ltd. Description: Young NAESB..., protests, and service can be found at: http://www.ferc.gov/docs-filing/efiling/filing-req.pdf . For other...

  9. Software for Managing Personal Files.

    Science.gov (United States)

    Lundeen, Gerald

    1989-01-01

    Discusses the special characteristics of personal file management software and compares four microcomputer software packages: Notebook II with Bibliography and Convert, Pro-Cite with Biblio-Links, askSam, and Reference Manager. Each package is evaluated in terms of the user interface, file maintenance, retrieval capabilities, output, and…

  10. 75 FR 62522 - Combined Notice of Filings No. 3

    Science.gov (United States)

    2010-10-12

    ... filing per 154.203: NAESB EDI Form Filing to be effective 11/1/ 2010. Filed Date: 09/30/2010. Accession....9 EDI Form to be effective 11/1/2010. Filed Date: 09/30/2010. Accession Number: 20100930-5348...

  11. MR-AFS: a global hierarchical file-system

    International Nuclear Information System (INIS)

    Reuter, H.

    2000-01-01

    The next generation of fusion experiments will use object-oriented technology creating the need for world wide sharing of an underlying hierarchical file-system. The Andrew file system (AFS) is a well known and widely spread global distributed file-system. Multiple-resident-AFS (MR-AFS) combines the features of AFS with hierarchical storage management systems. Files in MR-AFS therefore may be migrated on secondary storage, such as roboted tape libraries. MR-AFS is in use at IPP for the current experiments and data originating from super-computer applications. Experiences and scalability issues are discussed

  12. 29 CFR 801.71 - Filing and service.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false Filing and service. 801.71 Section 801.71 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR DIVISION, DEPARTMENT OF LABOR OTHER LAWS APPLICATION OF THE... and Order of Administrative Law Judge § 801.71 Filing and service. (a) Filing. All documents submitted...

  13. An analysis of file system and installation of the file management system for NOS operating system

    International Nuclear Information System (INIS)

    Lee, Young Jai; Park, Sun Hee; Hwang, In Ah; Kim, Hee Kyung

    1992-06-01

    In this technical report, we analyze NOS file structure for Cyber 170-875 and Cyber 960-31 computer system. We also describe functions, procedure and how-to-operate and how-to-use of VDS. VDS is used to manage large files effectively for Cyber computer system. The purpose of the VDS installation is to increase the virtual disk storage by utilizing magnetic tape, to assist the users of the computer system in managing their files, and to enhance the performance of KAERI Cyber computer system. (Author)

  14. DIGITAL ONCOLOGY PATIENT RECORD - HETEROGENEOUS FILE BASED APPROACH

    Directory of Open Access Journals (Sweden)

    Nikolay Sapundzhiev

    2010-12-01

    Full Text Available Introduction: Oncology patients need extensive follow-up and meticulous documentation. The aim of this study was to introduce a simple, platform independent file based system for documentation of diagnostic and therapeutic procedures in oncology patients and test its function.Material and methods: A file-name based system of the type M1M2M3.F2 was introduced, where M1 is a unique identifier for the patient, M2 is the date of the clinical intervention/event, M3 is an identifier for the author of the medical record and F2 is the specific software generated file-name extension.Results: This system is in use at 5 institutions, where a total of 11 persons on 14 different workstations inputted 16591 entries (files for 2370. The merge process was tested on 2 operating systems - when copied together all files sort up as expected by patient, and for each patient in a chronological order, providing a digital cumulative patient record, which contains heterogeneous file formats.Conclusion: The file based approach for storing heterogeneous digital patient related information is an reliable system, which can handle open-source, proprietary, general and custom file formats and seems to be easily scalable. Further development of software for automatic checks of the integrity and searching and indexing of the files is expected to produce a more user-friendly environment

  15. ENDF/B-5. Fission Product Yields File

    International Nuclear Information System (INIS)

    Schwerer, O.

    1985-10-01

    The ENDF/B-5 Fission Product Yields File contains a complete set of independent and cumulative fission product yields, representing the final data from ENDF/B-5 as received at the IAEA Nuclear Data Section in June 1985. Yields for 11 fissioning nuclides at one or more neutron incident energies are included. The data are available costfree on magnetic tape from the IAEA Nuclear Data Section. (author). 4 refs

  16. Grammar-Based Specification and Parsing of Binary File Formats

    Directory of Open Access Journals (Sweden)

    William Underwood

    2012-03-01

    Full Text Available The capability to validate and view or play binary file formats, as well as to convert binary file formats to standard or current file formats, is critically important to the preservation of digital data and records. This paper describes the extension of context-free grammars from strings to binary files. Binary files are arrays of data types, such as long and short integers, floating-point numbers and pointers, as well as characters. The concept of an attribute grammar is extended to these context-free array grammars. This attribute grammar has been used to define a number of chunk-based and directory-based binary file formats. A parser generator has been used with some of these grammars to generate syntax checkers (recognizers for validating binary file formats. Among the potential benefits of an attribute grammar-based approach to specification and parsing of binary file formats is that attribute grammars not only support format validation, but support generation of error messages during validation of format, validation of semantic constraints, attribute value extraction (characterization, generation of viewers or players for file formats, and conversion to current or standard file formats. The significance of these results is that with these extensions to core computer science concepts, traditional parser/compiler technologies can potentially be used as a part of a general, cost effective curation strategy for binary file formats.

  17. SWATH label-free proteomics analyses revealed the roles of oxidative stress and antioxidant defensing system in sclerotia formation of Polyporus umbellatus

    Science.gov (United States)

    Li, Bing; Tian, Xiaofang; Wang, Chunlan; Zeng, Xu; Xing, Yongmei; Ling, Hong; Yin, Wanqiang; Tian, Lixia; Meng, Zhixia; Zhang, Jihui; Guo, Shunxing

    2017-01-01

    Understanding the initiation and maturing mechanisms is important for rational manipulating sclerotia differentiation and growth from hypha of Polyporus umbellatus. Proteomes in P. umbellatus sclerotia and hyphae at initial, developmental and mature phases were studied. 1391 proteins were identified by nano-liquid chromatograph-mass spectrometry (LC-MS) in Data Dependant Acquisition mode, and 1234 proteins were quantified successfully by Sequential Window Acquisition of all THeoretical fragment ion spectra-MS (SWATH-MS) technology. There were 347 differentially expressed proteins (DEPs) in sclerotia at initial phase compared with those in hypha, and the DEP profiles were dynamically changing with sclerotia growth. Oxidative stress (OS) in sclerotia at initial phase was indicated by the repressed proteins of respiratory chain, tricarboxylic acid cycle and the activation of glycolysis/gluconeogenesis pathways were determined based on DEPs. The impact of glycolysis/gluconeogenesis on sclerotium induction was further verified by glycerol addition assays, in which 5% glycerol significantly increased sclerotial differentiation rate and biomass. It can be speculated that OS played essential roles in triggering sclerotia differentiation from hypha of P. umbellatus, whereas antioxidant activity associated with glycolysis is critical for sclerotia growth. These findings reveal a mechanism for sclerotial differentiation in P. umbellatus, which may also be applicable for other fungi.

  18. 75 FR 49918 - Combined Notice of Filings # 1

    Science.gov (United States)

    2010-08-16

    ... Missouri Operations Company submits tariff filing per 35: KCP&L-GMO Baseline Compliance Filing to be.... Description: KCP&L Greater Missouri Operations Company submits tariff filing per 35.12: KCP&L-GMO OATT Volume...

  19. Dynamic Non-Hierarchical File Systems for Exascale Storage

    Energy Technology Data Exchange (ETDEWEB)

    Long, Darrell E. [Univ. of California, Santa Cruz, CA (United States); Miller, Ethan L [Univ. of California, Santa Cruz, CA (United States)

    2015-02-24

    This constitutes the final report for “Dynamic Non-Hierarchical File Systems for Exascale Storage”. The ultimate goal of this project was to improve data management in scientific computing and high-end computing (HEC) applications, and to achieve this goal we proposed: to develop the first, HEC-targeted, file system featuring rich metadata and provenance collection, extreme scalability, and future storage hardware integration as core design goals, and to evaluate and develop a flexible non-hierarchical file system interface suitable for providing more powerful and intuitive data management interfaces to HEC and scientific computing users. Data management is swiftly becoming a serious problem in the scientific community – while copious amounts of data are good for obtaining results, finding the right data is often daunting and sometimes impossible. Scientists participating in a Department of Energy workshop noted that most of their time was spent “...finding, processing, organizing, and moving data and it’s going to get much worse”. Scientists should not be forced to become data mining experts in order to retrieve the data they want, nor should they be expected to remember the naming convention they used several years ago for a set of experiments they now wish to revisit. Ideally, locating the data you need would be as easy as browsing the web. Unfortunately, existing data management approaches are usually based on hierarchical naming, a 40 year-old technology designed to manage thousands of files, not exabytes of data. Today’s systems do not take advantage of the rich array of metadata that current high-end computing (HEC) file systems can gather, including content-based metadata and provenance1 information. As a result, current metadata search approaches are typically ad hoc and often work by providing a parallel management system to the “main” file system, as is done in Linux (the locate utility), personal computers, and enterprise search

  20. 77 FR 23241 - Lock+ Hydro Friends Fund XXX, LLC; Notice of Intent To File License Application, Filing of Pre...

    Science.gov (United States)

    2012-04-18

    ... Friends Fund XXX, LLC; Notice of Intent To File License Application, Filing of Pre-Application Document.... Date Filed: February 20, 2012. d. Submitted By: Lock+ Hydro Friends Fund XXX, LLC. e. Name of Project... Applicant Contact: Mr. Mark R. Stover, Lock+\\TM\\ Hydro Friends Fund XXX, c/o Hydro Green Energy, LLC, 900...