WorldWideScience

Sample records for data files

  1. Decay data file based on the ENSDF file

    Energy Technology Data Exchange (ETDEWEB)

    Katakura, J. [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-03-01

    A decay data file with the JENDL (Japanese Evaluated Nuclear Data Library) format based on the ENSDF (Evaluated Nuclear Structure Data File) file was produced as a tentative one of special purpose files of JENDL. The problem using the ENSDF file as primary source data of the JENDL decay data file is presented. (author)

  2. JNDC FP decay data file

    International Nuclear Information System (INIS)

    Yamamoto, Tohru; Akiyama, Masatsugu

    1981-02-01

    The decay data file for fission product nuclides (FP DECAY DATA FILE) has been prepared for summation calculation of the decay heat of fission products. The average energies released in β- and γ-transitions have been calculated with computer code PROFP. The calculated results and necessary information have been arranged in tabular form together with the estimated results for 470 nuclides of which decay data are not available experimentally. (author)

  3. Analyzing data files in SWAN

    CERN Document Server

    Gajam, Niharika

    2016-01-01

    Traditionally analyzing data happens via batch-processing and interactive work on the terminal. The project aims to provide another way of analyzing data files: A cloud-based approach. It aims to make it a productive and interactive environment through the combination of FCC and SWAN software.

  4. Contents of GPS Data Files

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, John P. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Carver, Matthew Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Norman, Benjamin [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-12-09

    There are no very detailed descriptions of most of these instruments in the literature – we will attempt to fix that problem in the future. The BDD instruments are described in [1]. One of the dosimeter instruments on CXD boxes is described in [2]. These documents (or web links to them) and a few others are in this directory tree. The cross calibration of the CXD electron data with RBSP is described in [3]. Each row in the data file contains the data from one time bin from a CXD or BDD instrument along with a variety of parameters derived from the data. Time steps are commandable but 4 minutes is a typical setting. These instruments are on many (but not all) GPS satellites which are currently in operation. The data come from either BDD instruments on GPS Block IIR satellites (SVN41 and 48), or else CXD-IIR instruments on GPS Block IIR and IIR-M satellites (SVN53-61) or CXD-IIF instruments on GPS block IIF satellites (SVN62-73). The CXD-IIR instruments on block IIR and IIR(M) satellites use the same design.

  5. Evaluated nuclear data file of Th-232

    International Nuclear Information System (INIS)

    Meadows, J.; Poenitz, W.; Smith, A.; Smith, D.; Whalen, J.; Howerton, R.

    1977-09-01

    An evaluated nuclear data file for thorium is described. The file extends over the energy range 0.049 (i.e., the inelastic-scattering threshold) to 20.0 MeV and is formulated within the framework of the ENDF system. The input data base, the evaluation procedures and judgments, and ancillary experiments carried out in conjunction with the evaluation are outlined. The file includes: neutron total cross sections, neutron scattering processes, neutron radiative capture cross sections, fission cross sections, (n;2n) and (n;3n) processes, fission properties (e.g., nu-bar and delayed neutron emission) and photon production processes. Regions of uncertainty are pointed out particularly where new measured results would be of value. The file is extended to thermal energies using previously reported resonance evaluations thereby providing a complete file for neutronic calculations. Integral data tests indicated that the file was suitable for neutronic calculations in the MeV range

  6. Evaluated nuclear-data file for niobium

    International Nuclear Information System (INIS)

    Smith, A.B.; Smith, D.L.; Howerton, R.J.

    1985-03-01

    A comprehensive evaluated nuclear-data file for elemental niobium is provided in the ENDF/B format. This file, extending over the energy range 10 -11 -20 MeV, is suitable for comprehensive neutronic calculations, particulary those dealing with fusion-energy systems. It also provides dosimetry information. Attention is given to the internal consistancy of the file, energy balance, and the quantitative specification of uncertainties. Comparisons are made with experimental data and previous evaluated files. The results of integral tests are described and remaining outstanding problem areas are cited. 107 refs

  7. Benchmark comparisons of evaluated nuclear data files

    International Nuclear Information System (INIS)

    Resler, D.A.; Howerton, R.J.; White, R.M.

    1994-05-01

    With the availability and maturity of several evaluated nuclear data files, it is timely to compare the results of integral tests with calculations using these different files. We discuss here our progress in making integral benchmark tests of the following nuclear data files: ENDL-94, ENDF/B-V and -VI, JENDL-3, JEF-2, and BROND-2. The methods used to process these evaluated libraries in a consistent way into applications files for use in Monte Carlo calculations is presented. Using these libraries, we are calculating and comparing to experiment k eff for 68 fast critical assemblies of 233,235 U and 239 Pu with reflectors of various material and thickness

  8. Fast probabilistic file fingerprinting for big data.

    Science.gov (United States)

    Tretyakov, Konstantin; Laur, Sven; Smant, Geert; Vilo, Jaak; Prins, Pjotr

    2013-01-01

    Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily concerns logistics within and between data centers, but is also important for workstation users in the analysis phase. Common usage patterns, such as comparing and transferring files, are proving computationally expensive and are tying down shared resources. We present an efficient method for calculating file uniqueness for large scientific data files, that takes less computational effort than existing techniques. This method, called Probabilistic Fast File Fingerprinting (PFFF), exploits the variation present in biological data and computes file fingerprints by sampling randomly from the file instead of reading it in full. Consequently, it has a flat performance characteristic, correlated with data variation rather than file size. We demonstrate that probabilistic fingerprinting can be as reliable as existing hashing techniques, with provably negligible risk of collisions. We measure the performance of the algorithm on a number of data storage and access technologies, identifying its strengths as well as limitations. Probabilistic fingerprinting may significantly reduce the use of computational resources when comparing very large files. Utilisation of probabilistic fingerprinting techniques can increase the speed of common file-related workflows, both in the data center and for workbench analysis. The implementation of the algorithm is available as an open-source tool named pfff, as a command-line tool as well as a C library. The tool can be downloaded from http://biit.cs.ut.ee/pfff.

  9. ENSDF: The evaluated nuclear structure data file

    International Nuclear Information System (INIS)

    Martin, M.J.

    1986-01-01

    The structure, organization, and contents of the Evaluated Nuclear Structure Data File, ENSDF, will be discussed. This file summarizes the state of experimental nuclear structure data for all nuclei as determined from consideration of measurements reported world wide. Special emphasis will be given to the data evaluation procedures and consistency checks utilized at the input stage and to the retrieval capabilities of the system at the output stage

  10. Analyzing Log Files using Data-Mining

    Directory of Open Access Journals (Sweden)

    Marius Mihut

    2008-01-01

    Full Text Available Information systems (i.e. servers, applications and communication devices create a large amount of monitoring data that are saved as log files. For analyzing them, a data-mining approach is helpful. This article presents the steps which are necessary for creating an ‘analyzing instrument’, based on an open source software called Waikato Environment for Knowledge Analysis (Weka [1]. For exemplification, a system log file created by a Windows-based operating system, is used as input file.

  11. Identifiable Data Files - Health Outcomes Survey (HOS)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicare Health Outcomes Survey (HOS) identifiable data files are comprised of the entire national sample for a given 2-year cohort (including both respondents...

  12. Identifiable Data Files - Medicare Provider Analysis and ...

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicare Provider Analysis and Review (MEDPAR) File contains data from claims for services provided to beneficiaries admitted to Medicare certified inpatient...

  13. A basic evaluated neutronic data file for elemental scandium

    International Nuclear Information System (INIS)

    Smith, A.B.; Meadows, J.W.; Howerton, R.J.

    1992-01-01

    This report documents an evaluated neutronic data file for elemental scandium, presented in the ENDF/B-VI format. This file should provide basic nuclear data essential for neutronic calculations involving elemental scandium. No equivalent file was previously available

  14. Fast probabilistic file fingerprinting for big data

    NARCIS (Netherlands)

    Tretjakov, K.; Laur, S.; Smant, G.; Vilo, J.; Prins, J.C.P.

    2013-01-01

    Background: Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily

  15. DATA Act File C Award Financial - Social Security

    Data.gov (United States)

    Social Security Administration — The DATA Act Information Model Schema Reporting Submission Specification File C. File C includes the agency award information from the financial accounting system at...

  16. Library of files of evaluated neutron data

    International Nuclear Information System (INIS)

    Blokhin, A.I.; Ignatyuk, A.V.; Koshcheev, V.N.; Kuz'minov, B.D.; Manokhin, V.N.; Manturov, G.N.; Nikolaev, M.N.

    1988-01-01

    It is reported about development of the evaluated neutron data files library which was recommended by the GKAE Nuclear Data Commission as the base of improving constant systems in neutron engeneering calculations. A short description of the library content is given and status of the library is pointed out

  17. An evaluated neutronic data file for bismuth

    International Nuclear Information System (INIS)

    Guenther, P.T.; Lawson, R.D.; Meadows, J.W.; Smith, A.B.; Smith, D.L.; Sugimoto, M.; Howerton, R.J.

    1989-11-01

    A comprehensive evaluated neutronic data file for bismuth, extending from 10 -5 eV to 20.0 MeV, is described. The experimental database, the application of the theoretical models, and the evaluation rationale are outlined. Attention is given to uncertainty specification, and comparisons are made with the prior ENDF/B-V evaluation. The corresponding numerical file, in ENDF/B-VI format, has been transmitted to the National Nuclear Data Center, Brookhaven National Laboratory. 106 refs., 10 figs., 6 tabs

  18. Nuclear plant fire incident data file

    International Nuclear Information System (INIS)

    Sideris, A.G.; Hockenbury, R.W.; Yeater, M.L.; Vesely, W.E.

    1979-01-01

    A computerized nuclear plant fire incident data file was developed by American Nuclear Insurers and was further analyzed by Rensselaer Polytechnic Institute with technical and monetary support provided by the Nuclear Regulatory Commission. Data on 214 fires that occurred at nuclear facilities have been entered in the file. A computer program has been developed to sort the fire incidents according to various parameters. The parametric sorts that are presented in this article are significant since they are the most comprehensive statistics presently available on fires that have occurred at nuclear facilities

  19. Reactor fuel performance data file, 1985 edition

    International Nuclear Information System (INIS)

    Harayama, Yasuo; Fujita, Misao; Watanabe, Kohji.

    1986-07-01

    In safety evaluation and integrity studies of reactor fuel, data on fuel performance are the most basic materials. The Fuel Reliability Laboratory No.1 has obtained the fuel performance data by joining in some international programs to study the safety and integrity of fuel. Those data have only used for the studies in the above two fields. However, if the data are rearranged and compiled in a easily usable form, they can be utilized in other field of studies. Then, a 'data file' on fuel performance is beeing compiled by adding data from open literatures to those obtained in international programs. The present report is prepared on the basis of the data file compiled by March in 1986. (author)

  20. Sandia Data Archive (SDA) file specifications

    Energy Technology Data Exchange (ETDEWEB)

    Dolan, Daniel H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ao, Tommy [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    The Sandia Data Archive (SDA) format is a specific implementation of the HDF5 (Hierarchal Data Format version 5) standard. The format was developed for storing data in a universally accessible manner. SDA files may contain one or more data records, each associated with a distinct text label. Primitive records provide basic data storage, while compound records support more elaborate grouping. External records allow text/binary files to be carried inside an archive and later recovered. This report documents version 1.0 of the SDA standard. The information provided here is sufficient for reading from and writing to an archive. Although the format was original designed for use in MATLAB, broader use is encouraged.

  1. Distributed Data Management and Distributed File Systems

    CERN Document Server

    Girone, Maria

    2015-01-01

    The LHC program has been successful in part due to the globally distributed computing resources used for collecting, serving, processing, and analyzing the large LHC datasets. The introduction of distributed computing early in the LHC program spawned the development of new technologies and techniques to synchronize information and data between physically separated computing centers. Two of the most challenges services are the distributed file systems and the distributed data management systems. In this paper I will discuss how we have evolved from local site services to more globally independent services in the areas of distributed file systems and data management and how these capabilities may continue to evolve into the future. I will address the design choices, the motivations, and the future evolution of the computing systems used for High Energy Physics.

  2. Skyshine analysis using various nuclear data files

    Energy Technology Data Exchange (ETDEWEB)

    Zharkov, V.P.; Dikareva, O.F.; Kartashev, I.A.; Kiselev, A.N. [Research and Development Inst. of Power Engineering, Moscow (Russian Federation); Nomura, Y.; Tsubosaka, A. [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan)

    2000-03-01

    The calculations of the spacial distributions of dose rate for neutron and secondary photons, thermal neutron fluxes and space-energy distributions of neutron and photons near the air-ground interface were performed by MCNP and DORT codes. Different nuclear data files were used (ENDF/B-IV, ENDF/B-VI, FENDL-2, JENDL-3.2). Either the standard pointwise libraries (MCNP) or special libraries prepared by NJOY code from ENDF/B and others' files were used. Prepared multigroup coupled neutron and photon cross sections libraries for DORT code had CASK-40 group energy structures. The libraries contain pointwise or multigroup cross sections data for all elements included in the atmosphere and ground composition. The validation of the calculated results was performed with using the experimental data obtained for the series of measurements at RA reactor. (author)

  3. Skyshine analysis using various nuclear data files

    International Nuclear Information System (INIS)

    Zharkov, V.P.; Dikareva, O.F.; Kartashev, I.A.; Kiselev, A.N.; Nomura, Y.; Tsubosaka, A.

    2000-01-01

    The calculations of the spacial distributions of dose rate for neutron and secondary photons, thermal neutron fluxes and space-energy distributions of neutron and photons near the air-ground interface were performed by MCNP and DORT codes. Different nuclear data files were used (ENDF/B-IV, ENDF/B-VI, FENDL-2, JENDL-3.2). Either the standard pointwise libraries (MCNP) or special libraries prepared by NJOY code from ENDF/B and others' files were used. Prepared multigroup coupled neutron and photon cross sections libraries for DORT code had CASK-40 group energy structures. The libraries contain pointwise or multigroup cross sections data for all elements included in the atmosphere and ground composition. The validation of the calculated results was performed with using the experimental data obtained for the series of measurements at RA reactor. (author)

  4. Central Personnel Data File (CPDF) Status Data

    Data.gov (United States)

    Office of Personnel Management — Precursor to the Enterprise Human Resources Integration-Statistical Data Mart (EHRI-SDM). It contains data about the employee and their position, along with various...

  5. Activation cross section data file, (1)

    International Nuclear Information System (INIS)

    Yamamuro, Nobuhiro; Iijima, Shungo.

    1989-09-01

    To evaluate the radioisotope productions due to the neutron irradiation in fission of fusion reactors, the data for the activation cross sections ought to be provided. It is planning to file more than 2000 activation cross sections at final. In the current year, the neutron cross sections for 14 elements from Ni to W have been calculated and evaluated in the energy range 10 -5 to 20 MeV. The calculations with a simplified-input nuclear cross section calculation system SINCROS were described, and another method of evaluation which is consistent with the JENDL-3 were also mentioned. The results of cross section calculation are in good agreement with experimental data and they were stored in the file 8, 9 and 10 of ENDF/B format. (author)

  6. DMFS: A Data Migration File System for NetBSD

    Science.gov (United States)

    Studenmund, William

    2000-01-01

    I have recently developed DMFS, a Data Migration File System, for NetBSD. This file system provides kernel support for the data migration system being developed by my research group at NASA/Ames. The file system utilizes an underlying file store to provide the file backing, and coordinates user and system access to the files. It stores its internal metadata in a flat file, which resides on a separate file system. This paper will first describe our data migration system to provide a context for DMFS, then it will describe DMFS. It also will describe the changes to NetBSD needed to make DMFS work. Then it will give an overview of the file archival and restoration procedures, and describe how some typical user actions are modified by DMFS. Lastly, the paper will present simple performance measurements which indicate that there is little performance loss due to the use of the DMFS layer.

  7. Silvabase: A flexible data file management system

    Science.gov (United States)

    Lambing, Steven J.; Reynolds, Sandra J.

    1991-01-01

    The need for a more flexible and efficient data file management system for mission planning in the Mission Operations Laboratory (EO) at MSFC has spawned the development of Silvabase. Silvabase is a new data file structure based on a B+ tree data structure. This data organization allows for efficient forward and backward sequential reads, random searches, and appends to existing data. It also provides random insertions and deletions with reasonable efficiency, utilization of storage space well but not at the expense of speed, and performance of these functions on a large volume of data. Mission planners required that some data be keyed and manipulated in ways not found in a commercial product. Mission planning software is currently being converted to use Silvabase in the Spacelab and Space Station Mission Planning Systems. Silvabase runs on a Digital Equipment Corporation's popular VAX/VMS computers in VAX Fortran. Silvabase has unique features involving time histories and intervals such as in operations research. Because of its flexibility and unique capabilities, Silvabase could be used in almost any government or commercial application that requires efficient reads, searches, and appends in medium to large amounts of almost any kind of data.

  8. Supplemental Security Income Public-Use Microdata File, 2001 Data

    Data.gov (United States)

    Social Security Administration — The SSI Public-Use Microdata File contains an extract of data fields from SSA's Supplemental Security Record file and consists of a 5 percent random, representative...

  9. JENDL special purpose data files and related nuclear data

    International Nuclear Information System (INIS)

    Iijima, Shungo

    1989-01-01

    The objectives of JENDL Special Purpose Data Files under development are the applications of nuclear data to the evaluation of the fuel cycle, nuclear activation, and radiation damage. The files in plan consist of 9 types of data, viz., the actinide cross sections, the decay data, the activation cross sections, the (α,n) cross sections, the photo-reaction cross sections, the dosimetry cross sections, the gas production cross sections, the primary knock-on atom spectra and KERMA factors, and the data for standard. The status of the compilation and the evaluation of these data are briefly reviewed. In particular, the features of the data required for the evaluation of the activation cross sections, (α,n) cross sections, photo-reaction cross sections, and PKA data are discussed in some detail. The need for the realistic definition of the scope of the work is emphasized. (author)

  10. Evaluated Nuclear Structure Data File (ENSDF)

    International Nuclear Information System (INIS)

    Bhat, M.R.

    1991-01-01

    The Evaluated Nuclear Structure Data File (ENSDF), is maintained by the National Nuclear Data Center (NNDC) on behalf of the international Nuclear Structure and Decay Data (NSDD) network organized under the auspices of the International Atomic Energy Agency. ENSDF provides evaluated experimental nuclear structure and decay data for basic and applied research. The activities of the NSDD network, the publication of the evaluations, and their use in different applications are described. Since 1986, the ENSDF and related numeric and bibliographic data bases have been made available for on-line access. The current status of these data bases, and future plans to improve the on-line access to their contents are discussed. 8 refs., 4 tabs

  11. An evaluated neutronic data file for elemental zirconium

    International Nuclear Information System (INIS)

    Smith, A.B.; Chiba, S.

    1994-09-01

    A comprehensive evaluated neutronic data file for elemental zirconium is derived and presented in the ENDF/B-VI formats. The derivation is based upon measured microscopic nuclear data, augmented by model calculations as necessary. The primary objective is a quality contemporary file suitable for fission-reactor development extending from conventional thermal to fast and innovative systems. This new file is a significant improvement over previously available evaluated zirconium files, in part, as a consequence of extensive new experimental measurements reported elsewhere

  12. Data_files_Reyes_EHP_phthalates

    Data.gov (United States)

    U.S. Environmental Protection Agency — The file contains three files in comma separated values (.csv) format. “Reyes_EHP_Phthalates_US_metabolites.csv” contains information about the National Health and...

  13. Titanium-II: an evaluated nuclear data file

    International Nuclear Information System (INIS)

    Philis, C.; Howerton, R.; Smith, A.B.

    1977-06-01

    A comprehensive evaluated nuclear data file for elemental titanium is outlined including definition of the data base, the evaluation procedures and judgments, and the final evaluated results. The file describes all significant neutron-induced reactions with elemental titanium and the associated photon-production processes to incident neutron energies of 20.0 MeV. In addition, isotopic-reaction files, consistent with the elemental file, are separately defined for those processes which are important to applied considerations of material-damage and neutron-dosimetry. The file is formulated in the ENDF format. This report formally documents the evaluation and, together with the numerical file, is submitted for consideration as a part of the ENDF/B-V evaluated file system. 20 figures, 9 tables

  14. Adding Data Management Services to Parallel File Systems

    Energy Technology Data Exchange (ETDEWEB)

    Brandt, Scott [Univ. of California, Santa Cruz, CA (United States)

    2015-03-04

    The objective of this project, called DAMASC for “Data Management in Scientific Computing”, is to coalesce data management with parallel file system management to present a declarative interface to scientists for managing, querying, and analyzing extremely large data sets efficiently and predictably. Managing extremely large data sets is a key challenge of exascale computing. The overhead, energy, and cost of moving massive volumes of data demand designs where computation is close to storage. In current architectures, compute/analysis clusters access data in a physically separate parallel file system and largely leave it scientist to reduce data movement. Over the past decades the high-end computing community has adopted middleware with multiple layers of abstractions and specialized file formats such as NetCDF-4 and HDF5. These abstractions provide a limited set of high-level data processing functions, but have inherent functionality and performance limitations: middleware that provides access to the highly structured contents of scientific data files stored in the (unstructured) file systems can only optimize to the extent that file system interfaces permit; the highly structured formats of these files often impedes native file system performance optimizations. We are developing Damasc, an enhanced high-performance file system with native rich data management services. Damasc will enable efficient queries and updates over files stored in their native byte-stream format while retaining the inherent performance of file system data storage via declarative queries and updates over views of underlying files. Damasc has four key benefits for the development of data-intensive scientific code: (1) applications can use important data-management services, such as declarative queries, views, and provenance tracking, that are currently available only within database systems; (2) the use of these services becomes easier, as they are provided within a familiar file

  15. Status of the evaluated nuclear structure data file

    International Nuclear Information System (INIS)

    Martin, M.J.

    1991-01-01

    The structure, organization, and contents of the Evaluated Nuclear Structure Data File (ENSDF) are discussed in this paper. This file contains a summary of the state of experimental nuclear structure data for all nuclides as determined from consideration of measurements reported worldwide in the literature. Special emphasis is given to the data evaluation procedures, the consistency checks, and the quality control utilized at the input stage and to the retrieval capabilities of the system at the output stage. Recent enhancements of the on-line interaction with the file contents is addressed as well as procedural changes that will improve the currency of the file

  16. Development of EDFSRS: evaluated data files storage and retrieval system

    International Nuclear Information System (INIS)

    Hasegawa, Akira

    1985-07-01

    EDFSRS: Evaluated Data Files Storage and Retrieval System has been developed, which is a complete service system for the evaluated nuclear data files compiled in the major three formats: ENDF/B, UKNDL and KEDAK. This system intends to give efficient loading and maintenance of evaluated nuclear data files to the data base administrators and efficient retrievals to their users not only with the easiness but with the best confidence. It can give users all of the information available in these major three formats. The system consists of more than fifteen independent programs and some 150 Mega byte data files and index files (data-base) of the loaded data. In addition it is designed to be operated in the on-line TSS (Time Sharing System) mode, so that users can get any information from their desk top terminals. This report is prepared as a reference manual of the EDFSRS. (author)

  17. Data and code files for co-occurrence modeling project

    Data.gov (United States)

    U.S. Environmental Protection Agency — Files included are original data inputs on stream fishes (fish_data_OEPA_2012.csv), water chemistry (OEPA_WATER_2012.csv), geographic data (NHD_Plus_StreamCat);...

  18. Data vaults: a database welcome to scientific file repositories

    NARCIS (Netherlands)

    Ivanova, M.; Kargın, Y.; Kersten, M.; Manegold, S.; Zhang, Y.; Datcu, M.; Espinoza Molina, D.

    2013-01-01

    Efficient management and exploration of high-volume scientific file repositories have become pivotal for advancement in science. We propose to demonstrate the Data Vault, an extension of the database system architecture that transparently opens scientific file repositories for efficient in-database

  19. Development of data file system for cardiovascular nuclear medicine

    International Nuclear Information System (INIS)

    Hayashida, Kohei; Nishimura, Tsunehiko; Uehara, Toshiisa; Nisawa, Yoshifumi.

    1985-01-01

    A computer-assisted filing system for storing and processing data from cardiac pool scintigraphy and myocardial scintigraphy has been developed. Individual patient data are stored with his (her) identification number (ID) into floppy discs successively in order of receiving scintigraphy. Data for 900 patients can be stored per floppy disc. Scintigraphic findings can be outputted in a uniform file format, and can be used as a reporting format. Output or retrieval of filed individual patient data is possible according to each examination, disease code or ID. This system seems to be used for prospective study in patients with cardiovascular diseases. (Namekawa, K.)

  20. Air and Soil Data Files from Sumas Study

    Data.gov (United States)

    U.S. Environmental Protection Agency — The data are summarized in the manuscript, but users may wish to apply them from these files. This dataset is associated with the following publication: Wroble, J.,...

  1. An evaluated neutronic data file for elemental cobalt

    Energy Technology Data Exchange (ETDEWEB)

    Guenther, P.; Lawson, R.; Meadows, J.; Sugimoto, M.; Smith, A.; Smith, D.; Howerton, R.

    1988-08-01

    A comprehensive evaluated neutronic data file for elemental cobalt is described. The experimental data base, the calculational methods, the evaluation techniques and judgments, and the physical content are outlined. The file contains: neutron total and scattering cross sections and associated properties, (n,2n) and (n,3n) processes, neutron radiative capture processes, charged-particle-emission processes, and photon-production processes. The file extends from 10/sup /minus/5/ eV to 20 MeV, and is presented in the ENDF/B-VI format. Detailed attention is given to the uncertainties and correlations associated with the prominent neutron-induced processes. The numerical contents of the file have been transmitted to the National Nuclear Data Center, Brookhaven National Laboratory. 143 refs., 16 figs., 5 tabs.

  2. An attempt for revision of JNDC FP decay data file

    International Nuclear Information System (INIS)

    Katakura, Jun-ichi; Matsumoto, Zyun-itiro; Akiyama, Masatsugu; Yoshida, Tadashi; Nakasima, Ryozo.

    1984-06-01

    Some improvement of JNDC FP Decay Data File is tried by reexamining the decay scheme for several nuclides, since slight discrepancies are seen in detailed comparison of decay powers. As a results, it is found that the average beta- and gamma-energies should be modified for 88 Rb and 143 La among the nuclides reexamined in the present study. The JNDC file modified in 88 Rb and 143 La gives better agreement in most cases with experiments than the original JNDC file for cooling times longer than a few thousands seconds. However, the discrepancy for cooling times from a few hundreds to about 1500 seconds still remains. (author)

  3. Processing and validation of intermediate energy evaluated data files

    International Nuclear Information System (INIS)

    2000-01-01

    Current accelerator-driven and other intermediate energy technologies require accurate nuclear data to model the performance of the target/blanket assembly, neutron production, activation, heating and damage. In a previous WPEC subgroup, SG13 on intermediate energy nuclear data, various aspects of intermediate energy data, such as nuclear data needs, experiments, model calculations and file formatting issues were investigated and categorized to come to a joint evaluation effort. The successor of SG13, SG14 on the processing and validation of intermediate energy evaluated data files, goes one step further. The nuclear data files that have been created with the aforementioned information need to be processed and validated in order to be applicable in realistic intermediate energy simulations. We emphasize that the work of SG14 excludes the 0-20 MeV data part of the neutron evaluations, which is supposed to be covered elsewhere. This final report contains the following sections: section 2: a survey of the data files above 20 MeV that have been considered for validation in SG14; section 3: a summary of the review of the 150 MeV intermediate energy data files for ENDF/B-VI and, more briefly, the other libraries; section 4: validation of the data library against an integral experiment with MCNPX; section 5: conclusions. (author)

  4. DataNet: A flexible metadata overlay over file resources

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    Managing and sharing data stored in files results in a challenge due to data amounts produced by various scientific experiments [1]. While solutions such as Globus Online [2] focus on file transfer and synchronization, in this work we propose an additional layer of metadata over file resources which helps to categorize and structure the data, as well as to make it efficient in integration with web-based research gateways. A basic concept of the proposed solution [3] is a data model consisting of entities built from primitive types such as numbers, texts and also from files and relationships among different entities. This allows for building complex data structure definitions and mix metadata and file data into a single model tailored for a given scientific field. A data model becomes actionable after being deployed as a data repository which is done automatically by the proposed framework by using one of the available PaaS (platform-as-a-service) platforms and is exposed to the world as a REST service, which...

  5. The evaluated nuclear structure data file: Philosophy, content, and uses

    International Nuclear Information System (INIS)

    Burrows, T.W.

    1990-01-01

    The Evaulated Nuclear Structure Data File (ENSDF) is maintained by the National Nuclear Data Center (NNDC) on behalf of the international Nuclear Structure and Decay Data Network sponsored by the International Atomic Energy Agency, Vienna. Data for A=5 to 44 are extracted from the evaluations published in Nuclear Physics; for A≥45 the file is used to produce the Nuclear Data Sheets. The philosophy and methodology of ENSDF evaluations are outlined, along with the file contents of relevance to radionuclide metrologists; the service available at various nuclear data centers and the NNDC on-line capabilities are also discussed. Application codes have been developed for use with ENSDF, and the program RADLST is used as an example. The interaction of ENSDF evaluation with other evaluations is also discussed. (orig.)

  6. Design and creation of a direct access nuclear data file

    International Nuclear Information System (INIS)

    Charpentier, P.

    1981-06-01

    General considerations on the structure of instructions and files are reviewed. Design, organization and mode of use of the different files: instruction file, index files, inverted files, automatic analysis and inquiry programs are examined [fr

  7. Evaluated nuclear data file ENDF/B-VI

    International Nuclear Information System (INIS)

    Dunford, C.L.

    1991-01-01

    For the past 25 years, the United States Department of Energy has sponsored a cooperative program among its laboratories, contractors and university research programs to produce an evaluated nuclear data library which would be application independent and universally accepted. The product of this cooperative activity is the ENDF/B evaluated nuclear data file. After approximately eight years of development, a new version of the data file, ENDF/B-VI has been released. The essential features of this evaluated data library are described in this paper. 7 refs

  8. DATA Act File B Object Class and Program Activity - Social Security

    Data.gov (United States)

    Social Security Administration — The DATA Act Information Model Schema Reporting Submission Specification File B. File B includes the agency object class and program activity detail obligation and...

  9. The version control service for ATLAS data acquisition configuration files

    CERN Document Server

    Soloviev, Igor; The ATLAS collaboration

    2012-01-01

    To configure data taking session the ATLAS systems and detectors store more than 160 MBytes of data acquisition related configuration information in OKS XML files [1]. The total number of the files exceeds 1300 and they are updated by many system experts. In the past from time to time after such updates we had experienced problems caused by XML syntax errors or inconsistent state of files from a point of view of the overall ATLAS configuration. It was not always possible to know who made a modification causing problems or how to go back to a previous version of the modified file. Few years ago a special service addressing these issues has been implemented and deployed on ATLAS Point-1. It excludes direct write access to XML files stored in a central database repository. Instead, for an update the files are copied into a user repository, validated after modifications and committed using a version control system. The system's callback updates the central repository. Also, it keeps track of all modifications pro...

  10. Fast processing the film data file

    International Nuclear Information System (INIS)

    Abramov, B.M.; Avdeev, N.F.; Artemov, A.V.

    1978-01-01

    The problems of processing images obtained from three-meter magnetic spectrometer on a new PSP-2 automatic device are considered. A detailed description of the filtration program, which controls the correctness of operation connection line, as well as of scanning parameters and technical quality of information. The filtration process can be subdivided into the following main stages: search of fiducial marks binding of track to fiducial marks; plotting from sparks of track fragments in chambers. For filtration purposes the BESM-6 computer has been chosen. The complex of filtration programs is shaped as a RAM-file, the required version of the program is collected by the PATCHY program. The subprograms, performing the greater part of the calculations are written in the autocode MADLEN, the rest of the subprograms - in FORTRAN and ALGOL. The filtration time for one image makes 1,2-2 s of the calculation. The BESM-6 computer processes up to 12 thousand images a day

  11. SLIB77, Source Library Data Compression and File Maintenance System

    International Nuclear Information System (INIS)

    Lunsford, A.

    1989-01-01

    Description of program or function: SLIB77 is a source librarian program designed to maintain FORTRAN source code in a compressed form on magnetic disk. The program was prepared to meet program maintenance requirements for ongoing program development and continual improvement of very large programs involving many programmers from a number of different organizations. SLIB77 automatically maintains in one file the source of the current program as well as all previous modifications. Although written originally for FORTRAN programs, SLIB77 is suitable for use with data files, text files, operating systems, and other programming languages, such as Ada, C and COBOL. It can handle libraries with records of up to 160-characters. Records are grouped into DECKS and assigned deck names by the user. SLIB77 assigns a number to each record in each DECK. Records can be deleted or restored singly or as a group within each deck. Modification records are grouped and assigned modification identification names by the user. The program assigns numbers to each new record within the deck. The program has two modes of execution, BATCH and EDIT. The BATCH mode is controlled by an input file and is used to make changes permanent and create new library files. The EDIT mode is controlled by interactive terminal input and a built-in line editor is used for modification of single decks. Transferring of a library from one computer system to another is accomplished using a Portable Library File created by SLIB77 in a BATCH run

  12. An information retrieval system for research file data

    Science.gov (United States)

    Joan E. Lengel; John W. Koning

    1978-01-01

    Research file data have been successfully retrieved at the Forest Products Laboratory through a high-speed cross-referencing system involving the computer program FAMULUS as modified by the Madison Academic Computing Center at the University of Wisconsin. The method of data input, transfer to computer storage, system utilization, and effectiveness are discussed....

  13. Nuclear decay data files of the Dosimetry Research Group

    International Nuclear Information System (INIS)

    Eckerman, K.F.; Westfall, R.J.; Ryman, J.C.; Cristy, M.

    1993-12-01

    This report documents the nuclear decay data files used by the Dosimetry Research Group at Oak Ridge National Laboratory and the utility DEXRAX which provides access to the files. The files are accessed, by nuclide, to extract information on the intensities and energies of the radiations associated with spontaneous nuclear transformation of the radionuclides. In addition, beta spectral data are available for all beta-emitting nuclides. Two collections of nuclear decay data are discussed. The larger collection contains data for 838 radionuclides, which includes the 825 radionuclides assembled during the preparation of Publications 30 and 38 of the International Commission on Radiological Protection (ICRP) and 13 additional nuclides evaluated in preparing a monograph for the Medical Internal Radiation Dose (MIRD) Committee of the Society of Nuclear Medicine. The second collection is composed of data from the MIRD monograph and contains information for 242 radionuclides. Abridged tabulations of these data have been published by the ICRP in Publication 38 and by the Society of Nuclear Medicine in a monograph entitled ''MIRD: Radionuclide Data and Decay Schemes.'' The beta spectral data reported here have not been published by either organization. Electronic copies of the files and the utility, along with this report, are available from the Radiation Shielding Information Center at Oak Ridge National Laboratory

  14. The file of evaluated decay data in ENDF/B

    International Nuclear Information System (INIS)

    Reich, C.W.

    1991-01-01

    One important application of nuclear decay data is the Evaluated Nuclear Data File/B (ENDF/B), the base of evaluated nuclear data used in reactor research and technology activities within the United States. The decay data in the Activation File (158 nuclides) and the Actinide File (108 nuclides) excellently represent the current status of this information. In particular, the half-lives and gamma and alpha emission probabilities, quantities that are so important for many applications, of the actinide nuclides represent a significant improvement over those in ENDF/B-V because of the inclusion of data produced by an International Atomic Energy Agency Coordinated Research Program. The Fission Product File contains experimental decay data on ∼510 nuclides, which is essentially all for which a meaningful number of data are available. For the first time, delayed-neutron spectra for the precursor nuclides are included. Some hint of problems in the fission product data base is provided by the gamma decay heat following a burst irradiation of 239 Pu

  15. Method for data compression by associating complex numbers with files of data values

    Science.gov (United States)

    Feo, John Thomas; Hanks, David Carlton; Kraay, Thomas Arthur

    1998-02-10

    A method for compressing data for storage or transmission. Given a complex polynomial and a value assigned to each root, a root generated data file (RGDF) is created, one entry at a time. Each entry is mapped to a point in a complex plane. An iterative root finding technique is used to map the coordinates of the point to the coordinates of one of the roots of the polynomial. The value associated with that root is assigned to the entry. An equational data compression (EDC) method reverses this procedure. Given a target data file, the EDC method uses a search algorithm to calculate a set of m complex numbers and a value map that will generate the target data file. The error between a simple target data file and generated data file is typically less than 10%. Data files can be transmitted or stored without loss by transmitting the m complex numbers, their associated values, and an error file whose size is at most one-tenth of the size of the input data file.

  16. Part B Carrier Summary Data File

    Data.gov (United States)

    U.S. Department of Health & Human Services — Data sets are summarized at the carrier level by meaningful Healthcare Common Procedure Coding-Current Procedural Terminology, (HCPC-CPT), code ranges. The data set...

  17. [Filing and reuse of research data].

    Science.gov (United States)

    Osler, Merete; Bredahl, Lone; Ousager, Steen

    2008-02-25

    Currently several scientific journals only publish data from randomised clinical trials which are registered in a public database. Similar requirements on data sharing now follow grants from agencies such as the National Institute of Health. In Denmark the Health unit at the Danish Data Archive (DDA/Health) offers Danish researchers to keep their data for free on conditions that fulfil the above requirements. DDA/Health also passes on research data for reuse, and at present more than 300 studies are available in a database on sundhed.dda.dk.

  18. Data Management in Practice Supplementary Files

    DEFF Research Database (Denmark)

    Hansen, Karsten Kryger; Madsen, Christina Guldfeldt; Kjeldgaard, Anne Sofie Fink

    This report presents the results of the Data Management i Praksis (DMiP) project (in English: Data Management in Practice). The project was funded by Denmark’s Electronic Research Library (DEFF), the National Danish Archives and the participating main Danish libraries. The following partners...... level. The project should also demonstrate that research libraries have a role to play regarding research data. Furthermore, the project should ensure development of competences at the libraries, which can then be used in the future process of managing research data....

  19. GABUNGAN ALGORITMA VERNAM CHIPER DAN END OF FILE UNTUK KEAMANAN DATA

    Directory of Open Access Journals (Sweden)

    Christy Atika Sari

    2014-10-01

    Full Text Available Adanya kesamaan fungsi pada  metode Kriptografi dan Steganografi untuk mengamankan data, maka makalah ini menggunakan algoritma Vernam Cipher sebagai salah satu algoritma yang popular pada Kriptografi dan End Of File (EOF pada metode Steganografi. Vernam Cipher mempunyai kemampuan untuk menyembunyikan data karena proses enkripsi dan dekripsi menggunakan sebuah kunci yang sama. Kunci ini berasal dari perhitungan XOR anatar bit plainteks dengan bit kunci. Sedangkan EOF dikenal sebagai pengembangan dari metode Least Significant Bit (LSB. EOF dapat digunakan untuk menyisipkan data yang ukurannya sesuai dengan kebutuhan. Pada penelitian ini digunakan file asli berformat .mp3 dan file spoofing berformat .pdf. file hasil stegano berhasil di ekstraksi menjadi file asli dan file spoofing. Ukuran file yang telah melalui proses penyisipan sama dengan ukuran file sebelum disisipkan data ditambah dengan ukuran data yang disisipkan ke dalam file tersebut. Kata Kunci: Vernam Chiper, End Of File, Kriptografi, Steganografi.

  20. An UI Layout Files Analyzer for Test Data Generation

    Directory of Open Access Journals (Sweden)

    Paul POCATILU

    2014-01-01

    Full Text Available Prevention actions (trainings, audits and inspections (tests, validations, code reviews are the crucial factors in achieving a high quality level for any software application simply because low investments in this area are leading to significant expenses in terms of corrective actions needed for defect fixing. Mobile applications testing involves the use of various tools and scenarios. An important process is represented by test data generation. This paper proposes a test data generator (TDG system for mobile applications using several sources for test data and it focuses on the UI layout files analyzer module. The proposed architecture aims to reduce time-to-market for mobile applications. The focus is on test data generators based on the source code, user interface layout files (using markup languages like XML or XAML and application specifications. In order to assure a common interface for test data generators, an XML or JSON-based language called Data Specification Language (DSL is proposed.

  1. EVALUATED NUCLEAR STRUCTURE DATA FILE. A MANUAL FOR PREPARATION OF DATA SETS

    International Nuclear Information System (INIS)

    TULI, J.K.

    2001-01-01

    This manual describes the organization and structure of the Evaluated Nuclear Structure Data File (ENSDF). This computer-based file is maintained by the National Nuclear Data Center (NNDC) at Brookhaven National Laboratory for the international Nuclear Structure and Decay Data Network. For every mass number (presently, A ≤ 293), the Evaluated Nuclear Structure Data File (ENSDF) contains evaluated structure information. For masses A ≥ 44, this information is published in the Nuclear Data Sheets; for A < 44, ENSDF is based on compilations published in the journal Nuclear Physics. The information in ENSDF is updated by mass chain or by nuclide with a varying cycle time dependent on the availability of new information

  2. Part B National Summary Data File

    Data.gov (United States)

    U.S. Department of Health & Human Services — Previously known as BESS. The data sets are summarized by meaningful Healthcare Common Procedure Coding Current Procedural Terminology, HCPC CPT, code ranges. Brief...

  3. Health Care Information System (HCIS) Data File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The data was derived from the Health Care Information System (HCIS), which contains Medicare Part A (Inpatient, Skilled Nursing Facility, Home Health Agency (Part A...

  4. Guidebook for the ENDF/B-V nuclear data files

    International Nuclear Information System (INIS)

    Magurno, B.A.; Kinsey, R.R.; Scheffel, F.M.

    1982-07-01

    The National Nuclear Data Center (NNDC) has provided the Electric Power Research Institute (EPRI) with a convenient reference/guidebook to nuclear data derived from the Evaluated Nuclear Data File, Version V (ENDF/B-V). The main part of the edition consists of plots of the major cross sections for each of the General Purpose Nuclides. These plots are reconstructed from the resonance parameters and background cross sections given in the library. The resolution and display format have been selected to show general trends in the data. Following the section for individual nuclides, an intercomparison of cross section ratios (plots of eta and α values) is provided for the major fissile nuclei. The final section contains a table of nuclide properties derived from the data files. Included are thermal (2200m/sec and maxwellian averaged) cross sections, g factors, infinitely dilute resonance integrals and fission spectrum averages

  5. Data Conversion Tool For Tobii Pro Glasses 2 Live Data Files

    DEFF Research Database (Denmark)

    Wulff-Jensen, Andreas

    2017-01-01

    The data gathered through the Tobii Pro Glasses 2 is saved in a .json file called livedata.json. This format is convenient for the Tobii analysis software, but for any other analysis software packages it can be rather troublesome as the software packages do not know how to interpret the .json file...

  6. Penyembunyian Data pada File Video Menggunakan Metode LSB dan DCT

    Directory of Open Access Journals (Sweden)

    Mahmuddin Yunus

    2014-01-01

    Full Text Available Abstrak Penyembunyian data pada file video dikenal dengan istilah steganografi video. Metode steganografi yang dikenal diantaranya metode Least Significant Bit (LSB dan Discrete Cosine Transform (DCT. Dalam penelitian ini dilakukan penyembunyian data pada file video dengan menggunakan metode LSB, metode DCT, dan gabungan metode LSB-DCT. Sedangkan kualitas file video yang dihasilkan setelah penyisipan dihitung dengan menggunakan Mean Square Error (MSE dan Peak Signal to Noise Ratio (PSNR.Uji eksperimen dilakukan berdasarkan ukuran file video, ukuran file berkas rahasia yang disisipkan, dan resolusi video. Hasil pengujian menunjukkan tingkat keberhasilan steganografi video dengan menggunakan metode LSB adalah 38%, metode DCT adalah 90%, dan gabungan metode LSB-DCT adalah 64%. Sedangkan hasil perhitungan MSE, nilai MSE metode DCT paling rendah dibandingkan metode LSB dan gabungan metode LSB-DCT. Sedangkan metode LSB-DCT mempunyai nilai yang lebih kecil dibandingkan metode LSB. Pada pengujian PSNR diperoleh databahwa nilai PSNR metode DCTlebih tinggi dibandingkan metode LSB dan gabungan metode LSB-DCT. Sedangkan nilai PSNR metode gabungan LSB-DCT lebih tinggi dibandingkan metode LSB.   Kata Kunci— Steganografi, Video, Least Significant Bit (LSB, Discrete Cosine Transform (DCT, Mean Square Error (MSE, Peak Signal to Noise Ratio (PSNR                             Abstract Hiding data in video files is known as video steganography. Some of the well known steganography methods areLeast Significant Bit (LSB and Discrete Cosine Transform (DCT method. In this research, data will be hidden on the video file with LSB method, DCT method, and the combined method of LSB-DCT. While the quality result of video file after insertion is calculated using the Mean Square Error (MSE and Peak Signal to Noise Ratio (PSNR. The experiments were conducted based on the size of the video file, the file size of the inserted secret files, and

  7. Alaska Resource Data File, Nabesna quadrangle, Alaska

    Science.gov (United States)

    Hudson, Travis L.

    2003-01-01

    Descriptions of the mineral occurrences shown on the accompanying figure follow. See U.S. Geological Survey (1996) for a description of the information content of each field in the records. The data presented here are maintained as part of a statewide database on mines, prospects and mineral occurrences throughout Alaska.

  8. Kepler Data Validation Time Series File: Description of File Format and Content

    Science.gov (United States)

    Mullally, Susan E.

    2016-01-01

    The Kepler space mission searches its time series data for periodic, transit-like signatures. The ephemerides of these events, called Threshold Crossing Events (TCEs), are reported in the TCE tables at the NASA Exoplanet Archive (NExScI). Those TCEs are then further evaluated to create planet candidates and populate the Kepler Objects of Interest (KOI) table, also hosted at the Exoplanet Archive. The search, evaluation and export of TCEs is performed by two pipeline modules, TPS (Transit Planet Search) and DV (Data Validation). TPS searches for the strongest, believable signal and then sends that information to DV to fit a transit model, compute various statistics, and remove the transit events so that the light curve can be searched for other TCEs. More on how this search is done and on the creation of the TCE table can be found in Tenenbaum et al. (2012), Seader et al. (2015), Jenkins (2002). For each star with at least one TCE, the pipeline exports a file that contains the light curves used by TPS and DV to find and evaluate the TCE(s). This document describes the content of these DV time series files, and this introduction provides a bit of context for how the data in these files are used by the pipeline.

  9. Software Library for Bruker TopSpin NMR Data Files

    Energy Technology Data Exchange (ETDEWEB)

    2016-10-14

    A software library for parsing and manipulating frequency-domain data files that have been processed using the Bruker TopSpin NMR software package. In the context of NMR, the term "processed" indicates that the end-user of the Bruker TopSpin NMR software package has (a) Fourier transformed the raw, time-domain data (the Free Induction Decay) into the frequency-domain and (b) has extracted the list of NMR peaks.

  10. Transmission of the environmental radiation data files on the internet

    International Nuclear Information System (INIS)

    Yamaguchi, Yoshiaki; Saito, Tadashi; Yamamoto, Takayoshi; Matsumoto, Atsushi; Kyoh, Bunkei

    1999-01-01

    Recently, any text or data file has come to be transportable through the Internet with a personal computer. It is, however, restricted by selection of monitoring point to lay the cable because the personal circuit is generally used in case of continuous type environmental monitors. This is the reason why we have developed an environmental monitoring system that can transmit radiation data files on the Internet. Both 3''φ x 3'' NaI(Tl) detector and Thermo-Hydrometer are installed in the monitoring post of this system, and the data files of those detectors are transmitted from a personal computer at the monitoring point to Radioisotope Research Center of Osaka University. Environmental monitoring data from remote places have easily been obtained due to the data transmission through the Internet. Moreover, the system brings a higher precision of the environmental monitoring data because it includes the energy information of γ-rays. If it is possible to maintain the monitors at remote places, this system could execute the continuous environmental monitoring over the wide area. (author)

  11. Transmission of the environmental radiation data files on the internet

    Energy Technology Data Exchange (ETDEWEB)

    Yamaguchi, Yoshiaki; Saito, Tadashi; Yamamoto, Takayoshi [Osaka Univ., Suita (Japan). Radioisotope Research Center; Matsumoto, Atsushi; Kyoh, Bunkei

    1999-01-01

    Recently, any text or data file has come to be transportable through the Internet with a personal computer. It is, however, restricted by selection of monitoring point to lay the cable because the personal circuit is generally used in case of continuous type environmental monitors. This is the reason why we have developed an environmental monitoring system that can transmit radiation data files on the Internet. Both 3``{phi} x 3`` NaI(Tl) detector and Thermo-Hydrometer are installed in the monitoring post of this system, and the data files of those detectors are transmitted from a personal computer at the monitoring point to Radioisotope Research Center of Osaka University. Environmental monitoring data from remote places have easily been obtained due to the data transmission through the Internet. Moreover, the system brings a higher precision of the environmental monitoring data because it includes the energy information of {gamma}-rays. If it is possible to maintain the monitors at remote places, this system could execute the continuous environmental monitoring over the wide area. (author)

  12. Procedures manual for the Evaluated Nuclear Structure Data File

    International Nuclear Information System (INIS)

    Bhat, M.R.

    1987-10-01

    This manual is a collection of various notes, memoranda and instructions on procedures for the evaluation of data in the Evaluated Nuclear Structure Data File (ENSDF). They were distributed at different times over the past few years to the evaluators of nuclear structure data and some of them were not readily avaialble. Hence, they have been collected in this manual for ease of reference by the evaluators of the international Nuclear Structure and Decay Data (NSDD) network contribute mass-chains to the ENSDF. Some new articles were written specifically for this manual and others are reivsions of earlier versions

  13. Data Analysis & Statistical Methods for Command File Errors

    Science.gov (United States)

    Meshkat, Leila; Waggoner, Bruce; Bryant, Larry

    2014-01-01

    This paper explains current work on modeling for managing the risk of command file errors. It is focused on analyzing actual data from a JPL spaceflight mission to build models for evaluating and predicting error rates as a function of several key variables. We constructed a rich dataset by considering the number of errors, the number of files radiated, including the number commands and blocks in each file, as well as subjective estimates of workload and operational novelty. We have assessed these data using different curve fitting and distribution fitting techniques, such as multiple regression analysis, and maximum likelihood estimation to see how much of the variability in the error rates can be explained with these. We have also used goodness of fit testing strategies and principal component analysis to further assess our data. Finally, we constructed a model of expected error rates based on the what these statistics bore out as critical drivers to the error rate. This model allows project management to evaluate the error rate against a theoretically expected rate as well as anticipate future error rates.

  14. National RCRA Hazardous Waste Biennial Report Data Files

    Science.gov (United States)

    The United States Environmental Protection Agency (EPA), in cooperation with the States, biennially collects information regarding the generation, management, and final disposition of hazardous wastes regulated under the Resource Conservation and Recovery Act of 1976 (RCRA), as amended. Collection, validation and verification of the Biennial Report (BR) data is the responsibility of RCRA authorized states and EPA regions. EPA does not modify the data reported by the states or regions. Any questions regarding the information reported for a RCRA handler should be directed to the state agency or region responsible for the BR data collection. BR data are collected every other year (odd-numbered years) and submitted in the following year. The BR data are used to support regulatory activities and provide basic statistics and trend of hazardous waste generation and management. BR data is available to the public through 3 mechanisms. 1. The RCRAInfo website includes data collected from 2001 to present-day (https://rcrainfo.epa.gov/rcrainfoweb/action/main-menu/view). Users of the RCRAInfo website can run queries and output reports for different data collection years at this site. All BR data collected from 2001 to present-day is stored in RCRAInfo, and is accessible through this website. 2. An FTP site allows users to access BR data files collected from 1999 - present day (ftp://ftp.epa.gov/rcrainfodata/). Zip files are available for download directly from this

  15. 75 FR 24718 - Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability

    Science.gov (United States)

    2010-05-05

    ...] Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability AGENCY... documenting statistical analyses and data files submitted to the Center for Veterinary Medicine (CVM) for the... on Documenting Statistical Analysis Programs and Data Files; Availability'' giving interested persons...

  16. Data Visualization: Conversion of Data to Animation Files

    National Research Council Canada - National Science Library

    Kimbler, Nate

    2004-01-01

    .... Because visualization tools are vital to understanding complex physical phenomena, these visualization tools attempt to facilitate converting data into animations that can be saved and used in data...

  17. Status of transactinium nuclear data in the evaluated nuclear structure data file

    International Nuclear Information System (INIS)

    Ewbank, W.B.

    1980-01-01

    The structure and organization of the Evaluated Nuclear Structure Data File (ENSDF) which serves as the source data base for the production of drawings and tables for the ''Nuclear Data Sheets'' journal is described. The updating and output features of ENSDF are described with emphasis on nuclear structure and decay data of the transactinium isotopes. (author)

  18. Old Age, Survivors, and Disability Insurance (OASDI) Public-Use Microdata File, 2001 Data

    Data.gov (United States)

    Social Security Administration — The OASDI Public-Use Microdata File contains an extract of data fields from SSA's Master Beneficiary Record file and consists of a 1 percent random, representative...

  19. Aerothermodynamic data base. Data file contents report, phase C

    Science.gov (United States)

    Lutz, G. R.

    1983-01-01

    Space shuttle aerothermodynamic data, collected from a continuing series of wind tunnel tests, are permanently stored with the Data Management Services (DMS) system. Information pertaining to current baseline configuration definition is also stored. Documentation of DMS processed data arranged sequentially and by space shuttle configuration is listed to provide an up-to-date record of all applicable aerothermodynamic data collected, processed, or summarized during the space shuttle program. Tables provide survey information to the various space shuttle managerial and technical levels.

  20. XAFS Data Interchange: A single spectrum XAFS data file format

    International Nuclear Information System (INIS)

    Ravel, B.; Newville, M.

    2016-01-01

    We propose a standard data format for the interchange of XAFS data. The XAFS Data Interchange (XDI) standard is meant to encapsulate a single spectrum of XAFS along with relevant metadata. XDI is a text-based format with a simple syntax which clearly delineates metadata from the data table in a way that is easily interpreted both by a computer and by a human. The metadata header is inspired by the format of an electronic mail header, representing metadata names and values as an associative array. The data table is represented as columns of numbers. This format can be imported as is into most existing XAFS data analysis, spreadsheet, or data visualization programs. Along with a specification and a dictionary of metadata types, we provide an application-programming interface written in C and bindings for programming dynamic languages. (paper)

  1. The self-describing data sets file protocol and Toolkit

    International Nuclear Information System (INIS)

    Borland, M.; Emery, L.

    1995-01-01

    The Self-Describing Data Sets (SDDS) file protocol continues to be used extensively in commissioning the Advanced Photon Source (APS) accelerator complex. SDDS protocol has proved useful primarily due to the existence of the SDDS Toolkit, a growing set of about 60 generic commandline programs that read and/or write SDDS files. The SDDS Toolkit is also used extensively for simulation postprocessing, giving physicists a single environment for experiment and simulation. With the Toolkit, new SDDS data is displayed and subjected to complex processing without developing new programs. Data from EPICS, lab instruments, simulation, and other sources are easily integrated. Because the SDDS tools are commandline-based, data processing scripts are readily written using the user's preferred shell language. Since users work within a UNIX shell rather than an application-specific shell or GUI, they may add SDDS-compliant programs and scripts to their personal toolkits without restriction or complication. The SDDS Toolkit has been run under UNIX on SUN OS4, HP-UX, and LINUX. Application of SDDS to accelerator operation is being pursued using Tcl/Tk to provide a GUI

  2. Use of error files in uncertainty analysis and data adjustment

    International Nuclear Information System (INIS)

    Chestnutt, M.M.; McCracken, A.K.; McCracken, A.K.

    1979-01-01

    Some results are given from uncertainty analyses on Pressurized Water Reactor (PWR) and Fast Reactor Theoretical Benchmarks. Upper limit estimates of calculated quantities are shown to be significantly reduced by the use of ENDF/B data covariance files and recently published few-group covariance matrices. Some problems in the analysis of single-material benchmark experiments are discussed with reference to the Winfrith iron benchmark experiment. Particular attention is given to the difficulty of making use of very extensive measurements which are likely to be a feature of this type of experiment. Preliminary results of an adjustment in iron are shown

  3. Xbox one file system data storage: A forensic analysis

    OpenAIRE

    Gravel, Caitlin Elizabeth

    2015-01-01

    The purpose of this research was to answer the question, how does the file system of the Xbox One store data on its hard disk? This question is the main focus of the exploratory research and results sought. The research is focused on digital forensic investigators and experts. An out of the box Xbox One gaming console was used in the research. Three test cases were created as viable scenarios an investigator could come across in a search and seizure of evidence. The three test cases were then...

  4. Distributed PACS using distributed file system with hierarchical meta data servers.

    Science.gov (United States)

    Hiroyasu, Tomoyuki; Minamitani, Yoshiyuki; Miki, Mitsunori; Yokouchi, Hisatake; Yoshimi, Masato

    2012-01-01

    In this research, we propose a new distributed PACS (Picture Archiving and Communication Systems) which is available to integrate several PACSs that exist in each medical institution. The conventional PACS controls DICOM file into one data-base. On the other hand, in the proposed system, DICOM file is separated into meta data and image data and those are stored individually. Using this mechanism, since file is not always accessed the entire data, some operations such as finding files, changing titles, and so on can be performed in high-speed. At the same time, as distributed file system is utilized, accessing image files can also achieve high-speed access and high fault tolerant. The introduced system has a more significant point. That is the simplicity to integrate several PACSs. In the proposed system, only the meta data servers are integrated and integrated system can be constructed. This system also has the scalability of file access with along to the number of file numbers and file sizes. On the other hand, because meta-data server is integrated, the meta data server is the weakness of this system. To solve this defect, hieratical meta data servers are introduced. Because of this mechanism, not only fault--tolerant ability is increased but scalability of file access is also increased. To discuss the proposed system, the prototype system using Gfarm was implemented. For evaluating the implemented system, file search operating time of Gfarm and NFS were compared.

  5. PHOBINS: an index file of photon production cross section data and its utility code system

    International Nuclear Information System (INIS)

    Hasegawa, Akira; Koyama, Kinji; Ido, Masaru; Hotta, Masakazu; Miyasaka, Shun-ichi

    1978-08-01

    The code System PHOBINS developed for reference of photon production cross sections is described in detail. The system is intended to grasp the present status of photon production data and present the information of available data. It consists of four utility routines, CREA, UP-DT, REF and BACK, and data files. These utility routines are used for making an index file of the photon production cross sections, updating the index file, searching the index file and producing a back-up file of the index file. In the index file of the photon production cross sections, a data base system is employed for efficient data management in economical storage, ease of updating and efficient reference. The present report is a reference manual of PHOBINS. (author)

  6. NASIS data base management system: IBM 360 TSS implementation. Volume 6: NASIS message file

    Science.gov (United States)

    1973-01-01

    The message file for the NASA Aerospace Safety Information System (NASIS) is discussed. The message file contains all the message and term explanations for the system. The data contained in the file can be broken down into three separate sections: (1) global terms, (2) local terms, and (3) system messages. The various terms are defined and their use within the system is explained.

  7. NASIS data base management system - IBM 360/370 OS MVT implementation. 6: NASIS message file

    Science.gov (United States)

    1973-01-01

    The message file for the NASA Aerospace Safety Information System (NASIS) is discussed. The message file contains all the message and term explanations for the system. The data contained in the file can be broken down into three separate sections: (1) global terms, (2) local terms, and (3) system messages. The various terms are defined and their use within the system is explained.

  8. GEODOC: the GRID document file, record structure and data element description

    Energy Technology Data Exchange (ETDEWEB)

    Trippe, T.; White, V.; Henderson, F.; Phillips, S.

    1975-11-06

    The purpose of this report is to describe the information structure of the GEODOC file. GEODOC is a computer based file which contains the descriptive cataloging and indexing information for all documents processed by the National Geothermal Information Resource Group. This file (along with other GRID files) is managed by DBMS, the Berkeley Data Base Management System. Input for the system is prepared using the IRATE Text Editing System with its extended (12 bit) character set, or punched cards.

  9. ARM Data File Standards Version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Palanisamy, Giri [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-05-01

    The U.S. Department of Energy (DOE)’s Atmospheric Radiation Measurement (ARM) Climate Research Facility performs routine in situ and remote-sensing observations to provide a detailed and accurate description of the Earth atmosphere in diverse climate regimes. The result is a huge archive of diverse data sets containing observational and derived data, currently accumulating at a rate of 30 terabytes (TB) of data and 150,000 different files per month (http://www.archive.arm.gov/stats/). Continuing the current processing while scaling this to even larger sizes is extremely important to the ARM Facility and requires consistent metadata and data standards. The standards described in this document will enable development of automated analysis and discovery tools for the ever growing data volumes. It will enable consistent analysis of the multiyear data, allow for development of automated monitoring and data health status tools, and allow future capabilities of delivering data on demand that can be tailored explicitly for the user needs. This analysis ability will only be possible if the data follows a minimum set of standards. This document proposes a hierarchy of required and recommended standards.

  10. ARM Data File Standards Version: 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Kehoe, Kenneth [University of Oklahoma; Beus, Sherman [Pacific Northwest National Laboratory; Cialella, Alice [Brookhaven National Laboratory; Collis, Scott [Argonne National Laboratory; Ermold, Brian [Pacific Northwest National Laboratory; Perez, Robin [State University of New York, Albany; Shamblin, Stefanie [Oak Ridge National Laboratory; Sivaraman, Chitra [Pacific Northwest National Laboratory; Jensen, Mike [Brookhaven National Laboratory; McCord, Raymond [Oak Ridge National Laboratory; McCoy, Renata [Sandia National Laboratories; Moore, Sean [Alliant Techsystems, Inc.; Monroe, Justin [University of Oklahoma; Perkins, Brad [Los Alamos National Laboratory; Shippert, Tim [Pacific Northwest National Laboratory

    2014-04-01

    The Atmospheric Radiation Measurement (ARM) Climate Research Facility performs routine in situ and remote-sensing observations to provide a detailed and accurate description of the Earth atmosphere in diverse climate regimes. The result is a diverse data sets containing observational and derived data, currently accumulating at a rate of 30 TB of data and 150,000 different files per month (http://www.archive.arm.gov/stats/storage2.html). Continuing the current processing while scaling this to even larger sizes is extremely important to the ARM Facility and requires consistent metadata and data standards. The standards described in this document will enable development of automated analysis and discovery tools for the ever-growing volumes of data. It also will enable consistent analysis of the multiyear data, allow for development of automated monitoring and data health status tools, and facilitate development of future capabilities for delivering data on demand that can be tailored explicitly to user needs. This analysis ability will only be possible if the data follows a minimum set of standards. This document proposes a hierarchy that includes required and recommended standards.

  11. Search across Different Media: Numeric Data Sets and Text Files

    Directory of Open Access Journals (Sweden)

    Michael Buckland

    2006-12-01

    Full Text Available Digital technology encourages the hope of searching across and between different media forms (text, sound, image, numeric data. Topic searches are described in two different media: text files and socioeconomic numeric databases and also for transverse searching, whereby retrieved text is used to find topically related numeric data and vice versa. Direct transverse searching across different media is impossible. Descriptive metadata provide enabling infrastructure, but usually require mappings between different vocabularies and a search-term recommender system. Statistical association techniques and natural-language processing can help. Searches in socioeconomic numeric databases ordinarily require that place and time be specified.

  12. Manual on usage of the Nuclear Reaction Data File (NRDF)

    International Nuclear Information System (INIS)

    1984-10-01

    In the computer in the Institute for Nuclear Study, University of Tokyo, there is set up a Nuclear Reaction Data File (NRDF) which has been built in Hokkaido University. While the data base is growing year after year, its trial usage is for the purpose of joint utilization by educational institutions. In section 1, examples of the retrieval are presented to have the user familiarize with NRDF. In section 2, the terms used in retrieval are given in table. Then, in section 3, as a summary of the examples, structure of the retrieval commands is explained. In section 4, for the retrieval results on a CRT, cautions in reading are given. Finally, in section 5, general cautions in usage of NRDF are given. (Mori, K.)

  13. Coexistence of graph-oriented and relational data file organisations in a data bank system

    International Nuclear Information System (INIS)

    Engel, K.D.

    1980-01-01

    It is shown that a coexistence of hierarchical and relational data bank structures in computer networks in a common data bank system is possible. This coexistence model, first established by NIJSSEN, regards the graph theory CODASYL approach and CODD's relational model as graph-oriented, or rather table-oriented, data file organisation as presented to the user of a common logical structure of the data bank. (WB) [de

  14. Data formats and procedures for the Evaluated Nuclear Data File, ENDF

    International Nuclear Information System (INIS)

    Garber, D.; Dunford, C.; Pearlstein, S.

    1975-10-01

    This report describes the philosophy of the Evaluated Nuclear Data File (ENDF) and the data formats and procedures that have been developed for it. The ENDF system was designed for the storage and retrieval of the evaluated nuclear data that are required for neutronics, photonics and decay heat calculations. This system is composed of several parts that include a series of data processing codes and neutron and photon cross section nuclear structure libraries

  15. Data formats and procedures for the Evaluated Nuclear Data File, ENDF

    Energy Technology Data Exchange (ETDEWEB)

    Garber, D.; Dunford, C.; Pearlstein, S.

    1975-10-01

    This report describes the philosophy of the Evaluated Nuclear Data File (ENDF) and the data formats and procedures that have been developed for it. The ENDF system was designed for the storage and retrieval of the evaluated nuclear data that are required for neutronics, photonics and decay heat calculations. This system is composed of several parts that include a series of data processing codes and neutron and photon cross section nuclear structure libraries.

  16. The version control service for the ATLAS data acquisition configuration files

    International Nuclear Information System (INIS)

    Soloviev, Igor

    2012-01-01

    The ATLAS experiment at the LHC in Geneva uses a complex and highly distributed Trigger and Data Acquisition system, involving a very large number of computing nodes and custom modules. The configuration of the system is specified by schema and data in more than 1000 XML files, with various experts responsible for updating the files associated with their components. Maintaining an error free and consistent set of XML files proved a major challenge. Therefore a special service was implemented; to validate any modifications; to check the authorization of anyone trying to modify a file; to record who had made changes, plus when and why; and to provide tools to compare different versions of files and to go back to earlier versions if required. This paper provides details of the implementation and exploitation experience, that may be interesting for other applications using many human-readable files maintained by different people, where consistency of the files and traceability of modifications are key requirements.

  17. Status of transactinium nuclear data in the Evaluated Nuclear Structure Data File

    International Nuclear Information System (INIS)

    Ewbank, W.B.

    1979-01-01

    The organization and program of the Nuclear Data Project are described. An Evaluated Nuclear Structure Data File (ENSDF) was designed to contain most of the data of nuclear structure physics. ENSDF includes adopted level information for all 1950 known nuclei, and detailed data for approximately 1500 decay schemes. File organization, management, and retrieval are reviewed. An international network of data evaluation centers has been organized to provide for a four-year cycle of ENSDF revisions. Standard retrieval and display programs can prepare various tables of specific data, which can serve as a good first approximation to a complete up-to-date compilation. Appendixes list, for A > 206, nuclear levels with lifetimes > or = 1 s, strong γ rays from radioisotopes (ordered by nuclide and energy), and strong α particle emissions (similarly ordered). 8 figures

  18. Nuclear structure data file. A manual for preparation of data sets

    International Nuclear Information System (INIS)

    Ewbank, W.B.; Schmorak, M.R.; Bertrand, F.E.; Feliciano, M.; Horen, D.J.

    1975-06-01

    The Nuclear Data Project at ORNL is building a computer-based file of nuclear structure data, which is intended for use by both basic and applied users. For every nucleus, the Nuclear Structure Data File contains evaluated nuclear structure information. This manual describes a standard input format for nuclear structure data. The format is sufficiently structured that bulk data can be entered efficiently. At the same time, the structure is open-ended and can accommodate most measured or deduced quantities that yield nuclear structure information. Computer programs have been developed at the Data Project to perform consistency checking and routine calculations. Programs are also used for preparing level scheme drawings. (U.S.)

  19. Operations Data Files, driving force behind International Space Station operations

    Science.gov (United States)

    Hoppenbrouwers, Tom; Ferra, Lionel; Markus, Michael; Wolff, Mikael

    2017-09-01

    Almost all tasks performed by the astronauts on-board the International Space Station (ISS) and by ground controllers in Mission Control Centre, from operation and maintenance of station systems to the execution of scientific experiments or high risk visiting vehicles docking manoeuvres, would not be possible without Operations Data Files (ODF). ODFs are the User Manuals of the Space Station and have multiple faces, going from traditional step-by-step procedures, scripts, cue cards, over displays, to software which guides the crew through the execution of certain tasks. Those key operational documents are standardized as they are used on-board the Space Station by an international crew constantly changing every 3 months. Furthermore this harmonization effort is paramount for consistency as the crew moves from one element to another in a matter of seconds, and from one activity to another. On ground, a significant large group of experts from all International Partners drafts, prepares reviews and approves on a daily basis all Operations Data Files, ensuring their timely availability on-board the ISS for all activities. Unavailability of these operational documents will halt the conduct of experiments or cancel milestone events. This paper will give an insight in the ground preparation work for the ODFs (with a focus on ESA ODF processes) and will present an overview on ODF formats and their usage within the ISS environment today and show how vital they are. Furthermore the focus will be on the recently implemented ODF features, which significantly ease the use of this documentation and improve the efficiency of the astronauts performing the tasks. Examples are short video demonstrations, interactive 3D animations, Execute Tailored Procedures (XTP-versions), tablet products, etc.

  20. File-based data flow in the CMS Filter Farm

    Science.gov (United States)

    Andre, J.-M.; Andronidis, A.; Bawej, T.; Behrens, U.; Branson, J.; Chaze, O.; Cittolin, S.; Darlea, G.-L.; Deldicque, C.; Dobson, M.; Dupont, A.; Erhan, S.; Gigi, D.; Glege, F.; Gomez-Ceballos, G.; Hegeman, J.; Holzner, A.; Jimenez-Estupiñán, R.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, R. K.; Morovic, S.; Nunez-Barranco-Fernandez, C.; O'Dell, V.; Orsini, L.; Paus, C.; Petrucci, A.; Pieri, M.; Racz, A.; Roberts, P.; Sakulin, H.; Schwick, C.; Stieger, B.; Sumorok, K.; Veverka, J.; Zaza, S.; Zejdl, P.

    2015-12-01

    During the LHC Long Shutdown 1, the CMS Data Acquisition system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and prepare the ground for future upgrades of the detector front-ends. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. This approach provides additional decoupling between the HLT algorithms and the input and output data flow. All the metadata needed for bookkeeping of the data flow and the HLT process lifetimes are also generated in the form of small “documents” using the JSON encoding, by either services in the flow of the HLT execution (for rates etc.) or watchdog processes. These “files” can remain memory-resident or be written to disk if they are to be used in another part of the system (e.g. for aggregation of output data). We discuss how this redesign improves the robustness and flexibility of the CMS DAQ and the performance of the system currently being commissioned for the LHC Run 2.

  1. File-Based Data Flow in the CMS Filter Farm

    Energy Technology Data Exchange (ETDEWEB)

    Andre, J.M.; et al.

    2015-12-23

    During the LHC Long Shutdown 1, the CMS Data Acquisition system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and prepare the ground for future upgrades of the detector front-ends. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. This approach provides additional decoupling between the HLT algorithms and the input and output data flow. All the metadata needed for bookkeeping of the data flow and the HLT process lifetimes are also generated in the form of small “documents” using the JSON encoding, by either services in the flow of the HLT execution (for rates etc.) or watchdog processes. These “files” can remain memory-resident or be written to disk if they are to be used in another part of the system (e.g. for aggregation of output data). We discuss how this redesign improves the robustness and flexibility of the CMS DAQ and the performance of the system currently being commissioned for the LHC Run 2.

  2. Extracting the Data From the LCM vk4 Formatted Output File

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, James G. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-29

    These are slides about extracting the data from the LCM vk4 formatted output file. The following is covered: vk4 file produced by Keyence VK Software, custom analysis, no off the shelf way to read the file, reading the binary data in a vk4 file, various offsets in decimal lines, finding the height image data, directly in MATLAB, binary output beginning of height image data, color image information, color image binary data, color image decimal and binary data, MATLAB code to read vk4 file (choose a file, read the file, compute offsets, read optical image, laser optical image, read and compute laser intensity image, read height image, timing, display height image, display laser intensity image, display RGB laser optical images, display RGB optical images, display beginning data and save images to workspace, gamma correction subroutine), reading intensity form the vk4 file, linear in the low range, linear in the high range, gamma correction for vk4 files, computing the gamma intensity correction, observations.

  3. Parallel file system performances in fusion data storage

    International Nuclear Information System (INIS)

    Iannone, F.; Podda, S.; Bracco, G.; Manduchi, G.; Maslennikov, A.; Migliori, S.; Wolkersdorfer, K.

    2012-01-01

    High I/O flow rates, up to 10 GB/s, are required in large fusion Tokamak experiments like ITER where hundreds of nodes store simultaneously large amounts of data acquired during the plasma discharges. Typical network topologies such as linear arrays (systolic), rings, meshes (2-D arrays), tori (3-D arrays), trees, butterfly, hypercube in combination with high speed data transports like Infiniband or 10G-Ethernet, are the main areas in which the effort to overcome the so-called parallel I/O bottlenecks is most focused. The high I/O flow rates were modelled in an emulated testbed based on the parallel file systems such as Lustre and GPFS, commonly used in High Performance Computing. The test runs on High Performance Computing–For Fusion (8640 cores) and ENEA CRESCO (3392 cores) supercomputers. Message Passing Interface based applications were developed to emulate parallel I/O on Lustre and GPFS using data archival and access solutions like MDSPLUS and Universal Access Layer. These methods of data storage organization are widely diffused in nuclear fusion experiments and are being developed within the EFDA Integrated Tokamak Modelling – Task Force; the authors tried to evaluate their behaviour in a realistic emulation setup.

  4. Parallel file system performances in fusion data storage

    Energy Technology Data Exchange (ETDEWEB)

    Iannone, F., E-mail: francesco.iannone@enea.it [Associazione EURATOM-ENEA sulla Fusione, C.R.ENEA Frascati, via E.Fermi, 45 - 00044 Frascati, Rome (Italy); Podda, S.; Bracco, G. [ENEA Information Communication Tecnologies, Lungotevere Thaon di Revel, 76 - 00196 Rome (Italy); Manduchi, G. [Associazione EURATOM-ENEA sulla Fusione, Consorzio RFX, Corso Stati Uniti, 4 - 35127 Padua (Italy); Maslennikov, A. [CASPUR Inter-University Consortium for the Application of Super-Computing for Research, via dei Tizii, 6b - 00185 Rome (Italy); Migliori, S. [ENEA Information Communication Tecnologies, Lungotevere Thaon di Revel, 76 - 00196 Rome (Italy); Wolkersdorfer, K. [Juelich Supercomputing Centre-FZJ, D-52425 Juelich (Germany)

    2012-12-15

    High I/O flow rates, up to 10 GB/s, are required in large fusion Tokamak experiments like ITER where hundreds of nodes store simultaneously large amounts of data acquired during the plasma discharges. Typical network topologies such as linear arrays (systolic), rings, meshes (2-D arrays), tori (3-D arrays), trees, butterfly, hypercube in combination with high speed data transports like Infiniband or 10G-Ethernet, are the main areas in which the effort to overcome the so-called parallel I/O bottlenecks is most focused. The high I/O flow rates were modelled in an emulated testbed based on the parallel file systems such as Lustre and GPFS, commonly used in High Performance Computing. The test runs on High Performance Computing-For Fusion (8640 cores) and ENEA CRESCO (3392 cores) supercomputers. Message Passing Interface based applications were developed to emulate parallel I/O on Lustre and GPFS using data archival and access solutions like MDSPLUS and Universal Access Layer. These methods of data storage organization are widely diffused in nuclear fusion experiments and are being developed within the EFDA Integrated Tokamak Modelling - Task Force; the authors tried to evaluate their behaviour in a realistic emulation setup.

  5. A data compression algorithm for nuclear spectrum files

    International Nuclear Information System (INIS)

    Mika, J.F.; Martin, L.J.; Johnston, P.N.

    1990-01-01

    The total space occupied by computer files of spectra generated in nuclear spectroscopy systems can lead to problems of storage, and transmission time. An algorithm is presented which significantly reduces the space required to store nuclear spectra, without loss of any information content. Testing indicates that spectrum files can be routinely compressed by a factor of 5. (orig.)

  6. Managing Variant Calling Files the Big Data Way: Using HDFS and Apache Parquet

    NARCIS (Netherlands)

    Boufea, Aikaterini; Finkers, H.J.; Kaauwen, van M.P.W.; Kramer, M.R.; Athanasiadis, I.N.

    2017-01-01

    Big Data has been seen as a remedy for the efficient management of the ever-increasing genomic data. In this paper, we investigate the use of Apache Spark to store and process Variant Calling Files (VCF) on a Hadoop cluster. We demonstrate Tomatula, a software tool for converting VCF files to Apache

  7. Photon and decay data libraries for ORIGEN2 code based on JENDL FP decay data file 2000

    CERN Document Server

    Katakura, J I

    2002-01-01

    Photon and decay data libraries for the ORIGEN2 code have been updated by using JENDL FP Decay Data File 2000 (JENDL/FPD-00). As for the decay data, half-lives, branching ratios and recoverable energy values have been replaced with those of the JENDL/FPD-00 file. The data of the photon library has been also replaced with those of the JENDL/FPD-00 file in which photon data of the nuclides without measured data are calculated with a theoretical method. Using the updated photon library, the calculation of photon spectrum at a short time after fission event is able to be made.

  8. Data File Standard for Flow Cytometry, version FCS 3.1.

    Science.gov (United States)

    Spidlen, Josef; Moore, Wayne; Parks, David; Goldberg, Michael; Bray, Chris; Bierre, Pierre; Gorombey, Peter; Hyun, Bill; Hubbard, Mark; Lange, Simon; Lefebvre, Ray; Leif, Robert; Novo, David; Ostruszka, Leo; Treister, Adam; Wood, James; Murphy, Robert F; Roederer, Mario; Sudar, Damir; Zigon, Robert; Brinkman, Ryan R

    2010-01-01

    The flow cytometry data file standard provides the specifications needed to completely describe flow cytometry data sets within the confines of the file containing the experimental data. In 1984, the first Flow Cytometry Standard format for data files was adopted as FCS 1.0. This standard was modified in 1990 as FCS 2.0 and again in 1997 as FCS 3.0. We report here on the next generation flow cytometry standard data file format. FCS 3.1 is a minor revision based on suggested improvements from the community. The unchanged goal of the standard is to provide a uniform file format that allows files created by one type of acquisition hardware and software to be analyzed by any other type.The FCS 3.1 standard retains the basic FCS file structure and most features of previous versions of the standard. Changes included in FCS 3.1 address potential ambiguities in the previous versions and provide a more robust standard. The major changes include simplified support for international characters and improved support for storing compensation. The major additions are support for preferred display scale, a standardized way of capturing the sample volume, information about originality of the data file, and support for plate and well identification in high throughput, plate based experiments. Please see the normative version of the FCS 3.1 specification in Supporting Information for this manuscript (or at http://www.isac-net.org/ in the Current standards section) for a complete list of changes.

  9. Access to DIII-D data located in multiple files and multiple locations

    International Nuclear Information System (INIS)

    McHarg, B.B. Jr.

    1993-10-01

    The General Atomics DIII-D tokamak fusion experiment is now collecting over 80 MB of data per discharge once every 10 min, and that quantity is expected to double within the next year. The size of the data files, even in compressed format, is becoming increasingly difficult to handle. Data is also being acquired now on a variety of UNIX systems as well as MicroVAX and MODCOMP computer systems. The existing computers collect all the data into a single shot file, and this data collection is taking an ever increasing amount of time as the total quantity of data increases. Data is not available to experimenters until it has been collected into the shot file, which is in conflict with the substantial need for data examination on a timely basis between shots. The experimenters are also spread over many different types of computer systems (possibly located at other sites). To improve data availability and handling, software has been developed to allow individual computer systems to create their own shot files locally. The data interface routine PTDATA that is used to access DIII-D data has been modified so that a user's code on any computer can access data from any computer where that data might be located. This data access is transparent to the user. Breaking up the shot file into separate files in multiple locations also impacts software used for data archiving, data management, and data restoration

  10. Benchmark test of evaluated nuclear data files for fast reactor neutronics application

    International Nuclear Information System (INIS)

    Chiba, Go; Hazama, Taira; Iwai, Takehiko; Numata, Kazuyuki

    2007-07-01

    A benchmark test of the latest evaluated nuclear data files, JENDL-3.3, JEFF-3.1 and ENDF/B-VII.0, has been carried out for fast reactor neutronics application. For this benchmark test, experimental data obtained at fast critical assemblies and fast power reactors are utilized. In addition to comparing of numerical solutions with the experimental data, we have extracted several cross sections, in which differences between three nuclear data files affect significantly numerical solutions, by virtue of sensitivity analyses. This benchmark test concludes that ENDF/B-VII.0 predicts well the neutronics characteristics of fast neutron systems rather than the other nuclear data files. (author)

  11. Views of CMS Event Data Objects, Files, Collections, Virtual Data Products

    CERN Document Server

    Holtman, Koen

    2001-01-01

    The CMS data grid system will store many types of data maintained by the CMS collaboration. An important type of data is the event data, which is defined in this note as all data that directly represents simulated, raw, or reconstructed CMS physics events. Many views on this data will exist simultaneously. To a CMS physics code implementer this data will appear as C++ objects, to a tape robot operator the data will appear as files. This note identifies different views that can exist, describes each of them, and interrelates them by placing them into a vertical stack. This particular stack integrates several existing architectural structures, and is therefore a plausible basis for further prototyping and architectural work. This document is intended as a contribution to, and as common (terminological) reference material for, the CMS architectural efforts and for the Grid projects PPDG, GriPhyN, and the EU DataGrid.

  12. Distributing File-Based Data to Remote Sites Within the BABAR Collaboration

    International Nuclear Information System (INIS)

    Gowdy, Stephen J.

    2002-01-01

    BABAR [1] uses two formats for its data: Objectivity database and root [2] files. This poster concerns the distribution of the latter--for Objectivity data see [3]. The BABAR analysis data is stored in root files--one per physics run and analysis selection channel--maintained in a large directory tree. Currently BABAR has more than 4.5 TBytes in 200,000 root files. This data is (mostly) produced at SLAC, but is required for analysis at universities and research centers throughout the us and Europe. Two basic problems confront us when we seek to import bulk data from slac to an institute's local storage via the network. We must determine which files must be imported (depending on the local site requirements and which files have already been imported), and we must make the optimum use of the network when transferring the data. Basic ftp-like tools (ftp, scp, etc) do not attempt to solve the first problem. More sophisticated tools like rsync [4], the widely-used mirror/synchronization program, compare local and remote file systems, checking for changes (based on file date, size and, if desired, an elaborate checksum) in order to only copy new or modified files. However rsync allows for only limited file selection. Also when, as in BABAR, an extremely large directory structure must be scanned, rsync can take several hours just to determine which files need to be copied. Although rsync (and scp) provides on-the-fly compression, it does not allow us to optimize the network transfer by using multiple streams, adjusting the tcp window size, or separating encrypted authentication from unencrypted data channels

  13. Distributing file-based data to remote sites within the BABAR collaboration

    International Nuclear Information System (INIS)

    Adye, T.; Dorigo, A.; Forti, A.; Leonardi, E.

    2001-01-01

    BABAR uses two formats for its data: Objectivity database and ROOT files. This poster concerns the distribution of the latter--for Objectivity data see. The BABAR analysis data is stored in ROOT files--one per physics run and analysis selection channel-maintained in a large directory tree. Currently BABAR has more than 4.5 TBytes in 200,00- ROOT files. This data is (mostly) produced at SLAC, but is required for analysis at universities and research centres throughout the US and Europe. Two basic problems confront us when we seek to import bulk data from SLAC to an institute's local storage via the network. We must determine which files must be imported (depending on the local site requirements and which files have already been imported), and the authors must make the optimum use of the network when transferring the data. Basic ftp-like tools (ftp, scp, etc) do not attempt to solve the first problem. More sophisticated tools like rsync, the widely-used mirror/synchronisation program, compare local and remote file systems, checking for changes (based on file date, size and, if desired, an elaborate checksum) in order to only copy new or modified files. However rsync allows for only limited file selection. Also when, as in BABAR, an extremely large directory structure must be scanned, rsync can take several hours just to determine which files need to be copied. Although rsync (and scp) provides on-the-fly compression, it does not allow us to optimise the network transfer by using multiple streams, adjusting the TCP window size, or separating encrypted authentication from unencrypted data channels

  14. Data Vaults: a Database Welcome to Scientific File Repositories

    NARCIS (Netherlands)

    M.G. Ivanova (Milena); Y. Kargin (Yagiz); M.L. Kersten (Martin); S. Manegold (Stefan); Y. Zhang (Ying); M. Datcu (Mihai); D. Espinoza Molina

    2013-01-01

    textabstractEfficient management and exploration of high-volume scientific file repositories have become pivotal for advancement in science. We propose to demonstrate the Data Vault, an extension of the database system architecture that transparently opens scientific file repositories for efficient

  15. High School and Beyond Transcripts Survey (1982). Data File User's Manual. Contractor Report.

    Science.gov (United States)

    Jones, Calvin; And Others

    This data file user's manual documents the procedures used to collect and process high school transcripts for a large sample of the younger cohort (1980 sophomores) in the High School and Beyond survey. The manual provides the user with the technical assistance needed to use the computer file and also discusses the following: (1) sample design for…

  16. Recalling ISX shot data files from the off-line archive

    International Nuclear Information System (INIS)

    Stanton, J.S.

    1981-02-01

    This document describes a set of computer programs designed to allow access to ISX shot data files stored on off-line disk packs. The programs accept user requests for data files and build a queue of end requests. When an operator is available to mount the necessary disk packs, the system copies the requested files to an on-line disk area. The program runs on the Fusion Energy Division's DECsystem-10 computer. The request queue is implemented under the System 1022 data base management system. The support programs are coded in MACRO-10 and FORTRAN-10

  17. FORTRAN data files transference from VAX/VMS to ALPHA/UNIX

    International Nuclear Information System (INIS)

    Sanchez, E.; Milligen, B.Ph. van

    1997-01-01

    Several tools have been developed to access the TJ-I and TJ-IU databases, which currently reside in VAX/VMS servers, from the TJ-II Data Acquisition System DEC ALPHA 8400 server. The TJ-I/TJ-IU databases are not homogeneous and contain several types of data files, namely, SADE. CAMAC and FORTRAN un formatted files. The tools presented in this report allow one to transfer CAMAC and those FORTRAN un formatted files defined herein. from a VAX/VMS server, for data manipulation on the ALPHA/Digital UNIX server. (Author) 5 refs

  18. NJOY99, Data Processing System of Evaluated Nuclear Data Files ENDF Format

    International Nuclear Information System (INIS)

    2000-01-01

    1 - Description of program or function: The NJOY nuclear data processing system is a modular computer code used for converting evaluated nuclear data in the ENDF format into libraries useful for applications calculations. Because the Evaluated Nuclear Data File (ENDF) format is used all around the world (e.g., ENDF/B-VI in the US, JEF-2.2 in Europe, JENDL-3.2 in Japan, BROND-2.2 in Russia), NJOY gives its users access to a wide variety of the most up-to-date nuclear data. NJOY provides comprehensive capabilities for processing evaluated data, and it can serve applications ranging from continuous-energy Monte Carlo (MCNP), through deterministic transport codes (DANT, ANISN, DORT), to reactor lattice codes (WIMS, EPRI). NJOY handles a wide variety of nuclear effects, including resonances, Doppler broadening, heating (KERMA), radiation damage, thermal scattering (even cold moderators), gas production, neutrons and charged particles, photo-atomic interactions, self shielding, probability tables, photon production, and high-energy interactions (to 150 MeV). Output can include printed listings, special library files for applications, and Postscript graphics (plus color). More information on NJOY is available from the developer's home page at http://t2.lanl.gov/tour/tourbus.html. Follow the Tourbus section of the Tour area to find notes from the ICTP lectures held at Trieste in March 2000 on the ENDF format and on the NJOY code. NJOY contains the following modules: NJOY directs the flow of data through the other modules and contains a library of common functions and subroutines used by the other modules. RECONR reconstructs pointwise (energy-dependent) cross sections from ENDF resonance parameters and interpolation schemes. BROADR Doppler broadens and thins pointwise cross sections. UNRESR computes effective self-shielded pointwise cross sections in the unresolved energy range. HEATR generates pointwise heat production cross sections (KERMA coefficients) and radiation

  19. Toolsets for Airborne Data (TAD): Improving Machine Readability for ICARTT Data Files

    Science.gov (United States)

    Northup, E. A.; Early, A. B.; Beach, A. L., III; Kusterer, J.; Quam, B.; Wang, D.; Chen, G.

    2015-12-01

    NASA has conducted airborne tropospheric chemistry studies for about three decades. These field campaigns have generated a great wealth of observations, including a wide range of the trace gases and aerosol properties. The ASDC Toolsets for Airborne Data (TAD) is designed to meet the user community needs for manipulating aircraft data for scientific research on climate change and air quality relevant issues. TAD makes use of aircraft data stored in the International Consortium for Atmospheric Research on Transport and Transformation (ICARTT) file format. ICARTT has been the NASA standard since 2010, and is widely used by NOAA, NSF, and international partners (DLR, FAAM). Its level of acceptance is due in part to it being generally self-describing for researchers, i.e., it provides necessary data descriptions for proper research use. Despite this, there are a number of issues with the current ICARTT format, especially concerning the machine readability. In order to overcome these issues, the TAD team has developed an "idealized" file format. This format is ASCII and is sufficiently machine readable to sustain the TAD system, however, it is not fully compatible with the current ICARTT format. The process of mapping ICARTT metadata to the idealized format, the format specifics, and the actual conversion process will be discussed. The goal of this presentation is to demonstrate an example of how to improve the machine readability of ASCII data format protocols.

  20. Data Qualification Report For: Thermodynamic Data File, DATA0.YMP.R0 For Geochemical Code, EQ3/6?

    International Nuclear Information System (INIS)

    P.L. Cloke

    2000-09-01

    The objective of this work is to evaluate the adequacy of chemical thermodynamic data provided by Lawrence Livermore National Laboratory (LLNL) as DataO.ymp.ROA in response to an input request submitted under AP-3.14Q. This request specified that chemical thermodynamic data available in the file, Data0.com.R2, be updated, improved, and augmented for use in geochemical modeling used in Process Model Reports (PMRs) for Engineered Barrier Systems, Waste Form, Waste Package, Unsaturated Zone, and Near Field Environment, as well as for Performance Assessment. The data are qualified in the temperature range 0 to 100 C. Several Data Tracking Numbers (DTNs) associated with Analysis/Model Reports (AMR) addressing various aspects of the post-closure chemical behavior of the waste package and the Engineered Barrier System that rely on EQ316 outputs to which these data are used as input, are Principal Factor affecting. This qualification activity was accomplished in accordance with the AP-SIII.2Q using the Technical Assessment method. A development plan, TDP-EBS-MD-000044, was prepared in accordance with AP-2.13Q and approved by the Responsible Manager. In addition, a Process Control Evaluation was performed in accordance with AP-SV.1Q. The qualification method, selected in accordance with AP-SIII.2Q, was Technical Assessment. The rationale for this approach is that the data in File Data0.com.R2 are considered Handbook data and therefore do not themselves require qualification. Only changes to Data0.com.R2 required qualification. A new file has been produced which contains the database Data0.ymp.R0, which is recommended for qualification as a result of this action. Data0.ymp.R0 will supersede Data0.com.R2 for all Yucca Mountain Project (YMP) activities

  1. Ground-Based Global Navigation Satellite System Combined Broadcast Ephemeris Data (daily files) from NASA CDDIS

    Data.gov (United States)

    National Aeronautics and Space Administration — This dataset consists of ground-based Global Navigation Satellite System (GNSS) Combined Broadcast Ephemeris Data (daily files of all distinct navigation messages...

  2. Processing of evaluated neutron data files in ENDF format on personal computers

    International Nuclear Information System (INIS)

    Vertes, P.

    1991-11-01

    A computer code package - FDMXPC - has been developed for processing evaluated data files in ENDF format. The earlier version of this package is supplemented with modules performing calculations using Reich-Moore and Adler-Adler resonance parameters. The processing of evaluated neutron data files by personal computers requires special programming considerations outlined in this report. The scope of the FDMXPC program system is demonstrated by means of numerical examples. (author). 5 refs, 4 figs, 4 tabs

  3. Simulation of Thermal Neutron Transport Processes Directly from the Evaluated Nuclear Data Files

    Science.gov (United States)

    Androsenko, P. A.; Malkov, M. R.

    The main idea of the method proposed in this paper is to directly extract thetrequired information for Monte-Carlo calculations from nuclear data files. The met od being developed allows to directly utilize the data obtained from libraries and seehs to be the most accurate technique. Direct simulation of neutron scattering in themmal energy range using file 7 ENDF-6 format in terms of code system BRAND has beer achieved. Simulation algorithms have been verified using the criterion x2

  4. Sandia equation of state data base: seslan File

    Energy Technology Data Exchange (ETDEWEB)

    Kerley, G.I. [Sandia National Labs., Albuquerque, NM (US); Christian-Frear, T.L. [RE/SPEC Inc., Albuquerque, NM (US)

    1993-06-24

    Sandia National Laboratories maintains several libraries of equation of state tables, in a modified Sesame format, for use in hydrocode calculations and other applications. This report discusses one of those libraries, the seslan file, which contains 78 tables from the Los Alamos equation of state library. Minor changes have been made to these tables, making them more convenient for code users and reducing numerical difficulties that occasionally arise in hydrocode calculations.

  5. Solving data-at-rest for the storage and retrieval of files in ad hoc networks

    Science.gov (United States)

    Knobler, Ron; Scheffel, Peter; Williams, Jonathan; Gaj, Kris; Kaps, Jens-Peter

    2013-05-01

    Based on current trends for both military and commercial applications, the use of mobile devices (e.g. smartphones and tablets) is greatly increasing. Several military applications consist of secure peer to peer file sharing without a centralized authority. For these military applications, if one or more of these mobile devices are lost or compromised, sensitive files can be compromised by adversaries, since COTS devices and operating systems are used. Complete system files cannot be stored on a device, since after compromising a device, an adversary can attack the data at rest, and eventually obtain the original file. Also after a device is compromised, the existing peer to peer system devices must still be able to access all system files. McQ has teamed with the Cryptographic Engineering Research Group at George Mason University to develop a custom distributed file sharing system to provide a complete solution to the data at rest problem for resource constrained embedded systems and mobile devices. This innovative approach scales very well to a large number of network devices, without a single point of failure. We have implemented the approach on representative mobile devices as well as developed an extensive system simulator to benchmark expected system performance based on detailed modeling of the network/radio characteristics, CONOPS, and secure distributed file system functionality. The simulator is highly customizable for the purpose of determining expected system performance for other network topologies and CONOPS.

  6. Use of DBMS-10 for storage and retrieval of evaluated nuclear data files

    International Nuclear Information System (INIS)

    Dunford, C.L.

    1977-01-01

    The use of a data base management system (DBMS) for storage of, and retrieval from, the many scientific data bases maintained by the National Nuclear Data Center is currently being investigated. It would appear that a commercially available DBMS package would save the Center considerable money and manpower when adding new data files to the library and in the long-term maintenance of current data files. Current DBMS technology and experience with an internal DBMS system suggests an inherent inefficiency in processing large data networks where significant portions are accessed in a sequential manner. Such a file is the Evaluated Nuclear Data File (ENDF/B), which contains many large data tables, each one normally accessed in a sequential manner. After gaining some experience and success in small applications of the commercially available DBMS package, DBMS-10, on the Center's DECsystem-10 computer, it was decided to select a large data base as a test case before making a final decision on the implementation of DBMS-10 for all data bases. The obvious approach is to utilize the DBMS to index a random-access file. In this way one is able to increase the storage and retrieval efficiency at the one-time cost of additional programing effort. 2 figures

  7. Use of DBMS-10 for storage and retrieval of evaluated nuclear data files

    International Nuclear Information System (INIS)

    Dunford, C.L.

    1978-01-01

    The use of a data base management system (DBMS) for storage of, and retrieval from, the many scientific data bases maintained by the National Nuclear Data Center is currently being investigated. It would appear that a commercially available DBMS package would save the Center considerable money and manpower when adding new data files to our library and in the long-term maintenance of our current data files. Current DBMS technology and experience with our internal DBMS system suggests an inherent inefficiency in processing large data networks where significant portions are accessed in a sequential manner. Such a file is the Evaluated Nuclear Data File (ENDF/B) which contains many large data tables, each one normally accessed in a sequential manner. After gaining some experience and success in small applications of the commercially available DBMS package, DBMS-10, on the Center's DECsystem-10 computer, it was decided to select one of our large data bases as a test case before making a final decision on the implementation of DBMS-10 for all our data bases. The obvious approach is to utilize the DBMS to index a random access file. In this way one is able to increase the storage and retrieval efficiency at the one-time cost of additional programming effort

  8. ENDF-102 DATA FORMATS AND PROCEDURES FOR THE EVALUATED NUCLEAR DATA FILE ENDF-6

    International Nuclear Information System (INIS)

    MCLANE, V.

    2001-01-01

    The Evaluated Nuclear Data File (ENDF) formats and libraries are decided by the Cross Section Evaluation Working Group (CSEWG), a cooperative effort of national laboratories, industry, and universities in the U.S. and Canada, and are maintained by the National Nuclear Data Center (NNDC). Earlier versions of the ENDF format provided representations for neutron cross sections and distributions, photon production from neutron reactions, a limited amount of charged-particle production from neutron reactions, photo-atomic interaction data, thermal neutron scattering data, and radionuclide production and decay data (including fission products). Version 6 (ENDF-6) allows higher incident energies, adds more complete descriptions of the distributions of emitted particles, and provides for incident charged particles and photonuclear data by partitioning the ENDF library into sub-libraries. Decay data, fission product yield data, thermal scattering data, and photo-atomic data have also been formally placed in sub-libraries. In addition, this rewrite represents an extensive update to the Version V manual

  9. ENDF-102 DATA FORMATS AND PROCEDURES FOR THE EVALUATION NUCLEAR DATA FILE ENDF-6.

    Energy Technology Data Exchange (ETDEWEB)

    MCLANE,V.

    2001-05-15

    The Evaluated Nuclear Data File (ENDF) formats and libraries are decided by the Cross Section Evaluation Working Group (CSEWG), a cooperative effort of national laboratories, industry, and universities in the U.S. and Canada, and are maintained by the National Nuclear Data Center (NNDC). Earlier versions of the ENDF format provided representations for neutron cross sections and distributions, photon production from neutron reactions, a limited amount of charged-particle production from neutron reactions, photo-atomic interaction data, thermal neutron scattering data, and radionuclide production and decay data (including fission products). Version 6 (ENDF-6) allows higher incident energies, adds more complete descriptions of the distributions of emitted particles, and provides for incident charged particles and photonuclear data by partitioning the ENDF library into sub-libraries. Decay data, fission product yield data, thermal scattering data, and photo-atomic data have also been formally placed in sub-libraries. In addition, this rewrite represents an extensive update to the Version V manual.

  10. Overview of the contents of ENDF/B-VI [Evaluated Nuclear Data File

    International Nuclear Information System (INIS)

    Dunford, C.L.; Pearlstein, S.

    1989-01-01

    The sixth release of the Evaluated Nuclear Data File (ENDF/B-VI) is now being prepared for general distribution. This data file serves as the primary source of nuclear data for nuclear applications in the United States and Canada and in many other countries of the world. The data library is maintained and distributed by the National Nuclear Data Center at Brookhaven National Laboratory from evaluations provided by members of the Cross Section Evaluation Working Group (CSEWG). Unlike its predecessor, ENDF/B-V, this file will be available to all requesters without restrictions. Compared to ENDF/B-V, released more than 11 yr ago, the ENDF/B-VI data library contains significant improvements for both fission and fusion reaction design. Future work will continue with limited staffing and foreign cooperation to provide the data needed for future nuclear applications

  11. Visual system of recovering and combination of information for ENDF (Evaluated Nuclear Data File) format libraries

    International Nuclear Information System (INIS)

    Ferreira, Claudia A.S. Velloso; Corcuera, Raquel A. Paviotti

    1997-01-01

    This report presents a data information retrieval and merger system for ENDF (Evaluated Nuclear Data File) format libraries, which can be run on personal computers under the Windows TM environment. The input is the name of an ENDF/B library, which can be chosen in a proper window. The system has a display function which allows the user to visualize the reaction data of a specific nuclide and to produce a printed copy of these data. The system allows the user to retrieve and/or combine evaluated data to create a single file of data in ENDF format, from a number of different files, each of which is in the ENDF format. The user can also create a mini-library from an ENDF/B library. This interactive and easy-to-handle system is a useful tool for Nuclear Data Centers and it is also of interest to nuclear and reactor physics researchers. (author)

  12. ChemEngine: harvesting 3D chemical structures of supplementary data from PDF files.

    Science.gov (United States)

    Karthikeyan, Muthukumarasamy; Vyas, Renu

    2016-01-01

    Digital access to chemical journals resulted in a vast array of molecular information that is now available in the supplementary material files in PDF format. However, extracting this molecular information, generally from a PDF document format is a daunting task. Here we present an approach to harvest 3D molecular data from the supporting information of scientific research articles that are normally available from publisher's resources. In order to demonstrate the feasibility of extracting truly computable molecules from PDF file formats in a fast and efficient manner, we have developed a Java based application, namely ChemEngine. This program recognizes textual patterns from the supplementary data and generates standard molecular structure data (bond matrix, atomic coordinates) that can be subjected to a multitude of computational processes automatically. The methodology has been demonstrated via several case studies on different formats of coordinates data stored in supplementary information files, wherein ChemEngine selectively harvested the atomic coordinates and interpreted them as molecules with high accuracy. The reusability of extracted molecular coordinate data was demonstrated by computing Single Point Energies that were in close agreement with the original computed data provided with the articles. It is envisaged that the methodology will enable large scale conversion of molecular information from supplementary files available in the PDF format into a collection of ready- to- compute molecular data to create an automated workflow for advanced computational processes. Software along with source codes and instructions available at https://sourceforge.net/projects/chemengine/files/?source=navbar.Graphical abstract.

  13. JENDL FP decay data file 2000 and the beta-decay theory

    International Nuclear Information System (INIS)

    Yoshida, Tadashi; Katakura, Jun Ichi; Tachibana, Takahiro

    2002-01-01

    JENDL FP Decay Data File 2000 has been developed as one of the special purpose files of the Japanese Evaluated Nuclear Data Library (JENDL), which constitutes a versatile nuclear data basis for science and technology. In the format of ENDF-6 this file includes the decay data for 1087 unstable fission product (FP) nuclides and 142 stable nuclides as their daughters. The primary purpose of this file is to use in the summation calculation of FP decay heat, which plays a critical role in nuclear safety analysis; the loss-of-coolant accident analysis of reactors, for example. The data for a given nuclide are its decay modes, the Q value, the branching ratios, the average energies released in the form of beta- and gamma-rays per decay, and their spectral data. The primary source of the decay data adopted here is the ENSDF (Evaluated Nuclear Structure Data File). The data in ENSDF, however, cover only the measured values. The data of the short-lived nuclides, which are essential for the decay heat calculations at short cooling times, are often fully lacking or incomplete even if they exist. This is mainly because of their short half-life nature. For such nuclides a theoretical model calculation is applied in order to fill the gaps between the true and the experimentally known decay schemes. In practice we have to predict the average decay energies and the spectral data for a lot of short-lived FPs by use of beta-decay theories. Thus the beta-decay theory plays a very important role in generating the FP decay data file

  14. Establishment of data base files of thermodynamic data developed by OECD/NEA. Pt. 1. Thermodynamic data of Np and Pu

    International Nuclear Information System (INIS)

    Yoshida, Yasushi; Sasamoto, Hiroshi

    2004-01-01

    Thermodynamic data base for compounds and complexes of actinides and fission products specialized in modeling requirements for safety assessments of radioactive waste disposal systems are being developed by NEA TDB project of OECD/NEA. In this project, relevant data bases for compounds and complexes of Np and Pu were published in 2001. JNC established the data base files available for geochemical calculation codes using these Np and Pu published data. And this procedure for establishment and contents of data base files are described in this report. These data base files were prepared as the formats of major geochemical codes PHREEQE, PHREEQC, EQ3/6 and Geochemist's workbench. Additionally modification for data in the thermodynamic data base files which had been already published by JNC was also done. This procedure and revised data bases are shown in the appendix of this report. (author)

  15. Establishment of data base files of thermodynamic data developed by OECD/NEA. Pt. 2. Thermodynamic data of Tc, U, Np, Pu and Am with auxiliary species

    International Nuclear Information System (INIS)

    Yoshida, Yasushi; Shibata, Masahiro

    2005-03-01

    Thermodynamic data base for compounds and complexes of actinides and fission products with auxiliary species specialized in modeling requirements for safety assessment of radioactive waste disposal systems are being developed by NEA TDB project of OECD/NEA. In this project, relevant data bases for compounds and complexes of U, Am, Tc, Np and Pu with auxiliary species were updated and published in 2003. JNC established the data base files available for geochemical calculation codes using these updated data. The procedure for establishment and contents of data base files are described in this report. These data base files were prepared as the formats of major geochemical codes PHREEQE, PHREEQC, EQ3/6 and Geochemist's workbench. Additionally modification for data in the thermodynamic data base files which had been already published by JNC was also done. This procedure and revised data bases are shown in the appendix of this report. (author)

  16. The structure and extent of data files for research management and planning

    International Nuclear Information System (INIS)

    Jankowski, L.

    1981-01-01

    The paper is concerned with the structure and extent of the data files which are necessary for the efficient planning and management of a research institute. An analysis is made of the interrelations between decision-making and the amount of information, its content and structure, including consequences to be drawn for planning an in-house data bank for an institute. Special emphasis is placed on the type and structure of data files. The interrelations of the individual data with each other, the frequency of access and the necessity of involving individual agencies and services providing research guidance. (author)

  17. Data Analysis of Minima Total Cross-sections of Nitrogen-14 on JENDL-3.2Nuclear Data File

    International Nuclear Information System (INIS)

    Suwoto; Pandiangan, Tumpal; Ferhat-Aziz

    2000-01-01

    The integral tests of neutron cross-section for shielding material suchas nitrogen-14 contained in JENDL-3.2 file have been performed. Analysis ofthe calculation for nitrogen-14 was based on the MAEKER's ORNL-BroomstickExperiment at ORNL-USA. For the data comparison, the calculation analysiswith JENDL-3.1 file, ENDF/B-IV file, ENDF/B-VI file and JEF2.2 have also beencarried out. The overall calculation results by using JENDL-3.2 evaluationshowed good agreement with the experimental data, as well as those with theENDF/B-VI evaluation. In particular, the JENDL-3.2 evaluation gave betterresults than JENDL-3.1 evaluation and ENDF/B-IV. It was been concluded thatthe total cross-sections of Nitrogen-14 contained in JENDL-3.2 file is invery good agreement with the experimental results, although the totalcross-section in the energy range between 0.5 MeV and 0.9 MeV on fileJENDL-3.2 was small (about 4% lower), and minima of total cross-sections wasdeeper. (author)

  18. The version control service for ATLAS data acquisition configuration filesDAQ ; configuration ; OKS ; XML

    CERN Document Server

    Soloviev, Igor; The ATLAS collaboration

    2012-01-01

    To configure data taking session the ATLAS systems and detectors store more than 160 MBytes of data acquisition related configuration information in OKS XML files. The total number of the files exceeds 1300 and they are updated by many system experts. In the past from time to time after such updates we had experienced problems caused by XML syntax errors or inconsistent state of files from a point of view of the overall ATLAS configuration. It was not always possible to know who made a modification causing problems or how to go back to a previous version of the modified file. Few years ago a special service addressing these issues has been implemented and deployed on ATLAS Point-1. It excludes direct write access to XML files stored in a central database repository. Instead, for an update the files are copied into a user repository, validated after modifications and committed using a version control system. The system's callback updates the central repository. Also, it keeps track of all modifications providi...

  19. pTSC: Data file editing for the Tokamak Simulation Code

    International Nuclear Information System (INIS)

    Meiss, J.D.

    1987-09-01

    The code pTSC is an editor for the data files needed to run the Princeton Tokamak Simulation Code (TSC). pTSC utilizes the Macintosh interface to create a graphical environment for entering the data. As most of the data to run TSC consists of conductor positions, the graphical interface is especially appropriate

  20. Developing a File System Structure to Solve Healthy Big Data Storage and Archiving Problems Using a Distributed File System

    Directory of Open Access Journals (Sweden)

    Atilla Ergüzen

    2018-06-01

    Full Text Available Recently, the use of internet has become widespread, increasing the use of mobile phones, tablets, computers, Internet of Things (IoT devices and other digital sources. In the health sector with the help of new generation digital medical equipment, this digital world also has tended to grow in an unpredictable way in that it has nearly 10% of the global wide data itself and continues to keep grow beyond what the other sectors have. This progress has greatly enlarged the amount of produced data which cannot be resolved with conventional methods. In this work, an efficient model for the storage of medical images using a distributed file system structure has been developed. With this work, a robust, available, scalable, and serverless solution structure has been produced, especially for storing large amounts of data in the medical field. Furthermore, the security level of the system is extreme by use of static Internet protocol (IP, user credentials, and synchronously encrypted file contents. One of the most important key features of the system is high performance and easy scalability. In this way, the system can work with fewer hardware elements and be more robust than others that use name node architecture. According to the test results, it is seen that the performance of the designed system is better than 97% from a Not Only Structured Query Language (NoSQL system, 80% from a relational database management system (RDBMS, and 74% from an operating system (OS.

  1. Comparison of data file and storage configurations for efficient temporal access of satellite image data

    CSIR Research Space (South Africa)

    Bachoo, A

    2009-01-01

    Full Text Available . Traditional storage formats store such a series of images as a sequence of individual files, with each file internally storing the pixels in their spatial order. Consequently, the construction of a time series profile of a single pixel requires reading from...

  2. National Household Education Surveys of 2003. Data File User's Manual, Volume II: Parent and Family Involvement in Education Survey. NCES 2004-102

    Science.gov (United States)

    Hagedorn, Mary; Montaquila, Jill; Vaden-Kiernan, Nancy; Kim, Kwang; Roth, Shelley Brock; Chapman, Christopher

    2004-01-01

    This manual provides documentation and guidance for users of the public-use data file for PFI-NHES: 2003. This volume contains a description of the content and organization of the data file, including useful information regarding questionnaire items and the various derived variables found on the file. Appended are the public-use data file layout,…

  3. Student Achievement Study, 1970-1974. The IEA Six-Subject Data Bank [machine-readable data file].

    Science.gov (United States)

    International Association for the Evaluation of Educational Achievement, Stockholm (Sweden).

    The "Student Achievement Study" machine-readable data files (MRDF) (also referred to as the "IEA Six-Subject Survey") are the result of an international data collection effort during 1970-1974 by 21 designated National Centers, which had agreed to cooperate. The countries involved were: Australia, Belgium, Chile, England-Wales,…

  4. Transfer of numeric ASCII data files between Apple and IBM personal computers.

    Science.gov (United States)

    Allan, R W; Bermejo, R; Houben, D

    1986-01-01

    Listings for programs designed to transfer numeric ASCII data files between Apple and IBM personal computers are provided with accompanying descriptions of how the software operates. Details of the hardware used are also given. The programs may be easily adapted for transferring data between other microcomputers.

  5. INDXENDF: A PC code for indexing nuclear data files in ENDF-6 format

    International Nuclear Information System (INIS)

    Silva, O.O. de; Corcuera, R.P.; Ferreira, P.A.; Moraes Cunha, M. de.

    1992-01-01

    The PC code INDXENDF which creates visual or printed indexes of nuclear data files in ENDF-6 format, is available from the IAEA Nuclear Data Section on a PC diskette, free of charge upon request. The present document describes the features of this code. (author). 11 refs, 9 figs

  6. Application of the Levenshtein Distance Metric for the Construction of Longitudinal Data Files

    Science.gov (United States)

    Doran, Harold C.; van Wamelen, Paul B.

    2010-01-01

    The analysis of longitudinal data in education is becoming more prevalent given the nature of testing systems constructed for No Child Left Behind Act (NCLB). However, constructing the longitudinal data files remains a significant challenge. Students move into new schools, but in many cases the unique identifiers (ID) that should remain constant…

  7. Using NJOY to Create MCNP ACE Files and Visualize Nuclear Data

    Energy Technology Data Exchange (ETDEWEB)

    Kahler, Albert Comstock [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-10-14

    We provide lecture materials that describe the input requirements to create various MCNP ACE files (Fast, Thermal, Dosimetry, Photo-nuclear and Photo-atomic) with the NJOY Nuclear Data Processing code system. Input instructions to visualize nuclear data with NJOY are also provided.

  8. Migrant Student Record Transfer System (MSRTS) [machine-readable data file].

    Science.gov (United States)

    Arkansas State Dept. of Education, Little Rock. General Education Div.

    The Migrant Student Record Transfer System (MSRTS) machine-readable data file (MRDF) is a collection of education and health data on more than 750,000 migrant children in grades K-12 in the United States (except Hawaii), the District of Columbia, and the outlying territories of Puerto Rico and the Mariana and Marshall Islands. The active file…

  9. NoDB: efficient query execution on raw data files

    NARCIS (Netherlands)

    I. Alagiannis; R Borovica; M. Branco; S. Idreos (Stratos); A. Ailamaki

    2012-01-01

    htmlabstractAs data collections become larger and larger, data loading evolves to a major bottleneck. Many applications already avoid using database systems, e.g., scientific data analysis and social networks, due to the complexity and the increased data-to-query time. For such applications data

  10. Data Vaults: Database Technology for Scientific File Repositories

    NARCIS (Netherlands)

    M.G. Ivanova (Milena); M.L. Kersten (Martin); S. Manegold (Stefan); Y. Kargin (Yagiz)

    2013-01-01

    htmlabstractCurrent data-management systems and analysis tools fail to meet scientists’ data-intensive needs. A "data vault" approach lets researchers effectively and efficiently explore and analyze information.

  11. FEDGROUP - A program system for producing group constants from evaluated nuclear data of files disseminated by IAEA

    International Nuclear Information System (INIS)

    Vertes, P.

    1976-06-01

    A program system for calculating group constants from several evaluated nuclear data files has been developed. These files are distributed by the Nuclear Data Section of IAEA. Our program system - FEDGROUP - has certain advantage over the well-known similar codes such as: 1. it requires only a medium sized computer />or approximately equal to 20000 words memory/, 2. it is easily adaptable to any type of computer, 3. it is flexible to the input evaluated nuclear data file and to the output group constant file. Nowadays, FEDGROUP calculates practically all types of group constants needed for reactor physics calculations by using the most frequent representations of evaluated data. (author)

  12. Ground-Based Global Navigation Satellite System (GNSS) GLONASS Broadcast Ephemeris Data (hourly files) from NASA CDDIS

    Data.gov (United States)

    National Aeronautics and Space Administration — This dataset consists of ground-based Global Navigation Satellite System (GNSS) GLObal NAvigation Satellite System (GLONASS) Broadcast Ephemeris Data (hourly files)...

  13. Odysseus/DFS: Integration of DBMS and Distributed File System for Transaction Processing of Big Data

    OpenAIRE

    Kim, Jun-Sung; Whang, Kyu-Young; Kwon, Hyuk-Yoon; Song, Il-Yeol

    2014-01-01

    The relational DBMS (RDBMS) has been widely used since it supports various high-level functionalities such as SQL, schemas, indexes, and transactions that do not exist in the O/S file system. But, a recent advent of big data technology facilitates development of new systems that sacrifice the DBMS functionality in order to efficiently manage large-scale data. Those so-called NoSQL systems use a distributed file system, which support scalability and reliability. They support scalability of the...

  14. ENDF/B-IV fission-product files: summary of major nuclide data

    International Nuclear Information System (INIS)

    England, T.R.; Schenter, R.E.

    1975-09-01

    The major fission-product parameters [sigma/sub th/, RI, tau/sub 1/2/, E-bar/sub β/, E-bar/sub γ/, E-bar/sub α/, decay and (n,γ) branching, Q, and AWR] abstracted from ENDF/B-IV files for 824 nuclides are summarized. These data are most often requested by users concerned with reactor design, reactor safety, dose, and other sundry studies. The few known file errors are corrected to date. Tabular data are listed by increasing mass number

  15. Database Objects vs Files: Evaluation of alternative strategies for managing large remote sensing data

    Science.gov (United States)

    Baru, Chaitan; Nandigam, Viswanath; Krishnan, Sriram

    2010-05-01

    Increasingly, the geoscience user community expects modern IT capabilities to be available in service of their research and education activities, including the ability to easily access and process large remote sensing datasets via online portals such as GEON (www.geongrid.org) and OpenTopography (opentopography.org). However, serving such datasets via online data portals presents a number of challenges. In this talk, we will evaluate the pros and cons of alternative storage strategies for management and processing of such datasets using binary large object implementations (BLOBs) in database systems versus implementation in Hadoop files using the Hadoop Distributed File System (HDFS). The storage and I/O requirements for providing online access to large datasets dictate the need for declustering data across multiple disks, for capacity as well as bandwidth and response time performance. This requires partitioning larger files into a set of smaller files, and is accompanied by the concomitant requirement for managing large numbers of file. Storing these sub-files as blobs in a shared-nothing database implemented across a cluster provides the advantage that all the distributed storage management is done by the DBMS. Furthermore, subsetting and processing routines can be implemented as user-defined functions (UDFs) on these blobs and would run in parallel across the set of nodes in the cluster. On the other hand, there are both storage overheads and constraints, and software licensing dependencies created by such an implementation. Another approach is to store the files in an external filesystem with pointers to them from within database tables. The filesystem may be a regular UNIX filesystem, a parallel filesystem, or HDFS. In the HDFS case, HDFS would provide the file management capability, while the subsetting and processing routines would be implemented as Hadoop programs using the MapReduce model. Hadoop and its related software libraries are freely available

  16. Pengembangan Algoritma Fast Inversion dalam Membentuk Inverted File untuk Text Retrieval dengan Data Skala Besar

    Directory of Open Access Journals (Sweden)

    Derwin Suhartono

    2012-06-01

    Full Text Available The rapid development of information systems generates new needs for indexing and retrieval of various kinds of media. The need for documents in the form of multimedia is increasing currently. Thus, the need to store or retrieve now becomes a primary problem. The multimedia type commonly used is text types, as widely seen as the main option in the search engines like Yahoo, Google or others. Essentially, search does not just want to get results, but also a more efficient process. For the purposes of indexing and retrieval, inverted file is used to provide faster results. However, there will be a problem if the making of an inverted file is related to a large amount of data. This study describes an algorithm called Fast Inversion as the development of base inverted file making method to address the needs related to the amount of data.

  17. Data management in large-scale collaborative toxicity studies: how to file experimental data for automated statistical analysis.

    Science.gov (United States)

    Stanzel, Sven; Weimer, Marc; Kopp-Schneider, Annette

    2013-06-01

    High-throughput screening approaches are carried out for the toxicity assessment of a large number of chemical compounds. In such large-scale in vitro toxicity studies several hundred or thousand concentration-response experiments are conducted. The automated evaluation of concentration-response data using statistical analysis scripts saves time and yields more consistent results in comparison to data analysis performed by the use of menu-driven statistical software. Automated statistical analysis requires that concentration-response data are available in a standardised data format across all compounds. To obtain consistent data formats, a standardised data management workflow must be established, including guidelines for data storage, data handling and data extraction. In this paper two procedures for data management within large-scale toxicological projects are proposed. Both procedures are based on Microsoft Excel files as the researcher's primary data format and use a computer programme to automate the handling of data files. The first procedure assumes that data collection has not yet started whereas the second procedure can be used when data files already exist. Successful implementation of the two approaches into the European project ACuteTox is illustrated. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Parallel checksumming of data chunks of a shared data object using a log-structured file system

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-09-06

    Checksum values are generated and used to verify the data integrity. A client executing in a parallel computing system stores a data chunk to a shared data object on a storage node in the parallel computing system. The client determines a checksum value for the data chunk; and provides the checksum value with the data chunk to the storage node that stores the shared object. The data chunk can be stored on the storage node with the corresponding checksum value as part of the shared object. The storage node may be part of a Parallel Log-Structured File System (PLFS), and the client may comprise, for example, a Log-Structured File System client on a compute node or burst buffer. The checksum value can be evaluated when the data chunk is read from the storage node to verify the integrity of the data that is read.

  19. Requirements for an evaluated nuclear data file for accelerator-based transmutation

    International Nuclear Information System (INIS)

    Koning, A.J.

    1993-06-01

    The importance of intermediate-energy nuclear data files as part of a global calculation scheme for accelerator-based transmutation of radioactive waste systems (for instance with an accelerator-driven subcritical reactor) is discussed. A proposal for three intermediate-energy data libraries for incident neutrons and protons is presented: - a data library from 0 to about 100 MeV (first priority), - a reference data library from 20 to 1500 MeV, - an activation/transmutation library from 0 to about 100 MeV. Furthermore, the proposed ENDF-6 structure of each library is given. The data needs for accelerator-based transmutation are translated in terms of the aforementioned intermediate-energy data libraries. This could be a starting point for an ''International Evaluated Nuclear Data File for Transmutation''. This library could also be of interest for other applications in science and technology. Finally, some conclusions and recommendations concerning future evaluation work are given. (orig.)

  20. First Use of LHC Run 3 Conditions Database Infrastructure for Auxiliary Data Files in ATLAS

    CERN Document Server

    Aperio Bella, Ludovica; The ATLAS collaboration

    2016-01-01

    Processing of the large amount of data produced by the ATLAS experiment requires fast and reliable access to what we call Auxiliary Data Files (ADF). These files, produced by Combined Performance, Trigger and Physics groups, contain conditions, calibrations, and other derived data used by the ATLAS software. In ATLAS this data has, thus far for historical reasons, been collected and accessed outside the ATLAS Conditions Database infrastructure and related software. For this reason, along with the fact that ADF data is effectively read by the software as binary objects, makes this class of data ideal for testing the proposed Run 3 Conditions data infrastructure now in development. This paper will describe this implementation as well as describe the lessons learned in exploring and refining the new infrastructure with the potential for deployment during Run 2.

  1. Crash Injury Research and Engineering Network (CIREN) - CIREN data files

    Data.gov (United States)

    Department of Transportation — The CIREN process combines prospective data collection with professional multidisciplinary analysis of medical and engineering evidence to determine injury causation...

  2. A file of reference data for multiple-element neutron activation analysis

    International Nuclear Information System (INIS)

    Kabina, L.P.; Kondurov, I.A.; Shesterneva, I.M.

    1983-12-01

    Data needed for planning neutron activation analysis experiments and processing their results are given. The decay schemes of radioactive nuclei formed in irradiation with thermal neutrons during the (n,γ) reaction taken from the international ENSDF file are used for calculating the activities of nuclei and for drawing up an optimum table for identifying gamma lines in the spectra measured. (author)

  3. What Happens When Persons Leave Welfare: Data from the SIPP Panel File.

    Science.gov (United States)

    Lamas, Enrique; McNeil, John

    This document reports on a study of the likelihood of individuals participating in the Federal food stamp program and the Medicaid program and the likelihood of exiting those programs. Data were analyzed from the first panel file of the Survey of Income and Program Participation (SIPP). Special problems with representativeness and measurement…

  4. The International Evaluated Nuclear Structure Data File (ENSDF) in fundamental and applied photonuclear research

    International Nuclear Information System (INIS)

    Boboshin, I.N.; Varlamov, V.V.

    1989-04-01

    In order to provide the necessary nuclear physics data from the ENSDF file to those carrying out fundamental or applied photonuclear research a specialized software system was set up on an ES computer. A brief description of the block diagram of this software package and of one of the programs in this package (SUPER) is given. 4 refs, 6 figs

  5. First use of LHC Run 3 Conditions Database infrastructure for auxiliary data files in ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00081940; The ATLAS collaboration; Barberis, Dario; Gallas, Elizabeth; Rybkin, Grigori; Rinaldi, Lorenzo; Aperio Bella, Ludovica; Buttinger, William

    2017-01-01

    Processing of the large amount of data produced by the ATLAS experiment requires fast and reliable access to what we call Auxiliary Data Files (ADF). These files, produced by Combined Performance, Trigger and Physics groups, contain conditions, calibrations, and other derived data used by the ATLAS software. In ATLAS this data has, thus far for historical reasons, been collected and accessed outside the ATLAS Conditions Database infrastructure and related software. For this reason, along with the fact that ADF are effectively read by the software as binary objects, this class of data appears ideal for testing the proposed Run 3 conditions data infrastructure now in development. This paper describes this implementation as well as the lessons learned in exploring and refining the new infrastructure with the potential for deployment during Run 2.

  6. The Evaluated Nuclear Structure Data File (ENSDF). Its philosophy, content and uses

    International Nuclear Information System (INIS)

    Burrows, T.W.

    1989-04-01

    The Evaluated Nuclear Structure Data File (ENSDF) is maintained by the National Nuclear Data Center (NNDC) on behalf of international Nuclear Structure and Decay Data Network sponsored by the International Atomic Energy Agency, Vienna. For A≥44 the file is used to produce the Nuclear Data Sheets. Data for A=5 to 44 are extracted from the evaluations published in Nuclear Physics. The contents of ENSDF are briefly described as is the philosophy and methodology of ENSDF evaluations. Also discussed are the services available at various nuclear data centers and the on-line services of the NNDC. Application codes developed for use with ENSDF are described with the program RADLST used as an example. The interaction of ENSDF evaluations with other evaluations is also discussed. (author). 23 refs, 3 tabs

  7. Report on the achievements in the Sunshine Project in fiscal 1986. Surveys on coal type selection and surveys on coal types (Data file); 1986 nendo tanshu sentei chosa tanshu chosa seika hokokusho. Data file

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1987-03-01

    This data file is a data file concerning coal types for liquefaction in the report on the achievements in the surveys on coal type selection and on coal types (JN0040843). Such items of information were filed as existence and production of coals, various kinds of analyses, and test values relative to data for liquefaction tests that have been collected and sent to date. The file consists of two files of a test sample information file related to existence and production of coals and coal mines, and an analysis and test file accommodating the results of different analyses and tests. However, the test sample information files (1) through (6) have not been put into order on such items of information as test samples and sample collection, geography, geology, ground beds, coal beds, coal mines, development and transportation. The analysis and test file contains (7) industrial analyses, (8) element analysis, (9) ash composition, (10) solubility of ash, (11) structure analysis, (12) liquefaction characteristics (standard version), (13) analysis of liquefaction produced gas, (14) distillation characteristics of liquefaction produced oil, (15) liquefaction characteristics (simplified version), (16) analysis of liquefaction produced gas (simplified version), and (17) distillation characteristics of liquefaction produced oil (simplified version). However, the information related to liquefaction test using a tubing reactor in (15) through (17) has not been put into order. (NEDO)

  8. Hierarchical remote data possession checking method based on massive cloud files

    Directory of Open Access Journals (Sweden)

    Ma Haifeng

    2017-06-01

    Full Text Available Cloud storage service enables users to migrate their data and applications to the cloud, which saves the local data maintenance and brings great convenience to the users. But in cloud storage, the storage servers may not be fully trustworthy. How to verify the integrity of cloud data with lower overhead for users has become an increasingly concerned problem. Many remote data integrity protection methods have been proposed, but these methods authenticated cloud files one by one when verifying multiple files. Therefore, the computation and communication overhead are still high. Aiming at this problem, a hierarchical remote data possession checking (hierarchical-remote data possession checking (H-RDPC method is proposed, which can provide efficient and secure remote data integrity protection and can support dynamic data operations. This paper gives the algorithm descriptions, security, and false negative rate analysis of H-RDPC. The security analysis and experimental performance evaluation results show that the proposed H-RDPC is efficient and reliable in verifying massive cloud files, and it has 32–81% improvement in performance compared with RDPC.

  9. Alaska Resource Data File, McCarthy quadrangle, Alaska

    Science.gov (United States)

    Hudson, Travis L.

    2003-01-01

    Descriptions of the mineral occurrences shown on the accompanying figure follow. See U.S. Geological Survey (1996) for a description of the information content of each field in the records. The data presented here are maintained as part of a statewide database on mines, prospects and mineral occurrences throughout Alaska.

  10. Regional Files of GEOS3/SEASAT/GEOSAT Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Gravity anomalies and sea surface heights have been computed on a 0.125 degree grid in the ocean areas from a combined GEOS3/SEASAT/GEOSAT altimeter data set. This...

  11. Nuclide identifier and grat data reader application for ORIGEN output file

    International Nuclear Information System (INIS)

    Arif Isnaeni

    2011-01-01

    ORIGEN is a one-group depletion and radioactive decay computer code developed at the Oak Ridge National Laboratory (ORNL). ORIGEN takes one-group neutronics calculation providing various nuclear material characteristics (the buildup, decay and processing of radioactive materials). ORIGEN output is a text-based file, ORIGEN output file contains only numbers in the form of group data nuclide, nuclide identifier and grat. This application was created to facilitate data collection nuclide identifier and grat, this application also has a function to acquire mass number data and calculate mass (gram) for each nuclide. Output from these applications can be used for computer code data input for neutronic calculations such as MCNP. (author)

  12. Efficient analysis and extraction of MS/MS result data from Mascot™ result files

    Directory of Open Access Journals (Sweden)

    Sickmann Albert

    2005-12-01

    Full Text Available Abstract Background Mascot™ is a commonly used protein identification program for MS as well as for tandem MS data. When analyzing huge shotgun proteomics datasets with Mascot™'s native tools, limits of computing resources are easily reached. Up to now no application has been available as open source that is capable of converting the full content of Mascot™ result files from the original MIME format into a database-compatible tabular format, allowing direct import into database management systems and efficient handling of huge datasets analyzed by Mascot™. Results A program called mres2x is presented, which reads Mascot™ result files, analyzes them and extracts either selected or all information in order to store it in a single file or multiple files in formats which are easier to handle downstream of Mascot™. It generates different output formats. The output of mres2x in tab format is especially designed for direct high-performance import into relational database management systems using native tools of these systems. Having the data available in database management systems allows complex queries and extensive analysis. In addition, the original peak lists can be extracted in DTA format suitable for protein identification using the Sequest™ program, and the Mascot™ files can be split, preserving the original data format. During conversion, several consistency checks are performed. mres2x is designed to provide high throughput processing combined with the possibility to be driven by other computer programs. The source code including supplement material and precompiled binaries is available via http://www.protein-ms.de and http://sourceforge.net/projects/protms/. Conclusion The database upload allows regrouping of the MS/MS results using a database management system and complex analyzing queries using SQL without the need to run new Mascot™ searches when changing grouping parameters.

  13. KENO2MCNP, Version 5L, Conversion of Input Data between KENOV.a and MCNP File Formats

    International Nuclear Information System (INIS)

    2008-01-01

    1 - Description of program or function: The KENO2MCNP program was written to convert KENO V.a input files to MCNP Format. This program currently only works with KENO Va geometries and will not work with geometries that contain more than a single array. A C++ graphical user interface was created that was linked to Fortran routines from KENO V.a that read the material library and Fortran routines from the MCNP Visual Editor that generate the MCNP input file. Either SCALE 5.0 or SCALE 5.1 cross section files will work with this release. 2 - Methods: The C++ binary executable reads the KENO V.a input file, the KENO V.a material library and SCALE data libraries. When an input file is read in, the input is stored in memory. The converter goes through and loads different sections of the input file into memory including parameters, composition, geometry information, array information and starting information. Many of the KENO V.a materials represent compositions that must be read from the KENO V.a material library. KENO2MCNP includes the KENO V.a FORTRAN routines used to read this material file for creating the MCNP materials. Once the file has been read in, the user must select 'Convert' to convert the file from KENO V.a to MCNP. This will generate the MCNP input file along with an output window that lists the KENO V.a composition information for the materials contained in the KENO V.a input file. The program can be run interactively by clicking on the executable or in batch mode from the command prompt. 3 - Restrictions on the complexity of the problem: Not all KENO V.a input files are supported. Only one array is allowed in the input file. Some of the more complex material descriptions also may not be converted

  14. CRYSTMET—The NRCC Metals Crystallographic Data File

    Science.gov (United States)

    Wood, Gordon H.; Rodgers, John R.; Gough, S. Roger; Villars, Pierre

    1996-01-01

    CRYSTMET is a computer-readable database of critically evaluated crystallographic data for metals (including alloys, intermetallics and minerals) accompanied by pertinent chemical, physical and bibliographic information. It currently contains about 60 000 entries and covers the literature exhaustively from 1913. Scientific editing of the abstracted entries, consisting of numerous automated and manual checks, is done to ensure consistency with related, previously published studies, to assign structure types where necessary and to help guarantee the accuracy of the data and related information. Analyses of the entries and their distribution across key journals as a function of time show interesting trends in the complexity of the compounds studied as well as in the elements they contain. Two applications of CRYSTMET are the identification of unknowns and the prediction of properties of materials. CRYSTMET is available either online or via license of a private copy from the Canadian Scientific Numeric Database Service (CAN/SND). The indexed online search and analysis system is easy and economical to use yet fast and powerful. Development of a new system is under way combining the capabilities of ORACLE with the flexibility of a modern interface based on the Netscape browsing tool. PMID:27805157

  15. JENDL special purpose file

    International Nuclear Information System (INIS)

    Nakagawa, Tsuneo

    1995-01-01

    In JENDL-3,2, the data on all the reactions having significant cross section over the neutron energy from 0.01 meV to 20 MeV are given for 340 nuclides. The object range of application extends widely, such as the neutron engineering, shield and others of fast reactors, thermal neutron reactors and nuclear fusion reactors. This is a general purpose data file. On the contrary to this, the file in which only the data required for a specific application field are collected is called special purpose file. The file for dosimetry is a typical special purpose file. The Nuclear Data Center, Japan Atomic Energy Research Institute, is making ten kinds of JENDL special purpose files. The files, of which the working groups of Sigma Committee are in charge, are listed. As to the format of the files, ENDF format is used similarly to JENDL-3,2. Dosimetry file, activation cross section file, (α, n) reaction data file, fusion file, actinoid file, high energy data file, photonuclear data file, PKA/KERMA file, gas production cross section file and decay data file are described on their contents, the course of development and their verification. Dosimetry file and gas production cross section file have been completed already. As for the others, the expected time of completion is shown. When these files are completed, they are opened to the public. (K.I.)

  16. Creating Customized Data Files in {E}-{P}rime: {A} Practical Tutorial

    Directory of Open Access Journals (Sweden)

    İyilikci, Osman

    2018-02-01

    Full Text Available There are packages that simplify experiment generation by taking advantage of graphical user interface. Typically, such packages create output files that are not in an appropriate format to be directly analyzed with a statistical package. For this reason, researchers must complete several time consuming steps, and use additional software to prepare data for a statistical analysis. The present paper suggests a particular E-Basic technique which is time saving in data analysis process and applicable to a wide range of experiments that measure reaction time and response accuracy. The technique demonstrated here makes it possible to create a customized and ready-to-analyze data file automatically while running an experiment designed in E-Prime environment.

  17. Evaluated nuclear data file libraries use in nuclear-physical calculations

    International Nuclear Information System (INIS)

    Gritsaj, O.O.; Kalach, N.Yi.; Kal'chenko, O.Yi.; Kolotij, V.V.; Vlasov, M.F.

    1994-01-01

    The necessity of nuclear updated usage is founded for neutron experiment modeling calculations, for preparation of suitable data for reactor calculations and for other applications that account of detail energetic structure of cross section is required. The scheme of system to coordinate the work to collect and to prepare evaluated nuclear data on an international scale is presented. Main updated and recommended nuclear data libraries and associated computer programs are reviewed. Total neutron cross sections for 28 energetic groups calculated on the base of natural mixture iron isotopes evaluated nuclear data file (BROND-2, 1991) have been compared with BNAB-78 data. (author). 7 refs., 1 tab., 4 figs

  18. Ground-Based Global Navigation Satellite System (GNSS) GPS Broadcast Ephemeris Data (daily files) from NASA CDDIS

    Data.gov (United States)

    National Aeronautics and Space Administration — This dataset consists of ground-based Global Navigation Satellite System (GNSS) GPS Broadcast Ephemeris Data (daily files) from the NASA Crustal Dynamics Data...

  19. Ground-Based Global Navigation Satellite System Mixed Broadcast Ephemeris Data (sub-hourly files) from NASA CDDIS

    Data.gov (United States)

    National Aeronautics and Space Administration — This dataset consists of ground-based Global Navigation Satellite System (GNSS) Mixed Broadcast Ephemeris Data (sub-hourly files) from the NASA Crustal Dynamics Data...

  20. Parallel compression of data chunks of a shared data object using a log-structured file system

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-10-25

    Techniques are provided for parallel compression of data chunks being written to a shared object. A client executing on a compute node or a burst buffer node in a parallel computing system stores a data chunk generated by the parallel computing system to a shared data object on a storage node by compressing the data chunk; and providing the data compressed data chunk to the storage node that stores the shared object. The client and storage node may employ Log-Structured File techniques. The compressed data chunk can be de-compressed by the client when the data chunk is read. A storage node stores a data chunk as part of a shared object by receiving a compressed version of the data chunk from a compute node; and storing the compressed version of the data chunk to the shared data object on the storage node.

  1. New remarks on KERMA factors and DPA cross section data in ACE files

    International Nuclear Information System (INIS)

    Konno, Chikara; Sato, Satoshi; Ohta, Masayuki; Kwon, Saerom; Ochiai, Kentaro

    2016-01-01

    KERMA factors and DPA cross section data are essential for nuclear heating and material damage estimation in fusion reactor designs. Recently we compared KERMA factors and DPA cross section data in the latest official ACE files of JENDL-4.0, ENDF/B-VII.1, JEFF-3.2 and FENDL-3.0 and it was found out that the KERMA factors and DPA cross section data of a lot of nuclei did not always agree among the nuclear data libraries. We investigated the nuclear data libraries and the nuclear data processing code NJOY and specified new reasons for the discrepancies; (1) incorrect nuclear data and NJOY bugs, (2) huge helium production cross section data, (3) gamma production data format in the nuclear data, (4) no detailed secondary particle data (energy–angular distribution data). These problems should be resolved based on this study.

  2. New remarks on KERMA factors and DPA cross section data in ACE files

    Energy Technology Data Exchange (ETDEWEB)

    Konno, Chikara, E-mail: konno.chikara@jaea.go.jp; Sato, Satoshi; Ohta, Masayuki; Kwon, Saerom; Ochiai, Kentaro

    2016-11-01

    KERMA factors and DPA cross section data are essential for nuclear heating and material damage estimation in fusion reactor designs. Recently we compared KERMA factors and DPA cross section data in the latest official ACE files of JENDL-4.0, ENDF/B-VII.1, JEFF-3.2 and FENDL-3.0 and it was found out that the KERMA factors and DPA cross section data of a lot of nuclei did not always agree among the nuclear data libraries. We investigated the nuclear data libraries and the nuclear data processing code NJOY and specified new reasons for the discrepancies; (1) incorrect nuclear data and NJOY bugs, (2) huge helium production cross section data, (3) gamma production data format in the nuclear data, (4) no detailed secondary particle data (energy–angular distribution data). These problems should be resolved based on this study.

  3. Reconstruction of point cross-section from ENDF data file for Monte Carlo applications

    International Nuclear Information System (INIS)

    Kumawat, H.; Saxena, A.; Carminati, F.; )

    2016-12-01

    Monte Carlo neutron transport codes are one of the best tools to simulate complex systems like fission and fusion reactors, Accelerator Driven Sub-critical systems, radio-activity management of spent fuel and waste, optimization and characterization of neutron detectors, optimization of Boron Neutron Capture Therapy, imaging etc. The neutron cross-section and secondary particle emission properties are the main input parameters of such codes. The fission, capture and elastic scattering cross-sections have complex resonating structures. Evaluated Nuclear Data File (ENDF) contains these cross-sections and secondary parameters. We report the development of reconstruction procedure to generate point cross-sections and probabilities from ENDF data file. The cross-sections are compared with the values obtained from PREPRO and in some cases NJOY codes. The results are in good agreement. (author)

  4. Development of a utility system for nuclear reaction data file: WinNRDF

    International Nuclear Information System (INIS)

    Aoyama, Shigeyoshi; Ohbayasi, Yosihide; Masui, Hiroshi; Chiba, Masaki; Kato, Kiyoshi; Ohnishi, Akira

    2000-01-01

    A utility system, WinNRDF, is developed for charged particle nuclear reaction data of NRDF (Nuclear Reaction Data File) on the Windows interface. By using this system, we can easily search the experimental data of a charged particle nuclear reaction in NRDF than old retrieval systems on the mainframe and also see graphically the experimental data on GUI (Graphical User Interface). We adopted a mechanism of making a new index of keywords to put to practical use of the time dependent properties of the NRDF database. (author)

  5. Status of data testing of ENDF/B-V reactor dosimetry file

    International Nuclear Information System (INIS)

    Magurno, B.A.

    1979-01-01

    The ENDF/B-V Reactor Dosimetry File was released August 1979, and Phase II data testing started. The results presented here are from Brookhaven National Laboratory only, and are considered preliminary. The tests include calculated spectrum-averaged cross sections using 235 U fission spectrum (Watt), 252 Cf spontaneous fission spectrum (Watt and Maxwellian), and the Coupled Fast Reactor Measurement Facility (CFRMF) spectrum. 6 tables

  6. Data File.

    Science.gov (United States)

    Vocational Education Journal, 1991

    1991-01-01

    Eleven graphs present statistics on the number of adult, secondary, and college vocational teachers; demographics of high school vocational and nonvocational teachers; beginning teachers' race, job satisfaction, stress levels, income, and age; and subjects taught by high school, junior high, and two-year college vocational faculty. (SK)

  7. Total cross-sections assessment of neutron reaction with stainless steel SUS-310 contained in various nuclear data files

    International Nuclear Information System (INIS)

    Suwoto

    2002-01-01

    The integral testing of neutron cross-sections for Stainless Steel SUS-310 contained in various nuclear data files have been performed. The shielding benchmark calculations for Stainless Steel SUS-310 has been analysed through ORNL-Broomstick Experiment calculation which performed by MAERKER, R.E. at ORNL - USA ( 1) . Assessment with JENDL-3.1, JENDL-3.2, ENDF/B-IV, ENDF/B-VI nuclear data files and data from GEEL have also been carried out. The overall calculation results SUS-310 show in a good agreement with the experimental data, although, underestimate results appear below 3 MeV for all nuclear data files. These underestimation tendencies clearly caused by presented of iron nuclide which more than half in Stainless Steel compound. The total neutron cross-sections of iron nuclide contained in various nuclear data files relatively lower on that energy ranges

  8. The method to set up file-6 in neutron data library of light nuclei below 20 MeV

    International Nuclear Information System (INIS)

    Zhang Jingshang; Han Yinlu

    2001-01-01

    So far there is no file-6 (double differential cross section data, DDX) of the light nuclei in the main evaluated neutron nuclear data libraries in the world. Therefore, locating a proper description on the double differential cross section of all kinds of outgoing particles from neutron induced light nucleus reaction below 20 MeV is necessary. The motivation for this work is to introduce a way to set up file-6 in the neutron data library

  9. ENDF-UTILITY-CODES, codes to check and standardize data in the Evaluated Nuclear Data File (ENDF)

    International Nuclear Information System (INIS)

    Dunford, Charles L.

    2007-01-01

    1 - Description of program or function: The ENDF Utility Codes include 9 codes to check and standardize data in the Evaluated Nuclear Data File (ENDF). Four programs of this release, GETMAT, LISTEF, PLOTEF and SETMDC are no more maintained since release 6.13. The suite of ENDF utility codes includes: - CHECKR (version 7.01) is a program for checking that an evaluated data file conforms to the ENDF format. - FIZCON (version 7.02) is a program for checking that an evaluated data file has valid data and conforms to recommended procedures. - GETMAT (version 6.13) is designed to retrieve one or more materials from an ENDF formatted data file. The output will contain only the selected materials. - INTER (version 7.01) calculates thermal cross sections, g-factors, resonance integrals, fission spectrum averaged cross sections and 14.0 MeV (or other energy) cross sections for major reactions in an ENDF-6 or ENDF-5 format data file. - LISTEF (version 6.13) is designed to produce summary and annotated listings of a data file in either ENDF-6 or ENDF-5 format. - PLOTEF (version 6.13) is designed to produce graphical displays of a data file in either ENDF-5 or ENDF-6 format. The form of graphical output depends on the graphical devices available at the installation where this code will be used. - PSYCHE (version 7.02) is a program for checking the physics content of an evaluated data file. It can recognise the difference between ENDF-5 or ENDF-6 formats and performs its tests accordingly. - SETMDC (version 6.13) is a utility program that converts the source decks of programs to different computers (DOS, UNIX, LINUX, VMS, Windows). - STANEF (version 7.01) performs bookkeeping operations on a data file containing one or more material evaluations in ENDF format. The version 7.02 of the ENDF Utility Codes corrects all bugs reported to NNDC as of April 1, 2005 and supersedes all previous releases. Three codes CHECKR, STANEF, and INTER were actually ported from the 7.01 release

  10. Formatting data files for repeated-measures analyses in SPSS: Using the Aggregate and Restructure procedures

    Directory of Open Access Journals (Sweden)

    Gyslain Giguère

    2006-03-01

    Full Text Available In this tutorial, we demonstrate how to use the Aggregate and Restructure procedures available in SPSS (versions 11 and up to prepare data files for repeated-measures analyses. In the first two sections of the tutorial, we briefly describe the Aggregate and Restructure procedures. In the final section, we present an example in which the data from a fictional lexical decision task are prepared for analysis using a mixed-design ANOVA. The tutorial demonstrates that the presented method is the most efficient way to prepare data for repeated-measures analyses in SPSS.

  11. Taming Log Files from Game/Simulation-Based Assessments: Data Models and Data Analysis Tools. Research Report. ETS RR-16-10

    Science.gov (United States)

    Hao, Jiangang; Smith, Lawrence; Mislevy, Robert; von Davier, Alina; Bauer, Malcolm

    2016-01-01

    Extracting information efficiently from game/simulation-based assessment (G/SBA) logs requires two things: a well-structured log file and a set of analysis methods. In this report, we propose a generic data model specified as an extensible markup language (XML) schema for the log files of G/SBAs. We also propose a set of analysis methods for…

  12. Comparison of WIMS results using libraries based on new evaluated data files

    International Nuclear Information System (INIS)

    Trkov, A.; Ganesan, S.; Zidi, T.

    1996-01-01

    A number of selected benchmark experiments have been modelled with the WIMS-D/4 lattice code. Calculations were performed using multigroup libraries generated from a number of newly released evaluated data files. Data processing was done with the NJOY91.38 code. Since the data processing methods were the same in all cases, the results may serve to determine the impact on integral parameters due to differences in the basic data. The calculated integral parameters were also compared to the measured values. Observed differences were small, which means that there are no essential differences between the evaluated data libraries. The results of the analysis cannot serve to discriminate in terms of quality of the data between the evaluated data libraries considered. For the test cases considered the results with the new, unadjusted libraries are at least as good as those obtained with the old, adjusted WIMS library which is supplied with the code. (author). 16 refs, 3 tabs

  13. ENDF/B-V 7 Standards Data File (EN5-ST Library)

    International Nuclear Information System (INIS)

    DayDay, N.; Lemmel, H.D.

    1980-10-01

    This document summarizes the contents and documentation of the ENDF/B-V 7 Standards Data File (EN5-ST Library) released in September 1979. The library contains complete evaluations for all significant neutron reactions in the energy range 10 -5 eV to 20 MeV for H-1, He-3, Li-6, B-10, C-12, Au-197 and U-235 isotopes. The entire library or selective retrievals from it can be obtained free of charge from the IAEA Nuclear Data Section. (author)

  14. Archive Inventory Management System (AIMS) — A Fast, Metrics Gathering Framework for Validating and Gaining Insight from Large File-Based Data Archives

    Science.gov (United States)

    Verma, R. V.

    2018-04-01

    The Archive Inventory Management System (AIMS) is a software package for understanding the distribution, characteristics, integrity, and nuances of files and directories in large file-based data archives on a continuous basis.

  15. Development of Indian cross section data files for Th-232 and U-233 and integral validation studies

    International Nuclear Information System (INIS)

    Ganesan, S.

    1988-01-01

    This paper presents an overview of the tasks performed towards the development of Indian cross section data files for Th-232 and U-233. Discrepancies in various neutron induced reaction cross sections in various available evaluated data files have been obtained by processing the basic data into multigroup form and intercomparison of the latter. Interesting results of integral validation studies for capture, fission and (n,2n) cross sections for Th-232 by analyses of selected integral measurements are presented. In the resonance range, energy regions where significant differences in the calculated self-shielding factors for Th-232 occur have been identified by a comparison of self-shielded multigroup cross sections derived from two recent evaluated data files, viz., ENDF/B-V (Rev.2) and JENDL-2, for several dilutions and temperatures. For U-233, the three different basic data files ENDF/B-IV, JENDL-2 and ENDL-84 were intercompared. Interesting observations on the predictional capability of these files for the criticality of the spherical metal U-233 system are given. The current status of Indian data file is presented. (author) 62 ref

  16. EQPT, a data file preprocessor for the EQ3/6 software package: User's guide and related documentation (Version 7.0)

    International Nuclear Information System (INIS)

    Daveler, S.A.; Wolery, T.J.

    1992-01-01

    EQPT is a data file preprocessor for the EQ3/6 software package. EQ3/6 currently contains five primary data files, called datao files. These files comprise alternative data sets. These data files contain both standard state and activity coefficient-related data. Three (com, sup, and nea) support the use of the Davies or B equations for the activity coefficients; the other two (hmw and pit) support the use of Pitzer's (1973, 1975) equations. The temperature range of the thermodynamic data on these data files varies from 25 degrees C only to 0-300 degrees C. The principal modeling codes in EQ3/6, EQ3NR and EQ6, do not read a data0 file, however. Instead, these codes read an unformatted equivalent called a data1 file. EQPT writes a datal file, using the corresponding data0 file as input. In processing a data0 file, EQPT checks the data for common errors, such as unbalanced reactions. It also conducts two kinds of data transformation. Interpolating polynomials are fit to data which are input on temperature adds. The coefficients of these polynomials are then written on the datal file in place of the original temperature grids. A second transformation pertains only to data files tied to Pitzer's equations. The commonly reported observable Pitzer coefficient parameters are mapped into a set of primitive parameters by means of a set of conventional relations. These primitive form parameters are then written onto the datal file in place of their observable counterparts. Usage of the primitive form parameters makes it easier to evaluate Pitzer's equations in EQ3NR and EQ6. EQPT and the other codes in the EQ3/6 package are written in FORTRAN 77 and have been developed to run under the UNIX operating system on computers ranging from workstations to supercomputers

  17. EQPT, a data file preprocessor for the EQ3/6 software package: User`s guide and related documentation (Version 7.0); Part 2

    Energy Technology Data Exchange (ETDEWEB)

    Daveler, S.A.; Wolery, T.J.

    1992-12-17

    EQPT is a data file preprocessor for the EQ3/6 software package. EQ3/6 currently contains five primary data files, called datao files. These files comprise alternative data sets. These data files contain both standard state and activity coefficient-related data. Three (com, sup, and nea) support the use of the Davies or B-dot equations for the activity coefficients; the other two (hmw and pit) support the use of Pitzer`s (1973, 1975) equations. The temperature range of the thermodynamic data on these data files varies from 25{degrees}C only to 0-300{degrees}C. The principal modeling codes in EQ3/6, EQ3NR and EQ6, do not read a data0 file, however. Instead, these codes read an unformatted equivalent called a data1 file. EQPT writes a datal file, using the corresponding data0 file as input. In processing a data0 file, EQPT checks the data for common errors, such as unbalanced reactions. It also conducts two kinds of data transformation. Interpolating polynomials are fit to data which are input on temperature adds. The coefficients of these polynomials are then written on the datal file in place of the original temperature grids. A second transformation pertains only to data files tied to Pitzer`s equations. The commonly reported observable Pitzer coefficient parameters are mapped into a set of primitive parameters by means of a set of conventional relations. These primitive form parameters are then written onto the datal file in place of their observable counterparts. Usage of the primitive form parameters makes it easier to evaluate Pitzer`s equations in EQ3NR and EQ6. EQPT and the other codes in the EQ3/6 package are written in FORTRAN 77 and have been developed to run under the UNIX operating system on computers ranging from workstations to supercomputers.

  18. Establishment of data base files of thermodynamic data developed by OECD/NEA. Part 4. Addition of thermodynamic data for iron, tin and thorium

    International Nuclear Information System (INIS)

    Yoshida, Yasushi; Kitamura, Akira

    2014-12-01

    Thermodynamic data for compounds and complexes of elements with auxiliary species specialized in modeling requirements for safety assessments of radioactive waste disposal systems have been developed by the Thermochemical Data Base (TDB) project of the Nuclear Energy Agency in the Organization for Economic Co-operation and Development (OECD/NEA). Recently, thermodynamic data for aqueous complexes, solids and gases of thorium, tin and iron (Part 1) have been published in 2008, 2012 and 2013, respectively. These thermodynamic data have been selected on the basis of NEA’s guidelines which describe peer review and data selection, extrapolation to zero ionic strength, assignment of uncertainty, and temperature correction; therefore the selected data are considered to be reliable. The reliability of selected thermodynamic data of TDB developed by Japan Atomic Energy Agency (JAEA-TDB) has been confirmed by comparing with selected data by the NEA. For this comparison, text files of the selected data on some geochemical calculation programs are required. In the present report, the database files for the NEA’s TDB with addition of selected data for iron, tin and thorium to the previous files have been established for use of PHREEQC, Geochemist’s Workbench and EQ3/6. In addition, as an example of confirmation of quality, dominant species in iron TDB were compared in Eh-pH diagram and differences between JAEA-TDB and NEA-TDB were shown. Data base files established in the present study will be at the Website of thermodynamic, sorption and diffusion database in JAEA (http://migrationdb.jaea.go.jp/). A CD-ROM is attached as an appendix. (J.P.N.)

  19. CINDA 83 (1977-1983). The index to literature and computer files on microscopic neutron data

    International Nuclear Information System (INIS)

    1983-01-01

    CINDA, the Computer Index of Neutron Data, contains bibliographical references to measurements, calculations, reviews and evaluations of neutron cross-sections and other microscopic neutron data; it includes also index references to computer libraries of numerical neutron data exchanged between four regional neutron data centres. The present issue, CINDA 83, is an index to the literature on neutron data published after 1976. The basic volume, CINDA-A, together with the present issue, contains the full CINDA file as of 1 April 1983. A supplement to CINDA 83 is foreseen for fall 1983. Next year's issue, which is envisaged to be published in June 1984, will again cover all relevant literature that has appeared after 1976

  20. Review of ENDF/B-VI Fission-Product Cross Sections[Evaluated Nuclear Data File

    Energy Technology Data Exchange (ETDEWEB)

    Wright, R.Q.; MacFarlane, R.E.

    2000-04-01

    In response to concerns raised in the Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 93-2, the US Department of Energy (DOE) developed a comprehensive program to help assure that the DOE maintain and enhance its capability to predict the criticality of systems throughout the complex. Tasks developed to implement the response to DNFSB recommendation 93-2 included Critical Experiments, Criticality Benchmarks, Training, Analytical Methods, and Nuclear Data. The Nuclear Data Task consists of a program of differential measurements at the Oak Ridge Electron Linear Accelerator (ORELA), precise fitting of the differential data with the generalized least-squares fitting code SAMMY to represent the data with resonance parameters using the Reich-Moore formalism along with covariance (uncertainty) information, and the development of complete evaluations for selected nuclides for inclusion in the Evaluated Nuclear Data File (ENDFB).

  1. ENDF-6 File 30: Data covariances obtained from parameter covariances and sensitivities

    International Nuclear Information System (INIS)

    Muir, D.W.

    1989-01-01

    File 30 is provided as a means of describing the covariances of tabulated cross sections, multiplicities, and energy-angle distributions that result from propagating the covariances of a set of underlying parameters (for example, the input parameters of a nuclear-model code), using an evaluator-supplied set of parameter covariances and sensitivities. Whenever nuclear data are evaluated primarily through the application of nuclear models, the covariances of the resulting data can be described very adequately, and compactly, by specifying the covariance matrix for the underlying nuclear parameters, along with a set of sensitivity coefficients giving the rate of change of each nuclear datum of interest with respect to each of the model parameters. Although motivated primarily by these applications of nuclear theory, use of File 30 is not restricted to any one particular evaluation methodology. It can be used to describe data covariances of any origin, so long as they can be formally separated into a set of parameters with specified covariances and a set of data sensitivities

  2. Data Qualification Report For: Thermodynamic Data File, DATA0.YMP.R0 For Geochemical Code, EQ3/6 

    Energy Technology Data Exchange (ETDEWEB)

    P.L. Cloke

    2001-10-16

    The objective of this work is to evaluate the adequacy of chemical thermodynamic data provided by Lawrence Livermore National Laboratory (LLNL) as DataO.ymp.ROA in response to an input request submitted under AP-3.14Q. This request specified that chemical thermodynamic data available in the file, Data0.com.R2, be updated, improved, and augmented for use in geochemical modeling used in Process Model Reports (PMRs) for Engineered Barrier Systems, Waste Form, Waste Package, Unsaturated Zone, and Near Field Environment, as well as for Performance Assessment. The data are qualified in the temperature range 0 to 100 C. Several Data Tracking Numbers (DTNs) associated with Analysis/Model Reports (AMR) addressing various aspects of the post-closure chemical behavior of the waste package and the Engineered Barrier System that rely on EQ316 outputs to which these data are used as input, are Principal Factor affecting. This qualification activity was accomplished in accordance with the AP-SIII.2Q using the Technical Assessment method. A development plan, TDP-EBS-MD-000044, was prepared in accordance with AP-2.13Q and approved by the Responsible Manager. In addition, a Process Control Evaluation was performed in accordance with AP-SV.1Q. The qualification method, selected in accordance with AP-SIII.2Q, was Technical Assessment. The rationale for this approach is that the data in File Data0.com.R2 are considered Handbook data and therefore do not themselves require qualification. Only changes to Data0.com.R2 required qualification. A new file has been produced which contains the database Data0.ymp.R0, which is recommended for qualification as a result of this action. Data0.ymp.R0 will supersede Data0.com.R2 for all Yucca Mountain Project (YMP) activities.

  3. Nuclear Reaction Data File for Astrophysics (NRDF/A) in Hokkaido University Nuclear Reaction Data Center

    International Nuclear Information System (INIS)

    Kato, Kiyoshi; Kimura, Masaaki; Furutachi, Naoya; Makinaga, Ayano; Togashi, Tomoaki; Otuka, Naohiko

    2010-01-01

    The activities of the Japan Nuclear Reaction Data Centre is explained. The main task of the centre is data compilation of Japanese nuclear reaction data in collaboration of the International Network of Nuclear Reaction Data Centres. As one of recent activities, preparation of a new database (NRDF/A) and evaluation of astronuclear reaction data are reported. Collaboration in the nuclear data activities among Asian countries is proposed.

  4. Data file on retention and excretion of inhaled radionuclides calculated using ICRP dosimetric models

    International Nuclear Information System (INIS)

    Ishigure, Nobuhito; Nakano, Takashi; Enomoto, Hiroko; Shimo, Michikuni; Inaba, Jiro

    2000-01-01

    The authors have computed whole-body or a specific organ content and the daily urinary and faecal excretion rate of some selected radionuclides following acute intake by inhalation and ingestion, where the ICRP new respiratory tract model (ICRP Publication 66) and the latest ICRP biokinetic models were applied. The results were compiled in a file of MS Excel. The file was tentatively called MONDAI for reference. MONDAI contains the data for all radionuclides in ICRP Publications 54 and 78 and, in addition, some other radionuclides which are important from the viewpoint of occupational exposure in nuclear industry, research and medicine. They are H-3, P-32, Cr-51, Mn-54, Fe-59, Co-57, Co-58, Co-60, Zn-65, Rb-86, Sr-85, Sr-89, Sr-90, Zr-95, Ru-106, Ag-110m, Sb-124, Sb-125, I-125, I-129, I-131, Cs-134, Cs-137, Ba-140, Ce-141, Ce-144, Hg-203, Ra-226, Ra-228, Th-228, Th-232, U-234, U-235, U-238, Np-237, Pu-238, Pu-239, Pu-240, Am-241, Cm-242, Cm-244 and Cf-252. The day-by-day data up to 1000 days and the data at every 10 days up to 10000 days are presented. The following ICRP default values for the physical characteristics of the radioactive aerosols were used: AMAD=5 micron, geometric SD=2.5, particle density = 3 g/cm 3 , particle shape factor = 1.5. The subject exposed to the aerosols is the ICRP reference worker doing light work: light exercise with the ventilation rate of 1.5 m 3 /h for 5.5 h + sitting with the ventilation rate of 0.54 m 3 /h for 2.5 h. MONDAI was originally made by Version 7.0 of MS Excel for Windows 95, but the file was saved in the form of Ver. 4.0 as well as Ver. 7.0. Therefore, if the user has Ver. 4.0 or an upper version, he can open the file and operate it. With the graph-wizard of MS Excel the user can easily make a diagram for the retention or daily excretion of a radionuclide of interest. The dose coefficient (Sv/Bq intake) of each radionuclide for each absorption type given in ICRP Publication 68 was also written in each sheet. Therefore

  5. School Survey on Crime and Safety (SSOCS) 2000 Public-Use Data Files, User's Manual, and Detailed Data Documentation. [CD-ROM].

    Science.gov (United States)

    National Center for Education Statistics (ED), Washington, DC.

    This CD-ROM contains the raw, public-use data from the 2000 School Survey on Crime and Safety (SSOCS) along with a User's Manual and Detailed Data Documentation. The data are provided in SAS, SPSS, STATA, and ASCII formats. The User's Manual and the Detailed Data Documentation are provided as .pdf files. (Author)

  6. Performance of the engineering analysis and data system 2 common file system

    Science.gov (United States)

    Debrunner, Linda S.

    1993-01-01

    The Engineering Analysis and Data System (EADS) was used from April 1986 to July 1993 to support large scale scientific and engineering computation (e.g. computational fluid dynamics) at Marshall Space Flight Center. The need for an updated system resulted in a RFP in June 1991, after which a contract was awarded to Cray Grumman. EADS II was installed in February 1993, and by July 1993 most users were migrated. EADS II is a network of heterogeneous computer systems supporting scientific and engineering applications. The Common File System (CFS) is a key component of this system. The CFS provides a seamless, integrated environment to the users of EADS II including both disk and tape storage. UniTree software is used to implement this hierarchical storage management system. The performance of the CFS suffered during the early months of the production system. Several of the performance problems were traced to software bugs which have been corrected. Other problems were associated with hardware. However, the use of NFS in UniTree UCFM software limits the performance of the system. The performance issues related to the CFS have led to a need to develop a greater understanding of the CFS organization. This paper will first describe the EADS II with emphasis on the CFS. Then, a discussion of mass storage systems will be presented, and methods of measuring the performance of the Common File System will be outlined. Finally, areas for further study will be identified and conclusions will be drawn.

  7. Generation of SCALE 6 Input Data File for Cross Section Library of PWR Spent Fuel

    International Nuclear Information System (INIS)

    Jeong, Chang Joon; Cho, Dong Keun

    2010-11-01

    In order to obtain the cross section libraries of the Korean Pressurized water reactor (PWR) spent fuel (SF), SCALE 6 code input files have been generated. The PWR fuel data were obtained from the nuclear design report (NDR) of the current operating PWRs. The input file were prepared for 16 fuel types such as 4 types of Westinghouse 14x14, 3 types of OPR-1000 16x16, 4 types of Westinghouse 16x16, and 6 types of Westinghouse 17x17. For each fuel type, 5 kinds of fuel enrichments have been considered such as 1.5, 2.0 ,3.0, 4.0 and 5.0 wt%. In the SCALE 6 calculation, a ENDF-V 44 group was used. The 25 burnup step until 72000 MWD/T was used. A 1/4 symmetry model was used for 16x16 and 17x17 fuel assembly, and 1/2 symmetry model was used for 14x14 fuel assembly The generated cross section libraries will be used for the source-term analysis of the PWR SF

  8. The ASCO Oncology Composite Provider Utilization File: New Data, New Insights.

    Science.gov (United States)

    Barr, Thomas R; Towle, Elaine L; Barr, Thomas R; Towle, Elaine L

    2016-01-01

    As we seek to understand the changing practice environment in oncology, the need for accurate information about demand for services, distribution of the delivery system in this sector of the health economy, and other practice trends is apparent. In this article, we present analysis of the sector using one of the public use files from the Centers for Medicare & Medicaid Services in combination with other publicly available data. Medicare data are particularly useful for this analysis because cancer is associated with aging and Medicare is the primary payer in the United States for patients older than age 65. As a result, nearly all oncologists who serve adult populations are represented in these data. By combining publicly available datasets into what we call the ASCO Provider Utilization File,we can investigate a wide range of supply, demand, and practice issues. We calculate the average work performed per physician, observe regional differences in work production,and quantify the downside risk and upside potential associated with the provision of chemotherapy drugs. Comparing the supply of oncologists by state with physician work relative value units and with estimates of cancer incidence by state reveals intriguing differences in the distribution of physicians and the demand for oncology services. In addition, our analysis demonstrates significant downside practice risk associated with the provision of drug therapy to Medicare beneficiaries. The economic risk associated with the purchase and delivery of chemotherapy is of particular concern as pressure for value increases. This article provides a description of a new dataset and interesting observations from these data.

  9. PC Graphic file programing

    International Nuclear Information System (INIS)

    Yang, Jin Seok

    1993-04-01

    This book gives description of basic of graphic knowledge and understanding and realization of graphic file form. The first part deals with graphic with graphic data, store of graphic data and compress of data, programing language such as assembling, stack, compile and link of program and practice and debugging. The next part mentions graphic file form such as Mac paint file, GEM/IMG file, PCX file, GIF file, and TIFF file, consideration of hardware like mono screen driver and color screen driver in high speed, basic conception of dithering and conversion of formality.

  10. Visualizing NetCDF Files by Using the EverVIEW Data Viewer

    Science.gov (United States)

    Conzelmann, Craig; Romañach, Stephanie S.

    2010-01-01

    Over the past few years, modelers in South Florida have started using Network Common Data Form (NetCDF) as the standard data container format for storing hydrologic and ecologic modeling inputs and outputs. With its origins in the meteorological discipline, NetCDF was created by the Unidata Program Center at the University Corporation for Atmospheric Research, in conjunction with the National Aeronautics and Space Administration and other organizations. NetCDF is a portable, scalable, self-describing, binary file format optimized for storing array-based scientific data. Despite attributes which make NetCDF desirable to the modeling community, many natural resource managers have few desktop software packages which can consume NetCDF and unlock the valuable data contained within. The U.S. Geological Survey and the Joint Ecosystem Modeling group, an ecological modeling community of practice, are working to address this need with the EverVIEW Data Viewer. Available for several operating systems, this desktop software currently supports graphical displays of NetCDF data as spatial overlays on a three-dimensional globe and views of grid-cell values in tabular form. An included Open Geospatial Consortium compliant, Web-mapping service client and charting interface allows the user to view Web-available spatial data as additional map overlays and provides simple charting visualizations of NetCDF grid values.

  11. National Household Education Surveys Program of 2012: Data File User's Manual. Parent and Family Involvement in Education Survey. Early Childhood Program Participation Survey. NCES 2015-030

    Science.gov (United States)

    McPhee, C.; Bielick, S.; Masterton, M.; Flores, L.; Parmer, R.; Amchin, S.; Stern, S.; McGowan, H.

    2015-01-01

    The 2012 National Household Education Surveys Program (NHES:2012) Data File User's Manual provides documentation and guidance for users of the NHES:2012 data files. The manual provides information about the purpose of the study, the sample design, data collection procedures, data processing procedures, response rates, imputation, weighting and…

  12. The U.S. Geological Survey Peak-Flow File Data Verification Project, 2008–16

    Science.gov (United States)

    Ryberg, Karen R.; Goree, Burl B.; Williams-Sether, Tara; Mason, Robert R.

    2017-11-21

    Annual peak streamflow (peak flow) at a streamgage is defined as the maximum instantaneous flow in a water year. A water year begins on October 1 and continues through September 30 of the following year; for example, water year 2015 extends from October 1, 2014, through September 30, 2015. The accuracy, characterization, and completeness of the peak streamflow data are critical in determining flood-frequency estimates that are used daily to design water and transportation infrastructure, delineate flood-plain boundaries, and regulate development and utilization of lands throughout the United States and are essential to understanding the implications of climate and land-use change on flooding and high-flow conditions.As of November 14, 2016, peak-flow data existed for 27,240 unique streamgages in the United States and its territories. The data, collectively referred to as the “peak-flow file,” are available as part of the U.S. Geological Survey (USGS) public web interface, the National Water Information System, at https://nwis.waterdata.usgs.gov/usa/nwis/peak. Although the data have been routinely subjected to periodic review by the USGS Office of Surface Water and screening at the USGS Water Science Center level, these data were not reviewed in a national, systematic manner until 2008 when automated scripts were developed and applied to detect potential errors in peak-flow values and their associated dates, gage heights, and peak-flow qualification codes, as well as qualification codes associated with the gage heights. USGS scientists and hydrographers studied the resulting output, accessed basic records and field notes, and corrected observed errors or, more commonly, confirmed existing data as correct.This report summarizes the changes in peak-flow file data at a national level, illustrates their nature and causation, and identifies the streamgages affected by these changes. Specifically, the peak-flow data were compared for streamgages with peak flow

  13. Toolsets for Airborne Data (TAD): Improving Machine Readability for ICARTT Data Files

    Science.gov (United States)

    Early, Amanda Benson; Beach, Aubrey; Northup, Emily; Wang, Dali; Kusterer, John; Quam, Brandi; Chen, Gao

    2015-01-01

    The Atmospheric Science Data Center (ASDC) at NASA Langley Research Center is responsible for the ingest, archive, and distribution of NASA Earth Science data in the areas of radiation budget, clouds, aerosols, and tropospheric chemistry. The ASDC specializes in atmospheric data that is important to understanding the causes and processes of global climate change and the consequences of human activities on the climate. The ASDC currently supports more than 44 projects and has over 1,700 archived data sets, which increase daily. ASDC customers include scientists, researchers, federal, state, and local governments, academia, industry, and application users, the remote sensing community, and the general public.

  14. Evaluated Nuclear Structure data file: a manual for preparation of data sets

    International Nuclear Information System (INIS)

    Ewbank, W.B.; Schmorak, M.R.

    1978-02-01

    A standard input format for nuclear structure data is described. The format is sufficiently structured that bulk data can be entered efficiently. At the same time, the structure is open-ended and can accommodate most measured or deduced quantities that yield nuclear structure information

  15. A computer program for creating keyword indexes to textual data files

    Science.gov (United States)

    Moody, David W.

    1972-01-01

    A keyword-in-context (KWIC) or out-of-context (KWOC) index is a convenient means of organizing information. This keyword index program can be used to create either KWIC or KWOC indexes of bibliographic references or other types of information punched on. cards, typed on optical scanner sheets, or retrieved from various Department of Interior data bases using the Generalized Information Processing System (GIPSY). The index consists of a 'bibliographic' section and a keyword-section based on the permutation of. document titles, project titles, environmental impact statement titles, maps, etc. or lists of descriptors. The program can also create a back-of-the-book index to documents from a list of descriptors. By providing the user with a wide range of input and output options, the program provides the researcher, manager, or librarian with a means of-maintaining a list and index to documents in. a small library, reprint collection, or office file.

  16. Verification of data files of TREF-computer program; TREF-ohjelmiston ohjaustiedostojen soveltuvuustutkimus

    Energy Technology Data Exchange (ETDEWEB)

    Ruottu, S.; Halme, A.; Ruottu, A. [Einco Oy, Karhula (Finland)

    1996-12-01

    Originally the aim of Y43 project was to verify TREF data files for several different processes. However, it appeared that deficient or missing coordination between experimental and theoretical works made meaningful verifications impossible in some cases. Therefore verification calculations were focused on catalytic cracking reactor which was developed by Neste. The studied reactor consisted of prefluidisation and reaction zones. Verification calculations concentrated mainly on physical phenomena like vaporization near oil injection zone. The main steps of the cracking process can be described as follows oil (liquid) -> oil (gas) -> oil (catal) -> product (catal) + char (catal) -> product (gas). Catalytic nature of cracking reaction was accounted by defining the cracking pseudoreaction into catalyst phase. This simplified reaction model was valid only for vaporization zone. Applied fluid dynamic theory was based on the results of EINCO`s earlier LIEKKI-projects. (author)

  17. Some aspects of the file organization and retrieval strategy in large data-bases

    International Nuclear Information System (INIS)

    Arnaudov, D.D.; Govorun, N.N.

    1977-01-01

    Methods of organizing a big information retrieval system are discribed. A special attention is paid to the file organization. An adapting file structure is described in more detail. The discussed method gives one the opportunity to organize large files in such a way that the response time of the system can be minimized, when the file is increasing. In connection with the retrieval strategy a method is proposed, which uses the frequencies of the descr/iptors and the couples of the descriptors to forecast the expected number of the relevant documents. Programmes are made, on the base of these methods, which are used in the information retrieval systems of JINR

  18. XML Files

    Science.gov (United States)

    ... this page: https://medlineplus.gov/xml.html MedlinePlus XML Files To use the sharing features on this page, please enable JavaScript. MedlinePlus produces XML data sets that you are welcome to download ...

  19. Provider of Services File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The POS file consists of two data files, one for CLIA labs and one for 18 other provider types. The file names are CLIA and OTHER. If downloading the file, note it...

  20. Thermal lattice benchmarks for testing basic evaluated data files, developed with MCNP4B

    International Nuclear Information System (INIS)

    Maucec, M.; Glumac, B.

    1996-01-01

    The development of unit cell and full reactor core models of DIMPLE S01A and TRX-1 and TRX-2 benchmark experiments, using Monte Carlo computer code MCNP4B is presented. Nuclear data from ENDF/B-V and VI version of cross-section library were used in the calculations. In addition, a comparison to results obtained with the similar models and cross-section data from the EJ2-MCNPlib library (which is based upon the JEF-2.2 evaluation) developed in IRC Petten, Netherlands is presented. The results of the criticality calculation with ENDF/B-VI data library, and a comparison to results obtained using JEF-2.2 evaluation, confirm the MCNP4B full core model of a DIMPLE reactor as a good benchmark for testing basic evaluated data files. On the other hand, the criticality calculations results obtained using the TRX full core models show less agreement with experiment. It is obvious that without additional data about the TRX geometry, our TRX models are not suitable as Monte Carlo benchmarks. (author)

  1. Release of the ENDF/B-VII.1 Evaluated Nuclear Data File

    Energy Technology Data Exchange (ETDEWEB)

    Brown, David

    2012-06-30

    The Cross Section Evaluation Working Group (CSEWG) released the ENDF/B-VII.1 library on December 22, 2011. The ENDF/B-VII.1 library is CSEWG's latest recommended evaluated nuclear data file for use in nuclear science and technology applications, and incorporates advances made in the five years since the release of ENDF/B-VII.0, including: many new evaluation in the neutron sublibrary (423 in all and over 190 of these contain covariances), new fission product yields and a greatly improved decay data sublibrary. This summary barely touches on the five years worth of advances present in the ENDF/B-VII.1 library. We expect that these changes will lead to improved integral performance in reactors and other applications. Furthermore, the expansion of covariance data in this release will allow for better uncertainty quantification, reducing design margins and costs. The ENDF library is an ongoing and evolving effort. Currently, the ENDF data community embarking on several parallel efforts to improve library management: (1) The adoption of a continuous integration system to provide evaluators 'instant' feedback on the quality of their evaluations and to provide data users with working 'beta' quality libraries in between major releases. (2) The transition to new hierarchical data format - the Generalized Nuclear Data (GND) format. We expect GND to enable new kinds of evaluated data which cannot be accommodated in the legacy ENDF format. (3) The development of data assimilation and uncertainty propagation techniques to enable the consistent use of integral experimental data in the evaluation process.

  2. Lapin Data Interchange Among Database, Analysis and Display Programs Using XML-Based Text Files

    Science.gov (United States)

    2005-01-01

    The purpose of grant NCC3-966 was to investigate and evaluate the interchange of application-specific data among multiple programs each carrying out part of the analysis and design task. This has been carried out previously by creating a custom program to read data produced by one application and then write that data to a file whose format is specific to the second application that needs all or part of that data. In this investigation, data of interest is described using the XML markup language that allows the data to be stored in a text-string. Software to transform output data of a task into an XML-string and software to read an XML string and extract all or a portion of the data needed for another application is used to link two independent applications together as part of an overall design effort. This approach was initially used with a standard analysis program, Lapin, along with standard applications a standard spreadsheet program, a relational database program, and a conventional dialog and display program to demonstrate the successful sharing of data among independent programs. Most of the effort beyond that demonstration has been concentrated on the inclusion of more complex display programs. Specifically, a custom-written windowing program organized around dialogs to control the interactions have been combined with an independent CAD program (Open Cascade) that supports sophisticated display of CAD elements such as lines, spline curves, and surfaces and turbine-blade data produced by an independent blade design program (UD0300).

  3. High School and Beyond. 1980 Senior Coort. Third-Follow-Up (1986). Data File User's Manual. Volume II: Survey Instruments. Contractor Report.

    Science.gov (United States)

    Sebring, Penny; And Others

    Survey instruments used in the collection of data for the High School and Beyond base year (1980) through the third follow-up surveys (1986) are provided as Volume II of a user's manual for the senior cohort data file. The complete user's manual is designed to provide the extensive documentation necessary for using the cohort data files. Copies of…

  4. Data formats and procedures for the Evaluated Nuclear Data File, ENDF

    International Nuclear Information System (INIS)

    Kinsey, R.

    1979-10-01

    These revisions to Data Formats and Procedures for the ENDF Neutron Cross Section Library, ENDF-102, pertain to the latest version of ENDF/B-V. The descriptions of the formats are brought up to date, and important procedural matters are explained

  5. Building analytical platform with Big Data solutions for log files of PanDA infrastructure

    Science.gov (United States)

    Alekseev, A. A.; Barreiro Megino, F. G.; Klimentov, A. A.; Korchuganova, T. A.; Maendo, T.; Padolski, S. V.

    2018-05-01

    The paper describes the implementation of a high-performance system for the processing and analysis of log files for the PanDA infrastructure of the ATLAS experiment at the Large Hadron Collider (LHC), responsible for the workload management of order of 2M daily jobs across the Worldwide LHC Computing Grid. The solution is based on the ELK technology stack, which includes several components: Filebeat, Logstash, ElasticSearch (ES), and Kibana. Filebeat is used to collect data from logs. Logstash processes data and export to Elasticsearch. ES are responsible for centralized data storage. Accumulated data in ES can be viewed using a special software Kibana. These components were integrated with the PanDA infrastructure and replaced previous log processing systems for increased scalability and usability. The authors will describe all the components and their configuration tuning for the current tasks, the scale of the actual system and give several real-life examples of how this centralized log processing and storage service is used to showcase the advantages for daily operations.

  6. NURE [National Uranium Resource Evaluation] HSSR [Hydrogeochemical and Stream Sediment Reconnaissance] Introduction to Data Files, United States: Volume 1

    International Nuclear Information System (INIS)

    1985-01-01

    One product of the Hydrogeochemical and Stream Sediment Reconnaissance (HSSR) program, a component of the National Uranium Resource Evaluation (NURE), is a data-base of interest to scientists and professionals in the academic, business, industrial, and governmental communities. This database contains individual records for water and sediment samples taken during the reconnaissance survey of the entire United States, excluding Hawaii. The purpose of this report is to describe the NURE HSSR data by highlighting its key characteristics and providing user guides to the data. A companion report, ''A Technical History of the NURE HSSR Program,'' summarizes those aspects of the HSSR Program which are likely to be important in helping users understand the database. Each record on the database contains varying information on general field or site characteristics and analytical results for elemental concentrations in the sample; the database is potentially valuable for describing the geochemistry of specified locations and addressing issues or questions in other areas such as water quality, geoexploration, and hydrologic studies. This report is organized in twelve volumes. This first volume presents a brief history of the NURE HSSR program, a description of the data files produced by ISP, a Users' Dictionary for the Analysis File and graphs showing the distribution of elemental concentrations for sediments at the US level. Volumes 2 through 12 are comprised of Data Summary Tables displaying the percentile distribution of the elemental concentrations on the file. Volume 2 contains data for the individual states. Volumes 3 through 12 contain data for the 1 0 x 2 0 quadrangles, organized into eleven regional files; the data for the two regional files for Alaska (North and South) are bound together as Volume 12

  7. Conversion of Input Data between KENO and MCNP File Formats for Computer Criticality Assessments

    International Nuclear Information System (INIS)

    Schwarz, Randolph A.; Carter, Leland L.; Schwarz Alysia L.

    2006-01-01

    KENO is a Monte Carlo criticality code that is maintained by Oak Ridge National Laboratory (ORNL). KENO is included in the SCALE (Standardized Computer Analysis for Licensing Evaluation) package. KENO is often used because it was specifically designed for criticality calculations. Because KENO has convenient geometry input, including the treatment of lattice arrays of materials, it is frequently used for production calculations. Monte Carlo N-Particle (MCNP) is a Monte Carlo transport code maintained by Los Alamos National Laboratory (LANL). MCNP has a powerful 3D geometry package and an extensive cross section database. It is a general-purpose code and may be used for calculations involving shielding or medical facilities, for example, but can also be used for criticality calculations. MCNP is becoming increasingly more popular for performing production criticality calculations. Both codes have their own specific advantages. After a criticality calculation has been performed with one of the codes, it is often desirable (or may be a safety requirement) to repeat the calculation with the other code to compare the important parameters using a different geometry treatment and cross section database. This manual conversion of input files between the two codes is labor intensive. The industry needs the capability of converting geometry models between MCNP and KENO without a large investment in manpower. The proposed conversion package will aid the user in converting between the codes. It is not intended to be used as a ''black box''. The resulting input file will need to be carefully inspected by criticality safety personnel to verify the intent of the calculation is preserved in the conversion. The purpose of this package is to help the criticality specialist in the conversion process by converting the geometry, materials, and pertinent data cards

  8. The design and analysis of salmonid tagging studies in the Columbia Basin. Volume 10: Instructional guide to using program CaptHist to create SURPH files for survival analysis using PTAGIS data files

    International Nuclear Information System (INIS)

    Westhagen, P.; Skalski, J.

    1997-12-01

    The SURPH program is a valuable tool for estimating survivals and capture probabilities of fish outmigrations on the Snake and Columbia Rivers. Using special data files, SURPH computes reach to reach statistics for any release group passing a system of detection sites. Because the data must be recorded for individual fish, PIT tag data is best suited for use as input. However, PIT tag data as available from PTAGIS comes in a form that is not ready for use as SURPH input. SURPH requires a capture history for each fish. A capture history consists of a series of fields, one for each detection site, that has a code for whether the fish was detected and returned to the river, detected and removed, or not detected. For the PTAGIS data to be usable by SURPH it must be pre-processed. The data must be condensed down to one line per fish with the relevant detection information from the PTAGIS file represented compactly on each line. In addition, the PTAGIS data file coil information must be passed through a series of logic algorithms to determine whether or not a fish is returned to the river after detection. Program CaptHist was developed to properly pre-process the PTAGIS data files for input to program SURPH. This utility takes PTAGIS data files as input and creates a SURPH data file as well as other output including travel time records, detection date records, and a data error file. CaptHist allows a user to download PTAGIS files and easily process the data for use with SURPH

  9. Combination of Rivest-Shamir-Adleman Algorithm and End of File Method for Data Security

    Science.gov (United States)

    Rachmawati, Dian; Amalia, Amalia; Elviwani

    2018-03-01

    Data security is one of the crucial issues in the delivery of information. One of the ways which used to secure the data is by encoding it into something else that is not comprehensible by human beings by using some crypto graphical techniques. The Rivest-Shamir-Adleman (RSA) cryptographic algorithm has been proven robust to secure messages. Since this algorithm uses two different keys (i.e., public key and private key) at the time of encryption and decryption, it is classified as asymmetric cryptography algorithm. Steganography is a method that is used to secure a message by inserting the bits of the message into a larger media such as an image. One of the known steganography methods is End of File (EoF). In this research, the cipher text resulted from the RSA algorithm is compiled into an array form and appended to the end of the image. The result of the EoF is the image which has a line with black gradations under it. This line contains the secret message. This combination of cryptography and steganography in securing the message is expected to increase the security of the message, since the message encryption technique (RSA) is mixed with the data hiding technique (EoF).

  10. PDB Editor: a user-friendly Java-based Protein Data Bank file editor with a GUI.

    Science.gov (United States)

    Lee, Jonas; Kim, Sung Hou

    2009-04-01

    The Protein Data Bank file format is the format most widely used by protein crystallographers and biologists to disseminate and manipulate protein structures. Despite this, there are few user-friendly software packages available to efficiently edit and extract raw information from PDB files. This limitation often leads to many protein crystallographers wasting significant time manually editing PDB files. PDB Editor, written in Java Swing GUI, allows the user to selectively search, select, extract and edit information in parallel. Furthermore, the program is a stand-alone application written in Java which frees users from the hassles associated with platform/operating system-dependent installation and usage. PDB Editor can be downloaded from http://sourceforge.net/projects/pdbeditorjl/.

  11. Evaluation of sodium-23 neutron capture cross section data for the ENDF/B V-III file

    International Nuclear Information System (INIS)

    Paik, N.C.; Pitterle, T.A.

    1975-01-01

    The evaluation of neutron cross sections of 23 Na, material number 1156, for the ENDF/B File is described. Cross sections were evaluated between 10 -5 eV and 15 MeV. Experimental data available up to March 1971 were included in the evaluation

  12. Instructions for preparation of data entry sheets for Licensee Event Report (LER) file. Revision 1. Instruction manual

    International Nuclear Information System (INIS)

    1977-07-01

    The manual provides instructions for the preparation of data entry sheets for the licensee event report (LER) file. It is a revision to an interim manual published in October 1974 in 00E-SS-001. The LER file is a computer-based data bank of information using the data entry sheets as input. These data entry sheets contain pertinent information in regard to those occurrences required to be reported to the NRC. The computer-based data bank provides a centralized source of data that may be used for qualitative assessment of the nature and extent of off-normal events in the nuclear industry and as an index of source information to which users may refer for more detail

  13. Summary remarks and recommended reactions for an international data file for dosimetry applications for LWR, FBR, and MFR reactor research, development and testing programs

    International Nuclear Information System (INIS)

    McElroy, W.N.; Lippincott, E.P.; Grundl, J.A.; Fabry, A.; Dierckx, R.; Farinelli, U.

    1979-01-01

    The need for the use of an internationally accepted data file for dosimetry applications for light water reactor (LWR), fast breeder reactor (FBR), and magnetic fusion reactor (MFR) research, development, and testing programs continues to exist for the Nuclear Industry. The work of this IAEA meeting, therefore, will be another important step in achieving consensus agreement on an internationally recommended file and its purpose, content, structure, selected reactions, and associated uncertainy files. Summary remarks and a listing of recommended reactions for consideration in the formulation of an ''International Data File for Dosimetry Applications'' are presented in subsequent sections of this report

  14. Formalizing structured file services for the data storage and retrieval subsystem of the data management system for Spacestation Freedom

    Science.gov (United States)

    Jamsek, Damir A.

    1993-01-01

    A brief example of the use of formal methods techniques in the specification of a software system is presented. The report is part of a larger effort targeted at defining a formal methods pilot project for NASA. One possible application domain that may be used to demonstrate the effective use of formal methods techniques within the NASA environment is presented. It is not intended to provide a tutorial on either formal methods techniques or the application being addressed. It should, however, provide an indication that the application being considered is suitable for a formal methods by showing how such a task may be started. The particular system being addressed is the Structured File Services (SFS), which is a part of the Data Storage and Retrieval Subsystem (DSAR), which in turn is part of the Data Management System (DMS) onboard Spacestation Freedom. This is a software system that is currently under development for NASA. An informal mathematical development is presented. Section 3 contains the same development using Penelope (23), an Ada specification and verification system. The complete text of the English version Software Requirements Specification (SRS) is reproduced in Appendix A.

  15. Renewal and maintenance of a nuclear structure data file used for the calculations of dose conversion factors

    International Nuclear Information System (INIS)

    Togawa, Orihiko; Yamaguchi, Yukichi

    1996-02-01

    The ENSDF decay data are used as fundamental data to compute radiation data in the DOSDAC code system, which was developed at JAERI, for the calculation of dose conversion factors. The ENSDF decay data have been periodically revised by reviewing new experimental data in the literature under an international network. The use of this data file enables us to calculate radiation data from information which is the newest and internationally recognized. In spite of this advantage, the decay data file is seldom used in applied fields. This is due to some problems to be solved from a viewpoint of the calculation of radiation data, as well as its complicated structure. This report describes methods for renewal and maintenance of the ENSDF decay data used for the calculation of dose conversion factors. In case that the decay data are used directly, attention should be sometimes paid to some problems, for example defects in data. In renewing and using the ENSDF decay data, the DOSDAC code system tries to avoid wrong calculations of radiation data by check and modification of defects in data through four supporting computer codes. (author)

  16. Direct utilization of information from nuclear data files in Monte Carlo simulation of neutron and photon transport

    International Nuclear Information System (INIS)

    Androsenko, P.; Joloudov, D.; Kompaniyets, A.

    2001-01-01

    Questions, related to Monte-Carlo method for solution of neutron and photon transport equation, are discussed in the work concerned. Problems dealing with direct utilization of information from evaluated nuclear data files in run-time calculations are considered. ENDF-6 format libraries have been used for calculations. Approaches provided by the rules of ENDF-6 files 2, 3-6, 12-15, 23, 27 and algorithms for reconstruction of resolved and unresolved resonance region cross sections under preset energy are described. The comparison results of calculations made by NJOY and GRUCON programs and computed cross sections data are represented. Test computation data of neutron leakage spectra for spherical benchmark-experiments are also represented. (authors)

  17. The Polls-Review: Inaccurate Age and Sex Data in the Census Pums Files: Evidence and Implications.

    Science.gov (United States)

    Alexander, J Trent; Davern, Michael; Stevenson, Betsey

    2010-01-01

    We discover and document errors in public-use microdata samples ("PUMS files") of the 2000 Census, the 2003-2006 American Community Survey, and the 2004-2009 Current Population Survey. For women and men age 65 and older, age- and sex-specific population estimates generated from the PUMS files differ by as much as 15 percent from counts in published data tables. Moreover, an analysis of labor-force participation and marriage rates suggests the PUMS samples are not representative of the population at individual ages for those age 65 and over. PUMS files substantially underestimate labor-force participation of those near retirement age and overestimate labor-force participation rates of those at older ages. These problems were an unintentional byproduct of the misapplication of a newer generation of disclosure-avoidance procedures carried out on the data. The resulting errors in the public-use data could significantly impact studies of people age 65 and older, particularly analyses of variables that are expected to change by age.

  18. FORTRAN data files transference from VAX/VMS to ALPHA/UNIX; Traspaso de ficheros FORTRAN de datos de VAX/VMS a ALPHA/UNIX

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, E.; Milligen, B. Ph van [CIEMAT (Spain)

    1997-09-01

    Several tools have been developed to access the TJ-IU databases, which currently reside in VAX/VMS servers, from the TJ-II Data Acquisition System DEC ALPHA 8400 server. The TJ-I/TJ-IU databases are not homogeneous and contain several types of data files, namely, SADE, CAMAC and FORTRAN unformatted files. The tools presented in this report allow one to transfer CAMAC and those FORTRAN unformatted files defined herein, from a VAX/VMS server, for data manipulation on the ALPHA/Digital UNIX server. (Author)

  19. Consistency between data from the ENDF/B-V dosimetry file and corresponding experimental data for some fast neutron reference spectra

    International Nuclear Information System (INIS)

    Nolthenius, H.J.; Zijp, W.L.

    1981-11-01

    Results are given of a study on the consistency between 'integral' and 'differential' cross sections data for four benchmark neutron spectra and 36 neutron reactions of importance for reactor neutron metrology. The energy dependent cross section data and their uncertainty data are obtained from the ENDF/B-V dosimetry file. The reactions have been considered with respect to the following quantities: 1. the precision of the averaged cross sections, for a specified spectrum; 2. the discrepancy between the measured and the calculated average cross section values; 3. the consistency between the measured and calculated average cross section values, described by the chi 2 -parameter. It was possible to take into account the available cross section covariance information present in the ENDF/B-V dosimetry file. Covariance information on the benchmark flux density spectra was not taken into account in this study

  20. Surfing the Web for Science: Early Data on the Users and Uses of The Why Files.

    Science.gov (United States)

    Eveland, William P., Jr.; Dunwoody, Sharon

    1998-01-01

    This brief offers an initial look at one science site on the World Wide Web (The Why Files: http://whyfiles.news.wise.edu) in order to consider the educational potential of this technology. The long-term goal of the studies of this site is to understand how the World Wide Web can be used to enhance science, mathematics, engineering, and technology…

  1. Ground-Based Global Navigation Satellite System GLONASS (GLObal NAvigation Satellite System) Combined Broadcast Ephemeris Data (daily files) from NASA CDDIS

    Data.gov (United States)

    National Aeronautics and Space Administration — This dataset consists of ground-based Global Navigation Satellite System (GNSS) GLONASS Combined Broadcast Ephemeris Data (daily files of all distinct navigation...

  2. Ground-Based Global Navigation Satellite System (GNSS) Compact Observation Data (1-second sampling, sub-hourly files) from NASA CDDIS

    Data.gov (United States)

    National Aeronautics and Space Administration — This dataset consists of ground-based Global Navigation Satellite System (GNSS) Observation Data (1-second sampling, sub-hourly files) from the NASA Crustal Dynamics...

  3. ZZ-FSXLIB-JD99, MCNP nuclear data library based on JENDL Dosimetry File 99

    International Nuclear Information System (INIS)

    Shibata, Keiichi

    2007-01-01

    Description: JENDL Dosimetry File 99 processed into ACE for Monte Carlo calculations. JENDL/D-99 based MCNP library. Format: ACE. Number of groups: Continuous energy cross section library. Nuclides: 47 Nuclides and 67 reactions: Li-6 (n, triton) alpha; Li-6 alpha-production; Li-7 triton- production; B-10 (n, alpha) Li-7; B-10 alpha-production; F-19 (n, 2n) F-18; Na-23 (n, 2n) Na-22; Na-23 (n, gamma) Na-24; Mg-24 (n, p) Na-24; Al-27 (n, p) Mg-27; Al-27 (n, alpha) Na-24; P-31 (n, p) Si-31; S-32 (n, p) P-32; Sc-45 (n, gamma) Sc-46; Ti-nat (n, x) Sc-46; Ti-nat (n, x) Sc-47; Ti-nat (n, x) Sc-48; Ti-46 (n, 2n) Ti-45; Ti-46 (n, p) Sc-46; Ti-47 (n, np) Sc-46; Ti-47 (n, p) Sc-47; Ti-48 (n, np) Sc-47; Ti-48 (n, p) Sc-48; Ti-49 (n, np) Sc-48; Cr-50 (n, gamma) Cr-51; Cr-52 (n, 2n) Cr-51; Mn-55 (n, 2n) Mn-54; Mn-55 (n, gamma) Mn-56; Fe-54 (n, p) Mn-54; Fe-56 (n, p) Mn-56; Fe-57 (n, np) Mn-56; Fe-58 (n, gamma) Fe-59; Co-59 (n, 2n) Co-58; Co-59 (n, gamma) Co-60; Co-59 (n, alpha) Mn-56; Ni-58 (n, 2n) Ni-57; Ni-58 (n, p) Co-58; Ni-60 (n, p) Co-60; Cu-63 (n, 2n) Cu-62; Cu-63 (n, gamma) Cu-64; Cu-63 (n, alpha) Co-60; Cu-65 (n, 2n) Cu-64; Zn-64 (n, p) Cu-64; Y-89 (n, 2n) Y-88; Zr-90 (n, 2n) Zr-89; Nb-93 (n, n') Nb-93m; Nb-93 (n, 2n) Nb-92m; Rh-103 (n, n') Rh-103m; Ag-109 (n, gamma) Ag-110m; In-115 (n, n') In-115m; In-115 (n, gamma) In-116m; I-127 (n,2n) I-126; Eu-151 (n, gamma) Eu-152; Tm-169 (n,2n) Tm-168; Ta-181 (n, gamma) Ta-182; W-186 (n, gamma) W-187; Au-197 (n, 2n) Au-196; Au-197 (n, gamma) Au-198; Hg-199 (n, n') Hg-199m; Th-232 - fission; Th-232 (n, gamma) Th-233; U-235 - fission; U-238 - fission; U-238 (n, gamma) U-239; Np-237 - fission; Pu-239 - fission; Am-241 - fission. The data were produced on the 31 of March, 2006

  4. A scalable infrastructure for CMS data analysis based on OpenStack Cloud and Gluster file system

    International Nuclear Information System (INIS)

    Toor, S; Eerola, P; Kraemer, O; Lindén, T; Osmani, L; Tarkoma, S; White, J

    2014-01-01

    The challenge of providing a resilient and scalable computational and data management solution for massive scale research environments requires continuous exploration of new technologies and techniques. In this project the aim has been to design a scalable and resilient infrastructure for CERN HEP data analysis. The infrastructure is based on OpenStack components for structuring a private Cloud with the Gluster File System. We integrate the state-of-the-art Cloud technologies with the traditional Grid middleware infrastructure. Our test results show that the adopted approach provides a scalable and resilient solution for managing resources without compromising on performance and high availability.

  5. A scalable infrastructure for CMS data analysis based on OpenStack Cloud and Gluster file system

    Science.gov (United States)

    Toor, S.; Osmani, L.; Eerola, P.; Kraemer, O.; Lindén, T.; Tarkoma, S.; White, J.

    2014-06-01

    The challenge of providing a resilient and scalable computational and data management solution for massive scale research environments requires continuous exploration of new technologies and techniques. In this project the aim has been to design a scalable and resilient infrastructure for CERN HEP data analysis. The infrastructure is based on OpenStack components for structuring a private Cloud with the Gluster File System. We integrate the state-of-the-art Cloud technologies with the traditional Grid middleware infrastructure. Our test results show that the adopted approach provides a scalable and resilient solution for managing resources without compromising on performance and high availability.

  6. Comparisons of experimental beta-ray spectra important to decay heat predictions with ENSDF [Evaluated Nuclear Structure Data File] evaluations

    International Nuclear Information System (INIS)

    Dickens, J.K.

    1990-03-01

    Graphical comparisons of recently obtained experimental beta-ray spectra with predicted beta-ray spectra based on the Evaluated Nuclear Structure Data File are exhibited for 77 fission products having masses 79--99 and 130--146 and lifetimes between 0.17 and 23650 sec. The comparisons range from very poor to excellent. For beta decay of 47 nuclides, estimates are made of ground-state transition intensities. For 14 cases the value in ENSDF gives results in very good agreement with the experimental data. 12 refs., 77 figs., 1 tab

  7. User's guide for the implementation of level one of the proposed American National Standard Specifications for an information interchange data descriptive file on control data 6000/7000 series computers

    CERN Document Server

    Wiley, R A

    1977-01-01

    User's guide for the implementation of level one of the proposed American National Standard Specifications for an information interchange data descriptive file on control data 6000/7000 series computers

  8. Evaluation of clinical data in childhood asthma. Application of a computer file system

    International Nuclear Information System (INIS)

    Fife, D.; Twarog, F.J.; Geha, R.S.

    1983-01-01

    A computer file system was used in our pediatric allergy clinic to assess the value of chest roentgenograms and hemoglobin determinations used in the examination of patients and to correlate exposure to pets and forced hot air with the severity of asthma. Among 889 children with asthma, 20.7% had abnormal chest roentgenographic findings, excluding hyperinflation and peribronchial thickening, and 0.7% had abnormal hemoglobin values. Environmental exposure to pets or forced hot air was not associated with increased severity of asthma, as assessed by five measures of outcome: number of medications administered, requirement for corticosteroids, frequency of clinic visits, frequency of emergency room visits, and frequency of hospitalizations

  9. Summary report of the 3. research co-ordination meeting on development of reference input parameter library for nuclear model calculations of nuclear data (Phase 1: Starter File)

    International Nuclear Information System (INIS)

    Oblozinsky, P.

    1997-09-01

    The report contains the summary of the third and the last Research Co-ordination Meeting on ''Development of Reference Input Parameter Library for Nuclear Model Calculations of Nuclear Data (Phase I: Starter File)'', held at the ICTP, Trieste, Italy, from 26 to 29 May 1997. Details are given on the status of the Handbook and the Starter File - two major results of the project. (author)

  10. Formulation of detailed consumables management models for the development (preoperational) period of advanced space transportation system. Volume 4: Flight data file contents

    Science.gov (United States)

    Zamora, M. A.

    1976-01-01

    The contents of the Flight Data File which constitute the data required by and the data generated by the Mission Planning Processor are presented for the construction of the timeline and the determination of the consumables requirements of a given mission.

  11. Huygens file service and storage architecture

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.; Stabell-Kulo, Tage; Stabell-Kulo, Tage

    1993-01-01

    The Huygens file server is a high-performance file server which is able to deliver multi-media data in a timely manner while also providing clients with ordinary “Unix” like file I/O. The file server integrates client machines, file servers and tertiary storage servers in the same storage

  12. Huygens File Service and Storage Architecture

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.; Stabell-Kulo, Tage; Stabell-Kulo, Tage

    1993-01-01

    The Huygens file server is a high-performance file server which is able to deliver multi-media data in a timely manner while also providing clients with ordinary “Unix” like file I/O. The file server integrates client machines, file servers and tertiary storage servers in the same storage

  13. HLYWD: a program for post-processing data files to generate selected plots or time-lapse graphics

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.

    1980-05-01

    The program HLYWD is a post-processor of output files generated by large plasma simulation computations or of data files containing a time sequence of plasma diagnostics. It is intended to be used in a production mode for either type of application; i.e., it allows one to generate along with the graphics sequence, segments containing title, credits to those who performed the work, text to describe the graphics, and acknowledgement of funding agency. The current version is designed to generate 3D plots and allows one to select type of display (linear or semi-log scales), choice of normalization of function values for display purposes, viewing perspective, and an option to allow continuous rotations of surfaces. This program was developed with the intention of being relatively easy to use, reasonably flexible, and requiring a minimum investment of the user's time. It uses the TV80 library of graphics software and ORDERLIB system software on the CDC 7600 at the National Magnetic Fusion Energy Computing Center at Lawrence Livermore Laboratory in California

  14. ORACL program file for acquisition, storage and analysis of data in radiation measurement and nondestructive measurement of nuclear material, vol. 2

    International Nuclear Information System (INIS)

    Yagi, Hideyuki; Takeuchi, Norio; Gotoh, Hiroshi

    1976-09-01

    The file contains 79 programs for radiation measurement and nondestructive measurement of nuclear material written in conversational language ORACL associated with the GAMMA-III system of ORTEC Incorporated. It deals with data transfers between disk/core/MCA/magnetic tape, edition of data in disks, calculation of the peak area, calculation of mean and standard deviation, reference to gamma-ray data files, accounting, calendar, etc. It also has a support system for micro-computer development. Usages of the built-in functions of ORACL are presented. (auth.)

  15. Refreshing File Aggregate of Distributed Data Warehouse in Sets of Electric Apparatus

    Institute of Scientific and Technical Information of China (English)

    YU Baoqin; WANG Taiyong; ZHANG Jun; ZHOU Ming; HE Gaiyun; LI Guoqin

    2006-01-01

    Integrating heterogeneous data sources is a precondition to share data for enterprises.Highly-efficient data updating can both save system expenses, and offer real-time data.It is one of the hot issues to modify data rapidly in the pre-processing area of the data warehouse.An extract transform loading design is proposed based on a new data algorithm called Diff-Match,which is developed by utilizing mode matching and data-filtering technology.It can accelerate data renewal, filter the heterogeneous data, and seek out different sets of data.Its efficiency has been proved by its successful application in an enterprise of electric apparatus groups.

  16. Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files. SG39 meeting, May 2015

    International Nuclear Information System (INIS)

    Wang, Wenming; Yokoyama, Kenji; Kim, Do Heon; Kodeli, Ivan-Alexander; Hursin, Mathieu; Pelloni, Sandro; Palmiotti, Giuseppe; Salvatores, Massimo; Touran, Nicholas; Cabellos De Francisco, Oscar; )

    2015-05-01

    The aim of WPEC subgroup 39 'Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files' is to provide criteria and practical approaches to use effectively the results of sensitivity analyses and cross section adjustments for feedback to evaluators and differential measurement experimentalists in order to improve the knowledge of neutron cross sections, uncertainties, and correlations to be used in a wide range of applications. This document is the proceedings of the fourth Subgroup meeting, held at the NEA, Issy-les-Moulineaux, France, on 19-20 May 2015. It comprises a Summary Record of the meeting, two papers on deliverables and all the available presentations (slides) given by the participants: 1 - Status of Deliverables: '1. Methodology' (K. Yokoyama); 2 - Status of Deliverables: '2. Comments on covariance data' (K. Yokoyama); 3 - PROTEUS HCLWR Experiments (M. Hursin); 4 - Preliminary UQ Efforts for TWR Design (N. Touran); 5 - Potential use of beta-eff and other benchmark for adjustment (I. Kodeli); 6 - k_e_f_f uncertainties for a simple case of Am"2"4"1 using different codes and evaluated files (I. Kodeli); 7 - k_e_f_f uncertainties for a simple case of Am"2"4"1 using TSUNAMI (O. Cabellos); 8 - REWIND: Ranking Experiments by Weighting to Improve Nuclear Data (G. Palmiotti); 9 - Recent analysis on NUDUNA/MOCABA applications to reactor physics parameters (E. Castro); 10 - INL exploratory study for SEG (A. Hummel); 11 - The Development of Nuclear Data Adjustment Code at CNDC (H. Wu); 12 - SG39 Perspectives (M. Salvatores). A list of issues and actions conclude the document

  17. Log File Analytics for Gaining Insight into Actual Use of Open Data

    NARCIS (Netherlands)

    van Loenen, B.; Ubacht, J.; Zuiderwijk-van Eijk, AMG; Borges, Vieira; Dias Rouco, José Carlos

    2017-01-01

    Following open data policies worldwide, an increasing number of public organisations has now published open data that is free to be used by anyone. However, despite the significant increase in use of this open data, the open data
    providers are mostly not aware of their users and the way in which

  18. ENDF-102 data formats and procedures for the evaluated nuclear data file ENDF-6. Revision November 1995

    International Nuclear Information System (INIS)

    McLane, V.; Dunford, C.L.; Rose, P.F.

    1995-11-01

    The ENDF formats and libraries are decided by the Cross Section Evaluation Working Group (CSEWG), a cooperative effort of national laboratories, industry, and universities in the US and Canada, and are maintained by the National Nuclear Data Center (NNDC). Earlier versions of the ENDF format provided representations for neutron cross sections and distributions, photon production from neutron reactions, a limited amount of charged-particle production from neutron reactions, photo-atomic interaction data, thermal neutron scattering data, and radionuclide production and decay data (including fission products). Version 6 (ENDF-6) allows higher incident energies, adds more complete descriptions of the distributions of emitted particles, and provides for incident charged particles and photo-nuclear data by partitioning the ENDF library into sublibraries. Decay data, fission product yield data, thermal scattering data, and photo-atomic data have also been formally placed in sublibraries. In addition, this rewrite represents an extensive update to the Version V manual

  19. Hudson River Sub_Bottom Profile Data - Raw SEG-Y Files (*.sgy)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Hudson River Estuary Shallow Water Surveys. Subbottom data was collected November 5 to December 15, 2009, in the estuary north from Saugerties to Troy. Data...

  20. A Javascript library that uses Windows Script Host (WSH) to analyze prostate motion data fragmented across a multitude of Excel files by the Calypso 4D Localization System.

    Science.gov (United States)

    Vali, Faisal S; Hsi, Alex; Cho, Paul; Parsai, Homayon; Garver, Elizabeth; Garza, Richard

    2008-11-06

    The Calypso 4D Localization System records prostate motion continuously during radiation treatment. It stores the data across thousands of Excel files. We developed Javascript (JScript) libraries for Windows Script Host (WSH) that use ActiveX Data Objects, OLE Automation and SQL to statistically analyze the data and display the results as a comprehensible Excel table. We then leveraged these libraries in other research to perform vector math on data spread across multiple access databases.

  1. LOD-A-lot : A single-file enabler for data science

    NARCIS (Netherlands)

    Beek, Wouter; Ferńandez, Javier D.; Verborgh, Ruben

    2017-01-01

    Many data scientists make use of Linked Open Data (LOD) as a huge interconnected knowledge base represented in RDF. However, the distributed nature of the information and the lack of a scalable approach to manage and consume such Big Semantic Data makes it difficult and expensive to conduct

  2. Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files. SG39 meeting, December 2015

    International Nuclear Information System (INIS)

    Cabellos, Oscar; De Saint Jean, Cyrille; Hursin, Mathieu; Pelloni, Sandro; Ivanov, Evgeny; Kodeli, Ivan; Leconte, Pierre; Palmiotti, Giuseppe; Salvatores, Massimo; Sobes, Vladimir; Yokoyama, Kenji

    2015-12-01

    The aim of WPEC subgroup 39 'Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files' is to provide criteria and practical approaches to use effectively the results of sensitivity analyses and cross section adjustments for feedback to evaluators and differential measurement experimentalists in order to improve the knowledge of neutron cross sections, uncertainties, and correlations to be used in a wide range of applications. This document is the proceedings of the fifth formal Subgroup 39 meeting held at the Institute Curie, Paris, France, on 4 December 2015. It comprises a Summary Record of the meeting, and all the available presentations (slides) given by the participants: A - Sensitivity methods: - 1: Short update on deliverables (K. Yokoyama); - 2: Does one shot Bayesian is equivalent to successive update? Bayesian inference: some matrix linear algebra (C. De Saint Jean); - 3: Progress in Methodology (G. Palmiotti); - SG39-3: Use of PIA approach. Possible application to neutron propagation experiments (S. Pelloni); - 4: Update on sensitivity coefficient methods (E. Ivanov); - 5: Stress test for U-235 fission (H. Wu); - 6: Methods and approaches development at ORNL for providing feedback from integral benchmark experiments for improvement of nuclear data files (V. Sobes); B - Integral experiments: - 7a: Update on SEG analysis (G. Palmiotti); - 7b:Status of MANTRA (G. Palmiotti); - 7c: Possible new experiments at NRAD (G. Palmiotti); - 8: B-eff experiments (I. Kodeli); - 9: On going CEA activities related to dedicated integral experiments for nuclear date validation in the Fast energy range (P. Leconte); - 10: PROTEUS Experiments: an update (M. Hursin); - 11: Short updates on neutron propagation experiments, STEK, CIELO status (O. Cabellos)

  3. Program for shaping neutron microconstants for calculations by means of the Monte-Carlo method on the base of estimated data files (NEDAM)

    International Nuclear Information System (INIS)

    Zakharov, L.N.; Markovskij, D.V.; Frank-Kamenetskij, A.D.; Shatalov, G.E.

    1978-01-01

    The program for shaping neutron microconstants for calculations by means of the Monte-Carlo method, oriented on the detailed consideration of processes in the quick region. The initial information is files of the estimated datea within the UKNDL formate. The method combines the group approach to representation of the process probability and anisotropy of the elastic scattering with the individual description of the secondary neutron spectra of non-elastic processes. The NEDAM program is written in the FORTRAN language for BESM-6 computer and has the following characteristics: the initial file length of the evaluated data is 20000 words, the multigroup constant file length equals 8000 words, the MARK massive length equals 1000 words. The calculation time of a single variant equals 1-2 min

  4. PeakML/mzMatch : A File Format, Java Library, R Library, and Tool-Chain for Mass Spectrometry Data Analysis

    NARCIS (Netherlands)

    Scheltema, Richard A.; Jankevics, Andris; Jansen, Ritsert C.; Swertz, Morris A.; Breitling, Rainer

    2011-01-01

    The recent proliferation of high-resolution mass spectrometers has generated a wealth of new data analysis methods. However, flexible integration of these methods into configurations best suited to the research question is hampered by heterogeneous file formats and monolithic software development.

  5. Indicators of the Legal Security of Indigenous and Community Lands. Data file from LandMark: The Global Platform of Indigenous and Community Lands.

    NARCIS (Netherlands)

    Tagliarino, Nicholas Korte

    2016-01-01

    L. Alden Wily, N. Tagliarino, Harvard Law and International Development Society (LIDS), A. Vidal, C. Salcedo-La Vina, S. Ibrahim, and B. Almeida. 2016. Indicators of the Legal Security of Indigenous and Community Lands. Data file from LandMark: The Global Platform of Indigenous and Community Lands.

  6. From Log Files to Assessment Metrics: Measuring Students' Science Inquiry Skills Using Educational Data Mining

    Science.gov (United States)

    Gobert, Janice D.; Sao Pedro, Michael; Raziuddin, Juelaila; Baker, Ryan S.

    2013-01-01

    We present a method for assessing science inquiry performance, specifically for the inquiry skill of designing and conducting experiments, using educational data mining on students' log data from online microworlds in the Inq-ITS system (Inquiry Intelligent Tutoring System; www.inq-its.org). In our approach, we use a 2-step process: First we use…

  7. The processed neutron activation cross-section data files of the FENDL project. Summary documentation

    International Nuclear Information System (INIS)

    Ganesan, S.; Pashchenko, A.B.; Lemmle, H.D.; Mann, F.M.

    1994-01-01

    This document summarises a neutron activation cross-section database which has been processed in two formats for input to MCNP Monte Carlo codes and to REAC transmutation codes. The data are available from the IAEA Nuclear Data Section online via INTERNET by FTP command. (author)

  8. Validation of CENDL and JEFF evaluated nuclear data files for TRIGA calculations through the analysis of integral parameters of TRX and BAPL benchmark lattices of thermal reactors

    International Nuclear Information System (INIS)

    Uddin, M.N.; Sarker, M.M.; Khan, M.J.H.; Islam, S.M.A.

    2009-01-01

    The aim of this paper is to present the validation of evaluated nuclear data files CENDL-2.2 and JEFF-3.1.1 through the analysis of the integral parameters of TRX and BAPL benchmark lattices of thermal reactors for neutronics analysis of TRIGA Mark-II Research Reactor at AERE, Bangladesh. In this process, the 69-group cross-section library for lattice code WIMS was generated using the basic evaluated nuclear data files CENDL-2.2 and JEFF-3.1.1 with the help of nuclear data processing code NJOY99.0. Integral measurements on the thermal reactor lattices TRX-1, TRX-2, BAPL-UO 2 -1, BAPL-UO 2 -2 and BAPL-UO 2 -3 served as standard benchmarks for testing nuclear data files and have also been selected for this analysis. The integral parameters of the said lattices were calculated using the lattice transport code WIMSD-5B based on the generated 69-group cross-section library. The calculated integral parameters were compared to the measured values as well as the results of Monte Carlo Code MCNP. It was found that in most cases, the values of integral parameters show a good agreement with the experiment and MCNP results. Besides, the group constants in WIMS format for the isotopes U-235 and U-238 between two data files have been compared using WIMS library utility code WILLIE and it was found that the group constants are identical with very insignificant difference. Therefore, this analysis reflects the validation of evaluated nuclear data files CENDL-2.2 and JEFF-3.1.1 through benchmarking the integral parameters of TRX and BAPL lattices and can also be essential to implement further neutronic analysis of TRIGA Mark-II research reactor at AERE, Dhaka, Bangladesh.

  9. Hall et al., 2016 Artificial Turf Surrogate Surface Methods Paper Data File

    Data.gov (United States)

    U.S. Environmental Protection Agency — Mercury dry deposition data quantified via static water surrogate surface (SWSS) and artificial turf surrogate surface (ATSS) collectors. This dataset is associated...

  10. NASA Shuttle Radar Topography Mission Water Body Data Shapefiles & Raster Files V003

    Data.gov (United States)

    National Aeronautics and Space Administration — The NASA SRTM data sets result from a collaborative effort by the National Aeronautics and Space Administration (NASA) and the National Geospatial-Intelligence...

  11. Saving HEBBLE Data from Oblivion: From Faded Paper Copy to Digital Files

    Science.gov (United States)

    Mishonov, A. V.; Richardson, M. J.; Gardner, W. D.

    2017-12-01

    The high-energy benthic boundary-layer experiment (HEBBLE) was designed to test the hypothesis that bed modifications can result from contemporary local erosion and deposition. We observed several 'benthic storms' that resuspended record-high concentrations of particulate matter - filtered samples up to 12,700 µg/l. High kinetic energy and near-bed flow were associated with these record-high concentrations of particulate matter at 4,9600 m off the Nova Scotian Rise in the north-west Atlantic, showing that large episodic events resuspend bottom sediments in deep ocean areas. As part of HEBBLE, CTD/Transmissometer data were collected in the late 1970's and early 1980's, including more than 40 stations on cruise KN74. Although many papers were published based on HEBBLE data, no electronic copies of the KN74 CTD/Transmissometer data were preserved. Because of the uniqueness of the record-high particulate matter concentrations, with ambient current velocities of >70 cm/sec near the seafloor, it was important to rescue these data. We had a paper printout of all of the digital CTD data. Attempts to scan and apply OCR to the data proved futile with standard copying/scanning machines. Texas A&M University Library Digital Service Center scanned our copies with a SupraScanQuartzA00-CamQuartzHD scanner and used ABBYY Fine Reader for OCR and PDF, more frequently used in the humanities for digital preservation and conservation. Their scans were markedly better, but still contained many errors because of poor quality originals. Two students were hired to QA/QC the hundreds of pages of data. While tedious, they successfully corrected the data, thus making it possible to make maps and sections shown here and submit data to publicly accessible archives for future generations to use. These data reside in the OAKTrust Digital Repository at Texas A&M University. After final QA/QC these data will be submitted to NCEI and will be merged with World Ocean Database (WOD).

  12. Fission product yield evaluation for the USA evaluated nuclear data files

    International Nuclear Information System (INIS)

    Rider, B.F.; England, T.R.

    1994-01-01

    An evaluated set of fission product yields for use in calculation of decay heat curves with improved accuracy has been prepared. These evaluated yields are based on all known experimental data through 1992. Unmeasured fission product yields are calculated from charge distribution, pairing effects, and isomeric state models developed at Los Alamos National Laboratory. The current evaluation has been distributed as the ENDF/B-VI fission product yield data set

  13. Formalizing a hierarchical file system

    NARCIS (Netherlands)

    Hesselink, Wim H.; Lali, Muhammad Ikram

    An abstract file system is defined here as a partial function from (absolute) paths to data. Such a file system determines the set of valid paths. It allows the file system to be read and written at a valid path, and it allows the system to be modified by the Unix operations for creation, removal,

  14. Formalizing a Hierarchical File System

    NARCIS (Netherlands)

    Hesselink, Wim H.; Lali, M.I.

    2009-01-01

    In this note, we define an abstract file system as a partial function from (absolute) paths to data. Such a file system determines the set of valid paths. It allows the file system to be read and written at a valid path, and it allows the system to be modified by the Unix operations for removal

  15. Toxic Substances Control Act test submissions database (TSCATS) - comprehensive update. Data file

    International Nuclear Information System (INIS)

    1993-01-01

    The Toxic Substances Control Act Test Submissions Database (TSCATS) was developed to make unpublished test data available to the public. The test data is submitted to the U.S. Environmental Protection Agency by industry under the Toxic Substances Control Act. Test is broadly defined to include case reports, episodic incidents, such as spills, and formal test study presentations. The database allows searching of test submissions according to specific chemical identity or type of study when used with an appropriate search retrieval software program. Studies are indexed under three broad subject areas: health effects, environmental effects and environmental fate. Additional controlled vocabulary terms are assigned which describe the experimental protocol and test observations. Records identify reference information needed to locate the source document, as well as the submitting organization and reason for submission of the test data

  16. Development of web-based user interface for evaluated covariance data files

    International Nuclear Information System (INIS)

    Togashi, Tomoaki; Kato, Kiyoshi; Suzuki, Ryusuke; Otuka, Naohiko

    2010-01-01

    We develop a web-based interface which visualizes cross sections with their covariance compiled in the ENDF format in order to support evaluated covariance data users who do not have experience of NJOY calculation. A package of programs has been constructed without aid of any existing program libraries. (author)

  17. Restoration and Reexamination of Apollo Lunar Dust Detector Data from Original Telemetry Files

    Science.gov (United States)

    McBride, M. J.; Williams, David R.; Hills, H. Kent

    2012-01-01

    We are recovering the original telemetry (Figure I) from the Apollo Dust, Thermal, Radiation Environment Monitor (DTREM) experiment, more commonly known as the Dust Detector, and producing full time resolution (54 second) data sets for release through the Planetary Data System (PDS). The primary objective of the experiment was to evaluate the effect of dust deposition, temperature, and radiation damage on solar cells on the lunar surface. The monitor was a small box consisting of three solar cells and thermistors mounted on the ALSEP (Apollo Lunar Surface Experiments Package) central station. The Dust Detector was carried on Apollo's 11, 12, 14 and 15. The Apollo 11 DTREM was powered by solar cells and only operated for a few months as planned. The Apollo 12, 14, and 15 detectors operated for 5 to 7 years, returning data every 54 seconds, consisting of voltage outputs from the three solar cells and temperatures measured by the three thermistors. The telemetry was received at ground stations and held on the Apollo Housekeeping (known as "Word 33") tapes. made available to the National Space Science Data Center (NSSDC) by Yosio Nakamura (University of Texas Institute for Geophysics). We have converted selected parts of the telemetry into uncalibrated and calibrated output voltages and temperatures.

  18. Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files. SG39 meeting, November 2013

    International Nuclear Information System (INIS)

    De Saint Jean, C.; Dupont, E.; ); Dyrda, J.; Hursin, M.; Pelloni, S.; Ishikawa, M.; Ivanov, E.; Ivanova, T.; Kim, D.H.; Ee, Y.O.; Kodeli, I.; Leal, L.; Leichtle, D.; Palmiotti, G.; Salvatores, M.; Pronyaev, V.; Simakov, S.; )

    2013-11-01

    The aim of WPEC subgroup 39 'Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files' is to provide criteria and practical approaches to use effectively the results of sensitivity analyses and cross section adjustments for feedback to evaluators and differential measurement experimentalists in order to improve the knowledge of neutron cross sections, uncertainties, and correlations to be used in a wide range of applications. This document is the proceedings of the first formal Subgroup 39 meeting held at the NEA, Issy-les-Moulineaux, France, on 28-29 November 2013. It comprises a Summary Record of the meeting and all the available presentations (slides) given by the participants: A - Recent data adjustments performances and trends: 1 - Recommendations from ADJ2010 adjustment (M. Ishikawa); 2 - Feedback on CIELO isotopes from ENDF/B-VII.0 adjustment (G. Palmiotti); 3 - Sensitivity and uncertainty results on FLATTOP-Pu (I. Kodeli); 4 - SG33 benchmark: Comparative adjustment results (S. Pelloni) 5 - Integral benchmarks for data assimilation: selection of a consistent set and establishment of integral correlations (E. Ivanov); 6 - PROTEUS experimental data (M. Hursin); 7 - Additional information on High Conversion Light Water Reactor (HCLWR aka FDWR-II) experiments (14 January 2014); 8 - Data assimilation of benchmark experiments for homogenous thermal/epithermal uranium systems (J. Dyrda); B - Methodology issues: 1 - Adjustment methodology issues (G. Palmiotti); 2 - Marginalisation, methodology issues and nuclear data parameter adjustment (C. De Saint Jean); 3 - Nuclear data parameter adjustment (G. Palmiotti). A list of issues and actions conclude the document

  19. Comparison of burnup calculation results using several evaluated nuclear data files

    International Nuclear Information System (INIS)

    Suyama, Kenya; Katakura, Jun-ichi; Nomura, Yasushi

    2002-01-01

    Burn-up calculation and comparison of the results were carried out to clarify the differences among the following latest evaluated nuclear data libraries: JENDL-3.2, ENDF/B-VI and JEF-2.2. The analyses showed that the differences seen among the current evaluated nuclear data libraries are small for evaluation of the amounts of many uranium and plutonium isotopes. However, several nuclides important for evaluation of nuclear fuel cycle as 238 Pu, 244 Cm, 149 Sm and 134 Cs showed large differences among used libraries. The chain analyses for the isotopes were conducted and the reasons for the differences were discussed. Based on the discussion, information of important cross section to obtain better agreement with the experimental results for 238 Pu, 244 Cm, 149 Sm and 134 Cs was shown. (author)

  20. The NBER Patent Citation Data File: Lessons, Insights and Methodological Tools

    OpenAIRE

    Bronwyn H. Hall; Adam B. Jaffe; Manuel Trajtenberg

    2001-01-01

    This paper describes the database on U.S. patents that we have developed over the past decade, with the goal of making it widely accessible for research. We present main trends in U. S. patenting over the last 30 years, including a variety of original measures constructed with citation data, such as backward and forward citation lags, indices of 'originality' and 'generality', self-citations, etc. Many of these measures exhibit interesting differences across the six main technological categor...

  1. Improvement of Evaluated Nuclear Data Files with Emphasis on Activation and Dosimetry Reactions

    International Nuclear Information System (INIS)

    Trkov, A.

    2013-01-01

    Researchers from the Jozef Stefan Institute are actively involved in the development of the fusion technology through the Slovenian Fusion Association. Neutronics calculations are one of the key areas of expertise and FENDL library is the reference library for the ITER device. The quality of the nuclear data library is therefore of great concern, which drives the motivation for participation in the CRP. Specific contributions to the CRP are briefly described

  2. File sharing

    NARCIS (Netherlands)

    van Eijk, N.

    2011-01-01

    File sharing’ has become generally accepted on the Internet. Users share files for downloading music, films, games, software etc. In this note, we have a closer look at the definition of file sharing, the legal and policy-based context as well as enforcement issues. The economic and cultural

  3. JENDL Dosimetry File

    International Nuclear Information System (INIS)

    Nakazawa, Masaharu; Iguchi, Tetsuo; Kobayashi, Katsuhei; Iwasaki, Shin; Sakurai, Kiyoshi; Ikeda, Yujiro; Nakagawa, Tsuneo.

    1992-03-01

    The JENDL Dosimetry File based on JENDL-3 was compiled and integral tests of cross section data were performed by the Dosimetry Integral Test Working Group of the Japanese Nuclear Data Committee. Data stored in the JENDL Dosimetry File are the cross sections and their covariance data for 61 reactions. The cross sections were mainly taken from JENDL-3 and the covariances from IRDF-85. For some reactions, data were adopted from other evaluated data files. The data are given in the neutron energy region below 20 MeV in both of point-wise and group-wise files in the ENDF-5 format. In order to confirm reliability of the data, several integral tests were carried out; comparison with the data in IRDF-85 and average cross sections measured in fission neutron fields, fast reactor spectra, DT neutron fields and Li(d, n) neutron fields. As a result, it has been found that the JENDL Dosimetry File gives better results than IRDF-85 but there are some problems to be improved in future. The contents of the JENDL Dosimetry File and the results of the integral tests are described in this report. All of the dosimetry cross sections are shown in a graphical form. (author) 76 refs

  4. JENDL Dosimetry File

    Energy Technology Data Exchange (ETDEWEB)

    Nakazawa, Masaharu; Iguchi, Tetsuo [Tokyo Univ. (Japan). Faculty of Engineering; Kobayashi, Katsuhei [Kyoto Univ., Kumatori, Osaka (Japan). Research Reactor Inst.; Iwasaki, Shin [Tohoku Univ., Sendai (Japan). Faculty of Engineering; Sakurai, Kiyoshi; Ikeda, Yujior; Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1992-03-15

    The JENDL Dosimetry File based on JENDL-3 was compiled and integral tests of cross section data were performed by the Dosimetry Integral Test Working Group of the Japanese Nuclear Data Committee. Data stored in the JENDL Dosimetry File are the cross sections and their covariance data for 61 reactions. The cross sections were mainly taken from JENDL-3 and the covariances from IRDF-85. For some reactions, data were adopted from other evaluated data files. The data are given in the neutron energy region below 20 MeV in both of point-wise and group-wise files in the ENDF-5 format. In order to confirm reliability of the data, several integral tests were carried out; comparison with the data in IRDF-85 and average cross sections measured in fission neutron fields, fast reactor spectra, DT neutron fields and Li(d,n) neutron fields. As a result, it has been found that the JENDL Dosimetry File gives better results than IRDF-85 but there are some problems to be improved in future. The contents of the JENDL Dosimetry File and the results of the integral tests are described in this report. All of the dosimetry cross sections are shown in a graphical form.

  5. GUFDIPP - the GUERAP user-friendly data-input programme package for generating and modifying GUERAP III data-files

    International Nuclear Information System (INIS)

    Richards, A.G.

    1989-11-01

    This document describes the GUERAP User-Friendly Data-Input Program Package, called GUFDIPP for short. GUFDIPP is a large suite of programs, developed at RAL (Rutherfield Appleton Laboratory), with the specific purpose of providing a user-friendly interface to the GUERAP III straylight analysis program. GUERAP III is a powerful, Monte-Carlo based program (supplied under licence from ESTEC) for simulating the transfer of electromagnetic radiation between the surfaces of a physical structure and it requires a rather detailed data-set to describe the structure to be modelled. GUFDIPP was developed in order to permit the GUERAP III dataset to be easily, efficiently and accurately built-up and modified, so that access to the GUERAP III program would be much easier, therefore encouraging its use. This document acts as a user-manual for GUFDIPP. Perhaps the most powerful of GUFDIPP's capabilities are those which permit the extraction of subsets of surfaces from a model's datasets to create a new 'sub-model' and the ability to 'add' two models' datasets to create a new 'merged' model. These permit considerable time-saving when entering constraint surface information for a sensor model. (author)

  6. Noaa chlorofluorocarbon tracer program air and seawater measurements: 1986-1989. Data file

    International Nuclear Information System (INIS)

    Wisegarver, D.P.; Bullister, J.L.; Gammon, R.H.; Menzia, F.A.; Kelly, K.C.

    1993-04-01

    The NOAA Chlorofluorocarbon (CFC) Tracer Program at PMEL has been measuring the growing burden of these anthropogenic gases in the thermocline waters of the Pacific Ocean since 1980. The central goals of the NOAA CFC Tracer Program are to document the transient invasion of the CFC tracers into the Pacific Ocean, by means of repeat occupations of key hydrographic sections at 5-year intervals, and to interpret these changing distributions in terms of coupled ocean-atmosphere models. Studies are underway to use the CFC observations in model-validation studies, and to help develop predictive capabilities on the decade-to-century timescale. The report includes measurements of trichlorofluoromethane (CFC-11) and dichlorodifluoromethane (CFC-12) dissolved in seawater samples collected in the Pacific Ocean by the NOAA CFC Tracer Program on six cruises during the period of 1986-1989. Measurements of depth, pressure, salinity, temperature, and dissolved oxygen are included with the CFC data. Measurements of CFC-11 and CFC-12 in air samples collected along the cruise tracks are also included in the report. Data from the report are also available from the authors in digital format

  7. Analysis of Log File Data to Understand Mobile Service Context and Usage Patterns

    Directory of Open Access Journals (Sweden)

    Bernhard Klein

    2013-09-01

    Full Text Available Several mobile acceptance models exist today that focus on user interface handling and usage frequency evaluation. Since mobile applications reach much deeper into everyday life, it is however important to better consider user behaviour for the service evaluation. In this paper we introduce the Behaviour Assessment Model (BAM, which is designed to gaining insights about how well services enable, enhance and replace human activities. More specifically, the basic columns of the evaluation framework concentrate on (1 service actuation in relation to the current user context, (2 the balance between service usage effort and benefit, and (3 the degree to which community knowledge can be exploited. The evaluation is guided by a process model that specifies individual steps of data capturing, aggregation, and final assessment. The BAM helps to gain stronger insights regarding characteristic usage hotspots, frequent usage patterns, and leveraging of networking effects showing more realistically the strengths and weaknesses of mobile services

  8. Yeast-2-Hybrid data file showing progranulin interactions in human fetal brain and bone marrow libraries

    Directory of Open Access Journals (Sweden)

    Irmgard Tegeder

    2016-12-01

    Full Text Available Progranulin deficiency in humans is associated with neurodegeneration. Its mechanisms are not yet fully understood. We performed a Yeast-2-Hybrid screen using human full-length progranulin as bait to assess the interactions of progranulin. Progranulin was screened against human fetal brain and human bone marrow libraries using the standard Matchmaker technology (Clontech. This article contains the full Y2H data table, including blast results and sequences, a sorted table according to selection criteria for likely positive, putatively positive, likely false and false preys, and tables showing the gene ontology terms associated with the likely and putative preys of the brain and bone marrow libraries. The interactions with autophagy proteins were confirmed and functionally analyzed in "Progranulin overexpression in sensory neurons attenuates neuropathic pain in mice: Role of autophagy" (C. Altmann, S. Hardt, C. Fischer, J. Heidler, H.Y. Lim, A. Haussler, B. Albuquerque, B. Zimmer, C. Moser, C. Behrends, F. Koentgen, I. Wittig, M.H. Schmidt, A.M. Clement, T. Deller, I. Tegeder, 2016 [1].

  9. Yeast-2-Hybrid data file showing progranulin interactions in human fetal brain and bone marrow libraries.

    Science.gov (United States)

    Tegeder, Irmgard

    2016-12-01

    Progranulin deficiency in humans is associated with neurodegeneration. Its mechanisms are not yet fully understood. We performed a Yeast-2-Hybrid screen using human full-length progranulin as bait to assess the interactions of progranulin. Progranulin was screened against human fetal brain and human bone marrow libraries using the standard Matchmaker technology (Clontech). This article contains the full Y2H data table, including blast results and sequences, a sorted table according to selection criteria for likely positive, putatively positive, likely false and false preys, and tables showing the gene ontology terms associated with the likely and putative preys of the brain and bone marrow libraries. The interactions with autophagy proteins were confirmed and functionally analyzed in "Progranulin overexpression in sensory neurons attenuates neuropathic pain in mice: Role of autophagy" (C. Altmann, S. Hardt, C. Fischer, J. Heidler, H.Y. Lim, A. Haussler, B. Albuquerque, B. Zimmer, C. Moser, C. Behrends, F. Koentgen, I. Wittig, M.H. Schmidt, A.M. Clement, T. Deller, I. Tegeder, 2016) [1].

  10. MMLEADS Public Use File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicare-Medicaid Linked Enrollee Analytic Data Source (MMLEADS) Public Use File (PUF) contains demographic, enrollment, condition prevalence, utilization, and...

  11. Hospital Service Area File

    Data.gov (United States)

    U.S. Department of Health & Human Services — This file is derived from the calendar year inpatient claims data. The records contain number of discharges, length of stay, and total charges summarized by provider...

  12. USEEIO Satellite Files

    Data.gov (United States)

    U.S. Environmental Protection Agency — These files contain the environmental data as particular emissions or resources associated with a BEA sectors that are used in the USEEIO model. They are organized...

  13. Provider of Services File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The POS file contains data on characteristics of hospitals and other types of healthcare facilities, including the name and address of the facility and the type of...

  14. Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files. SG39 meeting, May 2016

    International Nuclear Information System (INIS)

    Herman, Michal Wladyslaw; Cabellos De Francisco, Oscar; Beck, Bret; Ignatyuk, Anatoly V.; Palmiotti, Giuseppe; Grudzevich, Oleg T.; Salvatores, Massimo; Chadwick, Mark; Pelloni, Sandro; Diez De La Obra, Carlos Javier; Wu, Haicheng; Sobes, Vladimir; Rearden, Bradley T.; Yokoyama, Kenji; Hursin, Mathieu; Penttila, Heikki; Kodeli, Ivan-Alexander; Plevnik, Lucijan; Plompen, Arjan; Gabrielli, Fabrizio; Leal, Luiz Carlos; Aufiero, Manuele; Fiorito, Luca; Hummel, Andrew; Siefman, Daniel; Leconte, Pierre

    2016-05-01

    The aim of WPEC subgroup 39 'Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files' is to provide criteria and practical approaches to use effectively the results of sensitivity analyses and cross section adjustments for feedback to evaluators and differential measurement experimentalists in order to improve the knowledge of neutron cross sections, uncertainties, and correlations to be used in a wide range of applications. WPEC subgroup 40-CIELO (Collaborative International Evaluated Library Organization) provides a new working paradigm to facilitate evaluated nuclear reaction data advances. It brings together experts from across the international nuclear reaction data community to identify and document discrepancies among existing evaluated data libraries, measured data, and model calculation interpretations, and aims to make progress in reconciling these discrepancies to create more accurate ENDF-formatted files. SG40-CIELO focusses on 6 important isotopes: "1H, "1"6O, "5"6Fe, "2"3"5","2"3"8U, "2"3"9Pu. This document is the proceedings of the seventh formal Subgroup 39 meeting and of the Joint SG39+SG40 Session held at the NEA, OECD Conference Center, Paris, France on 10-11 May 2016. It comprises a Summary Record of the meeting, and all the available presentations (slides) given by the participants: A - Welcome and actions review (Oscar CABELLOS); B - Methods: - XGPT: uncertainty propagation and data assimilation from continuous energy covariance matrix and resonance parameters covariances (Manuele AUFIERO); - Optimal experiment utilization (REWINDing PIA), (G. Palmiotti); C - Experiment analysis, sensitivity calculations and benchmarks: - Tripoli-4 analysis of SEG experiments (Andrew HUMMEL); - Tripoli-4 analysis of BERENICE experiments (P. DUFAY, Cyrille DE SAINT JEAN); - Preparation of sensitivities of k-eff, beta-eff and shielding benchmarks for adjustment exercise (Ivo KODELI); - SA and

  15. mzML2ISA & nmrML2ISA: generating enriched ISA-Tab metadata files from metabolomics XML data.

    Science.gov (United States)

    Larralde, Martin; Lawson, Thomas N; Weber, Ralf J M; Moreno, Pablo; Haug, Kenneth; Rocca-Serra, Philippe; Viant, Mark R; Steinbeck, Christoph; Salek, Reza M

    2017-08-15

    Submission to the MetaboLights repository for metabolomics data currently places the burden of reporting instrument and acquisition parameters in ISA-Tab format on users, who have to do it manually, a process that is time consuming and prone to user input error. Since the large majority of these parameters are embedded in instrument raw data files, an opportunity exists to capture this metadata more accurately. Here we report a set of Python packages that can automatically generate ISA-Tab metadata file stubs from raw XML metabolomics data files. The parsing packages are separated into mzML2ISA (encompassing mzML and imzML formats) and nmrML2ISA (nmrML format only). Overall, the use of mzML2ISA & nmrML2ISA reduces the time needed to capture metadata substantially (capturing 90% of metadata on assay and sample levels), is much less prone to user input errors, improves compliance with minimum information reporting guidelines and facilitates more finely grained data exploration and querying of datasets. mzML2ISA & nmrML2ISA are available under version 3 of the GNU General Public Licence at https://github.com/ISA-tools. Documentation is available from http://2isa.readthedocs.io/en/latest/. reza.salek@ebi.ac.uk or isatools@googlegroups.com. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  16. Flat Files - JSNP | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data ... Data file File name: jsnp_flat_files File URL: ftp://ftp.biosciencedbc.jp/archiv...his Database Database Description Download License Update History of This Database Site Policy | Contact Us Flat Files - JSNP | LSDB Archive ...

  17. ZZ ENDF/B-V, Evaluated Nuclear Data File Version 5

    International Nuclear Information System (INIS)

    Kinsey, R.; Magurno, B.A.; Young, P.G.

    2006-01-01

    -152, 57-La-138-155, 58-Ce-140-157, 59-Pr-141-142, 59-Pr-142m, 59-Pr-143-144, 59-Pr-144m, 59-Pr-145-159, 60-Nd-142-161, 61-Pm-147-148, 61-Pm-148m, 61-Pm-149-152, 61-Pm-152m, 61-Pm-152N, 61-Pm-153-154, 61-Pm-154m, 61-Pm-155-162, 62-Sm-144-165, 63-Eu-151-152, 63-Eu-152m, 63-Eu-152N, 63-Eu-153-154, 63-Eu-154m, 63-Eu-155-166, 64-Gd-152-165, 65-Tb-159-165, 66-Dy-160-165, 66-Dy-165m, 66-Dy-166, 67-Ho-165-166, 67-Ho-166m, 68-Er-166-167, 68-Er-167m, 71-Lu-175-176, 72-Hf-174, 72-Hf-176-181, 73-Ta-181-182, 74-W-182-184, 74-W-186-187, 75-Re-185, 75-Re-187, 79-Au-197, 81-Tl-208, 82-Pb-212-213, 83-Bi-209, 83-Bi-212, 84-Po-216, 86-Rn-220, 88-Ra-224, 90-Th-228, 90-Th-230-233, 91-Pa-231-233, 92-U-232-239, 93-Np-236, 93-Np-236m, 93-Np-237-239, 94-Pu-236-244, 95-Am-240-242, 95-Am-242m, 95-Am-243-244, 95-Am-244m, 96-Cm-241-249, 97-Bk-249-250, 98-Cf-249-253, 99-Es-253 Origin: BNL; Weighting Spectrum: none. Contains the evaluated neutron cross-section data for 90 nuclides, issued by the National Nuclear Data Center at the Brookhaven National Laboratory (NNDC-BNL) in 1979

  18. A study of existing experimental data and validation process for evaluated high energy nuclear data. Report of task force on integral test for JENDL High Energy File in Japanese Nuclear Data Committee

    International Nuclear Information System (INIS)

    Oyama, Yukio; Baba, Mamoru; Watanabe, Yukinobu

    1998-11-01

    JENDL High Energy File (JENDL-HE) is being produced by Japanese Nuclear Data Committee (JNDC) to provide common fundamental nuclear data in the intermediate energy region for many applications concerning a basic research, an accelerator-driven nuclear waste transmutation, a fusion material study, and medical applications like the radiation therapy. The first version of JENDL-HE, which contains the evaluated nuclear data up to 50 MeV, is planned to release in 1998. However, a method of integral test with which we can validate the high-energy nuclear data file has not been established. The validation of evaluated nuclear data through the integral tests is necessary to promote utilization of JENDL-HE. JNDC set up a task force in 1997 to discuss the problems concerning the integral tests of JENDL-HE. The task force members have surveyed and studied the current status of the problems for a year to obtain a guideline for development of the high-energy nuclear database. This report summarizes the results of the survey and study done by the task force for JNDC. (author)

  19. Evaluated neutronic file for indium

    International Nuclear Information System (INIS)

    Smith, A.B.; Chiba, S.; Smith, D.L.; Meadows, J.W.; Guenther, P.T.; Lawson, R.D.; Howerton, R.J.

    1990-01-01

    A comprehensive evaluated neutronic data file for elemental indium is documented. This file, extending from 10 -5 eV to 20 MeV, is presented in the ENDF/B-VI format, and contains all neutron-induced processes necessary for the vast majority of neutronic applications. In addition, an evaluation of the 115 In(n,n') 116m In dosimetry reaction is presented as a separate file. Attention is given in quantitative values, with corresponding uncertainty information. These files have been submitted for consideration as a part of the ENDF/B-VI national evaluated-file system. 144 refs., 10 figs., 4 tabs

  20. Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files. SG39 meeting, December 2016

    International Nuclear Information System (INIS)

    Cabellos, Oscar; ); PELLONI, Sandro; Ivanov, Evgeny; Sobes, Vladimir; Fukushima, M.; Yokoyama, Kenji; Palmiotti, Giuseppe; Kodeli, Ivo

    2016-12-01

    The aim of WPEC subgroup 39 'Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files' is to provide criteria and practical approaches to use effectively the results of sensitivity analyses and cross section adjustments for feedback to evaluators and differential measurement experimentalists in order to improve the knowledge of neutron cross sections, uncertainties, and correlations to be used in a wide range of applications. This document is the proceedings of the eighth Subgroup 39 meeting, held at the OECD NEA, Boulogne-Billancourt, France, on 1-2 December 2016. It comprises all the available presentations (slides) given by the participants: A - Presentations: Welcome and actions review (Oscar CABELLOS); B - Methods: - Detailed comparison of Progressive Incremental Adjustment (PIA) sequence results involving adjustments of spectral indices and coolant density effects on the basis of the SG33 benchmark (Sandro PELLONI); - ND assessment alternatives: Validation matrix vs XS adjustment (Evgeny IVANOV); - Implementation of Resonance Parameter Sensitivity Coefficients Calculation in CE TSUNAMI-3D (Vladimir SOBES); C - Experiment analysis, sensitivity calculations and benchmarks: - Benchmark tests of ENDF/B-VIII.0 beta 1 using sodium void reactivity worth of FCA-XXVII-1 assembly (M. FUKUSHIMA, Kenji YOKOYAMA); D - Adjustments: - Cross-section adjustment based on JENDL-4.0 using new experiments on the basis of the SG33 benchmark (Kenji YOKOYAMA); - Comparison of adjustment trends with the Cielo evaluation (Sandro PELLONI); - Expanded adjustment in support of CIELO initiative (Giuseppe PALMIOTTI); - First preliminary results of the adjustment exercise using ASPIS Fe88 and SNEAK-7A/7B k_e_f_f and b_e_f_f benchmarks (Ivo KODELI); E - Future actions, deliverables: - Discussion on future of SG39 and possible new subgroup (Giuseppe PALMIOTTI); - WPEC sub-group proposal: Investigation of Covariance Data in

  1. Long term file migration. Part I: file reference patterns

    International Nuclear Information System (INIS)

    Smith, A.J.

    1978-08-01

    In most large computer installations, files are moved between on-line disk and mass storage (tape, integrated mass storage device) either automatically by the system or specifically at the direction of the user. This is the first of two papers which study the selection of algorithms for the automatic migration of files between mass storage and disk. The use of the text editor data sets at the Stanford Linear Accelerator Center (SLAC) computer installation is examined through the analysis of thirteen months of file reference data. Most files are used very few times. Of those that are used sufficiently frequently that their reference patterns may be examined, about a third show declining rates of reference during their lifetime; of the remainder, very few (about 5%) show correlated interreference intervals, and interreference intervals (in days) appear to be more skewed than would occur with the Bernoulli process. Thus, about two-thirds of all sufficiently active files appear to be referenced as a renewal process with a skewed interreference distribution. A large number of other file reference statistics (file lifetimes, interference distributions, moments, means, number of uses/file, file sizes, file rates of reference, etc.) are computed and presented. The results are applied in the following paper to the development and comparative evaluation of file migration algorithms. 17 figures, 13 tables

  2. Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files. SG39 meeting, November 2014

    International Nuclear Information System (INIS)

    Aufiero, Manuele; Ivanov, Evgeny; Hoefer, Axel; Yokoyama, Kenji; Da Cruz, Dirceu Ferreira; KODELI, Ivan-Alexander; Hursin, Mathieu; Pelloni, Sandro; Palmiotti, Giuseppe; Salvatores, Massimo; Barnes, Andrew; Cabellos De Francisco, Oscar; ); Ivanova, Tatiana; )

    2014-11-01

    The aim of WPEC subgroup 39 'Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files' is to provide criteria and practical approaches to use effectively the results of sensitivity analyses and cross section adjustments for feedback to evaluators and differential measurement experimentalists in order to improve the knowledge of neutron cross sections, uncertainties, and correlations to be used in a wide range of applications. This document is the proceedings of the third formal Subgroup meeting held at the NEA, Issy-les-Moulineaux, France, on 27-28 November 2014. It comprises a Summary Record of the meeting and all the available presentations (slides) given by the participants: A - Sensitivity methods: 1 - Perturbation/sensitivity calculations with Serpent (M. Aufiero); 2 - Comparison of deterministic and Monte Carlo sensitivity analysis of SNEAK-7A and FLATTOP-Pu Benchmarks (I. Kodeli); B - Integral experiments: 1 - PROTEUS experiments: selected experiments sensitivity profiles and availability, (M. Hursin, M. Salvatores - PROTEUS Experiments, HCLWR configurations); 2 - SINBAD Benchmark Database and FNS/JAEA Liquid Oxygen TOF Experiment Analysis (I. Kodeli); 3 - STEK experiment Opportunity for Validation of Fission Products Nuclear Data (D. Da Cruz); 4 - SEG (tailored adjoint flux shapes) (M. Savatores - comments) 5 - IPPE transmission experiments (Fe, 238 U) (T. Ivanova); 6 - RPI semi-integral (Fe, 238 U) (G. Palmiotti - comments); 7 - New experiments, e.g. in connection with the new NSC Expert Group on 'Improvement of Integral Experiments Data for Minor Actinide Management' (G. Palmiotti - Some comments from the Expert Group) 8 - Additional PSI adjustment studies accounting for nonlinearity (S. Pelloni); 9 - Adjustment methodology issues (G. Palmiotti); C - Am-241 and fission product issues: 1 - Am-241 validation for criticality-safety calculations (A. Barnes - Visio

  3. ENDF-6 Formats Manual. Data Formats and Procedures for the Evaluated Nuclear Data File ENDF/B-VI and ENDF/B-VII

    International Nuclear Information System (INIS)

    Herman, M.

    2009-01-01

    In December 2006, the Cross Section Evaluation Working Group (CSEWG) of the United States released the new ENDF/B-VII.0 library. This represented considerable achievement as it was the 1st major release since 1990 when ENDF/B-VI has been made publicly available. The two libraries have been released in the same format, ENDF-6, which has been originally developed for the ENDF/B-VI library. In the early stage of work on the VII-th generation of the library CSEWG made important decision to use the same formats. This decision was adopted even though it was argued that it would be timely to modernize the formats and several interesting ideas were proposed. After careful deliberation CSEWG concluded that actual implementation would require considerable resources needed to modify processing codes and to guarantee high quality of the files processed by these codes. In view of this the idea of format modernization has been postponed and ENDF-6 format was adopted for the new ENDF/B-VII library. In several other areas related to ENDF we made our best to move beyond established tradition and achieve maximum modernization. Thus, the 'Big Paper' on ENDF/B-VII.0 has been published, also in December 2006, as the Special Issue of Nuclear Data Sheets 107 (1996) 2931-3060. The new web retrieval and plotting system for ENDF-6 formatted data, Sigma, was developed by the NNDC and released in 2007. Extensive paper has been published on the advanced tool for nuclear reaction data evaluation, EMPIRE, in 2007. This effort was complemented with release of updated set of ENDF checking codes in 2009. As the final item on this list, major revision of ENDF-6 Formats Manual was made. This work started in 2006 and came to fruition in 2009 as documented in the present report.

  4. ENDF-6 Formats Manual Data Formats and Procedures for the Evaluated Nuclear Data File ENDF/B-VI and ENDF/B-VII

    Energy Technology Data Exchange (ETDEWEB)

    Herman, M.; Members of the Cross Sections Evaluation Working Group

    2009-06-01

    In December 2006, the Cross Section Evaluation Working Group (CSEWG) of the United States released the new ENDF/B-VII.0 library. This represented considerable achievement as it was the 1st major release since 1990 when ENDF/B-VI has been made publicly available. The two libraries have been released in the same format, ENDF-6, which has been originally developed for the ENDF/B-VI library. In the early stage of work on the VII-th generation of the library CSEWG made important decision to use the same formats. This decision was adopted even though it was argued that it would be timely to modernize the formats and several interesting ideas were proposed. After careful deliberation CSEWG concluded that actual implementation would require considerable resources needed to modify processing codes and to guarantee high quality of the files processed by these codes. In view of this the idea of format modernization has been postponed and ENDF-6 format was adopted for the new ENDF/B-VII library. In several other areas related to ENDF we made our best to move beyond established tradition and achieve maximum modernization. Thus, the 'Big Paper' on ENDF/B-VII.0 has been published, also in December 2006, as the Special Issue of Nuclear Data Sheets 107 (1996) 2931-3060. The new web retrieval and plotting system for ENDF-6 formatted data, Sigma, was developed by the NNDC and released in 2007. Extensive paper has been published on the advanced tool for nuclear reaction data evaluation, EMPIRE, in 2007. This effort was complemented with release of updated set of ENDF checking codes in 2009. As the final item on this list, major revision of ENDF-6 Formats Manual was made. This work started in 2006 and came to fruition in 2009 as documented in the present report.

  5. Impact of up-to-date evaluated nuclear data files on the Monte-Carlo analysis results of metallic fueled BFS critical assemblies

    International Nuclear Information System (INIS)

    Yoo, Jaewoon; Kim, Do-Heon; Kim, Sang-Ji; Kim, Yeong-Il

    2009-01-01

    Three metallic fueled BFS critical assemblies, BFS-73-1, BFS-75-1, and BFS-55-1 were analyzed by using the Monte-Carlo analysis code MCNP4C with five different evaluated data files, ENDF/B-VII.0, JEFF-3.1, JENDL-3.3, JENDL-AC and ENDF/B-VI.6. The impacts of microscopic cross sections in the up-to-date evaluated nuclear data files were clarified by the analyses. The update of Zr cross section leads to the calculated k-effective lower than that of ENDF/B-VI.6. The revision of U-238 inelastic scattering cross section makes large difference in the predicted k-effectives between the libraries, which depends on the amount of the contribution of the inelastic cross sections change and the compensation of other reaction types. The results of the spectral indices and reaction rate ratios shows the improvement of the up-to-date evaluated nuclear data files for the U-238, Np-237, Pu-240 fission reactions, however, there are still need of further improvement for other minor actinide cross sections. The heterogeneity effects involved on the k-effective and relative fission rate distribution were evaluated in this study, which can be used as the correction factor for constructing the homogeneous benchmark configuration while keeping the consistency with the actual critical experiment. (author)

  6. Land Boundary Conditions for the Goddard Earth Observing System Model Version 5 (GEOS-5) Climate Modeling System: Recent Updates and Data File Descriptions

    Science.gov (United States)

    Mahanama, Sarith P.; Koster, Randal D.; Walker, Gregory K.; Takacs, Lawrence L.; Reichle, Rolf H.; De Lannoy, Gabrielle; Liu, Qing; Zhao, Bin; Suarez, Max J.

    2015-01-01

    The Earths land surface boundary conditions in the Goddard Earth Observing System version 5 (GEOS-5) modeling system were updated using recent high spatial and temporal resolution global data products. The updates include: (i) construction of a global 10-arcsec land-ocean lakes-ice mask; (ii) incorporation of a 10-arcsec Globcover 2009 land cover dataset; (iii) implementation of Level 12 Pfafstetter hydrologic catchments; (iv) use of hybridized SRTM global topography data; (v) construction of the HWSDv1.21-STATSGO2 merged global 30 arc second soil mineral and carbon data in conjunction with a highly-refined soil classification system; (vi) production of diffuse visible and near-infrared 8-day MODIS albedo climatologies at 30-arcsec from the period 2001-2011; and (vii) production of the GEOLAND2 and MODIS merged 8-day LAI climatology at 30-arcsec for GEOS-5. The global data sets were preprocessed and used to construct global raster data files for the software (mkCatchParam) that computes parameters on catchment-tiles for various atmospheric grids. The updates also include a few bug fixes in mkCatchParam, as well as changes (improvements in algorithms, etc.) to mkCatchParam that allow it to produce tile-space parameters efficiently for high resolution AGCM grids. The update process also includes the construction of data files describing the vegetation type fractions, soil background albedo, nitrogen deposition and mean annual 2m air temperature to be used with the future Catchment CN model and the global stream channel network to be used with the future global runoff routing model. This report provides detailed descriptions of the data production process and data file format of each updated data set.

  7. Mixed-Media File Systems

    NARCIS (Netherlands)

    Bosch, H.G.P.

    1999-01-01

    This thesis addresses the problem of implementing mixed-media storage systems. In this work a mixed-media file system is defined to be a system that stores both conventional (best-effort) file data and real-time continuous-media data. Continuous-media data is usually bulky, and servers storing and

  8. 21 CFR 514.11 - Confidentiality of data and information in a new animal drug application file.

    Science.gov (United States)

    2010-04-01

    ..., reports under §§ 514.80 and 510.301 of this chapter, master files, and other related submissions. The... report, such as a physician, hospital, or other institution. (5) A list of all active ingredients and any... summaries of oral discussions relating to the NADA, in accordance with the provisions of part 20 of this...

  9. Building Parts Inventory Files Using the AppleWorks Data Base Subprogram and Apple IIe or GS Computers.

    Science.gov (United States)

    Schlenker, Richard M.

    This manual is a "how to" training device for building database files using the AppleWorks program with an Apple IIe or Apple IIGS Computer with Duodisk or two disk drives and an 80-column card. The manual provides step-by-step directions, and includes 25 figures depicting the computer screen at the various stages of the database file…

  10. Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files. SG39 meeting, May 2014

    International Nuclear Information System (INIS)

    Aliberti, G.; Archier, P.; Dunn, M.; Dupont, E.; Hill, I.; ); Garcia, A.; Hursin, M.; Pelloni, S.; Ivanova, T.; Kodeli, I.; Palmiotti, G.; Salvatores, M.; Touran, N.; Wenming, Wang; Yokoyama, K.

    2014-05-01

    The aim of WPEC subgroup 39 'Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files' is to provide criteria and practical approaches to use effectively the results of sensitivity analyses and cross section adjustments for feedback to evaluators and differential measurement experimentalists in order to improve the knowledge of neutron cross sections, uncertainties, and correlations to be used in a wide range of applications. This document is the proceedings of the second Subgroup meeting, held at the NEA, Issy-les-Moulineaux, France, on 13 May 2014. It comprises a Summary Record of the meeting and all the available presentations (slides) given by the participants: A - Welcome: Review of actions (M. Salvatores); B - Inter-comparison of sensitivity coefficients: 1 - Sensitivity Computation with Monte Carlo Methods (T. Ivanova); 2 - Sensitivity analysis of FLATTOP-Pu (I. Kodeli); 3 - Sensitivity coefficients by means of SERPENT-2 (S. Pelloni); 4 - Demonstration - Database for ICSBEP (DICE) and Database and Analysis Tool for IRPhE (IDAT) (I. Hill); C - Specific new experiments: 1 - PROTEUS FDWR-II (HCLWR) program summary (M. Hursin); 2 - STEK and SEG Experiments, M. Salvatores 3 - Experiments related to "2"3"5U, "2"3"8U, "5"6Fe and "2"3Na, G. Palmiotti); 4 - Validation of Iron Cross Sections against ASPIS Experiments (JEF/DOC-420) (I. Kodeli); 5 - Benchmark analysis of Iron Cross-sections (EFFDOC-1221) (I. Kodeli 6 - Integral Beta-effective Measurements (K. Yokoyama on behalf of M. Ishikawa); D - Adjustment results: 1 - Impacts of Covariance Data and Interpretation of Adjustment Trends of ADJ2010, (K. Yokoyama); 2 - Revised Recommendations from ADJ2010 Adjustment (K. Yokoyama); 3 - Comparisons and Discussions on Adjustment trends from JEFF (CEA) (P. Archier); 4 - Feedback on CIELO Isotopes from ENDF/B-VII.0 Adjustment (G. Palmiotti); 5 - Demonstration - Plot comparisons of participants' results (E

  11. Digital Elevation Model (DEM) file of topographic elevations for the Death Valley region of southern Nevada and southeastern California processed from US Geological Survey 1-degree Digital Elevation Model data files

    International Nuclear Information System (INIS)

    Turner, A.K.; D'Agnese, F.A.; Faunt, C.C.

    1996-01-01

    Elevation data have been compiled into a digital data base for an ∼100,000-km 2 area of the southern Great Basin, the Death Valley region of southern Nevada, and SE Calif., located between lat 35 degree N, long 115 degree W, and lat 38 degree N, long 118 degree W. This region includes the Nevada Test Site, Yucca Mountain, and adjacent parts of southern Nevada and eastern California and encompasses the Death Valley regional ground-water system. Because digital maps are often useful for applications other than that for which they were originally intended, and because the area corresponds to a region under continuing investigation by several groups, these digital files are being released by USGS

  12. Cut-and-Paste file-systems : integrating simulators and file systems

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.

    1996-01-01

    We have implemented an integrated and configurable file system called the PFS and a trace-driven file-system simulator called Patsy. Patsy is used for off-line analysis of file-system algorithms, PFS is used for on-line file-system data storage. Algorithms are first analyzed in Patsy and when we are

  13. PeakML/mzMatch: a file format, Java library, R library, and tool-chain for mass spectrometry data analysis.

    Science.gov (United States)

    Scheltema, Richard A; Jankevics, Andris; Jansen, Ritsert C; Swertz, Morris A; Breitling, Rainer

    2011-04-01

    The recent proliferation of high-resolution mass spectrometers has generated a wealth of new data analysis methods. However, flexible integration of these methods into configurations best suited to the research question is hampered by heterogeneous file formats and monolithic software development. The mzXML, mzData, and mzML file formats have enabled uniform access to unprocessed raw data. In this paper we present our efforts to produce an equally simple and powerful format, PeakML, to uniformly exchange processed intermediary and result data. To demonstrate the versatility of PeakML, we have developed an open source Java toolkit for processing, filtering, and annotating mass spectra in a customizable pipeline (mzMatch), as well as a user-friendly data visualization environment (PeakML Viewer). The PeakML format in particular enables the flexible exchange of processed data between software created by different groups or companies, as we illustrate by providing a PeakML-based integration of the widely used XCMS package with mzMatch data processing tools. As an added advantage, downstream analysis can benefit from direct access to the full mass trace information underlying summarized mass spectrometry results, providing the user with the means to rapidly verify results. The PeakML/mzMatch software is freely available at http://mzmatch.sourceforge.net, with documentation, tutorials, and a community forum.

  14. Portable File Format (PFF) specifications

    Energy Technology Data Exchange (ETDEWEB)

    Dolan, Daniel H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Created at Sandia National Laboratories, the Portable File Format (PFF) allows binary data transfer across computer platforms. Although this capability is supported by many other formats, PFF files are still in use at Sandia, particularly in pulsed power research. This report provides detailed PFF specifications for accessing data without relying on legacy code.

  15. source files for manuscript in tex format

    Data.gov (United States)

    U.S. Environmental Protection Agency — Source tex files used to create the manuscript including original figure files and raw data used in tables and inline text. This dataset is associated with the...

  16. Zebra: A striped network file system

    Science.gov (United States)

    Hartman, John H.; Ousterhout, John K.

    1992-01-01

    The design of Zebra, a striped network file system, is presented. Zebra applies ideas from log-structured file system (LFS) and RAID research to network file systems, resulting in a network file system that has scalable performance, uses its servers efficiently even when its applications are using small files, and provides high availability. Zebra stripes file data across multiple servers, so that the file transfer rate is not limited by the performance of a single server. High availability is achieved by maintaining parity information for the file system. If a server fails its contents can be reconstructed using the contents of the remaining servers and the parity information. Zebra differs from existing striped file systems in the way it stripes file data: Zebra does not stripe on a per-file basis; instead it stripes the stream of bytes written by each client. Clients write to the servers in units called stripe fragments, which are analogous to segments in an LFS. Stripe fragments contain file blocks that were written recently, without regard to which file they belong. This method of striping has numerous advantages over per-file striping, including increased server efficiency, efficient parity computation, and elimination of parity update.

  17. ACONC Files

    Data.gov (United States)

    U.S. Environmental Protection Agency — ACONC files containing simulated ozone and PM2.5 fields that were used to create the model difference plots shown in the journal article. This dataset is associated...

  18. 831 Files

    Data.gov (United States)

    Social Security Administration — SSA-831 file is a collection of initial and reconsideration adjudicative level DDS disability determinations. (A few hearing level cases are also present, but the...

  19. File System Virtual Appliances

    Science.gov (United States)

    2010-05-01

    4 KB of data is read or written, data is copied back and forth using trampoline buffers — pages that are shared during proxy initialization — because...in 2008. CIO Magazine. 104 · File system virtual appliances [64] Megiddo, N. and Modha, D. S. 2003. ARC: A Self-Tuning, Low Over- head Replacement

  20. TIGER/Line Shapefile, 2010, Series Information File for the 2010 Census Block State-based Shapefile with Housing and Population Data

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The TIGER/Line Files are shapefiles and related database files (.dbf) that are an extract of selected geographic and cartographic information from the U.S. Census...

  1. Storage of sparse files using parallel log-structured file system

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron

    2017-11-07

    A sparse file is stored without holes by storing a data portion of the sparse file using a parallel log-structured file system; and generating an index entry for the data portion, the index entry comprising a logical offset, physical offset and length of the data portion. The holes can be restored to the sparse file upon a reading of the sparse file. The data portion can be stored at a logical end of the sparse file. Additional storage efficiency can optionally be achieved by (i) detecting a write pattern for a plurality of the data portions and generating a single patterned index entry for the plurality of the patterned data portions; and/or (ii) storing the patterned index entries for a plurality of the sparse files in a single directory, wherein each entry in the single directory comprises an identifier of a corresponding sparse file.

  2. panMetaDocs, eSciDoc, and DOIDB - an infrastructure for the curation and publication of file-based datasets for 'GFZ Data Services'

    Science.gov (United States)

    Ulbricht, Damian; Elger, Kirsten; Bertelmann, Roland; Klump, Jens

    2016-04-01

    With the foundation of DataCite in 2009 and the technical infrastructure installed in the last six years it has become very easy to create citable dataset DOIs. Nowadays, dataset DOIs are increasingly accepted and required by journals in reference lists of manuscripts. In addition, DataCite provides usage statistics [1] of assigned DOIs and offers a public search API to make research data count. By linking related information to the data, they become more useful for future generations of scientists. For this purpose, several identifier systems, as ISBN for books, ISSN for journals, DOI for articles or related data, Orcid for authors, and IGSN for physical samples can be attached to DOIs using the DataCite metadata schema [2]. While these are good preconditions to publish data, free and open solutions that help with the curation of data, the publication of research data, and the assignment of DOIs in one software seem to be rare. At GFZ Potsdam we built a modular software stack that is made of several free and open software solutions and we established 'GFZ Data Services'. 'GFZ Data Services' provides storage, a metadata editor for publication and a facility to moderate minted DOIs. All software solutions are connected through web APIs, which makes it possible to reuse and integrate established software. Core component of 'GFZ Data Services' is an eSciDoc [3] middleware that is used as central storage, and has been designed along the OAIS reference model for digital preservation. Thus, data are stored in self-contained packages that are made of binary file-based data and XML-based metadata. The eSciDoc infrastructure provides access control to data and it is able to handle half-open datasets, which is useful in embargo situations when a subset of the research data are released after an adequate period. The data exchange platform panMetaDocs [4] makes use of eSciDoc's REST API to upload file-based data into eSciDoc and uses a metadata editor [5] to annotate the files

  3. Data file of a deep proteome analysis of the prefrontal cortex in aged mice with progranulin deficiency or neuronal overexpression of progranulin.

    Science.gov (United States)

    Heidler, Juliana; Hardt, Stefanie; Wittig, Ilka; Tegeder, Irmgard

    2016-12-01

    Progranulin deficiency is associated with neurodegeneration in humans and in mice. The mechanisms likely involve progranulin-promoted removal of protein waste via autophagy. We performed a deep proteomic screen of the pre-frontal cortex in aged (13-15 months) female progranulin-deficient mice (GRN -/- ) and mice with inducible neuron-specific overexpression of progranulin (SLICK-GRN-OE) versus the respective control mice. Proteins were extracted and analyzed per liquid chromatography/mass spectrometry (LC/MS) on a Thermo Scientific™ Q Exactive Plus equipped with an ultra-high performance liquid chromatography unit and a Nanospray Flex Ion-Source. Full Scan MS-data were acquired using Xcalibur and raw files were analyzed using the proteomics software Max Quant. The mouse reference proteome set from uniprot (June 2015) was used to identify peptides and proteins. The DiB data file is a reduced MaxQuant output and includes peptide and protein identification, accession numbers, protein and gene names, sequence coverage and label free quantification (LFQ) values of each sample. Differences in protein expression in genotypes are presented in "Progranulin overexpression in sensory neurons attenuates neuropathic pain in mice: Role of autophagy" (C. Altmann, S. Hardt, C. Fischer, J. Heidler, H.Y. Lim, A. Haussler, B. Albuquerque, B. Zimmer, C. Moser, C. Behrends, F. Koentgen, I. Wittig, M.H. Schmidt, A.M. Clement, T. Deller, I. Tegeder, 2016) [1].

  4. Data file of a deep proteome analysis of the prefrontal cortex in aged mice with progranulin deficiency or neuronal overexpression of progranulin

    Directory of Open Access Journals (Sweden)

    Juliana Heidler

    2016-12-01

    Full Text Available Progranulin deficiency is associated with neurodegeneration in humans and in mice. The mechanisms likely involve progranulin-promoted removal of protein waste via autophagy. We performed a deep proteomic screen of the pre-frontal cortex in aged (13–15 months female progranulin-deficient mice (GRN−/− and mice with inducible neuron-specific overexpression of progranulin (SLICK-GRN-OE versus the respective control mice. Proteins were extracted and analyzed per liquid chromatography/mass spectrometry (LC/MS on a Thermo Scientific™ Q Exactive Plus equipped with an ultra-high performance liquid chromatography unit and a Nanospray Flex Ion-Source. Full Scan MS-data were acquired using Xcalibur and raw files were analyzed using the proteomics software Max Quant. The mouse reference proteome set from uniprot (June 2015 was used to identify peptides and proteins. The DiB data file is a reduced MaxQuant output and includes peptide and protein identification, accession numbers, protein and gene names, sequence coverage and label free quantification (LFQ values of each sample. Differences in protein expression in genotypes are presented in "Progranulin overexpression in sensory neurons attenuates neuropathic pain in mice: Role of autophagy" (C. Altmann, S. Hardt, C. Fischer, J. Heidler, H.Y. Lim, A. Haussler, B. Albuquerque, B. Zimmer, C. Moser, C. Behrends, F. Koentgen, I. Wittig, M.H. Schmidt, A.M. Clement, T. Deller, I. Tegeder, 2016 [1].

  5. Detection of myocardial 123I-BMIPP distribution abnormality in patients with ischemic heart disease based on normal data file in Bull's-eye polar map

    International Nuclear Information System (INIS)

    Takahashi, Nobukazu; Ishida, Yoshio; Hirose, Yoshiaki; Kawano, Shigeo; Fukuoka, Syuji; Hayashida, Kohei; Takamiya, Makoto; Nonogi, Hiroshi

    1995-01-01

    Visual interpretation of 123 I-BMIPP (BMIPP) myocardial images has difficulties in detecting mild reduction in tracer uptake. We studied the significance of the objective assessment of myocardial BMIPP maldistributions at rest by using a Bull's-eye map and its normal data file for detecting ischemic heart disease. Twenty nine patients, 15 with prior myocardial infarction and 14 with effort angina were studied. The initial 15-min BMIPP image was evaluated by visual analysis and by generating the extent Bull's-eye map which exhibits regions with reduced % uptake under mean-2SD of 10 normal controls. The sensitivity for determining coronary lesions in non-infarcted myocardial regions with the extent map was superior to that with visual analysis (67% vs. 33%). In the regions supplied by the stenotic coronary artery, those which showed visually negative but positive in the map and which showed positive in both had higher incidence of wall motion abnormalities and severe coronary stenosis than those with normal findings in both. These results suggest that the objective assessment based on the normal data file in a Bull's-eye polar map is clinically important for improving the limitation or the visual interpretation in 123 I-BMIPP imaging. (author)

  6. Early Childhood Longitudinal Study, Kindergarten Class of 2010-11 (ECLS-K:2011). User's Manual for the ECLS-K:2011 Kindergarten-First Grade Data File and Electronic Codebook, Public Version. NCES 2015-078

    Science.gov (United States)

    Tourangeau, Karen; Nord, Christine; Lê, Thanh; Wallner-Allen, Kathleen; Hagedorn, Mary C.; Leggitt, John; Najarian, Michelle

    2015-01-01

    This manual provides guidance and documentation for users of the longitudinal kindergarten-first grade (K-1) data file of the Early Childhood Longitudinal Study, Kindergarten Class of 2010-11 (ECLS-K:2011). It mainly provides information specific to the first-grade rounds of data collection. Data for the ECLS-K:2011 are released in both a…

  7. Group cross-section processing method and common nuclear group cross-section library based on JENDL-3 nuclear data file

    International Nuclear Information System (INIS)

    Hasegawa, Akira

    1991-01-01

    A common group cross-section library has been developed in JAERI. This system is called 'JSSTDL-295n-104γ (neutron:295 gamma:104) group constants library system', which is composed of a common 295n-104γ group cross-section library based on JENDL-3 nuclear data file and its utility codes. This system is applicable to fast and fusion reactors. In this paper, firstly outline of group cross-section processing adopted in Prof. GROUCH-G/B system is described in detail which is a common step for all group cross-section library generation. Next available group cross-section libraries developed in Japan based on JENDL-3 are briefly reviewed. Lastly newly developed JSSTDL library system is presented with some special attention to the JENDL-3 data. (author)

  8. Image Steganography In Securing Sound File Using Arithmetic Coding Algorithm, Triple Data Encryption Standard (3DES) and Modified Least Significant Bit (MLSB)

    Science.gov (United States)

    Nasution, A. B.; Efendi, S.; Suwilo, S.

    2018-04-01

    The amount of data inserted in the form of audio samples that use 8 bits with LSB algorithm, affect the value of PSNR which resulted in changes in image quality of the insertion (fidelity). So in this research will be inserted audio samples using 5 bits with MLSB algorithm to reduce the number of data insertion where previously the audio sample will be compressed with Arithmetic Coding algorithm to reduce file size. In this research will also be encryption using Triple DES algorithm to better secure audio samples. The result of this research is the value of PSNR more than 50dB so it can be concluded that the image quality is still good because the value of PSNR has exceeded 40dB.

  9. Important comments on KERMA factors and DPA cross-section data in ACE files of JENDL-4.0, JEFF-3.2 and ENDF/B-VII.1

    Science.gov (United States)

    Konno, Chikara; Tada, Kenichi; Kwon, Saerom; Ohta, Masayuki; Sato, Satoshi

    2017-09-01

    We have studied reasons of differences of KERMA factors and DPA cross-section data among nuclear data libraries. Here the KERMA factors and DPA cross-section data included in the official ACE files of JENDL-4.0, ENDF/B-VII.1 and JEFF-3.2 are examined in more detail. As a result, it is newly found out that the KERMA factors and DPA cross-section data of a lot of nuclei are different among JENDL-4.0, ENDF/B-VII.1 and JEFF-3.2 and reasons of the differences are the followings: 1) large secondary particle production yield, 2) no secondary gamma data, 3) secondary gamma data in files12-15 mt = 3, 4) mt = 103-107 data without mt = 600 s-800 s data in file6. The issue 1) is considered to be due to nuclear data, while the issues 2)-4) seem to be due to NJOY. The ACE files of JENDL-4.0, ENDF/B-VII.1 and JEFF-3.2 with these problems should be revised after correcting wrong nuclear data and NJOY problems.

  10. Building a parallel file system simulator

    International Nuclear Information System (INIS)

    Molina-Estolano, E; Maltzahn, C; Brandt, S A; Bent, J

    2009-01-01

    Parallel file systems are gaining in popularity in high-end computing centers as well as commercial data centers. High-end computing systems are expected to scale exponentially and to pose new challenges to their storage scalability in terms of cost and power. To address these challenges scientists and file system designers will need a thorough understanding of the design space of parallel file systems. Yet there exist few systematic studies of parallel file system behavior at petabyte- and exabyte scale. An important reason is the significant cost of getting access to large-scale hardware to test parallel file systems. To contribute to this understanding we are building a parallel file system simulator that can simulate parallel file systems at very large scale. Our goal is to simulate petabyte-scale parallel file systems on a small cluster or even a single machine in reasonable time and fidelity. With this simulator, file system experts will be able to tune existing file systems for specific workloads, scientists and file system deployment engineers will be able to better communicate workload requirements, file system designers and researchers will be able to try out design alternatives and innovations at scale, and instructors will be able to study very large-scale parallel file system behavior in the class room. In this paper we describe our approach and provide preliminary results that are encouraging both in terms of fidelity and simulation scalability.

  11. Linking road accident data to other files : an integrated road accident recordkeeping system. Contribution in Proceedings of Seminar P 'Road Safety' held at the 14th PTHC Summer Annual Meeting, University of Sussex, England, from 14-17 July 1986. Volume P 284, p. 55-86.

    OpenAIRE

    Harris, S.

    1986-01-01

    The road accident data which the police collect is of great value to road safety research and is used extensively. This data increases greatly in value if it can be linked to other files which contain more detailed information on exposure. Linking road accident data to other files results in what we call an Integrated Road Accident Recordkeeping System in -which the combined value of the linked files is greater than that of the sum of their individual values.

  12. Documentation for the NCES Common Core of Data National Public Education Financial Survey (NPEFS), School Year 2008-09 (Fiscal Year 2009). Revised File Version 1b. NCES 2011-330rev

    Science.gov (United States)

    Cornman, Stephen Q.; Zhou, Lei; Nakamoto, Nanae

    2012-01-01

    This documentation is for the revised file (Version 1b) of the National Center for Education Statistics' (NCES) Common Core of Data (CCD) National Public Education Financial Survey (NPEFS) for school year 2008-2009, fiscal year 2009 (FY 09). It contains a brief description of the data collection along with information required to understand and…

  13. A Metadata-Rich File System

    Energy Technology Data Exchange (ETDEWEB)

    Ames, S; Gokhale, M B; Maltzahn, C

    2009-01-07

    Despite continual improvements in the performance and reliability of large scale file systems, the management of file system metadata has changed little in the past decade. The mismatch between the size and complexity of large scale data stores and their ability to organize and query their metadata has led to a de facto standard in which raw data is stored in traditional file systems, while related, application-specific metadata is stored in relational databases. This separation of data and metadata requires considerable effort to maintain consistency and can result in complex, slow, and inflexible system operation. To address these problems, we have developed the Quasar File System (QFS), a metadata-rich file system in which files, metadata, and file relationships are all first class objects. In contrast to hierarchical file systems and relational databases, QFS defines a graph data model composed of files and their relationships. QFS includes Quasar, an XPATH-extended query language for searching the file system. Results from our QFS prototype show the effectiveness of this approach. Compared to the defacto standard, the QFS prototype shows superior ingest performance and comparable query performance on user metadata-intensive operations and superior performance on normal file metadata operations.

  14. Parcels and Land Ownership, This data set consists of digital map files containing parcel-level cadastral information obtained from property descriptions. Cadastral features contained in the data set include real property boundary lines, rights-of-way boundaries, property dimensions, Published in Not Provided, 1:2400 (1in=200ft) scale, Racine County Government.

    Data.gov (United States)

    NSGIC Local Govt | GIS Inventory — Parcels and Land Ownership dataset current as of unknown. This data set consists of digital map files containing parcel-level cadastral information obtained from...

  15. Water Pumping Stations, File name = UTILITIES - PARTIAL Data is incomplete. Contains electric trans lines, electric substations, sewer plants, sewer pumpstations, water plants, water tanks http://www.harfordcountymd.gov/gis/Index.cfm, Published in 2011, 1:1200 (1in=100ft) scale, Harford County Government.

    Data.gov (United States)

    NSGIC Local Govt | GIS Inventory — Water Pumping Stations dataset current as of 2011. File name = UTILITIES - PARTIAL Data is incomplete. Contains electric trans lines, electric substations, sewer...

  16. Sewerage Pumping Stations, File name = UTILITIES - PARTIAL Data is incomplete. Contains electric trans lines, electric substations, sewer plants, sewer pumpstations, water plants, water tanks http://www.harfordcountymd.gov/gis/Index.cfm, Published in 2011, 1:600 (1in=50ft) scale, Harford County Government.

    Data.gov (United States)

    NSGIC Local Govt | GIS Inventory — Sewerage Pumping Stations dataset current as of 2011. File name = UTILITIES - PARTIAL Data is incomplete. Contains electric trans lines, electric substations, sewer...

  17. Parallel log structured file system collective buffering to achieve a compact representation of scientific and/or dimensional data

    Science.gov (United States)

    Grider, Gary A.; Poole, Stephen W.

    2015-09-01

    Collective buffering and data pattern solutions are provided for storage, retrieval, and/or analysis of data in a collective parallel processing environment. For example, a method can be provided for data storage in a collective parallel processing environment. The method comprises receiving data to be written for a plurality of collective processes within a collective parallel processing environment, extracting a data pattern for the data to be written for the plurality of collective processes, generating a representation describing the data pattern, and saving the data and the representation.

  18. Ground-Based Global Navigation Satellite System Data (30-second sampling, 1 hour files) from NASA CDDIS

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Navigation Satellite System (GNSS) daily 30-second sampled data available from the Crustal Dynamics Data Information System (CDDIS). Global Navigation...

  19. CINDA 99, supplement 2 to CINDA 97 (1988-1999). The index to literature and computer files on microscopic neutron data

    International Nuclear Information System (INIS)

    1999-01-01

    CINDA, the Computer Index of Neutron Data, contains bibliographical references to measurements, calculations, reviews and evaluations of neutron cross-sections and other microscopic neutron data; it includes also index references to computer libraries of numerical neutron data available from four regional neutron data centres. The present issue, CINDA 99, is the second supplement to CINDA 97, the index to the literature on neutron data published after 1987. It supersedes the first supplement, CINDA 98. The complete CINDA file as of 1 June 1999 is contained in: the archival issue CINDA-A (5 volumes, 1990), CINDA 97 and the current issue CINDA 99. The compilation and publication of CINDA are the result of worldwide co-operation involving the following four data centres. Each centre is responsible for compiling the CINDA entries from the literature published in a defined geographical area given in brackets below: the USA National Nuclear Data Center at the Brookhaven National Laboratory, USA (United States of America and Canada); the Russian Nuclear Data Centre at the Fiziko-Energeticheskij Institut, Obninsk, Russian Federation (former USSR countries); the NEA Data Bank in Paris, France (European OECD member countries in Western Europe and Japan); and the IAEA Nuclear Data Section in Vienna, Austria (all other countries in Eastern Europe, Asia, Australia, Africa, Central and South America; also IAEA publications and translation journals). Besides the published CINDA books, up-to-date computer retrievals for specified CINDA information are currently available on request from the responsible CINDA centres, or via direct access to the on-line services as described in this publication

  20. Australian comments on data catalogues

    Energy Technology Data Exchange (ETDEWEB)

    Symonds, J L [A.A.E.C. Research Establishment, Lucas Heights (Australia)

    1968-05-01

    Between the need for some neutron data and a final evaluated set of data, the need for an action file, a bibliographic and reference file of catalogue, and a data storage and retrieval file is discussed.

  1. Linking road accident data to other files : an integrated road accident recordkeeping system. Contribution in Proceedings of Seminar P 'Road Safety' held at the 14th PTHC Summer Annual Meeting, University of Sussex, England, from 14-17 July 1986. Volume P 284, p. 55-86.

    NARCIS (Netherlands)

    Harris, S.

    1986-01-01

    The road accident data which the police collect is of great value to road safety research and is used extensively. This data increases greatly in value if it can be linked to other files which contain more detailed information on exposure. Linking road accident data to other files results in what we

  2. WTSETUP: Software for Creating and Editing Configuration Files in the Low Speed Wind Tunnel Data Acquisition System

    National Research Council Canada - National Science Library

    Edwards, Craig

    1999-01-01

    The Data Acquisition System in the Low Speed Wind Tunnel at the Aeronautical and Maritime Research Laboratory is responsible for the measurement, recording, processing and displaying of wind tunnel test data...

  3. Ground-Based Global Navigation Satellite System Data (30-second sampling, 24 hour files) from NASA CDDIS

    Data.gov (United States)

    National Aeronautics and Space Administration — GNSS provide autonomous geo-spatial positioning with global coverage. GNSS data sets from ground receivers at the CDDIS consist primarily of the data from the U.S....

  4. Virtual file system for PSDS

    Science.gov (United States)

    Runnels, Tyson D.

    1993-01-01

    This is a case study. It deals with the use of a 'virtual file system' (VFS) for Boeing's UNIX-based Product Standards Data System (PSDS). One of the objectives of PSDS is to store digital standards documents. The file-storage requirements are that the files must be rapidly accessible, stored for long periods of time - as though they were paper, protected from disaster, and accumulative to about 80 billion characters (80 gigabytes). This volume of data will be approached in the first two years of the project's operation. The approach chosen is to install a hierarchical file migration system using optical disk cartridges. Files are migrated from high-performance media to lower performance optical media based on a least-frequency-used algorithm. The optical media are less expensive per character stored and are removable. Vital statistics about the removable optical disk cartridges are maintained in a database. The assembly of hardware and software acts as a single virtual file system transparent to the PSDS user. The files are copied to 'backup-and-recover' media whose vital statistics are also stored in the database. Seventeen months into operation, PSDS is storing 49 gigabytes. A number of operational and performance problems were overcome. Costs are under control. New and/or alternative uses for the VFS are being considered.

  5. 40 CFR 152.95 - Citation of all studies in the Agency's files pertinent to a specific data requirement.

    Science.gov (United States)

    2010-07-01

    ... PROCEDURES Procedures To Ensure Protection of Data Submitters' Rights § 152.95 Citation of all studies in the... requirement. The applicant who selects this cite-all option must submit to the Agency: (a) A general offer to... may be limited to apply only to data pertinent to the specific data requirement(s) for which the cite...

  6. The Global File System

    Science.gov (United States)

    Soltis, Steven R.; Ruwart, Thomas M.; OKeefe, Matthew T.

    1996-01-01

    The global file system (GFS) is a prototype design for a distributed file system in which cluster nodes physically share storage devices connected via a network-like fiber channel. Networks and network-attached storage devices have advanced to a level of performance and extensibility so that the previous disadvantages of shared disk architectures are no longer valid. This shared storage architecture attempts to exploit the sophistication of storage device technologies whereas a server architecture diminishes a device's role to that of a simple component. GFS distributes the file system responsibilities across processing nodes, storage across the devices, and file system resources across the entire storage pool. GFS caches data on the storage devices instead of the main memories of the machines. Consistency is established by using a locking mechanism maintained by the storage devices to facilitate atomic read-modify-write operations. The locking mechanism is being prototyped in the Silicon Graphics IRIX operating system and is accessed using standard Unix commands and modules.

  7. Interoperability format translation and transformation between IFC architectural design file and simulation file formats

    Science.gov (United States)

    Chao, Tian-Jy; Kim, Younghun

    2015-01-06

    Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function to convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.

  8. Census Data

    Data.gov (United States)

    Department of Housing and Urban Development — The Bureau of the Census has released Census 2000 Summary File 1 (SF1) 100-Percent data. The file includes the following population items: sex, age, race, Hispanic...

  9. File Type Identification of File Fragments using Longest Common Subsequence (LCS)

    Science.gov (United States)

    Rahmat, R. F.; Nicholas, F.; Purnamawati, S.; Sitompul, O. S.

    2017-01-01

    Computer forensic analyst is a person in charge of investigation and evidence tracking. In certain cases, the file needed to be presented as digital evidence was deleted. It is difficult to reconstruct the file, because it often lost its header and cannot be identified while being restored. Therefore, a method is required for identifying the file type of file fragments. In this research, we propose Longest Common Subsequences that consists of three steps, namely training, testing and validation, to identify the file type from file fragments. From all testing results we can conlude that our proposed method works well and achieves 92.91% of accuracy to identify the file type of file fragment for three data types.

  10. Algorithms and file structures for computational geometry

    International Nuclear Information System (INIS)

    Hinrichs, K.; Nievergelt, J.

    1983-01-01

    Algorithms for solving geometric problems and file structures for storing large amounts of geometric data are of increasing importance in computer graphics and computer-aided design. As examples of recent progress in computational geometry, we explain plane-sweep algorithms, which solve various topological and geometric problems efficiently; and we present the grid file, an adaptable, symmetric multi-key file structure that provides efficient access to multi-dimensional data along any space dimension. (orig.)

  11. HUD GIS Boundary Files

    Data.gov (United States)

    Department of Housing and Urban Development — The HUD GIS Boundary Files are intended to supplement boundary files available from the U.S. Census Bureau. The files are for community planners interested in...

  12. 77 FR 12367 - Agency Information Collection and Reporting Activities; Electronic Filing of Bank Secrecy Act...

    Science.gov (United States)

    2012-02-29

    ... capability of electronically filing BSA reports through its system called BSA E-Filing. Effective August 2011... Accounts (FBAR) report. BSA E-Filing is a secure, web-based electronic filing system. It is a flexible... filing institutions or individuals, thereby providing a significant improvement in data quality. BSA E...

  13. Streamflow and water-quality data for Meadow Run Basin, Fayette County, Pennsylvania, December 1987-November 1988. Open file report

    International Nuclear Information System (INIS)

    Kostelnik, K.M.; Witt, E.C.

    1989-01-01

    Streamflow and water-quality data were collected throughout the Meadow Run basin, Fayette County, Pennsylvania, from December 7, 1987 through November 15, 1988, to determine the prevailing quality of surface water over a range of hydrologic conditions. The data will assist the Pennsylvania Department of Environmental Resources during its review of coal-mine permit applications. A water-quality station near the mouth of Meadow Run provided continuous-record of stream stage, pH, specific conductance, and water temperature. Monthly water-quality samples collected at the station were analyzed for total and dissolved metals, nutrients, major cations and anions, and suspended-sediment concentrations

  14. GrabBlur--a framework to facilitate the secure exchange of whole-exome and -genome SNV data using VCF files.

    Science.gov (United States)

    Stade, Björn; Seelow, Dominik; Thomsen, Ingo; Krawczak, Michael; Franke, Andre

    2014-01-01

    Next Generation Sequencing (NGS) of whole exomes or genomes is increasingly being used in human genetic research and diagnostics. Sharing NGS data with third parties can help physicians and researchers to identify causative or predisposing mutations for a specific sample of interest more efficiently. In many cases, however, the exchange of such data may collide with data privacy regulations. GrabBlur is a newly developed tool to aggregate and share NGS-derived single nucleotide variant (SNV) data in a public database, keeping individual samples unidentifiable. In contrast to other currently existing SNV databases, GrabBlur includes phenotypic information and contact details of the submitter of a given database entry. By means of GrabBlur human geneticists can securely and easily share SNV data from resequencing projects. GrabBlur can ease the interpretation of SNV data by offering basic annotations, genotype frequencies and in particular phenotypic information - given that this information was shared - for the SNV of interest. GrabBlur facilitates the combination of phenotypic and NGS data (VCF files) via a local interface or command line operations. Data submissions may include HPO (Human Phenotype Ontology) terms, other trait descriptions, NGS technology information and the identity of the submitter. Most of this information is optional and its provision at the discretion of the submitter. Upon initial intake, GrabBlur merges and aggregates all sample-specific data. If a certain SNV is rare, the sample-specific information is replaced with the submitter identity. Generally, all data in GrabBlur are highly aggregated so that they can be shared with others while ensuring maximum privacy. Thus, it is impossible to reconstruct complete exomes or genomes from the database or to re-identify single individuals. After the individual information has been sufficiently "blurred", the data can be uploaded into a publicly accessible domain where aggregated genotypes are

  15. Direct utilization of information from nuclear data files in Monte Carlo simulation of neutron and photon transport

    International Nuclear Information System (INIS)

    Androseno, P.; Zholudov, D.; Kompaniyets, A.; Smirnova, O.

    2000-01-01

    In order to improve both the economics of Nuclear Power Plants (NPPs) as well as their safety, data and computer codes that perform benchmark calculations while simulating NPP parameters must be utilized. This work is mainly concerned with application of computer codes using the Monte Carlo method, which provides advanced accuracy of equations to be calculated. (authors)

  16. Index files for Belle II - very small skim containers

    Science.gov (United States)

    Sevior, Martin; Bloomfield, Tristan; Kuhr, Thomas; Ueda, I.; Miyake, H.; Hara, T.

    2017-10-01

    The Belle II experiment[1] employs the root file format[2] for recording data and is investigating the use of “index-files” to reduce the size of data skims. These files contain pointers to the location of interesting events within the total Belle II data set and reduce the size of data skims by 2 orders of magnitude. We implement this scheme on the Belle II grid by recording the parent file metadata and the event location within the parent file. While the scheme works, it is substantially slower than a normal sequential read of standard skim files using default root file parameters. We investigate the performance of the scheme by adjusting the “splitLevel” and “autoflushsize” parameters of the root files in the parent data files.

  17. The FOND-2.2 evaluated neutron data library (Russian library of evaluated neutron data files for generating sets of constants in the ABBN constants system)

    International Nuclear Information System (INIS)

    Koshcheev, V.N.; Nikolaev, M.N.; Korchagina, Zh.A.; Savoskina, G.V.

    2001-01-01

    A short description is given of the Russian evaluated neutron data library FOND-2.2. The main purpose of FOND-2.2 is to provide sets of constants for the ABBN constants system. A history of its compilation and the sources of the neutron data are given. The contents of FOND-2.2 are presented with brief comments. (author)

  18. Detection Of Alterations In Audio Files Using Spectrograph Analysis

    Directory of Open Access Journals (Sweden)

    Anandha Krishnan G

    2015-08-01

    Full Text Available The corresponding study was carried out to detect changes in audio file using spectrograph. An audio file format is a file format for storing digital audio data on a computer system. A sound spectrograph is a laboratory instrument that displays a graphical representation of the strengths of the various component frequencies of a sound as time passes. The objectives of the study were to find the changes in spectrograph of audio after altering them to compare altering changes with spectrograph of original files and to check for similarity and difference in mp3 and wav. Five different alterations were carried out on each audio file to analyze the differences between the original and the altered file. For altering the audio file MP3 or WAV by cutcopy the file was opened in Audacity. A different audio was then pasted to the audio file. This new file was analyzed to view the differences. By adjusting the necessary parameters the noise was reduced. The differences between the new file and the original file were analyzed. By adjusting the parameters from the dialog box the necessary changes were made. The edited audio file was opened in the software named spek where after analyzing a graph is obtained of that particular file which is saved for further analysis. The original audio graph received was combined with the edited audio file graph to see the alterations.

  19. Use of the PIXEL method to investigate gas adsorption in metal–organic frameworks† †Electronic supplementary information (ESI) available. See DOI: 10.1039/c6ce00555a Click here for additional data file. Click here for additional data file. Click here for additional data file.

    Science.gov (United States)

    Maloney, Andrew G. P.; Wood, Peter A.

    2016-01-01

    PIXEL has been used to perform calculations of adsorbate-adsorbent interaction energies between a range of metal–organic frameworks (MOFs) and simple guest molecules. Interactions have been calculated for adsorption between MOF-5 and Ar, H2, and N2; Zn2(BDC)2(TED) (BDC = 1,4-benzenedicarboxylic acid, TED = triethylenediamine) and H2; and HKUST-1 and CO2. The locations of the adsorption sites and the calculated energies, which show differences in the Coulombic or dispersion characteristic of the interaction, compare favourably to experimental data and literature energy values calculated using density functional theory. PMID:28496380

  20. Status and evaluation methods of JENDL fusion file and JENDL PKA/KERMA file

    International Nuclear Information System (INIS)

    Chiba, S.; Fukahori, T.; Shibata, K.; Yu Baosheng; Kosako, K.

    1997-01-01

    The status of evaluated nuclear data in the JENDL fusion file and PKA/KERMA file is presented. The JENDL fusion file was prepared in order to improve the quality of the JENDL-3.1 data especially on the double-differential cross sections (DDXs) of secondary neutrons and gamma-ray production cross sections, and to provide DDXs of secondary charged particles (p, d, t, 3 He and α-particle) for the calculation of PKA and KERMA factors. The JENDL fusion file contains evaluated data of 26 elements ranging from Li to Bi. The data in JENDL fusion file reproduce the measured data on neutron and charged-particle DDXs and also on gamma-ray production cross sections. Recoil spectra in PKA/KERMA file were calculated from secondary neutron and charged-particle DDXs contained in the fusion file with two-body reaction kinematics. The data in the JENDL fusion file and PKA/KERMA file were compiled in ENDF-6 format with an MF=6 option to store the DDX data. (orig.)

  1. Program LINEAR (version 79-1): linearize data in the evaluated nuclear data file/version B (ENDF/B) format

    International Nuclear Information System (INIS)

    Cullen, D.E.

    1979-01-01

    Program LINEAR converts evaluated cross sections in the ENDF/B format into a tabular form that is subject to linear-linear interpolation in energy and cross section. The code also thins tables of cross sections already in that form (i.e., removes points not needed for linear interpolability). The main advantage of the code is that it allows subsequent codes to consider only linear-linear data. A listing of the source deck is available on request

  2. High School and Beyond: Twins and Siblings' File Users' Manual, User's Manual for Teacher Comment File, Friends File Users' Manual.

    Science.gov (United States)

    National Center for Education Statistics (ED), Washington, DC.

    These three users' manuals are for specific files of the High School and Beyond Study, a national longitudinal study of high school sophomores and seniors in 1980. The three files are computerized databases that are available on magnetic tape. As one component of base year data collection, information identifying twins, triplets, and some non-twin…

  3. The rice growth image files - The Rice Growth Monitoring for The Phenotypic Functional Analysis | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us The Rice Growth Monitoring for The Phenotypic Functional Analysis The rice growth image file...s Data detail Data name The rice growth image files DOI 10.18908/lsdba.nbdc00945-004 Description of data contents The rice growth ima...ge files categorized based on file size. Data file File name: image files (director...y) File URL: ftp://ftp.biosciencedbc.jp/archive/agritogo-rice-phenome/LATEST/image...ite Policy | Contact Us The rice growth image files - The Rice Growth Monitoring for The Phenotypic Functional Analysis | LSDB Archive ...

  4. Malpractice by physical therapists: descriptive analysis of reports in the National Practitioner Data Bank public use data file, 1991-2004.

    Science.gov (United States)

    Sandstrom, Robert

    2007-01-01

    As physical therapists increase autonomous practice, medical error becomes more important to public safety and public perceptions of the profession. The purpose of this study was to describe malpractice by physical therapists in the United States based on physical therapist malpractice reports in the National Practitioner Data Bank between January 1, 1991, and December 31, 2004. A frequency analysis of data related to physical therapist malpractice reports was performed. The relationship between size of malpractice payment and public policy related to access to physical therapist services and malpractice experience was explored. A total of 664 malpractice reports were found in the study period (mean, 47.73 events annually). California had 114 malpractice events, while Maine and Wyoming had none. The median payment amount for physical therapist malpractice was $10,000 to $15,000. "Treatment-related" events and events related to "improper technique" were the most common reasons for a malpractice report. Incidence of malpractice by physical therapists is low (estimated at 2.5 events/10,000 working therapists/year), and the average malpractice payment is small (public policy related to direct patient access to physical therapy services.

  5. Clockwise: A Mixed-Media File System

    NARCIS (Netherlands)

    Bosch, H.G.P.; Jansen, P.G.; Mullender, Sape J.

    This (short) paper presents the Clockwise, a mixed-media file system. The primary goal of the Clockwise is to provide a storage architecture that supports the storage and retrieval of best-effort and real-time file system data. Clockwise provides an abstraction called a dynamic partition that groups

  6. The DNA Files

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-09

    The DNA Files is a radio documentary which disseminates genetics information over public radio. The documentaries explore subjects which include the following: How genetics affects society. How human life began and how it evolved. Could new prenatal genetic tests hold the key to disease prevention later in life? Would a national genetic data base sacrifice individual privacy? and Should genes that may lead to the cure for cancer be privately owned? This report serves as a project update for the second quarter of 1998. It includes the spring/summer 1998 newsletter, the winter 1998 newsletter, the program clock, and the latest flyer.

  7. 12 CFR Appendix F to Part 360 - Customer File Structure

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Customer File Structure F Appendix F to Part... POLICY RESOLUTION AND RECEIVERSHIP RULES Pt. 360, App. F Appendix F to Part 360—Customer File Structure This is the structure of the data file to provide to the FDIC information related to each customer who...

  8. UPIN Group File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Group Unique Physician Identifier Number (UPIN) File is the business entity file that contains the group practice UPIN and descriptive information. It does NOT...

  9. Status of thorium cycle nuclear data evaluations: Comparison of cross-section line shapes of JENDL-3 and ENDF-B-VI files for 230Th, 232Th, 231Pa, 233Pa, 232U, 233U and 234U

    International Nuclear Information System (INIS)

    Ganesan, S.; McLaughlin, P.K.

    1992-02-01

    Since 1990, one of the most interesting developments in the field of nuclear data for nuclear technology applications is that several new evaluated data files have been finalized and made available to the International Atomic Energy Agency (IAEA) for distribution to its Member States. Improved evaluated nuclear data libraries such as ENDF/B-VI from the United States and JENDL-3 from Japan were developed over a period of 10-15 years. This report is not an evaluation of the evaluations. The report as presented here gives a first look at the cross section line shapes of the isotopes that are important to the thorium fuel cycle derived from the two recently evaluated data files: JENDL-3 and ENDF/B-VI. The basic evaluated data files JENDL-3 and ENDF/B-VI were point-processed successfully using the codes LINEAR and RECENT. The point data were multigrouped in three different group structures using the GROUPIE code. Graphs of intercomparisons of cross section line shapes of JENDL-3 and ENDF/B-VI are presented in this paper for the following isotopes of major interest to studies of the thorium fuel cycle: 230 Th, 232 Th, 231 Pa, 233 Pa, 232 U, 233 U and 234 U. Comparisons between JENDL-3 and ENDF/B-VI which were performed at the point and group levels show large discrepancies in various cross sections. We conclude this report with a general remark that it is necessary to perform sensitivity studies to assess the impacts of the discrepancies between the two different sets of data on calculated reactor design and safety parameters of specific reactor systems and, based on the results of such sensitivity studies, to undertake new tasks of evaluations. (author). 2 refs, 245 figs, 8 tabs

  10. PCF File Format.

    Energy Technology Data Exchange (ETDEWEB)

    Thoreson, Gregory G [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    PCF files are binary files designed to contain gamma spectra and neutron count rates from radiation sensors. It is the native format for the GAmma Detector Response and Analysis Software (GADRAS) package [1]. It can contain multiple spectra and information about each spectrum such as energy calibration. This document outlines the format of the file that would allow one to write a computer program to parse and write such files.

  11. An exploratory discussion on business files compilation

    International Nuclear Information System (INIS)

    Gao Chunying

    2014-01-01

    Business files compilation for an enterprise is a distillation and recreation of its spiritual wealth, from which the applicable information can be available to those who want to use it in a fast, extensive and precise way. Proceeding from the effects of business files compilation on scientific researches, productive constructions and developments, this paper in five points discusses the way how to define topics, analyze historical materials, search or select data and process it to an enterprise archives collection. Firstly, to expound the importance and necessity of business files compilation in production, operation and development of an company; secondly, to present processing methods from topic definition, material searching and data selection to final examination and correction; thirdly, to define principle and classification in order to make different categories and levels of processing methods available to business files compilation; fourthly, to discuss the specific method how to implement a file compilation through a documentation collection upon principle of topic definition gearing with demand; fifthly, to address application of information technology to business files compilation in view point of widely needs for business files so as to level up enterprise archives management. The present discussion focuses on the examination and correction principle of enterprise historical material compilation and the basic classifications as well as the major forms of business files compilation achievements. (author)

  12. File Level Provenance Tracking in CMS

    CERN Document Server

    Jones, C D; Paterno, M; Sexton-Kennedy, L; Tanenbaum, W; Riley, D S

    2009-01-01

    The CMS off-line framework stores provenance information within CMS's standard ROOT event data files. The provenance information is used to track how each data product was constructed, including what other data products were read to do the construction. We will present how the framework gathers the provenance information, the efforts necessary to minimise the space used to store the provenance in the file and the tools that will be available to use the provenance.

  13. Beyond a Terabyte File System

    Science.gov (United States)

    Powers, Alan K.

    1994-01-01

    The Numerical Aerodynamics Simulation Facility's (NAS) CRAY C916/1024 accesses a "virtual" on-line file system, which is expanding beyond a terabyte of information. This paper will present some options to fine tuning Data Migration Facility (DMF) to stretch the online disk capacity and explore the transitions to newer devices (STK 4490, ER90, RAID).

  14. Remote file inquiry (RFI) system

    Science.gov (United States)

    1975-01-01

    System interrogates and maintains user-definable data files from remote terminals, using English-like, free-form query language easily learned by persons not proficient in computer programming. System operates in asynchronous mode, allowing any number of inquiries within limitation of available core to be active concurrently.

  15. Nuclear Structure References (NSR) file

    International Nuclear Information System (INIS)

    Ewbank, W.B.

    1978-08-01

    The use of the Nuclear Structure References file by the Nuclear Data Project at ORNL is described. Much of the report concerns format information of interest only to those preparing input to the system or otherwise needing detailed knowledge of its internal structure. 17 figures

  16. A File Archival System

    Science.gov (United States)

    Fanselow, J. L.; Vavrus, J. L.

    1984-01-01

    ARCH, file archival system for DEC VAX, provides for easy offline storage and retrieval of arbitrary files on DEC VAX system. System designed to eliminate situations that tie up disk space and lead to confusion when different programers develop different versions of same programs and associated files.

  17. Text File Comparator

    Science.gov (United States)

    Kotler, R. S.

    1983-01-01

    File Comparator program IFCOMP, is text file comparator for IBM OS/VScompatable systems. IFCOMP accepts as input two text files and produces listing of differences in pseudo-update form. IFCOMP is very useful in monitoring changes made to software at the source code level.

  18. Protecting your files on the DFS file system

    CERN Multimedia

    Computer Security Team

    2011-01-01

    The Windows Distributed File System (DFS) hosts user directories for all NICE users plus many more data.    Files can be accessed from anywhere, via a dedicated web portal (http://cern.ch/dfs). Due to the ease of access to DFS with in CERN it is of utmost importance to properly protect access to sensitive data. As the use of DFS access control mechanisms is not obvious to all users, passwords, certificates or sensitive files might get exposed. At least this happened in past to the Andrews File System (AFS) - the Linux equivalent to DFS) - and led to bad publicity due to a journalist accessing supposedly "private" AFS folders (SonntagsZeitung 2009/11/08). This problem does not only affect the individual user but also has a bad impact on CERN's reputation when it comes to IT security. Therefore, all departments and LHC experiments agreed recently to apply more stringent protections to all DFS user folders. The goal of this data protection policy is to assist users in pro...

  19. Studies of acute and chronic radiation injury at the Biological and Medical Research Division, Argonne National Laboratory, 1953-1970: Description of individual studies, data files, codes, and summaries of significant findings

    Energy Technology Data Exchange (ETDEWEB)

    Grahn, D.; Fox, C.; Wright, B.J.; Carnes, B.A.

    1994-05-01

    Between 1953 and 1970, studies on the long-term effects of external x-ray and {gamma} irradiation on inbred and hybrid mouse stocks were carried out at the Biological and Medical Research Division, Argonne National Laboratory. The results of these studies, plus the mating, litter, and pre-experimental stock records, were routinely coded on IBM cards for statistical analysis and record maintenance. Also retained were the survival data from studies performed in the period 1943-1953 at the National Cancer Institute, National Institutes of Health, Bethesda, Maryland. The card-image data files have been corrected where necessary and refiled on hard disks for long-term storage and ease of accessibility. In this report, the individual studies and data files are described, and pertinent factors regarding caging, husbandry, radiation procedures, choice of animals, and other logistical details are summarized. Some of the findings are also presented. Descriptions of the different mouse stocks and hybrids are included in an appendix; more than three dozen stocks were involved in these studies. Two other appendices detail the data files in their original card-image format and the numerical codes used to describe the animal`s exit from an experiment and, for some studies, any associated pathologic findings. Tabular summaries of sample sizes, dose levels, and other variables are also given to assist investigators in their selection of data for analysis. The archive is open to any investigator with legitimate interests and a willingness to collaborate and acknowledge the source of the data and to recognize appropriate conditions or caveats.

  20. Studies of acute and chronic radiation injury at the Biological and Medical Research Division, Argonne National Laboratory, 1953-1970: Description of individual studies, data files, codes, and summaries of significant findings

    International Nuclear Information System (INIS)

    Grahn, D.; Fox, C.; Wright, B.J.; Carnes, B.A.

    1994-05-01

    Between 1953 and 1970, studies on the long-term effects of external x-ray and γ irradiation on inbred and hybrid mouse stocks were carried out at the Biological and Medical Research Division, Argonne National Laboratory. The results of these studies, plus the mating, litter, and pre-experimental stock records, were routinely coded on IBM cards for statistical analysis and record maintenance. Also retained were the survival data from studies performed in the period 1943-1953 at the National Cancer Institute, National Institutes of Health, Bethesda, Maryland. The card-image data files have been corrected where necessary and refiled on hard disks for long-term storage and ease of accessibility. In this report, the individual studies and data files are described, and pertinent factors regarding caging, husbandry, radiation procedures, choice of animals, and other logistical details are summarized. Some of the findings are also presented. Descriptions of the different mouse stocks and hybrids are included in an appendix; more than three dozen stocks were involved in these studies. Two other appendices detail the data files in their original card-image format and the numerical codes used to describe the animal's exit from an experiment and, for some studies, any associated pathologic findings. Tabular summaries of sample sizes, dose levels, and other variables are also given to assist investigators in their selection of data for analysis. The archive is open to any investigator with legitimate interests and a willingness to collaborate and acknowledge the source of the data and to recognize appropriate conditions or caveats

  1. PKA spectrum file

    Energy Technology Data Exchange (ETDEWEB)

    Kawai, M. [Toshiba Corp., Kawasaki, Kanagawa (Japan). Nuclear Engineering Lab.

    1997-03-01

    In the Japanese Nuclear Data Committee, the PKA/KERMA file containing PKA spectra, KERMA factors and DPA cross sections in the energy range between 10{sup -5} eV and 50 MeV is being prepared from the evaluated nuclear data. The processing code ESPERANT was developed to calculate quantities of PKA, KERMA and DPA from evaluated nuclear data for medium and heavy elements by using the effective single particle emission approximation (ESPEA). For light elements, the PKA spectra are evaluated by the SCINFUL/DDX and EXIFON codes, simultaneously with other neutron cross sections. The DPA cross sections due to charged particle emitted from light elements are evaluated for high neutron energy above 20 MeV. (author)

  2. Program SIGMA1 (version 79-1): Doppler broaden evaluated cross sections in the evaluated nuclear data file/version B (ENDF/B) format

    International Nuclear Information System (INIS)

    Cullen, D.E.

    1979-01-01

    Program SIGMA1 Doppler-broadens evaluated cross sections in the ENDF/B format. The program requires that input cross sections be tabulated as linearly interpolable functions of energy in ENDF/B File 3; broadened cross sections, in this same form, replace the original values in the output tape. This report describes the methods used in the code and serves as a user's guide. A listing of the source deck is available on request

  3. Design and Implementation of a Metadata-rich File System

    Energy Technology Data Exchange (ETDEWEB)

    Ames, S; Gokhale, M B; Maltzahn, C

    2010-01-19

    Despite continual improvements in the performance and reliability of large scale file systems, the management of user-defined file system metadata has changed little in the past decade. The mismatch between the size and complexity of large scale data stores and their ability to organize and query their metadata has led to a de facto standard in which raw data is stored in traditional file systems, while related, application-specific metadata is stored in relational databases. This separation of data and semantic metadata requires considerable effort to maintain consistency and can result in complex, slow, and inflexible system operation. To address these problems, we have developed the Quasar File System (QFS), a metadata-rich file system in which files, user-defined attributes, and file relationships are all first class objects. In contrast to hierarchical file systems and relational databases, QFS defines a graph data model composed of files and their relationships. QFS incorporates Quasar, an XPATH-extended query language for searching the file system. Results from our QFS prototype show the effectiveness of this approach. Compared to the de facto standard, the QFS prototype shows superior ingest performance and comparable query performance on user metadata-intensive operations and superior performance on normal file metadata operations.

  4. Protecting your files on the AFS file system

    CERN Multimedia

    2011-01-01

    The Andrew File System is a world-wide distributed file system linking hundreds of universities and organizations, including CERN. Files can be accessed from anywhere, via dedicated AFS client programs or via web interfaces that export the file contents on the web. Due to the ease of access to AFS it is of utmost importance to properly protect access to sensitive data in AFS. As the use of AFS access control mechanisms is not obvious to all users, passwords, private SSH keys or certificates have been exposed in the past. In one specific instance, this also led to bad publicity due to a journalist accessing supposedly "private" AFS folders (SonntagsZeitung 2009/11/08). This problem does not only affect the individual user but also has a bad impact on CERN's reputation when it comes to IT security. Therefore, all departments and LHC experiments agreed in April 2010 to apply more stringent folder protections to all AFS user folders. The goal of this data protection policy is to assist users in...

  5. FDIC Summary of Deposits (SOD) Download File

    Data.gov (United States)

    Federal Deposit Insurance Corporation — The FDIC's Summary of Deposits (SOD) download file contains deposit data for branches and offices of all FDIC-insured institutions. The Federal Deposit Insurance...

  6. The International Reactor Dosimetry File (IRDF-85)

    International Nuclear Information System (INIS)

    Cullen, D.E.; McLaughlin, P.K.

    1985-04-01

    This document describes the contents of the second version of the International Reactor Dosimetry File (IRDF-85), distributed by the Nuclear Data Section of the International Atomic Energy Agency. This library superseded IRDF-82. (author)

  7. On-Board File Management and Its Application in Flight Operations

    Science.gov (United States)

    Kuo, N.

    1998-01-01

    In this paper, the author presents the minimum functions required for an on-board file management system. We explore file manipulation processes and demonstrate how the file transfer along with the file management system will be utilized to support flight operations and data delivery.

  8. Files synchronization from a large number of insertions and deletions

    Science.gov (United States)

    Ellappan, Vijayan; Kumari, Savera

    2017-11-01

    Synchronization between different versions of files is becoming a major issue that most of the applications are facing. To make the applications more efficient a economical algorithm is developed from the previously used algorithm of “File Loading Algorithm”. I am extending this algorithm in three ways: First, dealing with non-binary files, Second backup is generated for uploaded files and lastly each files are synchronized with insertions and deletions. User can reconstruct file from the former file with minimizing the error and also provides interactive communication by eliminating the frequency without any disturbance. The drawback of previous system is overcome by using synchronization, in which multiple copies of each file/record is created and stored in backup database and is efficiently restored in case of any unwanted deletion or loss of data. That is, to introduce a protocol that user B may use to reconstruct file X from file Y with suitably low probability of error. Synchronization algorithms find numerous areas of use, including data storage, file sharing, source code control systems, and cloud applications. For example, cloud storage services such as Drop box synchronize between local copies and cloud backups each time users make changes to local versions. Similarly, synchronization tools are necessary in mobile devices. Specialized synchronization algorithms are used for video and sound editing. Synchronization tools are also capable of performing data duplication.

  9. The design and development of GRASS file reservation system

    International Nuclear Information System (INIS)

    Huang Qiulan; Zhu Suijiang; Cheng Yaodong; Chen Gang

    2010-01-01

    GFRS (GRASS File Reservation System) is designed to improve the file access performance of GRASS (Grid-enabled Advanced Storage System) which is a Hierarchical Storage Management (HSM) system developed at Computing Center, Institute of High Energy Physics. GRASS can provide massive storage management and data migration, but the data migration policy is simply based factors such as pool water level, the intervals for migration and so on, so it is short of precise control over files. As for that, we design GFRS to implement user-based file reservation which is to reserve and keep the required files on disks for High Energy physicists. CFRS can improve file access speed for users by avoiding migrating frequently accessed files to tapes. In this paper we first give a brief introduction of GRASS system and then detailed architecture and implementation of GFRS. Experiments results from GFRS have shown good performance and a simple analysis is made based on it. (authors)

  10. Strategy on review method for JENDL High Energy File

    Energy Technology Data Exchange (ETDEWEB)

    Yamano, Naoki [Sumitomo Atomic Energy Industries Ltd., Tokyo (Japan)

    1998-11-01

    Status on review method and problems for a High Energy File of Japanese Evaluated Nuclear Data Library (JENDL-HE File) has been described. Measurements on differential and integral data relevant to the review work for the JENDL-HE File have been examined from a viewpoint of data quality and applicability. In order to achieve the work effectively, strategy on development of standard review method has been discussed as well as necessity of tools to be used in the review scheme. (author)

  11. COMPOZ data guide

    International Nuclear Information System (INIS)

    Knight, J.R.

    1984-01-01

    The COMPOZ Data Guide used to create the Standard Composition Library is described. Of particular importance is documentation of the COMPOZ input data file structure. Knowledge of the file structure allows users to edit the data file and subsequently create their own site-specific composition library

  12. RRB / SSI Interface Checkwriting Integrated Computer Operation Extract File (CHICO)

    Data.gov (United States)

    Social Security Administration — This monthly file provides SSA with information about benefit payments made to railroad retirement beneficiaries. SSA uses this data to verify Supplemental Security...

  13. Clinical usefulness of myocardial iodine-123-15-(p-iodophenyl)-3(R,S)-methyl-pentadecanoic acid distribution abnormality in patients with mitochondrial encephalomyopathy based on normal data file in bull's-eye polar map

    International Nuclear Information System (INIS)

    Takahashi, Nobukazu; Mitani, Isao; Sumita, Shinichi

    1998-01-01

    Visual interpretation of iodine-123-beta-15-(p-iodophenyl)-3(R,S)-methyl-pentadecanoic acid ( 123 I-BMIPP) myocardial images cannot easily detect mild reduction in tracer uptake. Objective assessment of myocardial 123 I-BMIPP maldistributions at rest was attempted using a bull's-eye map and its normal data file for detecting myocardial damage in patients with mitochondrial encephalomyopathy. Six patients, two with Kearns-Sayre syndrome and four with mitochondrial myopathy, encephalopathy, lactic acidosis, and strokelike episodes (MELAS), and 10 normal subjects were studied. Fractional myocardial uptake of 1 23 I-BMIPP was also measured by dynamic static imaging to assess the global myocardial free fatty acid. These data were compared with the cardiothoracic ratio measured by chest radiography and left ventricular ejection fraction assessed by echocardiography. Abnormal cardiothoracic ratio and lower ejection fraction were detected in only one patient with Kearns-Sayre syndrome. Abnormal fractional myocardial uptake was detected in two patients (1.61%, 1.91%), whereas abnormal regional 123 I-BMIPP uptake assessed by the bull's-eye map was detected in five patients (83%). All patients showed abnormal uptake in the anterior portion, and one showed progressive atrioventricular conduction abnormality and systolic dysfunction with extended 123 I-BMIPP abnormal uptake. The results suggest that assessment based on the normal data file in a bull's-eye polar map is clinically useful for detection of myocardial damage in patients with mitochondrial encephalomyopathy. (author)

  14. Grammar-Based Specification and Parsing of Binary File Formats

    Directory of Open Access Journals (Sweden)

    William Underwood

    2012-03-01

    Full Text Available The capability to validate and view or play binary file formats, as well as to convert binary file formats to standard or current file formats, is critically important to the preservation of digital data and records. This paper describes the extension of context-free grammars from strings to binary files. Binary files are arrays of data types, such as long and short integers, floating-point numbers and pointers, as well as characters. The concept of an attribute grammar is extended to these context-free array grammars. This attribute grammar has been used to define a number of chunk-based and directory-based binary file formats. A parser generator has been used with some of these grammars to generate syntax checkers (recognizers for validating binary file formats. Among the potential benefits of an attribute grammar-based approach to specification and parsing of binary file formats is that attribute grammars not only support format validation, but support generation of error messages during validation of format, validation of semantic constraints, attribute value extraction (characterization, generation of viewers or players for file formats, and conversion to current or standard file formats. The significance of these results is that with these extensions to core computer science concepts, traditional parser/compiler technologies can potentially be used as a part of a general, cost effective curation strategy for binary file formats.

  15. Research of Performance Linux Kernel File Systems

    Directory of Open Access Journals (Sweden)

    Andrey Vladimirovich Ostroukh

    2015-10-01

    Full Text Available The article describes the most common Linux Kernel File Systems. The research was carried out on a personal computer, the characteristics of which are written in the article. The study was performed on a typical workstation running GNU/Linux with below characteristics. On a personal computer for measuring the file performance, has been installed the necessary software. Based on the results, conclusions and proposed recommendations for use of file systems. Identified and recommended by the best ways to store data.

  16. The AEP Barnbook DATLIB. Nuclear Reaction Cross Sections and Reactivity Parameter Library and Files

    International Nuclear Information System (INIS)

    Feldbacher, R.

    1987-10-01

    Nuclear reaction data for light isotope charged particle reactions (Z<6) have been compiled. This hardcopy contains file headers, plots and an extended bibliography. Numerical data files and processing routines are available on tape at IAEA-NDS. (author). Refs

  17. Source Reference File

    Data.gov (United States)

    Social Security Administration — This file contains a national set of names and contact information for doctors, hospitals, clinics, and other facilities (known collectively as sources) from which...

  18. Patient Assessment File (PAF)

    Data.gov (United States)

    Department of Veterans Affairs — The Patient Assessment File (PAF) database compiles the results of the Patient Assessment Instrument (PAI) questionnaire filled out for intermediate care Veterans...

  19. RRB Earnings File (RRBERN)

    Data.gov (United States)

    Social Security Administration — RRBERN contains records for all beneficiaries on the RRB's PSSVES file who's SSNs are validated through the SVES processing. Validated output is processed through...

  20. ACTIV87 Fast neutron activation cross section file 1987

    International Nuclear Information System (INIS)

    Manokhin, V.N.; Pashchenko, A.B.; Plyaskin, V.I.; Bychkov, V.M.; Pronyaev, V.G.; Schwerer, O.

    1989-10-01

    This document summarizes the content of the Fast Neutron Activation Cross Section File based on data from different evaluated data libraries and individual evaluations in ENDF/B-5 format. The entire file or selective retrievals from it are available on magnetic tape, free of charge, from the IAEA Nuclear Data Section. (author)

  1. Methods and apparatus for capture and storage of semantic information with sub-files in a parallel computing system

    Science.gov (United States)

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-02-03

    Techniques are provided for storing files in a parallel computing system using sub-files with semantically meaningful boundaries. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a plurality of sub-files. The method comprises the steps of obtaining a user specification of semantic information related to the file; providing the semantic information as a data structure description to a data formatting library write function; and storing the semantic information related to the file with one or more of the sub-files in one or more storage nodes of the parallel computing system. The semantic information provides a description of data in the file. The sub-files can be replicated based on semantically meaningful boundaries.

  2. 76 FR 62092 - Filing Procedures

    Science.gov (United States)

    2011-10-06

    ... INTERNATIONAL TRADE COMMISSION Filing Procedures AGENCY: International Trade Commission. ACTION: Notice of issuance of Handbook on Filing Procedures. SUMMARY: The United States International Trade Commission (``Commission'') is issuing a Handbook on Filing Procedures to replace its Handbook on Electronic...

  3. Methods and apparatus for multi-resolution replication of files in a parallel computing system using semantic information

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-10-20

    Techniques are provided for storing files in a parallel computing system using different resolutions. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a sub-file. The method comprises the steps of obtaining semantic information related to the file; generating a plurality of replicas of the file with different resolutions based on the semantic information; and storing the file and the plurality of replicas of the file in one or more storage nodes of the parallel computing system. The different resolutions comprise, for example, a variable number of bits and/or a different sub-set of data elements from the file. A plurality of the sub-files can be merged to reproduce the file.

  4. NASA work unit system file maintenance manual

    Science.gov (United States)

    1972-01-01

    The NASA Work Unit System is a management information system for research tasks (i.e., work units) performed under NASA grants and contracts. It supplies profiles on research efforts and statistics on fund distribution. The file maintenance operator can add, delete and change records at a remote terminal or can submit punched cards to the computer room for batch update. The system is designed for file maintenance by a person with little or no knowledge of data processing techniques.

  5. Plug Load Data

    Data.gov (United States)

    National Aeronautics and Space Administration — We provide MATLAB binary files (.mat) and comma separated values files of data collected from a pilot study of a plug load management system that allows for the...

  6. LASIP-III, a generalized processor for standard interface files

    International Nuclear Information System (INIS)

    Bosler, G.E.; O'Dell, R.D.; Resnik, W.M.

    1976-03-01

    The LASIP-III code was developed for processing Version III standard interface data files which have been specified by the Committee on Computer Code Coordination. This processor performs two distinct tasks, namely, transforming free-field format, BCD data into well-defined binary files and providing for printing and punching data in the binary files. While LASIP-III is exported as a complete free-standing code package, techniques are described for easily separating the processor into two modules, viz., one for creating the binary files and one for printing the files. The two modules can be separated into free-standing codes or they can be incorporated into other codes. Also, the LASIP-III code can be easily expanded for processing additional files, and procedures are described for such an expansion. 2 figures, 8 tables

  7. Image File - TP Atlas | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available ption of data contents Network diagrams (in PNG format) for each project. One project has one pathway file o...List Contact us TP Atlas Image File Data detail Data name Image File DOI 10.18908/lsdba.nbdc01161-004 Descri

  8. FHEO Filed Cases

    Data.gov (United States)

    Department of Housing and Urban Development — The dataset is a list of all the Title VIII fair housing cases filed by FHEO from 1/1/2007 - 12/31/2012 including the case number, case name, filing date, state and...

  9. Securing the AliEn File Catalogue - Enforcing authorization with accountable file operations

    International Nuclear Information System (INIS)

    Schreiner, Steffen; Banerjee, Subho Sankar; Betev, Latchezar; Carminati, Federico; Vladimirovna Datskova, Olga; Furano, Fabrizio; Grigoras, Alina; Grigoras, Costin; Mendez Lorenzo, Patricia; Peters, Andreas Joachim; Saiz, Pablo; Bagnasco, Stefano; Zhu Jianlin

    2011-01-01

    The AliEn Grid Services, as operated by the ALICE Collaboration in its global physics analysis grid framework, is based on a central File Catalogue together with a distributed set of storage systems and the possibility to register links to external data resources. This paper describes several identified vulnerabilities in the AliEn File Catalogue access protocol regarding fraud and unauthorized file alteration and presents a more secure and revised design: a new mechanism, called LFN Booking Table, is introduced in order to keep track of access authorization in the transient state of files entering or leaving the File Catalogue. Due to a simplification of the original Access Envelope mechanism for xrootd-protocol-based storage systems, fundamental computational improvements of the mechanism were achieved as well as an up to 50% reduction of the credential's size. By extending the access protocol with signed status messages from the underlying storage system, the File Catalogue receives trusted information about a file's size and checksum and the protocol is no longer dependent on client trust. Altogether, the revised design complies with atomic and consistent transactions and allows for accountable, authentic, and traceable file operations. This paper describes these changes as part and beyond the development of AliEn version 2.19.

  10. A novel platform for in vitro analysis of torque, forces, and three-dimensional file displacements during root canal preparations: application to ProTaper rotary files.

    Science.gov (United States)

    Diop, Amadou; Maurel, Nathalie; Oiknine, Michel; Patoor, Etienne; Machtou, Pierre

    2009-04-01

    We proposed a new testing setup and in vitro experimental procedure allowing the analysis of the forces, torque, and file displacements during the preparation of root canals using nickel-titanium rotary endodontic files. We applied it to the preparation of 20 fresh frozen cadaveric teeth using ProTaper files (Dentsply Maillefer, Ballaigues, Switzerland), according to a clinically used sequence. During the preparations, a clinical hand motion was performed by an endodontist, and we measured the applied torque around the file axis and also the involved three-dimensional forces and 3-dimensional file displacements. Such a biomechanical procedure is useful to better understand the working conditions of the files in terms of loads and displacements. It could be used to analyze the effects of various mechanical and geometric parameters on the files' behavior and to get data for modelling purposes. Finally, it could contribute to studies aiming to improve files design in order to reduce the risks of file fractures.

  11. TFTR data management system

    International Nuclear Information System (INIS)

    Randerson, L.; Chu, J.; Ludescher, C.; Malsbury, J.; Stark, W.

    1986-01-01

    Developments in the tokamak fusion test reactor (TFTR) data management system supporting data management system supporting data acquisition and off-line physics data reduction are described. Data from monitor points, timing channels, and transient recorder channels and other devices are acquired and stored for use by on-line tasks. Files are transferred off-line automatically. A configuration utility determines data acquired and files transferred. An event system driven by file arrival activates off-line reduction processes. A post-run process transfers files not shipped during runs. Files are archived to tape and are retrievable by digraph and shot number. Automatic skimming based on most recent access, file type, shot numbers, and user-set protection maintains the files required for post-run data reduction

  12. Processing and benchmarking of evaluated nuclear data file/b-viii.0β4 cross-section library by analysis of a series of critical experimental benchmark using the monte carlo code MCNP(X and NJOY2016

    Directory of Open Access Journals (Sweden)

    Kabach Ouadie

    2017-12-01

    Full Text Available To validate the new Evaluated Nuclear Data File (ENDF/B-VIII.0β4 library, 31 different critical cores were selected and used for a benchmark test of the important parameter keff. The four utilized libraries are processed using Nuclear Data Processing Code (NJOY2016. The results obtained with the ENDF/B-VIII.0β4 library were compared against those calculated with ENDF/B-VI.8, ENDF/B-VII.0, and ENDF/B-VII.1 libraries using the Monte Carlo N-Particle (MCNP(X code. All the MCNP(X calculations of keff values with these four libraries were compared with the experimentally measured results, which are available in the International Critically Safety Benchmark Evaluation Project. The obtained results are discussed and analyzed in this paper.

  13. Status of the JENDL activation file

    International Nuclear Information System (INIS)

    Nakajima, Yutaka

    1996-01-01

    The preliminary JENDL activation file was accomplished in February 1995 and has been used in the Japanese Nuclear Data Committee and as one of the data sources for the Fusion Evaluated Nuclear Data Library in IAEA. Since there are already big activation libraries in western Europe and United States, we are aiming at more accurate evaluation of important reactions to application to nuclear energy development rather than aiming at as many reaction data as in these big libraries. In the preliminary file 1,158 reaction cross sections have been compiled for 225 nuclides up to 20 MeV. (author)

  14. pcircle - A Suite of Scalable Parallel File System Tools

    Energy Technology Data Exchange (ETDEWEB)

    2015-10-01

    Most of the software related to file system are written for conventional local file system, they are serialized and can't take advantage of the benefit of a large scale parallel file system. "pcircle" software builds on top of ubiquitous MPI in cluster computing environment and "work-stealing" pattern to provide a scalable, high-performance suite of file system tools. In particular - it implemented parallel data copy and parallel data checksumming, with advanced features such as async progress report, checkpoint and restart, as well as integrity checking.

  15. Conversion software for ANSYS APDL 2 FLUENT MHD magnetic file

    International Nuclear Information System (INIS)

    Ghita, G.; Ionescu, S.; Prisecaru, I.

    2016-01-01

    The present paper describes the improvements made to the conversion software for ANSYS APDL 2 FLUENT MHD Magnetic File which is able to extract the data from ANSYS APDL file and write down a file containing the magnetic field data in FLUENT magneto hydro dynamics (MHD) format. The MHD module has some features for the uniform and non uniform magnetic field but it is limited for sinusoidal or pulsed, square wave, having a fixed duty cycle of 50%. The present software, ANSYS APDL 2 FLUENT MHD Magnetic File, suffered major modifications in comparison with the last one. The most important improvement consists in a new graphical interface, which has 3D graphical interface for the input file but also for the output file. Another improvement has been made for processing time, the new version is two times faster comparing with the old one. (authors)

  16. Parallel file system with metadata distributed across partitioned key-value store c

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron

    2017-09-19

    Improved techniques are provided for storing metadata associated with a plurality of sub-files associated with a single shared file in a parallel file system. The shared file is generated by a plurality of applications executing on a plurality of compute nodes. A compute node implements a Parallel Log Structured File System (PLFS) library to store at least one portion of the shared file generated by an application executing on the compute node and metadata for the at least one portion of the shared file on one or more object storage servers. The compute node is also configured to implement a partitioned data store for storing a partition of the metadata for the shared file, wherein the partitioned data store communicates with partitioned data stores on other compute nodes using a message passing interface. The partitioned data store can be implemented, for example, using Multidimensional Data Hashing Indexing Middleware (MDHIM).

  17. Image files - RPD | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data ...ftp://ftp.biosciencedbc.jp/archive/rpd/LATEST/rpd_gel_image.zip File size: 38.5 MB Simple search URL - Data ... License Update History of This Database Site Policy | Contact Us Image files - RPD | LSDB Archive ...

  18. TFTR data management system

    International Nuclear Information System (INIS)

    Randerson, L.; Chu, J.; Ludescher, C.; Malsbury, J.; Stark, W.

    1986-01-01

    Developments in the tokamak fusion test reactor (TFTR) data-management system supporting data acquisition and off-line physics data reduction are described. Data from monitor points, timing channels, transient recorder channels, and other devices are acquired and stored for use by on-line tasks. Files are transferred off line automatically. A configuration utility determines data acquired and files transferred. An event system driven by file arrival activates off-line reduction processes. A post-run process transfers files not shipped during runs. Files are archived to tape and are retrievable by digraph and shot number. Automatic skimming based on most recent access, file type, shot numbers, and user-set protections maintains the files required for post-run data reduction

  19. Exemple de données statistiques à la disposition de l'exploration pétrolière : le fichier STATSID Example of Statistical Data Available for Petroleum Exploration : the Statsid File

    Directory of Open Access Journals (Sweden)

    Coustau H.

    2006-11-01

    Full Text Available Le fichier STATSID est une banque de données qui recense pays par pays, année par année, les principaux résultats pétroliers : évolution des permis, effort d'exploration (géophysique et forages, réserves et productions. De ces données de base, on peut tirer des statistiques telles que le taux de succès, le rendement de l'exploration, les caractéristiques géopétrolières (richesse en hydrocarbures, type d'habitat, paramètres de réservoir-gisement, ainsi que des références statistiques, véritables aides à la décision, telles que l'effort d'exploration nécessaire pour parvenir à une première découverte. Sont aussi présentés les principaux programmes de traitement, à partir desquels ces informations peuvent être extraites du fichier. The STATSID File is a data bank containing a country-by-country, year-by-yearcensus of the leading petroleum statistics, i.e. how search permts evolve, exploration efforts igeophysics and drilling, reserves and productions. These basic data can be used to compile statistics such as the rate of success, the yield of exploration efforts, geopetroleum properties (richness in hydrocarbons,type of habitat, reservoir-field parameters, as well as statistical references which are of great help in decision making, such as the exploration effort required for making an initial discovery. This article also describes the main processing programs bywhich these data can be retrieved from the file.

  20. CGB - Consumer Complaints Data

    Data.gov (United States)

    Federal Communications Commission — Individual informal consumer complaint data detailing complaints filed with the Consumer Help Center beginning October 31, 2014. This data represents information...

  1. Comparative evaluation of debris extruded apically by using, Protaper retreatment file, K3 file and H-file with solvent in endodontic retreatment

    Directory of Open Access Journals (Sweden)

    Chetna Arora

    2012-01-01

    Full Text Available Aim: The aim of this study was to evaluate the apical extrusion of debris comparing 2 engine driven systems and hand instrumentation technique during root canal retreatment. Materials and Methods: Forty five human permanent mandibular premolars were prepared using the step-back technique, obturated with gutta-percha/zinc oxide eugenol sealer and cold lateral condensation technique. The teeth were divided into three groups: Group A: Protaper retreatment file, Group B: K3, file Group C: H-file with tetrachloroethylene. All the canals were irrigated with 20ml distilled water during instrumentation. Debris extruded along with the irrigating solution during retreatment procedure was carefully collected in preweighed Eppendorf tubes. The tubes were stored in an incubator for 5 days, placed in a desiccator and then re-weighed. Weight of dry debris was calculated by subtracting the weight of the tube before instrumentation and from the weight of the tube after instrumentation. Data was analyzed using Two Way ANOVA and Post Hoc test. Results : There was statistically significant difference in the apical extrusion of debris between hand instrumentation and protaper retreatment file and K3 file. The amount of extruded debris caused by protaper retreatment file and K3 file instrumentation technique was not statistically significant. All the three instrumentation techniques produced apically extruded debris and irrigant. Conclusion: The best way to minimize the extrusion of debris is by adapting crown down technique therefore the use of rotary technique (Protaper retreatment file, K3 file is recommended.

  2. Download this PDF file

    African Journals Online (AJOL)

    5,. May. 1923, p. 287. ISouth African Military Schools) p 287. CGS Box 231, File 31/0/2. .... One gains the impression that the sphere .... tions, Anthropology, Sociology and Man Manage- ment. ... of the word, possesses personality and initiative,.

  3. Patient Treatment File (PTF)

    Data.gov (United States)

    Department of Veterans Affairs — This database is part of the National Medical Information System (NMIS). The Patient Treatment File (PTF) contains a record for each inpatient care episode provided...

  4. Download this PDF file

    African Journals Online (AJOL)

    countries quite a number of distance education institutions and programmes are more likely to be ... The Open University of Tanzania (OUT), (Ministry of Higher Education, Science and ..... (1991) Comic Relief Funding file. BAI, London, 1st ...

  5. Stochastic Petri net analysis of a replicated file system

    Science.gov (United States)

    Bechta Dugan, Joanne; Ciardo, Gianfranco

    1989-01-01

    A stochastic Petri-net model of a replicated file system is presented for a distributed environment where replicated files reside on different hosts and a voting algorithm is used to maintain consistency. Witnesses, which simply record the status of the file but contain no data, can be used in addition to or in place of files to reduce overhead. A model sufficiently detailed to include file status (current or out-of-date), as well as failure and repair of hosts where copies or witnesses reside, is presented. The number of copies and witnesses is a parameter of the model. Two different majority protocols are examined, one where a majority of all copies and witnesses is necessary to form a quorum, and the other where only a majority of the copies and witnesses on operational hosts is needed. The latter, known as adaptive voting, is shown to increase file availability in most cases.

  6. Controlling P2P File-Sharing Networks Traffic

    OpenAIRE

    García Pineda, Miguel; HAMMOUMI, MOHAMMED; Canovas Solbes, Alejandro; Lloret, Jaime

    2011-01-01

    Since the appearance of Peer-To-Peer (P2P) file-sharing networks some time ago, many Internet users have chosen this technology to share and search programs, videos, music, documents, etc. The total number of P2P file-sharing users has been increasing and decreasing in the last decade depending on the creation or end of some well known P2P file-sharing systems. P2P file-sharing networks traffic is currently overloading some data networks and it is a major headache for netw...

  7. RAMA: A file system for massively parallel computers

    Science.gov (United States)

    Miller, Ethan L.; Katz, Randy H.

    1993-01-01

    This paper describes a file system design for massively parallel computers which makes very efficient use of a few disks per processor. This overcomes the traditional I/O bottleneck of massively parallel machines by storing the data on disks within the high-speed interconnection network. In addition, the file system, called RAMA, requires little inter-node synchronization, removing another common bottleneck in parallel processor file systems. Support for a large tertiary storage system can easily be integrated in lo the file system; in fact, RAMA runs most efficiently when tertiary storage is used.

  8. MR-AFS: a global hierarchical file-system

    International Nuclear Information System (INIS)

    Reuter, H.

    2000-01-01

    The next generation of fusion experiments will use object-oriented technology creating the need for world wide sharing of an underlying hierarchical file-system. The Andrew file system (AFS) is a well known and widely spread global distributed file-system. Multiple-resident-AFS (MR-AFS) combines the features of AFS with hierarchical storage management systems. Files in MR-AFS therefore may be migrated on secondary storage, such as roboted tape libraries. MR-AFS is in use at IPP for the current experiments and data originating from super-computer applications. Experiences and scalability issues are discussed

  9. Utilizing HDF4 File Content Maps for the Cloud

    Science.gov (United States)

    Lee, Hyokyung Joe

    2016-01-01

    We demonstrate a prototype study that HDF4 file content map can be used for efficiently organizing data in cloud object storage system to facilitate cloud computing. This approach can be extended to any binary data formats and to any existing big data analytics solution powered by cloud computing because HDF4 file content map project started as long term preservation of NASA data that doesn't require HDF4 APIs to access data.

  10. JENDL gas-production cross section file

    International Nuclear Information System (INIS)

    Nakagawa, Tsuneo; Narita, Tsutomu

    1992-05-01

    The JENDL gas-production cross section file was compiled by taking cross-section data from JENDL-3 and by using the ENDF-5 format. The data were given to 23 nuclei or elements in light nuclei and structural materials. Graphs of the cross sections and brief description on their evaluation methods are given in this report. (author)

  11. AliEnFS - a Linux File System for the AliEn Grid Services

    OpenAIRE

    Peters, Andreas J.; Saiz, P.; Buncic, P.

    2003-01-01

    Among the services offered by the AliEn (ALICE Environment http://alien.cern.ch) Grid framework there is a virtual file catalogue to allow transparent access to distributed data-sets using various file transfer protocols. $alienfs$ (AliEn File System) integrates the AliEn file catalogue as a new file system type into the Linux kernel using LUFS, a hybrid user space file system framework (Open Source http://lufs.sourceforge.net). LUFS uses a special kernel interface level called VFS (Virtual F...

  12. Java facilities in processing XML files - JAXB and generating PDF reports

    Directory of Open Access Journals (Sweden)

    Danut-Octavian SIMION

    2008-01-01

    Full Text Available The paper presents the Java programming language facilities in working with XML files using JAXB (The Java Architecture for XML Binding technology and generating PDF reports from XML files using Java objects. The XML file can be an existing one and could contain the data about an entity (Clients for example or it might be the result of a SELECT-SQL statement. JAXB generates JAVA classes through xs rules and a Marshalling, Unmarshalling compiler. The PDF file is build from a XML file and uses XSL-FO formatting file and a Java ResultSet object.

  13. SIDS-toADF File Mapping Manual

    Science.gov (United States)

    McCarthy, Douglas; Smith, Matthew; Poirier, Diane; Smith, Charles A. (Technical Monitor)

    2002-01-01

    The "CFD General Notation System" (CGNS) consists of a collection of conventions, and conforming software, for the storage and retrieval of Computational Fluid Dynamics (CFD) data. It facilitates the exchange of data between sites and applications, and helps stabilize the archiving of aerodynamic data. This effort was initiated in order to streamline the procedures in exchanging data and software between NASA and its customers, but the goal is to develop CGNS into a National Standard for the exchange of aerodynamic data. The CGNS development team is comprised of members from Boeing Commercial Airplane Group, NASA-Ames, NASA-Langley, NASA-Lewis, McDonnell-Douglas Corporation (now Boeing-St. Louis), Air Force-Wright Lab., and ICEM-CFD Engineering. The elements of CGNS address all activities associated with the storage of data on external media and its movement to and from application programs. These elements include: 1) The Advanced Data Format (ADF) Database manager, consisting of both a file format specification and its I/O software, which handles the actual reading and writing of data from and to external storage media; 2) The Standard Interface Data Structures (SIDS), which specify the intellectual content of CFD data and the conventions governing naming and terminology; 3) The SIDS-to-ADF File Mapping conventions, which specify the exact location where the CFD data defined by the SIDS is to be stored within the ADF file(s); and 4) The CGNS Mid-level Library, which provides CFD-knowledgeable routines suitable for direct installation into application codes. The SIDS-toADF File Mapping Manual specifies the exact manner in which, under CGNS conventions, CFD data structures (the SIDS) are to be stored in (i.e., mapped onto) the file structure provided by the database manager (ADF). The result is a conforming CGNS database. Adherence to the mapping conventions guarantees uniform meaning and location of CFD data within ADF files, and thereby allows the construction of

  14. panMetaDocs, eSciDoc, and DOIDB—An Infrastructure for the Curation and Publication of File-Based Datasets for GFZ Data Services

    Directory of Open Access Journals (Sweden)

    Damian Ulbricht

    2016-03-01

    Full Text Available The GFZ German Research Centre for Geosciences is the national laboratory for Geosciences in Germany. As part of the Helmholtz Association, providing and maintaining large-scale scientific infrastructures are an essential part of GFZ activities. This includes the generation of significant volumes and numbers of research data, which subsequently become source materials for data publications. The development and maintenance of data systems is a key component of GFZ Data Services to support state-of-the-art research. A challenge lies not only in the diversity of scientific subjects and communities, but also in different types and manifestations of how data are managed by research groups and individual scientists. The data repository of GFZ Data Services provides a flexible IT infrastructure for data storage and publication, including minting of digital object identifiers (DOI. It was built as a modular system of several independent software components linked together through Application Programming Interfaces (APIs provided by the eSciDoc framework. Principal application software are panMetaDocs for data management and DOIDB for logging and moderating data publications activities. Wherever possible, existing software solutions were integrated or adapted. A summary of our experiences made in operating this service is given. Data are described through comprehensive landing pages and supplementary documents, like journal articles or data reports, thus augmenting the scientific usability of the service.

  15. Renewal-anomalous-heterogeneous files

    International Nuclear Information System (INIS)

    Flomenbom, Ophir

    2010-01-01

    Renewal-anomalous-heterogeneous files are solved. A simple file is made of Brownian hard spheres that diffuse stochastically in an effective 1D channel. Generally, Brownian files are heterogeneous: the spheres' diffusion coefficients are distributed and the initial spheres' density is non-uniform. In renewal-anomalous files, the distribution of waiting times for individual jumps is not exponential as in Brownian files, yet obeys: ψ α (t)∼t -1-α , 0 2 >, obeys, 2 >∼ 2 > nrml α , where 2 > nrml is the MSD in the corresponding Brownian file. This scaling is an outcome of an exact relation (derived here) connecting probability density functions of Brownian files and renewal-anomalous files. It is also shown that non-renewal-anomalous files are slower than the corresponding renewal ones.

  16. Basic Stand Alone Medicare Claims Public Use Files

    Data.gov (United States)

    U.S. Department of Health & Human Services — CMS is committed to increasing access to its Medicare claims data through the release of de-identified data files available for public use. They contain...

  17. 40 CFR 716.25 - Adequate file search.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Adequate file search. 716.25 Section... ACT HEALTH AND SAFETY DATA REPORTING General Provisions § 716.25 Adequate file search. The scope of a person's responsibility to search records is limited to records in the location(s) where the required...

  18. Next generation WLCG File Transfer Service (FTS)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    LHC experiments at CERN and worldwide utilize WLCG resources and middleware components to perform distributed computing tasks. One of the most important tasks is reliable file replication. It is a complex problem, suffering from transfer failures, disconnections, transfer duplication, server and network overload, differences in storage systems, etc. To address these problems, EMI and gLite have provided the independent File Transfer Service (FTS) and Grid File Access Library (GFAL) tools. Their development started almost a decade ago, in the meantime, requirements in data management have changed - the old architecture of FTS and GFAL cannot keep support easily these changes. Technology has also been progressing: FTS and GFAL do not fit into the new paradigms (cloud, messaging, for example). To be able to serve the next stage of LHC data collecting (from 2013), we need a new generation of  these tools: FTS 3 and GFAL 2. We envision a service requiring minimal configuration, which can dynamically adapt to the...

  19. A secure file manager for UNIX

    Energy Technology Data Exchange (ETDEWEB)

    DeVries, R.G.

    1990-12-31

    The development of a secure file management system for a UNIX-based computer facility with supercomputers and workstations is described. Specifically, UNIX in its usual form does not address: (1) Operation which would satisfy rigorous security requirements. (2) Online space management in an environment where total data demands would be many times the actual online capacity. (3) Making the file management system part of a computer network in which users of any computer in the local network could retrieve data generated on any other computer in the network. The characteristics of UNIX can be exploited to develop a portable, secure file manager which would operate on computer systems ranging from workstations to supercomputers. Implementation considerations making unusual use of UNIX features, rather than requiring extensive internal system changes, are described, and implementation using the Cray Research Inc. UNICOS operating system is outlined.

  20. Building Data

    Data.gov (United States)

    Town of Cary, North Carolina — Explore real estate information about buildings in the Town of Cary.This file is created by the Town of Cary GIS Group. It contains data from both the Wake, Chatham...

  1. High School Longitudinal Study of 2009 (HSLS:09) Base Year to First Follow-Up Data File Documentation. NCES 2014-361

    Science.gov (United States)

    Ingels, Steven J.; Pratt, Daniel J.; Herget, Deborah R.; Dever, Jill A.; Fritch, Laura Burns; Ottem, Randolph; Rogers, James E.; Kitmitto, Sami; Leinwand, Steve

    2013-01-01

    This manual has been produced to familiarize data users with the design, and the procedures followed for data collection and processing, in the base year and first follow-up of the High School Longitudinal Study of 2009 (HSLS:09), with emphasis on the first follow-up. It also provides the necessary documentation for use of the public-use data…

  2. High School Longitudinal Study of 2009 (HSLS:09) Base Year to First Follow-Up Data File Documentation. Appendixes. NCES 2014-361

    Science.gov (United States)

    Ingels, Steven J.; Pratt, Daniel J.; Herget, Deborah R.; Dever, Jill A.; Fritch, Laura Burns; Ottem, Randolph; Rogers, James E.; Kitmitto, Sami; Leinwand, Steve

    2013-01-01

    The manual that accompanies these appendices was produced to familiarize data users with the design, and the procedures followed for data collection and processing, in the base year and first follow-up of the High School Longitudinal Study of 2009 (HSLS:09), with emphasis on the first follow-up. It also provides the necessary documentation for use…

  3. Experiments in high energy elementary particle physics and processing of photographically filed data with the aid of a measuring and evaluating system

    Energy Technology Data Exchange (ETDEWEB)

    Kirst, H [Akademie der Wissenschaften der DDR, Berlin-Zeuthen. Inst. fuer Hochenergiephysik

    1977-01-01

    The measuring and evaluating system includes pattern recognition and measuring instruments as well as a processor for data evaluation and checking procedures. The program chart and the application to evaluating photographs of particle tracks from high energy physics experiments are mentioned. The time-sharing effect of such systems in data evaluation is emphasized.

  4. Exploring Online Students' Self-Regulated Learning with Self-Reported Surveys and Log Files: A Data Mining Approach

    Science.gov (United States)

    Cho, Moon-Heum; Yoo, Jin Soung

    2017-01-01

    Many researchers who are interested in studying students' online self-regulated learning (SRL) have heavily relied on self-reported surveys. Data mining is an alternative technique that can be used to discover students' SRL patterns from large data logs saved on a course management system. The purpose of this study was to identify students' online…

  5. School Survey on Crime and Safety (SSOCS): 2015-16. Public-Use Data File User's Manual. NCES 2018-107

    Science.gov (United States)

    Jackson, Michael; Diliberti, Melissa; Kemp, Jana; Hummel, Steven; Cox, Christina; Gbondo-Tugbawa, Komba; Simon, Dillon

    2018-01-01

    The School Survey on Crime and Safety (SSOCS) is managed by the National Center for Education Statistics (NCES) within the Institute of Education Sciences of the U.S. Department of Education. SSOCS collects extensive crime and safety data from principals and administrators of public schools in the United States. Data from this collection can be…

  6. ATLAS, an integrated structural analysis and design system. Volume 4: Random access file catalog

    Science.gov (United States)

    Gray, F. P., Jr. (Editor)

    1979-01-01

    A complete catalog is presented for the random access files used by the ATLAS integrated structural analysis and design system. ATLAS consists of several technical computation modules which output data matrices to corresponding random access file. A description of the matrices written on these files is contained herein.

  7. Log files can and should be prepared for a functionalistic approach

    DEFF Research Database (Denmark)

    Bergenholtz, Henning; Johnsen, Mia

    2007-01-01

    -ups. However, log file analyses have also been characterised by a juggling of num­bers based on data calculations of limited direct relevance to practical and theoretical lexicography. This article proposes the development of lexicographically relevant log files for the use in log file analyses in order...

  8. Structural origins of broadband emission from layered Pb–Br hybrid perovskites† †Electronic supplementary information (ESI) available. CCDC 1521053–1521055, 1521057–1521060 and 1521067. For ESI and crystallographic data in CIF or other electronic format see DOI: 10.1039/c7sc01590a Click here for additional data file. Click here for additional data file.

    OpenAIRE

    Smith, Matthew D.; Jaffe, Adam; Dohner, Emma R.; Lindenberg, Aaron M.; Karunadasa, Hemamala I.

    2017-01-01

    Through structural and optical studies of a series of two-dimensional hybrid perovskites, we show that broadband emission upon near-ultraviolet excitation is common to (001) lead-bromide perovskites. Importantly, we find that the relative intensity of the broad emission correlates with increasing out-of-plane distortion of the Pb–(μ-Br)–Pb angle in the inorganic sheets. Temperature- and power-dependent photoluminescence data obtained on a representative (001) perovskite support an intrinsic o...

  9. Configuration Management File Manager Developed for Numerical Propulsion System Simulation

    Science.gov (United States)

    Follen, Gregory J.

    1997-01-01

    One of the objectives of the High Performance Computing and Communication Project's (HPCCP) Numerical Propulsion System Simulation (NPSS) is to provide a common and consistent way to manage applications, data, and engine simulations. The NPSS Configuration Management (CM) File Manager integrated with the Common Desktop Environment (CDE) window management system provides a common look and feel for the configuration management of data, applications, and engine simulations for U.S. engine companies. In addition, CM File Manager provides tools to manage a simulation. Features include managing input files, output files, textual notes, and any other material normally associated with simulation. The CM File Manager includes a generic configuration management Application Program Interface (API) that can be adapted for the configuration management repositories of any U.S. engine company.

  10. APLIKASI KEAMANAN FILE AUDIO WAV (WAVEFORM DENGAN TERAPAN ALGORITMA RSA

    Directory of Open Access Journals (Sweden)

    Raja Nasrul Fuad

    2017-03-01

    Full Text Available The WAV file format that is widely used rough on various kinds of multimedia and gaming platforms. Ease of access and technological development with a variety of media to facilitate the exchange of information to various places. The data are important and need to be kept confidential secret for a wide range of security threats so that data can be intercepted and acknowledged by third parties during the shipping process. Of these problems led to the idea to create an application data security functions can secure the data using the RSA algorithm. The programming language is C # with Visual Studio software, the processed data is a sample each byte in WAV file, the header will be the same as that originally WAV files can be played even if the information has been withheld. RSA algorithm can be implemented into a programming language that WAV files can be processed and secured the data.

  11. Versioning Complex Data

    Energy Technology Data Exchange (ETDEWEB)

    Macduff, Matt C.; Lee, Benno; Beus, Sherman J.

    2014-06-29

    Using the history of ARM data files, we designed and demonstrated a data versioning paradigm that is feasible. Assigning versions to sets of files that are modified with some special assumptions and domain specific rules was effective in the case of ARM data, which has more than 5000 datastreams and 500TB of data.

  12. Performance of the Galley Parallel File System

    Science.gov (United States)

    Nieuwejaar, Nils; Kotz, David

    1996-01-01

    As the input/output (I/O) needs of parallel scientific applications increase, file systems for multiprocessors are being designed to provide applications with parallel access to multiple disks. Many parallel file systems present applications with a conventional Unix-like interface that allows the application to access multiple disks transparently. This interface conceals the parallism within the file system, which increases the ease of programmability, but makes it difficult or impossible for sophisticated programmers and libraries to use knowledge about their I/O needs to exploit that parallelism. Furthermore, most current parallel file systems are optimized for a different workload than they are being asked to support. We introduce Galley, a new parallel file system that is intended to efficiently support realistic parallel workloads. Initial experiments, reported in this paper, indicate that Galley is capable of providing high-performance 1/O to applications the applications that rely on them. In Section 3 we describe that access data in patterns that have been observed to be common.

  13. Dynamic Non-Hierarchical File Systems for Exascale Storage

    Energy Technology Data Exchange (ETDEWEB)

    Long, Darrell E. [Univ. of California, Santa Cruz, CA (United States); Miller, Ethan L [Univ. of California, Santa Cruz, CA (United States)

    2015-02-24

    This constitutes the final report for “Dynamic Non-Hierarchical File Systems for Exascale Storage”. The ultimate goal of this project was to improve data management in scientific computing and high-end computing (HEC) applications, and to achieve this goal we proposed: to develop the first, HEC-targeted, file system featuring rich metadata and provenance collection, extreme scalability, and future storage hardware integration as core design goals, and to evaluate and develop a flexible non-hierarchical file system interface suitable for providing more powerful and intuitive data management interfaces to HEC and scientific computing users. Data management is swiftly becoming a serious problem in the scientific community – while copious amounts of data are good for obtaining results, finding the right data is often daunting and sometimes impossible. Scientists participating in a Department of Energy workshop noted that most of their time was spent “...finding, processing, organizing, and moving data and it’s going to get much worse”. Scientists should not be forced to become data mining experts in order to retrieve the data they want, nor should they be expected to remember the naming convention they used several years ago for a set of experiments they now wish to revisit. Ideally, locating the data you need would be as easy as browsing the web. Unfortunately, existing data management approaches are usually based on hierarchical naming, a 40 year-old technology designed to manage thousands of files, not exabytes of data. Today’s systems do not take advantage of the rich array of metadata that current high-end computing (HEC) file systems can gather, including content-based metadata and provenance1 information. As a result, current metadata search approaches are typically ad hoc and often work by providing a parallel management system to the “main” file system, as is done in Linux (the locate utility), personal computers, and enterprise search

  14. Structural origins of broadband emission from layered Pb–Br hybrid perovskites† †Electronic supplementary information (ESI) available. CCDC 1521053–1521055, 1521057–1521060 and 1521067. For ESI and crystallographic data in CIF or other electronic format see DOI: 10.1039/c7sc01590a Click here for additional data file. Click here for additional data file.

    Science.gov (United States)

    Smith, Matthew D.; Jaffe, Adam; Dohner, Emma R.; Lindenberg, Aaron M.

    2017-01-01

    Through structural and optical studies of a series of two-dimensional hybrid perovskites, we show that broadband emission upon near-ultraviolet excitation is common to (001) lead-bromide perovskites. Importantly, we find that the relative intensity of the broad emission correlates with increasing out-of-plane distortion of the Pb–(μ-Br)–Pb angle in the inorganic sheets. Temperature- and power-dependent photoluminescence data obtained on a representative (001) perovskite support an intrinsic origin to the broad emission from the bulk material, where photogenerated carriers cause excited-state lattice distortions mediated through electron–lattice coupling. In contrast, most inorganic phosphors contain extrinsic emissive dopants or emissive surface sites. The design rules established here could allow us to systematically optimize white-light emission from layered hybrid perovskites by fine-tuning the bulk crystal structure. PMID:28970879

  15. Streamflow and water-quality data for Little Scrubgrass Creek basin, Venango and Butler Counties, Pennsylvania, December 1987-November 1988. Open File Report

    International Nuclear Information System (INIS)

    Kostelnik, K.M.; Durlin, R.R.

    1989-01-01

    Streamflow and water-quality data were collected throughout the Little Scrubgrass Creek basin, Venango and Butler Counties, Pennsylvania, from December 1987 to November 1988, to determine the prevailing quality of surface water throughout the basin. The data will assist the Pennsylvania Department of Environmental Resources during its review of coal mine permit applications. A water-quality station on Little Scrubgrass Creek near Lisbon, provided continuous-record of stream stage, pH, specific conductance, and water temperature. Monthly water-quality samples collected at the station were analyzed for total and dissolved metals, nutrients, major cations and anions, and suspended sediment concentrations. Fourteen partial-record sites, located throughout the basin, were similarly sampled four times during the period of study. Streamflow and water-quality data obtained at these sites during various base flow periods are also presented

  16. Streamflow and water-quality data for Little Clearfield Creek basin, Clearfield County, Pennsylvania, December 1987-November 1988. Open File Report

    International Nuclear Information System (INIS)

    Kostelnik, K.M.; Durlin, R.R.

    1989-01-01

    Streamflow and water-quality data were collected throughout the Little Clearfield Creek basin, Clearfield County, Pennsylvania, from December 1987 through November 1988, to determine the existing quality of surface water over a range of hydrologic conditions. The data will assist the Pennsylvania Department of Environmental Resources during its review of coal-mine permit applications. A water-quality station near the mouth of Little Clearfield Creek provided continuous-record of stream stage, pH, specific conductance, and water temperature. Monthly water-quality samples collected at the station were analyzed for total and dissolved metals, nutrients, major cations, and suspended-sediment concentrations. Seventeen partial-record sites, located throughout the basin, were similarly sampled four times during the study. Streamflow and water-quality data obtained at these sites during a winter base flow, a spring storm event, a low summer base flow, and a more moderate summer base flow also are presented

  17. Auto Draw from Excel Input Files

    Science.gov (United States)

    Strauss, Karl F.; Goullioud, Renaud; Cox, Brian; Grimes, James M.

    2011-01-01

    The design process often involves the use of Excel files during project development. To facilitate communications of the information in the Excel files, drawings are often generated. During the design process, the Excel files are updated often to reflect new input. The problem is that the drawings often lag the updates, often leading to confusion of the current state of the design. The use of this program allows visualization of complex data in a format that is more easily understandable than pages of numbers. Because the graphical output can be updated automatically, the manual labor of diagram drawing can be eliminated. The more frequent update of system diagrams can reduce confusion and reduce errors and is likely to uncover symmetric problems earlier in the design cycle, thus reducing rework and redesign.

  18. 18 CFR 11.16 - Filing requirements.

    Science.gov (United States)

    2010-04-01

    ... ACT Charges for Headwater Benefits § 11.16 Filing requirements. (a) Applicability. (1) Any party subject to a headwater benefits determination under this subpart must supply project-specific data, in... are attributable to the annual costs of interest, maintenance, and depreciation, identifying the...

  19. Data Files for Ground-Motion Simulations of the 1906 San Francisco Earthquake and Scenario Earthquakes on the Northern San Andreas Fault

    Science.gov (United States)

    Aagaard, Brad T.; Barall, Michael; Brocher, Thomas M.; Dolenc, David; Dreger, Douglas; Graves, Robert W.; Harmsen, Stephen; Hartzell, Stephen; Larsen, Shawn; McCandless, Kathleen; Nilsson, Stefan; Petersson, N. Anders; Rodgers, Arthur; Sjogreen, Bjorn; Zoback, Mary Lou

    2009-01-01

    This data set contains results from ground-motion simulations of the 1906 San Francisco earthquake, seven hypothetical earthquakes on the northern San Andreas Fault, and the 1989 Loma Prieta earthquake. The bulk of the data consists of synthetic velocity time-histories. Peak ground velocity on a 1/60th degree grid and geodetic displacements from the simulations are also included. Details of the ground-motion simulations and analysis of the results are discussed in Aagaard and others (2008a,b).

  20. Study and development of a document file system with selective access

    International Nuclear Information System (INIS)

    Mathieu, Jean-Claude

    1974-01-01

    The objective of this research thesis was to design and to develop a set of software aimed at an efficient management of a document file system by using methods of selective access to information. Thus, the three main aspects of file processing (creation, modification, reorganisation) have been addressed. The author first presents the main problems related to the development of a comprehensive automatic documentation system, and their conventional solutions. Some future aspects, notably dealing with the development of peripheral computer technology, are also evoked. He presents the characteristics of INIS bibliographic records provided by the IAEA which have been used to create the files. In the second part, he briefly describes the file system general organisation. This system is based on the use of two main files: an inverse file which contains for each descriptor a list of of numbers of files indexed by this descriptor, and a dictionary of descriptor or input file which gives access to the inverse file. The organisation of these both files is then describes in a detailed way. Other related or associated files are created, and the overall architecture and mechanisms integrated into the file data input software are described, as well as various processing applied to these different files. Performance and possible development are finally discussed

  1. RRDF-98. Russian reactor dosimetry file. Summary documentation

    Energy Technology Data Exchange (ETDEWEB)

    Pashchenko, A B

    1999-03-01

    This document summarizes the contents and documentation of the new version of tile Russian Reactor Dosimetry File (RRDF-98) released in December 1998 by the Russian Center on Nuclear Data (CJD) at the Institute of Physics and Power Engineering, Russian Federation. This file contains the original evaluations of cross section data and covariance matrixes for 22 reactions which are used for neutron flux dosimetry by foil activation. The majority of the evaluations included in previous versions of the Russian Reactor Dosimetry Files (BOSPOR-80, RRGF-94 and RRDF-96) have been superseded by new evaluations. The evaluated cross sections of RRDF-98 averaged over 252-Cf and 235-U fission spectra are compared with relevant integral data. The data file is available from the IAEA Nuclear Data Section on diskette, cost free. (author) 9 refs, 22 figs, 2 tabs

  2. RRDF-98. Russian reactor dosimetry file. Summary documentation

    International Nuclear Information System (INIS)

    Pashchenko, A.B.

    1999-01-01

    This document summarizes the contents and documentation of the new version of tile Russian Reactor Dosimetry File (RRDF-98) released in December 1998 by the Russian Center on Nuclear Data (CJD) at the Institute of Physics and Power Engineering, Russian Federation. This file contains the original evaluations of cross section data and covariance matrixes for 22 reactions which are used for neutron flux dosimetry by foil activation. The majority of the evaluations included in previous versions of the Russian Reactor Dosimetry Files (BOSPOR-80, RRGF-94 and RRDF-96) have been superseded by new evaluations. The evaluated cross sections of RRDF-98 averaged over 252-Cf and 235-U fission spectra are compared with relevant integral data. The data file is available from the IAEA Nuclear Data Section on diskette, cost free. (author)

  3. Accessing files in an Internet: The Jade file system

    Science.gov (United States)

    Peterson, Larry L.; Rao, Herman C.

    1991-01-01

    Jade is a new distribution file system that provides a uniform way to name and access files in an internet environment. It makes two important contributions. First, Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Jade is designed under the restriction that the underlying file system may not be modified. Second, rather than providing a global name space, Jade permits each user to define a private name space. These private name spaces support two novel features: they allow multiple file systems to be mounted under one directory, and they allow one logical name space to mount other logical name spaces. A prototype of the Jade File System was implemented on Sun Workstations running Unix. It consists of interfaces to the Unix file system, the Sun Network File System, the Andrew File System, and FTP. This paper motivates Jade's design, highlights several aspects of its implementation, and illustrates applications that can take advantage of its features.

  4. Accessing files in an internet - The Jade file system

    Science.gov (United States)

    Rao, Herman C.; Peterson, Larry L.

    1993-01-01

    Jade is a new distribution file system that provides a uniform way to name and access files in an internet environment. It makes two important contributions. First, Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Jade is designed under the restriction that the underlying file system may not be modified. Second, rather than providing a global name space, Jade permits each user to define a private name space. These private name spaces support two novel features: they allow multiple file systems to be mounted under one directory, and they allow one logical name space to mount other logical name spaces. A prototype of the Jade File System was implemented on Sun Workstations running Unix. It consists of interfaces to the Unix file system, the Sun Network File System, the Andrew File System, and FTP. This paper motivates Jade's design, highlights several aspects of its implementation, and illustrates applications that can take advantage of its features.

  5. Download this PDF file

    African Journals Online (AJOL)

    1- is gifts' ta5ie" in elist fig'equitable' fees distilition s ... O'." & 1 25; 33i) re...) C SS Sati ri. Southerri'Stillah diffigFiles'f actities s % -- - , a v. & ' " St - a s fit . . . fiji ſti i ...

  6. Challenging Ubiquitous Inverted Files

    NARCIS (Netherlands)

    de Vries, A.P.

    2000-01-01

    Stand-alone ranking systems based on highly optimized inverted file structures are generally considered ‘the’ solution for building search engines. Observing various developments in software and hardware, we argue however that IR research faces a complex engineering problem in the quest for more

  7. Download this PDF file

    African Journals Online (AJOL)

    AJNS WEBMASTERS

    Incidence is higher in the elderly, about 58 per 100,000 per year. Diagnosis of CSDH is still .... in the other two patients was not stated in the case file. Evacuation of the Subdural .... Personal experience in 39 patients. Br J of Neurosurg. 2003 ...

  8. Effects of existing evaluated nuclear data files on neutronics characteristics of the BFS-62-3A critical assembly benchmark model

    International Nuclear Information System (INIS)

    Semenov, Mikhail

    2002-11-01

    This report is continuation of studying of the experiments performed on BFS-62-3A critical assembly in Russia. The objective of work is definition of the cross section uncertainties on reactor neutronics parameters as applied to the hybrid core of the BN-600 reactor of Beloyarskaya NPP. Two-dimensional benchmark model of BFS-62-3A was created specially for these purposes and experimental values were reduced to it. Benchmark characteristics for this assembly are 1) criticality; 2) central fission rate ratios (spectral indices); and 3) fission rate distributions in stainless steel reflector. The effects of nuclear data libraries have been studied by comparing the results calculated using available modern data libraries - ENDF/B-V, ENDF/B-VI, ENDF/B-VI-PT, JENDL-3.2 and ABBN-93. All results were computed by Monte Carlo method with the continuous energy cross-sections. The checking of the cross sections of major isotopes on wide benchmark criticality collection was made. It was shown that ENDF/B-V data underestimate the criticality of fast reactor systems up to 2% Δk. As for the rest data, the difference between each other in criticality for BFS-62-3A is around 0.6% Δk. However, taking into account the results obtained for other fast reactor benchmarks (and steel-reflected also), it may conclude that the difference in criticality calculation results can achieve 1% Δk. This value is in a good agreement with cross section uncertainty evaluated for BN-600 hybrid core (±0.6% Δk). This work is related to the JNC-IPPE Collaboration on Experimental Investigation of Excess Weapons Grade Pu Disposition in BN-600 Reactor Using BFS-2 Facility. (author)

  9. Assessment of beryllium and molybdenum nuclear data files with the RPI neutron scattering system in the energy region from 0.5 to 20 MeV

    Science.gov (United States)

    Daskalakis, Adam; Blain, Ezekiel; Leinweber, Gregory; Rapp, Michael; Barry, Devin; Block, Robert; Danon, Yaron

    2017-09-01

    A series of neutron scattering benchmark measurements were performed on beryllium and molybdenum with the Rensselaer Polytechnic Institute's Neutron Scattering System. The pulsed neutron source was produced by the Rensselaer Polytechnic Institute's Linear Accelerator and a well collimated neutron beam was incident onto the samples located at a distance of 30.07 m. Neutrons that scattered from the sample were measured using the time-of-flight by eight EJ-301 liquid scintillator detectors positioned 0.5 m from the sample of interest. A total of eight experiments were performed with two sample thicknesses each, measured by detectors placed at two sets of angles. All data were processed using pulse shape analysis that separated the neutron and gamma ray events and included a gamma misclassification correction to account for erroneously identified gamma rays. A detailed model of the neutron scattering system simulated each experiment with several current evaluated nuclear data libraries and their predecessors. Results for each evaluation were compared to the experimental data using a figure-of-merit. The neutron scattering system has been used as a means to quantify a library's performance.

  10. 11 CFR 100.19 - File, filed or filing (2 U.S.C. 434(a)).

    Science.gov (United States)

    2010-01-01

    ... a facsimile machine or by electronic mail if the reporting entity is not required to file..., including electronic reporting entities, may use the Commission's website's on-line program to file 48-hour... the reporting entity is not required to file electronically in accordance with 11 CFR 104.18. [67 FR...

  11. The crystallographic information file (CIF): A new standard archive file for crystallography

    International Nuclear Information System (INIS)

    Hall, S.R.; Allen, F.H.; Brown, I.D.

    1991-01-01

    The specification of a new standard Crystallographic Information File (CIF) is described. Its development is based on the Self-Defining Text Archieve and Retrieval (STAR) procedure. The CIF is a general, flexible and easily extensible free-format archive file; it is human and machine readable and can be edited by a simple editor. The CIF is designed for the electronic transmission of crystallographic data between individual laboratories, journals and databases: It has been adopted by the International Union of Crystallography as the recommended medium for this purpose. The file consists of data names and data items, together with a loop facility for repeated items. The data names, constructed hierarchically so as to form data categories, are self-descriptive within a 32-character limit. The sorted list of data names, together with their precise definitions, constitutes the CIF dictionary (core version 1991). The CIF core dictionary is presented in full and covers the fundamental and most commonly used data items relevant to crystal structure analysis. The dictionary is also available as an electronic file suitable for CIF computer applications. Future extensions to the dictionary will include data items used in more specialized areas of crystallography. (orig.)

  12. Summary of JENDL-2 general purpose file

    Energy Technology Data Exchange (ETDEWEB)

    Nakagawa, Tsuneo [ed.

    1984-06-15

    The general purpose file of the second version of Japanese Evaluated Nuclear Data Library (JENDL-2) was released in December 1982. Recently, descriptive data were added to JENDL-2 and at the same time the first revision of numerical data was performed. JENDL-2 (Rev.1) consists of the data for 89 nuclides and about 211,000 records in the ENDF/B-IV format. In this report, full listings of presently added descriptive data are given to summarize the JENDL-2 general purpose file. The 2200-m/sec and 14-MeV cross sections, resonance integrals, Maxwellian and fission spectrum averaged cross sections are given in a table. Average cross sections were also calculated in suitable energy intervals.

  13. Summary of JENDL-2 general purpose file

    International Nuclear Information System (INIS)

    Nakagawa, Tsuneo

    1984-06-01

    The general purpose file of the second version of Japanese Evaluated Nuclear Data Library (JENDL-2) was released in December 1982. Recently, descriptive data were added to JENDL-2 and at the same time the first revision of numerical data was performed. JENDL-2 (Rev1) consists of the data for 89 nuclides and about 211,000 records in the ENDF/B-IV format. In this report, full listings of presently added descriptive data are given to summarize the JENDL-2 general purpose file. The 2200-m/sec and 14-MeV cross sections, resonance integrals, Maxwellian and fission spectrum averaged cross sections are given in a table. Average cross sections were also calculated in suitable energy intervals. (author)

  14. Sorting protein lists with nwCompare: a simple and fast algorithm for n-way comparison of proteomic data files.

    Science.gov (United States)

    Pont, Frédéric; Fournié, Jean Jacques

    2010-03-01

    MS, the reference technology for proteomics, routinely produces large numbers of protein lists whose fast comparison would prove very useful. Unfortunately, most softwares only allow comparisons of two to three lists at once. We introduce here nwCompare, a simple tool for n-way comparison of several protein lists without any query language, and exemplify its use with differential and shared cancer cell proteomes. As the software compares character strings, it can be applied to any type of data mining, such as genomic or metabolomic datalists.

  15. HCUP State Emergency Department Databases (SEDD) - Restricted Access File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The State Emergency Department Databases (SEDD) contain the universe of emergency department visits in participating States. Restricted access data files are...

  16. Archive of Census Related Products (ACRP): 1992 Boundary Files

    Data.gov (United States)

    National Aeronautics and Space Administration — The 1992 Boundary Files portion of the Archive of Census Related Products (ACRP) consists of 1992 boundary data from the U.S. Census Bureau's Topologically...

  17. Archive of Census Related Products (ACRP): 1990 Standard Extract Files

    Data.gov (United States)

    National Aeronautics and Space Administration — The 1990 Standard Extract Files portion of the Archive of Census Related Products (ACRP) contains population and housing data derived from the U.S. Census Bureau's...

  18. 75 FR 57759 - Great River Energy; Notice of Filing

    Science.gov (United States)

    2010-09-22

    ... (GRE) filed its proposed updated Reactive Power revenue requirement and supporting cost data for GRE's... Reference Room in Washington, DC. There is an ``eSubscription'' link on the Web site that enables...

  19. 12 CFR Appendix E to Part 360 - Hold File Structure

    Science.gov (United States)

    2010-01-01

    ... ReasonReason for the hold. Possible values are: Character (2). • LN = Loan Collateral Hold • LG = Court... structure of the data file to provide information to the FDIC for each legal or collateral hold placed on a...

  20. Grid collector: An event catalog with automated file management

    International Nuclear Information System (INIS)

    Wu, Kesheng; Zhang, Wei-Ming; Sim, Alexander; Gu, Junmin; Shoshani, Arie

    2003-01-01

    High Energy Nuclear Physics (HENP) experiments such as STAR at BNL and ATLAS at CERN produce large amounts of data that are stored as files on mass storage systems in computer centers. In these files, the basic unit of data is an event. Analysis is typically performed on a selected set of events. The files containing these events have to be located, copied from mass storage systems to disks before analysis, and removed when no longer needed. These file management tasks are tedious and time consuming. Typically, all events contained in the files are read into memory before a selection is made. Since the time to read the events dominate the overall execution time, reading the unwanted event needlessly increases the analysis time. The Grid Collector is a set of software modules that works together to address these two issues. It automates the file management tasks and provides ''direct'' access to the selected events for analyses. It is currently integrated with the STAR analysis framework. The users can select events based on tags, such as, ''production date between March 10 and 20, and the number of charged tracks > 100.'' The Grid Collector locates the files containing relevant events, transfers the files across the Grid if necessary, and delivers the events to the analysis code through the familiar iterators. There has been some research efforts to address the file management issues, the Grid Collector is unique in that it addresses the event access issue together with the file management issues. This makes it more useful to a large variety of users

  1. Grid collector: An event catalog with automated file management

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Kesheng; Zhang, Wei-Ming; Sim, Alexander; Gu, Junmin; Shoshani, Arie

    2003-10-17

    High Energy Nuclear Physics (HENP) experiments such as STAR at BNL and ATLAS at CERN produce large amounts of data that are stored as files on mass storage systems in computer centers. In these files, the basic unit of data is an event. Analysis is typically performed on a selected set of events. The files containing these events have to be located, copied from mass storage systems to disks before analysis, and removed when no longer needed. These file management tasks are tedious and time consuming. Typically, all events contained in the files are read into memory before a selection is made. Since the time to read the events dominate the overall execution time, reading the unwanted event needlessly increases the analysis time. The Grid Collector is a set of software modules that works together to address these two issues. It automates the file management tasks and provides ''direct'' access to the selected events for analyses. It is currently integrated with the STAR analysis framework. The users can select events based on tags, such as, ''production date between March 10 and 20, and the number of charged tracks > 100.'' The Grid Collector locates the files containing relevant events, transfers the files across the Grid if necessary, and delivers the events to the analysis code through the familiar iterators. There has been some research efforts to address the file management issues, the Grid Collector is unique in that it addresses the event access issue together with the file management issues. This makes it more useful to a large variety of users.

  2. Embedded system file transfer USB

    International Nuclear Information System (INIS)

    Jaoua, Mehdi

    2008-01-01

    The development of the communication series A emphasized new aspects of data exchange. The transfer of data, subject of my project of end of studies, consists in transferring from the files of a support of mass towards another via port USB. In first phase, I had like stain the realization of an embarked system allowing the communication between a key USB and final of communication such as a Pc. For this fact, I had to include/understand the operation of protocol USB and thus I could programmed a Peak to manage this communication. The second phase, will consist in extending this project towards a transmission de< donnees between two keys USB without intervention of a powerful machine equipped with an operating system pour rant to manage this transaction. (Author)

  3. Wadeable Streams Assessment Data

    Science.gov (United States)

    The Wadeable Streams Assessment (WSA) is a first-ever statistically-valid survey of the biological condition of small streams throughout the U.S. The U.S. Environmental Protection Agency (EPA) worked with the states to conduct the assessment in 2004-2005. Data for each parameter sampled in the Wadeable Streams Assessment (WSA) are available for downloading in a series of files as comma separated values (*.csv). Each *.csv data file has a companion text file (*.txt) that lists a dataset label and individual descriptions for each variable. Users should view the *.txt files first to help guide their understanding and use of the data.

  4. Nuclear data online

    International Nuclear Information System (INIS)

    McLane, V.

    1997-01-01

    The National Nuclear Data Center (NNDC) Online Data Service, available since 1986, is continually being upgraded and expanded. Most files are now available for access through the World Wide Web. Bibliographic, experimental, and evaluated data files are available containing information no neutron, charged-particle, and photon-induced nuclear reaction data, as well as nuclear decay and nuclear structure information. An effort is being made through the world-wide Nuclear Reaction Data Centers collaboration to make the charged-particle reaction data libraries as complete as possible. The data may be downloaded or viewed both as plots or as tabulated data. A variety of output formats are available for most files

  5. A brief overview of the European Fusion File (EFF) Project

    International Nuclear Information System (INIS)

    Kellett, M.A.; Forrest, R.A.; Batistoni, P.

    2004-01-01

    The European Fusion File (EFF) Project is a collaborative project with work funded by the European Fusion Development Agreement (EFDA). The emphasis is on the pooling of resources and removal of duplication of effort, leading to the efficient development of two types of nuclear data libraries for use in fusion power plant design and operation studies. The two branches consist of, on the one hand, a general purpose file for modelling and design capabilities and, second, an activation file for the calculation and simulation of dose rates and energy release during operation of a future power plant. Efforts are directed towards a continued improvement of the quality of the nuclear data needed for these analyses. The OECD Nuclear Energy Agency's Data Bank acts as the central repository for the files and all information discussed during twice yearly meetings. It offers its services at no charge to the Project. (author)

  6. File: International bilateral relations

    International Nuclear Information System (INIS)

    Feltin, Ch.; Rabouhams, J.; Bravo, X.; Rousseau, M.; Le Breton, S.; Saint Raymond, Ph.; Brigaud, O.; Pertuis, V.; McNair, J.; Sayers, M.R.; Bye, R.; Scherrer, J.

    1998-01-01

    Since its creation in 1973, the Authority of Safety was assigned missions in the international field with following objectives: to develop information exchanges with its foreign counterpart, to make know and to explain the French approach and practice; to give to concerned countries the useful information on french nuclear facilities situated near the border; This file shows with some examples, how bilateral relations allow to fill up these objectives and how the French Authority got the foreign experience. (N.C.)

  7. Non-POSIX File System for LHCb Online Event Handling

    CERN Document Server

    Garnier, J C; Cherukuwada, S S

    2011-01-01

    LHCb aims to use its O(20000) CPU cores in the high level trigger (HLT) and its 120 TB Online storage system for data reprocessing during LHC shutdown periods. These periods can last a few days for technical maintenance or only a few hours during beam interfill gaps. These jobs run on files which are staged in from tape storage to the local storage buffer. The result are again one or more files. Efficient file writing and reading is essential for the performance of the system. Rather than using a traditional shared file-system such as NFS or CIFS we have implemented a custom, light-weight, non-Posix network file-system for the handling of these files. Streaming this file-system for the data-access allows to obtain high performance, while at the same time keep the resource consumption low and add nice features not found in NFS such as high-availability, transparent fail-over of the read and write service. The writing part of this streaming service is in successful use for the Online, real-time writing of the d...

  8. PFS: a distributed and customizable file system

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.

    1996-01-01

    In this paper we present our ongoing work on the Pegasus File System (PFS), a distributed and customizable file system that can be used for off-line file system experiments and on-line file system storage. PFS is best described as an object-oriented component library from which either a true file

  9. 78 FR 75554 - Combined Notice of Filings

    Science.gov (United States)

    2013-12-12

    ...-000. Applicants: Young Gas Storage Company, Ltd. Description: Young Fuel Reimbursement Filing to be.... Protests may be considered, but intervention is necessary to become a party to the proceeding. eFiling is... qualifying facilities filings can be found at: http://www.ferc.gov/docs-filing/efiling/filing-req.pdf . For...

  10. 12 CFR 5.4 - Filing required.

    Science.gov (United States)

    2010-01-01

    ... CORPORATE ACTIVITIES Rules of General Applicability § 5.4 Filing required. (a) Filing. A depository institution shall file an application or notice with the OCC to engage in corporate activities and... advise an applicant through a pre-filing communication to send the filing or submission directly to the...

  11. Storage Manager and File Transfer Web Services

    International Nuclear Information System (INIS)

    William A Watson III; Ying Chen; Jie Chen; Walt Akers

    2002-01-01

    Web services are emerging as an interesting mechanism for a wide range of grid services, particularly those focused upon information services and control. When coupled with efficient data transfer services, they provide a powerful mechanism for building a flexible, open, extensible data grid for science applications. In this paper we present our prototype work on a Java Storage Resource Manager (JSRM) web service and a Java Reliable File Transfer (JRFT) web service. A java client (Grid File Manager) on top of JSRM and is developed to demonstrate the capabilities of these web services. The purpose of this work is to show the extent to which SOAP based web services are an appropriate direction for building a grid-wide data management system, and eventually grid-based portals

  12. The Galley Parallel File System

    Science.gov (United States)

    Nieuwejaar, Nils; Kotz, David

    1996-01-01

    Most current multiprocessor file systems are designed to use multiple disks in parallel, using the high aggregate bandwidth to meet the growing I/0 requirements of parallel scientific applications. Many multiprocessor file systems provide applications with a conventional Unix-like interface, allowing the application to access multiple disks transparently. This interface conceals the parallelism within the file system, increasing the ease of programmability, but making it difficult or impossible for sophisticated programmers and libraries to use knowledge about their I/O needs to exploit that parallelism. In addition to providing an insufficient interface, most current multiprocessor file systems are optimized for a different workload than they are being asked to support. We introduce Galley, a new parallel file system that is intended to efficiently support realistic scientific multiprocessor workloads. We discuss Galley's file structure and application interface, as well as the performance advantages offered by that interface.

  13. Nuclear data and related services

    International Nuclear Information System (INIS)

    Tuli, J.K.

    1985-01-01

    National Nuclear Data Center (NNDC) maintains a number of data bases containing bibliographic information and evaluated as well as experimental nuclear properties. An evaluated computer file maintained by the NNDC, called the Evaluated Nuclear Structure Data File (ENSDF), contains nuclear structure information for all known nuclides. The ENSDF is the source for the journal Nuclear Data Sheets which is produced and edited by NNDC. The Evaluated Nuclear Data File (ENDF), on the other hand is designed for storage and retrieval of such evaluated nuclear data as are used in neutronic, photonic, and decay heat calculations in a large variety of applications. The NNDC maintains three bibliographic files: NSR - for nuclear structure and decay data related references, CINDA - a bibliographic file for neutron induced reactions, and CPBIB - for charged particle reactions. Selected retrievals from evaluated data and bibliographic files are possible on-line or on request from NNDC

  14. Duplicate Record Elimination in Large Data Files.

    Science.gov (United States)

    1981-08-01

    UNCLASSIFIJED CSTR -445 NL LmEE~hhE - I1.0 . 111112----5 1.~4 __112 ___IL25_ 1.4 111111.6 EI24 COMPUTER SCIENCES DEPARTMENT oUniversity of Wisconsin...we propose a combinatorial model for the use in the analysis of algorithms for duplicate elimination. We contend that this model can serve as a...duplicates in a multiset of records, knowing the size of the multiset and the number of distinct records in it. 3. Algorithms for Duplicate Elimination

  15. Geographic data: Zip Codes (Shape File)

    Data.gov (United States)

    Montgomery County of Maryland — This dataset contains all zip codes in Montgomery County. Zip codes are the postal delivery areas defined by USPS. Zip codes with mailboxes only are not included. As...

  16. Biomass Data | Geospatial Data Science | NREL

    Science.gov (United States)

    Biomass Data Biomass Data These datasets detail the biomass resources available in the United Coverage File Last Updated Metadata Biomethane Zip 72.2 MB 10/30/2014 Biomethane.xml Solid Biomass Zip 69.5

  17. A brief overview of the European Fusion File (EFF) project

    International Nuclear Information System (INIS)

    Kellett, M.A.; Forrest, R.A.; Batistoni, P.

    2003-01-01

    The European Fusion File (EFF) Project is a collaborative project with work funded by the European Fusion Development Agreement (EFDA). The emphasis is on the pooling of resources and removal of duplication of effort, leading to the efficient development of two types of nuclear data libraries for use in fusion power plant design and operation studies. The two branches consist of, on the one hand, a transport file for modelling and design capabilities and, secondly, an activation file for the calculation and simulation of dose rates and energy release during operation of a future power plant. The OECD Nuclear Energy Agency's Data Bank acts as the central repository for the files and all information discussed during twice yearly meetings. It offers its services at no charge to the Project. (author)

  18. [PVFS 2000: An operational parallel file system for Beowulf

    Science.gov (United States)

    Ligon, Walt

    2004-01-01

    The approach has been to develop Parallel Virtual File System version 2 (PVFS2) , retaining the basic philosophy of the original file system but completely rewriting the code. It shows the architecture of the server and client components. BMI - BMI is the network abstraction layer. It is designed with a common driver and modules for each protocol supported. The interface is non-blocking, and provides mechanisms for optimizations including pinning user buffers. Currently TCP/IP and GM(Myrinet) modules have been implemented. Trove -Trove is the storage abstraction layer. It provides for storing both data spaces and name/value pairs. Trove can also be implemented using different underlying storage mechanisms including native files, raw disk partitions, SQL and other databases. The current implementation uses native files for data spaces and Berkeley db for name/value pairs.

  19. A brief overview of the European Fusion File (EFF) project

    International Nuclear Information System (INIS)

    Kellett, M.A.

    2002-01-01

    The European Fusion File (EFF) Project is a collaborative project with work funded by the European Fusion Development Agreement (EFDA). The emphasis is on the pooling of resources and removal of duplication of effort, leading to the efficient development of two types of nuclear data libraries for use in fusion reactor design and operation work. The two branches consist of, on the one hand, a transport file for modelling and design capabilities and, secondly, an activation file for the calculation and simulation of dose rates and energy release during operation of a future reactor. The OECD Nuclear Energy Agency's Data Bank acts as the central repository for the files and all information discussed during twice yearly meetings, which it holds, offering its services at no charge to the Project. (author)

  20. A model for optimizing file access patterns using spatio-temporal parallelism

    Energy Technology Data Exchange (ETDEWEB)

    Boonthanome, Nouanesengsy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Patchett, John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Geveci, Berk [Kitware Inc., Clifton Park, NY (United States); Ahrens, James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bauer, Andy [Kitware Inc., Clifton Park, NY (United States); Chaudhary, Aashish [Kitware Inc., Clifton Park, NY (United States); Miller, Ross G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Shipman, Galen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-01-01

    For many years now, I/O read time has been recognized as the primary bottleneck for parallel visualization and analysis of large-scale data. In this paper, we introduce a model that can estimate the read time for a file stored in a parallel filesystem when given the file access pattern. Read times ultimately depend on how the file is stored and the access pattern used to read the file. The file access pattern will be dictated by the type of parallel decomposition used. We employ spatio-temporal parallelism, which combines both spatial and temporal parallelism, to provide greater flexibility to possible file access patterns. Using our model, we were able to configure the spatio-temporal parallelism to design optimized read access patterns that resulted in a speedup factor of approximately 400 over traditional file access patterns.