WorldWideScience

Sample records for sample input files

  1. Documentation of CATHENA input files for the APOLLO computer

    International Nuclear Information System (INIS)

    1988-06-01

    Input files created for the VAX version of the CATHENA two-fluid code have been modified and documented for simulation on the AECB's APOLLO computer system. The input files describe the RD-14 thermalhydraulic loop, the RD-14 steam generator, the RD-12 steam generator blowdown test facility, the Stern Laboratories Cold Water Injection Facility (CWIT), and a CANDU 600 reactor. Sample CATHENA predictions are given and compared with experimental results where applicable. 24 refs

  2. Incorporating uncertainty in RADTRAN 6.0 input files.

    Energy Technology Data Exchange (ETDEWEB)

    Dennis, Matthew L.; Weiner, Ruth F.; Heames, Terence John (Alion Science and Technology)

    2010-02-01

    Uncertainty may be introduced into RADTRAN analyses by distributing input parameters. The MELCOR Uncertainty Engine (Gauntt and Erickson, 2004) has been adapted for use in RADTRAN to determine the parameter shape and minimum and maximum of the distribution, to sample on the distribution, and to create an appropriate RADTRAN batch file. Coupling input parameters is not possible in this initial application. It is recommended that the analyst be very familiar with RADTRAN and able to edit or create a RADTRAN input file using a text editor before implementing the RADTRAN Uncertainty Analysis Module. Installation of the MELCOR Uncertainty Engine is required for incorporation of uncertainty into RADTRAN. Gauntt and Erickson (2004) provides installation instructions as well as a description and user guide for the uncertainty engine.

  3. Pre-processing of input files for the AZTRAN code

    International Nuclear Information System (INIS)

    Vargas E, S.; Ibarra, G.

    2017-09-01

    The AZTRAN code began to be developed in the Nuclear Engineering Department of the Escuela Superior de Fisica y Matematicas (ESFM) of the Instituto Politecnico Nacional (IPN) with the purpose of numerically solving various models arising from the physics and engineering of nuclear reactors. The code is still under development and is part of the AZTLAN platform: Development of a Mexican platform for the analysis and design of nuclear reactors. Due to the complexity to generate an input file for the code, a script based on D language is developed, with the purpose of making its elaboration easier, based on a new input file format which includes specific cards, which have been divided into two blocks, mandatory cards and optional cards, including a pre-processing of the input file to identify possible errors within it, as well as an image generator for the specific problem based on the python interpreter. (Author)

  4. Development and validation of gui based input file generation code for relap

    International Nuclear Information System (INIS)

    Anwar, M.M.; Khan, A.A.; Chughati, I.R.; Chaudri, K.S.; Inyat, M.H.; Hayat, T.

    2009-01-01

    Reactor Excursion and Leak Analysis Program (RELAP) is a widely acceptable computer code for thermal hydraulics modeling of Nuclear Power Plants. It calculates thermal- hydraulic transients in water-cooled nuclear reactors by solving approximations to the one-dimensional, two-phase equations of hydraulics in an arbitrarily connected system of nodes. However, the preparation of input file and subsequent analysis of results in this code is a tedious task. The development of a Graphical User Interface (GUI) for preparation of the input file for RELAP-5 is done with the validation of GUI generated Input File. The GUI is developed in Microsoft Visual Studio using Visual C Sharp (C) as programming language. The Nodalization diagram is drawn graphically and the program contains various component forms along with the starting data form, which are launched for properties assignment to generate Input File Cards serving as GUI for the user. The GUI is provided with Open / Save function to store and recall the Nodalization diagram along with Components' properties. The GUI generated Input File is validated for several case studies and individual component cards are compared with the originally required format. The generated Input File of RELAP is found consistent with the requirement of RELAP. The GUI provided a useful platform for simulating complex hydrodynamic problems efficiently with RELAP. (author)

  5. Optimizing Input/Output Using Adaptive File System Policies

    Science.gov (United States)

    Madhyastha, Tara M.; Elford, Christopher L.; Reed, Daniel A.

    1996-01-01

    Parallel input/output characterization studies and experiments with flexible resource management algorithms indicate that adaptivity is crucial to file system performance. In this paper we propose an automatic technique for selecting and refining file system policies based on application access patterns and execution environment. An automatic classification framework allows the file system to select appropriate caching and pre-fetching policies, while performance sensors provide feedback used to tune policy parameters for specific system environments. To illustrate the potential performance improvements possible using adaptive file system policies, we present results from experiments involving classification-based and performance-based steering.

  6. Auto Draw from Excel Input Files

    Science.gov (United States)

    Strauss, Karl F.; Goullioud, Renaud; Cox, Brian; Grimes, James M.

    2011-01-01

    The design process often involves the use of Excel files during project development. To facilitate communications of the information in the Excel files, drawings are often generated. During the design process, the Excel files are updated often to reflect new input. The problem is that the drawings often lag the updates, often leading to confusion of the current state of the design. The use of this program allows visualization of complex data in a format that is more easily understandable than pages of numbers. Because the graphical output can be updated automatically, the manual labor of diagram drawing can be eliminated. The more frequent update of system diagrams can reduce confusion and reduce errors and is likely to uncover symmetric problems earlier in the design cycle, thus reducing rework and redesign.

  7. KENO2MCNP, Version 5L, Conversion of Input Data between KENOV.a and MCNP File Formats

    International Nuclear Information System (INIS)

    2008-01-01

    1 - Description of program or function: The KENO2MCNP program was written to convert KENO V.a input files to MCNP Format. This program currently only works with KENO Va geometries and will not work with geometries that contain more than a single array. A C++ graphical user interface was created that was linked to Fortran routines from KENO V.a that read the material library and Fortran routines from the MCNP Visual Editor that generate the MCNP input file. Either SCALE 5.0 or SCALE 5.1 cross section files will work with this release. 2 - Methods: The C++ binary executable reads the KENO V.a input file, the KENO V.a material library and SCALE data libraries. When an input file is read in, the input is stored in memory. The converter goes through and loads different sections of the input file into memory including parameters, composition, geometry information, array information and starting information. Many of the KENO V.a materials represent compositions that must be read from the KENO V.a material library. KENO2MCNP includes the KENO V.a FORTRAN routines used to read this material file for creating the MCNP materials. Once the file has been read in, the user must select 'Convert' to convert the file from KENO V.a to MCNP. This will generate the MCNP input file along with an output window that lists the KENO V.a composition information for the materials contained in the KENO V.a input file. The program can be run interactively by clicking on the executable or in batch mode from the command prompt. 3 - Restrictions on the complexity of the problem: Not all KENO V.a input files are supported. Only one array is allowed in the input file. Some of the more complex material descriptions also may not be converted

  8. FragIt: a tool to prepare input files for fragment based quantum chemical calculations.

    Directory of Open Access Journals (Sweden)

    Casper Steinmann

    Full Text Available Near linear scaling fragment based quantum chemical calculations are becoming increasingly popular for treating large systems with high accuracy and is an active field of research. However, it remains difficult to set up these calculations without expert knowledge. To facilitate the use of such methods, software tools need to be available to support these methods and help to set up reasonable input files which will lower the barrier of entry for usage by non-experts. Previous tools relies on specific annotations in structure files for automatic and successful fragmentation such as residues in PDB files. We present a general fragmentation methodology and accompanying tools called FragIt to help setup these calculations. FragIt uses the SMARTS language to locate chemically appropriate fragments in large structures and is applicable to fragmentation of any molecular system given suitable SMARTS patterns. We present SMARTS patterns of fragmentation for proteins, DNA and polysaccharides, specifically for D-galactopyranose for use in cyclodextrins. FragIt is used to prepare input files for the Fragment Molecular Orbital method in the GAMESS program package, but can be extended to other computational methods easily.

  9. Pre-processing of input files for the AZTRAN code; Pre procesamiento de archivos de entrada para el codigo AZTRAN

    Energy Technology Data Exchange (ETDEWEB)

    Vargas E, S. [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Ibarra, G., E-mail: samuel.vargas@inin.gob.mx [IPN, Av. Instituto Politecnico Nacional s/n, 07738 Ciudad de Mexico (Mexico)

    2017-09-15

    The AZTRAN code began to be developed in the Nuclear Engineering Department of the Escuela Superior de Fisica y Matematicas (ESFM) of the Instituto Politecnico Nacional (IPN) with the purpose of numerically solving various models arising from the physics and engineering of nuclear reactors. The code is still under development and is part of the AZTLAN platform: Development of a Mexican platform for the analysis and design of nuclear reactors. Due to the complexity to generate an input file for the code, a script based on D language is developed, with the purpose of making its elaboration easier, based on a new input file format which includes specific cards, which have been divided into two blocks, mandatory cards and optional cards, including a pre-processing of the input file to identify possible errors within it, as well as an image generator for the specific problem based on the python interpreter. (Author)

  10. PLEXOS Input Data Generator

    Energy Technology Data Exchange (ETDEWEB)

    2017-02-01

    The PLEXOS Input Data Generator (PIDG) is a tool that enables PLEXOS users to better version their data, automate data processing, collaborate in developing inputs, and transfer data between different production cost modeling and other power systems analysis software. PIDG can process data that is in a generalized format from multiple input sources, including CSV files, PostgreSQL databases, and PSS/E .raw files and write it to an Excel file that can be imported into PLEXOS with only limited manual intervention.

  11. Input Files and Procedures for Analysis of SMA Hybrid Composite Beams in MSC.Nastran and ABAQUS

    Science.gov (United States)

    Turner, Travis L.; Patel, Hemant D.

    2005-01-01

    A thermoelastic constitutive model for shape memory alloys (SMAs) and SMA hybrid composites (SMAHCs) was recently implemented in the commercial codes MSC.Nastran and ABAQUS. The model is implemented and supported within the core of the commercial codes, so no user subroutines or external calculations are necessary. The model and resulting structural analysis has been previously demonstrated and experimentally verified for thermoelastic, vibration and acoustic, and structural shape control applications. The commercial implementations are described in related documents cited in the references, where various results are also shown that validate the commercial implementations relative to a research code. This paper is a companion to those documents in that it provides additional detail on the actual input files and solution procedures and serves as a repository for ASCII text versions of the input files necessary for duplication of the available results.

  12. ColloInputGenerator

    DEFF Research Database (Denmark)

    2013-01-01

    This is a very simple program to help you put together input files for use in Gries' (2007) R-based collostruction analysis program. It basically puts together a text file with a frequency list of lexemes in the construction and inserts a column where you can add the corpus frequencies. It requires...... it as input for basic collexeme collostructional analysis (Stefanowitsch & Gries 2003) in Gries' (2007) program. ColloInputGenerator is, in its current state, based on programming commands introduced in Gries (2009). Projected updates: Generation of complete work-ready frequency lists....

  13. DOE-2 sample run book: Version 2.1E

    Energy Technology Data Exchange (ETDEWEB)

    Winkelmann, F.C.; Birdsall, B.E.; Buhl, W.F.; Ellington, K.L.; Erdem, A.E. [Lawrence Berkeley Lab., CA (United States); Hirsch, J.J.; Gates, S. [Hirsch (James J.) and Associates, Camarillo, CA (United States)

    1993-11-01

    The DOE-2 Sample Run Book shows inputs and outputs for a variety of building and system types. The samples start with a simple structure and continue to a high-rise office building, a medical building, three small office buildings, a bar/lounge, a single-family residence, a small office building with daylighting, a single family residence with an attached sunspace, a ``parameterized`` building using input macros, and a metric input/output example. All of the samples use Chicago TRY weather. The main purpose of the Sample Run Book is instructional. It shows the relationship of LOADS-SYSTEMS-PLANT-ECONOMICS inputs, displays various input styles, and illustrates many of the basic and advanced features of the program. Many of the sample runs are preceded by a sketch of the building showing its general appearance and the zoning used in the input. In some cases we also show a 3-D rendering of the building as produced by the program DrawBDL. Descriptive material has been added as comments in the input itself. We find that a number of users have loaded these samples onto their editing systems and use them as ``templates`` for creating new inputs. Another way of using them would be to store various portions as files that can be read into the input using the {number_sign}{number_sign} include command, which is part of the Input Macro feature introduced in version DOE-2.lD. Note that the energy rate structures here are the same as in the DOE-2.lD samples, but have been rewritten using the new DOE-2.lE commands and keywords for ECONOMICS. The samples contained in this report are the same as those found on the DOE-2 release files. However, the output numbers that appear here may differ slightly from those obtained from the release files. The output on the release files can be used as a check set to compare results on your computer.

  14. Generation of SCALE 6 Input Data File for Cross Section Library of PWR Spent Fuel

    International Nuclear Information System (INIS)

    Jeong, Chang Joon; Cho, Dong Keun

    2010-11-01

    In order to obtain the cross section libraries of the Korean Pressurized water reactor (PWR) spent fuel (SF), SCALE 6 code input files have been generated. The PWR fuel data were obtained from the nuclear design report (NDR) of the current operating PWRs. The input file were prepared for 16 fuel types such as 4 types of Westinghouse 14x14, 3 types of OPR-1000 16x16, 4 types of Westinghouse 16x16, and 6 types of Westinghouse 17x17. For each fuel type, 5 kinds of fuel enrichments have been considered such as 1.5, 2.0 ,3.0, 4.0 and 5.0 wt%. In the SCALE 6 calculation, a ENDF-V 44 group was used. The 25 burnup step until 72000 MWD/T was used. A 1/4 symmetry model was used for 16x16 and 17x17 fuel assembly, and 1/2 symmetry model was used for 14x14 fuel assembly The generated cross section libraries will be used for the source-term analysis of the PWR SF

  15. Generation of Gaussian 09 Input Files for the Computation of 1H and 13C NMR Chemical Shifts of Structures from a Spartan’14 Conformational Search

    OpenAIRE

    sprotocols

    2014-01-01

    Authors: Spencer Reisbick & Patrick Willoughby ### Abstract This protocol describes an approach to preparing a series of Gaussian 09 computational input files for an ensemble of conformers generated in Spartan’14. The resulting input files are necessary for computing optimum geometries, relative conformer energies, and NMR shielding tensors using Gaussian. Using the conformational search feature within Spartan’14, an ensemble of conformational isomers was obtained. To convert the str...

  16. Input data for inferring species distributions in Kyphosidae world-wide

    Directory of Open Access Journals (Sweden)

    Steen Wilhelm Knudsen

    2016-09-01

    Full Text Available Input data files for inferring the relationship among the family Kyphosidae, as presented in (Knudsen and Clements, 2016 [1], is here provided together with resulting topologies, to allow the reader to explore the topologies in detail. The input data files comprise seven nexus-files with sequence alignments of mtDNA and nDNA markers for performing Bayesian analysis. A matrix of recoded character states inferred from the morphology examined in museum specimens representing Dichistiidae, Girellidae, Kyphosidae, Microcanthidae and Scorpididae, is also provided, and can be used for performing a parsimonious analysis to infer the relationship among these perciform families. The nucleotide input data files comprise both multiple and single representatives of the various species to allow for inference of the relationship among the species in Kyphosidae and between the families closely related to Kyphosidae. The ‘.xml’-files with various constrained relationships among the families potentially closely related to Kyphosidae are also provided to allow the reader to rerun and explore the results from the stepping-stone analysis. The resulting topologies are supplied in newick-file formats together with input data files for Bayesian analysis, together with ‘.xml’-files. Re-running the input data files in the appropriate software, will enable the reader to examine log-files and tree-files themselves. Keywords: Sea chub, Drummer, Kyphosus, Scorpis, Girella

  17. ORIGNATE: PC input processor for ORIGEN-S

    International Nuclear Information System (INIS)

    Bowman, S.M.

    1992-01-01

    ORIGNATE is a personal computer program that serves as a user- friendly interface for the ORIGEN-S isotopic generation and depletion code. It is designed to assist an ORIGEN-S user in preparing an input file for execution of light-water-reactor fuel depletion and decay cases. Output from ORIGNATE is a card-image input file that may be uploaded to a mainframe computer to execute ORIGEN-S in SCALE-4. ORIGNATE features a pulldown menu system that accesses sophisticated data entry screens. The program allows the user to quickly set up an ORIGEN-S input file and perform error checking

  18. Conversion of Input Data between KENO and MCNP File Formats for Computer Criticality Assessments

    International Nuclear Information System (INIS)

    Schwarz, Randolph A.; Carter, Leland L.; Schwarz Alysia L.

    2006-01-01

    KENO is a Monte Carlo criticality code that is maintained by Oak Ridge National Laboratory (ORNL). KENO is included in the SCALE (Standardized Computer Analysis for Licensing Evaluation) package. KENO is often used because it was specifically designed for criticality calculations. Because KENO has convenient geometry input, including the treatment of lattice arrays of materials, it is frequently used for production calculations. Monte Carlo N-Particle (MCNP) is a Monte Carlo transport code maintained by Los Alamos National Laboratory (LANL). MCNP has a powerful 3D geometry package and an extensive cross section database. It is a general-purpose code and may be used for calculations involving shielding or medical facilities, for example, but can also be used for criticality calculations. MCNP is becoming increasingly more popular for performing production criticality calculations. Both codes have their own specific advantages. After a criticality calculation has been performed with one of the codes, it is often desirable (or may be a safety requirement) to repeat the calculation with the other code to compare the important parameters using a different geometry treatment and cross section database. This manual conversion of input files between the two codes is labor intensive. The industry needs the capability of converting geometry models between MCNP and KENO without a large investment in manpower. The proposed conversion package will aid the user in converting between the codes. It is not intended to be used as a ''black box''. The resulting input file will need to be carefully inspected by criticality safety personnel to verify the intent of the calculation is preserved in the conversion. The purpose of this package is to help the criticality specialist in the conversion process by converting the geometry, materials, and pertinent data cards

  19. Identification of continuous-time systems from samples of input ...

    Indian Academy of Sciences (India)

    Abstract. This paper presents an introductory survey of the methods that have been developed for identification of continuous-time systems from samples of input±output data. The two basic approaches may be described as (i) the indirect method, where first a discrete-time model is estimated from the sampled data and then ...

  20. Originate: PC input processor for origen-S

    International Nuclear Information System (INIS)

    Bowman, S.M.

    1994-01-01

    ORIGINATE is a personal computer program developed at Oak Ridge National Laboratory to serve as a user-friendly interface for the ORIGEN-S isotopic generation and depletion code. It is designed to assist an ORIGEN-S user in preparing an input file for execution of light-water-reactor fuel depletion and decay cases. Output from ORIGINATE is a card-image input file that may be uploaded to a mainframe computer to execute ORIGEN-S in SCALE-4. ORIGINATE features a pull down menu system that accesses sophisticated data entry screens. The program allows the user to quickly set up an ORIGEN-S input file and perform error checking. This capability increases productivity and decreases chance of user error. (authors). 6 refs., 3 tabs

  1. Input and execution

    International Nuclear Information System (INIS)

    Carr, S.; Lane, G.; Rowling, G.

    1986-11-01

    This document describes the input procedures, input data files and operating instructions for the SYVAC A/C 1.03 computer program. SYVAC A/C 1.03 simulates the groundwater mediated movement of radionuclides from underground facilities for the disposal of low and intermediate level wastes to the accessible environment, and provides an estimate of the subsequent radiological risk to man. (author)

  2. OFFSCALE: PC input processor for SCALE-4 criticality sequences

    International Nuclear Information System (INIS)

    Bowman, S.M.

    1991-01-01

    OFFSCALE is a personal computer program that serves as a user-friendly interface for the Criticality Safety Analysis Sequences (CSAS) available in SCALE-4. It is designed to assist a SCALE-4 user in preparing an input file for execution of criticality safety problems. Output from OFFSCALE is a card-image input file that may be uploaded to a mainframe computer to execute the CSAS4 control module in SCALE-4. OFFSCALE features a pulldown menu system that accesses sophisticated data entry screens. The program allows the user to quickly set up a CSAS4 input file and perform data checking

  3. Text File Comparator

    Science.gov (United States)

    Kotler, R. S.

    1983-01-01

    File Comparator program IFCOMP, is text file comparator for IBM OS/VScompatable systems. IFCOMP accepts as input two text files and produces listing of differences in pseudo-update form. IFCOMP is very useful in monitoring changes made to software at the source code level.

  4. Development of NUPREP PC Version and Input Structures for NUCIRC Single Channel Analyses

    International Nuclear Information System (INIS)

    Yoon, Churl; Jun, Ji Su; Park, Joo Hwan

    2007-12-01

    The input file for a steady-state thermal-hydraulic code NUCIRC consists of common channel input data and specific channel input data in a case of single channel analysis. Even when all the data is ready for the 380 channels' single channel analyses, it takes long time and requires enormous effort to compose an input file by hand-editing. The automatic pre-processor for this tedious job is a NUPREP code. In this study, a NUPREP PC version has been developed from the source list in the program manual of NUCIRC-MOD2.000 that is imported in a form of an execution file. In this procedure, some errors found in PC executions and lost statements are fixed accordingly. It is confirmed that the developed NUPREP code produces input file correctly for the CANDU-6 single channel analysis. Additionally, the NUCIRC input structure and data format are summarized for a single channel analysis and the input CARDs required for the creep information of aged channels are listed

  5. Development of NUPREP PC Version and Input Structures for NUCIRC Single Channel Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Churl; Jun, Ji Su; Park, Joo Hwan

    2007-12-15

    The input file for a steady-state thermal-hydraulic code NUCIRC consists of common channel input data and specific channel input data in a case of single channel analysis. Even when all the data is ready for the 380 channels' single channel analyses, it takes long time and requires enormous effort to compose an input file by hand-editing. The automatic pre-processor for this tedious job is a NUPREP code. In this study, a NUPREP PC version has been developed from the source list in the program manual of NUCIRC-MOD2.000 that is imported in a form of an execution file. In this procedure, some errors found in PC executions and lost statements are fixed accordingly. It is confirmed that the developed NUPREP code produces input file correctly for the CANDU-6 single channel analysis. Additionally, the NUCIRC input structure and data format are summarized for a single channel analysis and the input CARDs required for the creep information of aged channels are listed.

  6. WORM: A general-purpose input deck specification language

    International Nuclear Information System (INIS)

    Jones, T.

    1999-01-01

    Using computer codes to perform criticality safety calculations has become common practice in the industry. The vast majority of these codes use simple text-based input decks to represent the geometry, materials, and other parameters that describe the problem. However, the data specified in input files are usually processed results themselves. For example, input decks tend to require the geometry specification in linear dimensions and materials in atom or weight fractions, while the parameter of interest might be mass or concentration. The calculations needed to convert from the item of interest to the required parameter in the input deck are usually performed separately and then incorporated into the input deck. This process of calculating, editing, and renaming files to perform a simple parameter study is tedious at best. In addition, most computer codes require dimensions to be specified in centimeters, while drawings or other materials used to create the input decks might be in other units. This also requires additional calculation or conversion prior to composition of the input deck. These additional calculations, while extremely simple, introduce a source for error in both the calculations and transcriptions. To overcome these difficulties, WORM (Write One, Run Many) was created. It is an easy-to-use programming language to describe input decks and can be used with any computer code that uses standard text files for input. WORM is available, via the Internet, at worm.lanl.gov. A user's guide, tutorials, example models, and other WORM-related materials are also available at this Web site. Questions regarding WORM should be directed to wormatlanl.gov

  7. Summary report of the 3. research co-ordination meeting on development of reference input parameter library for nuclear model calculations of nuclear data (Phase 1: Starter File)

    International Nuclear Information System (INIS)

    Oblozinsky, P.

    1997-09-01

    The report contains the summary of the third and the last Research Co-ordination Meeting on ''Development of Reference Input Parameter Library for Nuclear Model Calculations of Nuclear Data (Phase I: Starter File)'', held at the ICTP, Trieste, Italy, from 26 to 29 May 1997. Details are given on the status of the Handbook and the Starter File - two major results of the project. (author)

  8. Tabulation of Fundamental Assembly Heat and Radiation Source Files

    International Nuclear Information System (INIS)

    T. deBues; J.C. Ryman

    2006-01-01

    The purpose of this calculation is to tabulate a set of computer files for use as input to the WPLOAD thermal loading software. These files contain details regarding heat and radiation from pressurized water reactor (PWR) assemblies and boiling water reactor (BWR) assemblies. The scope of this calculation is limited to rearranging and reducing the existing file information into a more streamlined set of tables for use as input to WPLOAD. The electronic source term files used as input to this calculation were generated from the output files of the SAS2H/ORIGIN-S sequence of the SCALE Version 4.3 modular code system, as documented in References 2.1.1 and 2.1.2, and are included in Attachment II

  9. Development of an Input Model to MELCOR 1.8.5 for the Ringhals 3 PWR

    International Nuclear Information System (INIS)

    Nilsson, Lars

    2004-12-01

    An input file to the severe accident code MELCOR 1.8.5 has been developed for the Swedish pressurized water reactor Ringhals 3. The aim was to produce a file that can be used for calculations of various postulated severe accident scenarios, although the first application is specifically on cases involving large hydrogen production. The input file is rather detailed with individual modelling of all three cooling loops. The report describes the basis for the Ringhals 3 model and the input preparation step by step and is illustrated by nodalization schemes of the different plant systems. Present version of the report is restricted to the fundamental MELCOR input preparation, and therefore most of the figures of Ringhals 3 measurements and operating parameters are excluded here. These are given in another, complete version of the report, for limited distribution, which includes tables for pertinent data of all components. That version contains appendices with a complete listing of the input files as well as tables of data compiled from a RELAP5 file, that was a major basis for the MELCOR input for the cooling loops. The input was tested in steady-state calculations in order to simulate the initial conditions at current nominal operating conditions in Ringhals 3 for 2775 MW thermal power. The results of the steady-state calculations are presented in the report. Calculations with the MELCOR model will then be carried out of certain accident sequences for comparison with results from earlier MAAP4 calculations. That work will be reported separately

  10. Environmental sample accounting at the Savannah River Plant

    International Nuclear Information System (INIS)

    Zeigler, C.C.; Wood, M.B.

    1978-01-01

    At the Savannah River Plant Environmental Monitoring Laboratories, a computer-based systematic accounting method was developed to ensure that all scheduled samples are collected, processed through the laboratory, and counted without delay. The system employs an IBM 360/195 computer with a magnetic tape master file, an online disk file, and cathode ray tube (CRT) terminals. Scheduling and accounting are accomplished using computer-generated schedules, bottle labels, and output/ input cards. A printed card is issued for the collecting, analyzing, and counting of each scheduled sample. The card also contains information for the personnel who are to perform the work, e.g., sample location, aliquot to be processed, and procedure to be used. Manual entries are made on the card when each step in the process is completed. Additional pertinent data such as the reason a sample is not collected, the need for a nonstandard aliquot, and field measurement results are keypunched and then read into the computer files as required. The computer files are audited daily and summaries showing samples not processed in pre-established normal schedules are provided. The progress of sample analyses is readily determined at any time using the CRT terminal. Historic data are maintained on magnetic tape, and workload summaries showing the number of samples and number of determinations per month are issued. (author)

  11. Environmental sampling accounting at the Savannah River Plant

    International Nuclear Information System (INIS)

    Zeigler, C.C.; Wood, M.B.

    1978-06-01

    At the Savannah River Plant Environmental Monitoring Laboratories, a computer-based systematic accounting method was developed to ensure that all scheduled samples are collected, processed through the laboratory, and counted without delay. The system employs an IBM 360/195 computer with a magnetic tape master file, an on-line disk file, and cathode ray tube (CRT) terminals. Scheduling and accounting are accomplished by using computer-generated schedules, collection labels, and output/input cards. For each scheduled sample and analysis, a printed card is issued for collection, laboratory analysis, and counting. The cards also contain information needed by personnel performing the jobs, such as sample location, aliquot to be processed, or procedure number. Manual entries are made on the cards when each step in the process is completed. Additional pertinent data are also manually entered on the cards; e.g., entries are made explaining why a sample is not collected, the sample aliquot in the event a nonstandard aliquot is processed, field measurement results, and analytical results. These manually entered data are keypunched and read into the computer files. The computer files are audited daily, and summaries of samples not processed in pre-established normal time intervals are issued. The progress of sample analyses can also be readily determined at any time using the CRT terminal. Historic data are also maintained on magnetic tape and workload summaries are issued showing the number of samples and number of determinations per month

  12. Hybrid image and blood sampling input function for quantification of small animal dynamic PET data

    International Nuclear Information System (INIS)

    Shoghi, Kooresh I.; Welch, Michael J.

    2007-01-01

    We describe and validate a hybrid image and blood sampling (HIBS) method to derive the input function for quantification of microPET mice data. The HIBS algorithm derives the peak of the input function from the image, which is corrected for recovery, while the tail is derived from 5 to 6 optimally placed blood sampling points. A Bezier interpolation algorithm is used to link the rightmost image peak data point to the leftmost blood sampling point. To assess the performance of HIBS, 4 mice underwent 60-min microPET imaging sessions following a 0.40-0.50-mCi bolus administration of 18 FDG. In total, 21 blood samples (blood-sampled plasma time-activity curve, bsPTAC) were obtained throughout the imaging session to compare against the proposed HIBS method. MicroPET images were reconstructed using filtered back projection with a zoom of 2.75 on the heart. Volumetric regions of interest (ROIs) were composed by drawing circular ROIs 3 pixels in diameter on 3-4 transverse planes of the left ventricle. Performance was characterized by kinetic simulations in terms of bias in parameter estimates when bsPTAC and HIBS are used as input functions. The peak of the bsPTAC curve was distorted in comparison to the HIBS-derived curve due to temporal limitations and delay in blood sampling, which affected the rates of bidirectional exchange between plasma and tissue. The results highlight limitations in using bsPTAC. The HIBS method, however, yields consistent results, and thus, is a substitute for bsPTAC

  13. Interoperability format translation and transformation between IFC architectural design file and simulation file formats

    Science.gov (United States)

    Chao, Tian-Jy; Kim, Younghun

    2015-01-06

    Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function to convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.

  14. OFFSCALE: A PC input processor for the SCALE code system. The ORIGNATE processor for ORIGEN-S

    International Nuclear Information System (INIS)

    Bowman, S.M.

    1994-11-01

    OFFSCALE is a suite of personal computer input processor programs developed at Oak Ridge National Laboratory to provide an easy-to-use interface for modules in the SCALE-4 code system. ORIGNATE is a program in the OFFSCALE suite that serves as a user-friendly interface for the ORIGEN-S isotopic generation and depletion code. It is designed to assist an ORIGEN-S user in preparing an input file for execution of light-water-reactor (LWR) fuel depletion and decay cases. ORIGNATE generates an input file that may be used to execute ORIGEN-S in SCALE-4. ORIGNATE features a pulldown menu system that accesses sophisticated data entry screens. The program allows the user to quickly set up an ORIGEN-S input file and perform error checking. This capability increases productivity and decreases the chance of user error

  15. Detecting Malicious Code by Binary File Checking

    Directory of Open Access Journals (Sweden)

    Marius POPA

    2014-01-01

    Full Text Available The object, library and executable code is stored in binary files. Functionality of a binary file is altered when its content or program source code is changed, causing undesired effects. A direct content change is possible when the intruder knows the structural information of the binary file. The paper describes the structural properties of the binary object files, how the content can be controlled by a possible intruder and what the ways to identify malicious code in such kind of files. Because the object files are inputs in linking processes, early detection of the malicious content is crucial to avoid infection of the binary executable files.

  16. File list: InP.Kid.10.AllAg.Nephrectomy_sample [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Kid.10.AllAg.Nephrectomy_sample hg19 Input control Kidney Nephrectomy sample SR...90,SRX1037589,SRX1037588,SRX1037582 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Kid.10.AllAg.Nephrectomy_sample.bed ...

  17. File list: InP.Kid.50.AllAg.Nephrectomy_sample [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Kid.50.AllAg.Nephrectomy_sample hg19 Input control Kidney Nephrectomy sample SR...84,SRX1037589,SRX1037590,SRX1037583 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Kid.50.AllAg.Nephrectomy_sample.bed ...

  18. File list: InP.Kid.05.AllAg.Nephrectomy_sample [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Kid.05.AllAg.Nephrectomy_sample hg19 Input control Kidney Nephrectomy sample SR...84,SRX1037590,SRX1037588,SRX1037589 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Kid.05.AllAg.Nephrectomy_sample.bed ...

  19. File list: InP.Kid.20.AllAg.Nephrectomy_sample [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Kid.20.AllAg.Nephrectomy_sample hg19 Input control Kidney Nephrectomy sample SR...83,SRX1037590,SRX1037588,SRX1037582 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Kid.20.AllAg.Nephrectomy_sample.bed ...

  20. Development of an integrated filing system for endoscopic images.

    Science.gov (United States)

    Fujino, M A; Ikeda, M; Yamamoto, Y; Kinose, T; Tachikawa, H; Morozumi, A; Sano, S; Kojima, Y; Nakamura, T; Kawai, T

    1991-01-01

    A new integrated filing system for endoscopic images has been developed, comprising a main image filing system and subsystems located at different stations. A hybrid filing system made up of both digital and analog filing devices was introduced to construct this system that combines the merits of the two filing methods. Each subsystem provided with a video processor, is equipped with a digital filing device, and routine images were recorded in the analog image filing device of the main system. The use of a multi-input adapter enabled simultaneous input of analog images from up to 8 video processors. Recorded magneto-optical disks make it possible to recall the digital images at any station in the hospital; the disks are copied without image degradation and also utilised for image processing. This system promises reliable storage and integrated, efficient management of endoscopic information. It also costs less to install than the so-called PACS (picture archiving and communication system), which connects all the stations of the hospital using optical fiber cables.

  1. OFFSCALE: A PC input processor for the SCALE code system. The CSASIN processor for the criticality sequences

    International Nuclear Information System (INIS)

    Bowman, S.M.

    1994-11-01

    OFFSCALE is a suite of personal computer input processor programs developed at Oak Ridge National Laboratory to provide an easy-to-use interface for modules in the SCALE-4 code system. CSASIN (formerly known as OFFSCALE) is a program in the OFFSCALE suite that serves as a user-friendly interface for the Criticality Safety Analysis Sequences (CSAS) available in SCALE-4. It is designed to assist a SCALE-4 user in preparing an input file for execution of criticality safety problems. Output from CSASIN generates an input file that may be used to execute the CSAS control module in SCALE-4. CSASIN features a pulldown menu system that accesses sophisticated data entry screens. The program allows the user to quickly set up a CSAS input file and perform data checking. This capability increases productivity and decreases the chance of user error

  2. FRAMES User Defined Body Burden Concentration File Module Documentation

    International Nuclear Information System (INIS)

    Pelton, Mitchell A.; Rutz, Frederick C.; Eslinger, Melany A.; Gelston, Gariann M.

    2001-01-01

    The Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) Body Burden Concentration File (BBF) contains time-varying, instantaneous, constituent concentrations for body burden by contaminant. This report contains the requirements for this file and will be used by software engineers and testers to ensure that the file inputs properly.

  3. Analyzing Log Files using Data-Mining

    Directory of Open Access Journals (Sweden)

    Marius Mihut

    2008-01-01

    Full Text Available Information systems (i.e. servers, applications and communication devices create a large amount of monitoring data that are saved as log files. For analyzing them, a data-mining approach is helpful. This article presents the steps which are necessary for creating an ‘analyzing instrument’, based on an open source software called Waikato Environment for Knowledge Analysis (Weka [1]. For exemplification, a system log file created by a Windows-based operating system, is used as input file.

  4. Study and development of a document file system with selective access

    International Nuclear Information System (INIS)

    Mathieu, Jean-Claude

    1974-01-01

    The objective of this research thesis was to design and to develop a set of software aimed at an efficient management of a document file system by using methods of selective access to information. Thus, the three main aspects of file processing (creation, modification, reorganisation) have been addressed. The author first presents the main problems related to the development of a comprehensive automatic documentation system, and their conventional solutions. Some future aspects, notably dealing with the development of peripheral computer technology, are also evoked. He presents the characteristics of INIS bibliographic records provided by the IAEA which have been used to create the files. In the second part, he briefly describes the file system general organisation. This system is based on the use of two main files: an inverse file which contains for each descriptor a list of of numbers of files indexed by this descriptor, and a dictionary of descriptor or input file which gives access to the inverse file. The organisation of these both files is then describes in a detailed way. Other related or associated files are created, and the overall architecture and mechanisms integrated into the file data input software are described, as well as various processing applied to these different files. Performance and possible development are finally discussed

  5. Flexible input, dazzling output with IBM i

    CERN Document Server

    Victória-Pereira, Rafael

    2014-01-01

    Link your IBM i system to the modern business server world! This book presents easier and more flexible ways to get data into your IBM i system, along with rather surprising methods to export and present the vital business data it contains. You'll learn how to automate file transfers, seamlessly connect PC applications with your RPG programs, and much more. Input operations will become more flexible and user-proof, with self-correcting import processes and direct file transfers that require a minimum of user intervention. Also learn novel ways to present information: your DB2 data will look gr

  6. File list: InP.Adp.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Adp.10.Input_control.AllCell hg19 Input control Input control Adipocyte SRX0194...p://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Adp.10.Input_control.AllCell.bed ...

  7. File list: InP.Unc.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Unc.20.Input_control.AllCell sacCer3 Input control Input control Unclassified E.../dbarchive.biosciencedbc.jp/kyushu-u/sacCer3/assembled/InP.Unc.20.Input_control.AllCell.bed ...

  8. File list: InP.Epd.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Epd.50.Input_control.AllCell mm9 Input control Input control Epidermis SRX14260...tp://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Epd.50.Input_control.AllCell.bed ...

  9. File list: InP.PSC.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.PSC.05.Input_control.AllCell mm9 Input control Input control Pluripotent stem c... http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.PSC.05.Input_control.AllCell.bed ...

  10. File list: InP.Epd.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Epd.20.Input_control.AllCell mm9 Input control Input control Epidermis SRX70095...tp://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Epd.20.Input_control.AllCell.bed ...

  11. File list: InP.ALL.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.ALL.05.Input_control.AllCell dm3 Input control Input control All cell types SRX...ttp://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/InP.ALL.05.Input_control.AllCell.bed ...

  12. File list: InP.Emb.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Emb.50.Input_control.AllCell dm3 Input control Input control Embryo SRX681824,S...http://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/InP.Emb.50.Input_control.AllCell.bed ...

  13. File list: InP.ALL.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.ALL.50.Input_control.AllCell sacCer3 Input control Input control All cell types.../dbarchive.biosciencedbc.jp/kyushu-u/sacCer3/assembled/InP.ALL.50.Input_control.AllCell.bed ...

  14. File list: InP.Adp.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Adp.05.Input_control.AllCell mm9 Input control Input control Adipocyte SRX99775...27370 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Adp.05.Input_control.AllCell.bed ...

  15. File list: InP.Neu.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Neu.20.Input_control.AllCell hg19 Input control Input control Neural SRX643470,...SRX026881 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Neu.20.Input_control.AllCell.bed ...

  16. File list: InP.PSC.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.PSC.05.Input_control.AllCell hg19 Input control Input control Pluripotent stem ...RX342849,ERX342851 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.PSC.05.Input_control.AllCell.bed ...

  17. File list: InP.Plc.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Plc.50.Input_control.AllCell hg19 Input control Input control Placenta SRX19004...9,SRX080366 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Plc.50.Input_control.AllCell.bed ...

  18. File list: InP.Neu.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Neu.10.Input_control.AllCell hg19 Input control Input control Neural SRX643470,...RX1035591 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Neu.10.Input_control.AllCell.bed ...

  19. File list: InP.PSC.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.PSC.20.Input_control.AllCell hg19 Input control Input control Pluripotent stem ...RX342850,ERX342851 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.PSC.20.Input_control.AllCell.bed ...

  20. File list: InP.Bon.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Bon.05.Input_control.AllCell hg19 Input control Input control Bone SRX188789,SR...08 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Bon.05.Input_control.AllCell.bed ...

  1. File list: InP.Lar.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Lar.05.Input_control.AllCell ce10 Input control Input control Larvae SRX331089,...,SRX015099 http://dbarchive.biosciencedbc.jp/kyushu-u/ce10/assembled/InP.Lar.05.Input_control.AllCell.bed ...

  2. File list: InP.YSt.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.YSt.50.Input_control.AllCell sacCer3 Input control Input control Yeast strain S... http://dbarchive.biosciencedbc.jp/kyushu-u/sacCer3/assembled/InP.YSt.50.Input_control.AllCell.bed ...

  3. File list: InP.Myo.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Myo.05.Input_control.AllCell hg19 Input control Input control Muscle SRX018653,...656,SRX038601 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Myo.05.Input_control.AllCell.bed ...

  4. File list: InP.Plc.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Plc.05.Input_control.AllCell hg19 Input control Input control Placenta SRX19004...9,SRX080366 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Plc.05.Input_control.AllCell.bed ...

  5. File list: InP.Bon.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Bon.10.Input_control.AllCell hg19 Input control Input control Bone SRX188789,SR...64 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Bon.10.Input_control.AllCell.bed ...

  6. File list: InP.PSC.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.PSC.10.Input_control.AllCell hg19 Input control Input control Pluripotent stem ...RX342849,ERX342851 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.PSC.10.Input_control.AllCell.bed ...

  7. File list: InP.Myo.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Myo.05.Input_control.AllCell mm9 Input control Input control Muscle SRX1482291,...62124,SRX022849 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Myo.05.Input_control.AllCell.bed ...

  8. File list: InP.Myo.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Myo.10.Input_control.AllCell hg19 Input control Input control Muscle SRX018653,...652,SRX038601 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Myo.10.Input_control.AllCell.bed ...

  9. File list: InP.Myo.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Myo.10.Input_control.AllCell mm9 Input control Input control Muscle SRX1482291,...66227,SRX695944 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Myo.10.Input_control.AllCell.bed ...

  10. File list: InP.ALL.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.ALL.50.Input_control.AllCell dm3 Input control Input control All cell types SRX...948 http://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/InP.ALL.50.Input_control.AllCell.bed ...

  11. File list: InP.Bld.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Bld.10.Input_control.AllCell mm9 Input control Input control Blood SRX181865,SR...,SRX832474,SRX832475 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Bld.10.Input_control.AllCell.bed ...

  12. File list: InP.Myo.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Myo.50.Input_control.AllCell mm9 Input control Input control Muscle SRX262224,S...62124,SRX022849 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Myo.50.Input_control.AllCell.bed ...

  13. File list: InP.ALL.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.ALL.50.Input_control.AllCell mm9 Input control Input control All cell types SRX...7314,ERX807291 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.ALL.50.Input_control.AllCell.bed ...

  14. File list: InP.Utr.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Utr.05.Input_control.AllCell mm9 Input control Input control Uterus SRX129065,S...RX114726,SRX336292 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Utr.05.Input_control.AllCell.bed ...

  15. File list: InP.Bld.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Bld.05.Input_control.AllCell mm9 Input control Input control Blood SRX181866,SR...,SRX836257,SRX832475 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Bld.05.Input_control.AllCell.bed ...

  16. File list: InP.YSt.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.YSt.10.Input_control.AllCell sacCer3 Input control Input control Yeast strain S...,SRX211436 http://dbarchive.biosciencedbc.jp/kyushu-u/sacCer3/assembled/InP.YSt.10.Input_control.AllCell.bed ...

  17. File list: InP.YSt.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.YSt.05.Input_control.AllCell sacCer3 Input control Input control Yeast strain S...,SRX211435 http://dbarchive.biosciencedbc.jp/kyushu-u/sacCer3/assembled/InP.YSt.05.Input_control.AllCell.bed ...

  18. File list: InP.Bld.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Bld.20.Input_control.AllCell mm9 Input control Input control Blood SRX181865,SR...,SRX209471,SRX021428 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Bld.20.Input_control.AllCell.bed ...

  19. File list: InP.Utr.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Utr.20.Input_control.AllCell mm9 Input control Input control Uterus SRX114726,S...RX129065,SRX336292 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Utr.20.Input_control.AllCell.bed ...

  20. File list: InP.Brs.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Brs.10.Input_control.AllCell hg19 Input control Input control Breast SRX386756,...2,ERX371677,SRX1272812 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Brs.10.Input_control.AllCell.bed ...

  1. File list: InP.YSt.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.YSt.20.Input_control.AllCell sacCer3 Input control Input control Yeast strain S...,SRX211436 http://dbarchive.biosciencedbc.jp/kyushu-u/sacCer3/assembled/InP.YSt.20.Input_control.AllCell.bed ...

  2. File list: InP.Bld.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Bld.50.Input_control.AllCell mm9 Input control Input control Blood SRX1287947,S...,SRX120230,SRX021428 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Bld.50.Input_control.AllCell.bed ...

  3. The design and analysis of salmonid tagging studies in the Columbia Basin. Volume 10: Instructional guide to using program CaptHist to create SURPH files for survival analysis using PTAGIS data files

    International Nuclear Information System (INIS)

    Westhagen, P.; Skalski, J.

    1997-12-01

    The SURPH program is a valuable tool for estimating survivals and capture probabilities of fish outmigrations on the Snake and Columbia Rivers. Using special data files, SURPH computes reach to reach statistics for any release group passing a system of detection sites. Because the data must be recorded for individual fish, PIT tag data is best suited for use as input. However, PIT tag data as available from PTAGIS comes in a form that is not ready for use as SURPH input. SURPH requires a capture history for each fish. A capture history consists of a series of fields, one for each detection site, that has a code for whether the fish was detected and returned to the river, detected and removed, or not detected. For the PTAGIS data to be usable by SURPH it must be pre-processed. The data must be condensed down to one line per fish with the relevant detection information from the PTAGIS file represented compactly on each line. In addition, the PTAGIS data file coil information must be passed through a series of logic algorithms to determine whether or not a fish is returned to the river after detection. Program CaptHist was developed to properly pre-process the PTAGIS data files for input to program SURPH. This utility takes PTAGIS data files as input and creates a SURPH data file as well as other output including travel time records, detection date records, and a data error file. CaptHist allows a user to download PTAGIS files and easily process the data for use with SURPH

  4. TART input manual

    International Nuclear Information System (INIS)

    Kimlinger, J.R.; Plechaty, E.F.

    1982-01-01

    The TART code is a Monte Carlo neutron/photon transport code that is only on the CRAY computer. All the input cards for the TART code are listed, and definitions for all input parameters are given. The execution and limitations of the code are described, and input for two sample problems are given

  5. File list: InP.Utr.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Utr.05.Input_control.AllCell hg19 Input control Input control Uterus SRX092571,...44717,SRX811390,SRX811387,SRX811388 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Utr.05.Input_control.AllCell.bed ...

  6. File list: InP.ALL.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.ALL.10.Input_control.AllCell mm9 Input control Input control All cell types SRX...2625,SRX564564,SRX115361 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.ALL.10.Input_control.AllCell.bed ...

  7. File list: InP.Lng.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Lng.05.Input_control.AllCell mm9 Input control Input control Lung SRX062977,SRX...SRX213837,SRX213846,SRX213842 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Lng.05.Input_control.AllCell.bed ...

  8. File list: InP.Emb.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Emb.50.Input_control.AllCell ce10 Input control Input control Embryo SRX027097,...331141,SRX982091,SRX331153,SRX331231 http://dbarchive.biosciencedbc.jp/kyushu-u/ce10/assembled/InP.Emb.50.Input_control.AllCell.bed ...

  9. File list: InP.Gon.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Gon.50.Input_control.AllCell mm9 Input control Input control Gonad SRX332361,SR...89,SRX099892,SRX099895 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Gon.50.Input_control.AllCell.bed ...

  10. File list: InP.Utr.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Utr.50.Input_control.AllCell hg19 Input control Input control Uterus SRX117811,...50615,SRX150616,SRX863779,SRX389283 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Utr.50.Input_control.AllCell.bed ...

  11. File list: InP.Gon.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Gon.05.Input_control.AllCell mm9 Input control Input control Gonad SRX332361,SR...03,SRX591716,SRX555507 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Gon.05.Input_control.AllCell.bed ...

  12. File list: InP.Lng.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Lng.10.Input_control.AllCell mm9 Input control Input control Lung SRX213845,SRX...RX213842,SRX213846,SRX1528654 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Lng.10.Input_control.AllCell.bed ...

  13. File list: InP.Dig.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Dig.05.Input_control.AllCell mm9 Input control Input control Digestive tract SR...X885797,SRX885796,SRX885801 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Dig.05.Input_control.AllCell.bed ...

  14. File list: InP.ALL.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.ALL.05.Input_control.AllCell sacCer3 Input control Input control All cell types...ERX462363,ERX433715 http://dbarchive.biosciencedbc.jp/kyushu-u/sacCer3/assembled/InP.ALL.05.Input_control.AllCell.bed ...

  15. File list: InP.Gon.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Gon.10.Input_control.AllCell mm9 Input control Input control Gonad SRX332361,SR...89,SRX076065,SRX555507 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Gon.10.Input_control.AllCell.bed ...

  16. File list: InP.Dig.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Dig.50.Input_control.AllCell mm9 Input control Input control Digestive tract SR...X193725,SRX885789,SRX376973 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Dig.50.Input_control.AllCell.bed ...

  17. File list: InP.Unc.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Unc.10.Input_control.AllCell ce10 Input control Input control Unclassified SRX0...03829,SRX003828,SRX003826,SRX003827,SRX560679 http://dbarchive.biosciencedbc.jp/kyushu-u/ce10/assembled/InP.Unc.10.Input_control.AllCell.bed ...

  18. File list: InP.Adl.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Adl.50.Input_control.AllCell dm3 Input control Input control Adult SRX181432,SR...15625,SRX507386,SRX390503,SRX016141 http://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/InP.Adl.50.Input_control.AllCell.bed ...

  19. File list: InP.Neu.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Neu.50.Input_control.AllCell mm9 Input control Input control Neural SRX109476,S...X685922,SRX145797,SRX150263,SRX150261 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Neu.50.Input_control.AllCell.bed ...

  20. File list: InP.Adl.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Adl.20.Input_control.AllCell dm3 Input control Input control Adult SRX264598,SR...07394,SRX970862,SRX215623,SRX215627 http://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/InP.Adl.20.Input_control.AllCell.bed ...

  1. File list: InP.Oth.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Oth.50.Input_control.AllCell hg19 Input control Input control Others SRX253236,...RX188949,SRX189956,SRX080334,SRX080429 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Oth.50.Input_control.AllCell.bed ...

  2. File list: InP.Oth.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Oth.10.Input_control.AllCell mm9 Input control Input control Others SRX022481,S...,SRX1365335,SRX228661,SRX388797,SRX957693 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Oth.10.Input_control.AllCell.bed ...

  3. File list: InP.Epd.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Epd.50.Input_control.AllCell hg19 Input control Input control Epidermis SRX7005...65,SRX189959,SRX080381,SRX189950,SRX099054 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Epd.50.Input_control.AllCell.bed ...

  4. File list: InP.Unc.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Unc.20.Input_control.AllCell hg19 Input control Input control Unclassified SRX1...3292,ERX026985,SRX1094495,SRX545955,SRX212467 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Unc.20.Input_control.AllCell.bed ...

  5. File list: InP.Unc.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Unc.05.Input_control.AllCell hg19 Input control Input control Unclassified SRX5...5953,SRX063292,SRX1094492,SRX012413,ERX026985 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Unc.05.Input_control.AllCell.bed ...

  6. File list: InP.Unc.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Unc.10.Input_control.AllCell hg19 Input control Input control Unclassified SRX1...94492,SRX063292,SRX212465,SRX012413,ERX026985 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Unc.10.Input_control.AllCell.bed ...

  7. File list: InP.Lar.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Lar.20.Input_control.AllCell dm3 Input control Input control Larvae SRX040614,S...457599,SRX1426944,SRX1426946,SRX1426948 http://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/InP.Lar.20.Input_control.AllCell.bed ...

  8. File list: InP.Epd.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Epd.10.Input_control.AllCell hg19 Input control Input control Epidermis SRX2000...54,SRX189959,SRX080365,SRX573182,SRX573176 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Epd.10.Input_control.AllCell.bed ...

  9. File list: InP.Neu.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Neu.05.Input_control.AllCell mm9 Input control Input control Neural SRX236086,S...X668239,ERX513118,SRX150263,ERX513117 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Neu.05.Input_control.AllCell.bed ...

  10. File list: InP.ALL.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.ALL.20.Input_control.AllCell sacCer3 Input control Input control All cell types...2392,ERX433647,ERX433677,ERX433717 http://dbarchive.biosciencedbc.jp/kyushu-u/sacCer3/assembled/InP.ALL.20.Input_control.AllCell.bed ...

  11. File list: InP.Oth.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Oth.20.Input_control.AllCell hg19 Input control Input control Others SRX253236,...X512178,SRX512179,SRX1074575,SRX371462 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Oth.20.Input_control.AllCell.bed ...

  12. File list: InP.Neu.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Neu.20.Input_control.AllCell mm9 Input control Input control Neural SRX109476,S...X513118,SRX150261,ERX513117,SRX150263 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Neu.20.Input_control.AllCell.bed ...

  13. File list: InP.Unc.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Unc.10.Input_control.AllCell sacCer3 Input control Input control Unclassified E...2438,ERX462370,ERX433651,ERX433709 http://dbarchive.biosciencedbc.jp/kyushu-u/sacCer3/assembled/InP.Unc.10.Input_control.AllCell.bed ...

  14. File list: InP.Unc.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Unc.05.Input_control.AllCell ce10 Input control Input control Unclassified SRX0...03829,SRX003827,SRX003828,SRX003826,SRX560679 http://dbarchive.biosciencedbc.jp/kyushu-u/ce10/assembled/InP.Unc.05.Input_control.AllCell.bed ...

  15. File list: InP.Unc.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Unc.20.Input_control.AllCell ce10 Input control Input control Unclassified SRX0...03827,SRX003826,SRX560679,SRX003828,SRX003829 http://dbarchive.biosciencedbc.jp/kyushu-u/ce10/assembled/InP.Unc.20.Input_control.AllCell.bed ...

  16. File list: InP.Oth.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Oth.10.Input_control.AllCell hg19 Input control Input control Others SRX668215,...RX371462,SRX376724,SRX512177,SRX512179 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Oth.10.Input_control.AllCell.bed ...

  17. File list: InP.Adl.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Adl.05.Input_control.AllCell dm3 Input control Input control Adult SRX220301,SR...81442,SRX215625,SRX215627,SRX970862 http://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/InP.Adl.05.Input_control.AllCell.bed ...

  18. File list: InP.Brs.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Brs.05.Input_control.AllCell mm9 Input control Input control Breast SRX483143,S...ERX200438,SRX1078980,ERX200398,ERX200402,SRX396750 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Brs.05.Input_control.AllCell.bed ...

  19. File list: InP.Lar.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Lar.05.Input_control.AllCell dm3 Input control Input control Larvae SRX645428,S...287678,SRX1426944,SRX1426946,SRX1426948,SRX1426950 http://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/InP.Lar.05.Input_control.AllCell.bed ...

  20. File list: InP.ALL.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.ALL.05.Input_control.AllCell ce10 Input control Input control All cell types SR...6,SRX188620,SRX005633,SRX560679,SRX015099 http://dbarchive.biosciencedbc.jp/kyushu-u/ce10/assembled/InP.ALL.05.Input_control.AllCell.bed ...

  1. File list: InP.Pan.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Pan.05.Input_control.AllCell hg19 Input control Input control Pancreas SRX19030...0794,SRX188948,SRX190029,SRX199860,SRX026707,SRX825393 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Pan.05.Input_control.AllCell.bed ...

  2. File list: InP.CDV.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.CDV.05.Input_control.AllCell hg19 Input control Input control Cardiovascular SR...SRX190068,SRX220067,SRX190094,SRX080409,SRX190090,SRX699736 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.CDV.05.Input_control.AllCell.bed ...

  3. File list: InP.Brs.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Brs.20.Input_control.AllCell mm9 Input control Input control Breast SRX483143,S...SRX396745,SRX403484,SRX1078982,SRX031066,SRX031214 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Brs.20.Input_control.AllCell.bed ...

  4. File list: InP.Pup.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Pup.50.Input_control.AllCell dm3 Input control Input control Pupae SRX016163,SR...X016167,SRX016169,SRX016164,SRX016165,SRX016168 http://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/InP.Pup.50.Input_control.AllCell.bed ...

  5. File list: InP.CDV.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.CDV.10.Input_control.AllCell hg19 Input control Input control Cardiovascular SR...SRX190094,SRX190090,SRX189947,SRX080435,DRX021454,SRX699736 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.CDV.10.Input_control.AllCell.bed ...

  6. File list: InP.Pan.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Pan.10.Input_control.AllCell hg19 Input control Input control Pancreas SRX19030...8948,SRX825393,SRX199860,SRX190029,SRX026707,SRX375320 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Pan.10.Input_control.AllCell.bed ...

  7. File list: InP.Oth.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Oth.05.Input_control.AllCell mm9 Input control Input control Others SRX022481,S...1,SRX957691,SRX388797,SRX957693,SRX024355 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Oth.05.Input_control.AllCell.bed ...

  8. File list: InP.Pan.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Pan.20.Input_control.AllCell hg19 Input control Input control Pancreas SRX34080...0803,SRX340794,SRX199860,SRX190029,SRX026707,SRX375320 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Pan.20.Input_control.AllCell.bed ...

  9. File list: InP.Brs.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Brs.10.Input_control.AllCell mm9 Input control Input control Breast SRX213411,S...,ERX200429,ERX200402,SRX396747,SRX396750,SRX403487 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Brs.10.Input_control.AllCell.bed ...

  10. File list: InP.Pup.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Pup.10.Input_control.AllCell dm3 Input control Input control Pupae SRX016165,SR...X016163,SRX016164,SRX016169,SRX016168,SRX016167 http://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/InP.Pup.10.Input_control.AllCell.bed ...

  11. File list: InP.Brs.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Brs.50.Input_control.AllCell mm9 Input control Input control Breast SRX483143,S...SRX403484,SRX1078982,SRX031066,SRX031214,SRX697687 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Brs.50.Input_control.AllCell.bed ...

  12. File list: InP.Oth.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Oth.20.Input_control.AllCell mm9 Input control Input control Others SRX022481,S...1,SRX957691,SRX378545,SRX378544,SRX957693 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Oth.20.Input_control.AllCell.bed ...

  13. File list: InP.Lng.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Lng.20.Input_control.AllCell hg19 Input control Input control Lung SRX502813,SR...1772,SRX734236,SRX188957,SRX1004561,SRX497257,SRX497258 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Lng.20.Input_control.AllCell.bed ...

  14. File list: InP.Dig.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Dig.10.Input_control.AllCell hg19 Input control Input control Digestive tract S...RX155777,SRX077858,SRX863785,SRX543682,SRX286206,SRX543691 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Dig.10.Input_control.AllCell.bed ...

  15. File list: InP.Dig.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Dig.50.Input_control.AllCell hg19 Input control Input control Digestive tract S...RX155744,SRX612781,SRX543681,SRX101310,SRX648244,SRX863785 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Dig.50.Input_control.AllCell.bed ...

  16. File list: InP.Dig.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Dig.20.Input_control.AllCell hg19 Input control Input control Digestive tract S...X077858,SRX286206,SRX124697,SRX1183967,SRX124698,SRX543691 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Dig.20.Input_control.AllCell.bed ...

  17. File list: InP.Lng.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Lng.10.Input_control.AllCell hg19 Input control Input control Lung SRX502813,SR...74812,SRX016558,SRX734236,SRX038682,SRX497257,SRX497258 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Lng.10.Input_control.AllCell.bed ...

  18. File list: InP.Lng.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Lng.05.Input_control.AllCell hg19 Input control Input control Lung SRX502813,SR...88957,SRX093320,SRX038682,SRX502807,SRX497258,SRX497257 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Lng.05.Input_control.AllCell.bed ...

  19. File list: InP.ALL.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.ALL.50.Input_control.AllCell hg19 Input control Input control All cell types SR...038682,SRX389283,SRX863785,SRX1029468,SRX155724,SRX038907 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.ALL.50.Input_control.AllCell.bed ...

  20. File list: InP.Lng.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Lng.50.Input_control.AllCell hg19 Input control Input control Lung SRX502813,SR...89952,SRX080360,SRX734236,SRX038683,SRX093320,SRX038682 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Lng.50.Input_control.AllCell.bed ...

  1. File list: InP.Dig.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Dig.05.Input_control.AllCell hg19 Input control Input control Digestive tract S...RX124694,SRX543691,SRX543683,SRX367635,SRX286206,SRX543682 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Dig.05.Input_control.AllCell.bed ...

  2. File list: InP.Adl.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Adl.05.Input_control.AllCell ce10 Input control Input control Adult SRX005630,S...RX080064,SRX323687,SRX1388754,SRX146417,SRX012296,SRX012299,SRX005633 http://dbarchive.biosciencedbc.jp/kyushu-u/ce10/assembled/InP.Adl.05.Input_control.AllCell.bed ...

  3. File list: InP.Plc.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Plc.20.Input_control.AllCell mm9 Input control Input control Placenta SRX112971...02,SRX404310,SRX1072160,SRX871507,SRX192097,SRX192098,SRX192099,SRX204148 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Plc.20.Input_control.AllCell.bed ...

  4. File list: InP.Gon.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Gon.10.Input_control.AllCell hg19 Input control Input control Gonad SRX1002699,...SRX663452,SRX1002710,SRX1002709,SRX1091823,SRX663439,SRX663445,SRX663442 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Gon.10.Input_control.AllCell.bed ...

  5. File list: InP.Spl.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Spl.10.Input_control.AllCell mm9 Input control Input control Spleen ERX662629,E...242,ERX662616,SRX701169,SRX701170,SRX701168,ERX662615,ERX662625 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Spl.10.Input_control.AllCell.bed ...

  6. File list: InP.Gon.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Gon.50.Input_control.AllCell hg19 Input control Input control Gonad SRX1002699,...SRX663452,SRX1002710,SRX1002709,SRX1091823,SRX663439,SRX663445,SRX663442 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Gon.50.Input_control.AllCell.bed ...

  7. File list: InP.Gon.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Gon.20.Input_control.AllCell hg19 Input control Input control Gonad SRX1002699,...SRX663452,SRX1002710,SRX1002709,SRX663439,SRX663445,SRX1091823,SRX663442 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Gon.20.Input_control.AllCell.bed ...

  8. File list: InP.Plc.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Plc.05.Input_control.AllCell mm9 Input control Input control Placenta SRX344647...160,SRX204147,SRX404310,SRX871506,SRX871507,SRX192098,SRX192097,SRX192099 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Plc.05.Input_control.AllCell.bed ...

  9. File list: InP.Spl.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Spl.20.Input_control.AllCell mm9 Input control Input control Spleen ERX662629,E...626,ERX662620,SRX701168,SRX701169,SRX701170,ERX662615,ERX662625 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Spl.20.Input_control.AllCell.bed ...

  10. File list: InP.Plc.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Plc.10.Input_control.AllCell mm9 Input control Input control Placenta SRX344648...310,SRX871506,SRX142911,SRX871507,SRX204148,SRX192098,SRX192097,SRX192099 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Plc.10.Input_control.AllCell.bed ...

  11. File list: InP.Pan.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Pan.20.Input_control.AllCell mm9 Input control Input control Pancreas SRX188611...035142,SRX751762,SRX751759,SRX751765,SRX751768,SRX327162,SRX672452,SRX327163 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Pan.20.Input_control.AllCell.bed ...

  12. File list: InP.Prs.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Prs.05.Input_control.AllCell hg19 Input control Input control Prostate SRX08452...RX882951,SRX334229,SRX047085,SRX248897,SRX315154,SRX315157,SRX120297,SRX540851 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Prs.05.Input_control.AllCell.bed ...

  13. File list: InP.CDV.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.CDV.10.Input_control.AllCell mm9 Input control Input control Cardiovascular SRX...4,SRX373590,SRX1304811,SRX1304812,SRX377394,SRX377691,SRX275462,SRX275461,SRX066549 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.CDV.10.Input_control.AllCell.bed ...

  14. File list: InP.Bon.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Bon.10.Input_control.AllCell mm9 Input control Input control Bone SRX425483,ERX...RX211434,ERX211444,ERX211435,ERX211437,ERX211438,ERX211433,ERX211436,SRX963262 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Bon.10.Input_control.AllCell.bed ...

  15. File list: InP.Kid.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Kid.20.Input_control.AllCell hg19 Input control Input control Kidney SRX973437,...X968416,SRX1094507,SRX1094511,SRX1353404,SRX1094515,ERX513120,SRX114492,SRX170378 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Kid.20.Input_control.AllCell.bed ...

  16. File list: InP.Kid.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Kid.10.Input_control.AllCell mm9 Input control Input control Kidney SRX804277,S...RX085456,SRX143808,SRX804276,SRX286402,SRX286403,SRX062965,SRX804278,SRX1050552 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Kid.10.Input_control.AllCell.bed ...

  17. File list: InP.Prs.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Prs.20.Input_control.AllCell hg19 Input control Input control Prostate SRX53966...X306505,SRX475374,SRX540851,SRX475375,SRX1098753,SRX334229,SRX176071,SRX315154 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Prs.20.Input_control.AllCell.bed ...

  18. File list: InP.Kid.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Kid.05.Input_control.AllCell hg19 Input control Input control Kidney SRX359412,...X1037589,SRX985312,SRX974385,SRX1094515,SRX003879,SRX1094507,SRX1094511,ERX513120 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Kid.05.Input_control.AllCell.bed ...

  19. File list: InP.Kid.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Kid.05.Input_control.AllCell mm9 Input control Input control Kidney SRX062965,S...RX085456,SRX143808,SRX1050552,SRX286402,SRX804277,SRX804276,SRX286403,SRX804278 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Kid.05.Input_control.AllCell.bed ...

  20. File list: InP.Kid.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Kid.20.Input_control.AllCell mm9 Input control Input control Kidney SRX804277,S...RX286402,SRX143808,SRX804276,SRX286403,SRX062965,SRX085456,SRX804278,SRX1050552 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Kid.20.Input_control.AllCell.bed ...

  1. File list: InP.ALL.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.ALL.10.Input_control.AllCell dm3 Input control Input control All cell types SRX...,SRX215627,SRX390503,SRX1426952,SRX1426954,SRX1426944,SRX1426946,SRX681798,SRX1426948 http://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/InP.ALL.10.Input_control.AllCell.bed ...

  2. File list: InP.Kid.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Kid.50.Input_control.AllCell mm9 Input control Input control Kidney SRX804277,S...RX286402,SRX143808,SRX804276,SRX286403,SRX062965,SRX085456,SRX804278,SRX1050552 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Kid.50.Input_control.AllCell.bed ...

  3. File list: InP.CDV.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.CDV.50.Input_control.AllCell mm9 Input control Input control Cardiovascular SRX...74,SRX517514,SRX373580,SRX320036,SRX320037,SRX1121694,SRX275462,SRX275461,SRX066549 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.CDV.50.Input_control.AllCell.bed ...

  4. File list: InP.Bon.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Bon.20.Input_control.AllCell mm9 Input control Input control Bone SRX425483,SRX...RX211444,ERX211434,ERX211433,ERX211443,ERX211437,ERX211435,SRX963262,ERX211436 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.Bon.20.Input_control.AllCell.bed ...

  5. File list: InP.CDV.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.CDV.20.Input_control.AllCell mm9 Input control Input control Cardiovascular SRX...75,SRX373609,SRX373610,SRX373590,SRX320036,SRX275461,SRX275462,SRX066549,SRX1121694 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.CDV.20.Input_control.AllCell.bed ...

  6. File list: InP.Kid.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.Kid.50.Input_control.AllCell hg19 Input control Input control Kidney SRX973437,...,SRX114492,SRX170378,SRX691805,SRX080441,SRX1293074,SRX326417,SRX684265,SRX684263 http://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.Kid.50.Input_control.AllCell.bed ...

  7. Input data requirements for special processors in the computation system containing the VENTURE neutronics code

    International Nuclear Information System (INIS)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.

    1976-11-01

    This report presents user input data requirements for certain special processors in a nuclear reactor computation system. These processors generally read data in formatted form and generate binary interface data files. Some data processing is done to convert from the user-oriented form to the interface file forms. The VENTURE diffusion theory neutronics code and other computation modules in this system use the interface data files which are generated

  8. Input data requirements for special processors in the computation system containing the VENTURE neutronics code

    International Nuclear Information System (INIS)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.

    1979-07-01

    User input data requirements are presented for certain special processors in a nuclear reactor computation system. These processors generally read data in formatted form and generate binary interface data files. Some data processing is done to convert from the user oriented form to the interface file forms. The VENTURE diffusion theory neutronics code and other computation modules in this system use the interface data files which are generated

  9. RRB's SVES Input File - Post Entitlement State Verification and Exchange System (PSSVES)

    Data.gov (United States)

    Social Security Administration — Several PSSVES request files are transmitted to SSA each year for processing in the State Verification and Exchange System (SVES). This is a first step in obtaining...

  10. PREPAR: a user-friendly preprocessor to create AIRDOS-EPA input data sets

    International Nuclear Information System (INIS)

    Sjoreen, A.L.; Miller, C.W.; Nelson, C.B.

    1984-01-01

    PREPAR is a FORTRAN program designed to simplify the preparation of input for the AIRDOS-EPA computer code. PREPAR was designed to provide a method for data entry that is both logical and flexible. It also provides default values for all variables, so the user needs only to enter those data for which the defaults should be changed. Data are entered either unformatted or via a user-selected format. A separate file of the nuclide-specific data needed by AIRDOS-EPA is read by PREPAR. Two utility programs, EXTRAC and RADLST, were written to create and list this file. PREPAR writes the file needed to run AIRDOS-EPA and writes a listing of that file

  11. Preparation and documentation of a CATHENA input file for Darlington NGS

    International Nuclear Information System (INIS)

    1989-03-01

    A CATHENA input model has been developed and documented for the heat transport system of the Darlington Nuclear Generating Station. CATHENA, an advanced two-fluid thermalhydraulic computer code, has been designed for analysis of postulated loss-of-coolant accidents (LOCA) and upset conditions in the CANDU system. This report describes the Darlington input model (or idealization), and gives representative results for a simulation of a small break at an inlet header

  12. Input-output supervisor

    International Nuclear Information System (INIS)

    Dupuy, R.

    1970-01-01

    The input-output supervisor is the program which monitors the flow of informations between core storage and peripheral equipments of a computer. This work is composed of three parts: 1 - Study of a generalized input-output supervisor. With sample modifications it looks like most of input-output supervisors which are running now on computers. 2 - Application of this theory on a magnetic drum. 3 - Hardware requirement for time-sharing. (author) [fr

  13. File list: InP.NoD.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.NoD.05.Input_control.AllCell dm3 Input control Input control No description htt...p://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/InP.NoD.05.Input_control.AllCell.bed ...

  14. File list: InP.NoD.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.NoD.20.Input_control.AllCell dm3 Input control Input control No description htt...p://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/InP.NoD.20.Input_control.AllCell.bed ...

  15. File list: InP.NoD.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.NoD.05.Input_control.AllCell sacCer3 Input control Input control No description... http://dbarchive.biosciencedbc.jp/kyushu-u/sacCer3/assembled/InP.NoD.05.Input_control.AllCell.bed ...

  16. File list: InP.NoD.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.NoD.50.Input_control.AllCell hg19 Input control Input control No description ht...tp://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.NoD.50.Input_control.AllCell.bed ...

  17. File list: InP.NoD.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.NoD.50.Input_control.AllCell sacCer3 Input control Input control No description... http://dbarchive.biosciencedbc.jp/kyushu-u/sacCer3/assembled/InP.NoD.50.Input_control.AllCell.bed ...

  18. File list: InP.NoD.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.NoD.50.Input_control.AllCell dm3 Input control Input control No description htt...p://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/InP.NoD.50.Input_control.AllCell.bed ...

  19. File list: InP.NoD.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.NoD.05.Input_control.AllCell hg19 Input control Input control No description ht...tp://dbarchive.biosciencedbc.jp/kyushu-u/hg19/assembled/InP.NoD.05.Input_control.AllCell.bed ...

  20. File list: InP.NoD.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.NoD.20.Input_control.AllCell mm9 Input control Input control No description htt...p://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.NoD.20.Input_control.AllCell.bed ...

  1. File list: InP.NoD.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.NoD.05.Input_control.AllCell mm9 Input control Input control No description htt...p://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.NoD.05.Input_control.AllCell.bed ...

  2. File list: InP.NoD.10.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.NoD.10.Input_control.AllCell dm3 Input control Input control No description htt...p://dbarchive.biosciencedbc.jp/kyushu-u/dm3/assembled/InP.NoD.10.Input_control.AllCell.bed ...

  3. File list: InP.EmF.50.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.EmF.50.Input_control.AllCell mm9 Input control Input control Embryonic fibrobla...363,SRX115361 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.EmF.50.Input_control.AllCell.bed ...

  4. File list: InP.EmF.05.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.EmF.05.Input_control.AllCell mm9 Input control Input control Embryonic fibrobla...367,SRX115361 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.EmF.05.Input_control.AllCell.bed ...

  5. File list: InP.EmF.20.Input_control.AllCell [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available InP.EmF.20.Input_control.AllCell mm9 Input control Input control Embryonic fibrobla...367,SRX115361 http://dbarchive.biosciencedbc.jp/kyushu-u/mm9/assembled/InP.EmF.20.Input_control.AllCell.bed ...

  6. Input/output manual of light water reactor fuel analysis code FEMAXI-7 and its related codes

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa [Japan Atomic Energy Agency, Nuclear Safety Research Center, Tokai, Ibaraki (Japan); Saitou, Hiroaki [ITOCHU Techno-Solutions Corporation, Tokyo (Japan)

    2013-10-15

    A light water reactor fuel analysis code FEMAXI-7 has been developed, as an extended version from the former version FEMAXI-6, for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which are fully disclosed in the code model description published in the form of another JAEA-Data/Code report. The present manual, which is the very counterpart of this description document, gives detailed explanations of files and operation method of FEMAXI-7 code and its related codes, methods of input/output, sample Input/Output, methods of source code modification, subroutine structure, and internal variables in a specific manner in order to facilitate users to perform fuel analysis by FEMAXI-7. (author)

  7. Input/output manual of light water reactor fuel analysis code FEMAXI-7 and its related codes

    International Nuclear Information System (INIS)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa; Saitou, Hiroaki

    2013-10-01

    A light water reactor fuel analysis code FEMAXI-7 has been developed, as an extended version from the former version FEMAXI-6, for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which are fully disclosed in the code model description published in the form of another JAEA-Data/Code report. The present manual, which is the very counterpart of this description document, gives detailed explanations of files and operation method of FEMAXI-7 code and its related codes, methods of input/output, sample Input/Output, methods of source code modification, subroutine structure, and internal variables in a specific manner in order to facilitate users to perform fuel analysis by FEMAXI-7. (author)

  8. TVF-NMCRC-A powerful program for writing and executing simulation inputs for the FLUKA Monte Carlo Code system

    International Nuclear Information System (INIS)

    Mark, S.; Khomchenko, S.; Shifrin, M.; Haviv, Y.; Schwartz, J.R.; Orion, I.

    2007-01-01

    We at the Negev Monte Carlo Research Center (NMCRC) have developed a powerful new interface for writing and executing FLUKA input files-TVF-NMCRC. With the TVF tool a FLUKA user has the ability to easily write an input file without requiring any previous experience. The TVF-NMCRC tool is a LINUX program that has been verified for the most common LINUX-based operating systems, and is suitable for the latest version of FLUKA (FLUKA 2006.3)

  9. ENSDF: The evaluated nuclear structure data file

    International Nuclear Information System (INIS)

    Martin, M.J.

    1986-01-01

    The structure, organization, and contents of the Evaluated Nuclear Structure Data File, ENSDF, will be discussed. This file summarizes the state of experimental nuclear structure data for all nuclei as determined from consideration of measurements reported world wide. Special emphasis will be given to the data evaluation procedures and consistency checks utilized at the input stage and to the retrieval capabilities of the system at the output stage

  10. SPOTS4. Group data library and computer code, preparing ENDF/B-4 data for input to LEOPARD

    International Nuclear Information System (INIS)

    Kim, J.D.; Lee, J.T.

    1981-09-01

    The magnetic tape SPOTS4 contains in file 1 a data library to be used as input to the SPOTS4 program which is contained in file 2. The data library is based on ENDF/B-4 and consists of two parts in TEMPEST format (246 groups) and MUFT format (54 groups) respectively. From this library the SPOTS4 program produces a 172 + 54 group library for LEOPARD input. A copy of the magnetic tape is available from the IAEA Nuclear Data Section. (author)

  11. GEODOC: the GRID document file, record structure and data element description

    Energy Technology Data Exchange (ETDEWEB)

    Trippe, T.; White, V.; Henderson, F.; Phillips, S.

    1975-11-06

    The purpose of this report is to describe the information structure of the GEODOC file. GEODOC is a computer based file which contains the descriptive cataloging and indexing information for all documents processed by the National Geothermal Information Resource Group. This file (along with other GRID files) is managed by DBMS, the Berkeley Data Base Management System. Input for the system is prepared using the IRATE Text Editing System with its extended (12 bit) character set, or punched cards.

  12. Status of the evaluated nuclear structure data file

    International Nuclear Information System (INIS)

    Martin, M.J.

    1991-01-01

    The structure, organization, and contents of the Evaluated Nuclear Structure Data File (ENSDF) are discussed in this paper. This file contains a summary of the state of experimental nuclear structure data for all nuclides as determined from consideration of measurements reported worldwide in the literature. Special emphasis is given to the data evaluation procedures, the consistency checks, and the quality control utilized at the input stage and to the retrieval capabilities of the system at the output stage. Recent enhancements of the on-line interaction with the file contents is addressed as well as procedural changes that will improve the currency of the file

  13. Nuclear Structure References (NSR) file

    International Nuclear Information System (INIS)

    Ewbank, W.B.

    1978-08-01

    The use of the Nuclear Structure References file by the Nuclear Data Project at ORNL is described. Much of the report concerns format information of interest only to those preparing input to the system or otherwise needing detailed knowledge of its internal structure. 17 figures

  14. Apically extruded dentin debris by reciprocating single-file and multi-file rotary system.

    Science.gov (United States)

    De-Deus, Gustavo; Neves, Aline; Silva, Emmanuel João; Mendonça, Thais Accorsi; Lourenço, Caroline; Calixto, Camila; Lima, Edson Jorge Moreira

    2015-03-01

    This study aims to evaluate the apical extrusion of debris by the two reciprocating single-file systems: WaveOne and Reciproc. Conventional multi-file rotary system was used as a reference for comparison. The hypotheses tested were (i) the reciprocating single-file systems extrude more than conventional multi-file rotary system and (ii) the reciprocating single-file systems extrude similar amounts of dentin debris. After solid selection criteria, 80 mesial roots of lower molars were included in the present study. The use of four different instrumentation techniques resulted in four groups (n = 20): G1 (hand-file technique), G2 (ProTaper), G3 (WaveOne), and G4 (Reciproc). The apparatus used to evaluate the collection of apically extruded debris was typical double-chamber collector. Statistical analysis was performed for multiple comparisons. No significant difference was found in the amount of the debris extruded between the two reciprocating systems. In contrast, conventional multi-file rotary system group extruded significantly more debris than both reciprocating groups. Hand instrumentation group extruded significantly more debris than all other groups. The present results yielded favorable input for both reciprocation single-file systems, inasmuch as they showed an improved control of apically extruded debris. Apical extrusion of debris has been studied extensively because of its clinical relevance, particularly since it may cause flare-ups, originated by the introduction of bacteria, pulpal tissue, and irrigating solutions into the periapical tissues.

  15. Evaluated nuclear data file of Th-232

    International Nuclear Information System (INIS)

    Meadows, J.; Poenitz, W.; Smith, A.; Smith, D.; Whalen, J.; Howerton, R.

    1977-09-01

    An evaluated nuclear data file for thorium is described. The file extends over the energy range 0.049 (i.e., the inelastic-scattering threshold) to 20.0 MeV and is formulated within the framework of the ENDF system. The input data base, the evaluation procedures and judgments, and ancillary experiments carried out in conjunction with the evaluation are outlined. The file includes: neutron total cross sections, neutron scattering processes, neutron radiative capture cross sections, fission cross sections, (n;2n) and (n;3n) processes, fission properties (e.g., nu-bar and delayed neutron emission) and photon production processes. Regions of uncertainty are pointed out particularly where new measured results would be of value. The file is extended to thermal energies using previously reported resonance evaluations thereby providing a complete file for neutronic calculations. Integral data tests indicated that the file was suitable for neutronic calculations in the MeV range

  16. Development of the GUI environments of MIDAS code for convenient input and output processing

    International Nuclear Information System (INIS)

    Kim, K. L.; Kim, D. H.

    2003-01-01

    MIDAS is being developed at KAERI as an integrated Severe Accident Analysis Code with easy model modification and addition by restructuring the data transfer scheme. In this paper, the input file management system, IEDIT and graphic simulation system, SATS, are presented as MIDAS input and output GUI systems. These two systems would form the basis of the MIDAS GUI system for input and output processing, and they are expected to be useful tools for severe accidents analysis and simulation

  17. Selection method and device for reactor core performance calculation input indication

    International Nuclear Information System (INIS)

    Yuto, Yoshihiro.

    1994-01-01

    The position of a reactor core component on a reactor core map, which is previously designated and optionally changeable, is displayed by different colors on a CRT screen by using data of a data file incorporating results of a calculation for reactor core performance, such as incore thermal limit values. That is, an operator specifies the kind of the incore component to be sampled on a menu screen, to display the position of the incore component which satisfies a predetermined condition on the CRT screen by different colors in the form of a reactor core map. The position for the reactor core component displayed on the CRT screen by different colors is selected and designated on the screen by a touch panel, a mouse or a light pen, thereby automatically outputting detailed data of evaluation for the reactor core performance of the reactor core component at the indicated position. Retrieval of coordinates of fuel assemblies to be data sampled and input of the coordinates and demand for data sampling can be conducted at once by one menu screen. (N.H.)

  18. Using NJOY to Create MCNP ACE Files and Visualize Nuclear Data

    Energy Technology Data Exchange (ETDEWEB)

    Kahler, Albert Comstock [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-10-14

    We provide lecture materials that describe the input requirements to create various MCNP ACE files (Fast, Thermal, Dosimetry, Photo-nuclear and Photo-atomic) with the NJOY Nuclear Data Processing code system. Input instructions to visualize nuclear data with NJOY are also provided.

  19. Ground-Based Global Navigation Satellite System (GNSS) Compact Observation Data (1-second sampling, sub-hourly files) from NASA CDDIS

    Data.gov (United States)

    National Aeronautics and Space Administration — This dataset consists of ground-based Global Navigation Satellite System (GNSS) Observation Data (1-second sampling, sub-hourly files) from the NASA Crustal Dynamics...

  20. Computer code ANISN multiplying media and shielding calculation 2. Code description (input/output)

    International Nuclear Information System (INIS)

    Maiorino, J.R.

    1991-01-01

    The new code CCC-0514-ANISN/PC is described, as well as a ''GENERAL DESCRIPTION OF ANISN/PC code''. In addition to the ANISN/PC code, the transmittal package includes an interactive input generation programme called APE (ANISN Processor and Evaluator), which facilitates the work of the user in giving input. Also, a 21 group photon cross section master library FLUNGP.LIB in ISOTX format, which can be edited by an executable file LMOD.EXE, is included in the package. The input and output subroutines are reviewed. 6 refs, 1 fig., 1 tab

  1. Performance of the Galley Parallel File System

    Science.gov (United States)

    Nieuwejaar, Nils; Kotz, David

    1996-01-01

    As the input/output (I/O) needs of parallel scientific applications increase, file systems for multiprocessors are being designed to provide applications with parallel access to multiple disks. Many parallel file systems present applications with a conventional Unix-like interface that allows the application to access multiple disks transparently. This interface conceals the parallism within the file system, which increases the ease of programmability, but makes it difficult or impossible for sophisticated programmers and libraries to use knowledge about their I/O needs to exploit that parallelism. Furthermore, most current parallel file systems are optimized for a different workload than they are being asked to support. We introduce Galley, a new parallel file system that is intended to efficiently support realistic parallel workloads. Initial experiments, reported in this paper, indicate that Galley is capable of providing high-performance 1/O to applications the applications that rely on them. In Section 3 we describe that access data in patterns that have been observed to be common.

  2. Interactive development of RADTRAN

    International Nuclear Information System (INIS)

    Neuhauser, K.S.; Kanipe, F.L.; Weiner, R.F.; Yoshimura, H.R.; Joy, H.W.

    1995-01-01

    The RADTRAN computer code for transportation risk analysis, which has been under continuous development at Sandia National Laboratories since 1977, has evolved from a purely research tool into a publicly available with a variety of applications. This expansion of the user community has substantially increased the need to make the system easier to use without decreasing its capabilities or the quality of output. A large set of modifiable RADTRAN input files has been available via TRANSNET for several years. One approach to assisting the user involves adding annotations/information to each of these files. A second approach is providing additional help in building new/modifying old input files. Keeping the proposed information/annotation files separate from but closely coupled to the modifiable input files within the TRANSNET shell system allows the modifiable input files to remain as regular input files while providing rapid, automatic access to, useful information about the analysis. In this way, the sample input files remain intact as regular RADTRAN input files and any files generated using associated on-line menus or editors may be readily converted into new input files. A single sample file is selected and used as an example to illustrate the prototype help features

  3. Keemei: cloud-based validation of tabular bioinformatics file formats in Google Sheets.

    Science.gov (United States)

    Rideout, Jai Ram; Chase, John H; Bolyen, Evan; Ackermann, Gail; González, Antonio; Knight, Rob; Caporaso, J Gregory

    2016-06-13

    Bioinformatics software often requires human-generated tabular text files as input and has specific requirements for how those data are formatted. Users frequently manage these data in spreadsheet programs, which is convenient for researchers who are compiling the requisite information because the spreadsheet programs can easily be used on different platforms including laptops and tablets, and because they provide a familiar interface. It is increasingly common for many different researchers to be involved in compiling these data, including study coordinators, clinicians, lab technicians and bioinformaticians. As a result, many research groups are shifting toward using cloud-based spreadsheet programs, such as Google Sheets, which support the concurrent editing of a single spreadsheet by different users working on different platforms. Most of the researchers who enter data are not familiar with the formatting requirements of the bioinformatics programs that will be used, so validating and correcting file formats is often a bottleneck prior to beginning bioinformatics analysis. We present Keemei, a Google Sheets Add-on, for validating tabular files used in bioinformatics analyses. Keemei is available free of charge from Google's Chrome Web Store. Keemei can be installed and run on any web browser supported by Google Sheets. Keemei currently supports the validation of two widely used tabular bioinformatics formats, the Quantitative Insights into Microbial Ecology (QIIME) sample metadata mapping file format and the Spatially Referenced Genetic Data (SRGD) format, but is designed to easily support the addition of others. Keemei will save researchers time and frustration by providing a convenient interface for tabular bioinformatics file format validation. By allowing everyone involved with data entry for a project to easily validate their data, it will reduce the validation and formatting bottlenecks that are commonly encountered when human-generated data files are

  4. Configuration Management File Manager Developed for Numerical Propulsion System Simulation

    Science.gov (United States)

    Follen, Gregory J.

    1997-01-01

    One of the objectives of the High Performance Computing and Communication Project's (HPCCP) Numerical Propulsion System Simulation (NPSS) is to provide a common and consistent way to manage applications, data, and engine simulations. The NPSS Configuration Management (CM) File Manager integrated with the Common Desktop Environment (CDE) window management system provides a common look and feel for the configuration management of data, applications, and engine simulations for U.S. engine companies. In addition, CM File Manager provides tools to manage a simulation. Features include managing input files, output files, textual notes, and any other material normally associated with simulation. The CM File Manager includes a generic configuration management Application Program Interface (API) that can be adapted for the configuration management repositories of any U.S. engine company.

  5. Ensemble standar deviation of wind speed and direction of the FDDA input to WRF

    Data.gov (United States)

    U.S. Environmental Protection Agency — NetCDF file of the SREF standard deviation of wind speed and direction that was used to inject variability in the FDDA input. variable U_NDG_OLD contains standard...

  6. Data and code files for co-occurrence modeling project

    Data.gov (United States)

    U.S. Environmental Protection Agency — Files included are original data inputs on stream fishes (fish_data_OEPA_2012.csv), water chemistry (OEPA_WATER_2012.csv), geographic data (NHD_Plus_StreamCat);...

  7. Input/Output of ab-initio nuclear structure calculations for improved performance and portability

    International Nuclear Information System (INIS)

    Laghave, Nikhil

    2010-01-01

    Many modern scientific applications rely on highly computation intensive calculations. However, most applications do not concentrate as much on the role that input/output operations can play for improved performance and portability. Parallelizing input/output operations of large files can significantly improve the performance of parallel applications where sequential I/O is a bottleneck. A proper choice of I/O library also offers a scope for making input/output operations portable across different architectures. Thus, use of parallel I/O libraries for organizing I/O of large data files offers great scope in improving performance and portability of applications. In particular, sequential I/O has been identified as a bottleneck for the highly scalable MFDn (Many Fermion Dynamics for nuclear structure) code performing ab-initio nuclear structure calculations. We develop interfaces and parallel I/O procedures to use a well-known parallel I/O library in MFDn. As a result, we gain efficient I/O of large datasets along with their portability and ease of use in the down-stream processing. Even situations where the amount of data to be written is not huge, proper use of input/output operations can boost the performance of scientific applications. Application checkpointing offers enormous performance improvement and flexibility by doing a negligible amount of I/O to disk. Checkpointing saves and resumes application state in such a manner that in most cases the application is unaware that there has been an interruption to its execution. This helps in saving large amount of work that has been previously done and continue application execution. This small amount of I/O provides substantial time saving by offering restart/resume capability to applications. The need for checkpointing in optimization code NEWUOA has been identified and checkpoint/restart capability has been implemented in NEWUOA by using simple file I/O.

  8. A fortran code CVTRAN to provide cross-section file for TWODANT by using macroscopic file written by SRAC

    International Nuclear Information System (INIS)

    Yamane, Tsuyoshi; Tsuchihashi, Keichiro

    1999-03-01

    A code CVTRAN provides the macroscopic cross-sections in the format of XSLIB file which is one of Standard interface files for a two-dimensional Sn transport code TWODANT by reading a macroscopic cross section file in the PDS format which is prepared by SRAC execution. While a two-dimensional Sn transport code TWOTRAN published by LANL is installed as a module in the SRAC code system, several functions such as alpha search, concentration search, zone thickness search and various edits are suppressed. Since the TWODANT code was released from LANL, its short running time, stable convergence and plenty of edits have attracted many users. The code CVTRAN makes the TWODANT available to the SRAC user by providing the macroscopic cross-sections on a card-image file XSLIB. The CVTRAN also provides material dependent fission spectra into a card-image format file CVLIB, together with group velocities, group boundary energies and material names. The user can feed them into the TWODANT input, if necessary, by cut-and-paste command. (author)

  9. Federating LHCb datasets using the DIRAC File catalog

    CERN Document Server

    Haen, Christophe; Frank, Markus; Tsaregorodtsev, Andrei

    2015-01-01

    In the distributed computing model of LHCb the File Catalog (FC) is a central component that keeps track of each file and replica stored on the Grid. It is federating the LHCb data files in a logical namespace used by all LHCb applications. As a replica catalog, it is used for brokering jobs to sites where their input data is meant to be present, but also by jobs for finding alternative replicas if necessary. The LCG File Catalog (LFC) used originally by LHCb and other experiments is now being retired and needs to be replaced. The DIRAC File Catalog (DFC) was developed within the framework of the DIRAC Project and presented during CHEP 2012. From the technical point of view, the code powering the DFC follows an Aspect oriented programming (AOP): each type of entity that is manipulated by the DFC (Users, Files, Replicas, etc) is treated as a separate 'concern' in the AOP terminology. Hence, the database schema can also be adapted to the needs of a Virtual Organization. LHCb opted for a highly tuned MySQL datab...

  10. Conversion software for ANSYS APDL 2 FLUENT MHD magnetic file

    International Nuclear Information System (INIS)

    Ghita, G.; Ionescu, S.; Prisecaru, I.

    2016-01-01

    The present paper describes the improvements made to the conversion software for ANSYS APDL 2 FLUENT MHD Magnetic File which is able to extract the data from ANSYS APDL file and write down a file containing the magnetic field data in FLUENT magneto hydro dynamics (MHD) format. The MHD module has some features for the uniform and non uniform magnetic field but it is limited for sinusoidal or pulsed, square wave, having a fixed duty cycle of 50%. The present software, ANSYS APDL 2 FLUENT MHD Magnetic File, suffered major modifications in comparison with the last one. The most important improvement consists in a new graphical interface, which has 3D graphical interface for the input file but also for the output file. Another improvement has been made for processing time, the new version is two times faster comparing with the old one. (authors)

  11. SLIB77, Source Library Data Compression and File Maintenance System

    International Nuclear Information System (INIS)

    Lunsford, A.

    1989-01-01

    Description of program or function: SLIB77 is a source librarian program designed to maintain FORTRAN source code in a compressed form on magnetic disk. The program was prepared to meet program maintenance requirements for ongoing program development and continual improvement of very large programs involving many programmers from a number of different organizations. SLIB77 automatically maintains in one file the source of the current program as well as all previous modifications. Although written originally for FORTRAN programs, SLIB77 is suitable for use with data files, text files, operating systems, and other programming languages, such as Ada, C and COBOL. It can handle libraries with records of up to 160-characters. Records are grouped into DECKS and assigned deck names by the user. SLIB77 assigns a number to each record in each DECK. Records can be deleted or restored singly or as a group within each deck. Modification records are grouped and assigned modification identification names by the user. The program assigns numbers to each new record within the deck. The program has two modes of execution, BATCH and EDIT. The BATCH mode is controlled by an input file and is used to make changes permanent and create new library files. The EDIT mode is controlled by interactive terminal input and a built-in line editor is used for modification of single decks. Transferring of a library from one computer system to another is accomplished using a Portable Library File created by SLIB77 in a BATCH run

  12. An approach for coupled-code multiphysics core simulations from a common input

    International Nuclear Information System (INIS)

    Schmidt, Rodney; Belcourt, Kenneth; Hooper, Russell; Pawlowski, Roger; Clarno, Kevin; Simunovic, Srdjan; Slattery, Stuart; Turner, John; Palmtag, Scott

    2015-01-01

    Highlights: • We describe an approach for coupled-code multiphysics reactor core simulations. • The approach can enable tight coupling of distinct physics codes with a common input. • Multi-code multiphysics coupling and parallel data transfer issues are explained. • The common input approach and how the information is processed is described. • Capabilities are demonstrated on an eigenvalue and power distribution calculation. - Abstract: This paper describes an approach for coupled-code multiphysics reactor core simulations that is being developed by the Virtual Environment for Reactor Applications (VERA) project in the Consortium for Advanced Simulation of Light-Water Reactors (CASL). In this approach a user creates a single problem description, called the “VERAIn” common input file, to define and setup the desired coupled-code reactor core simulation. A preprocessing step accepts the VERAIn file and generates a set of fully consistent input files for the different physics codes being coupled. The problem is then solved using a single-executable coupled-code simulation tool applicable to the problem, which is built using VERA infrastructure software tools and the set of physics codes required for the problem of interest. The approach is demonstrated by performing an eigenvalue and power distribution calculation of a typical three-dimensional 17 × 17 assembly with thermal–hydraulic and fuel temperature feedback. All neutronics aspects of the problem (cross-section calculation, neutron transport, power release) are solved using the Insilico code suite and are fully coupled to a thermal–hydraulic analysis calculated by the Cobra-TF (CTF) code. The single-executable coupled-code (Insilico-CTF) simulation tool is created using several VERA tools, including LIME (Lightweight Integrating Multiphysics Environment for coupling codes), DTK (Data Transfer Kit), Trilinos, and TriBITS. Parallel calculations are performed on the Titan supercomputer at Oak

  13. A method for quantitative analysis of standard and high-throughput qPCR expression data based on input sample quantity.

    Directory of Open Access Journals (Sweden)

    Mateusz G Adamski

    Full Text Available Over the past decade rapid advances have occurred in the understanding of RNA expression and its regulation. Quantitative polymerase chain reactions (qPCR have become the gold standard for quantifying gene expression. Microfluidic next generation, high throughput qPCR now permits the detection of transcript copy number in thousands of reactions simultaneously, dramatically increasing the sensitivity over standard qPCR. Here we present a gene expression analysis method applicable to both standard polymerase chain reactions (qPCR and high throughput qPCR. This technique is adjusted to the input sample quantity (e.g., the number of cells and is independent of control gene expression. It is efficiency-corrected and with the use of a universal reference sample (commercial complementary DNA (cDNA permits the normalization of results between different batches and between different instruments--regardless of potential differences in transcript amplification efficiency. Modifications of the input quantity method include (1 the achievement of absolute quantification and (2 a non-efficiency corrected analysis. When compared to other commonly used algorithms the input quantity method proved to be valid. This method is of particular value for clinical studies of whole blood and circulating leukocytes where cell counts are readily available.

  14. DIGITAL ONCOLOGY PATIENT RECORD - HETEROGENEOUS FILE BASED APPROACH

    Directory of Open Access Journals (Sweden)

    Nikolay Sapundzhiev

    2010-12-01

    Full Text Available Introduction: Oncology patients need extensive follow-up and meticulous documentation. The aim of this study was to introduce a simple, platform independent file based system for documentation of diagnostic and therapeutic procedures in oncology patients and test its function.Material and methods: A file-name based system of the type M1M2M3.F2 was introduced, where M1 is a unique identifier for the patient, M2 is the date of the clinical intervention/event, M3 is an identifier for the author of the medical record and F2 is the specific software generated file-name extension.Results: This system is in use at 5 institutions, where a total of 11 persons on 14 different workstations inputted 16591 entries (files for 2370. The merge process was tested on 2 operating systems - when copied together all files sort up as expected by patient, and for each patient in a chronological order, providing a digital cumulative patient record, which contains heterogeneous file formats.Conclusion: The file based approach for storing heterogeneous digital patient related information is an reliable system, which can handle open-source, proprietary, general and custom file formats and seems to be easily scalable. Further development of software for automatic checks of the integrity and searching and indexing of the files is expected to produce a more user-friendly environment

  15. Analysis of reactor capital costs and correlated sampling of economic input variables - 15342

    International Nuclear Information System (INIS)

    Ganda, F.; Kim, T.K.; Taiwo, T.A.; Wigeland, R.

    2015-01-01

    In this paper we present work aimed at enhancing the capability to perform nuclear fuel cycle cost estimates and evaluation of financial risk. Reactor capital costs are of particular relevance, since they typically comprise about 60% to 70% of the calculated Levelized Cost of Electricity at Equilibrium (LCAE). The work starts with the collection of historical construction cost and construction duration of nuclear plants in the U.S. and France, as well as forecasted costs of nuclear plants currently under construction in the U.S. This data has the primary goal of supporting the introduction of an appropriate framework, supported in this paper by two case studies with historical data, which allows the development of solid and defensible assumptions on nuclear reactor capital costs. Work is also presented on the enhancement of the capability to model interdependence of cost estimates between facilities and uncertainties. The correlated sampling capabilities in the nuclear economic code NECOST have been expanded to include partial correlations between input variables, according to a given correlation matrix. Accounting for partial correlations correctly allows a narrowing, where appropriate, of the probability density function of the difference in the LCAE between alternative, but correlated, fuel cycles. It also allows the correct calculation of the standard deviation of the LCAE of multistage systems, which appears smaller than the correct value if correlated input costs are treated as uncorrelated. (authors)

  16. Development of an Input Model to MELCOR 1.8.5 for the Oskarshamn 3 BWR

    Energy Technology Data Exchange (ETDEWEB)

    Nilsson, Lars [Lentek, Nykoeping (Sweden)

    2006-05-15

    An input model has been prepared to the code MELCOR 1.8.5 for the Swedish Oskarshamn 3 Boiling Water Reactor (O3). This report describes the modelling work and the various files which comprise the input deck. Input data are mainly based on original drawings and system descriptions made available by courtesy of OKG AB. Comparison and check of some primary system data were made against an O3 input file to the SCDAP/RELAP5 code that was used in the SARA project. Useful information was also obtained from the FSAR (Final Safety Analysis Report) for O3 and the SKI report '2003 Stoerningshandboken BWR'. The input models the O3 reactor at its current state with the operating power of 3300 MW{sub th}. One aim with this work is that the MELCOR input could also be used for power upgrading studies. All fuel assemblies are thus assumed to consist of the new Westinghouse-Atom's SVEA-96 Optima2 fuel. MELCOR is a severe accident code developed by Sandia National Laboratory under contract from the U.S. Nuclear Regulatory Commission (NRC). MELCOR is a successor to STCP (Source Term Code Package) and has thus a long evolutionary history. The input described here is adapted to the latest version 1.8.5 available when the work began. It was released the year 2000, but a new version 1.8.6 was distributed recently. Conversion to the new version is recommended. (During the writing of this report still another code version, MELCOR 2.0, has been announced to be released within short.) In version 1.8.5 there is an option to describe the accident progression in the lower plenum and the melt-through of the reactor vessel bottom in more detail by use of the Bottom Head (BH) package developed by Oak Ridge National Laboratory especially for BWRs. This is in addition to the ordinary MELCOR COR package. Since problems arose running with the BH input two versions of the O3 input deck were produced, a NONBH and a BH deck. The BH package is no longer a separate package in the new 1

  17. 76 FR 12155 - Self-Regulatory Organizations; BATS Exchange, Inc.; Notice of Filing and Immediate Effectiveness...

    Science.gov (United States)

    2011-03-04

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-63969; File No. SR-BATS-2011-007] Self-Regulatory Organizations; BATS Exchange, Inc.; Notice of Filing and Immediate Effectiveness of Proposed Rule Change by BATS Exchange, Inc. to Adopt BATS Rule 11.21, entitled ``Input of Accurate Information...

  18. A generic method for automatic translation between input models for different versions of simulation codes

    International Nuclear Information System (INIS)

    Serfontein, Dawid E.; Mulder, Eben J.; Reitsma, Frederik

    2014-01-01

    A computer code was developed for the semi-automatic translation of input models for the VSOP-A diffusion neutronics simulation code to the format of the newer VSOP 99/05 code. In this paper, this algorithm is presented as a generic method for producing codes for the automatic translation of input models from the format of one code version to another, or even to that of a completely different code. Normally, such translations are done manually. However, input model files, such as for the VSOP codes, often are very large and may consist of many thousands of numeric entries that make no particular sense to the human eye. Therefore the task, of for instance nuclear regulators, to verify the accuracy of such translated files can be very difficult and cumbersome. This may cause translation errors not to be picked up, which may have disastrous consequences later on when a reactor with such a faulty design is built. Therefore a generic algorithm for producing such automatic translation codes may ease the translation and verification process to a great extent. It will also remove human error from the process, which may significantly enhance the accuracy and reliability of the process. The developed algorithm also automatically creates a verification log file which permanently record the names and values of each variable used, as well as the list of meanings of all the possible values. This should greatly facilitate reactor licensing applications

  19. A generic method for automatic translation between input models for different versions of simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Serfontein, Dawid E., E-mail: Dawid.Serfontein@nwu.ac.za [School of Mechanical and Nuclear Engineering, North West University (PUK-Campus), PRIVATE BAG X6001 (Internal Post Box 360), Potchefstroom 2520 (South Africa); Mulder, Eben J. [School of Mechanical and Nuclear Engineering, North West University (South Africa); Reitsma, Frederik [Calvera Consultants (South Africa)

    2014-05-01

    A computer code was developed for the semi-automatic translation of input models for the VSOP-A diffusion neutronics simulation code to the format of the newer VSOP 99/05 code. In this paper, this algorithm is presented as a generic method for producing codes for the automatic translation of input models from the format of one code version to another, or even to that of a completely different code. Normally, such translations are done manually. However, input model files, such as for the VSOP codes, often are very large and may consist of many thousands of numeric entries that make no particular sense to the human eye. Therefore the task, of for instance nuclear regulators, to verify the accuracy of such translated files can be very difficult and cumbersome. This may cause translation errors not to be picked up, which may have disastrous consequences later on when a reactor with such a faulty design is built. Therefore a generic algorithm for producing such automatic translation codes may ease the translation and verification process to a great extent. It will also remove human error from the process, which may significantly enhance the accuracy and reliability of the process. The developed algorithm also automatically creates a verification log file which permanently record the names and values of each variable used, as well as the list of meanings of all the possible values. This should greatly facilitate reactor licensing applications.

  20. READKO - a subroutine package for centralized input- and output operations in KAPROS (Version 1.8)

    International Nuclear Information System (INIS)

    Kuefner, K.

    1982-06-01

    Subroutines are presented for the FORTRAN statements related to unformated (binary) data transfer. The cells are identical for all file realizations supported by KAPROS (data blocks, external units). Only at run-time of a module the actual realizations of the files are fixed. In this way, the input- and output-statements of a program are centralized, thereby greatly facilitating the KAPROS implementation of a program as well as the adoption of a KAPROS module for a 'stand alone' environment. (orig.) [de

  1. PERTV: A standard file version of the PERT-V code

    International Nuclear Information System (INIS)

    George, D.C.; LaBauve, R.J.

    1988-02-01

    The PERT-V code, used in two-dimensional perturbation theory, fast reactor analysis, has been modified to accept input data from standard files ISOTXS, GEODST, ZNATDN, NDXSRF, DLAYXS, RTFLUX, and ATFLUX. This modification has greatly reduced the additional input that must be supplied by the user. The new version of PERT-V, PERTV, has all the options of the original code including a plotting capability. 10 refs., 3 figs., 12 tabs

  2. Data entry system for INIS input using a personal computer

    International Nuclear Information System (INIS)

    Ishikawa, Masashi

    1990-01-01

    Input preparation for the INIS (International Nuclear Information System) has been performed by Japan Atomic Energy Research Institute since 1970. Instead of the input data preparation done by worksheets make out with the typewriters, new method with which data can be directly inputted into a diskette using personal computers is introduced. According to the popularization of personal computers and word processors, this system is easily applied to other system, so the outline and the future development on it are described. A shortcoming of this system is that spell-checking and data entry using authority files are hardly performed because of the limitation of hardware resources, and that data code conversion is needed because applied code systems between personal computer and main frame computer are quite different from each other. On the other hand, improving the timelyness of data entry is expected without duplication of keying. (author)

  3. C2x: A tool for visualisation and input preparation for CASTEP and other electronic structure codes

    Science.gov (United States)

    Rutter, M. J.

    2018-04-01

    The c2x code fills two distinct roles. Its first role is in acting as a converter between the binary format .check files from the widely-used CASTEP [1] electronic structure code and various visualisation programs. Its second role is to manipulate and analyse the input and output files from a variety of electronic structure codes, including CASTEP, ONETEP and VASP, as well as the widely-used 'Gaussian cube' file format. Analysis includes symmetry analysis, and manipulation arbitrary cell transformations. It continues to be under development, with growing functionality, and is written in a form which would make it easy to extend it to working directly with files from other electronic structure codes. Data which c2x is capable of extracting from CASTEP's binary checkpoint files include charge densities, spin densities, wavefunctions, relaxed atomic positions, forces, the Fermi level, the total energy, and symmetry operations. It can recreate .cell input files from checkpoint files. Volumetric data can be output in formats useable by many common visualisation programs, and c2x will itself calculate integrals, expand data into supercells, and interpolate data via combinations of Fourier and trilinear interpolation. It can extract data along arbitrary lines (such as lines between atoms) as 1D output. C2x is able to convert between several common formats for describing molecules and crystals, including the .cell format of CASTEP. It can construct supercells, reduce cells to their primitive form, and add specified k-point meshes. It uses the spglib library [2] to report symmetry information, which it can add to .cell files. C2x is a command-line utility, so is readily included in scripts. It is available under the GPL and can be obtained from http://www.c2x.org.uk. It is believed to be the only open-source code which can read CASTEP's .check files, so it will have utility in other projects.

  4. Systematic pseudopotentials from reference eigenvalue sets for DFT calculations: Pseudopotential files

    Directory of Open Access Journals (Sweden)

    Pablo Rivero

    2015-06-01

    Full Text Available We present in this article a pseudopotential (PP database for DFT calculations in the context of the SIESTA code [1–3]. Comprehensive optimized PPs in two formats (psf files and input files for ATM program are provided for 20 chemical elements for LDA and GGA exchange-correlation potentials. Our data represents a validated database of PPs for SIESTA DFT calculations. Extensive transferability tests guarantee the usefulness of these PPs.

  5. Computerized index for teaching files

    International Nuclear Information System (INIS)

    Bramble, J.M.

    1989-01-01

    A computerized index can be used to retrieve cases from a teaching file that have radiographic findings similar to an unknown case. The probability that a user will review cases with a correct diagnosis was estimated with use of radiographic findings of arthritis in hand radiographs of 110 cases from a teaching file. The nearest-neighbor classification algorithm was used as a computer index to 110 cases of arthritis. Each case was treated as an unknown and inputted to the computer index. The accuracy of the computer index in retrieving cases with the same diagnosis (including rheumatoid arthritis, gout, psoriatic arthritis, inflammatory osteoarthritis, and pyrophosphate arthropathy) was measured. A Bayes classifier algorithm was also tested on the same database. Results are presented. The nearest-neighbor algorithm was 83%. By comparison, the estimated accuracy of the Bayes classifier algorithm was 78%. Conclusions: A computerized index to a teaching file based on the nearest-neighbor algorithm should allow the user to review cases with the correct diagnosis of an unknown case, by entering the findings of the unknown case

  6. Neutron activation analysis using Excel files and Canberra Genie-2000

    International Nuclear Information System (INIS)

    Landsberger, S.; Jackman, K.; Welch, L.

    2005-01-01

    A method for analyzing neutron activated sample data by using Microsoft Excel as the analysis engine has been developed. A simple technique for inputting data is based on report files generated by Canberra's Genie-2000 spectroscopy system but could be easily modified to support other vendors having report formats with consistent text placement. A batch program handles operating an automatic sample changer, acquiring the data, and analyzing the spectrum to create a report of the peak locations and net area. The entire report is then transferred to within an Excel spreadsheet as the source data for neutron activation analysis. Unique Excel templates have been designed, for example, to accommodate short-lived and long-lived isotopes. This process provides a largely integrated solution to NAA while providing the results in an industry standard spreadsheet format. This software is ideally suited for teaching and training purposes. (author)

  7. Modification to the Monte N-Particle (MCNP) Visual Editor (MCNPVised) to read in Computer Aided Design (CAD) files

    International Nuclear Information System (INIS)

    Schwarz, Randy A.; Carter, Leeland L.

    2004-01-01

    Monte Carlo N-Particle Transport Code (MCNP) (Reference 1) is the code of choice for doing complex neutron/photon/electron transport calculations for the nuclear industry and research institutions. The Visual Editor for Monte Carlo N-Particle (References 2 to 11) is recognized internationally as the best code for visually creating and graphically displaying input files for MCNP. The work performed in this grant enhanced the capabilities of the MCNP Visual Editor to allow it to read in a 2D Computer Aided Design (CAD) file, allowing the user to modify and view the 2D CAD file and then electronically generate a valid MCNP input geometry with a user specified axial extent

  8. User's and reference guide to the INEL RML/analytical radiochemistry sample tracking database version 1.00

    International Nuclear Information System (INIS)

    Femec, D.A.

    1995-09-01

    This report discusses the sample tracking database in use at the Idaho National Engineering Laboratory (INEL) by the Radiation Measurements Laboratory (RML) and Analytical Radiochemistry. The database was designed in-house to meet the specific needs of the RML and Analytical Radiochemistry. The report consists of two parts, a user's guide and a reference guide. The user's guide presents some of the fundamentals needed by anyone who will be using the database via its user interface. The reference guide describes the design of both the database and the user interface. Briefly mentioned in the reference guide are the code-generating tools, CREATE-SCHEMA and BUILD-SCREEN, written to automatically generate code for the database and its user interface. The appendices contain the input files used by the these tools to create code for the sample tracking database. The output files generated by these tools are also included in the appendices

  9. Neutron metrology file NMF-90. An integrated database for performing neutron spectrum adjustment calculations

    International Nuclear Information System (INIS)

    Kocherov, N.P.

    1996-01-01

    The Neutron Metrology File NMF-90 is an integrated database for performing neutron spectrum adjustment (unfolding) calculations. It contains 4 different adjustment codes, the dosimetry reaction cross-section library IRDF-90/NMF-G with covariances files, 6 input data sets for reactor benchmark neutron fields and a number of utility codes for processing and plotting the input and output data. The package consists of 9 PC HD diskettes and manuals for the codes. It is distributed by the Nuclear Data Section of the IAEA on request free of charge. About 10 MB of diskspace is needed to install and run a typical reactor neutron dosimetry unfolding problem. (author). 8 refs

  10. Study and development of a document file system with selective access; Etude et realisation d'un systeme de fichiers documentaires a acces selectif

    Energy Technology Data Exchange (ETDEWEB)

    Mathieu, Jean-Claude

    1974-06-21

    The objective of this research thesis was to design and to develop a set of software aimed at an efficient management of a document file system by using methods of selective access to information. Thus, the three main aspects of file processing (creation, modification, reorganisation) have been addressed. The author first presents the main problems related to the development of a comprehensive automatic documentation system, and their conventional solutions. Some future aspects, notably dealing with the development of peripheral computer technology, are also evoked. He presents the characteristics of INIS bibliographic records provided by the IAEA which have been used to create the files. In the second part, he briefly describes the file system general organisation. This system is based on the use of two main files: an inverse file which contains for each descriptor a list of of numbers of files indexed by this descriptor, and a dictionary of descriptor or input file which gives access to the inverse file. The organisation of these both files is then describes in a detailed way. Other related or associated files are created, and the overall architecture and mechanisms integrated into the file data input software are described, as well as various processing applied to these different files. Performance and possible development are finally discussed.

  11. The use of the hybrid K-edge densitometer for routine analysis of safeguards verification samples of reprocessing input liquor

    International Nuclear Information System (INIS)

    Ottmar, H.; Eberle, H.

    1991-01-01

    Following successful tests of a hybrid K-edge instrument at TUI Karlsruhe and the routine use of a K-edge densitometer for safeguards verification at the same laboratory, the Euratom Safeguards Directorate of the Commission of the European Communities decided to install the first such instrument into a large industrial reprocessing plant for the routine verification of samples taken from the input accountancy tanks. This paper reports on the installation, calibration, sample handling procedure and the performance of this instrument after one year of routine operation

  12. mzML2ISA & nmrML2ISA: generating enriched ISA-Tab metadata files from metabolomics XML data.

    Science.gov (United States)

    Larralde, Martin; Lawson, Thomas N; Weber, Ralf J M; Moreno, Pablo; Haug, Kenneth; Rocca-Serra, Philippe; Viant, Mark R; Steinbeck, Christoph; Salek, Reza M

    2017-08-15

    Submission to the MetaboLights repository for metabolomics data currently places the burden of reporting instrument and acquisition parameters in ISA-Tab format on users, who have to do it manually, a process that is time consuming and prone to user input error. Since the large majority of these parameters are embedded in instrument raw data files, an opportunity exists to capture this metadata more accurately. Here we report a set of Python packages that can automatically generate ISA-Tab metadata file stubs from raw XML metabolomics data files. The parsing packages are separated into mzML2ISA (encompassing mzML and imzML formats) and nmrML2ISA (nmrML format only). Overall, the use of mzML2ISA & nmrML2ISA reduces the time needed to capture metadata substantially (capturing 90% of metadata on assay and sample levels), is much less prone to user input errors, improves compliance with minimum information reporting guidelines and facilitates more finely grained data exploration and querying of datasets. mzML2ISA & nmrML2ISA are available under version 3 of the GNU General Public Licence at https://github.com/ISA-tools. Documentation is available from http://2isa.readthedocs.io/en/latest/. reza.salek@ebi.ac.uk or isatools@googlegroups.com. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  13. ProMC: Input-output data format for HEP applications using varint encoding

    Science.gov (United States)

    Chekanov, S. V.; May, E.; Strand, K.; Van Gemmeren, P.

    2014-10-01

    A new data format for Monte Carlo (MC) events, or any structural data, including experimental data, is discussed. The format is designed to store data in a compact binary form using variable-size integer encoding as implemented in the Google's Protocol Buffers package. This approach is implemented in the PROMC library which produces smaller file sizes for MC records compared to the existing input-output libraries used in high-energy physics (HEP). Other important features of the proposed format are a separation of abstract data layouts from concrete programming implementations, self-description and random access. Data stored in PROMC files can be written, read and manipulated in a number of programming languages, such C++, JAVA, FORTRAN and PYTHON.

  14. Activation cross section data file, (1)

    International Nuclear Information System (INIS)

    Yamamuro, Nobuhiro; Iijima, Shungo.

    1989-09-01

    To evaluate the radioisotope productions due to the neutron irradiation in fission of fusion reactors, the data for the activation cross sections ought to be provided. It is planning to file more than 2000 activation cross sections at final. In the current year, the neutron cross sections for 14 elements from Ni to W have been calculated and evaluated in the energy range 10 -5 to 20 MeV. The calculations with a simplified-input nuclear cross section calculation system SINCROS were described, and another method of evaluation which is consistent with the JENDL-3 were also mentioned. The results of cross section calculation are in good agreement with experimental data and they were stored in the file 8, 9 and 10 of ENDF/B format. (author)

  15. An information retrieval system for research file data

    Science.gov (United States)

    Joan E. Lengel; John W. Koning

    1978-01-01

    Research file data have been successfully retrieved at the Forest Products Laboratory through a high-speed cross-referencing system involving the computer program FAMULUS as modified by the Madison Academic Computing Center at the University of Wisconsin. The method of data input, transfer to computer storage, system utilization, and effectiveness are discussed....

  16. ITOUGH2 sample problems

    International Nuclear Information System (INIS)

    Finsterle, S.

    1997-11-01

    This report contains a collection of ITOUGH2 sample problems. It complements the ITOUGH2 User's Guide [Finsterle, 1997a], and the ITOUGH2 Command Reference [Finsterle, 1997b]. ITOUGH2 is a program for parameter estimation, sensitivity analysis, and uncertainty propagation analysis. It is based on the TOUGH2 simulator for non-isothermal multiphase flow in fractured and porous media [Preuss, 1987, 1991a]. The report ITOUGH2 User's Guide [Finsterle, 1997a] describes the inverse modeling framework and provides the theoretical background. The report ITOUGH2 Command Reference [Finsterle, 1997b] contains the syntax of all ITOUGH2 commands. This report describes a variety of sample problems solved by ITOUGH2. Table 1.1 contains a short description of the seven sample problems discussed in this report. The TOUGH2 equation-of-state (EOS) module that needs to be linked to ITOUGH2 is also indicated. Each sample problem focuses on a few selected issues shown in Table 1.2. ITOUGH2 input features and the usage of program options are described. Furthermore, interpretations of selected inverse modeling results are given. Problem 1 is a multipart tutorial, describing basic ITOUGH2 input files for the main ITOUGH2 application modes; no interpretation of results is given. Problem 2 focuses on non-uniqueness, residual analysis, and correlation structure. Problem 3 illustrates a variety of parameter and observation types, and describes parameter selection strategies. Problem 4 compares the performance of minimization algorithms and discusses model identification. Problem 5 explains how to set up a combined inversion of steady-state and transient data. Problem 6 provides a detailed residual and error analysis. Finally, Problem 7 illustrates how the estimation of model-related parameters may help compensate for errors in that model

  17. Program for the Generation of MCNP Inputs from State Files of CAREM

    International Nuclear Information System (INIS)

    Leszczynski, Francisco; Lopasso, Edmundo; Villarino, E

    2000-01-01

    The objective of this work is the development and tests of detailed input data for the Monte Carlo program MCNP, to be able of model the core of CAREM reactor, with the detail included on the updated models, for having available a calculation system that allow the production of confident results to be compared with results obtained with the system used today for designing the CAREM reactor core (CONDOR-CITVAP).The model includes the possibility of temperature and coolant density, and temperature and numeric densities of fuel.The detail consists of 21 different fuel elements (symmetry 3) and 14 axial zones.Results of comparisons of reactivity and power pick factors are presented, between MCNP and CONDOR-CITVAP.On average, these results show an acceptable agreement for all the compared parameters.It is described, also, the interface CONDOR-CITVAP-MCNP program, that has been developed for generating inputs of materials for MCNP, from outputs of CONDOR and CITVAP, for different reactor states

  18. ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations

    Science.gov (United States)

    Laloo, Jalal Z. A.; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai

    2017-07-01

    The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.

  19. Method for data compression by associating complex numbers with files of data values

    Science.gov (United States)

    Feo, John Thomas; Hanks, David Carlton; Kraay, Thomas Arthur

    1998-02-10

    A method for compressing data for storage or transmission. Given a complex polynomial and a value assigned to each root, a root generated data file (RGDF) is created, one entry at a time. Each entry is mapped to a point in a complex plane. An iterative root finding technique is used to map the coordinates of the point to the coordinates of one of the roots of the polynomial. The value associated with that root is assigned to the entry. An equational data compression (EDC) method reverses this procedure. Given a target data file, the EDC method uses a search algorithm to calculate a set of m complex numbers and a value map that will generate the target data file. The error between a simple target data file and generated data file is typically less than 10%. Data files can be transmitted or stored without loss by transmitting the m complex numbers, their associated values, and an error file whose size is at most one-tenth of the size of the input data file.

  20. Improved Detection of Extended Spectrum Beta-Lactamase (ESBL)-Producing Escherichia coli in Input and Output Samples of German Biogas Plants by a Selective Pre-Enrichment Procedure

    Science.gov (United States)

    Schauss, Thorsten; Glaeser, Stefanie P.; Gütschow, Alexandra; Dott, Wolfgang; Kämpfer, Peter

    2015-01-01

    The presence of extended-spectrum beta-lactamase (ESBL)-producing Escherichia coli was investigated in input (manure from livestock husbandry) and output samples of six German biogas plants in 2012 (one sampling per biogas plant) and two German biogas plants investigated in an annual cycle four times in 2013/2014. ESBL-producing Escherichia coli were cultured by direct plating on CHROMagar ESBL from input samples in the range of 100 to 104 colony forming units (CFU) per g dry weight but not from output sample. This initially indicated a complete elimination of ESBL-producing E. coli by the biogas plant process. Detected non target bacteria were assigned to the genera Acinetobacter, Pseudomonas, Bordetella, Achromobacter, Castellaniella, and Ochrobactrum. A selective pre-enrichment procedure increased the detection efficiency of ESBL-producing E. coli in input samples and enabled the detection in five of eight analyzed output samples. In total 119 ESBL-producing E. coli were isolated from input and 46 from output samples. Most of the E. coli isolates carried CTX-M-type and/or TEM-type beta lactamases (94%), few SHV-type beta lactamase (6%). Sixty-four bla CTX-M genes were characterized more detailed and assigned mainly to CTX-M-groups 1 (85%) and 9 (13%), and one to group 2. Phylogenetic grouping of 80 E. coli isolates showed that most were assigned to group A (71%) and B1 (27%), only one to group D (2%). Genomic fingerprinting and multilocus sequence typing (MLST) showed a high clonal diversity with 41 BOX-types and 19 ST-types. The two most common ST-types were ST410 and ST1210. Antimicrobial susceptibility testing of 46 selected ESBL-producing E. coli revealed that several isolates were additionally resistant to other veterinary relevant antibiotics and some grew on CHROMagar STEC but shiga-like toxine (SLT) genes were not detected. Resistance to carbapenems was not detected. In summary the study showed for the first time the presence of ESBL-producing E. coli in

  1. Improved detection of extended spectrum beta-lactamase (ESBL-producing Escherichia coli in input and output samples of German biogas plants by a selective pre-enrichment procedure.

    Directory of Open Access Journals (Sweden)

    Thorsten Schauss

    Full Text Available The presence of extended-spectrum beta-lactamase (ESBL-producing Escherichia coli was investigated in input (manure from livestock husbandry and output samples of six German biogas plants in 2012 (one sampling per biogas plant and two German biogas plants investigated in an annual cycle four times in 2013/2014. ESBL-producing Escherichia coli were cultured by direct plating on CHROMagar ESBL from input samples in the range of 100 to 104 colony forming units (CFU per g dry weight but not from output sample. This initially indicated a complete elimination of ESBL-producing E. coli by the biogas plant process. Detected non target bacteria were assigned to the genera Acinetobacter, Pseudomonas, Bordetella, Achromobacter, Castellaniella, and Ochrobactrum. A selective pre-enrichment procedure increased the detection efficiency of ESBL-producing E. coli in input samples and enabled the detection in five of eight analyzed output samples. In total 119 ESBL-producing E. coli were isolated from input and 46 from output samples. Most of the E. coli isolates carried CTX-M-type and/or TEM-type beta lactamases (94%, few SHV-type beta lactamase (6%. Sixty-four blaCTX-M genes were characterized more detailed and assigned mainly to CTX-M-groups 1 (85% and 9 (13%, and one to group 2. Phylogenetic grouping of 80 E. coli isolates showed that most were assigned to group A (71% and B1 (27%, only one to group D (2%. Genomic fingerprinting and multilocus sequence typing (MLST showed a high clonal diversity with 41 BOX-types and 19 ST-types. The two most common ST-types were ST410 and ST1210. Antimicrobial susceptibility testing of 46 selected ESBL-producing E. coli revealed that several isolates were additionally resistant to other veterinary relevant antibiotics and some grew on CHROMagar STEC but shiga-like toxine (SLT genes were not detected. Resistance to carbapenems was not detected. In summary the study showed for the first time the presence of ESBL-producing E

  2. Methodology for converting CT medical images to MCNP input using the Scan2MCNP system

    International Nuclear Information System (INIS)

    Boia, L.S.; Silva, A.X.; Cardoso, S.C.; Castro, R.C.

    2009-01-01

    This paper develops a methodology for the application software Scan2MCNP, which converts medical images DICOM (Digital Imaging and Communications in Medicine) for MCNP input file. The Scan2MCNP handles, processes and executes the medical images generated by CT equipment, allowing the user to perform the selection and parameterization of the study area in question (tissues and organs). The details of these worked in medical imaging software, therefore, will be converted to equity to the process of language analysis of MCNP radiation transport, through the generation of a code input file. With this file, it is possible to simulate any situation/problem of the type and level of radiation to the proposed treatment chosen by the medical staff responsible for the patient. Within a computational process oriented, the Scan2MCNP can contribute along with other software that has been used recently in the area of medical physics, to improve the levels of quality and precision of radiotherapy treatments. In this work, medical images DICOM of the Anthropomorphic Rando Phantom were used in the process of analysis and development of computer software Scan2MCNP. However, it emphasized that the software is successful in certain situations, depending upon a number of auxiliary procedures and software that can help in the solution of certain problems in the natural radiation treatment or express agility by the team of medical physics. (author)

  3. Improvement of sample preparation for input plutonium accountability measurement by isotope dilution gammy-ray spectroscopy

    International Nuclear Information System (INIS)

    Nishida, K.; Kuno, Y.; Sato, S.; Masui, J.; Li, T.K.; Parker, J.L.; Hakkila, E.A.

    1992-01-01

    The sample preparation method for the isotope dilution gamma-ray spectrometry (IDGS) technique has been further improved for simultaneously determining the plutonium concentration and isotopic composition of highly irradiated spent-fuel dissolver solutions. The improvement includes using ion-exchange filter papers (instead of resin beads, as in two previous experiments) for better separation and recovery of plutonium from fission products. The results of IDGS measurements for five dissolver solutions are in good agreement with those by mass spectrometry with ∼0.4% for plutonium concentration and ∼0.1% for 239 Pu isotopic composition. The precision of the plutonium concentration is ∼1% with a 1-h count time. The technique could be implemented as an alternative method for input accountability and verification measurements in reprocessing plants

  4. Characterization of Input Current Interharmonics in Adjustable Speed Drives

    DEFF Research Database (Denmark)

    Soltani, Hamid; Davari, Pooya; Zare, Firuz

    2017-01-01

    This paper investigates the interharmonic generation process in the input current of double-stage Adjustable Speed Drives (ASDs) based on voltage source inverters and front-end diode rectifiers. The effects of the inverter output-side low order harmonics, caused by implementing the double......-edge symmetrical regularly sampled Space Vector Modulation (SVM) technique, on the input current interharmonic components are presented and discussed. Particular attention is also given to the influence of the asymmetrical regularly sampled modulation technique on the drive input current interharmonics....... The developed theoretical analysis predicts the drive interharmonic frequency locations with respect to the selected sampling strategies. Simulation and experimental results on a 2.5 kW ASD system verify the effectiveness of the theoretical analysis....

  5. EQPT, a data file preprocessor for the EQ3/6 software package: User's guide and related documentation (Version 7.0)

    International Nuclear Information System (INIS)

    Daveler, S.A.; Wolery, T.J.

    1992-01-01

    EQPT is a data file preprocessor for the EQ3/6 software package. EQ3/6 currently contains five primary data files, called datao files. These files comprise alternative data sets. These data files contain both standard state and activity coefficient-related data. Three (com, sup, and nea) support the use of the Davies or B equations for the activity coefficients; the other two (hmw and pit) support the use of Pitzer's (1973, 1975) equations. The temperature range of the thermodynamic data on these data files varies from 25 degrees C only to 0-300 degrees C. The principal modeling codes in EQ3/6, EQ3NR and EQ6, do not read a data0 file, however. Instead, these codes read an unformatted equivalent called a data1 file. EQPT writes a datal file, using the corresponding data0 file as input. In processing a data0 file, EQPT checks the data for common errors, such as unbalanced reactions. It also conducts two kinds of data transformation. Interpolating polynomials are fit to data which are input on temperature adds. The coefficients of these polynomials are then written on the datal file in place of the original temperature grids. A second transformation pertains only to data files tied to Pitzer's equations. The commonly reported observable Pitzer coefficient parameters are mapped into a set of primitive parameters by means of a set of conventional relations. These primitive form parameters are then written onto the datal file in place of their observable counterparts. Usage of the primitive form parameters makes it easier to evaluate Pitzer's equations in EQ3NR and EQ6. EQPT and the other codes in the EQ3/6 package are written in FORTRAN 77 and have been developed to run under the UNIX operating system on computers ranging from workstations to supercomputers

  6. Image-derived and arterial blood sampled input functions for quantitative PET imaging of the angiotensin II subtype 1 receptor in the kidney

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Tao; Tsui, Benjamin M. W.; Li, Xin; Vranesic, Melin; Lodge, Martin A.; Gulaldi, Nedim C. M.; Szabo, Zsolt, E-mail: zszabo@jhmi.edu [Russell H. Morgan Department of Radiology and Radiological Science, The Johns Hopkins School of Medicine, Baltimore, Maryland 21287 (United States)

    2015-11-15

    Purpose: The radioligand {sup 11}C-KR31173 has been introduced for positron emission tomography (PET) imaging of the angiotensin II subtype 1 receptor in the kidney in vivo. To study the biokinetics of {sup 11}C-KR31173 with a compartmental model, the input function is needed. Collection and analysis of arterial blood samples are the established approach to obtain the input function but they are not feasible in patients with renal diseases. The goal of this study was to develop a quantitative technique that can provide an accurate image-derived input function (ID-IF) to replace the conventional invasive arterial sampling and test the method in pigs with the goal of translation into human studies. Methods: The experimental animals were injected with [{sup 11}C]KR31173 and scanned up to 90 min with dynamic PET. Arterial blood samples were collected for the artery derived input function (AD-IF) and used as a gold standard for ID-IF. Before PET, magnetic resonance angiography of the kidneys was obtained to provide the anatomical information required for derivation of the recovery coefficients in the abdominal aorta, a requirement for partial volume correction of the ID-IF. Different image reconstruction methods, filtered back projection (FBP) and ordered subset expectation maximization (OS-EM), were investigated for the best trade-off between bias and variance of the ID-IF. The effects of kidney uptakes on the quantitative accuracy of ID-IF were also studied. Biological variables such as red blood cell binding and radioligand metabolism were also taken into consideration. A single blood sample was used for calibration in the later phase of the input function. Results: In the first 2 min after injection, the OS-EM based ID-IF was found to be biased, and the bias was found to be induced by the kidney uptake. No such bias was found with the FBP based image reconstruction method. However, the OS-EM based image reconstruction was found to reduce variance in the subsequent

  7. Prefetching in file systems for MIMD multiprocessors

    Science.gov (United States)

    Kotz, David F.; Ellis, Carla Schlatter

    1990-01-01

    The question of whether prefetching blocks on the file into the block cache can effectively reduce overall execution time of a parallel computation, even under favorable assumptions, is considered. Experiments have been conducted with an interleaved file system testbed on the Butterfly Plus multiprocessor. Results of these experiments suggest that (1) the hit ratio, the accepted measure in traditional caching studies, may not be an adequate measure of performance when the workload consists of parallel computations and parallel file access patterns, (2) caching with prefetching can significantly improve the hit ratio and the average time to perform an I/O (input/output) operation, and (3) an improvement in overall execution time has been observed in most cases. In spite of these gains, prefetching sometimes results in increased execution times (a negative result, given the optimistic nature of the study). The authors explore why it is not trivial to translate savings on individual I/O requests into consistently better overall performance and identify the key problems that need to be addressed in order to improve the potential of prefetching techniques in the environment.

  8. Modification to the Monte Carlo N-Particle (MCNP) Visual Editor (MCNPVised) to Read in Computer Aided Design (CAD) Files

    International Nuclear Information System (INIS)

    Randolph Schwarz; Leland L. Carter; Alysia Schwarz

    2005-01-01

    Monte Carlo N-Particle Transport Code (MCNP) is the code of choice for doing complex neutron/photon/electron transport calculations for the nuclear industry and research institutions. The Visual Editor for Monte Carlo N-Particle is internationally recognized as the best code for visually creating and graphically displaying input files for MCNP. The work performed in this grant was used to enhance the capabilities of the MCNP Visual Editor to allow it to read in both 2D and 3D Computer Aided Design (CAD) files, allowing the user to electronically generate a valid MCNP input geometry

  9. Multi-Input Convolutional Neural Network for Flower Grading

    Directory of Open Access Journals (Sweden)

    Yu Sun

    2017-01-01

    Full Text Available Flower grading is a significant task because it is extremely convenient for managing the flowers in greenhouse and market. With the development of computer vision, flower grading has become an interdisciplinary focus in both botany and computer vision. A new dataset named BjfuGloxinia contains three quality grades; each grade consists of 107 samples and 321 images. A multi-input convolutional neural network is designed for large scale flower grading. Multi-input CNN achieves a satisfactory accuracy of 89.6% on the BjfuGloxinia after data augmentation. Compared with a single-input CNN, the accuracy of multi-input CNN is increased by 5% on average, demonstrating that multi-input convolutional neural network is a promising model for flower grading. Although data augmentation contributes to the model, the accuracy is still limited by lack of samples diversity. Majority of misclassification is derived from the medium class. The image processing based bud detection is useful for reducing the misclassification, increasing the accuracy of flower grading to approximately 93.9%.

  10. PREP-45, Input Preparation for CITATION-2

    International Nuclear Information System (INIS)

    Ramalho Carlos, C.A.

    1995-01-01

    1 - Description of program or function: A Fortran program has been created, which saves much effort in preparing sections 004 (intervals in the coordinates) and 005 (zone numbers) of the input data file for the multigroup theory code CITATION (version CITATION-2, NESC0387/09), particularly when a thin complicated mesh is used. 2 - Method of solution: A domain is defined for CITATION calculations through specifying its sub-domains (e.g. graphite, lead, beryllium, water and fuel sub-domains) in a compact and simple way. An independent and previous geometrical specification is made of the various types of elements which are envisaged to constitute the contents of the reactor core grid positions. Then the load table for the configuration is input and scanned throughout, thus enabling the geometric mesh description to be produced (section 004). Also the zone placement (section 005) is achieved by means of element description subroutines for the different types of element (which may require appropriate but simple changes in the actual cases). The output of PREP45 is directly obtained in a format which is compatible with CITATION-2 input. 3 - Restrictions on the complexity of the problem: Only rectangular two-dimensional Cartesian coordinates are considered. A maximum of 12 sub-domains in the x direction (18 in the y direction) and up to 8 distinct element types are considered in this version. Other limitations exist which can nevertheless be overcome with simple changes in the source program

  11. FEDGROUP - A program system for producing group constants from evaluated nuclear data of files disseminated by IAEA

    International Nuclear Information System (INIS)

    Vertes, P.

    1976-06-01

    A program system for calculating group constants from several evaluated nuclear data files has been developed. These files are distributed by the Nuclear Data Section of IAEA. Our program system - FEDGROUP - has certain advantage over the well-known similar codes such as: 1. it requires only a medium sized computer />or approximately equal to 20000 words memory/, 2. it is easily adaptable to any type of computer, 3. it is flexible to the input evaluated nuclear data file and to the output group constant file. Nowadays, FEDGROUP calculates practically all types of group constants needed for reactor physics calculations by using the most frequent representations of evaluated data. (author)

  12. Quality Assurance Procedures for ModCat Database Code Files

    Energy Technology Data Exchange (ETDEWEB)

    Siciliano, Edward R.; Devanathan, Ram; Guillen, Zoe C.; Kouzes, Richard T.; Schweppe, John E.

    2014-04-01

    The Quality Assurance procedures used for the initial phase of the Model Catalog Project were developed to attain two objectives, referred to as “basic functionality” and “visualization.” To ensure the Monte Carlo N-Particle model input files posted into the ModCat database meet those goals, all models considered as candidates for the database are tested, revised, and re-tested.

  13. Input modelling of ASSERT-PV V2R8M1 for RUFIC fuel bundle

    Energy Technology Data Exchange (ETDEWEB)

    Park, Joo Hwan; Suk, Ho Chun

    2001-02-01

    This report describes the input modelling for subchannel analysis of CANFLEX-RU (RUFIC) fuel bundle which has been developed for an advanced fuel bundle of CANDU-6 reactor, using ASSERT-PV V2R8M1 code. Execution file of ASSERT-PV V2R8M1 code was recently transferred from AECL under JRDC agreement between KAERI and AECL. SSERT-PV V2R8M1 which is quite different from COBRA-IV-i code has been developed for thermalhydraulic analysis of CANDU-6 fuel channel by subchannel analysis method and updated so that 43-element CANDU fuel geometry can be applied. Hence, ASSERT code can be applied to the subchannel analysis of RUFIC fuel bundle. The present report was prepared for ASSERT input modelling of RUFIC fuel bundle. Since the ASSERT results highly depend on user's input modelling, the calculation results may be quite different among the user's input models. The objective of the present report is the preparation of detail description of the background information for input data and gives credibility of the calculation results.

  14. Input modelling of ASSERT-PV V2R8M1 for RUFIC fuel bundle

    International Nuclear Information System (INIS)

    Park, Joo Hwan; Suk, Ho Chun

    2001-02-01

    This report describes the input modelling for subchannel analysis of CANFLEX-RU (RUFIC) fuel bundle which has been developed for an advanced fuel bundle of CANDU-6 reactor, using ASSERT-PV V2R8M1 code. Execution file of ASSERT-PV V2R8M1 code was recently transferred from AECL under JRDC agreement between KAERI and AECL. SSERT-PV V2R8M1 which is quite different from COBRA-IV-i code has been developed for thermalhydraulic analysis of CANDU-6 fuel channel by subchannel analysis method and updated so that 43-element CANDU fuel geometry can be applied. Hence, ASSERT code can be applied to the subchannel analysis of RUFIC fuel bundle. The present report was prepared for ASSERT input modelling of RUFIC fuel bundle. Since the ASSERT results highly depend on user's input modelling, the calculation results may be quite different among the user's input models. The objective of the present report is the preparation of detail description of the background information for input data and gives credibility of the calculation results

  15. Iowa Geologic Sampling Points

    Data.gov (United States)

    Iowa State University GIS Support and Research Facility — Point locations of geologic samples/files in the IGS repository. Types of samples include well cuttings, outcrop samples, cores, drillers logs, measured sections,...

  16. Analysis of the Retained Gas Sample (RGS) Extruder Assembly

    International Nuclear Information System (INIS)

    Coverdell, B.L.

    1995-09-01

    In order for the Retained Gas Sample (RGS) Extruder Assembly to be safely used it was determined by the cognizant engineer that analysis was necessary. The use of the finite-element analysis (FEA) progarm COSMOS/M version 1.71 permitted a quick, easy, and detailed stress analysis of the RGS Extruder Assembly. The FEA model is a three dimensional model using the SHELL4T element type. From the results of the FEA, the cognizant engineer determined that the RGS extruder would be rated at 10,000 lbf and load tested to 12,000 lbf. The respective input and output files for the model are EXTR02.GFM and EXTR02.OUT and can be found on the attached tape

  17. A simple method for measurement of cerebral blood flow using 123I-IMP SPECT with calibrated standard input function by one point blood sampling. Validation of calibration by one point venous blood sampling as a substitute for arterial blood sampling

    International Nuclear Information System (INIS)

    Ito, Hiroshi; Akaizawa, Takashi; Goto, Ryoui

    1994-01-01

    In a simplified method for measurement of cerebral blood flow using one 123 I-IMP SPECT scan and one point arterial blood sampling (Autoradiography method), input function is obtained by calibrating a standard input function by one point arterial blood sampling. A purpose of this study is validation of calibration by one point venous blood sampling as a substitute for one point arterial blood sampling. After intravenous infusion of 123 I-IMP, frequent arterial and venous blood sampling were simultaneously performed on 12 patients of CNS disease without any heart and lung disease and 5 normal volunteers. The radioactivity ratio of venous whole blood which obtained from cutaneous cubital vein to arterial whole blood were 0.76±0.08, 0.80±0.05, 0.81±0.06, 0.83±0.11 at 10, 20, 30, 50 min after 123 I-IMP infusion, respectively. The venous blood radioactivities were always 20% lower than those of arterial blood radioactivity during 50 min. However, the ratio which obtained from cutaneous dorsal hand vein to artery were 0.93±0.02, 0.94±0.05, 0.98±0.04, 0.98±0.03, at 10, 20, 30, 50 min after 123 I-IMP infusion, respectively. The venous blood radioactivity was consistent with artery. These indicate that arterio-venous difference of radioactivity in a peripheral cutaneous vein like a dorsal hand vein is minimal due to arteriovenous shunt in palm. Therefore, a substitution by blood sampling from cutaneous dorsal hand vein for artery will be possible. Optimized time for venous blood sampling evaluated by error analysis was 20 min after 123 I-IMP infusion, which is 10 min later than that of arterial blood sampling. (author)

  18. EQPT, a data file preprocessor for the EQ3/6 software package: User`s guide and related documentation (Version 7.0); Part 2

    Energy Technology Data Exchange (ETDEWEB)

    Daveler, S.A.; Wolery, T.J.

    1992-12-17

    EQPT is a data file preprocessor for the EQ3/6 software package. EQ3/6 currently contains five primary data files, called datao files. These files comprise alternative data sets. These data files contain both standard state and activity coefficient-related data. Three (com, sup, and nea) support the use of the Davies or B-dot equations for the activity coefficients; the other two (hmw and pit) support the use of Pitzer`s (1973, 1975) equations. The temperature range of the thermodynamic data on these data files varies from 25{degrees}C only to 0-300{degrees}C. The principal modeling codes in EQ3/6, EQ3NR and EQ6, do not read a data0 file, however. Instead, these codes read an unformatted equivalent called a data1 file. EQPT writes a datal file, using the corresponding data0 file as input. In processing a data0 file, EQPT checks the data for common errors, such as unbalanced reactions. It also conducts two kinds of data transformation. Interpolating polynomials are fit to data which are input on temperature adds. The coefficients of these polynomials are then written on the datal file in place of the original temperature grids. A second transformation pertains only to data files tied to Pitzer`s equations. The commonly reported observable Pitzer coefficient parameters are mapped into a set of primitive parameters by means of a set of conventional relations. These primitive form parameters are then written onto the datal file in place of their observable counterparts. Usage of the primitive form parameters makes it easier to evaluate Pitzer`s equations in EQ3NR and EQ6. EQPT and the other codes in the EQ3/6 package are written in FORTRAN 77 and have been developed to run under the UNIX operating system on computers ranging from workstations to supercomputers.

  19. Nuclide identifier and grat data reader application for ORIGEN output file

    International Nuclear Information System (INIS)

    Arif Isnaeni

    2011-01-01

    ORIGEN is a one-group depletion and radioactive decay computer code developed at the Oak Ridge National Laboratory (ORNL). ORIGEN takes one-group neutronics calculation providing various nuclear material characteristics (the buildup, decay and processing of radioactive materials). ORIGEN output is a text-based file, ORIGEN output file contains only numbers in the form of group data nuclide, nuclide identifier and grat. This application was created to facilitate data collection nuclide identifier and grat, this application also has a function to acquire mass number data and calculate mass (gram) for each nuclide. Output from these applications can be used for computer code data input for neutronic calculations such as MCNP. (author)

  20. Report on the achievements in the Sunshine Project in fiscal 1986. Surveys on coal type selection and surveys on coal types (Data file); 1986 nendo tanshu sentei chosa tanshu chosa seika hokokusho. Data file

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1987-03-01

    This data file is a data file concerning coal types for liquefaction in the report on the achievements in the surveys on coal type selection and on coal types (JN0040843). Such items of information were filed as existence and production of coals, various kinds of analyses, and test values relative to data for liquefaction tests that have been collected and sent to date. The file consists of two files of a test sample information file related to existence and production of coals and coal mines, and an analysis and test file accommodating the results of different analyses and tests. However, the test sample information files (1) through (6) have not been put into order on such items of information as test samples and sample collection, geography, geology, ground beds, coal beds, coal mines, development and transportation. The analysis and test file contains (7) industrial analyses, (8) element analysis, (9) ash composition, (10) solubility of ash, (11) structure analysis, (12) liquefaction characteristics (standard version), (13) analysis of liquefaction produced gas, (14) distillation characteristics of liquefaction produced oil, (15) liquefaction characteristics (simplified version), (16) analysis of liquefaction produced gas (simplified version), and (17) distillation characteristics of liquefaction produced oil (simplified version). However, the information related to liquefaction test using a tubing reactor in (15) through (17) has not been put into order. (NEDO)

  1. ASSIST - a package of Fortran routines for handling input under specified syntax rules and for management of data structures

    International Nuclear Information System (INIS)

    Sinclair, J.E.

    1991-02-01

    The ASSIST package (A Structured Storage and Input Syntax Tool) provides for Fortran programs a means for handling data structures more general than those provided by the Fortran language, and for obtaining input to the program from a file or terminal according to specified syntax rules. The syntax-controlled input can be interactive, with automatic generation of prompts, and dialogue to correct any input errors. The range of syntax rules possible is sufficient to handle lists of numbers and character strings, keywords, commands with optional clauses, and many kinds of variable-format constructions, such as algebraic expressions. ASSIST was developed for use in two large programs for the analysis of safety of radioactive waste disposal facilities, but it should prove useful for a wide variety of applications. (author)

  2. The global unified parallel file system (GUPFS) project: FY 2002 activities and results

    Energy Technology Data Exchange (ETDEWEB)

    Butler, Gregory F.; Lee, Rei Chi; Welcome, Michael L.

    2003-04-07

    The Global Unified Parallel File System (GUPFS) project is a multiple-phase, five-year project at the National Energy Research Scientific Computing (NERSC) Center to provide a scalable, high performance, high bandwidth, shared file system for all the NERSC production computing and support systems. The primary purpose of the GUPFS project is to make it easier to conduct advanced scientific research using the NERSC systems. This is to be accomplished through the use of a shared file system providing a unified file namespace, operating on consolidated shared storage that is directly accessed by all the NERSC production computing and support systems. During its first year, FY 2002, the GUPFS project focused on identifying, testing, and evaluating existing and emerging shared/cluster file system, SAN fabric, and storage technologies; identifying NERSC user input/output (I/O) requirements, methods, and mechanisms; and developing appropriate benchmarking methodologies and benchmark codes for a parallel environment. This report presents the activities and progress of the GUPFS project during its first year, the results of the evaluations conducted, and plans for near-term and longer-term investigations.

  3. Decay data file based on the ENSDF file

    Energy Technology Data Exchange (ETDEWEB)

    Katakura, J. [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-03-01

    A decay data file with the JENDL (Japanese Evaluated Nuclear Data Library) format based on the ENSDF (Evaluated Nuclear Structure Data File) file was produced as a tentative one of special purpose files of JENDL. The problem using the ENSDF file as primary source data of the JENDL decay data file is presented. (author)

  4. Input modelling of ASSERT-PV V2R8M1 for RUFIC fuel bundle

    Energy Technology Data Exchange (ETDEWEB)

    Park, Joo Hwan; Suk, Ho Chun

    2001-02-01

    This report describes the input modelling for subchannel analysis of CANFLEX-RU (RUFIC) fuel bundle which has been developed for an advanced fuel bundle of CANDU-6 reactor, using ASSERT-PV V2R8M1 code. Execution file of ASSERT-PV V2R8M1 code was recently transferred from AECL under JRDC agreement between KAERI and AECL. SSERT-PV V2R8M1 which is quite different from COBRA-IV-i code has been developed for thermalhydraulic analysis of CANDU-6 fuel channel by subchannel analysis method and updated so that 43-element CANDU fuel geometry can be applied. Hence, ASSERT code can be applied to the subchannel analysis of RUFIC fuel bundle. The present report was prepared for ASSERT input modelling of RUFIC fuel bundle. Since the ASSERT results highly depend on user's input modelling, the calculation results may be quite different among the user's input models. The objective of the present report is the preparation of detail description of the background information for input data and gives credibility of the calculation results.

  5. FLUTAN input specifications

    International Nuclear Information System (INIS)

    Borgwaldt, H.; Baumann, W.; Willerding, G.

    1991-05-01

    FLUTAN is a highly vectorized computer code for 3-D fluiddynamic and thermal-hydraulic analyses in cartesian and cylinder coordinates. It is related to the family of COMMIX codes originally developed at Argonne National Laboratory, USA. To a large extent, FLUTAN relies on basic concepts and structures imported from COMMIX-1B and COMMIX-2 which were made available to KfK in the frame of cooperation contracts in the fast reactor safety field. While on the one hand not all features of the original COMMIX versions have been implemented in FLUTAN, the code on the other hand includes some essential innovative options like CRESOR solution algorithm, general 3-dimensional rebalacing scheme for solving the pressure equation, and LECUSSO-QUICK-FRAM techniques suitable for reducing 'numerical diffusion' in both the enthalphy and momentum equations. This report provides users with detailed input instructions, presents formulations of the various model options, and explains by means of comprehensive sample input, how to use the code. (orig.) [de

  6. Dose calculation for 40K ingestion in samples of beans using spectrometry and MCNP

    International Nuclear Information System (INIS)

    Garcez, R.W.D.; Lopes, J.M.; Silva, A.X.; Domingues, A.M.; Lima, M.A.F.

    2014-01-01

    A method based on gamma spectroscopy and on the use of voxel phantoms to calculate dose due to ingestion of 40 K contained in bean samples are presented in this work. To quantify the activity of radionuclide, HPGe detector was used and the data entered in the input file of MCNP code. The highest value of equivalent dose was 7.83 μSv.y -1 in the stomach for white beans, whose activity 452.4 Bq.Kg -1 was the highest of the five analyzed. The tool proved to be appropriate when you want to calculate the dose in organs due to ingestion of food. (author)

  7. Liver kinetics of glucose analogs measured in pigs by PET: importance of dual-input blood sampling

    DEFF Research Database (Denmark)

    Munk, O L; Bass, L; Roelsgaard, K

    2001-01-01

    -input functions were very similar. CONCLUSION: Compartmental analysis of MG and FDG kinetics using dynamic PET data requires measurements of dual-input activity concentrations. Using the dual-input function, physiologically reasonable parameter estimates of K1, k2, and Vp were obtained, whereas the use......Metabolic processes studied by PET are quantified traditionally using compartmental models, which relate the time course of the tracer concentration in tissue to that in arterial blood. For liver studies, the use of arterial input may, however, cause systematic errors to the estimated kinetic...... parameters, because of ignorance of the dual blood supply from the hepatic artery and the portal vein to the liver. METHODS: Six pigs underwent PET after [15O]carbon monoxide inhalation, 3-O-[11C]methylglucose (MG) injection, and [18F]FDG injection. For the glucose scans, PET data were acquired for 90 min...

  8. Nuclear structure data file. A manual for preparation of data sets

    International Nuclear Information System (INIS)

    Ewbank, W.B.; Schmorak, M.R.; Bertrand, F.E.; Feliciano, M.; Horen, D.J.

    1975-06-01

    The Nuclear Data Project at ORNL is building a computer-based file of nuclear structure data, which is intended for use by both basic and applied users. For every nucleus, the Nuclear Structure Data File contains evaluated nuclear structure information. This manual describes a standard input format for nuclear structure data. The format is sufficiently structured that bulk data can be entered efficiently. At the same time, the structure is open-ended and can accommodate most measured or deduced quantities that yield nuclear structure information. Computer programs have been developed at the Data Project to perform consistency checking and routine calculations. Programs are also used for preparing level scheme drawings. (U.S.)

  9. Sensitivity analysis of complex models: Coping with dynamic and static inputs

    International Nuclear Information System (INIS)

    Anstett-Collin, F.; Goffart, J.; Mara, T.; Denis-Vidal, L.

    2015-01-01

    In this paper, we address the issue of conducting a sensitivity analysis of complex models with both static and dynamic uncertain inputs. While several approaches have been proposed to compute the sensitivity indices of the static inputs (i.e. parameters), the one of the dynamic inputs (i.e. stochastic fields) have been rarely addressed. For this purpose, we first treat each dynamic as a Gaussian process. Then, the truncated Karhunen–Loève expansion of each dynamic input is performed. Such an expansion allows to generate independent Gaussian processes from a finite number of independent random variables. Given that a dynamic input is represented by a finite number of random variables, its variance-based sensitivity index is defined by the sensitivity index of this group of variables. Besides, an efficient sampling-based strategy is described to estimate the first-order indices of all the input factors by only using two input samples. The approach is applied to a building energy model, in order to assess the impact of the uncertainties of the material properties (static inputs) and the weather data (dynamic inputs) on the energy performance of a real low energy consumption house. - Highlights: • Sensitivity analysis of models with uncertain static and dynamic inputs is performed. • Karhunen–Loève (KL) decomposition of the spatio/temporal inputs is performed. • The influence of the dynamic inputs is studied through the modes of the KL expansion. • The proposed approach is applied to a building energy model. • Impact of weather data and material properties on performance of real house is given

  10. Molecular, morphological and fossil input data for inferring relationship among viviparous brotulas (Bythitidae) - Resulting in a family status change for Dinematichthyidae

    DEFF Research Database (Denmark)

    Knudsen, Steen Wilhelm; Møller, Peter Rask; Schwarzhans, Werner

    2016-01-01

    This article comprise the data related to the research article (Møller et al., 2016) [1], and makes it possible to explore and reproduce the topologies that allowed [1] to infer the relationship between the families Bythitidae and Dinematichthyidae. The supplementary data holds nexus-input files ...

  11. Source model for the Copahue volcano magma plumbing system constrained by InSAR surface deformation observations

    OpenAIRE

    Paul Lundgren; M. Nikkhoo; Sergey V. Samsonov; Pietro Milillo; Fernando Gil-Cruz; Jonathan Lazo

    2017-01-01

    Tar files for each of the InSAR time series (interferograms used in the GIAnT time series computation as well as the input files and outputs from using GIAnT). GIAnT is an open source InSAR time series code developed at Caltech. The UAVSAR_*.tgz files contain the interferograms from the UAVSAR airborne system that were used in the analysis. The actual model input files require some additional down sampling using resamptool.m a Matlab code developed by Prof. R. Lohman, Cornell Univ.

  12. Modeling recognition memory using the similarity structure of natural input

    NARCIS (Netherlands)

    Lacroix, J.P.W.; Murre, J.M.J.; Postma, E.O.; van den Herik, H.J.

    2006-01-01

    The natural input memory (NIM) model is a new model for recognition memory that operates on natural visual input. A biologically informed perceptual preprocessing method takes local samples (eye fixations) from a natural image and translates these into a feature-vector representation. During

  13. Dynamic Non-Hierarchical File Systems for Exascale Storage

    Energy Technology Data Exchange (ETDEWEB)

    Long, Darrell E. [Univ. of California, Santa Cruz, CA (United States); Miller, Ethan L [Univ. of California, Santa Cruz, CA (United States)

    2015-02-24

    appliances. These search applications are often optimized for a single file system, making it difficult to move files and their metadata between file systems. Users have tried to solve this problem in several ways, including the use of separate databases to index file properties, the encoding of file properties into file names, and separately gathering and managing provenance data, but none of these approaches has worked well, either due to limited usefulness or scalability, or both. Our research addressed several key issues: High-performance, real-time metadata harvesting: extracting important attributes from files dynamically and immediately updating indexes used to improve search; Transparent, automatic, and secure provenance capture: recording the data inputs and processing steps used in the production of each file in the system; Scalable indexing: indexes that are optimized for integration with the file system; Dynamic file system structure: our approach provides dynamic directories similar to those in semantic file systems, but these are the native organization rather than a feature grafted onto a conventional system. In addition to these goals, our research effort will include evaluating the impact of new storage technologies on the file system design and performance. In particular, the indexing and metadata harvesting functions can potentially benefit from the performance improvements promised by new storage class memories.

  14. Incidence of Apical Crack Initiation during Canal Preparation using Hand Stainless Steel (K-File) and Hand NiTi (Protaper) Files.

    Science.gov (United States)

    Soni, Dileep; Raisingani, Deepak; Mathur, Rachit; Madan, Nidha; Visnoi, Suchita

    2016-01-01

    To evaluate the incidence of apical crack initiation during canal preparation with stainless steel K-files and hand protaper files (in vitro study). Sixty extracted mandibular premo-lar teeth are randomly selected and embedded in an acrylic tube filled with autopolymerizing resin. A baseline image of the apical surface of each specimen was recorded under a digital microscope (80×). The cervical and middle thirds of all samples were flared with #2 and #1 Gates-Glidden (GG) drills, and a second image was recorded. The teeth were randomly divided into four groups of 15 teeth each according to the file type (hand K-file and hand-protaper) and working length (WL) (instrumented at WL and 1 mm less than WL). Final image after dye penetration and photomicrograph of the apical root surface were digitally recorded. Maximum numbers of cracks were observed with hand protaper files compared with hand K-file at the WL and 1 mm short of WL. Chi-square testing revealed a highly significant effect of WL on crack formation at WL and 1 mm short of WL (p = 0.000). Minimum numbers of cracks at WL and 1 mm short of WL were observed with hand K-file and maximum with hand protaper files. Soni D, Raisingani D, Mathur R, Madan N, Visnoi S. Incidence of Apical Crack Initiation during Canal Preparation using Hand Stainless Steel (K-File) and Hand NiTi (Protaper) Files. Int J Clin Pediatr Dent 2016;9(4):303-307.

  15. Accessing files in an Internet: The Jade file system

    Science.gov (United States)

    Peterson, Larry L.; Rao, Herman C.

    1991-01-01

    Jade is a new distribution file system that provides a uniform way to name and access files in an internet environment. It makes two important contributions. First, Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Jade is designed under the restriction that the underlying file system may not be modified. Second, rather than providing a global name space, Jade permits each user to define a private name space. These private name spaces support two novel features: they allow multiple file systems to be mounted under one directory, and they allow one logical name space to mount other logical name spaces. A prototype of the Jade File System was implemented on Sun Workstations running Unix. It consists of interfaces to the Unix file system, the Sun Network File System, the Andrew File System, and FTP. This paper motivates Jade's design, highlights several aspects of its implementation, and illustrates applications that can take advantage of its features.

  16. Accessing files in an internet - The Jade file system

    Science.gov (United States)

    Rao, Herman C.; Peterson, Larry L.

    1993-01-01

    Jade is a new distribution file system that provides a uniform way to name and access files in an internet environment. It makes two important contributions. First, Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Jade is designed under the restriction that the underlying file system may not be modified. Second, rather than providing a global name space, Jade permits each user to define a private name space. These private name spaces support two novel features: they allow multiple file systems to be mounted under one directory, and they allow one logical name space to mount other logical name spaces. A prototype of the Jade File System was implemented on Sun Workstations running Unix. It consists of interfaces to the Unix file system, the Sun Network File System, the Andrew File System, and FTP. This paper motivates Jade's design, highlights several aspects of its implementation, and illustrates applications that can take advantage of its features.

  17. Public Use Microdata Samples (PUMS)

    Data.gov (United States)

    National Aeronautics and Space Administration — Public Use Microdata Samples (PUMS) are computer-accessible files containing records for a sample of housing units, with information on the characteristics of each...

  18. An improved algorithm to convert CAD model to MCNP geometry model based on STEP file

    International Nuclear Information System (INIS)

    Zhou, Qingguo; Yang, Jiaming; Wu, Jiong; Tian, Yanshan; Wang, Junqiong; Jiang, Hai; Li, Kuan-Ching

    2015-01-01

    Highlights: • Fully exploits common features of cells, making the processing efficient. • Accurately provide the cell position. • Flexible to add new parameters in the structure. • Application of novel structure in INP file processing, conveniently evaluate cell location. - Abstract: MCNP (Monte Carlo N-Particle Transport Code) is a general-purpose Monte Carlo N-Particle code that can be used for neutron, photon, electron, or coupled neutron/photon/electron transport. Its input file, the INP file, has the characteristics of complicated form and is error-prone when describing geometric models. Due to this, a conversion algorithm that can solve the problem by converting general geometric model to MCNP model during MCNP aided modeling is highly needed. In this paper, we revised and incorporated a number of improvements over our previous work (Yang et al., 2013), which was proposed and targeted after STEP file and INP file were analyzed. Results of experiments show that the revised algorithm is more applicable and efficient than previous work, with the optimized extraction of geometry and topology information of the STEP file, as well as the production efficiency of output INP file. This proposed research is promising, and serves as valuable reference for the majority of researchers involved with MCNP-related researches

  19. BLT-MS (Breach, Leach, and Transport -- Multiple Species) data input guide. A computer model for simulating release of contaminants from a subsurface low-level waste disposal facility

    International Nuclear Information System (INIS)

    Sullivan, T.M.; Kinsey, R.R.; Aronson, A.; Divadeenam, M.; MacKinnon, R.J.

    1996-11-01

    The BLT-MS computer code has been developed, implemented, and tested. BLT-MS is a two-dimensional finite element computer code capable of simulating the time evolution of concentration resulting from the time-dependent release and transport of aqueous phase species in a subsurface soil system. BLT-MS contains models to simulate the processes (water flow, container degradation, waste form performance, transport, and radioactive production and decay) most relevant to estimating the release and transport of contaminants from a subsurface disposal system. Water flow is simulated through tabular input or auxiliary files. Container degradation considers localized failure due to pitting corrosion and general failure due to uniform surface degradation processes. Waste form performance considers release to be limited by one of four mechanisms: rinse with partitioning, diffusion, uniform surface degradation, or solubility. Radioactive production and decay in the waste form are simulated. Transport considers the processes of advection, dispersion, diffusion, radioactive production and decay, reversible linear sorption, and sources (waste forms releases). To improve the usefulness of BLT-MS a preprocessor, BLTMSIN, which assists in the creation of input files, and a post-processor, BLTPLOT, which provides a visual display of the data have been developed. This document reviews the models implemented in BLT-MS and serves as a guide to creating input files for BLT-MS

  20. BLT-MS (Breach, Leach, and Transport -- Multiple Species) data input guide. A computer model for simulating release of contaminants from a subsurface low-level waste disposal facility

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, T.M.; Kinsey, R.R.; Aronson, A.; Divadeenam, M. [Brookhaven National Lab., Upton, NY (United States); MacKinnon, R.J. [Brookhaven National Lab., Upton, NY (United States)]|[Ecodynamics Research Associates, Inc., Albuquerque, NM (United States)

    1996-11-01

    The BLT-MS computer code has been developed, implemented, and tested. BLT-MS is a two-dimensional finite element computer code capable of simulating the time evolution of concentration resulting from the time-dependent release and transport of aqueous phase species in a subsurface soil system. BLT-MS contains models to simulate the processes (water flow, container degradation, waste form performance, transport, and radioactive production and decay) most relevant to estimating the release and transport of contaminants from a subsurface disposal system. Water flow is simulated through tabular input or auxiliary files. Container degradation considers localized failure due to pitting corrosion and general failure due to uniform surface degradation processes. Waste form performance considers release to be limited by one of four mechanisms: rinse with partitioning, diffusion, uniform surface degradation, or solubility. Radioactive production and decay in the waste form are simulated. Transport considers the processes of advection, dispersion, diffusion, radioactive production and decay, reversible linear sorption, and sources (waste forms releases). To improve the usefulness of BLT-MS a preprocessor, BLTMSIN, which assists in the creation of input files, and a post-processor, BLTPLOT, which provides a visual display of the data have been developed. This document reviews the models implemented in BLT-MS and serves as a guide to creating input files for BLT-MS.

  1. Fast probabilistic file fingerprinting for big data.

    Science.gov (United States)

    Tretyakov, Konstantin; Laur, Sven; Smant, Geert; Vilo, Jaak; Prins, Pjotr

    2013-01-01

    Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily concerns logistics within and between data centers, but is also important for workstation users in the analysis phase. Common usage patterns, such as comparing and transferring files, are proving computationally expensive and are tying down shared resources. We present an efficient method for calculating file uniqueness for large scientific data files, that takes less computational effort than existing techniques. This method, called Probabilistic Fast File Fingerprinting (PFFF), exploits the variation present in biological data and computes file fingerprints by sampling randomly from the file instead of reading it in full. Consequently, it has a flat performance characteristic, correlated with data variation rather than file size. We demonstrate that probabilistic fingerprinting can be as reliable as existing hashing techniques, with provably negligible risk of collisions. We measure the performance of the algorithm on a number of data storage and access technologies, identifying its strengths as well as limitations. Probabilistic fingerprinting may significantly reduce the use of computational resources when comparing very large files. Utilisation of probabilistic fingerprinting techniques can increase the speed of common file-related workflows, both in the data center and for workbench analysis. The implementation of the algorithm is available as an open-source tool named pfff, as a command-line tool as well as a C library. The tool can be downloaded from http://biit.cs.ut.ee/pfff.

  2. Modeling Recognition Memory Using the Similarity Structure of Natural Input

    Science.gov (United States)

    Lacroix, Joyca P. W.; Murre, Jaap M. J.; Postma, Eric O.; van den Herik, H. Jaap

    2006-01-01

    The natural input memory (NAM) model is a new model for recognition memory that operates on natural visual input. A biologically informed perceptual preprocessing method takes local samples (eye fixations) from a natural image and translates these into a feature-vector representation. During recognition, the model compares incoming preprocessed…

  3. Data input guide for SWIFT II. The Sandia waste-isolation flow and transport model for fractured media, Release 4.84

    International Nuclear Information System (INIS)

    Reeves, M.; Ward, D.S.; Johns, N.D.; Cranwell, R.M.

    1986-04-01

    This report is one of three which describes the SWIFT II computer code. The code simulates flow and transport processes in geologic media which may be fractured. SWIFT II was developed for use in the analysis of deep geologic facilities for nuclear-waste disposal. This user's manual should permit the analyst to use the code effectively by facilitating the preparation of input data. A second companion document discusses the theory and implementation of the models employed by the SWIFT II code. A third document provides illustrative problems for instructional purposes. This report contains detailed descriptions of the input data along with an appendix of the input diagnostics. The use of auxiliary files, unit conversions, and program variable descriptors also are included in this document

  4. 77 FR 21754 - Endangered Species; File Nos. 16526, 16323, 16436, 16422, 16438, 16431, 16507, 16547, 16375...

    Science.gov (United States)

    2012-04-11

    ... telemetry tag. File No. 16323 The Connecticut Department of Environmental Protection Marine Fisheries, Tom... sampled, and gonad tissue sampled. File No. 16547 The U.S. Fish and Wildlife Service, Albert Spells (RP...

  5. Policy Analysis Screening System (PASS) demonstration: sample queries and terminal instructions

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-10-16

    This document contains the input and output for the Policy Analysis Screening System (PASS) demonstration. This demonstration is stored on a portable disk at the Environmental Impacts Division. Sample queries presented here include: (1) how to use PASS; (2) estimated 1995 energy consumption from Mid-Range Energy-Forecasting System (MEFS) data base; (3) pollution projections from Strategic Environmental Assessment System (SEAS) data base; (4) diesel auto regulations; (5) diesel auto health effects; (6) oil shale health and safety measures; (7) water pollution effects of SRC; (8) acid rainfall from Energy Environmental Statistics (EES) data base; 1990 EIA electric generation by fuel type; sulfate concentrations by Federal region; forecast of 1995 SO/sub 2/ emissions in Region III; and estimated electrical generating capacity in California to 1990. The file name for each query is included.

  6. Long term file migration. Part I: file reference patterns

    International Nuclear Information System (INIS)

    Smith, A.J.

    1978-08-01

    In most large computer installations, files are moved between on-line disk and mass storage (tape, integrated mass storage device) either automatically by the system or specifically at the direction of the user. This is the first of two papers which study the selection of algorithms for the automatic migration of files between mass storage and disk. The use of the text editor data sets at the Stanford Linear Accelerator Center (SLAC) computer installation is examined through the analysis of thirteen months of file reference data. Most files are used very few times. Of those that are used sufficiently frequently that their reference patterns may be examined, about a third show declining rates of reference during their lifetime; of the remainder, very few (about 5%) show correlated interreference intervals, and interreference intervals (in days) appear to be more skewed than would occur with the Bernoulli process. Thus, about two-thirds of all sufficiently active files appear to be referenced as a renewal process with a skewed interreference distribution. A large number of other file reference statistics (file lifetimes, interference distributions, moments, means, number of uses/file, file sizes, file rates of reference, etc.) are computed and presented. The results are applied in the following paper to the development and comparative evaluation of file migration algorithms. 17 figures, 13 tables

  7. High School and Beyond Transcripts Survey (1982). Data File User's Manual. Contractor Report.

    Science.gov (United States)

    Jones, Calvin; And Others

    This data file user's manual documents the procedures used to collect and process high school transcripts for a large sample of the younger cohort (1980 sophomores) in the High School and Beyond survey. The manual provides the user with the technical assistance needed to use the computer file and also discusses the following: (1) sample design for…

  8. Conceptual Design of GRIG (GUI Based RETRAN Input Generator)

    International Nuclear Information System (INIS)

    Lee, Gyung Jin; Hwang, Su Hyun; Hong, Soon Joon; Lee, Byung Chul; Jang, Chan Su; Um, Kil Sup

    2007-01-01

    and archival of results. But it has no capability to interconnect database of NPP design material. RETRANUI (RETRAN User Interface) developed by Computer Simulation and Analysis, Inc. is a PC-based graphical user interface designed to assist the RETRAN analyst with execution of the RETRAN computer programs and to provide convenient automated editing and plotting features. The RETRAN calculation is monitored and controlled by the RETRANUI. Once the analysis is complete, the results can be conveniently plotted or the output file viewed by selecting the appropriate RETRANUI toolbar button. But the function is limited to post-processing. Therefore, GRIG (Graphical User Interface based RETRAN Input Generator) is being developed to generate the basic input of transient analysis code from the database of NPP design manual, to minimize the faults induced in the progress of input generation, and to enhance the user convenience. The methodology of GRIG interconnecting the input generator with the database and calculation note is new approach that has never been tried until now

  9. Two-Stage Variable Sample-Rate Conversion System

    Science.gov (United States)

    Tkacenko, Andre

    2009-01-01

    A two-stage variable sample-rate conversion (SRC) system has been pro posed as part of a digital signal-processing system in a digital com munication radio receiver that utilizes a variety of data rates. The proposed system would be used as an interface between (1) an analog- todigital converter used in the front end of the receiver to sample an intermediatefrequency signal at a fixed input rate and (2) digita lly implemented tracking loops in subsequent stages that operate at v arious sample rates that are generally lower than the input sample r ate. This Two-Stage System would be capable of converting from an input sample rate to a desired lower output sample rate that could be var iable and not necessarily a rational fraction of the input rate.

  10. JENDL special purpose file

    International Nuclear Information System (INIS)

    Nakagawa, Tsuneo

    1995-01-01

    In JENDL-3,2, the data on all the reactions having significant cross section over the neutron energy from 0.01 meV to 20 MeV are given for 340 nuclides. The object range of application extends widely, such as the neutron engineering, shield and others of fast reactors, thermal neutron reactors and nuclear fusion reactors. This is a general purpose data file. On the contrary to this, the file in which only the data required for a specific application field are collected is called special purpose file. The file for dosimetry is a typical special purpose file. The Nuclear Data Center, Japan Atomic Energy Research Institute, is making ten kinds of JENDL special purpose files. The files, of which the working groups of Sigma Committee are in charge, are listed. As to the format of the files, ENDF format is used similarly to JENDL-3,2. Dosimetry file, activation cross section file, (α, n) reaction data file, fusion file, actinoid file, high energy data file, photonuclear data file, PKA/KERMA file, gas production cross section file and decay data file are described on their contents, the course of development and their verification. Dosimetry file and gas production cross section file have been completed already. As for the others, the expected time of completion is shown. When these files are completed, they are opened to the public. (K.I.)

  11. Identifiable Data Files - Health Outcomes Survey (HOS)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicare Health Outcomes Survey (HOS) identifiable data files are comprised of the entire national sample for a given 2-year cohort (including both respondents...

  12. 11 CFR 100.19 - File, filed or filing (2 U.S.C. 434(a)).

    Science.gov (United States)

    2010-01-01

    ... a facsimile machine or by electronic mail if the reporting entity is not required to file..., including electronic reporting entities, may use the Commission's website's on-line program to file 48-hour... the reporting entity is not required to file electronically in accordance with 11 CFR 104.18. [67 FR...

  13. Storage of sparse files using parallel log-structured file system

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron

    2017-11-07

    A sparse file is stored without holes by storing a data portion of the sparse file using a parallel log-structured file system; and generating an index entry for the data portion, the index entry comprising a logical offset, physical offset and length of the data portion. The holes can be restored to the sparse file upon a reading of the sparse file. The data portion can be stored at a logical end of the sparse file. Additional storage efficiency can optionally be achieved by (i) detecting a write pattern for a plurality of the data portions and generating a single patterned index entry for the plurality of the patterned data portions; and/or (ii) storing the patterned index entries for a plurality of the sparse files in a single directory, wherein each entry in the single directory comprises an identifier of a corresponding sparse file.

  14. A user-friendly robotic sample preparation program for fully automated biological sample pipetting and dilution to benefit the regulated bioanalysis.

    Science.gov (United States)

    Jiang, Hao; Ouyang, Zheng; Zeng, Jianing; Yuan, Long; Zheng, Naiyu; Jemal, Mohammed; Arnold, Mark E

    2012-06-01

    Biological sample dilution is a rate-limiting step in bioanalytical sample preparation when the concentrations of samples are beyond standard curve ranges, especially when multiple dilution factors are needed in an analytical run. We have developed and validated a Microsoft Excel-based robotic sample preparation program (RSPP) that automatically transforms Watson worklist sample information (identification, sequence and dilution factor) to comma-separated value (CSV) files. The Freedom EVO liquid handler software imports and transforms the CSV files to executable worklists (.gwl files), allowing the robot to perform sample dilutions at variable dilution factors. The dynamic dilution range is 1- to 1000-fold and divided into three dilution steps: 1- to 10-, 11- to 100-, and 101- to 1000-fold. The whole process, including pipetting samples, diluting samples, and adding internal standard(s), is accomplished within 1 h for two racks of samples (96 samples/rack). This platform also supports online sample extraction (liquid-liquid extraction, solid-phase extraction, protein precipitation, etc.) using 96 multichannel arms. This fully automated and validated sample dilution and preparation process has been applied to several drug development programs. The results demonstrate that application of the RSPP for fully automated sample processing is efficient and rugged. The RSPP not only saved more than 50% of the time in sample pipetting and dilution but also reduced human errors. The generated bioanalytical data are accurate and precise; therefore, this application can be used in regulated bioanalysis.

  15. File Type Identification of File Fragments using Longest Common Subsequence (LCS)

    Science.gov (United States)

    Rahmat, R. F.; Nicholas, F.; Purnamawati, S.; Sitompul, O. S.

    2017-01-01

    Computer forensic analyst is a person in charge of investigation and evidence tracking. In certain cases, the file needed to be presented as digital evidence was deleted. It is difficult to reconstruct the file, because it often lost its header and cannot be identified while being restored. Therefore, a method is required for identifying the file type of file fragments. In this research, we propose Longest Common Subsequences that consists of three steps, namely training, testing and validation, to identify the file type from file fragments. From all testing results we can conlude that our proposed method works well and achieves 92.91% of accuracy to identify the file type of file fragment for three data types.

  16. 49 CFR 564.5 - Information filing; agency processing of filings.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 6 2010-10-01 2010-10-01 false Information filing; agency processing of filings... HIGHWAY TRAFFIC SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION REPLACEABLE LIGHT SOURCE INFORMATION (Eff. until 12-01-12) § 564.5 Information filing; agency processing of filings. (a) Each manufacturer...

  17. Sampling of Stochastic Input Parameters for Rockfall Calculations and for Structural Response Calculations Under Vibratory Ground Motion

    International Nuclear Information System (INIS)

    M. Gross

    2004-01-01

    The purpose of this scientific analysis is to define the sampled values of stochastic (random) input parameters for (1) rockfall calculations in the lithophysal and nonlithophysal zones under vibratory ground motions, and (2) structural response calculations for the drip shield and waste package under vibratory ground motions. This analysis supplies: (1) Sampled values of ground motion time history and synthetic fracture pattern for analysis of rockfall in emplacement drifts in nonlithophysal rock (Section 6.3 of ''Drift Degradation Analysis'', BSC 2004 [DIRS 166107]); (2) Sampled values of ground motion time history and rock mechanical properties category for analysis of rockfall in emplacement drifts in lithophysal rock (Section 6.4 of ''Drift Degradation Analysis'', BSC 2004 [DIRS 166107]); (3) Sampled values of ground motion time history and metal to metal and metal to rock friction coefficient for analysis of waste package and drip shield damage to vibratory motion in ''Structural Calculations of Waste Package Exposed to Vibratory Ground Motion'' (BSC 2004 [DIRS 167083]) and in ''Structural Calculations of Drip Shield Exposed to Vibratory Ground Motion'' (BSC 2003 [DIRS 163425]). The sampled values are indices representing the number of ground motion time histories, number of fracture patterns and rock mass properties categories. These indices are translated into actual values within the respective analysis and model reports or calculations. This report identifies the uncertain parameters and documents the sampled values for these parameters. The sampled values are determined by GoldSim V6.04.007 [DIRS 151202] calculations using appropriate distribution types and parameter ranges. No software development or model development was required for these calculations. The calculation of the sampled values allows parameter uncertainty to be incorporated into the rockfall and structural response calculations that support development of the seismic scenario for the

  18. BLT-EC (Breach, Leach and Transport-Equilibrium Chemistry) data input guide. A computer model for simulating release and coupled geochemical transport of contaminants from a subsurface disposal facility

    International Nuclear Information System (INIS)

    MacKinnon, R.J.; Sullivan, T.M.; Kinsey, R.R.

    1997-05-01

    The BLT-EC computer code has been developed, implemented, and tested. BLT-EC is a two-dimensional finite element computer code capable of simulating the time-dependent release and reactive transport of aqueous phase species in a subsurface soil system. BLT-EC contains models to simulate the processes (container degradation, waste-form performance, transport, chemical reactions, and radioactive production and decay) most relevant to estimating the release and transport of contaminants from a subsurface disposal system. Water flow is provided through tabular input or auxiliary files. Container degradation considers localized failure due to pitting corrosion and general failure due to uniform surface degradation processes. Waste-form performance considers release to be limited by one of four mechanisms: rinse with partitioning, diffusion, uniform surface degradation, and solubility. Transport considers the processes of advection, dispersion, diffusion, chemical reaction, radioactive production and decay, and sources (waste form releases). Chemical reactions accounted for include complexation, sorption, dissolution-precipitation, oxidation-reduction, and ion exchange. Radioactive production and decay in the waste form is simulated. To improve the usefulness of BLT-EC, a pre-processor, ECIN, which assists in the creation of chemistry input files, and a post-processor, BLTPLOT, which provides a visual display of the data have been developed. BLT-EC also includes an extensive database of thermodynamic data that is also accessible to ECIN. This document reviews the models implemented in BLT-EC and serves as a guide to creating input files and applying BLT-EC

  19. Cut-and-Paste file-systems: integrating simulators and file systems

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.

    1995-01-01

    We have implemented an integrated and configurable file system called the Pegasus filesystem (PFS) and a trace-driven file-system simulator called Patsy. Patsy is used for off-line analysis of file-systemalgorithms, PFS is used for on-line file-systemdata storage. Algorithms are first analyzed in

  20. Federal Logistics Information System (FLIS) Procedures Manual. Volume 8. Document Identifier Code Input/Output Formats (Fixed Length)

    Science.gov (United States)

    1994-07-01

    REQUIRED MIX OF SEGMENTS OR INDIVIDUAL DATA ELEMENTS TO BE EXTRACTED. IN SEGMENT R ON AN INTERROGATION TRANSACTION (LTI), DATA RECORD NUMBER (DRN 0950) ONLY...and zation and Marketing input DICs. insert the Continuation Indicator Code (DRN 8555) in position 80 of this record. Maximum of OF The assigned NSN...for Procurement KFR, File Data Minus Security Classified Characteristics Data KFC 8.5-2 DoD 4100.39-M Volume 8 CHAPTER 5 ALPHABETIC INDEX OF DIC

  1. Cyclic fatigue resistance of R-Pilot, HyFlex EDM and PathFile nickel-titanium glide path files in artificial canals with double (S-shaped) curvature.

    Science.gov (United States)

    Uslu, G; Özyürek, T; Yılmaz, K; Gündoğar, M

    2018-05-01

    To examine the cyclic fatigue resistances of R-Pilot, HyFlex EDM and PathFile NiTi glide path files in S-shaped artificial canals. Twenty R-Pilot (12.5/.04), 20 HyFlex EDM (10/.05) and 20 PathFile (19/.02) single-file glide path files were included. Sixty files (n: 20/each) were subjected to static cyclic fatigue testing using double-curved canals until fracture occurred (TF). The number of cycles to fracture (NCF) was calculated by multiplying the rpm value by the TF. The length of the fractured fragment (FL) was determined by a digital microcaliper. Six samples of fractured files (n: 2/each) were examined by SEM to determine the fracture mode. The NCF and the FL data were analysed using one-way anova, post hoc Tamhane and Kruskal-Wallis tests using SPPS 21 software. The significance level was set at 5%. In the double-curved canal, all the files fractured first in the apical curvature and then in the coronal curvature. The NCF values revealed that the R-Pilot had the greatest cyclic fatigue resistance, followed by the HyFlex EDM and PathFile in both the apical and coronal curvatures (P < 0.05). R-Pilot NiTi glide path files, used in a reciprocating motion, had the greatest cyclic fatigue resistance amongst the tested NiTi glide path files in an artificial S-shaped canal. © 2017 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  2. GRUKON - A package of applied computer programs system input and operating procedures of functional modules

    International Nuclear Information System (INIS)

    Sinitsa, V.V.; Rineiskij, A.A.

    1993-04-01

    This manual describes a software package for the production of multigroup neutron cross-sections from evaluated nuclear data files. It presents the information necessary for the implementation of the program's modules in the framework of the execution of the program, including: operating procedures of the program, the data input, the macrocommand language, the assignment of the system's procedures. This report also presents the methodology used in the coding of the individual modules: the rules, the syntax, the method of procedures. The report also presents an example of the application of the data processing module. (author)

  3. Cut-and-Paste file-systems : integrating simulators and file systems

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.

    1996-01-01

    We have implemented an integrated and configurable file system called the PFS and a trace-driven file-system simulator called Patsy. Patsy is used for off-line analysis of file-system algorithms, PFS is used for on-line file-system data storage. Algorithms are first analyzed in Patsy and when we are

  4. Effect of different input management on weed composition, diversity and density of corn field

    Directory of Open Access Journals (Sweden)

    Surur Khoramdel

    2016-04-01

    Full Text Available In order to investigate the effects of input intensity on species diversity, composition and density of weeds in corn (Zea mays L., an experiment was conducted based on a randomized complete block design with three replications at the Agricultural Research Station, Ferdowsi University of Mashhad, Iran during the year 2009. Treatments included low input, medium input and high input systems. Low input received 30 tonha-1or 30 tonha-1 compost, zero tillage and hand weeding (twice. Medium input was based on 15 tonha-1 manure, 150 kgha-1 urea as chemical fertilizer, twice tillage operations and 2, 4-D (1.5 Lha-1, at five leaves emergence as an herbicide and hand weeding (once. High input received 300 kgha-1 urea, four tillage operations and Paraquat (2 Lha-1, after planting and 2, 4-D (1.5 Lha-1, at five leaves emergence. Manure and compost were applied in the planting time. Weed samplings were done in three stages (early, mid and late growing season. Results indicated that the highest and the lowest weed species diversity and density were observed in low input based on manure and high input systems, respectively. The highest range of weed relative density was obtained for black nightshade (Solanum nigrum with 9.09-75.00%. The highest number of species was observed in low input based on manure. Also, management practices affected weed dry matter and diversity indices. The highest and the lowest amounts of weed dry matter were observed in low input based on manure and high input systems, respectively. In the first, second and the third stages of sampling, the maximum and the minimum amounts of Margalef index were observed in low input based on manure (with 5.3, 5.4 and 3.3, respectively and high input systems (with 0.8, 2.3 and 2.6, respectively. In the first, second and the third stages of sampling, the highest and the lowest values of Shannon index were observed in low input based on manure (with 0.6, 0.7 and 0.5 respectively and high input (with 0

  5. Multi-input wide dynamic range ADC system for use with nuclear detectors

    Energy Technology Data Exchange (ETDEWEB)

    Austin, R W [National Aeronautics and Space Administration, Huntsville, Ala. (USA). George C. Marshall Space Flight Center

    1976-04-15

    A wide dynamic range, eight input analog-to-digital converter system has been developed for use in nuclear experiments. The system consists of eight dual-range sample and hold modules, an eight input multiplexer, a ten-bit analog-to-digital converter, and the associated control logic.

  6. Calibration of uncertain inputs to computer models using experimentally measured quantities and the BMARS emulator

    International Nuclear Information System (INIS)

    Stripling, H.F.; McClarren, R.G.; Kuranz, C.C.; Grosskopf, M.J.; Rutter, E.; Torralva, B.R.

    2011-01-01

    We present a method for calibrating the uncertain inputs to a computer model using available experimental data. The goal of the procedure is to produce posterior distributions of the uncertain inputs such that when samples from the posteriors are used as inputs to future model runs, the model is more likely to replicate (or predict) the experimental response. The calibration is performed by sampling the space of the uncertain inputs, using the computer model (or, more likely, an emulator for the computer model) to assign weights to the samples, and applying the weights to produce the posterior distributions and generate predictions of new experiments within confidence bounds. The method is similar to the Markov chain Monte Carlo (MCMC) calibration methods with independent sampling with the exception that we generate samples beforehand and replace the candidate acceptance routine with a weighting scheme. We apply our method to the calibration of a Hyades 2D model of laser energy deposition in beryllium. We employ a Bayesian Multivariate Adaptive Regression Splines (BMARS) emulator as a surrogate for Hyades 2D. We treat a range of uncertainties in our system, including uncertainties in the experimental inputs, experimental measurement error, and systematic experimental timing errors. The results of the calibration are posterior distributions that both agree with intuition and improve the accuracy and decrease the uncertainty in experimental predictions. (author)

  7. File sharing

    NARCIS (Netherlands)

    van Eijk, N.

    2011-01-01

    File sharing’ has become generally accepted on the Internet. Users share files for downloading music, films, games, software etc. In this note, we have a closer look at the definition of file sharing, the legal and policy-based context as well as enforcement issues. The economic and cultural

  8. User's guide to input for WRAP: a water reactor analysis package

    International Nuclear Information System (INIS)

    Gregory, M.V.

    1977-06-01

    The document describes the input records required to execute the Water Reactor Analysis Package (WRAP) for the analysis of thermal-hydraulic transients in primarily light water reactors. The card input required by RELAP4 has been significantly modified to broaden the code's input processing capabilities: (1) All input is in the form of templated, named records. (2) All components (volumes, junctions, etc.) are named rather than numbered, and system relationships are formed by defining associations between the names. (3) A hierarchical part structure is used which allows collections of components to be described as discrete parts (these parts may then be catalogued for use in a wide range of cases). A sample problem, the small break analysis of the Westinghouse Trojan Plant, is discussed and detailed, step-by-step instructions in setting up an input data base are presented. A master list of all input templates for WRAP is compiled

  9. FEMSYN - a code system to solve multigroup diffusion theory equations using a variety of solution techniques. Part 1 : Description of code system - input and sample problems

    International Nuclear Information System (INIS)

    Jagannathan, V.

    1985-01-01

    A modular computer code system called FEMSYN has been developed to solve the multigroup diffusion theory equations. The various methods that are incorporated in FEMSYN are (i) finite difference method (FDM) (ii) finite element method (FEM) and (iii) single channel flux synthesis method (SCFS). These methods are described in detail in parts II, III and IV of the present report. In this report, a comparison of the accuracy and the speed of different methods of solution for some benchmark problems are reported. The input preparation and listing of sample input and output are included in the Appendices. The code FEMSYN has been used to solve a wide variety of reactor core problems. It can be used for both LWR and PHWR applications. (author)

  10. Characterization, impact and fate of atmospheric inputs in the water column

    International Nuclear Information System (INIS)

    Sandroni, V.; Migon, C.

    1999-01-01

    The present results, obtained from continuous sampling (wet, dry and total inputs) at the Cap Ferrat sampling station (Ligurian Sea) enable to quantify the dissolved and particulate fractions relative to various types of metals (dust-derived, anthropogenic, medium)

  11. Mini MAX - Medicaid Sample

    Data.gov (United States)

    U.S. Department of Health & Human Services — To facilitate wider use of MAX, CMS contracted with Mathematica to convene a technical expert panel (TEP) and determine the feasibility of creating a sample file for...

  12. Fishing input requirements of artisanal fishers in coastal ...

    African Journals Online (AJOL)

    Efforts towards increase in fish production through artisanal fishery can be achieved by making needed inputs available. Fishing requirements of artisanal fishers in coastal communities of Ondo State, Nigeria were studied. Data were obtained from two hundred and sixteen artisans using multistage random sampling ...

  13. Effect of Autoclave Cycles on Surface Characteristics of S-File Evaluated by Scanning Electron Microscopy.

    Science.gov (United States)

    Razavian, Hamid; Iranmanesh, Pedram; Mojtahedi, Hamid; Nazeri, Rahman

    2016-01-01

    Presence of surface defects in endodontic instruments can lead to unwanted complications such as instrument fracture and incomplete preparation of the canal. The current study was conducted to evaluate the effect of autoclave cycles on surface characteristics of S-File by scanning electron microscopy (SEM). In this experimental study, 17 brand new S-Files (#30) were used. The surface characteristics of the files were examined in four steps (without autoclave, 1 autoclave cycle, 5 autoclave cycles and 10 autoclave cycles) by SEM under 200× and 1000× magnifications. Data were analyzed using the SPSS software and the paired sample t-test, independent sample t-test and multifactorial repeated measures ANOVA. The level of significance was set at 0.05. New files had debris and pitting on their surfaces. When the autoclave cycles were increased, the mean of surface roughness also increased at both magnifications (Pautoclave increased the surface roughness of the files and this had was directly related to the number of autoclave cycles.

  14. Renewal-anomalous-heterogeneous files

    International Nuclear Information System (INIS)

    Flomenbom, Ophir

    2010-01-01

    Renewal-anomalous-heterogeneous files are solved. A simple file is made of Brownian hard spheres that diffuse stochastically in an effective 1D channel. Generally, Brownian files are heterogeneous: the spheres' diffusion coefficients are distributed and the initial spheres' density is non-uniform. In renewal-anomalous files, the distribution of waiting times for individual jumps is not exponential as in Brownian files, yet obeys: ψ α (t)∼t -1-α , 0 2 >, obeys, 2 >∼ 2 > nrml α , where 2 > nrml is the MSD in the corresponding Brownian file. This scaling is an outcome of an exact relation (derived here) connecting probability density functions of Brownian files and renewal-anomalous files. It is also shown that non-renewal-anomalous files are slower than the corresponding renewal ones.

  15. Persepsi Wajib Pajak Mengenai E-Filing dan Pengaruhnya terhadap Tingkat Kepatuhan Wajib Pajak Orang Pribadi dalam Melaporkan Pajak

    OpenAIRE

    Gunawan, Teddy; Suprapti, Eny; Kurniawati, Eris Tri

    2017-01-01

    This research is aimed to examine the effect of taxpayer's perception toward e-Filing system ofindividual taxpayer's compliance in tax reporting. This research is associative research. Intaxpayer's perception there are e-Filing usefulness variable, e-Filing ease variable, e-Filingcomplexcity, e-Filing security and privacy variable and e-Filing readiness. The population of thisresearch is individual taxpayer listed in Tax Office Pratam Batu. Sampling used of this researchis convenience samplin...

  16. Assessment of apically extruded debris produced by the single-file ProTaper F2 technique under reciprocating movement.

    Science.gov (United States)

    De-Deus, Gustavo; Brandão, Maria Claudia; Barino, Bianca; Di Giorgi, Karina; Fidel, Rivail Antonio Sergio; Luna, Aderval Severino

    2010-09-01

    This study was designed to quantitatively evaluate the amount of dentin debris extruded from the apical foramen by comparing the conventional sequence of the ProTaper Universal nickel-titanium (NiTi) files with the single-file ProTaper F2 technique. Thirty mesial roots of lower molars were selected, and the use of different instrumentation techniques resulted in 3 groups (n=10 each). In G1, a crown-down hand-file technique was used, and in G2 conventional ProTaper Universal technique was used. In G3, ProTaper F2 file was used in a reciprocating motion. The apical finish preparation was equivalent to ISO size 25. An apparatus was used to evaluate the apically extruded debris. Statistical analysis was performed using 1-way analysis of variance and Tukey multiple comparisons. No significant difference was found in the amount of the debris extruded between the conventional sequence of the ProTaper Universal NiTi files and the single-file ProTaper F2 technique (P>.05). In contrast, the hand instrumentation group extruded significantly more debris than both NiTi groups (P<.05). The present results yielded favorable input for the F2 single-file technique in terms of apically extruded debris, inasmuch as it is the most simple and cost-effective instrumentation approach. Copyright (c) 2010 Mosby, Inc. All rights reserved.

  17. Reconstruction of point cross-section from ENDF data file for Monte Carlo applications

    International Nuclear Information System (INIS)

    Kumawat, H.; Saxena, A.; Carminati, F.; )

    2016-12-01

    Monte Carlo neutron transport codes are one of the best tools to simulate complex systems like fission and fusion reactors, Accelerator Driven Sub-critical systems, radio-activity management of spent fuel and waste, optimization and characterization of neutron detectors, optimization of Boron Neutron Capture Therapy, imaging etc. The neutron cross-section and secondary particle emission properties are the main input parameters of such codes. The fission, capture and elastic scattering cross-sections have complex resonating structures. Evaluated Nuclear Data File (ENDF) contains these cross-sections and secondary parameters. We report the development of reconstruction procedure to generate point cross-sections and probabilities from ENDF data file. The cross-sections are compared with the values obtained from PREPRO and in some cases NJOY codes. The results are in good agreement. (author)

  18. PC Graphic file programing

    International Nuclear Information System (INIS)

    Yang, Jin Seok

    1993-04-01

    This book gives description of basic of graphic knowledge and understanding and realization of graphic file form. The first part deals with graphic with graphic data, store of graphic data and compress of data, programing language such as assembling, stack, compile and link of program and practice and debugging. The next part mentions graphic file form such as Mac paint file, GEM/IMG file, PCX file, GIF file, and TIFF file, consideration of hardware like mono screen driver and color screen driver in high speed, basic conception of dithering and conversion of formality.

  19. Experiences on File Systems: Which is the best file system for you?

    CERN Document Server

    Blomer, J

    2015-01-01

    The distributed file system landscape is scattered. Besides a plethora of research file systems, there is also a large number of production grade file systems with various strengths and weaknesses. The file system, as an abstraction of permanent storage, is appealing because it provides application portability and integration with legacy and third-party applications, including UNIX utilities. On the other hand, the general and simple file system interface makes it notoriously difficult for a distributed file system to perform well under a variety of different workloads. This contribution provides a taxonomy of commonly used distributed file systems and points out areas of research and development that are particularly important for high-energy physics.

  20. Storing files in a parallel computing system using list-based index to identify replica files

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Zhang, Zhenhua; Grider, Gary

    2015-07-21

    Improved techniques are provided for storing files in a parallel computing system using a list-based index to identify file replicas. A file and at least one replica of the file are stored in one or more storage nodes of the parallel computing system. An index for the file comprises at least one list comprising a pointer to a storage location of the file and a storage location of the at least one replica of the file. The file comprises one or more of a complete file and one or more sub-files. The index may also comprise a checksum value for one or more of the file and the replica(s) of the file. The checksum value can be evaluated to validate the file and/or the file replica(s). A query can be processed using the list.

  1. Characterising Event-Based DOM Inputs to an Urban Watershed

    Science.gov (United States)

    Croghan, D.; Bradley, C.; Hannah, D. M.; Van Loon, A.; Sadler, J. P.

    2017-12-01

    Dissolved Organic Matter (DOM) composition in urban streams is dominated by terrestrial inputs after rainfall events. Urban streams have particularly strong terrestrial-riverine connections due to direct input from terrestrial drainage systems. Event driven DOM inputs can have substantial adverse effects on water quality. Despite this, DOM from important catchment sources such as road drains and Combined Sewage Overflows (CSO's) remains poorly characterised within urban watersheds. We studied DOM sources within an urbanised, headwater watershed in Birmingham, UK. Samples from terrestrial sources (roads, roofs and a CSO), were collected manually after the onset of rainfall events of varying magnitude, and again within 24-hrs of the event ending. Terrestrial samples were analysed for fluorescence, absorbance and Dissolved Organic Carbon (DOC) concentration. Fluorescence and absorbance indices were calculated, and Parallel Factor Analysis (PARAFAC) was undertaken to aid sample characterization. Substantial differences in fluorescence, absorbance, and DOC were observed between source types. PARAFAC-derived components linked to organic pollutants were generally highest within road derived samples, whilst humic-like components tended to be highest within roof samples. Samples taken from the CSO generally contained low fluorescence, however this likely represents a dilution effect. Variation within source groups was particularly high, and local land use seemed to be the driving factor for road and roof drain DOM character and DOC quantity. Furthermore, high variation in fluorescence, absorbance and DOC was apparent between all sources depending on event type. Drier antecedent conditions in particular were linked to greater presence of terrestrially-derived components and higher DOC content. Our study indicates that high variations in DOM character occur between source types, and over small spatial scales. Road drains located on main roads appear to contain the poorest

  2. Principals' Perception of Educational Inputs and Students' Academic ...

    African Journals Online (AJOL)

    This study investigated principals' perception of the relationship between educational inputs and academic performance of students in public junior secondary schools (JSS) in the Central Senatorial District of Delta State, Nigeria. The population was all the 173 public JSS and their principals from which a sample of twenty ...

  3. Input energy measurement toward warm dense matter generation using intense pulsed power generator

    Science.gov (United States)

    Hayashi, R.; Ito, T.; Ishitani, T.; Tamura, F.; Kudo, T.; Takakura, N.; Kashine, K.; Takahashi, K.; Sasaki, T.; Kikuchi, T.; Harada, Nob.; Jiang, W.; Tokuchi, A.

    2016-05-01

    In order to investigate properties of warm dense matter (WDM) in inertial confinement fusion (ICF), evaluation method for the WDM with isochoric heating on the implosion time-scale using an intense pulsed power generator ETIGO-II (∼1 TW, ∼50 ns) has been considered. In this study, the history of input energy into the sample is measured from the voltage and the current waveforms. To achieve isochoric heating, a foamed aluminum with pore sizes 600 μm and with 90% porosity was packed into a hollow glass capillary (ø 5 mm × 10 mm). The temperature of the sample is calculated from the numerical calculation using the measured input power. According to the above measurements, the input energy into a sample and the achievable temperature are estimated to be 300 J and 6000 K. It indicates that the WDM state is generated using the proposed method with ICF implosion time-scale.

  4. Root Canal Cleaning Efficacy of Rotary and Hand Files Instrumentation in Primary Molars

    Science.gov (United States)

    Nazari Moghaddam, Kiumars; Mehran, Majid; Farajian Zadeh, Hamideh

    2009-01-01

    INTRODUCTION: Pulpectomy of primary teeth is commonly carried out with hand files and broaches; a tricky and time consuming procedure. The purpose of this in vitro study was to compare the cleaning efficacy and time taken for instrumentation of deciduous molars using hand K-files and Flex Master rotary system. MATERIALS AND METHODS: In this study, 68 canals of 23 extracted primary molars with at least two third intact roots and 7-12 mm length were selected. After preparing an access cavity, K-file size #15 was introduced into the root canal and India ink was injected with an insulin syringe. Sixty samples were randomly divided in to experimental groups in group I (n=30), root canals were prepared with hand K-files; in group II (n=30), rotary Flex Master files were used for instrumentation, and in group III 8 remained samples were considered as negative controls. After clearing and root sectioning, the removal of India ink from cervical, middle, and apical thirds was scored. Data was analyzed using student's T-test and Mann-Whitney U test. RESULTS: There was no significant difference between experimental groups cleaning efficacy at the cervical, middle and apical root canal thirds. Only the coronal third scored higher in the hand instrumented group (PInstrumentation with Flex Master rotary files was significantly less time consuming (Protary technique. PMID:23940486

  5. PCF File Format.

    Energy Technology Data Exchange (ETDEWEB)

    Thoreson, Gregory G [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    PCF files are binary files designed to contain gamma spectra and neutron count rates from radiation sensors. It is the native format for the GAmma Detector Response and Analysis Software (GADRAS) package [1]. It can contain multiple spectra and information about each spectrum such as energy calibration. This document outlines the format of the file that would allow one to write a computer program to parse and write such files.

  6. Adaptive Rate Sampling and Filtering Based on Level Crossing Sampling

    Directory of Open Access Journals (Sweden)

    Saeed Mian Qaisar

    2009-01-01

    Full Text Available The recent sophistications in areas of mobile systems and sensor networks demand more and more processing resources. In order to maintain the system autonomy, energy saving is becoming one of the most difficult industrial challenges, in mobile computing. Most of efforts to achieve this goal are focused on improving the embedded systems design and the battery technology, but very few studies target to exploit the input signal time-varying nature. This paper aims to achieve power efficiency by intelligently adapting the processing activity to the input signal local characteristics. It is done by completely rethinking the processing chain, by adopting a non conventional sampling scheme and adaptive rate filtering. The proposed approach, based on the LCSS (Level Crossing Sampling Scheme presents two filtering techniques, able to adapt their sampling rate and filter order by online analyzing the input signal variations. Indeed, the principle is to intelligently exploit the signal local characteristics—which is usually never considered—to filter only the relevant signal parts, by employing the relevant order filters. This idea leads towards a drastic gain in the computational efficiency and hence in the processing power when compared to the classical techniques.

  7. FY92 Progress Report for the Gyrotron Backward-Wave-Oscillator Experiment

    Science.gov (United States)

    1993-07-01

    C. SAMPLE CABLE CALIBRATION 23 D. ASYST CHANNEL SETUPS 26 E. SAMPLE MAGNET INPUT DATA DECK FOR THE GYRO-BWO 32 F. SAMPLE EGUN INPUT DATA DECK FOR THE...of the first coil of the Helmholtz pair; zero also corresponds to the diode end of the experiment). Another computer code used was the EGUN code (Ref...a short computer program was written to superimpose the two magnetic fields; DC and Helmholtz). An example of an EGUN input data file is included in

  8. OK, Computer: File Sharing, the Music Industry, and Why We Need the Pirate Party

    Directory of Open Access Journals (Sweden)

    Adrian Cosstick

    2009-03-01

    Full Text Available The Pirate Party believes the state and big business are in the process of protecting stale and inefficient models of business for their own monetary benefit by limiting our right to share information. The Pirate Party suggests that they are achieving this goal through the amendment of intellectual property legislation. In the dawn of the digital era, the Pirate Party advocates that governments and multinational corporations are using intellectual property to: crack down on file sharing which limits the ability to share knowledge and information; increase the terms and length of copyright to raise profits; and build code into music files which limits their ability to be shared (Pirate Party, 2009. There are a number of ‘copyright industries’ that are affected by these issues, none more so than the music industry. Its relationship with file sharing is topical and makes an excellent case study to address the impact big business has had on intellectual property and the need for the Pirate Party’s legislative input. The essay will then examine the central issues raised by illegal file sharing. In particular, the future for record companies in an environment that increasingly demands flexibility, and whether the Pirate Party’s proposal is a viable solution to the music industry’s problems

  9. Minimally invasive input function for 2-{sup 18}F-fluoro-A-85380 brain PET studies

    Energy Technology Data Exchange (ETDEWEB)

    Zanotti-Fregonara, Paolo [National Institute of Mental Health, NIH, Molecular Imaging Branch, Bethesda, MD (United States); Maroy, Renaud; Peyronneau, Marie-Anne; Trebossen, Regine [CEA, DSV, I2BM, Service Hospitalier Frederic Joliot, Orsay (France); Bottlaender, Michel [CEA, DSV, I2BM, NeuroSpin, Gif-sur-Yvette (France)

    2012-04-15

    Quantitative neuroreceptor positron emission tomography (PET) studies often require arterial cannulation to measure input function. While population-based input function (PBIF) would be a less invasive alternative, it has only rarely been used in conjunction with neuroreceptor PET tracers. The aims of this study were (1) to validate the use of PBIF for 2-{sup 18}F-fluoro-A-85380, a tracer for nicotinic receptors; (2) to compare the accuracy of measures obtained via PBIF to those obtained via blood-scaled image-derived input function (IDIF) from carotid arteries; and (3) to explore the possibility of using venous instead of arterial samples for both PBIF and IDIF. Ten healthy volunteers underwent a dynamic 2-{sup 18}F-fluoro-A-85380 brain PET scan with arterial and, in seven subjects, concurrent venous serial blood sampling. PBIF was obtained by averaging the normalized metabolite-corrected arterial input function and subsequently scaling each curve with individual blood samples. IDIF was obtained from the carotid arteries using a blood-scaling method. Estimated Logan distribution volume (V{sub T}) values were compared to the reference values obtained from arterial cannulation. For all subjects, PBIF curves scaled with arterial samples were similar in shape and magnitude to the reference arterial input function. The Logan V{sub T} ratio was 1.00 {+-} 0.05; all subjects had an estimation error <10%. IDIF gave slightly less accurate results (V{sub T} ratio 1.03 {+-} 0.07; eight of ten subjects had an error <10%). PBIF scaled with venous samples yielded inaccurate results (V{sub T} ratio 1.13 {+-} 0.13; only three of seven subjects had an error <10%). Due to arteriovenous differences at early time points, IDIF could not be calculated using venous samples. PBIF scaled with arterial samples accurately estimates Logan V{sub T} for 2-{sup 18}F-fluoro-A-85380. Results obtained with PBIF were slightly better than those obtained with IDIF. Due to arteriovenous concentration

  10. Data File Standard for Flow Cytometry, version FCS 3.1.

    Science.gov (United States)

    Spidlen, Josef; Moore, Wayne; Parks, David; Goldberg, Michael; Bray, Chris; Bierre, Pierre; Gorombey, Peter; Hyun, Bill; Hubbard, Mark; Lange, Simon; Lefebvre, Ray; Leif, Robert; Novo, David; Ostruszka, Leo; Treister, Adam; Wood, James; Murphy, Robert F; Roederer, Mario; Sudar, Damir; Zigon, Robert; Brinkman, Ryan R

    2010-01-01

    The flow cytometry data file standard provides the specifications needed to completely describe flow cytometry data sets within the confines of the file containing the experimental data. In 1984, the first Flow Cytometry Standard format for data files was adopted as FCS 1.0. This standard was modified in 1990 as FCS 2.0 and again in 1997 as FCS 3.0. We report here on the next generation flow cytometry standard data file format. FCS 3.1 is a minor revision based on suggested improvements from the community. The unchanged goal of the standard is to provide a uniform file format that allows files created by one type of acquisition hardware and software to be analyzed by any other type.The FCS 3.1 standard retains the basic FCS file structure and most features of previous versions of the standard. Changes included in FCS 3.1 address potential ambiguities in the previous versions and provide a more robust standard. The major changes include simplified support for international characters and improved support for storing compensation. The major additions are support for preferred display scale, a standardized way of capturing the sample volume, information about originality of the data file, and support for plate and well identification in high throughput, plate based experiments. Please see the normative version of the FCS 3.1 specification in Supporting Information for this manuscript (or at http://www.isac-net.org/ in the Current standards section) for a complete list of changes.

  11. Visual system of recovering and combination of information for ENDF (Evaluated Nuclear Data File) format libraries

    International Nuclear Information System (INIS)

    Ferreira, Claudia A.S. Velloso; Corcuera, Raquel A. Paviotti

    1997-01-01

    This report presents a data information retrieval and merger system for ENDF (Evaluated Nuclear Data File) format libraries, which can be run on personal computers under the Windows TM environment. The input is the name of an ENDF/B library, which can be chosen in a proper window. The system has a display function which allows the user to visualize the reaction data of a specific nuclide and to produce a printed copy of these data. The system allows the user to retrieve and/or combine evaluated data to create a single file of data in ENDF format, from a number of different files, each of which is in the ENDF format. The user can also create a mini-library from an ENDF/B library. This interactive and easy-to-handle system is a useful tool for Nuclear Data Centers and it is also of interest to nuclear and reactor physics researchers. (author)

  12. Provider of Services File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The POS file consists of two data files, one for CLIA labs and one for 18 other provider types. The file names are CLIA and OTHER. If downloading the file, note it...

  13. Status and evaluation methods of JENDL fusion file and JENDL PKA/KERMA file

    International Nuclear Information System (INIS)

    Chiba, S.; Fukahori, T.; Shibata, K.; Yu Baosheng; Kosako, K.

    1997-01-01

    The status of evaluated nuclear data in the JENDL fusion file and PKA/KERMA file is presented. The JENDL fusion file was prepared in order to improve the quality of the JENDL-3.1 data especially on the double-differential cross sections (DDXs) of secondary neutrons and gamma-ray production cross sections, and to provide DDXs of secondary charged particles (p, d, t, 3 He and α-particle) for the calculation of PKA and KERMA factors. The JENDL fusion file contains evaluated data of 26 elements ranging from Li to Bi. The data in JENDL fusion file reproduce the measured data on neutron and charged-particle DDXs and also on gamma-ray production cross sections. Recoil spectra in PKA/KERMA file were calculated from secondary neutron and charged-particle DDXs contained in the fusion file with two-body reaction kinematics. The data in the JENDL fusion file and PKA/KERMA file were compiled in ENDF-6 format with an MF=6 option to store the DDX data. (orig.)

  14. Instructions for preparation of data entry sheets for Licensee Event Report (LER) file. Revision 1. Instruction manual

    International Nuclear Information System (INIS)

    1977-07-01

    The manual provides instructions for the preparation of data entry sheets for the licensee event report (LER) file. It is a revision to an interim manual published in October 1974 in 00E-SS-001. The LER file is a computer-based data bank of information using the data entry sheets as input. These data entry sheets contain pertinent information in regard to those occurrences required to be reported to the NRC. The computer-based data bank provides a centralized source of data that may be used for qualitative assessment of the nature and extent of off-normal events in the nuclear industry and as an index of source information to which users may refer for more detail

  15. Profitability, Inputs Elasticities And Resource-Use Efficiency In Small ...

    African Journals Online (AJOL)

    The study examined profitability, inputs elasticities and resource-use efficiency in small scale cowpea production in Niger State, Nigeria. The primary data for the study were obtained using structured questionnaire administered to one hundred randomly sampled farmers from two Local Government Areas. Descriptive ...

  16. Compressive sampling of polynomial chaos expansions: Convergence analysis and sampling strategies

    International Nuclear Information System (INIS)

    Hampton, Jerrad; Doostan, Alireza

    2015-01-01

    Sampling orthogonal polynomial bases via Monte Carlo is of interest for uncertainty quantification of models with random inputs, using Polynomial Chaos (PC) expansions. It is known that bounding a probabilistic parameter, referred to as coherence, yields a bound on the number of samples necessary to identify coefficients in a sparse PC expansion via solution to an ℓ 1 -minimization problem. Utilizing results for orthogonal polynomials, we bound the coherence parameter for polynomials of Hermite and Legendre type under their respective natural sampling distribution. In both polynomial bases we identify an importance sampling distribution which yields a bound with weaker dependence on the order of the approximation. For more general orthonormal bases, we propose the coherence-optimal sampling: a Markov Chain Monte Carlo sampling, which directly uses the basis functions under consideration to achieve a statistical optimality among all sampling schemes with identical support. We demonstrate these different sampling strategies numerically in both high-order and high-dimensional, manufactured PC expansions. In addition, the quality of each sampling method is compared in the identification of solutions to two differential equations, one with a high-dimensional random input and the other with a high-order PC expansion. In both cases, the coherence-optimal sampling scheme leads to similar or considerably improved accuracy

  17. Compressive sampling of polynomial chaos expansions: Convergence analysis and sampling strategies

    Science.gov (United States)

    Hampton, Jerrad; Doostan, Alireza

    2015-01-01

    Sampling orthogonal polynomial bases via Monte Carlo is of interest for uncertainty quantification of models with random inputs, using Polynomial Chaos (PC) expansions. It is known that bounding a probabilistic parameter, referred to as coherence, yields a bound on the number of samples necessary to identify coefficients in a sparse PC expansion via solution to an ℓ1-minimization problem. Utilizing results for orthogonal polynomials, we bound the coherence parameter for polynomials of Hermite and Legendre type under their respective natural sampling distribution. In both polynomial bases we identify an importance sampling distribution which yields a bound with weaker dependence on the order of the approximation. For more general orthonormal bases, we propose the coherence-optimal sampling: a Markov Chain Monte Carlo sampling, which directly uses the basis functions under consideration to achieve a statistical optimality among all sampling schemes with identical support. We demonstrate these different sampling strategies numerically in both high-order and high-dimensional, manufactured PC expansions. In addition, the quality of each sampling method is compared in the identification of solutions to two differential equations, one with a high-dimensional random input and the other with a high-order PC expansion. In both cases, the coherence-optimal sampling scheme leads to similar or considerably improved accuracy.

  18. APLIKASI KEAMANAN FILE AUDIO WAV (WAVEFORM DENGAN TERAPAN ALGORITMA RSA

    Directory of Open Access Journals (Sweden)

    Raja Nasrul Fuad

    2017-03-01

    Full Text Available The WAV file format that is widely used rough on various kinds of multimedia and gaming platforms. Ease of access and technological development with a variety of media to facilitate the exchange of information to various places. The data are important and need to be kept confidential secret for a wide range of security threats so that data can be intercepted and acknowledged by third parties during the shipping process. Of these problems led to the idea to create an application data security functions can secure the data using the RSA algorithm. The programming language is C # with Visual Studio software, the processed data is a sample each byte in WAV file, the header will be the same as that originally WAV files can be played even if the information has been withheld. RSA algorithm can be implemented into a programming language that WAV files can be processed and secured the data.

  19. Access to Research Inputs

    DEFF Research Database (Denmark)

    Czarnitzki, Dirk; Grimpe, Christoph; Pellens, Maikel

    2015-01-01

    The viability of modern open science norms and practices depends on public disclosure of new knowledge, methods, and materials. However, increasing industry funding of research can restrict the dissemination of results and materials. We show, through a survey sample of 837 German scientists in life...... sciences, natural sciences, engineering, and social sciences, that scientists who receive industry funding are twice as likely to deny requests for research inputs as those who do not. Receiving external funding in general does not affect denying others access. Scientists who receive external funding...... of any kind are, however, 50 % more likely to be denied access to research materials by others, but this is not affected by being funded specifically by industry...

  20. Access to Research Inputs

    DEFF Research Database (Denmark)

    Czarnitzki, Dirk; Grimpe, Christoph; Pellens, Maikel

    The viability of modern open science norms and practices depend on public disclosure of new knowledge, methods, and materials. However, increasing industry funding of research can restrict the dissemination of results and materials. We show, through a survey sample of 837 German scientists in life...... sciences, natural sciences, engineering, and social sciences, that scientists who receive industry funding are twice as likely to deny requests for research inputs as those who do not. Receiving external funding in general does not affect denying others access. Scientists who receive external funding...... of any kind are, however, 50% more likely to be denied access to research materials by others, but this is not affected by being funded specifically by industry....

  1. Securing the AliEn File Catalogue - Enforcing authorization with accountable file operations

    International Nuclear Information System (INIS)

    Schreiner, Steffen; Banerjee, Subho Sankar; Betev, Latchezar; Carminati, Federico; Vladimirovna Datskova, Olga; Furano, Fabrizio; Grigoras, Alina; Grigoras, Costin; Mendez Lorenzo, Patricia; Peters, Andreas Joachim; Saiz, Pablo; Bagnasco, Stefano; Zhu Jianlin

    2011-01-01

    The AliEn Grid Services, as operated by the ALICE Collaboration in its global physics analysis grid framework, is based on a central File Catalogue together with a distributed set of storage systems and the possibility to register links to external data resources. This paper describes several identified vulnerabilities in the AliEn File Catalogue access protocol regarding fraud and unauthorized file alteration and presents a more secure and revised design: a new mechanism, called LFN Booking Table, is introduced in order to keep track of access authorization in the transient state of files entering or leaving the File Catalogue. Due to a simplification of the original Access Envelope mechanism for xrootd-protocol-based storage systems, fundamental computational improvements of the mechanism were achieved as well as an up to 50% reduction of the credential's size. By extending the access protocol with signed status messages from the underlying storage system, the File Catalogue receives trusted information about a file's size and checksum and the protocol is no longer dependent on client trust. Altogether, the revised design complies with atomic and consistent transactions and allows for accountable, authentic, and traceable file operations. This paper describes these changes as part and beyond the development of AliEn version 2.19.

  2. Parameters for calculation of nuclear reactions of relevance to non-energy nuclear applications (Reference Input Parameter Library: Phase III). Summary report of the first research coordination meeting

    International Nuclear Information System (INIS)

    Capote Noy, R.

    2004-08-01

    A summary is given of the First Research Coordination Meeting on Parameters for Calculation of Nuclear Reactions of Relevance to Non-Energy Nuclear Applications (Reference Input Parameter Library: Phase III), including a critical review of the RIPL-2 file. The new library should serve as input for theoretical calculations of nuclear reaction data at incident energies up to 200 MeV, as needed for energy and non-energy modern applications of nuclear data. Technical discussions and the resulting work plan of the Coordinated Research Programme are summarized, along with actions and deadlines. Participants' contributions to the RCM are also attached. (author)

  3. Fast Ordered Sampling of DNA Sequence Variants

    Directory of Open Access Journals (Sweden)

    Anthony J. Greenberg

    2018-05-01

    Full Text Available Explosive growth in the amount of genomic data is matched by increasing power of consumer-grade computers. Even applications that require powerful servers can be quickly tested on desktop or laptop machines if we can generate representative samples from large data sets. I describe a fast and memory-efficient implementation of an on-line sampling method developed for tape drives 30 years ago. Focusing on genotype files, I test the performance of this technique on modern solid-state and spinning hard drives, and show that it performs well compared to a simple sampling scheme. I illustrate its utility by developing a method to quickly estimate genome-wide patterns of linkage disequilibrium (LD decay with distance. I provide open-source software that samples loci from several variant format files, a separate program that performs LD decay estimates, and a C++ library that lets developers incorporate these methods into their own projects.

  4. Fast Ordered Sampling of DNA Sequence Variants.

    Science.gov (United States)

    Greenberg, Anthony J

    2018-05-04

    Explosive growth in the amount of genomic data is matched by increasing power of consumer-grade computers. Even applications that require powerful servers can be quickly tested on desktop or laptop machines if we can generate representative samples from large data sets. I describe a fast and memory-efficient implementation of an on-line sampling method developed for tape drives 30 years ago. Focusing on genotype files, I test the performance of this technique on modern solid-state and spinning hard drives, and show that it performs well compared to a simple sampling scheme. I illustrate its utility by developing a method to quickly estimate genome-wide patterns of linkage disequilibrium (LD) decay with distance. I provide open-source software that samples loci from several variant format files, a separate program that performs LD decay estimates, and a C++ library that lets developers incorporate these methods into their own projects. Copyright © 2018 Greenberg.

  5. Application of HDF5 in long-pulse quasi-steady state data acquisition at high sampling rate

    International Nuclear Information System (INIS)

    Chen, Y.; Wang, F.; Li, S.; Xiao, B.J.; Yang, F.

    2014-01-01

    Highlights: • The new data-acquisition system supports long-pulse EAST data acquisition. • The new data-acquisition system is capable for most of the high frequency signals of EAST experiments. • The system's total throughput is about 500 MB/s. • The system uses HDF5 to store data. - Abstract: A new high sampling rate quasi-steady state data-acquisition system has been designed for the microwave reflectometry diagnostic of EAST experiments. In order to meet the requirements of long-pulse discharge and high sampling rate, it is designed based on PXI Express technology. A high-performance digitizer National Instruments PXIe-5122 with two synchronous analog input channels in which the maximum sampling rate is 100 MHz has been adopted. Two PXIe-5122 boards at 60 MSPS and one PXIe-6368 board at 2 MSPS are used in the system and the total throughput is about 500 MB/s. To guarantee the large amounts of data being saved continuously in the long-pulse discharge, an external hard-disk data stream enclosure NI HDD-8265 in which the capacity of sustained speed of reading and writing is 700 MB/s. And in RAID-5 mode its storage capacity is 80% of the total. The obtained raw data firstly stream continuously into NI HDD-8265 during the discharge. Then it will be transferred to the data server automatically and converted into HDF5 file format. HDF5 is an open source file format for data storage and management which has been widely used in various fields, and suitable for long term case. The details of the system are described in the paper

  6. Long-term atmospheric nutrient inputs to the Eastern Mediterranean: sources, solubility and comparison with riverine inputs

    Science.gov (United States)

    Koçak, M.; Kubilay, N.; Tuǧrul, S.; Mihalopoulos, N.

    2010-07-01

    Aerosol and rain samples were collected at a rural site located on the coastline of the Eastern Mediterranean, Erdemli, Turkey between January 1999 and December 2007. Riverine sampling was carried out at five Rivers (Ceyhan, Seyhan, Göksu, Berdan and Lamas) draining into the Northeastern Levantine Basin (NLB) between March 2002 and July 2007. Samples were analyzed for macronutrients of phosphate, silicate, nitrate and ammonium (PO43-, Sidiss, NO3- and NH4+). Phosphate and silicate in aerosol and rainwater showed higher and larger variation during the transitional period (March-May, September) when air flows predominantly originate from North Africa and Middle East/Arabian Peninsula. Deficiency of alkaline material were found to be the main reason of the acidic rain events whilst high pH values (>7) were associated with high Sidiss concentrations due to sporadic dust events. In general, lowest nitrate and ammonium concentrations in aerosol and rainwater were associated with air flow from the Mediterranean Sea. Unlike NO3- and NH4+ (Dissolved Inorganic Nitrogen, DIN), there were statistical differences for PO43- and Sidiss solubilities in sea-water and pure-water. Solubilities of PO43- and Sidiss were found to be related with air mass back trajectories and pH. Comparison of atmospheric with riverine fluxes demonstrated that DIN and PO43- fluxes to NLB were dominated by atmosphere (~90% and ~60% respectively) whereas the input of Si was mainly derived from riverine runoff (~90%). N/P ratios (atmosphere ~233; riverine ~28) revealed that NLB receives excessive amounts of DIN and this unbalanced P and N inputs may provoke even more phosphorus deficiency. Molar Si/N ratios (atmosphere + riverine) suggested Si limitation which might cause a switch from diatom dominated phytoplankton communities to non-siliceous populations in NLB.

  7. A Metadata-Rich File System

    Energy Technology Data Exchange (ETDEWEB)

    Ames, S; Gokhale, M B; Maltzahn, C

    2009-01-07

    Despite continual improvements in the performance and reliability of large scale file systems, the management of file system metadata has changed little in the past decade. The mismatch between the size and complexity of large scale data stores and their ability to organize and query their metadata has led to a de facto standard in which raw data is stored in traditional file systems, while related, application-specific metadata is stored in relational databases. This separation of data and metadata requires considerable effort to maintain consistency and can result in complex, slow, and inflexible system operation. To address these problems, we have developed the Quasar File System (QFS), a metadata-rich file system in which files, metadata, and file relationships are all first class objects. In contrast to hierarchical file systems and relational databases, QFS defines a graph data model composed of files and their relationships. QFS includes Quasar, an XPATH-extended query language for searching the file system. Results from our QFS prototype show the effectiveness of this approach. Compared to the defacto standard, the QFS prototype shows superior ingest performance and comparable query performance on user metadata-intensive operations and superior performance on normal file metadata operations.

  8. User input verification and test driven development in the NJOY21 nuclear data processing code

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Conlin, Jeremy Lloyd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); McCartney, Austin Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-21

    Before physically-meaningful data can be used in nuclear simulation codes, the data must be interpreted and manipulated by a nuclear data processing code so as to extract the relevant quantities (e.g. cross sections and angular distributions). Perhaps the most popular and widely-trusted of these processing codes is NJOY, which has been developed and improved over the course of 10 major releases since its creation at Los Alamos National Laboratory in the mid-1970’s. The current phase of NJOY development is the creation of NJOY21, which will be a vast improvement from its predecessor, NJOY2016. Designed to be fast, intuitive, accessible, and capable of handling both established and modern formats of nuclear data, NJOY21 will address many issues that many NJOY users face, while remaining functional for those who prefer the existing format. Although early in its development, NJOY21 is quickly providing input validation to check user input. By providing rapid and helpful responses to users while writing input files, NJOY21 will prove to be more intuitive and easy to use than any of its predecessors. Furthermore, during its development, NJOY21 is subject to regular testing, such that its test coverage must strictly increase with the addition of any production code. This thorough testing will allow developers and NJOY users to establish confidence in NJOY21 as it gains functionality. This document serves as a discussion regarding the current state input checking and testing practices of NJOY21.

  9. FLUTAN 2.0. Input specifications

    International Nuclear Information System (INIS)

    Willerding, G.; Baumann, W.

    1996-05-01

    FLUTAN is a highly vectorized computer code for 3D fluiddynamic and thermal-hydraulic analyses in Cartesian or cylinder coordinates. It is related to the family of COMMIX codes originally developed at Argonne National Laboratory, USA, and particularly to COMMIX-1A and COMMIX-1B, which were made available to FZK in the frame of cooperation contracts within the fast reactor safety field. FLUTAN 2.0 is an improved version of the FLUTAN code released in 1992. It offers some additional innovations, e.g. the QUICK-LECUSSO-FRAM techniques for reducing numerical diffusion in the k-ε turbulence model equations; a higher sophisticated wall model for specifying a mass flow outside the surface walls together with its flow path and its associated inlet and outlet flow temperatures; and a revised and upgraded pressure boundary condition to fully include the outlet cells in the solution process of the conservation equations. Last but not least, a so-called visualization option based on VISART standards has been provided. This report contains detailed input instructions, presents formulations of the various model options, and explains how to use the code by means of comprehensive sample input. (orig.) [de

  10. HUD GIS Boundary Files

    Data.gov (United States)

    Department of Housing and Urban Development — The HUD GIS Boundary Files are intended to supplement boundary files available from the U.S. Census Bureau. The files are for community planners interested in...

  11. Electronic Document Management Using Inverted Files System

    Science.gov (United States)

    Suhartono, Derwin; Setiawan, Erwin; Irwanto, Djon

    2014-03-01

    The amount of documents increases so fast. Those documents exist not only in a paper based but also in an electronic based. It can be seen from the data sample taken by the SpringerLink publisher in 2010, which showed an increase in the number of digital document collections from 2003 to mid of 2010. Then, how to manage them well becomes an important need. This paper describes a new method in managing documents called as inverted files system. Related with the electronic based document, the inverted files system will closely used in term of its usage to document so that it can be searched over the Internet using the Search Engine. It can improve document search mechanism and document save mechanism.

  12. Electronic Document Management Using Inverted Files System

    Directory of Open Access Journals (Sweden)

    Suhartono Derwin

    2014-03-01

    Full Text Available The amount of documents increases so fast. Those documents exist not only in a paper based but also in an electronic based. It can be seen from the data sample taken by the SpringerLink publisher in 2010, which showed an increase in the number of digital document collections from 2003 to mid of 2010. Then, how to manage them well becomes an important need. This paper describes a new method in managing documents called as inverted files system. Related with the electronic based document, the inverted files system will closely used in term of its usage to document so that it can be searched over the Internet using the Search Engine. It can improve document search mechanism and document save mechanism.

  13. 33 CFR 148.246 - When is a document considered filed and where should I file it?

    Science.gov (United States)

    2010-07-01

    ... filed and where should I file it? 148.246 Section 148.246 Navigation and Navigable Waters COAST GUARD... Formal Hearings § 148.246 When is a document considered filed and where should I file it? (a) If a document to be filed is submitted by mail, it is considered filed on the date it is postmarked. If a...

  14. Temperature increases on the external root surface during endodontic treatment using single file systems.

    Science.gov (United States)

    Özkocak, I; Taşkan, M M; Gökt Rk, H; Aytac, F; Karaarslan, E Şirin

    2015-01-01

    The aim of this study is to evaluate increases in temperature on the external root surface during endodontic treatment with different rotary systems. Fifty human mandibular incisors with a single root canal were selected. All root canals were instrumented using a size 20 Hedstrom file, and the canals were irrigated with 5% sodium hypochlorite solution. The samples were randomly divided into the following three groups of 15 teeth: Group 1: The OneShape Endodontic File no.: 25; Group 2: The Reciproc Endodontic File no.: 25; Group 3: The WaveOne Endodontic File no.: 25. During the preparation, the temperature changes were measured in the middle third of the roots using a noncontact infrared thermometer. The temperature data were transferred from the thermometer to the computer and were observed graphically. Statistical analysis was performed using the Kruskal-Wallis analysis of variance at a significance level of 0.05. The increases in temperature caused by the OneShape file system were lower than those of the other files (P file showed the highest temperature increases. However, there were no significant differences between the Reciproc and WaveOne files. The single file rotary systems used in this study may be recommended for clinical use.

  15. Protecting your files on the DFS file system

    CERN Multimedia

    Computer Security Team

    2011-01-01

    The Windows Distributed File System (DFS) hosts user directories for all NICE users plus many more data.    Files can be accessed from anywhere, via a dedicated web portal (http://cern.ch/dfs). Due to the ease of access to DFS with in CERN it is of utmost importance to properly protect access to sensitive data. As the use of DFS access control mechanisms is not obvious to all users, passwords, certificates or sensitive files might get exposed. At least this happened in past to the Andrews File System (AFS) - the Linux equivalent to DFS) - and led to bad publicity due to a journalist accessing supposedly "private" AFS folders (SonntagsZeitung 2009/11/08). This problem does not only affect the individual user but also has a bad impact on CERN's reputation when it comes to IT security. Therefore, all departments and LHC experiments agreed recently to apply more stringent protections to all DFS user folders. The goal of this data protection policy is to assist users in pro...

  16. Protecting your files on the AFS file system

    CERN Multimedia

    2011-01-01

    The Andrew File System is a world-wide distributed file system linking hundreds of universities and organizations, including CERN. Files can be accessed from anywhere, via dedicated AFS client programs or via web interfaces that export the file contents on the web. Due to the ease of access to AFS it is of utmost importance to properly protect access to sensitive data in AFS. As the use of AFS access control mechanisms is not obvious to all users, passwords, private SSH keys or certificates have been exposed in the past. In one specific instance, this also led to bad publicity due to a journalist accessing supposedly "private" AFS folders (SonntagsZeitung 2009/11/08). This problem does not only affect the individual user but also has a bad impact on CERN's reputation when it comes to IT security. Therefore, all departments and LHC experiments agreed in April 2010 to apply more stringent folder protections to all AFS user folders. The goal of this data protection policy is to assist users in...

  17. Zebra: A striped network file system

    Science.gov (United States)

    Hartman, John H.; Ousterhout, John K.

    1992-01-01

    The design of Zebra, a striped network file system, is presented. Zebra applies ideas from log-structured file system (LFS) and RAID research to network file systems, resulting in a network file system that has scalable performance, uses its servers efficiently even when its applications are using small files, and provides high availability. Zebra stripes file data across multiple servers, so that the file transfer rate is not limited by the performance of a single server. High availability is achieved by maintaining parity information for the file system. If a server fails its contents can be reconstructed using the contents of the remaining servers and the parity information. Zebra differs from existing striped file systems in the way it stripes file data: Zebra does not stripe on a per-file basis; instead it stripes the stream of bytes written by each client. Clients write to the servers in units called stripe fragments, which are analogous to segments in an LFS. Stripe fragments contain file blocks that were written recently, without regard to which file they belong. This method of striping has numerous advantages over per-file striping, including increased server efficiency, efficient parity computation, and elimination of parity update.

  18. Total dose induced increase in input offset voltage in JFET input operational amplifiers

    International Nuclear Information System (INIS)

    Pease, R.L.; Krieg, J.; Gehlhausen, M.; Black, J.

    1999-01-01

    Four different types of commercial JFET input operational amplifiers were irradiated with ionizing radiation under a variety of test conditions. All experienced significant increases in input offset voltage (Vos). Microprobe measurement of the electrical characteristics of the de-coupled input JFETs demonstrates that the increase in Vos is a result of the mismatch of the degraded JFETs. (authors)

  19. File compression and encryption based on LLS and arithmetic coding

    Science.gov (United States)

    Yu, Changzhi; Li, Hengjian; Wang, Xiyu

    2018-03-01

    e propose a file compression model based on arithmetic coding. Firstly, the original symbols, to be encoded, are input to the encoder one by one, we produce a set of chaotic sequences by using the Logistic and sine chaos system(LLS), and the values of this chaotic sequences are randomly modified the Upper and lower limits of current symbols probability. In order to achieve the purpose of encryption, we modify the upper and lower limits of all character probabilities when encoding each symbols. Experimental results show that the proposed model can achieve the purpose of data encryption while achieving almost the same compression efficiency as the arithmetic coding.

  20. FlaME: Flash Molecular Editor - a 2D structure input tool for the web

    Directory of Open Access Journals (Sweden)

    Dallakian Pavel

    2011-02-01

    Full Text Available Abstract Background So far, there have been no Flash-based web tools available for chemical structure input. The authors herein present a feasibility study, aiming at the development of a compact and easy-to-use 2D structure editor, using Adobe's Flash technology and its programming language, ActionScript. As a reference model application from the Java world, we selected the Java Molecular Editor (JME. In this feasibility study, we made an attempt to realize a subset of JME's functionality in the Flash Molecular Editor (FlaME utility. These basic capabilities are: structure input, editing and depiction of single molecules, data import and export in molfile format. Implementation The result of molecular diagram sketching in FlaME is accessible in V2000 molfile format. By integrating the molecular editor into a web page, its communication with the HTML elements on this page is established using the two JavaScript functions, getMol( and setMol(. In addition, structures can be copied to the system clipboard. Conclusion A first attempt was made to create a compact single-file application for 2D molecular structure input/editing on the web, based on Flash technology. With the application examples presented in this article, it could be demonstrated that the Flash methods are principally well-suited to provide the requisite communication between the Flash object (application and the HTML elements on a web page, using JavaScript functions.

  1. BioSampling Data from LHP Cruises

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set includes separate bioSampling logs from each LHP Bottomfishing cruise both within and outside of the Main Hawaiian Islands, as well as a master file...

  2. Screening important inputs in models with strong interaction properties

    International Nuclear Information System (INIS)

    Saltelli, Andrea; Campolongo, Francesca; Cariboni, Jessica

    2009-01-01

    We introduce a new method for screening inputs in mathematical or computational models with large numbers of inputs. The method proposed here represents an improvement over the best available practice for this setting when dealing with models having strong interaction effects. When the sample size is sufficiently high the same design can also be used to obtain accurate quantitative estimates of the variance-based sensitivity measures: the same simulations can be used to obtain estimates of the variance-based measures according to the Sobol' and the Jansen formulas. Results demonstrate that Sobol' is more efficient for the computation of the first-order indices, while Jansen performs better for the computation of the total indices.

  3. Screening important inputs in models with strong interaction properties

    Energy Technology Data Exchange (ETDEWEB)

    Saltelli, Andrea [European Commission, Joint Research Centre, 21020 Ispra, Varese (Italy); Campolongo, Francesca [European Commission, Joint Research Centre, 21020 Ispra, Varese (Italy)], E-mail: francesca.campolongo@jrc.it; Cariboni, Jessica [European Commission, Joint Research Centre, 21020 Ispra, Varese (Italy)

    2009-07-15

    We introduce a new method for screening inputs in mathematical or computational models with large numbers of inputs. The method proposed here represents an improvement over the best available practice for this setting when dealing with models having strong interaction effects. When the sample size is sufficiently high the same design can also be used to obtain accurate quantitative estimates of the variance-based sensitivity measures: the same simulations can be used to obtain estimates of the variance-based measures according to the Sobol' and the Jansen formulas. Results demonstrate that Sobol' is more efficient for the computation of the first-order indices, while Jansen performs better for the computation of the total indices.

  4. JENDL Dosimetry File

    International Nuclear Information System (INIS)

    Nakazawa, Masaharu; Iguchi, Tetsuo; Kobayashi, Katsuhei; Iwasaki, Shin; Sakurai, Kiyoshi; Ikeda, Yujiro; Nakagawa, Tsuneo.

    1992-03-01

    The JENDL Dosimetry File based on JENDL-3 was compiled and integral tests of cross section data were performed by the Dosimetry Integral Test Working Group of the Japanese Nuclear Data Committee. Data stored in the JENDL Dosimetry File are the cross sections and their covariance data for 61 reactions. The cross sections were mainly taken from JENDL-3 and the covariances from IRDF-85. For some reactions, data were adopted from other evaluated data files. The data are given in the neutron energy region below 20 MeV in both of point-wise and group-wise files in the ENDF-5 format. In order to confirm reliability of the data, several integral tests were carried out; comparison with the data in IRDF-85 and average cross sections measured in fission neutron fields, fast reactor spectra, DT neutron fields and Li(d, n) neutron fields. As a result, it has been found that the JENDL Dosimetry File gives better results than IRDF-85 but there are some problems to be improved in future. The contents of the JENDL Dosimetry File and the results of the integral tests are described in this report. All of the dosimetry cross sections are shown in a graphical form. (author) 76 refs

  5. JENDL Dosimetry File

    Energy Technology Data Exchange (ETDEWEB)

    Nakazawa, Masaharu; Iguchi, Tetsuo [Tokyo Univ. (Japan). Faculty of Engineering; Kobayashi, Katsuhei [Kyoto Univ., Kumatori, Osaka (Japan). Research Reactor Inst.; Iwasaki, Shin [Tohoku Univ., Sendai (Japan). Faculty of Engineering; Sakurai, Kiyoshi; Ikeda, Yujior; Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1992-03-15

    The JENDL Dosimetry File based on JENDL-3 was compiled and integral tests of cross section data were performed by the Dosimetry Integral Test Working Group of the Japanese Nuclear Data Committee. Data stored in the JENDL Dosimetry File are the cross sections and their covariance data for 61 reactions. The cross sections were mainly taken from JENDL-3 and the covariances from IRDF-85. For some reactions, data were adopted from other evaluated data files. The data are given in the neutron energy region below 20 MeV in both of point-wise and group-wise files in the ENDF-5 format. In order to confirm reliability of the data, several integral tests were carried out; comparison with the data in IRDF-85 and average cross sections measured in fission neutron fields, fast reactor spectra, DT neutron fields and Li(d,n) neutron fields. As a result, it has been found that the JENDL Dosimetry File gives better results than IRDF-85 but there are some problems to be improved in future. The contents of the JENDL Dosimetry File and the results of the integral tests are described in this report. All of the dosimetry cross sections are shown in a graphical form.

  6. Atmospheric nutrient inputs to the northern levantine basin from a long-term observation: sources and comparison with riverine inputs

    Science.gov (United States)

    Koçak, M.; Kubilay, N.; Tuğrul, S.; Mihalopoulos, N.

    2010-12-01

    Aerosol and rainwater samples have been collected at a rural site located on the coastline of the Eastern Mediterranean, Erdemli, Turkey between January 1999 and December 2007. Riverine sampling was carried out at five Rivers (Ceyhan, Seyhan, Göksu, Berdan and Lamas) draining into the Northeastern Levantine Basin (NLB) between March 2002 and July 2007. Samples have been analyzed for macronutrients of phosphate, silicate, nitrate and ammonium (PO43-, Sidiss, NO3- and NH4+). Phosphate and silicate in aerosol and rainwater showed higher and larger variations during the transitional period when air flows predominantly originate from North Africa and Middle East/Arabian Peninsula. Deficiency of alkaline material have been found to be the main reason of the acidic rain events whilst high pH values (>7) have been associated with high Sidiss concentrations due to sporadic dust events. In general, lowest nitrate and ammonium concentrations in aerosol and rainwater have been associated with air flow from the Mediterranean Sea. Comparison of atmospheric with riverine fluxes demonstrated that DIN and PO43- fluxes to NLB have been dominated by atmosphere (~90% and ~60% respectively) whereas the input of Si was mainly derived from riverine runoff (~90%). N/P ratios in the atmospheric deposition (233); riverine discharge (28) revealed that NLB receives excessive amounts of DIN and this unbalanced P and N inputs may provoke even more phosphorus deficiency. Observed molar Si/N ratio suggested Si limitation relative to nitrogen might cause a switch from diatom dominated communities to non-siliceous populations particularly at coastal NLB.

  7. FPFPspace2: A code for following airborne fission products in generic nuclear plant flow paths

    International Nuclear Information System (INIS)

    Owcarski, P.C.; Burk, K.W.; Ramsdell, J.V.; Yasuda, D.D.

    1991-03-01

    In order to assure that a nuclear power plant control room remains habitable during certain types of postulated accidents, Pacific Northwest Laboratory (PNL) has undertaken a special study for the US Nuclear Regulatory Commission. This purpose of this study is to develop software that can aid in the analyses of control room habitability during accidents in which airborne fission products could challenge internal air pathways to the control room. PNL has completed an initial version (FPFP) and final version (FPFP 2) of a software package that can estimate the unsteady-state invasion of quantities of fission products into the control room or any other destination within the nuclear plant via generic internal flow paths. This report consists of three parts: Section 2.0, Technical Bases, describes the flow path components and mechanisms of natural fission product deposition; Section 3.0, FPFP 2 Code Description, describes code organization and the functions of the subroutines; and Section 4.0, Code Operation, discusses details of input requirements, code output, and a sample case demonstration. The appendices consist of an FPFP 2 Fortran code listing, a listing of a code for building input files, forms for building input files, and the sample case input and output files. 7 refs., 3 figs

  8. Guidelines for determining inputs of inorganic contaminants into estuaries

    International Nuclear Information System (INIS)

    1987-01-01

    This publication describes sampling and sample preparation procedures suitable to obtain unpolluted samples for the purpose of determining river inputs of inorganic pollutants into estuaries. Emphasis is placed on heavy metal pollutants but procedures are suitable, with appropriate modifications for other inorganic pollutants. For example, the collection of samples for mercury may require modifications of handling procedures. River water samples are collected at the most down-river point where no estuarine influences effect results. Samples are collected using a peristaltic pump and separated into aqueous and particulate phases for pollutant analysis. As is the case of all trace pollutant analyses, meticulous care is required to prevent pollution of the sample and in addition to the precautions described in this method, great personal attention is required to minimize sample handling, pollution by smoke, hands, hair, dust, talc from gloves, etc., and to avoid all contact of the samples and reagents with skin and metallic objects. 1 ref., 3 figs, 1 tab

  9. Influence of cervical preflaring on apical file size determination.

    Science.gov (United States)

    Pecora, J D; Capelli, A; Guerisoli, D M Z; Spanó, J C E; Estrela, C

    2005-07-01

    To investigate the influence of cervical preflaring with different instruments (Gates-Glidden drills, Quantec Flare series instruments and LA Axxess burs) on the first file that binds at working length (WL) in maxillary central incisors. Forty human maxillary central incisors with complete root formation were used. After standard access cavities, a size 06 K-file was inserted into each canal until the apical foramen was reached. The WL was set 1 mm short of the apical foramen. Group 1 received the initial apical instrument without previous preflaring of the cervical and middle thirds of the root canal. Group 2 had the cervical and middle portion of the root canals enlarged with Gates-Glidden drills sizes 90, 110 and 130. Group 3 had the cervical and middle thirds of the root canals enlarged with nickel-titanium Quantec Flare series instruments. Titanium-nitrite treated, stainless steel LA Axxess burs were used for preflaring the cervical and middle portions of root canals from group 4. Each canal was sized using manual K-files, starting with size 08 files with passive movements until the WL was reached. File sizes were increased until a binding sensation was felt at the WL, and the instrument size was recorded for each tooth. The apical region was then observed under a stereoscopic magnifier, images were recorded digitally and the differences between root canal and maximum file diameters were evaluated for each sample. Significant differences were found between experimental groups regarding anatomical diameter at the WL and the first file to bind in the canal (P Flare instruments were ranked in an intermediary position, with no statistically significant differences between them (0.093 mm average). The instrument binding technique for determining anatomical diameter at WL is not precise. Preflaring of the cervical and middle thirds of the root canal improved anatomical diameter determination; the instrument used for preflaring played a major role in determining the

  10. A Java-based tool for creating KML files from GPS waypoints

    Science.gov (United States)

    Kinnicutt, P. G.; Rivard, C.; Rimer, S.

    2008-12-01

    Google Earth provides a free tool with powerful capabilities for visualizing geoscience images and data. Commercial software tools exist for doing sophisticated digitizing and spatial modeling , but for the purposes of presentation, visualization and overlaying aerial images with data Google Earth provides much of the functionality. Likewise, with current technologies in GPS (Global Positioning System) systems and with Google Earth Plus, it is possible to upload GPS waypoints, tracks and routes directly into Google Earth for visualization. However, older technology GPS units and even low-cost GPS units found today may lack the necessary communications interface to a computer (e.g. no Bluetooth, no WiFi, no USB, no Serial, etc.) or may have an incompatible interface, such as a Serial port but no USB adapter available. In such cases, any waypoints, tracks and routes saved in the GPS unit or recorded in a field notebook must be manually transferred to a computer for use in a GIS system or other program. This presentation describes a Java-based tool developed by the author which enables users to enter GPS coordinates in a user-friendly manner, then save these coordinates in a Keyhole MarkUp Language (KML) file format, for visualization in Google Earth. This tool either accepts user-interactive input or accepts input from a CSV (Comma Separated Value) file, which can be generated from any spreadsheet program. This tool accepts input in the form of lat/long or UTM (Universal Transverse Mercator) coordinates. This presentation describes this system's applicability through several small case studies. This free and lightweight tool simplifies the task of manually inputting GPS data into Google Earth for people working in the field without an automated mechanism for uploading the data; for instance, the user may not have internet connectivity or may not have the proper hardware or software. Since it is a Java application and not a web- based tool, it can be installed on one

  11. A File Archival System

    Science.gov (United States)

    Fanselow, J. L.; Vavrus, J. L.

    1984-01-01

    ARCH, file archival system for DEC VAX, provides for easy offline storage and retrieval of arbitrary files on DEC VAX system. System designed to eliminate situations that tie up disk space and lead to confusion when different programers develop different versions of same programs and associated files.

  12. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning.

    Directory of Open Access Journals (Sweden)

    Anne Hsu

    Full Text Available A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning.

  13. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning

    Science.gov (United States)

    2016-01-01

    A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning. PMID:27310576

  14. Optimized distributed systems achieve significant performance improvement on sorted merging of massive VCF files.

    Science.gov (United States)

    Sun, Xiaobo; Gao, Jingjing; Jin, Peng; Eng, Celeste; Burchard, Esteban G; Beaty, Terri H; Ruczinski, Ingo; Mathias, Rasika A; Barnes, Kathleen; Wang, Fusheng; Qin, Zhaohui S

    2018-06-01

    Sorted merging of genomic data is a common data operation necessary in many sequencing-based studies. It involves sorting and merging genomic data from different subjects by their genomic locations. In particular, merging a large number of variant call format (VCF) files is frequently required in large-scale whole-genome sequencing or whole-exome sequencing projects. Traditional single-machine based methods become increasingly inefficient when processing large numbers of files due to the excessive computation time and Input/Output bottleneck. Distributed systems and more recent cloud-based systems offer an attractive solution. However, carefully designed and optimized workflow patterns and execution plans (schemas) are required to take full advantage of the increased computing power while overcoming bottlenecks to achieve high performance. In this study, we custom-design optimized schemas for three Apache big data platforms, Hadoop (MapReduce), HBase, and Spark, to perform sorted merging of a large number of VCF files. These schemas all adopt the divide-and-conquer strategy to split the merging job into sequential phases/stages consisting of subtasks that are conquered in an ordered, parallel, and bottleneck-free way. In two illustrating examples, we test the performance of our schemas on merging multiple VCF files into either a single TPED or a single VCF file, which are benchmarked with the traditional single/parallel multiway-merge methods, message passing interface (MPI)-based high-performance computing (HPC) implementation, and the popular VCFTools. Our experiments suggest all three schemas either deliver a significant improvement in efficiency or render much better strong and weak scalabilities over traditional methods. Our findings provide generalized scalable schemas for performing sorted merging on genetics and genomics data using these Apache distributed systems.

  15. Development of Input/Output System for the Reactor Transient Analysis System (RETAS)

    International Nuclear Information System (INIS)

    Suh, Jae Seung; Kang, Doo Hyuk; Cho, Yeon Sik; Ahn, Seung Hoon; Cho, Yong Jin

    2009-01-01

    A Korea Institute of Nuclear Safety Reactor Transient Analysis System (KINS-RETAS) aims at providing a realistic prediction of core and RCS response to the potential or actual event scenarios in Korean nuclear power plants (NPPs). A thermal hydraulic system code MARS is a pivot code of the RETAS, and used to predict thermal hydraulic (TH) behaviors in the core and associated systems. MARS alone can be applied to many types of transients, but is sometimes coupled with the other codes developed for different objectives. Many tools have been developed to aid users in preparing input and displaying the transient information and output data. Output file and Graphical User Interfaces (GUI) that help prepare input decks, as seen in SNAP (Gitnick, 1998), VISA (K.D. Kim, 2007) and display aids include the eFAST (KINS, 2007). The tools listed above are graphical interfaces. The input deck builders allow the user to create a functional diagram of the plant, pictorially on the screen. The functional diagram, when annotated with control volume and junction numbers, is a nodalization diagram. Data required for an input deck is entered for volumes and junctions through a mouse-driven menu and pop-up dialog; after the information is complete, an input deck is generated. Display GUIs show data from MARS calculations, either during or after the transient. The RETAS requires the user to first generate a set of 'input', two dimensional pictures of the plant on which some of the data is displayed either numerically or with a color map. The RETAS can generate XY-plots of the data. Time histories of plant conditions can be seen via the plots or through the RETAS's replay mode. The user input was combined with design input from MARS developers and experts from both the GUI and ergonomics fields. A partial list of capabilities follows. - 3D display for neutronics. - Easier method (less user time and effort) to generate 'input' for the 3D displays. - Detailed view of data at volume or

  16. Development of Input/Output System for the Reactor Transient Analysis System (RETAS)

    Energy Technology Data Exchange (ETDEWEB)

    Suh, Jae Seung; Kang, Doo Hyuk; Cho, Yeon Sik [ENESYS, Daejeon (Korea, Republic of); Ahn, Seung Hoon; Cho, Yong Jin [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2009-05-15

    A Korea Institute of Nuclear Safety Reactor Transient Analysis System (KINS-RETAS) aims at providing a realistic prediction of core and RCS response to the potential or actual event scenarios in Korean nuclear power plants (NPPs). A thermal hydraulic system code MARS is a pivot code of the RETAS, and used to predict thermal hydraulic (TH) behaviors in the core and associated systems. MARS alone can be applied to many types of transients, but is sometimes coupled with the other codes developed for different objectives. Many tools have been developed to aid users in preparing input and displaying the transient information and output data. Output file and Graphical User Interfaces (GUI) that help prepare input decks, as seen in SNAP (Gitnick, 1998), VISA (K.D. Kim, 2007) and display aids include the eFAST (KINS, 2007). The tools listed above are graphical interfaces. The input deck builders allow the user to create a functional diagram of the plant, pictorially on the screen. The functional diagram, when annotated with control volume and junction numbers, is a nodalization diagram. Data required for an input deck is entered for volumes and junctions through a mouse-driven menu and pop-up dialog; after the information is complete, an input deck is generated. Display GUIs show data from MARS calculations, either during or after the transient. The RETAS requires the user to first generate a set of 'input', two dimensional pictures of the plant on which some of the data is displayed either numerically or with a color map. The RETAS can generate XY-plots of the data. Time histories of plant conditions can be seen via the plots or through the RETAS's replay mode. The user input was combined with design input from MARS developers and experts from both the GUI and ergonomics fields. A partial list of capabilities follows. - 3D display for neutronics. - Easier method (less user time and effort) to generate 'input' for the 3D displays. - Detailed view

  17. Comparative evaluation of debris extruded apically by using, Protaper retreatment file, K3 file and H-file with solvent in endodontic retreatment

    Directory of Open Access Journals (Sweden)

    Chetna Arora

    2012-01-01

    Full Text Available Aim: The aim of this study was to evaluate the apical extrusion of debris comparing 2 engine driven systems and hand instrumentation technique during root canal retreatment. Materials and Methods: Forty five human permanent mandibular premolars were prepared using the step-back technique, obturated with gutta-percha/zinc oxide eugenol sealer and cold lateral condensation technique. The teeth were divided into three groups: Group A: Protaper retreatment file, Group B: K3, file Group C: H-file with tetrachloroethylene. All the canals were irrigated with 20ml distilled water during instrumentation. Debris extruded along with the irrigating solution during retreatment procedure was carefully collected in preweighed Eppendorf tubes. The tubes were stored in an incubator for 5 days, placed in a desiccator and then re-weighed. Weight of dry debris was calculated by subtracting the weight of the tube before instrumentation and from the weight of the tube after instrumentation. Data was analyzed using Two Way ANOVA and Post Hoc test. Results : There was statistically significant difference in the apical extrusion of debris between hand instrumentation and protaper retreatment file and K3 file. The amount of extruded debris caused by protaper retreatment file and K3 file instrumentation technique was not statistically significant. All the three instrumentation techniques produced apically extruded debris and irrigant. Conclusion: The best way to minimize the extrusion of debris is by adapting crown down technique therefore the use of rotary technique (Protaper retreatment file, K3 file is recommended.

  18. Computer Forensics Method in Analysis of Files Timestamps in Microsoft Windows Operating System and NTFS File System

    Directory of Open Access Journals (Sweden)

    Vesta Sergeevna Matveeva

    2013-02-01

    Full Text Available All existing file browsers displays 3 timestamps for every file in file system NTFS. Nowadays there are a lot of utilities that can manipulate temporal attributes to conceal the traces of file using. However every file in NTFS has 8 timestamps that are stored in file record and used in detecting the fact of attributes substitution. The authors suggest a method of revealing original timestamps after replacement and automated variant of it in case of a set of files.

  19. 76 FR 43679 - Filing via the Internet; Notice of Additional File Formats for efiling

    Science.gov (United States)

    2011-07-21

    ... list of acceptable file formats the four-character file extensions for Microsoft Office 2007/2010... files from Office 2007 or Office 2010 in an Office 2003 format prior to submission. Dated: July 15, 2011...

  20. UPIN Group File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Group Unique Physician Identifier Number (UPIN) File is the business entity file that contains the group practice UPIN and descriptive information. It does NOT...

  1. 12 CFR 5.4 - Filing required.

    Science.gov (United States)

    2010-01-01

    ... CORPORATE ACTIVITIES Rules of General Applicability § 5.4 Filing required. (a) Filing. A depository institution shall file an application or notice with the OCC to engage in corporate activities and... advise an applicant through a pre-filing communication to send the filing or submission directly to the...

  2. ERX: a software for editing files containing X-ray spectra to be used in exposure computational models

    International Nuclear Information System (INIS)

    Cabral, Manuela O.M.; Vieira, Jose W.; Silva, Alysson G.; Leal Neto, Viriato; Oliveira, Alex C.H.; Lima, Fernando R.A.

    2011-01-01

    Exposure Computational Models (ECMs) are utilities that simulate situations in which occurs irradiation in a given environment. An ECM is composed primarily by an anthropomorphic model (phantom), and a Monte Carlo code (MC). This paper presents a tutorial of the software Espectro de Raios-X (ERX). This software performs reading and numerical and graphical analysis of text files containing diagnostic X-ray spectra for use in algorithms of radioactive sources in the ECMs of a Grupo de Dosimetria Numerica. The ERX allows the user to select one among several X-ray spectrums in the energy range Diagnostic radiology X-Ray most commonly used in radiology clinics. In the current version of the ERX there are two types of input files: the contained in mspectra.dat file and the resulting of MC simulations in Geant4. The software allows the construction of charts of the Probability Density Function (PDF) and Cumulative Distribution Function (CDF) of a selected spectrum as well as the table with the values of these functions and the spectrum. In addition, the ERX allows the user to make comparative analysis between the PDF graphics of the two catalogs of spectra available, besides being can perform dosimetric evaluations with the selected spectrum. A software of this kind is an important computational tool for researchers in numerical dosimetry because of the diversity of Diagnostic radiology X-Ray machines, which implies in a mass of input data highly diverse. And because of this, the ERX provides independence to the group related to the data origin that is contained in the catalogs created, not being necessary to resort to others. (author)

  3. Huygens file service and storage architecture

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.; Stabell-Kulo, Tage; Stabell-Kulo, Tage

    1993-01-01

    The Huygens file server is a high-performance file server which is able to deliver multi-media data in a timely manner while also providing clients with ordinary “Unix” like file I/O. The file server integrates client machines, file servers and tertiary storage servers in the same storage

  4. Huygens File Service and Storage Architecture

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.; Stabell-Kulo, Tage; Stabell-Kulo, Tage

    1993-01-01

    The Huygens file server is a high-performance file server which is able to deliver multi-media data in a timely manner while also providing clients with ordinary “Unix” like file I/O. The file server integrates client machines, file servers and tertiary storage servers in the same storage

  5. File-based data flow in the CMS Filter Farm

    Science.gov (United States)

    Andre, J.-M.; Andronidis, A.; Bawej, T.; Behrens, U.; Branson, J.; Chaze, O.; Cittolin, S.; Darlea, G.-L.; Deldicque, C.; Dobson, M.; Dupont, A.; Erhan, S.; Gigi, D.; Glege, F.; Gomez-Ceballos, G.; Hegeman, J.; Holzner, A.; Jimenez-Estupiñán, R.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, R. K.; Morovic, S.; Nunez-Barranco-Fernandez, C.; O'Dell, V.; Orsini, L.; Paus, C.; Petrucci, A.; Pieri, M.; Racz, A.; Roberts, P.; Sakulin, H.; Schwick, C.; Stieger, B.; Sumorok, K.; Veverka, J.; Zaza, S.; Zejdl, P.

    2015-12-01

    During the LHC Long Shutdown 1, the CMS Data Acquisition system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and prepare the ground for future upgrades of the detector front-ends. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. This approach provides additional decoupling between the HLT algorithms and the input and output data flow. All the metadata needed for bookkeeping of the data flow and the HLT process lifetimes are also generated in the form of small “documents” using the JSON encoding, by either services in the flow of the HLT execution (for rates etc.) or watchdog processes. These “files” can remain memory-resident or be written to disk if they are to be used in another part of the system (e.g. for aggregation of output data). We discuss how this redesign improves the robustness and flexibility of the CMS DAQ and the performance of the system currently being commissioned for the LHC Run 2.

  6. File-Based Data Flow in the CMS Filter Farm

    Energy Technology Data Exchange (ETDEWEB)

    Andre, J.M.; et al.

    2015-12-23

    During the LHC Long Shutdown 1, the CMS Data Acquisition system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and prepare the ground for future upgrades of the detector front-ends. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. This approach provides additional decoupling between the HLT algorithms and the input and output data flow. All the metadata needed for bookkeeping of the data flow and the HLT process lifetimes are also generated in the form of small “documents” using the JSON encoding, by either services in the flow of the HLT execution (for rates etc.) or watchdog processes. These “files” can remain memory-resident or be written to disk if they are to be used in another part of the system (e.g. for aggregation of output data). We discuss how this redesign improves the robustness and flexibility of the CMS DAQ and the performance of the system currently being commissioned for the LHC Run 2.

  7. Sensitivity Analysis of Input Parameters for a Dynamic Food Chain Model DYNACON

    International Nuclear Information System (INIS)

    Hwang, Won Tae; Lee, Geun Chang; Han, Moon Hee; Cho, Gyu Seong

    2000-01-01

    The sensitivity analysis of input parameters for a dynamic food chain model DYNACON was conducted as a function of deposition data for the long-lived radionuclides ( 137 Cs, 90 Sr). Also, the influence of input parameters for the short and long-terms contamination of selected foodstuffs (cereals, leafy vegetables, milk) was investigated. The input parameters were sampled using the LHS technique, and their sensitivity indices represented as PRCC. The sensitivity index was strongly dependent on contamination period as well as deposition data. In case of deposition during the growing stages of plants, the input parameters associated with contamination by foliar absorption were relatively important in long-term contamination as well as short-term contamination. They were also important in short-term contamination in case of deposition during the non-growing stages. In long-term contamination, the influence of input parameters associated with foliar absorption decreased, while the influence of input parameters associated with root uptake increased. These phenomena were more remarkable in case of the deposition of non-growing stages than growing stages, and in case of 90 Sr deposition than 137 Cs deposition. In case of deposition during growing stages of pasture, the input parameters associated with the characteristics of cattle such as feed-milk transfer factor and daily intake rate of cattle were relatively important in contamination of milk

  8. Building a parallel file system simulator

    International Nuclear Information System (INIS)

    Molina-Estolano, E; Maltzahn, C; Brandt, S A; Bent, J

    2009-01-01

    Parallel file systems are gaining in popularity in high-end computing centers as well as commercial data centers. High-end computing systems are expected to scale exponentially and to pose new challenges to their storage scalability in terms of cost and power. To address these challenges scientists and file system designers will need a thorough understanding of the design space of parallel file systems. Yet there exist few systematic studies of parallel file system behavior at petabyte- and exabyte scale. An important reason is the significant cost of getting access to large-scale hardware to test parallel file systems. To contribute to this understanding we are building a parallel file system simulator that can simulate parallel file systems at very large scale. Our goal is to simulate petabyte-scale parallel file systems on a small cluster or even a single machine in reasonable time and fidelity. With this simulator, file system experts will be able to tune existing file systems for specific workloads, scientists and file system deployment engineers will be able to better communicate workload requirements, file system designers and researchers will be able to try out design alternatives and innovations at scale, and instructors will be able to study very large-scale parallel file system behavior in the class room. In this paper we describe our approach and provide preliminary results that are encouraging both in terms of fidelity and simulation scalability.

  9. 76 FR 52323 - Combined Notice of Filings; Filings Instituting Proceedings

    Science.gov (United States)

    2011-08-22

    .... Applicants: Young Gas Storage Company, Ltd. Description: Young Gas Storage Company, Ltd. submits tariff..., but intervention is necessary to become a party to the proceeding. The filings are accessible in the.... More detailed information relating to filing requirements, interventions, protests, and service can be...

  10. Reconstruction of an input function from a dynamic PET water image using multiple tissue curves

    Science.gov (United States)

    Kudomi, Nobuyuki; Maeda, Yukito; Yamamoto, Yuka; Nishiyama, Yoshihiro

    2016-08-01

    Quantification of cerebral blood flow (CBF) is important for the understanding of normal and pathologic brain physiology. When CBF is assessed using PET with {{\\text{H}}2} 15O or C15O2, its calculation requires an arterial input function, which generally requires invasive arterial blood sampling. The aim of the present study was to develop a new technique to reconstruct an image derived input function (IDIF) from a dynamic {{\\text{H}}2} 15O PET image as a completely non-invasive approach. Our technique consisted of using a formula to express the input using tissue curve with rate constant parameter. For multiple tissue curves extracted from the dynamic image, the rate constants were estimated so as to minimize the sum of the differences of the reproduced inputs expressed by the extracted tissue curves. The estimated rates were used to express the inputs and the mean of the estimated inputs was used as an IDIF. The method was tested in human subjects (n  =  29) and was compared to the blood sampling method. Simulation studies were performed to examine the magnitude of potential biases in CBF and to optimize the number of multiple tissue curves used for the input reconstruction. In the PET study, the estimated IDIFs were well reproduced against the measured ones. The difference between the calculated CBF values obtained using the two methods was small as around  PET imaging. This suggests the possibility of using a completely non-invasive technique to assess CBF in patho-physiological studies.

  11. Visual Input Enhancement and Grammar Learning: A Meta-Analytic Review

    Science.gov (United States)

    Lee, Sang-Ki; Huang, Hung-Tzu

    2008-01-01

    Effects of pedagogical interventions with visual input enhancement on grammar learning have been investigated by a number of researchers during the past decade and a half. The present review delineates this research domain via a systematic synthesis of 16 primary studies (comprising 20 unique study samples) retrieved through an exhaustive…

  12. ENDF-6 File 30: Data covariances obtained from parameter covariances and sensitivities

    International Nuclear Information System (INIS)

    Muir, D.W.

    1989-01-01

    File 30 is provided as a means of describing the covariances of tabulated cross sections, multiplicities, and energy-angle distributions that result from propagating the covariances of a set of underlying parameters (for example, the input parameters of a nuclear-model code), using an evaluator-supplied set of parameter covariances and sensitivities. Whenever nuclear data are evaluated primarily through the application of nuclear models, the covariances of the resulting data can be described very adequately, and compactly, by specifying the covariance matrix for the underlying nuclear parameters, along with a set of sensitivity coefficients giving the rate of change of each nuclear datum of interest with respect to each of the model parameters. Although motivated primarily by these applications of nuclear theory, use of File 30 is not restricted to any one particular evaluation methodology. It can be used to describe data covariances of any origin, so long as they can be formally separated into a set of parameters with specified covariances and a set of data sensitivities

  13. The Complex Relationship between Bilingual Home Language Input and Kindergarten Children's Spanish and English Oral Proficiencies

    Science.gov (United States)

    Cha, Kijoo; Goldenberg, Claude

    2015-01-01

    This study examined how emergent bilingual children's English and Spanish proficiencies moderated the relationships between Spanish and English input at home (bilingual home language input [BHLI]) and children's oral language skills in each language. The sample comprised over 1,400 Spanish-dominant kindergartners in California and Texas. BHLI was…

  14. Generation of artificial FASTQ files to evaluate the performance of next-generation sequencing pipelines.

    Directory of Open Access Journals (Sweden)

    Matthew Frampton

    Full Text Available Pipelines for the analysis of Next-Generation Sequencing (NGS data are generally composed of a set of different publicly available software, configured together in order to map short reads of a genome and call variants. The fidelity of pipelines is variable. We have developed ArtificialFastqGenerator, which takes a reference genome sequence as input and outputs artificial paired-end FASTQ files containing Phred quality scores. Since these artificial FASTQs are derived from the reference genome, it provides a gold-standard for read-alignment and variant-calling, thereby enabling the performance of any NGS pipeline to be evaluated. The user can customise DNA template/read length, the modelling of coverage based on GC content, whether to use real Phred base quality scores taken from existing FASTQ files, and whether to simulate sequencing errors. Detailed coverage and error summary statistics are outputted. Here we describe ArtificialFastqGenerator and illustrate its implementation in evaluating a typical bespoke NGS analysis pipeline under different experimental conditions. ArtificialFastqGenerator was released in January 2012. Source code, example files and binaries are freely available under the terms of the GNU General Public License v3.0. from https://sourceforge.net/projects/artfastqgen/.

  15. Learning Structure of Sensory Inputs with Synaptic Plasticity Leads to Interference

    Directory of Open Access Journals (Sweden)

    Joseph eChrol-Cannon

    2015-08-01

    Full Text Available Synaptic plasticity is often explored as a form of unsupervised adaptationin cortical microcircuits to learn the structure of complex sensoryinputs and thereby improve performance of classification and prediction. The question of whether the specific structure of the input patterns is encoded in the structure of neural networks has been largely neglected. Existing studies that have analyzed input-specific structural adaptation have used simplified, synthetic inputs in contrast to complex and noisy patterns found in real-world sensory data.In this work, input-specific structural changes are analyzed forthree empirically derived models of plasticity applied to three temporal sensory classification tasks that include complex, real-world visual and auditory data. Two forms of spike-timing dependent plasticity (STDP and the Bienenstock-Cooper-Munro (BCM plasticity rule are used to adapt the recurrent network structure during the training process before performance is tested on the pattern recognition tasks.It is shown that synaptic adaptation is highly sensitive to specific classes of input pattern. However, plasticity does not improve the performance on sensory pattern recognition tasks, partly due to synaptic interference between consecutively presented input samples. The changes in synaptic strength produced by one stimulus are reversed by thepresentation of another, thus largely preventing input-specific synaptic changes from being retained in the structure of the network.To solve the problem of interference, we suggest that models of plasticitybe extended to restrict neural activity and synaptic modification to a subset of the neural circuit, which is increasingly found to be the casein experimental neuroscience.

  16. Evaluated neutronic file for indium

    International Nuclear Information System (INIS)

    Smith, A.B.; Chiba, S.; Smith, D.L.; Meadows, J.W.; Guenther, P.T.; Lawson, R.D.; Howerton, R.J.

    1990-01-01

    A comprehensive evaluated neutronic data file for elemental indium is documented. This file, extending from 10 -5 eV to 20 MeV, is presented in the ENDF/B-VI format, and contains all neutron-induced processes necessary for the vast majority of neutronic applications. In addition, an evaluation of the 115 In(n,n') 116m In dosimetry reaction is presented as a separate file. Attention is given in quantitative values, with corresponding uncertainty information. These files have been submitted for consideration as a part of the ENDF/B-VI national evaluated-file system. 144 refs., 10 figs., 4 tabs

  17. Biocorrosion of Endodontic Files through the Action of Two Species of Sulfate-reducing Bacteria: Desulfovibrio desulfuricans and Desulfovibrio fairfieldensis.

    Science.gov (United States)

    Heggendorn, Fabiano Luiz; Gonçalves, Lucio Souza; Dias, Eliane Pedra; de Oliveira Freitas Lione, Viviane; Lutterbach, Márcia Teresa Soares

    2015-08-01

    This study assessed the biocorrosive capacity of two bacteria: Desulfovibrio desulfuricans and Desulfovibrio fairfieldensis on endodontic files, as a preliminary step in the development of a biopharmaceutical, to facilitate the removal of endodontic file fragments from root canals. In the first stage, the corrosive potential of the artificial saliva medium (ASM), modified Postgate E medium (MPEM), 2.5 % sodium hypochlorite (NaOCl) solution and white medium (WM), without the inoculation of bacteria was assessed by immersion assays. In the second stage, test samples were inoculated with the two species of sulphur-reducing bacteria (SRB) on ASM and modified artificial saliva medium (MASM). In the third stage, test samples were inoculated with the same species on MPEM, ASM and MASM. All test samples were viewed under an infinite focus Alicona microscope. No test sample became corroded when immersed only in media, without bacteria. With the exception of one test sample between those inoculated with bacteria in ASM and MASM, there was no evidence of corrosion. Fifty percent of the test samples demonstrated a greater intensity of biocorrosion when compared with the initial assays. Desulfovibrio desulfuricans and D. fairfieldensis are capable of promoting biocorrosion of the steel constituent of endodontic files. This study describes the initial development of a biopharmaceutical to facilitate the removal of endodontic file fragments from root canals, which can be successfully implicated in endodontic therapy in order to avoiding parendodontic surgery or even tooth loss in such events.

  18. SSYST-3. Input description

    International Nuclear Information System (INIS)

    Meyder, R.

    1983-12-01

    The code system SSYST-3 is designed to analyse the thermal and mechanical behaviour of a fuel rod during a LOCA. The report contains a complete input-list for all modules and several tested inputs for a LOCA analysis. (orig.)

  19. FHEO Filed Cases

    Data.gov (United States)

    Department of Housing and Urban Development — The dataset is a list of all the Title VIII fair housing cases filed by FHEO from 1/1/2007 - 12/31/2012 including the case number, case name, filing date, state and...

  20. Chemomechanical preparation by hand instrumentation and by Mtwo engine-driven rotary files, an ex vivo study.

    Science.gov (United States)

    Krajczár, Károly; Tigyi, Zoltán; Papp, Viktória; Marada, Gyula; Sára, Jeges; Tóth, Vilmos

    2012-07-01

    To compare the disinfecting efficacy of the sodium hypochlorite irrigation by root canal preparation with stainless steel hand files, taper 0.02 and nickel-titanium Mtwo files with taper 0.04-0.06. 40 extracted human teeth were sterilized, and then inoculated with Enterococcus faecalis (ATCC 29212). After 6 day incubation time the root canals were prepared by hand with K-files (n=20) and by engine-driven Mtwo files (VDW, Munich, Germany) (n=20). Irrigation was carried out with 2.5% NaOCl in both cases. Samples were taken and determined in colony forming units (CFU) from the root canals before and after the preparation with instruments #25 and #35. Significant reduction in bacterial count was determined after filing at both groups. The number of bacteria kept on decreasing with the extension of apical preparation diameter. There was no significant difference between the preparation sizes in the bacterial counts after hand or engine-driven instrumentation at the same apical size. Statistical analysis was carried out with Mann-Whitney test, paired t-test and independent sample t-test. Significant reduction in CFU was achieved after the root canal preparation completed with 2.5% NaOCl irrigation, both with stainless steel hand or nickel-titanium rotary files. The root canal remained slightly infected after chemo mechanical preparation in both groups. Key words:Chemomechanical preparation, root canal disinfection, nickel-titanium, conicity, greater taper, apical size.

  1. Material input of nuclear fuel

    International Nuclear Information System (INIS)

    Rissanen, S.; Tarjanne, R.

    2001-01-01

    The Material Input (MI) of nuclear fuel, expressed in terms of the total amount of natural material needed for manufacturing a product, is examined. The suitability of the MI method for assessing the environmental impacts of fuels is also discussed. Material input is expressed as a Material Input Coefficient (MIC), equalling to the total mass of natural material divided by the mass of the completed product. The material input coefficient is, however, only an intermediate result, which should not be used as such for the comparison of different fuels, because the energy contents of nuclear fuel is about 100 000-fold compared to the energy contents of fossil fuels. As a final result, the material input is expressed in proportion to the amount of generated electricity, which is called MIPS (Material Input Per Service unit). Material input is a simplified and commensurable indicator for the use of natural material, but because it does not take into account the harmfulness of materials or the way how the residual material is processed, it does not alone express the amount of environmental impacts. The examination of the mere amount does not differentiate between for example coal, natural gas or waste rock containing usually just sand. Natural gas is, however, substantially more harmful for the ecosystem than sand. Therefore, other methods should also be used to consider the environmental load of a product. The material input coefficient of nuclear fuel is calculated using data from different types of mines. The calculations are made among other things by using the data of an open pit mine (Key Lake, Canada), an underground mine (McArthur River, Canada) and a by-product mine (Olympic Dam, Australia). Furthermore, the coefficient is calculated for nuclear fuel corresponding to the nuclear fuel supply of Teollisuuden Voima (TVO) company in 2001. Because there is some uncertainty in the initial data, the inaccuracy of the final results can be even 20-50 per cent. The value

  2. Choosing a suitable sample size in descriptive sampling

    International Nuclear Information System (INIS)

    Lee, Yong Kyun; Choi, Dong Hoon; Cha, Kyung Joon

    2010-01-01

    Descriptive sampling (DS) is an alternative to crude Monte Carlo sampling (CMCS) in finding solutions to structural reliability problems. It is known to be an effective sampling method in approximating the distribution of a random variable because it uses the deterministic selection of sample values and their random permutation,. However, because this method is difficult to apply to complex simulations, the sample size is occasionally determined without thorough consideration. Input sample variability may cause the sample size to change between runs, leading to poor simulation results. This paper proposes a numerical method for choosing a suitable sample size for use in DS. Using this method, one can estimate a more accurate probability of failure in a reliability problem while running a minimal number of simulations. The method is then applied to several examples and compared with CMCS and conventional DS to validate its usefulness and efficiency

  3. Chemical sensors are hybrid-input memristors

    Science.gov (United States)

    Sysoev, V. I.; Arkhipov, V. E.; Okotrub, A. V.; Pershin, Y. V.

    2018-04-01

    Memristors are two-terminal electronic devices whose resistance depends on the history of input signal (voltage or current). Here we demonstrate that the chemical gas sensors can be considered as memristors with a generalized (hybrid) input, namely, with the input consisting of the voltage, analyte concentrations and applied temperature. The concept of hybrid-input memristors is demonstrated experimentally using a single-walled carbon nanotubes chemical sensor. It is shown that with respect to the hybrid input, the sensor exhibits some features common with memristors such as the hysteretic input-output characteristics. This different perspective on chemical gas sensors may open new possibilities for smart sensor applications.

  4. Sampling the fermi distribution for β-decay energy input to EGS4

    International Nuclear Information System (INIS)

    Nelson, W.R.; Liu, J.

    1992-06-01

    A general method is presented for sampling the kinetic energy of electrons emitted during the β-decay process. Two computer codes (BETACDF and BETASAM) have been created to demonstrate and check the sampling scheme. The main purpose of this exercise is to come up with a convenient way to incorporate β-spectra sampling as a front-end to the EGS4 code. This should aid in the solution of a number of problems of current interest, ranging from the design of detectors for experiments in search of 17-keV neutrinos, to establishing a better understanding of the role of β rays in the dosimetry around SLAC beam device

  5. Comparison of Geometric Design of a Brand of Stainless Steel K-Files: An In Vitro Study.

    Science.gov (United States)

    Saeedullah, Maryam; Husain, Syed Wilayat

    2018-04-01

    The purpose of this experimental study was to determine the diametric variations of a brand of handheld stainless-steel K-files, acquired from different countries, in accordance with the available standards. 20 Mani stainless-steel K-files of identical size (ISO#25) were acquired from Pakistan and were designated as Group A while 20 Mani K-files were purchased from London, UK and designated as Group B. Files were assessed using profile projector Nikon B 24V. Data was statistically compared with ISO 3630:1 and ADA 101 by one sample T test. Significant difference was found between Groups A and B. Average discrepancy of Group A fell within the tolerance limit while that of Group B exceeded the limit. Findings in this study call attention towards adherence to the dimensional standards of stainless-steel endodontic files.

  6. Noninvasive quantification of cerebral metabolic rate for glucose in rats using 18F-FDG PET and standard input function

    Science.gov (United States)

    Hori, Yuki; Ihara, Naoki; Teramoto, Noboru; Kunimi, Masako; Honda, Manabu; Kato, Koichi; Hanakawa, Takashi

    2015-01-01

    Measurement of arterial input function (AIF) for quantitative positron emission tomography (PET) studies is technically challenging. The present study aimed to develop a method based on a standard arterial input function (SIF) to estimate input function without blood sampling. We performed 18F-fluolodeoxyglucose studies accompanied by continuous blood sampling for measurement of AIF in 11 rats. Standard arterial input function was calculated by averaging AIFs from eight anesthetized rats, after normalization with body mass (BM) and injected dose (ID). Then, the individual input function was estimated using two types of SIF: (1) SIF calibrated by the individual's BM and ID (estimated individual input function, EIFNS) and (2) SIF calibrated by a single blood sampling as proposed previously (EIF1S). No significant differences in area under the curve (AUC) or cerebral metabolic rate for glucose (CMRGlc) were found across the AIF-, EIFNS-, and EIF1S-based methods using repeated measures analysis of variance. In the correlation analysis, AUC or CMRGlc derived from EIFNS was highly correlated with those derived from AIF and EIF1S. Preliminary comparison between AIF and EIFNS in three awake rats supported an idea that the method might be applicable to behaving animals. The present study suggests that EIFNS method might serve as a noninvasive substitute for individual AIF measurement. PMID:25966947

  7. 76 FR 62092 - Filing Procedures

    Science.gov (United States)

    2011-10-06

    ... INTERNATIONAL TRADE COMMISSION Filing Procedures AGENCY: International Trade Commission. ACTION: Notice of issuance of Handbook on Filing Procedures. SUMMARY: The United States International Trade Commission (``Commission'') is issuing a Handbook on Filing Procedures to replace its Handbook on Electronic...

  8. Cyclic fatigue resistance of RaCe and Mtwo rotary files in continuous rotation and reciprocating motion.

    Science.gov (United States)

    Vadhana, Sekar; SaravanaKarthikeyan, Balasubramanian; Nandini, Suresh; Velmurugan, Natanasabapathy

    2014-07-01

    The purpose of this study was to evaluate and compare the cyclic fatigue resistance of RaCe (FKG Dentaire, La Chaux-de-Fonds, Switzerland) and Mtwo (VDW, Munich, Germany) rotary files in continuous rotation and reciprocating motion. A total of 60 new rotary Mtwo and RaCe files (ISO size = 25, taper = 0.06, length = 25 mm) were selected and randomly divided into 4 groups (n = 15 each): Mtc (Mtwo NiTi files in continuous rotation), Rc (RaCe NiTi files in continuous rotation), Mtr (Mtwo NiTi files in reciprocating motion), and Rr (RaCe NiTi files in reciprocating motion). A cyclic fatigue testing device was fabricated with a 60° angle of curvature and a 5-mm radius. All instruments were rotated or reciprocated until fracture occurred. The time taken for each instrument to fracture and the length of the broken fragments were recorded. All the fractured files were analyzed under a scanning electron microscope to detect the mode of fracture. The Kolmogorov-Smirnov test was used to assess the normality of samples distribution, and statistical analysis was performed using the independent sample t test. The time taken for the instruments of the Mtr and Rr groups to fail under cyclic loading was significantly longer compared with the Mtc and Rc groups (P ductile mode of fracture. The length of the fractured segments was between 5 and 6 mm, which was not statistically significant among the experimental groups. Mtwo and RaCe rotary instruments showed a significantly higher cyclic fatigue resistance in reciprocating motion compared with continuous rotation motion. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  9. Segmented correlation measurements on superconducting bandpass delta-sigma modulator with and without input tone

    International Nuclear Information System (INIS)

    Bulzacchelli, John F; Lee, Hae-Seung; Hong, Merit Y; Misewich, James A; Ketchen, Mark B

    2003-01-01

    Segmented correlation is a useful technique for testing a superconducting analogue-to-digital converter, as it allows the output spectrum to be estimated with fine frequency resolution even when data record lengths are limited by small on-chip acquisition memories. Previously, we presented segmented correlation measurements on a superconducting bandpass delta-sigma modulator sampling at 40.2 GHz under idle channel (no input) conditions. This paper compares the modulator output spectra measured by segmented correlation with and without an input tone. Important practical considerations of calculating segmented correlations are discussed in detail. Resolution enhancement by segmented correlation does reduce the spectral width of the input tone in the desired manner, but the signal power due to the input increases the variance of the spectral estimate near the input frequency, hindering accurate calculation of the in-band noise. This increased variance, which is predicted by theory, must be considered carefully in the application of segmented correlation. Methods for obtaining more accurate estimates of the quantization noise spectrum which are closer to those measured with no input are described

  10. 12 CFR 1780.9 - Filing of papers.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Filing of papers. 1780.9 Section 1780.9 Banks... papers. (a) Filing. Any papers required to be filed shall be addressed to the presiding officer and filed... Director or the presiding officer. All papers filed by electronic media shall also concurrently be filed in...

  11. Intraglomerular inhibition maintains mitral cell response contrast across input frequencies.

    Science.gov (United States)

    Shao, Zuoyi; Puche, Adam C; Shipley, Michael T

    2013-11-01

    Odor signals are transmitted to the olfactory bulb by olfactory nerve (ON) synapses onto mitral/tufted cells (MTCs) and external tufted cells (ETCs); ETCs provide additional feed-forward excitation to MTCs. Both are strongly regulated by intraglomerular inhibition that can last up to 1 s and, when blocked, dramatically increases ON-evoked MC spiking. Intraglomerular inhibition thus limits the magnitude and duration of MC spike responses to sensory input. In vivo, sensory input is repetitive, dictated by sniffing rates from 1 to 8 Hz, potentially summing intraglomerular inhibition. To investigate this, we recorded MTC responses to 1- to 8-Hz ON stimulation in slices. Inhibitory postsynaptic current area (charge) following each ON stimulation was unchanged from 1 to 5 Hz and modestly paired-pulse attenuated at 8 Hz, suggesting there is no summation and only limited decrement at the highest input frequencies. Next, we investigated frequency independence of intraglomerular inhibition on MC spiking. MCs respond to single ON shocks with an initial spike burst followed by reduced spiking decaying to baseline. Upon repetitive ON stimulation peak spiking is identical across input frequencies but the ratio of peak-to-minimum rate before the stimulus (max-min) diminishes from 30:1 at 1 Hz to 15:1 at 8 Hz. When intraglomerular inhibition is selectively blocked, peak spike rate is unchanged but trough spiking increases markedly decreasing max-min firing ratios from 30:1 at 1 Hz to 2:1 at 8 Hz. Together, these results suggest intraglomerular inhibition is relatively frequency independent and can "sharpen" MC responses to input across the range of frequencies. This suggests that glomerular circuits can maintain "contrast" in MC encoding during sniff-sampled inputs.

  12. The Jade File System. Ph.D. Thesis

    Science.gov (United States)

    Rao, Herman Chung-Hwa

    1991-01-01

    File systems have long been the most important and most widely used form of shared permanent storage. File systems in traditional time-sharing systems, such as Unix, support a coherent sharing model for multiple users. Distributed file systems implement this sharing model in local area networks. However, most distributed file systems fail to scale from local area networks to an internet. Four characteristics of scalability were recognized: size, wide area, autonomy, and heterogeneity. Owing to size and wide area, techniques such as broadcasting, central control, and central resources, which are widely adopted by local area network file systems, are not adequate for an internet file system. An internet file system must also support the notion of autonomy because an internet is made up by a collection of independent organizations. Finally, heterogeneity is the nature of an internet file system, not only because of its size, but also because of the autonomy of the organizations in an internet. The Jade File System, which provides a uniform way to name and access files in the internet environment, is presented. Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Because of autonomy, Jade is designed under the restriction that the underlying file systems may not be modified. In order to avoid the complexity of maintaining an internet-wide, global name space, Jade permits each user to define a private name space. In Jade's design, we pay careful attention to avoiding unnecessary network messages between clients and file servers in order to achieve acceptable performance. Jade's name space supports two novel features: (1) it allows multiple file systems to be mounted under one direction; and (2) it permits one logical name space to mount other logical name spaces. A prototype of Jade was implemented to examine and validate its

  13. Investigation of RADTRAN Stop Model input parameters for truck stops

    International Nuclear Information System (INIS)

    Griego, N.R.; Smith, J.D.; Neuhauser, K.S.

    1996-01-01

    RADTRAN is a computer code for estimating the risks and consequences as transport of radioactive materials (RAM). RADTRAN was developed and is maintained by Sandia National Laboratories for the US Department of Energy (DOE). For incident-free transportation, the dose to persons exposed while the shipment is stopped is frequently a major percentage of the overall dose. This dose is referred to as Stop Dose and is calculated by the Stop Model. Because stop dose is a significant portion of the overall dose associated with RAM transport, the values used as input for the Stop Model are important. Therefore, an investigation of typical values for RADTRAN Stop Parameters for truck stops was performed. The resulting data from these investigations were analyzed to provide mean values, standard deviations, and histograms. Hence, the mean values can be used when an analyst does not have a basis for selecting other input values for the Stop Model. In addition, the histograms and their characteristics can be used to guide statistical sampling techniques to measure sensitivity of the RADTRAN calculated Stop Dose to the uncertainties in the stop model input parameters. This paper discusses the details and presents the results of the investigation of stop model input parameters at truck stops

  14. Estimation of an image derived input function with MR-defined carotid arteries in FDG-PET human studies using a novel partial volume correction method

    DEFF Research Database (Denmark)

    Sari, Hasan; Erlandsson, Kjell; Law, Ian

    2017-01-01

    Kinetic analysis of18F-fluorodeoxyglucose positron emission tomography data requires an accurate knowledge the arterial input function. The gold standard method to measure the arterial input function requires collection of arterial blood samples and is an invasive method. Measuring an image deriv...... input function (p > 0.12 for grey matter and white matter). Hence, the presented image derived input function extraction method can be a practical alternative to noninvasively analyze dynamic18F-fluorodeoxyglucose data without the need for blood sampling....

  15. 75 FR 37389 - Marine Mammals; File Nos. 14628 and 15471

    Science.gov (United States)

    2010-06-29

    ... parts for scientific research purposes:National Museum of Natural History, Smithsonian Institution... (16 U.S.C. 1151 et seq.). File No.14628: The National Museum of Natural History (NMNH) is requesting... health assessment studies in Punta San Juan, Peru. Samples may be archived, transported, and analyzed by...

  16. The Galley Parallel File System

    Science.gov (United States)

    Nieuwejaar, Nils; Kotz, David

    1996-01-01

    Most current multiprocessor file systems are designed to use multiple disks in parallel, using the high aggregate bandwidth to meet the growing I/0 requirements of parallel scientific applications. Many multiprocessor file systems provide applications with a conventional Unix-like interface, allowing the application to access multiple disks transparently. This interface conceals the parallelism within the file system, increasing the ease of programmability, but making it difficult or impossible for sophisticated programmers and libraries to use knowledge about their I/O needs to exploit that parallelism. In addition to providing an insufficient interface, most current multiprocessor file systems are optimized for a different workload than they are being asked to support. We introduce Galley, a new parallel file system that is intended to efficiently support realistic scientific multiprocessor workloads. We discuss Galley's file structure and application interface, as well as the performance advantages offered by that interface.

  17. COSIS User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Cho, J. Y.; Lee, K. B.; Koo, B. S.; Lee, W. K.; Lee, C. C.; Zee, S. Q

    2006-02-15

    COSIS (COre State Indication System) which implemented in the SMART research reactor plays a role to supply the core state parameters or graphs for the operator to recognize the core state effectively. The followings are the main functions of COSIS. Validity Check for the Process Signals and Determination of the COSIS Inputs (SIGVAL), Coolant Flow Rate Calculation (FLOW), Core Thermal Power Calculation (COREPOW), In-core 3-Dimensional Power Distribution Calculation and Peaking Parameters Generation (POWER3D), Azimuthal Tilt Calculation (AZITILT). This report describes the structures of the I/O files that are essential for the users to run COSIS. COSIS handles the following 4 input files. DATABASE: The base input file, COSIS.INP: The signal input file, CCS.DAT: File required for the in-core detector signal processing and the 3-D power distribution calculation, TPFH2O: Steam table for the water properties The DATABASE file contains the base information for a nuclear power plant and is read at the first COSIS calculation. The COSIS.INP file contains the process input and detector signals, and is read by COSIS at every second. CCS.DAT file, that is produced by the COSISMAS code, contains the information for the in-core detector signal processing and the 3-D power distribution calculation. TPFH2O file is a steam table and is written in binary format. COSIS produces the following 4 output files. DATABASE.OUT: The output file for the DATABASE input file, COSIS.OUT: The normal output file produced after the COSIS calculation, COSIS.SUM: File for the operator to recognize the core state effectively, MAS{sub S}IG: File to run the COSISMAS code The DATABASE.OUT file is produced right after finishing DATABASE processing. The COSIS.OUT file is produced after finishing the input signal processing and the main COSIS calculation. The COSIS.SUM file is the summary file of the COSIS results for the operator to recognize the core state effectively. The MAS{sub S}IG file is the

  18. COSIS User's Manual

    International Nuclear Information System (INIS)

    Cho, J. Y.; Lee, K. B.; Koo, B. S.; Lee, W. K.; Lee, C. C.; Zee, S. Q.

    2006-02-01

    COSIS (COre State Indication System) which implemented in the SMART research reactor plays a role to supply the core state parameters or graphs for the operator to recognize the core state effectively. The followings are the main functions of COSIS. Validity Check for the Process Signals and Determination of the COSIS Inputs (SIGVAL), Coolant Flow Rate Calculation (FLOW), Core Thermal Power Calculation (COREPOW), In-core 3-Dimensional Power Distribution Calculation and Peaking Parameters Generation (POWER3D), Azimuthal Tilt Calculation (AZITILT). This report describes the structures of the I/O files that are essential for the users to run COSIS. COSIS handles the following 4 input files. DATABASE: The base input file, COSIS.INP: The signal input file, CCS.DAT: File required for the in-core detector signal processing and the 3-D power distribution calculation, TPFH2O: Steam table for the water properties The DATABASE file contains the base information for a nuclear power plant and is read at the first COSIS calculation. The COSIS.INP file contains the process input and detector signals, and is read by COSIS at every second. CCS.DAT file, that is produced by the COSISMAS code, contains the information for the in-core detector signal processing and the 3-D power distribution calculation. TPFH2O file is a steam table and is written in binary format. COSIS produces the following 4 output files. DATABASE.OUT: The output file for the DATABASE input file, COSIS.OUT: The normal output file produced after the COSIS calculation, COSIS.SUM: File for the operator to recognize the core state effectively, MAS S IG: File to run the COSISMAS code The DATABASE.OUT file is produced right after finishing DATABASE processing. The COSIS.OUT file is produced after finishing the input signal processing and the main COSIS calculation. The COSIS.SUM file is the summary file of the COSIS results for the operator to recognize the core state effectively. The MAS S IG file is the COSISMAS input

  19. A convertor and user interface to import CAD files into worldtoolkit virtual reality systems

    Science.gov (United States)

    Wang, Peter Hor-Ching

    1996-01-01

    Virtual Reality (VR) is a rapidly developing human-to-computer interface technology. VR can be considered as a three-dimensional computer-generated Virtual World (VW) which can sense particular aspects of a user's behavior, allow the user to manipulate the objects interactively, and render the VW at real-time accordingly. The user is totally immersed in the virtual world and feel the sense of transforming into that VW. NASA/MSFC Computer Application Virtual Environments (CAVE) has been developing the space-related VR applications since 1990. The VR systems in CAVE lab are based on VPL RB2 system which consists of a VPL RB2 control tower, an LX eyephone, an Isotrak polhemus sensor, two Fastrak polhemus sensors, a folk of Bird sensor, and two VPL DG2 DataGloves. A dynamics animator called Body Electric from VPL is used as the control system to interface with all the input/output devices and to provide the network communications as well as VR programming environment. The RB2 Swivel 3D is used as the modelling program to construct the VW's. A severe limitation of the VPL VR system is the use of RB2 Swivel 3D, which restricts the files to a maximum of 1020 objects and doesn't have the advanced graphics texture mapping. The other limitation is that the VPL VR system is a turn-key system which does not provide the flexibility for user to add new sensors and C language interface. Recently, NASA/MSFC CAVE lab provides VR systems built on Sense8 WorldToolKit (WTK) which is a C library for creating VR development environments. WTK provides device drivers for most of the sensors and eyephones available on the VR market. WTK accepts several CAD file formats, such as Sense8 Neutral File Format, AutoCAD DXF and 3D Studio file format, Wave Front OBJ file format, VideoScape GEO file format, Intergraph EMS stereolithographics and CATIA Stereolithographics STL file formats. WTK functions are object-oriented in their naming convention, are grouped into classes, and provide easy C

  20. Enhanced Input in LCTL Pedagogy

    Directory of Open Access Journals (Sweden)

    Marilyn S. Manley

    2009-08-01

    Full Text Available Language materials for the more-commonly-taught languages (MCTLs often include visual input enhancement (Sharwood Smith 1991, 1993 which makes use of typographical cues like bolding and underlining to enhance the saliency of targeted forms. For a variety of reasons, this paper argues that the use of enhanced input, both visual and oral, is especially important as a tool for the lesscommonly-taught languages (LCTLs. As there continues to be a scarcity of teaching resources for the LCTLs, individual teachers must take it upon themselves to incorporate enhanced input into their own self-made materials. Specific examples of how to incorporate both visual and oral enhanced input into language teaching are drawn from the author’s own experiences teaching Cuzco Quechua. Additionally, survey results are presented from the author’s Fall 2010 semester Cuzco Quechua language students, supporting the use of both visual and oral enhanced input.

  1. Enhanced Input in LCTL Pedagogy

    Directory of Open Access Journals (Sweden)

    Marilyn S. Manley

    2010-08-01

    Full Text Available Language materials for the more-commonly-taught languages (MCTLs often include visual input enhancement (Sharwood Smith 1991, 1993 which makes use of typographical cues like bolding and underlining to enhance the saliency of targeted forms. For a variety of reasons, this paper argues that the use of enhanced input, both visual and oral, is especially important as a tool for the lesscommonly-taught languages (LCTLs. As there continues to be a scarcity of teaching resources for the LCTLs, individual teachers must take it upon themselves to incorporate enhanced input into their own self-made materials. Specific examples of how to incorporate both visual and oral enhanced input into language teaching are drawn from the author’s own experiences teaching Cuzco Quechua. Additionally, survey results are presented from the author’s Fall 2010 semester Cuzco Quechua language students, supporting the use of both visual and oral enhanced input.

  2. Atmospheric nutrient inputs to the northern levantine basin from a long-term observation: sources and comparison with riverine inputs

    Directory of Open Access Journals (Sweden)

    M. Koçak

    2010-12-01

    Full Text Available Aerosol and rainwater samples have been collected at a rural site located on the coastline of the Eastern Mediterranean, Erdemli, Turkey between January 1999 and December 2007. Riverine sampling was carried out at five Rivers (Ceyhan, Seyhan, Göksu, Berdan and Lamas draining into the Northeastern Levantine Basin (NLB between March 2002 and July 2007. Samples have been analyzed for macronutrients of phosphate, silicate, nitrate and ammonium (PO43−, Sidiss, NO3 and NH4+. Phosphate and silicate in aerosol and rainwater showed higher and larger variations during the transitional period when air flows predominantly originate from North Africa and Middle East/Arabian Peninsula. Deficiency of alkaline material have been found to be the main reason of the acidic rain events whilst high pH values (>7 have been associated with high Sidiss concentrations due to sporadic dust events. In general, lowest nitrate and ammonium concentrations in aerosol and rainwater have been associated with air flow from the Mediterranean Sea. Comparison of atmospheric with riverine fluxes demonstrated that DIN and PO43− fluxes to NLB have been dominated by atmosphere (~90% and ~60% respectively whereas the input of Si was mainly derived from riverine runoff (~90%. N/P ratios in the atmospheric deposition (233; riverine discharge (28 revealed that NLB receives excessive amounts of DIN and this unbalanced P and N inputs may provoke even more phosphorus deficiency. Observed molar Si/N ratio suggested Si limitation relative to nitrogen might cause a switch from diatom dominated communities to non-siliceous populations particularly at coastal NLB.

  3. An efficient modularized sample-based method to estimate the first-order Sobol' index

    International Nuclear Information System (INIS)

    Li, Chenzhao; Mahadevan, Sankaran

    2016-01-01

    Sobol' index is a prominent methodology in global sensitivity analysis. This paper aims to directly estimate the Sobol' index based only on available input–output samples, even if the underlying model is unavailable. For this purpose, a new method to calculate the first-order Sobol' index is proposed. The innovation is that the conditional variance and mean in the formula of the first-order index are calculated at an unknown but existing location of model inputs, instead of an explicit user-defined location. The proposed method is modularized in two aspects: 1) index calculations for different model inputs are separate and use the same set of samples; and 2) model input sampling, model evaluation, and index calculation are separate. Due to this modularization, the proposed method is capable to compute the first-order index if only input–output samples are available but the underlying model is unavailable, and its computational cost is not proportional to the dimension of the model inputs. In addition, the proposed method can also estimate the first-order index with correlated model inputs. Considering that the first-order index is a desired metric to rank model inputs but current methods can only handle independent model inputs, the proposed method contributes to fill this gap. - Highlights: • An efficient method to estimate the first-order Sobol' index. • Estimate the index from input–output samples directly. • Computational cost is not proportional to the number of model inputs. • Handle both uncorrelated and correlated model inputs.

  4. Description of input and examples for PHREEQC version 3: a computer program for speciation, batch-reaction, one-dimensional transport, and inverse geochemical calculations

    Science.gov (United States)

    Parkhurst, David L.; Appelo, C.A.J.

    2013-01-01

    . Surface complexation can be calculated with the CD-MUSIC (Charge Distribution MUltiSIte Complexation) triple-layer model in addition to the diffuse-layer model. The composition of the electrical double layer of a surface can be estimated by using the Donnan approach, which is more robust and faster than the alternative Borkovec-Westall integration. Multicomponent diffusion, diffusion in the electrostatic double layer on a surface, and transport of colloids with simultaneous surface complexation have been added to the transport module. A series of keyword data blocks has been added for isotope calculations—ISOTOPES, CALCULATE_VALUES, ISOTOPE_ALPHAS, ISOTOPE_RATIOS, and NAMED_EXPRESSIONS. Solution isotopic data can be input in conventional units (for example, permil, percent modern carbon, or tritium units) and the numbers are converted to moles of isotope by PHREEQC. The isotopes are treated as individual components (they must be defined as individual master species) so that each isotope has its own set of aqueous species, gases, and solids. The isotope-related keywords allow calculating equilibrium fractionation of isotopes among the species and phases of a system. The calculated isotopic compositions are printed in easily readable conventional units. New keywords and options facilitate the setup of input files and the interpretation of the results. Keyword data blocks can be copied (keyword COPY) and deleted (keyword DELETE). Keyword data items can be altered by using the keyword data blocks with the _MODIFY extension and a simulation can be run with all reactants of a given index number (keyword RUN_CELLS). The definition of the complete chemical state of all reactants of PHREEQC can be saved in a file in a raw data format ( DUMP and _RAW keywords). The file can be read as part of another input file with the INCLUDE$ keyword. These keywords facilitate the use of IPhreeqc, which is a module implementing all PHREEQC version 3 capabilities; the module is designed to be

  5. 78 FR 21930 - Aquenergy Systems, Inc.; Notice of Intent To File License Application, Filing of Pre-Application...

    Science.gov (United States)

    2013-04-12

    ... Systems, Inc.; Notice of Intent To File License Application, Filing of Pre-Application Document, and Approving Use of the Traditional Licensing Process a. Type of Filing: Notice of Intent to File License...: November 11, 2012. d. Submitted by: Aquenergy Systems, Inc., a fully owned subsidiaries of Enel Green Power...

  6. 12 CFR 16.33 - Filing fees.

    Science.gov (United States)

    2010-01-01

    ... Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY SECURITIES OFFERING DISCLOSURE RULES § 16.33 Filing fees. (a) Filing fees must accompany certain filings made under the provisions of this part... Comptroller of the Currency Fees published pursuant to § 8.8 of this chapter. (b) Filing fees must be paid by...

  7. 75 FR 4689 - Electronic Tariff Filings

    Science.gov (United States)

    2010-01-29

    ... elements ``are required to properly identify the nature of the tariff filing, organize the tariff database... (or other pleading) and the Type of Filing code chosen will be resolved in favor of the Type of Filing...'s wish expressed in its transmittal letter or in other pleadings, the Commission may not review a...

  8. Nuclear model parameter testing for nuclear data evaluation (Reference Input Parameter Library: Phase II). Summary report of the third research co-ordination meeting

    International Nuclear Information System (INIS)

    Herman, M.

    2002-04-01

    This report summarises the results and recommendations of the third Research Co-ordination Meeting on improving and testing the Reference Input Parameter Library: Phase II. A primary aim of the meeting was to review the achievements of the CRP, to assess the testing of the library and to approve the final contents. Actions were approved that will result in completion of the file and a draft report by the end of February 2002. Full release of the library is scheduled for July 2002. (author)

  9. MDS MIC Catalog Inputs

    Science.gov (United States)

    Johnson-Throop, Kathy A.; Vowell, C. W.; Smith, Byron; Darcy, Jeannette

    2006-01-01

    This viewgraph presentation reviews the inputs to the MDS Medical Information Communique (MIC) catalog. The purpose of the group is to provide input for updating the MDS MIC Catalog and to request that MMOP assign Action Item to other working groups and FSs to support the MITWG Process for developing MIC-DDs.

  10. 78 FR 75554 - Combined Notice of Filings

    Science.gov (United States)

    2013-12-12

    ...-000. Applicants: Young Gas Storage Company, Ltd. Description: Young Fuel Reimbursement Filing to be.... Protests may be considered, but intervention is necessary to become a party to the proceeding. eFiling is... qualifying facilities filings can be found at: http://www.ferc.gov/docs-filing/efiling/filing-req.pdf . For...

  11. Inverse methods for estimating primary input signals from time-averaged isotope profiles

    Science.gov (United States)

    Passey, Benjamin H.; Cerling, Thure E.; Schuster, Gerard T.; Robinson, Todd F.; Roeder, Beverly L.; Krueger, Stephen K.

    2005-08-01

    Mammalian teeth are invaluable archives of ancient seasonality because they record along their growth axes an isotopic record of temporal change in environment, plant diet, and animal behavior. A major problem with the intra-tooth method is that intra-tooth isotope profiles can be extremely time-averaged compared to the actual pattern of isotopic variation experienced by the animal during tooth formation. This time-averaging is a result of the temporal and spatial characteristics of amelogenesis (tooth enamel formation), and also results from laboratory sampling. This paper develops and evaluates an inverse method for reconstructing original input signals from time-averaged intra-tooth isotope profiles. The method requires that the temporal and spatial patterns of amelogenesis are known for the specific tooth and uses a minimum length solution of the linear system Am = d, where d is the measured isotopic profile, A is a matrix describing temporal and spatial averaging during amelogenesis and sampling, and m is the input vector that is sought. Accuracy is dependent on several factors, including the total measurement error and the isotopic structure of the measured profile. The method is shown to accurately reconstruct known input signals for synthetic tooth enamel profiles and the known input signal for a rabbit that underwent controlled dietary changes. Application to carbon isotope profiles of modern hippopotamus canines reveals detailed dietary histories that are not apparent from the measured data alone. Inverse methods show promise as an effective means of dealing with the time-averaging problem in studies of intra-tooth isotopic variation.

  12. Dose calculation for {sup 40}K ingestion in samples of beans using spectrometry and MCNP; Calculo de dose devido a ingestao de {sup 40}K em amostras de feijao utilizando espectrometria e MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Garcez, R.W.D.; Lopes, J.M.; Silva, A.X., E-mail: marqueslopez@yahoo.com.br [Coordenacao dos Programas de Pos-Graduacao em Engenharia (COPPE/PEN/UFRJ), Rio de Janeiro, RJ (Brazil). Centro de Tecnologia; Domingues, A.M. [Universidade Federal Fluminense (UFF), Niteroi, RJ (Brazil). Instituto de Fisica; Lima, M.A.F. [Universidade Federal Fluminense (UFF), Niteroi, RJ (Brazil). Instituto de Biologia

    2014-07-01

    A method based on gamma spectroscopy and on the use of voxel phantoms to calculate dose due to ingestion of {sup 40}K contained in bean samples are presented in this work. To quantify the activity of radionuclide, HPGe detector was used and the data entered in the input file of MCNP code. The highest value of equivalent dose was 7.83 μSv.y{sup -1} in the stomach for white beans, whose activity 452.4 Bq.Kg{sup -1} was the highest of the five analyzed. The tool proved to be appropriate when you want to calculate the dose in organs due to ingestion of food. (author)

  13. PUFF-IV, Code System to Generate Multigroup Covariance Matrices from ENDF/B-VI Uncertainty Files

    International Nuclear Information System (INIS)

    2007-01-01

    1 - Description of program or function: The PUFF-IV code system processes ENDF/B-VI formatted nuclear cross section covariance data into multigroup covariance matrices. PUFF-IV is the newest release in this series of codes used to process ENDF uncertainty information and to generate the desired multi-group correlation matrix for the evaluation of interest. This version includes corrections and enhancements over previous versions. It is written in Fortran 90 and allows for a more modular design, thus facilitating future upgrades. PUFF-IV enhances support for resonance parameter covariance formats described in the ENDF standard and now handles almost all resonance parameter covariance information in the resolved region, with the exception of the long range covariance sub-subsections. PUFF-IV is normally used in conjunction with an AMPX master library containing group averaged cross section data. Two utility modules are included in this package to facilitate the data interface. The module SMILER allows one to use NJOY generated GENDF files containing group averaged cross section data in conjunction with PUFF-IV. The module COVCOMP allows one to compare two files written in COVERX format. 2 - Methods: Cross section and flux values on a 'super energy grid,' consisting of the union of the required energy group structure and the energy data points in the ENDF/B-V file, are interpolated from the input cross sections and fluxes. Covariance matrices are calculated for this grid and then collapsed to the required group structure. 3 - Restrictions on the complexity of the problem: PUFF-IV cannot process covariance information for energy and angular distributions of secondary particles. PUFF-IV does not process covariance information in Files 34 and 35; nor does it process covariance information in File 40. These new formats will be addressed in a future version of PUFF

  14. Effects of modulation techniques on the input current interharmonics of Adjustable Speed Drives

    DEFF Research Database (Denmark)

    Soltani, Hamid; Davari, Pooya; Zare, Firuz

    2018-01-01

    operation of the grid. This paper presents the effect of the symmetrical regularly sampled Space Vector Modulation (SVM) and Discontinuous Pulse Width Modulation-30olag (DPWM2) techniques, as the most popular modulation methods in the ASD applications, on the drive’s input current interharmonic magnitudes....... Further investigations are also devoted to the cases, where the Random Modulation (RM) technique is applied on the selected modulation strategies. The comparative results show that how different modulation techniques can influence the ASD’s input current interharmonics and consequently may...

  15. Small file aggregation in a parallel computing system

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Zhang, Jingwang

    2014-09-02

    Techniques are provided for small file aggregation in a parallel computing system. An exemplary method for storing a plurality of files generated by a plurality of processes in a parallel computing system comprises aggregating the plurality of files into a single aggregated file; and generating metadata for the single aggregated file. The metadata comprises an offset and a length of each of the plurality of files in the single aggregated file. The metadata can be used to unpack one or more of the files from the single aggregated file.

  16. 5 CFR 1203.13 - Filing pleadings.

    Science.gov (United States)

    2010-01-01

    ... delivery, by facsimile, or by e-filing in accordance with § 1201.14 of this chapter. If the document was... submitted by e-filing, it is considered to have been filed on the date of electronic submission. (e... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Filing pleadings. 1203.13 Section 1203.13...

  17. PFS: a distributed and customizable file system

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.

    1996-01-01

    In this paper we present our ongoing work on the Pegasus File System (PFS), a distributed and customizable file system that can be used for off-line file system experiments and on-line file system storage. PFS is best described as an object-oriented component library from which either a true file

  18. 76 FR 61351 - Combined Notice of Filings #1

    Science.gov (United States)

    2011-10-04

    ... MBR Baseline Tariff Filing to be effective 9/22/2011. Filed Date: 09/22/2011. Accession Number... submits tariff filing per 35.1: ECNY MBR Re-File to be effective 9/22/2011. Filed Date: 09/22/2011... Industrial Energy Buyers, LLC submits tariff filing per 35.1: NYIEB MBR Re-File to be effective 9/22/2011...

  19. Deceit: A flexible distributed file system

    Science.gov (United States)

    Siegel, Alex; Birman, Kenneth; Marzullo, Keith

    1989-01-01

    Deceit, a distributed file system (DFS) being developed at Cornell, focuses on flexible file semantics in relation to efficiency, scalability, and reliability. Deceit servers are interchangeable and collectively provide the illusion of a single, large server machine to any clients of the Deceit service. Non-volatile replicas of each file are stored on a subset of the file servers. The user is able to set parameters on a file to achieve different levels of availability, performance, and one-copy serializability. Deceit also supports a file version control mechanism. In contrast with many recent DFS efforts, Deceit can behave like a plain Sun Network File System (NFS) server and can be used by any NFS client without modifying any client software. The current Deceit prototype uses the ISIS Distributed Programming Environment for all communication and process group management, an approach that reduces system complexity and increases system robustness.

  20. Input-output supervisor; Le superviseur d'entree-sortie dans les ordinateurs

    Energy Technology Data Exchange (ETDEWEB)

    Dupuy, R [Commissariat a l' Energie Atomique, Vaujours (France). Centre d' Etudes Nucleaires

    1970-07-01

    The input-output supervisor is the program which monitors the flow of informations between core storage and peripheral equipments of a computer. This work is composed of three parts: 1 - Study of a generalized input-output supervisor. With sample modifications it looks like most of input-output supervisors which are running now on computers. 2 - Application of this theory on a magnetic drum. 3 - Hardware requirement for time-sharing. (author) [French] Le superviseur d'entree-sortie est le programme charge de gerer les echanges d'information entre la memoire rapide et les organes peripheriques d'un ordinateur. Ce travail se compose de trois parties: 1 - Etude d'un systeme d'entree-sortie general et theorique qui, en faisant un certain nombre d'hypotheses simplificatrices, permet de retrouver la plupart des superviseurs d'entree-sortie actuels. 2 - Expose d'une realisation concrete, gestion d'un tambour magnetique. 3 - Suggestions hardware en vue de faciliter le timesharing. (auteur)

  1. 10 CFR 110.89 - Filing and service.

    Science.gov (United States)

    2010-01-01

    ...: Rulemakings and Adjudications Staff or via the E-Filing system, following the procedure set forth in 10 CFR 2.302. Filing by mail is complete upon deposit in the mail. Filing via the E-Filing system is completed... residence with some occupant of suitable age and discretion; (2) Following the requirements for E-Filing in...

  2. 49 CFR 1104.6 - Timely filing required.

    Science.gov (United States)

    2010-10-01

    ... offers next day delivery to Washington, DC. If the e-filing option is chosen (for those pleadings and documents that are appropriate for e-filing, as determined by reference to the information on the Board's Web site), then the e-filed pleading or document is timely filed if the e-filing process is completed...

  3. DICOM supported sofware configuration by XML files

    International Nuclear Information System (INIS)

    LucenaG, Bioing Fabian M; Valdez D, Andres E; Gomez, Maria E; Nasisi, Oscar H

    2007-01-01

    A method for the configuration of informatics systems that provide support to DICOM standards using XML files is proposed. The difference with other proposals is base on that this system does not code the information of a DICOM objects file, but codes the standard itself in an XML file. The development itself is the format for the XML files mentioned, in order that they can support what DICOM normalizes for multiple languages. In this way, the same configuration file (or files) can be use in different systems. Jointly the XML configuration file generated, we wrote also a set of CSS and XSL files. So the same file can be visualized in a standard browser, as a query system of DICOM standard, emerging use, that did not was a main objective but brings a great utility and versatility. We exposed also some uses examples of the configuration file mainly in relation with the load of DICOM information objects. Finally, at the conclusions we show the utility that the system has already provided when the edition of DICOM standard changes from 2006 to 2007

  4. Substitution elasticities between GHG-polluting and nonpolluting inputs in agricultural production: A meta-regression

    International Nuclear Information System (INIS)

    Liu, Boying; Richard Shumway, C.

    2016-01-01

    This paper reports meta-regressions of substitution elasticities between greenhouse gas (GHG) polluting and nonpolluting inputs in agricultural production, which is the main feedstock source for biofuel in the U.S. We treat energy, fertilizer, and manure collectively as the “polluting input” and labor, land, and capital as nonpolluting inputs. We estimate meta-regressions for samples of Morishima substitution elasticities for labor, land, and capital vs. the polluting input. Much of the heterogeneity of Morishima elasticities can be explained by type of primal or dual function, functional form, type and observational level of data, input categories, number of outputs, type of output, time period, and country categories. Each estimated long-run elasticity for the reference case, which is most relevant for assessing GHG emissions through life-cycle analysis, is greater than 1.0 and significantly different from zero. Most predicted long-run elasticities remain significantly different from zero at the data means. These findings imply that life-cycle analysis based on fixed proportion production functions could provide grossly inaccurate measures of GHG of biofuel. - Highlights: • This paper reports meta-regressions of substitution elasticities between greenhouse-gas (GHG) polluting and nonpolluting inputs in agricultural production, which is the main feedstock source for biofuel in the U.S. • We estimate meta-regressions for samples of Morishima substitution elasticities for labor, land, and capital vs. the polluting input based on 65 primary studies. • We found that each estimated long-run elasticity for the reference case, which is most relevant for assessing GHG emissions through life-cycle analysis, is greater than 1.0 and significantly different from zero. Most predicted long-run elasticities remain significantly different from zero at the data means. • These findings imply that life-cycle analysis based on fixed proportion production functions could

  5. 12 CFR 908.25 - Filing of papers.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Filing of papers. 908.25 Section 908.25 Banks... RULES OF PRACTICE AND PROCEDURE IN HEARINGS ON THE RECORD General Rules § 908.25 Filing of papers. (a) Filing. Any papers required to be filed shall be addressed to the presiding officer and filed with the...

  6. PFS: a distributed and customizable file system

    OpenAIRE

    Bosch, H.G.P.; Mullender, Sape J.

    1996-01-01

    In this paper we present our ongoing work on the Pegasus File System (PFS), a distributed and customizable file system that can be used for off-line file system experiments and on-line file system storage. PFS is best described as an object-oriented component library from which either a true file system or a file-system simulator can be constructed. Each of the components in the library is easily replaced by another implementation to accommodate a wide range of applications.

  7. Formalizing a hierarchical file system

    NARCIS (Netherlands)

    Hesselink, Wim H.; Lali, Muhammad Ikram

    An abstract file system is defined here as a partial function from (absolute) paths to data. Such a file system determines the set of valid paths. It allows the file system to be read and written at a valid path, and it allows the system to be modified by the Unix operations for creation, removal,

  8. Data Qualification Report For: Thermodynamic Data File, DATA0.YMP.R0 For Geochemical Code, EQ3/6?

    International Nuclear Information System (INIS)

    P.L. Cloke

    2000-09-01

    The objective of this work is to evaluate the adequacy of chemical thermodynamic data provided by Lawrence Livermore National Laboratory (LLNL) as DataO.ymp.ROA in response to an input request submitted under AP-3.14Q. This request specified that chemical thermodynamic data available in the file, Data0.com.R2, be updated, improved, and augmented for use in geochemical modeling used in Process Model Reports (PMRs) for Engineered Barrier Systems, Waste Form, Waste Package, Unsaturated Zone, and Near Field Environment, as well as for Performance Assessment. The data are qualified in the temperature range 0 to 100 C. Several Data Tracking Numbers (DTNs) associated with Analysis/Model Reports (AMR) addressing various aspects of the post-closure chemical behavior of the waste package and the Engineered Barrier System that rely on EQ316 outputs to which these data are used as input, are Principal Factor affecting. This qualification activity was accomplished in accordance with the AP-SIII.2Q using the Technical Assessment method. A development plan, TDP-EBS-MD-000044, was prepared in accordance with AP-2.13Q and approved by the Responsible Manager. In addition, a Process Control Evaluation was performed in accordance with AP-SV.1Q. The qualification method, selected in accordance with AP-SIII.2Q, was Technical Assessment. The rationale for this approach is that the data in File Data0.com.R2 are considered Handbook data and therefore do not themselves require qualification. Only changes to Data0.com.R2 required qualification. A new file has been produced which contains the database Data0.ymp.R0, which is recommended for qualification as a result of this action. Data0.ymp.R0 will supersede Data0.com.R2 for all Yucca Mountain Project (YMP) activities

  9. Input filter compensation for switching regulators

    Science.gov (United States)

    Lee, F. C.; Kelkar, S. S.

    1982-01-01

    The problems caused by the interaction between the input filter, output filter, and the control loop are discussed. The input filter design is made more complicated because of the need to avoid performance degradation and also stay within the weight and loss limitations. Conventional input filter design techniques are then dicussed. The concept of pole zero cancellation is reviewed; this concept is the basis for an approach to control the peaking of the output impedance of the input filter and thus mitigate some of the problems caused by the input filter. The proposed approach for control of the peaking of the output impedance of the input filter is to use a feedforward loop working in conjunction with feedback loops, thus forming a total state control scheme. The design of the feedforward loop for a buck regulator is described. A possible implementation of the feedforward loop design is suggested.

  10. 77 FR 74839 - Combined Notice of Filings

    Science.gov (United States)

    2012-12-18

    ..., LP. Description: National Grid LNG, LP submits tariff filing per 154.203: Adoption of NAESB Version 2... with Order to Amend NAESB Version 2.0 Filing to be effective 12/1/2012. Filed Date: 12/11/12. Accession...: Refile to comply with Order on NAESB Version 2.0 Filing to be effective 12/1/2012. Filed Date: 12/11/12...

  11. Formalizing a Hierarchical File System

    NARCIS (Netherlands)

    Hesselink, Wim H.; Lali, M.I.

    2009-01-01

    In this note, we define an abstract file system as a partial function from (absolute) paths to data. Such a file system determines the set of valid paths. It allows the file system to be read and written at a valid path, and it allows the system to be modified by the Unix operations for removal

  12. NJOY99, Data Processing System of Evaluated Nuclear Data Files ENDF Format

    International Nuclear Information System (INIS)

    2000-01-01

    -damage cross sections. THERMR produces cross sections and energy-to-energy matrices for free or bound scatterers in the thermal energy range. GROUPR generates self-shielded multigroup cross sections, group-to-group scattering matrices, photon-production matrices, and charged-particle cross sections from pointwise input. GAMINR calculates multigroup photo-atomic cross sections, KERMA coefficients, and group-to-group photon scattering matrices. ERRORR computes multigroup covariance matrices from ENDF uncertainties. COVR reads the output of ERRORR and performs covariance plotting and output formatting operations. PURR generates unresolved-resonance probability tables for use in representing resonance self-shielding effects in the MCNP Monte Carlo code. LEAPR generates ENDF scattering-law files (File 7) for moderator materials in the thermal range. These scattering-law files can be used by THERMR to produce the corresponding cross sections. GASPR generates gas-production cross sections in pointwise format from basic reaction data in an ENDF evaluation. These results can be converted to multigroup form using GROUPR, passed to ACER, or displayed using PLOTR. MODER converts ENDF 'tapes' back and forth between ASCII format and the special NJOY blocked-binary format. DTFR formats multigroup data for transport codes that accept formats based in the DTF-IV code. CCCCR formats multigroup data for the CCCC standard interface files ISOTXS, BRKOXS, and DLAYXS. MATXSR formats multigroup data for the newer MATXS material cross-section interface file, which works with the TRANSX code to make libraries for many particle transport codes. RESXSR prepares pointwise cross sections in a CCCC-like form for thermal flux calculators. ACER prepares libraries in ACE format for the Los Alamos continuous-energy Monte Carlo code MCNP. POWR prepares libraries for the EPRI-CELL and EPRI-CPM codes. WIMSR prepares libraries for the thermal reactor assembly codes WIMS-D and WIMS-E. PLOTR reads ENDF-format files

  13. A non-linear dimension reduction methodology for generating data-driven stochastic input models

    Science.gov (United States)

    Ganapathysubramanian, Baskar; Zabaras, Nicholas

    2008-06-01

    Stochastic analysis of random heterogeneous media (polycrystalline materials, porous media, functionally graded materials) provides information of significance only if realistic input models of the topology and property variations are used. This paper proposes a framework to construct such input stochastic models for the topology and thermal diffusivity variations in heterogeneous media using a data-driven strategy. Given a set of microstructure realizations (input samples) generated from given statistical information about the medium topology, the framework constructs a reduced-order stochastic representation of the thermal diffusivity. This problem of constructing a low-dimensional stochastic representation of property variations is analogous to the problem of manifold learning and parametric fitting of hyper-surfaces encountered in image processing and psychology. Denote by M the set of microstructures that satisfy the given experimental statistics. A non-linear dimension reduction strategy is utilized to map M to a low-dimensional region, A. We first show that M is a compact manifold embedded in a high-dimensional input space Rn. An isometric mapping F from M to a low-dimensional, compact, connected set A⊂Rd(d≪n) is constructed. Given only a finite set of samples of the data, the methodology uses arguments from graph theory and differential geometry to construct the isometric transformation F:M→A. Asymptotic convergence of the representation of M by A is shown. This mapping F serves as an accurate, low-dimensional, data-driven representation of the property variations. The reduced-order model of the material topology and thermal diffusivity variations is subsequently used as an input in the solution of stochastic partial differential equations that describe the evolution of dependant variables. A sparse grid collocation strategy (Smolyak algorithm) is utilized to solve these stochastic equations efficiently. We showcase the methodology by constructing low

  14. A non-linear dimension reduction methodology for generating data-driven stochastic input models

    International Nuclear Information System (INIS)

    Ganapathysubramanian, Baskar; Zabaras, Nicholas

    2008-01-01

    Stochastic analysis of random heterogeneous media (polycrystalline materials, porous media, functionally graded materials) provides information of significance only if realistic input models of the topology and property variations are used. This paper proposes a framework to construct such input stochastic models for the topology and thermal diffusivity variations in heterogeneous media using a data-driven strategy. Given a set of microstructure realizations (input samples) generated from given statistical information about the medium topology, the framework constructs a reduced-order stochastic representation of the thermal diffusivity. This problem of constructing a low-dimensional stochastic representation of property variations is analogous to the problem of manifold learning and parametric fitting of hyper-surfaces encountered in image processing and psychology. Denote by M the set of microstructures that satisfy the given experimental statistics. A non-linear dimension reduction strategy is utilized to map M to a low-dimensional region, A. We first show that M is a compact manifold embedded in a high-dimensional input space R n . An isometric mapping F from M to a low-dimensional, compact, connected set A is contained in R d (d<< n) is constructed. Given only a finite set of samples of the data, the methodology uses arguments from graph theory and differential geometry to construct the isometric transformation F:M→A. Asymptotic convergence of the representation of M by A is shown. This mapping F serves as an accurate, low-dimensional, data-driven representation of the property variations. The reduced-order model of the material topology and thermal diffusivity variations is subsequently used as an input in the solution of stochastic partial differential equations that describe the evolution of dependant variables. A sparse grid collocation strategy (Smolyak algorithm) is utilized to solve these stochastic equations efficiently. We showcase the methodology

  15. An in vitro study: Evaluation of intracanal calcium hydroxide removal with different file systems

    Directory of Open Access Journals (Sweden)

    Atul Jain

    2017-01-01

    Full Text Available Background: Calcium hydroxide (Ca(OH2 is the most commonly used intracanal material; it needs to be removed in entirety before obturation. Several techniques have been used for the same including use of various hand and rotary files. Aim: This study was carried out to compare the efficacy of Hand K files and single and multiple rotary file system in removal of Ca(OH2. Methodology: Distobuccal root of 45 maxillary molars were selected on the basis of specified inclusion and exclusion criteria. They were divided into three groups - Group 1 (H and K file, Group 2 (HERO Shaper, and Group 3 (One Shape. Biomechanical preparation (BMP was carried out as per the manufacturer's instructions; 2.5% sodium hypochlorite was used as the irrigant and 17% ethylenediaminetetraacetic acid as the penultimate irrigant. Ca(OH2powder was mixed with normal saline to obtain a paste; canals were filled with this paste using a Lentulo spiral and were sealed. After 7 days, Ca(OH2was removed, using the same file system as that used for BMP. Samples were sectioned longitudinally and evaluated under a stereomicroscope. Statistical Analysis: Statistical analysis of the obtained data was carried out using one-way analysis of variance test. Results: HERO Shaper displayed better removal of Ca(OH2than One Shape and Hand K file. Moreover, removal was better in the middle third of canal than apical third. Conclusion: Multiple rotary file system (HERO Shaper is more effective in removal of Ca(OH2than the single file system (One Shape

  16. 77 FR 35371 - Combined Notice of Filings #1

    Science.gov (United States)

    2012-06-13

    .... Applicants: Duke Energy Miami Fort, LLC. Description: MBR Filing to be effective 10/1/2012. Filed Date: 6/5...-000. Applicants: Duke Energy Piketon, LLC. Description: MBR Filing to be effective 10/1/2012. Filed...-1959-000. Applicants: Duke Energy Stuart, LLC. Description: MBR Filing to be effective 10/1/2012. Filed...

  17. 7 CFR 3430.607 - Stakeholder input.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Stakeholder input. 3430.607 Section 3430.607 Agriculture Regulations of the Department of Agriculture (Continued) COOPERATIVE STATE RESEARCH, EDUCATION... § 3430.607 Stakeholder input. CSREES shall seek and obtain stakeholder input through a variety of forums...

  18. The Polls-Review: Inaccurate Age and Sex Data in the Census Pums Files: Evidence and Implications.

    Science.gov (United States)

    Alexander, J Trent; Davern, Michael; Stevenson, Betsey

    2010-01-01

    We discover and document errors in public-use microdata samples ("PUMS files") of the 2000 Census, the 2003-2006 American Community Survey, and the 2004-2009 Current Population Survey. For women and men age 65 and older, age- and sex-specific population estimates generated from the PUMS files differ by as much as 15 percent from counts in published data tables. Moreover, an analysis of labor-force participation and marriage rates suggests the PUMS samples are not representative of the population at individual ages for those age 65 and over. PUMS files substantially underestimate labor-force participation of those near retirement age and overestimate labor-force participation rates of those at older ages. These problems were an unintentional byproduct of the misapplication of a newer generation of disclosure-avoidance procedures carried out on the data. The resulting errors in the public-use data could significantly impact studies of people age 65 and older, particularly analyses of variables that are expected to change by age.

  19. Prediction of Chl-a concentrations in an eutrophic lake using ANN models with hybrid inputs

    Science.gov (United States)

    Aksoy, A.; Yuzugullu, O.

    2017-12-01

    Chlorophyll-a (Chl-a) concentrations in water bodies exhibit both spatial and temporal variations. As a result, frequent sampling is required with higher number of samples. This motivates the use of remote sensing as a monitoring tool. Yet, prediction performances of models that convert radiance values into Chl-a concentrations can be poor in shallow lakes. In this study, Chl-a concentrations in Lake Eymir, a shallow eutrophic lake in Ankara (Turkey), are determined using artificial neural network (ANN) models that use hybrid inputs composed of water quality and meteorological data as well as remotely sensed radiance values to improve prediction performance. Following a screening based on multi-collinearity and principal component analysis (PCA), dissolved-oxygen concentration (DO), pH, turbidity, and humidity were selected among several parameters as the constituents of the hybrid input dataset. Radiance values were obtained from QuickBird-2 satellite. Conversion of the hybrid input into Chl-a concentrations were studied for two different periods in the lake. ANN models were successful in predicting Chl-a concentrations. Yet, prediction performance declined for low Chl-a concentrations in the lake. In general, models with hybrid inputs were superior over the ones that solely used remotely sensed data.

  20. Virtual file system for PSDS

    Science.gov (United States)

    Runnels, Tyson D.

    1993-01-01

    This is a case study. It deals with the use of a 'virtual file system' (VFS) for Boeing's UNIX-based Product Standards Data System (PSDS). One of the objectives of PSDS is to store digital standards documents. The file-storage requirements are that the files must be rapidly accessible, stored for long periods of time - as though they were paper, protected from disaster, and accumulative to about 80 billion characters (80 gigabytes). This volume of data will be approached in the first two years of the project's operation. The approach chosen is to install a hierarchical file migration system using optical disk cartridges. Files are migrated from high-performance media to lower performance optical media based on a least-frequency-used algorithm. The optical media are less expensive per character stored and are removable. Vital statistics about the removable optical disk cartridges are maintained in a database. The assembly of hardware and software acts as a single virtual file system transparent to the PSDS user. The files are copied to 'backup-and-recover' media whose vital statistics are also stored in the database. Seventeen months into operation, PSDS is storing 49 gigabytes. A number of operational and performance problems were overcome. Costs are under control. New and/or alternative uses for the VFS are being considered.

  1. World Input-Output Network.

    Directory of Open Access Journals (Sweden)

    Federica Cerina

    Full Text Available Production systems, traditionally analyzed as almost independent national systems, are increasingly connected on a global scale. Only recently becoming available, the World Input-Output Database (WIOD is one of the first efforts to construct the global multi-regional input-output (GMRIO tables. By viewing the world input-output system as an interdependent network where the nodes are the individual industries in different economies and the edges are the monetary goods flows between industries, we analyze respectively the global, regional, and local network properties of the so-called world input-output network (WION and document its evolution over time. At global level, we find that the industries are highly but asymmetrically connected, which implies that micro shocks can lead to macro fluctuations. At regional level, we find that the world production is still operated nationally or at most regionally as the communities detected are either individual economies or geographically well defined regions. Finally, at local level, for each industry we compare the network-based measures with the traditional methods of backward linkages. We find that the network-based measures such as PageRank centrality and community coreness measure can give valuable insights into identifying the key industries.

  2. 7 CFR 3430.15 - Stakeholder input.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Stakeholder input. 3430.15 Section 3430.15... Stakeholder input. Section 103(c)(2) of the Agricultural Research, Extension, and Education Reform Act of 1998... RFAs for competitive programs. CSREES will provide instructions for submission of stakeholder input in...

  3. 76 FR 63291 - Combined Notice Of Filings #1

    Science.gov (United States)

    2011-10-12

    ... filing per 35: MBR Tariff to be effective 9/23/2011. Filed Date: 09/23/2011. Accession Number: 20110923.... submits tariff filing per 35: MBR Tariff to be effective 9/23/2011. Filed Date: 09/23/2011. Accession.... submits tariff filing per 35: MBR Tariff to be effective 9/23/2011. Filed Date: 09/23/2011. Accession...

  4. Input description for BIOPATH

    International Nuclear Information System (INIS)

    Marklund, J.E.; Bergstroem, U.; Edlund, O.

    1980-01-01

    The computer program BIOPATH describes the flow of radioactivity within a given ecosystem after a postulated release of radioactive material and the resulting dose for specified population groups. The present report accounts for the input data necessary to run BIOPATH. The report also contains descriptions of possible control cards and an input example as well as a short summary of the basic theory.(author)

  5. PRIMUS: a computer code for the preparation of radionuclide ingrowth matrices from user-specified sources

    International Nuclear Information System (INIS)

    Hermann, O.W.; Baes, C.F. III; Miller, C.W.; Begovich, C.L.; Sjoreen, A.L.

    1984-10-01

    The computer program, PRIMUS, reads a library of radionuclide branching fractions and half-lives and constructs a decay-chain data library and a problem-specific decay-chain data file. PRIMUS reads the decay data compiled for 496 nuclides from the Evaluated Nuclear Structure Data File (ENSDF). The ease of adding radionuclides to the input library allows the CRRIS system to further expand its comprehensive data base. The decay-chain library produced is input to the ANEMOS code. Also, PRIMUS produces a data set reduced to only the decay chains required in a particular problem, for input to the SUMIT, TERRA, MLSOIL, and ANDROS codes. Air concentrations and deposition rates from the PRIMUS decay-chain data file. Source term data may be entered directly to PRIMUS to be read by MLSOIL, TERRA, and ANDROS. The decay-chain data prepared by PRIMUS is needed for a matrix-operator method that computes either time-dependent decay products from an initial concentration generated from a constant input source. This document describes the input requirements and the output obtained. Also, sections are included on methods, applications, subroutines, and sample cases. A short appendix indicates a method of utilizing PRIMUS and the associated decay subroutines from TERRA or ANDROS for applications to other decay problems. 18 references

  6. Hanford analytical sample projections FY 1998 - FY 2002

    International Nuclear Information System (INIS)

    Joyce, S.M.

    1997-01-01

    Sample projections are compiled for the Hanford site based on inputs from the major programs for the years 1998 through 2002. Sample projections are categorized by radiation level, protocol, sample matrix and Program. Analyses requirements are also presented

  7. Titanium-II: an evaluated nuclear data file

    International Nuclear Information System (INIS)

    Philis, C.; Howerton, R.; Smith, A.B.

    1977-06-01

    A comprehensive evaluated nuclear data file for elemental titanium is outlined including definition of the data base, the evaluation procedures and judgments, and the final evaluated results. The file describes all significant neutron-induced reactions with elemental titanium and the associated photon-production processes to incident neutron energies of 20.0 MeV. In addition, isotopic-reaction files, consistent with the elemental file, are separately defined for those processes which are important to applied considerations of material-damage and neutron-dosimetry. The file is formulated in the ENDF format. This report formally documents the evaluation and, together with the numerical file, is submitted for consideration as a part of the ENDF/B-V evaluated file system. 20 figures, 9 tables

  8. A Consistent System for Coding Laboratory Samples

    Science.gov (United States)

    Sih, John C.

    1996-07-01

    A formal laboratory coding system is presented to keep track of laboratory samples. Preliminary useful information regarding the sample (origin and history) is gained without consulting a research notebook. Since this system uses and retains the same research notebook page number for each new experiment (reaction), finding and distinguishing products (samples) of the same or different reactions becomes an easy task. Using this system multiple products generated from a single reaction can be identified and classified in a uniform fashion. Samples can be stored and filed according to stage and degree of purification, e.g. crude reaction mixtures, recrystallized samples, chromatographed or distilled products.

  9. Preservation of root canal anatomy using self-adjusting file instrumentation with glide path prepared by 20/0.02 hand files versus 20/0.04 rotary files

    Science.gov (United States)

    Jain, Niharika; Pawar, Ajinkya M.; Ukey, Piyush D.; Jain, Prashant K.; Thakur, Bhagyashree; Gupta, Abhishek

    2017-01-01

    Objectives: To compare the relative axis modification and canal concentricity after glide path preparation with 20/0.02 hand K-file (NITIFLEX®) and 20/0.04 rotary file (HyFlex™ CM) with subsequent instrumentation with 1.5 mm self-adjusting file (SAF). Materials and Methods: One hundred and twenty ISO 15, 0.02 taper, Endo Training Blocks (Dentsply Maillefer, Ballaigues, Switzerland) were acquired and randomly divided into following two groups (n = 60): group 1, establishing glide path till 20/0.02 hand K-file (NITIFLEX®) followed by instrumentation with 1.5 mm SAF; and Group 2, establishing glide path till 20/0.04 rotary file (HyFlex™ CM) followed by instrumentation with 1.5 mm SAF. Pre- and post-instrumentation digital images were processed with MATLAB R 2013 software to identify the central axis, and then superimposed using digital imaging software (Picasa 3.0 software, Google Inc., California, USA) taking five landmarks as reference points. Student's t-test for pairwise comparisons was applied with the level of significance set at 0.05. Results: Training blocks instrumented with 20/0.04 rotary file and SAF were associated less deviation in canal axis (at all the five marked points), representing better canal concentricity compared to those, in which glide path was established by 20/0.02 hand K-files followed by SAF instrumentation. Conclusion: Canal geometry is better maintained after SAF instrumentation with a prior glide path established with 20/0.04 rotary file. PMID:28855752

  10. 76 FR 28018 - Combined Notice of Filings #1

    Science.gov (United States)

    2011-05-13

    ... tariff filing per 35.13(a)(2)(iii: Information Policy Revisions to be effective 6/20/ 2011. Filed Date... Interconnection, L.L.C. Description: PJM Interconnection, L.L.C. submits tariff filing per 35.13(a)(2)(iii: Queue... New Mexico submits tariff filing per 35.13(a)(2)(iii: PNM LGIP Filing to be effective 7/5/2011. Filed...

  11. 75 FR 62381 - Combined Notice of Filings #2

    Science.gov (United States)

    2010-10-08

    ... filing per 35.12: MeadWestvaco Virginia MBR Filing to be effective 9/ 28/2010. Filed Date: 09/29/2010... submits tariff filing per 35.12: City Power MBR Tariff to be effective 9/30/2010. Filed Date: 09/29/2010... Baseline MBR Tariff to be effective 9[sol]29[sol]2010. Filed Date: 09/29/2010. Accession Number: 20100929...

  12. Photon-HDF5: An Open File Format for Timestamp-Based Single-Molecule Fluorescence Experiments.

    Science.gov (United States)

    Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier

    2016-01-05

    We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long-term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode, photomultiplier tube, or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc.) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. The format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference Python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. To encourage adoption by the academic and commercial communities, all software is released under the MIT open source license. Copyright © 2016 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  13. Modeling and generating input processes

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, M.E.

    1987-01-01

    This tutorial paper provides information relevant to the selection and generation of stochastic inputs to simulation studies. The primary area considered is multivariate but much of the philosophy at least is relevant to univariate inputs as well. 14 refs.

  14. Evaluation of canal transportation after preparation with Reciproc single-file systems with or without glide path files.

    Science.gov (United States)

    Aydin, Ugur; Karataslioglu, Emrah

    2017-01-01

    Canal transportation is a common sequel caused by rotary instruments. The purpose of the present study is to evaluate the degree of transportation after the use of Reciproc single-file instruments with or without glide path files. Thirty resin blocks with L-shaped canals were divided into three groups ( n = 10). Group 1 - canals were prepared with Reciproc-25 file. Group 2 - glide path file-G1 was used before Reciproc. Group 3 - glide path files-G1 and G2 were used before Reciproc. Pre- and post-instrumentation images were superimposed under microscope, and resin removed from the inner and outer surfaces of the root canal was calculated throughout 10 points. Statistical analysis was performed with Kruskal-Wallis test and post hoc Dunn test. For coronal and middle one-thirds, there was no significant difference among groups ( P > 0.05). For apical section, transportation of Group 1 was significantly higher than other groups ( P files before Reciproc single-file system reduced the degree of apical canal transportation.

  15. Data Qualification Report For: Thermodynamic Data File, DATA0.YMP.R0 For Geochemical Code, EQ3/6 

    Energy Technology Data Exchange (ETDEWEB)

    P.L. Cloke

    2001-10-16

    The objective of this work is to evaluate the adequacy of chemical thermodynamic data provided by Lawrence Livermore National Laboratory (LLNL) as DataO.ymp.ROA in response to an input request submitted under AP-3.14Q. This request specified that chemical thermodynamic data available in the file, Data0.com.R2, be updated, improved, and augmented for use in geochemical modeling used in Process Model Reports (PMRs) for Engineered Barrier Systems, Waste Form, Waste Package, Unsaturated Zone, and Near Field Environment, as well as for Performance Assessment. The data are qualified in the temperature range 0 to 100 C. Several Data Tracking Numbers (DTNs) associated with Analysis/Model Reports (AMR) addressing various aspects of the post-closure chemical behavior of the waste package and the Engineered Barrier System that rely on EQ316 outputs to which these data are used as input, are Principal Factor affecting. This qualification activity was accomplished in accordance with the AP-SIII.2Q using the Technical Assessment method. A development plan, TDP-EBS-MD-000044, was prepared in accordance with AP-2.13Q and approved by the Responsible Manager. In addition, a Process Control Evaluation was performed in accordance with AP-SV.1Q. The qualification method, selected in accordance with AP-SIII.2Q, was Technical Assessment. The rationale for this approach is that the data in File Data0.com.R2 are considered Handbook data and therefore do not themselves require qualification. Only changes to Data0.com.R2 required qualification. A new file has been produced which contains the database Data0.ymp.R0, which is recommended for qualification as a result of this action. Data0.ymp.R0 will supersede Data0.com.R2 for all Yucca Mountain Project (YMP) activities.

  16. 77 FR 13587 - Combined Notice of Filings

    Science.gov (United States)

    2012-03-07

    .... Applicants: Transcontinental Gas Pipe Line Company. Description: Annual Electric Power Tracker Filing... Company. Description: 2012 Annual Fuel and Electric Power Reimbursement to be effective 4/1/2012. Filed... submits tariff filing per 154.403: Storm Surcharge 2012 to be effective 4/1/2012. Filed Date: 3/1/12...

  17. 10 CFR 2.302 - Filing of documents.

    Science.gov (United States)

    2010-01-01

    ... this part shall be electronically transmitted through the E-Filing system, unless the Commission or... all methods of filing have been completed. (e) For filings by electronic transmission, the filer must... digital ID certificates, the NRC permits participants in the proceeding to access the E-Filing system to...

  18. 5 CFR 1201.14 - Electronic filing procedures.

    Science.gov (United States)

    2010-01-01

    ... form. (b) Matters subject to electronic filing. Subject to the registration requirement of paragraph (e) of this section, parties and representatives may use electronic filing (e-filing) to do any of the...). (d) Internet is sole venue for electronic filing. Following the instructions at e-Appeal Online, the...

  19. Wave energy input into the Ekman layer

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    This paper is concerned with the wave energy input into the Ekman layer, based on 3 observational facts that surface waves could significantly affect the profile of the Ekman layer. Under the assumption of constant vertical diffusivity, the analytical form of wave energy input into the Ekman layer is derived. Analysis of the energy balance shows that the energy input to the Ekman layer through the wind stress and the interaction of the Stokes-drift with planetary vorticity can be divided into two kinds. One is the wind energy input, and the other is the wave energy input which is dependent on wind speed, wave characteristics and the wind direction relative to the wave direction. Estimates of wave energy input show that wave energy input can be up to 10% in high-latitude and high-wind speed areas and higher than 20% in the Antarctic Circumpolar Current, compared with the wind energy input into the classical Ekman layer. Results of this paper are of significance to the study of wave-induced large scale effects.

  20. The File System Interface is an Anachronism

    OpenAIRE

    Ellard, Daniel

    2003-01-01

    Contemporary file systems implement a set of abstractions and semantics that are suboptimal for many (if not most) purposes. The philosophy of using the simple mechanisms of the file system as the basis for a vast array of higher-level mechanisms leads to inefficient and incorrect implementations. We propose several extensions to the canonical file system model, including explicit support for lock files, indexed files, and resource forks, and the benefit of session semantics for write updates...

  1. Microstructural characterization of the HAZ of the AISI 439 with different heat input

    International Nuclear Information System (INIS)

    Silva, Lorena de Azevedo; Lima, Luciana Iglesias Lourenco; Campos, Wagner Reis da Costa

    2007-01-01

    Ferritic stainless steels have certain useful corrosion properties, such as resistance to chloride, corrosion in oxidizing aqueous media, oxidation at high temperatures, etc. It is suitable for the aqueous chloride environments, heat transfer applications, condenser tubing for fresh water power plants, industrial buildings, and recently, the ferritic stainless steels have also received attention owing to its superior performance under irradiation. Sometimes in these applications the use of welding processes is necessary. The object of the present work was to research the relationship between microstructure and microhardness in the heat affect zone (HAZ) of the AISI 439, for two different heat input. The base metal shows a random distribution of the precipitates. The HAZ size, grain size, and the amount of precipitates had increased to the bigger heat input weld. The precipitation occurred in bigger amount in the sample with greater heat input, had increased the microhardness. It was observed that the grain size is related with heat input, and that the microhardness is more strong related with other feature, as carbides and nitrites precipitation. (author)

  2. High School and Beyond: Twins and Siblings' File Users' Manual, User's Manual for Teacher Comment File, Friends File Users' Manual.

    Science.gov (United States)

    National Center for Education Statistics (ED), Washington, DC.

    These three users' manuals are for specific files of the High School and Beyond Study, a national longitudinal study of high school sophomores and seniors in 1980. The three files are computerized databases that are available on magnetic tape. As one component of base year data collection, information identifying twins, triplets, and some non-twin…

  3. Menggabungkan Beberapa File Dalam SPSS/PC

    Directory of Open Access Journals (Sweden)

    Syahrudji Naseh

    2012-09-01

    Full Text Available Pada dasamya piranti lunak komputer dapat dibagi ke dalam lima kelompok besar yaitu pengolah kata, spreadsheet database, statistika dan animasi/desktop. Masing-masing mempunyai kelebihan dan kekurangannya. Piranti lunak dBase 111+ yang merupakan piranti lunak paling populer dalam"database", hanya dapat menampung 128 variabel saja. Oleh karenanya pada suatu kuesioner yang besar seperti Susenas (Survei Sosial Ekonomi Nasional atau SKRT (Survei Kesehatan Rumah Tangga, datanya tidak dapat dijadikan satu "file". Biasanya dipecah menjadi banyak "file", umpamanya fileldbf, file2.dbf dan seterusnya.Masalahnya adalah bagaimana menggabung beberapa variabel yang ada di file1.dbf engan beberapa variabel yang ada di file5.dbf? Tulisan ini mencoba membahas masalah tersebut

  4. 76 FR 70651 - Fee for Filing a Patent Application Other Than by the Electronic Filing System

    Science.gov (United States)

    2011-11-15

    ... government; or (3) preempt tribal law. Therefore, a tribal summary impact statement is not required under... 0651-AC64 Fee for Filing a Patent Application Other Than by the Electronic Filing System AGENCY: United..., that is not filed by electronic means as prescribed by the Director of the United States Patent and...

  5. Quantification of regional cerebral blood flow (rCBF) measurement with one point sampling by sup 123 I-IMP SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Munaka, Masahiro [University of Occupational and Enviromental Health, Kitakyushu (Japan); Iida, Hidehiro; Murakami, Matsutaro

    1992-02-01

    A handy method of quantifying regional cerebral blood flow (rCBF) measurement by {sup 123}I-IMP SPECT was designed. A standard input function was made and the sampling time to calibrate this standard input function by one point sampling was optimized. An average standard input function was obtained from continuous arterial samplings of 12 healthy adults. The best sampling time was the minimum differential value between the integral calculus value of the standard input function calibrated by one point sampling and the input funciton by continuous arterial samplings. This time was 8 minutes after an intravenous injection of {sup 123}I-IMP and an error was estimated to be {+-}4.1%. The rCBF values by this method were evaluated by comparing them with the rCBF values of the input function with continuous arterial samplings in 2 healthy adults and a patient with cerebral infarction. A significant correlation (r=0.764, p<0.001) was obtained between both. (author).

  6. Distribution Development for STORM Ingestion Input Parameters

    Energy Technology Data Exchange (ETDEWEB)

    Fulton, John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-07-01

    The Sandia-developed Transport of Radioactive Materials (STORM) code suite is used as part of the Radioisotope Power System Launch Safety (RPSLS) program to perform statistical modeling of the consequences due to release of radioactive material given a launch accident. As part of this modeling, STORM samples input parameters from probability distributions with some parameters treated as constants. This report described the work done to convert four of these constant inputs (Consumption Rate, Average Crop Yield, Cropland to Landuse Database Ratio, and Crop Uptake Factor) to sampled values. Consumption rate changed from a constant value of 557.68 kg / yr to a normal distribution with a mean of 102.96 kg / yr and a standard deviation of 2.65 kg / yr. Meanwhile, Average Crop Yield changed from a constant value of 3.783 kg edible / m 2 to a normal distribution with a mean of 3.23 kg edible / m 2 and a standard deviation of 0.442 kg edible / m 2 . The Cropland to Landuse Database ratio changed from a constant value of 0.0996 (9.96%) to a normal distribution with a mean value of 0.0312 (3.12%) and a standard deviation of 0.00292 (0.29%). Finally the crop uptake factor changed from a constant value of 6.37e-4 (Bq crop /kg)/(Bq soil /kg) to a lognormal distribution with a geometric mean value of 3.38e-4 (Bq crop /kg)/(Bq soil /kg) and a standard deviation value of 3.33 (Bq crop /kg)/(Bq soil /kg)

  7. Earnings Public-Use File, 2006

    Data.gov (United States)

    Social Security Administration — Social Security Administration released Earnings Public-Use File (EPUF) for 2006. File contains earnings information for individuals drawn from a systematic random...

  8. 12 CFR 509.10 - Filing of papers.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Filing of papers. 509.10 Section 509.10 Banks... IN ADJUDICATORY PROCEEDINGS Uniform Rules of Practice and Procedure § 509.10 Filing of papers. (a) Filing. Any papers required to be filed, excluding documents produced in response to a discovery request...

  9. 47 CFR 61.14 - Method of filing publications.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 3 2010-10-01 2010-10-01 false Method of filing publications. 61.14 Section 61...) TARIFFS Rules for Electronic Filing § 61.14 Method of filing publications. (a) Publications filed... date of a publication received by the Electronic Tariff Filing System will be determined by the date...

  10. The relation between input-output transformation and gastrointestinal nematode infections on dairy farms.

    Science.gov (United States)

    van der Voort, M; Van Meensel, J; Lauwers, L; Van Huylenbroeck, G; Charlier, J

    2016-02-01

    Efficiency analysis is used for assessing links between technical efficiency (TE) of livestock farms and animal diseases. However, previous studies often do not make the link with the allocation of inputs and mainly present average effects that ignore the often huge differences among farms. In this paper, we studied the relationship between exposure to gastrointestinal (GI) nematode infections, the TE and the input allocation on dairy farms. Although the traditional cost allocative efficiency (CAE) indicator adequately measures how a given input allocation differs from the cost-minimising input allocation, they do not represent the unique input allocation of farms. Similar CAE scores may be obtained for farms with different input allocations. Therefore, we propose an adjusted allocative efficiency index (AAEI) to measure the unique input allocation of farms. Combining this AAEI with the TE score allows determining the unique input-output position of each farm. The method is illustrated by estimating efficiency scores using data envelopment analysis (DEA) on a sample of 152 dairy farms in Flanders for which both accountancy and parasitic monitoring data were available. Three groups of farms with a different input-output position can be distinguished based on cluster analysis: (1) technically inefficient farms, with a relatively low use of concentrates per 100 l milk and a high exposure to infection, (2) farms with an intermediate TE, relatively high use of concentrates per 100 l milk and a low exposure to infection, (3) farms with the highest TE, relatively low roughage use per 100 l milk and a relatively high exposure to infection. Correlation analysis indicates for each group how the level of exposure to GI nematodes is associated or not with improved economic performance. The results suggest that improving both the economic performance and exposure to infection seems only of interest for highly TE farms. The findings indicate that current farm recommendations

  11. Input-output supervisor; Le superviseur d'entree-sortie dans les ordinateurs

    Energy Technology Data Exchange (ETDEWEB)

    Dupuy, R. [Commissariat a l' Energie Atomique, Vaujours (France). Centre d' Etudes Nucleaires

    1970-07-01

    The input-output supervisor is the program which monitors the flow of informations between core storage and peripheral equipments of a computer. This work is composed of three parts: 1 - Study of a generalized input-output supervisor. With sample modifications it looks like most of input-output supervisors which are running now on computers. 2 - Application of this theory on a magnetic drum. 3 - Hardware requirement for time-sharing. (author) [French] Le superviseur d'entree-sortie est le programme charge de gerer les echanges d'information entre la memoire rapide et les organes peripheriques d'un ordinateur. Ce travail se compose de trois parties: 1 - Etude d'un systeme d'entree-sortie general et theorique qui, en faisant un certain nombre d'hypotheses simplificatrices, permet de retrouver la plupart des superviseurs d'entree-sortie actuels. 2 - Expose d'une realisation concrete, gestion d'un tambour magnetique. 3 - Suggestions hardware en vue de faciliter le timesharing. (auteur)

  12. 12 CFR 263.10 - Filing of papers.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Filing of papers. 263.10 Section 263.10 Banks... OF PRACTICE FOR HEARINGS Uniform Rules of Practice and Procedure § 263.10 Filing of papers. (a) Filing. Any papers required to be filed, excluding documents produced in response to a discovery request...

  13. 12 CFR 308.10 - Filing of papers.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Filing of papers. 308.10 Section 308.10 Banks... AND PROCEDURE Uniform Rules of Practice and Procedure § 308.10 Filing of papers. (a) Filing. Any papers required to be filed, excluding documents produced in response to a discovery request pursuant to...

  14. 29 CFR 1981.103 - Filing of discrimination complaint.

    Science.gov (United States)

    2010-07-01

    ... constitute the violations. (c) Place of filing. The complaint should be filed with the OSHA Area Director... or she has been discriminated against by an employer in violation of the Act may file, or have filed..., but may be filed with any OSHA officer or employee. Addresses and telephone numbers for these...

  15. 77 FR 66458 - Combined Notice of Filings #1

    Science.gov (United States)

    2012-11-05

    ... Service Company of Colorado. Description: 2012--10--26 PSCo MBR Filing to be effective 12/26/ 2012. Filed...--SPS MBR Filing to be effective 12/26/2012. Filed Date: 10/26/12. Accession Number: 20121026-5123...: Revised Application for MBR Authorization to be effective 10/16/2012. Filed Date: 10/25/12. Accession...

  16. 75 FR 66075 - Combined Notice of Filings #1

    Science.gov (United States)

    2010-10-27

    ....12: Baseline MBR Concurrence to be effective 10/8/2010. Filed Date: 10/19/2010. Accession Number... Company submits tariff filing per 35.12: Baseline MBR Concurrence to be effective 10/8/2010. Filed Date... Power Company submits tariff filing per 35.12: Baseline MBR Concurrence to be effective 10/8/2010. Filed...

  17. The Global File System

    Science.gov (United States)

    Soltis, Steven R.; Ruwart, Thomas M.; OKeefe, Matthew T.

    1996-01-01

    The global file system (GFS) is a prototype design for a distributed file system in which cluster nodes physically share storage devices connected via a network-like fiber channel. Networks and network-attached storage devices have advanced to a level of performance and extensibility so that the previous disadvantages of shared disk architectures are no longer valid. This shared storage architecture attempts to exploit the sophistication of storage device technologies whereas a server architecture diminishes a device's role to that of a simple component. GFS distributes the file system responsibilities across processing nodes, storage across the devices, and file system resources across the entire storage pool. GFS caches data on the storage devices instead of the main memories of the machines. Consistency is established by using a locking mechanism maintained by the storage devices to facilitate atomic read-modify-write operations. The locking mechanism is being prototyped in the Silicon Graphics IRIX operating system and is accessed using standard Unix commands and modules.

  18. Survey of sampling-based methods for uncertainty and sensitivity analysis

    International Nuclear Information System (INIS)

    Helton, J.C.; Johnson, J.D.; Sallaberry, C.J.; Storlie, C.B.

    2006-01-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (i) definition of probability distributions to characterize epistemic uncertainty in analysis inputs (ii) generation of samples from uncertain analysis inputs (iii) propagation of sampled inputs through an analysis (iv) presentation of uncertainty analysis results, and (v) determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two-dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition

  19. Survey of sampling-based methods for uncertainty and sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. PhD. (.; .); Storlie, Curt B. (Colorado State University, Fort Collins, CO)

    2006-06-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.

  20. Statistical identification of effective input variables

    International Nuclear Information System (INIS)

    Vaurio, J.K.

    1982-09-01

    A statistical sensitivity analysis procedure has been developed for ranking the input data of large computer codes in the order of sensitivity-importance. The method is economical for large codes with many input variables, since it uses a relatively small number of computer runs. No prior judgemental elimination of input variables is needed. The sceening method is based on stagewise correlation and extensive regression analysis of output values calculated with selected input value combinations. The regression process deals with multivariate nonlinear functions, and statistical tests are also available for identifying input variables that contribute to threshold effects, i.e., discontinuities in the output variables. A computer code SCREEN has been developed for implementing the screening techniques. The efficiency has been demonstrated by several examples and applied to a fast reactor safety analysis code (Venus-II). However, the methods and the coding are general and not limited to such applications