WorldWideScience

Sample records for irradiation computer file

  1. Small file aggregation in a parallel computing system

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Zhang, Jingwang

    2014-09-02

    Techniques are provided for small file aggregation in a parallel computing system. An exemplary method for storing a plurality of files generated by a plurality of processes in a parallel computing system comprises aggregating the plurality of files into a single aggregated file; and generating metadata for the single aggregated file. The metadata comprises an offset and a length of each of the plurality of files in the single aggregated file. The metadata can be used to unpack one or more of the files from the single aggregated file.

  2. Algorithms and file structures for computational geometry

    International Nuclear Information System (INIS)

    Hinrichs, K.; Nievergelt, J.

    1983-01-01

    Algorithms for solving geometric problems and file structures for storing large amounts of geometric data are of increasing importance in computer graphics and computer-aided design. As examples of recent progress in computational geometry, we explain plane-sweep algorithms, which solve various topological and geometric problems efficiently; and we present the grid file, an adaptable, symmetric multi-key file structure that provides efficient access to multi-dimensional data along any space dimension. (orig.)

  3. Storing files in a parallel computing system using list-based index to identify replica files

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Zhang, Zhenhua; Grider, Gary

    2015-07-21

    Improved techniques are provided for storing files in a parallel computing system using a list-based index to identify file replicas. A file and at least one replica of the file are stored in one or more storage nodes of the parallel computing system. An index for the file comprises at least one list comprising a pointer to a storage location of the file and a storage location of the at least one replica of the file. The file comprises one or more of a complete file and one or more sub-files. The index may also comprise a checksum value for one or more of the file and the replica(s) of the file. The checksum value can be evaluated to validate the file and/or the file replica(s). A query can be processed using the list.

  4. WinSCP for Windows File Transfers | High-Performance Computing | NREL

    Science.gov (United States)

    WinSCP for Windows File Transfers WinSCP for Windows File Transfers WinSCP for can used to securely transfer files between your local computer running Microsoft Windows and a remote computer running Linux

  5. 22 CFR 1429.21 - Computation of time for filing papers.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Computation of time for filing papers. 1429.21... MISCELLANEOUS AND GENERAL REQUIREMENTS General Requirements § 1429.21 Computation of time for filing papers. In... subchapter requires the filing of any paper, such document must be received by the Board or the officer or...

  6. RAMA: A file system for massively parallel computers

    Science.gov (United States)

    Miller, Ethan L.; Katz, Randy H.

    1993-01-01

    This paper describes a file system design for massively parallel computers which makes very efficient use of a few disks per processor. This overcomes the traditional I/O bottleneck of massively parallel machines by storing the data on disks within the high-speed interconnection network. In addition, the file system, called RAMA, requires little inter-node synchronization, removing another common bottleneck in parallel processor file systems. Support for a large tertiary storage system can easily be integrated in lo the file system; in fact, RAMA runs most efficiently when tertiary storage is used.

  7. Storing files in a parallel computing system based on user-specified parser function

    Science.gov (United States)

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Manzanares, Adam; Torres, Aaron

    2014-10-21

    Techniques are provided for storing files in a parallel computing system based on a user-specified parser function. A plurality of files generated by a distributed application in a parallel computing system are stored by obtaining a parser from the distributed application for processing the plurality of files prior to storage; and storing one or more of the plurality of files in one or more storage nodes of the parallel computing system based on the processing by the parser. The plurality of files comprise one or more of a plurality of complete files and a plurality of sub-files. The parser can optionally store only those files that satisfy one or more semantic requirements of the parser. The parser can also extract metadata from one or more of the files and the extracted metadata can be stored with one or more of the plurality of files and used for searching for files.

  8. Arranging and finding folders and files on your Windows 7 computer

    CERN Document Server

    Steps, Studio Visual

    2014-01-01

    If you have lots of documents on your desk, it may prove to be impossible to find the document you are looking for. In order to easily find certain documents, they are often stored in a filing cabinet and arranged in a logical order. The folders on your computer serve the same purpose. They do not just contain files; they can also contain other folders. You can create an unlimited number of folders, and each folder can contain any number of subfolders and files. You can use Windows Explorer, also called the folder window, to work with the files and folders on your computer. You can copy, delete, move, find, and sort files, among other things. Or you can transfer files and folders to a USB stick, an external hard drive, a CD, DVD or Blu-Ray disk. In this practical guide we will show you how to use the folder window, and help you arrange your own files.

  9. A digital imaging teaching file by using the internet, HTML and personal computers

    International Nuclear Information System (INIS)

    Chun, Tong Jin; Jeon, Eun Ju; Baek, Ho Gil; Kang, Eun Joo; Baik, Seung Kug; Choi, Han Yong; Kim, Bong Ki

    1996-01-01

    A film-based teaching file takes up space and the need to search through such a file places limits on the extent to which it is likely to be used. Furthermore it is not easy for doctors in a medium-sized hospital to experience a variety of cases, and so for these reasons we created an easy-to-use digital imaging teaching file with HTML(Hypertext Markup Language) and downloaded images via World Wide Web(WWW) services on the Internet. This was suitable for use by computer novices. We used WWW internet services as a resource for various images and three different IMB-PC compatible computers(386DX, 486DX-II, and Pentium) in downloading the images and in developing a digitalized teaching file. These computers were connected with the Internet through a high speed dial-up modem(28.8Kbps) and to navigate the Internet. Twinsock and Netscape were used. 3.0, Korean word processing software, was used to create HTML(Hypertext Markup Language) files and the downloaded images were linked to the HTML files. In this way, a digital imaging teaching file program was created. Access to a Web service via the Internet required a high speed computer(at least 486DX II with 8MB RAM) for comfortabel use; this also ensured that the quality of downloaded images was not degraded during downloading and that these were good enough to use as a teaching file. The time needed to retrieve the text and related images depends on the size of the file, the speed of the network, and the network traffic at the time of connection. For computer novices, a digital image teaching file using HTML is easy to use. Our method of creating a digital imaging teaching file by using Internet and HTML would be easy to create and radiologists with little computer experience who want to study various digital radiologic imaging cases would find it easy to use

  10. Mobile computing device configured to compute irradiance, glint, and glare of the sun

    Science.gov (United States)

    Gupta, Vipin P; Ho, Clifford K; Khalsa, Siri Sahib

    2014-03-11

    Described herein are technologies pertaining to computing the solar irradiance distribution on a surface of a receiver in a concentrating solar power system or glint/glare emitted from a reflective entity. A mobile computing device includes at least one camera that captures images of the Sun and the entity of interest, wherein the images have pluralities of pixels having respective pluralities of intensity values. Based upon the intensity values of the pixels in the respective images, the solar irradiance distribution on the surface of the entity or glint/glare corresponding to the entity is computed by the mobile computing device.

  11. NET: an inter-computer file transfer command

    International Nuclear Information System (INIS)

    Burris, R.D.

    1978-05-01

    The NET command was defined and supported in order to facilitate file transfer between computers. Among the goals of the implementation were greatest possible ease of use, maximum power (i.e., support of a diversity of equipment and operations), and protection of the operating system

  12. Methods and apparatus for multi-resolution replication of files in a parallel computing system using semantic information

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-10-20

    Techniques are provided for storing files in a parallel computing system using different resolutions. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a sub-file. The method comprises the steps of obtaining semantic information related to the file; generating a plurality of replicas of the file with different resolutions based on the semantic information; and storing the file and the plurality of replicas of the file in one or more storage nodes of the parallel computing system. The different resolutions comprise, for example, a variable number of bits and/or a different sub-set of data elements from the file. A plurality of the sub-files can be merged to reproduce the file.

  13. Computer-controlled gamma-ray scanner for irradiated reactor fuel

    International Nuclear Information System (INIS)

    Mandler, J.W.; Coates, R.A.; Killian, E.W.

    1979-01-01

    Gamma-ray scanning of irradiated fuel is an important nondestructive technique used in the thermal fuels behavior program currently under way at the Idaho National Engineering Laboratory. This paper is concerned with the computer-controlled isotopic gamma-ray-scanning system developed for postirradiation examination of fuel and includes a brief discussion of some scan results obtained from fuel rods irradiated in the Power-Burst Facility to illustrate gamma-ray spectrometry for this application. Both burnup profiles and information concerning fission-product migration in irradiated fuel are routinely obtained with the computer-controlled system

  14. Computational Modeling of Ablation on an Irradiated Target

    Science.gov (United States)

    Mehmedagic, Igbal; Thangam, Siva

    2017-11-01

    Computational modeling of pulsed nanosecond laser interaction with an irradiated metallic target is presented. The model formulation involves ablation of the metallic target irradiated by pulsed high intensity laser at normal atmospheric conditions. Computational findings based on effective representation and prediction of the heat transfer, melting and vaporization of the targeting material as well as plume formation and expansion are presented along with its relevance for the development of protective shields. In this context, the available results for a representative irradiation from 1064 nm laser pulse is used to analyze various ablation mechanisms, variable thermo-physical and optical properties, plume expansion and surface geometry. Funded in part by U. S. Army ARDEC, Picatinny Arsenal, NJ.

  15. Documentation of CATHENA input files for the APOLLO computer

    International Nuclear Information System (INIS)

    1988-06-01

    Input files created for the VAX version of the CATHENA two-fluid code have been modified and documented for simulation on the AECB's APOLLO computer system. The input files describe the RD-14 thermalhydraulic loop, the RD-14 steam generator, the RD-12 steam generator blowdown test facility, the Stern Laboratories Cold Water Injection Facility (CWIT), and a CANDU 600 reactor. Sample CATHENA predictions are given and compared with experimental results where applicable. 24 refs

  16. Methods and apparatus for capture and storage of semantic information with sub-files in a parallel computing system

    Science.gov (United States)

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-02-03

    Techniques are provided for storing files in a parallel computing system using sub-files with semantically meaningful boundaries. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a plurality of sub-files. The method comprises the steps of obtaining a user specification of semantic information related to the file; providing the semantic information as a data structure description to a data formatting library write function; and storing the semantic information related to the file with one or more of the sub-files in one or more storage nodes of the parallel computing system. The semantic information provides a description of data in the file. The sub-files can be replicated based on semantically meaningful boundaries.

  17. The image database management system of teaching file using personal computer

    International Nuclear Information System (INIS)

    Shin, M. J.; Kim, G. W.; Chun, T. J.; Ahn, W. H.; Baik, S. K.; Choi, H. Y.; Kim, B. G.

    1995-01-01

    For the systemic management and easy using of teaching file in radiology department, the authors tried to do the setup of a database management system of teaching file using personal computer. We used a personal computer (IBM PC compatible, 486DX2) including a image capture card(Window vision, Dooin Elect, Seoul, Korea) and video camera recorder (8mm, CCD-TR105, Sony, Tokyo, Japan) for the acquisition and storage of images. We developed the database program by using Foxpro for Window 2.6(Microsoft, Seattle, USA) executed in the Window 3.1 (Microsoft, Seattle, USA). Each datum consisted of hospital number, name, sex, age, examination date, keyword, radiologic examination modalities, final diagnosis, radiologic findings, references and representative images. The images were acquired and stored as bitmap format (8 bitmap, 540 X 390 ∼ 545 X 414, 256 gray scale) and displayed on the 17 inch-flat monitor(1024 X 768, Samtron, Seoul, Korea). Without special devices, the images acquisition and storage could be done on the reading viewbox, simply. The image quality on the computer's monitor was less than the one of original film on the viewbox, but generally the characteristics of each lesions could be differentiated. Easy retrieval of data was possible for the purpose of teaching file system. Without high cost appliances, we could consummate the image database system of teaching file using personal computer with relatively inexpensive method

  18. Processing of evaluated neutron data files in ENDF format on personal computers

    International Nuclear Information System (INIS)

    Vertes, P.

    1991-11-01

    A computer code package - FDMXPC - has been developed for processing evaluated data files in ENDF format. The earlier version of this package is supplemented with modules performing calculations using Reich-Moore and Adler-Adler resonance parameters. The processing of evaluated neutron data files by personal computers requires special programming considerations outlined in this report. The scope of the FDMXPC program system is demonstrated by means of numerical examples. (author). 5 refs, 4 figs, 4 tabs

  19. Survey on Security Issues in File Management in Cloud Computing Environment

    Science.gov (United States)

    Gupta, Udit

    2015-06-01

    Cloud computing has pervaded through every aspect of Information technology in past decade. It has become easier to process plethora of data, generated by various devices in real time, with the advent of cloud networks. The privacy of users data is maintained by data centers around the world and hence it has become feasible to operate on that data from lightweight portable devices. But with ease of processing comes the security aspect of the data. One such security aspect is secure file transfer either internally within cloud or externally from one cloud network to another. File management is central to cloud computing and it is paramount to address the security concerns which arise out of it. This survey paper aims to elucidate the various protocols which can be used for secure file transfer and analyze the ramifications of using each protocol.

  20. Informatics in Radiology (infoRAD): personal computer security: part 2. Software Configuration and file protection.

    Science.gov (United States)

    Caruso, Ronald D

    2004-01-01

    Proper configuration of software security settings and proper file management are necessary and important elements of safe computer use. Unfortunately, the configuration of software security options is often not user friendly. Safe file management requires the use of several utilities, most of which are already installed on the computer or available as freeware. Among these file operations are setting passwords, defragmentation, deletion, wiping, removal of personal information, and encryption. For example, Digital Imaging and Communications in Medicine medical images need to be anonymized, or "scrubbed," to remove patient identifying information in the header section prior to their use in a public educational or research environment. The choices made with respect to computer security may affect the convenience of the computing process. Ultimately, the degree of inconvenience accepted will depend on the sensitivity of the files and communications to be protected and the tolerance of the user. Copyright RSNA, 2004

  1. A technique for integrating remote minicomputers into a general computer's file system

    CERN Document Server

    Russell, R D

    1976-01-01

    This paper describes a simple technique for interfacing remote minicomputers used for real-time data acquisition into the file system of a central computer. Developed as part of the ORION system at CERN, this 'File Manager' subsystem enables a program in the minicomputer to access and manipulate files of any type as if they resided on a storage device attached to the minicomputer. Yet, completely transparent to the program, the files are accessed from disks on the central system via high-speed data links, with response times comparable to local storage devices. (6 refs).

  2. Computational model of gamma irradiation room at ININ

    Science.gov (United States)

    Rodríguez-Romo, Suemi; Patlan-Cardoso, Fernando; Ibáñez-Orozco, Oscar; Vergara Martínez, Francisco Javier

    2018-03-01

    In this paper, we present a model of the gamma irradiation room at the National Institute of Nuclear Research (ININ is its acronym in Spanish) in Mexico to improve the use of physics in dosimetry for human protection. We deal with air-filled ionization chambers and scientific computing made in house and framed in both the GEANT4 scheme and our analytical approach to characterize the irradiation room. This room is the only secondary dosimetry facility in Mexico. Our aim is to optimize its experimental designs, facilities, and industrial applications of physical radiation. The computational results provided by our model are supported by all the known experimental data regarding the performance of the ININ gamma irradiation room and allow us to predict the values of the main variables related to this fully enclosed space to within an acceptable margin of error.

  3. A software to report and file by personal computer

    International Nuclear Information System (INIS)

    Di Giandomenico, E.; Filippone, A.; Esposito, A.; Bonomo, L.

    1989-01-01

    During the past four years the authors have been gaining experince in reporting radiological examinations by personal computer. Today they describe the project of a new software which allows the reporting and filing of roentgenograms. This program was realized by a radiologist, using a well known data base management system: dBASE III. The program was shaped to fit the radiologist's needs: it helps to report, and allows to file, radiological data, with the diagnosic codes used by the American College of Radiology. In this paper the authors describe the data base structure and indicate the software functions which make its use possible. Thus, this paper is not aimed at advertising a new reporting program, but at demonstrating how the radiologist can himself manage some aspects of his work with the help of a personal computer

  4. ERX: a software for editing files containing X-ray spectra to be used in exposure computational models

    International Nuclear Information System (INIS)

    Cabral, Manuela O.M.; Vieira, Jose W.; Silva, Alysson G.; Leal Neto, Viriato; Oliveira, Alex C.H.; Lima, Fernando R.A.

    2011-01-01

    Exposure Computational Models (ECMs) are utilities that simulate situations in which occurs irradiation in a given environment. An ECM is composed primarily by an anthropomorphic model (phantom), and a Monte Carlo code (MC). This paper presents a tutorial of the software Espectro de Raios-X (ERX). This software performs reading and numerical and graphical analysis of text files containing diagnostic X-ray spectra for use in algorithms of radioactive sources in the ECMs of a Grupo de Dosimetria Numerica. The ERX allows the user to select one among several X-ray spectrums in the energy range Diagnostic radiology X-Ray most commonly used in radiology clinics. In the current version of the ERX there are two types of input files: the contained in mspectra.dat file and the resulting of MC simulations in Geant4. The software allows the construction of charts of the Probability Density Function (PDF) and Cumulative Distribution Function (CDF) of a selected spectrum as well as the table with the values of these functions and the spectrum. In addition, the ERX allows the user to make comparative analysis between the PDF graphics of the two catalogs of spectra available, besides being can perform dosimetric evaluations with the selected spectrum. A software of this kind is an important computational tool for researchers in numerical dosimetry because of the diversity of Diagnostic radiology X-Ray machines, which implies in a mass of input data highly diverse. And because of this, the ERX provides independence to the group related to the data origin that is contained in the catalogs created, not being necessary to resort to others. (author)

  5. Comparison of canal transportation and centering ability of hand Protaper files and rotary Protaper files by using micro computed tomography

    OpenAIRE

    Amit Gandhi; Taru Gandhi

    2011-01-01

    Introduction and objective: The aim of the present study was to compare root canal preparation with rotary ProTaper files and hand ProTaper files to find a better instrumentation technique for maintaining root canal geometry with the aid of computed tomography. Material and methods: Twenty curved root canals with at least 10 degree of curvature were divided into 2 groups of 10 teeth each. In group I the canals were prepared with hand ProTaper files and in group II the canals were prepared wit...

  6. Food irradiation: fiction and reality

    International Nuclear Information System (INIS)

    1991-01-01

    The International Consultative Group on Food Irradiation (IGCFI), sponsored by World Health Organization (WHO), Food and Agriculture Organization (FAO) and the International Atomic Energy Agency (IAEA), with the intention to provide to governments, especially those of developing countries, scientifically correct information about food irradiation, decided to organize a file and questions of general public interest. The document is composed by descriptive files related with the actual situation and future prospective, technical and scientific terms, food irradiation and the radioactivity, chemical transformations in irradiated food, genetic studies, microbiological safety of irradiated food, irradiation and harmlessness, irradiation and additives, packing, irradiation facilities control, process control, irradiation costs and benefits as well as consumers reaction

  7. Cooperative storage of shared files in a parallel computing system with dynamic block size

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-11-10

    Improved techniques are provided for parallel writing of data to a shared object in a parallel computing system. A method is provided for storing data generated by a plurality of parallel processes to a shared object in a parallel computing system. The method is performed by at least one of the processes and comprises: dynamically determining a block size for storing the data; exchanging a determined amount of the data with at least one additional process to achieve a block of the data having the dynamically determined block size; and writing the block of the data having the dynamically determined block size to a file system. The determined block size comprises, e.g., a total amount of the data to be stored divided by the number of parallel processes. The file system comprises, for example, a log structured virtual parallel file system, such as a Parallel Log-Structured File System (PLFS).

  8. Long term file migration. Part I: file reference patterns

    International Nuclear Information System (INIS)

    Smith, A.J.

    1978-08-01

    In most large computer installations, files are moved between on-line disk and mass storage (tape, integrated mass storage device) either automatically by the system or specifically at the direction of the user. This is the first of two papers which study the selection of algorithms for the automatic migration of files between mass storage and disk. The use of the text editor data sets at the Stanford Linear Accelerator Center (SLAC) computer installation is examined through the analysis of thirteen months of file reference data. Most files are used very few times. Of those that are used sufficiently frequently that their reference patterns may be examined, about a third show declining rates of reference during their lifetime; of the remainder, very few (about 5%) show correlated interreference intervals, and interreference intervals (in days) appear to be more skewed than would occur with the Bernoulli process. Thus, about two-thirds of all sufficiently active files appear to be referenced as a renewal process with a skewed interreference distribution. A large number of other file reference statistics (file lifetimes, interference distributions, moments, means, number of uses/file, file sizes, file rates of reference, etc.) are computed and presented. The results are applied in the following paper to the development and comparative evaluation of file migration algorithms. 17 figures, 13 tables

  9. Use of computed tomography for irradiation planning in practical radiotherapy

    International Nuclear Information System (INIS)

    Riessbeck, K.H.; Achtert, J.; Hegewald, H.

    1985-01-01

    Experience of several years comprising computed tomography into irradiation planning resulted in substantial and organizational arrangements in practical radiotherapy. Precising the individual topography of patient, target volume, and risk organs in the central radiation plane as well as in other planes beeing of interest, permits to optimize the irradiation area. In patients whose radiotherapy requires a complicated field adjustment (for instance head fields, bronchial esophagical cancer) and in all patients who receive an irradiation in motion the irradiation planning is done by the help of CT examination without omitting the localization diagnosis procedure approved. The method of irradiation planning in one plane is represented in which the spatial dimension of target volume can be considered yet after superprojection into the planning plane. However, the topometric gain alone can not result in new irradiation methods. Approved irradiation methods should be modified only in connection with increased knowledge on pathobiology of tumors and on tolerance of healthy tissue with regard to keeping or improving the ratio of curing to complication rate. (author)

  10. More than 2 years' experience with computer-aided irradiation planning in clinical routine

    International Nuclear Information System (INIS)

    Heller, H.; Rathje, J.

    1976-01-01

    This is a report on an irradiation planning system which has been used for about 2 years in the department of radiotherapy in the general hospital in Altona. Hard- and software as well as the mathematical model for the description of the dose distribution are described. The compromise between the required accuray of the irradiation plan and the investment in computer-technical activities and computer time is discussed. (orig./LN) [de

  11. Analysis of irradiated biogenic amines by computational chemistry and spectroscopy

    International Nuclear Information System (INIS)

    Oliveira, Jorge L.S.P.; Borges Junior, Itamar; Cardozo, Monique; Souza, Stefania P.; Lima, Antonio L.S.; Lima, Keila S.C.

    2011-01-01

    Biogenic Amines (B A) are nitrogenous compounds able to cause food poisoning. In this work, we studied the tyramine, one of the most common BA present in foods by combining experimental measured IR (Infrared) and GC/MS (Gas Chromatograph / Mass Spectrometry) spectra and computational quantum chemistry. Density Functional Theory (DFT) and the Deformed Atoms in Molecules (DMA) method was used to compute the partition the electronic densities in a chemically-intuitive way and electrostatic potentials of molecule to identify the acid and basic sites. Trading pattern was irradiated using a Cs 137 radiator, and each sample was identified by IR and GC/MS. Calculated and experimental IR spectra were compared. We observed that ionizing gamma irradiation was very effective in decreasing the population of standard amine, resulting in fragments that could be rationalized through the quantum chemistry calculations. In particular, we could locate the acid and basic sites of both molecules and identify possible sites of structural weaknesses, which allowed to propose mechanistic schemes for the breaking of chemical bonds by the irradiation. Moreover, from this work we hope it will be also possible to properly choose the dose of gamma irradiation which should be provided to eliminate each type of contamination. (author)

  12. File and metadata management for BESIII distributed computing

    International Nuclear Information System (INIS)

    Nicholson, C; Zheng, Y H; Lin, L; Deng, Z Y; Li, W D; Zhang, X M

    2012-01-01

    The BESIII experiment at the Institute of High Energy Physics (IHEP), Beijing, uses the high-luminosity BEPCII e + e − collider to study physics in the π-charm energy region around 3.7 GeV; BEPCII has produced the worlds largest samples of J/φ and φ’ events to date. An order of magnitude increase in the data sample size over the 2011-2012 data-taking period demanded a move from a very centralized to a distributed computing environment, as well as the development of an efficient file and metadata management system. While BESIII is on a smaller scale than some other HEP experiments, this poses particular challenges for its distributed computing and data management system. These constraints include limited resources and manpower, and low quality of network connections to IHEP. Drawing on the rich experience of the HEP community, a system has been developed which meets these constraints. The design and development of the BESIII distributed data management system, including its integration with other BESIII distributed computing components, such as job management, are presented here.

  13. Computer simulation of plastic deformation in irradiated metals

    International Nuclear Information System (INIS)

    Colak, U.

    1989-01-01

    A computer-based model is developed for the localized plastic deformation in irradiated metals by dislocation channeling, and it is applied to irradiated single crystals of niobium. In the model, the concentrated plastic deformation in the dislocation channels is postulated to occur by virtue of the motion of dislocations in a series of pile-tips on closely spaced parallel slip planes. The dynamics of this dislocation motion is governed by an experimentally determined dependence of dislocation velocity on shear stress. This leads to a set of coupled differential equations for the positions of the individual dislocations in the pile-up as a function of time. Shear displacement in the channel region is calculated from the total distance traveled by the dislocations. The macroscopic shape change in single crystal metal sheet samples is determined by the axial displacement produced by the shear displacements in the dislocation channels. Computer simulations are performed for the plastic deformation up to 20% engineering strain at a constant strain rate. Results of the computer calculations are compared with experimental observations of the shear stress-engineering strain curve obtained in tensile tests described in the literature. Agreement between the calculated and experimental stress-strain curves is obtained for shear displacement of 1.20-1.25 μm and 1000 active slip planes per channel, which is reasonable in the view of experimental observations

  14. Transfer of numeric ASCII data files between Apple and IBM personal computers.

    Science.gov (United States)

    Allan, R W; Bermejo, R; Houben, D

    1986-01-01

    Listings for programs designed to transfer numeric ASCII data files between Apple and IBM personal computers are provided with accompanying descriptions of how the software operates. Details of the hardware used are also given. The programs may be easily adapted for transferring data between other microcomputers.

  15. Dosimetry computer module of the gamma irradiator of ININ

    International Nuclear Information System (INIS)

    Ledezma F, L. E.; Baldomero J, R.; Agis E, K. A.

    2012-10-01

    This work present the technical specifications for the upgrade of the dosimetry module of the computer system of the gamma irradiator of the Instituto Nacional de Investigaciones Nucleares (ININ) whose result allows the integration and consultation of information in industrial dosimetry subject under an outline client-server. (Author)

  16. Computer-based anthropometrical system for total body irradiation.

    Science.gov (United States)

    Sánchez-Nieto, B; Sánchez-Doblado, F; Terrón, J A; Arráns, R; Errazquin, L

    1997-05-01

    For total body irradiation (TBI) dose calculation requirements, anatomical information about the whole body is needed. Despite the fact that video image grabbing techniques are used by some treatment planning systems for standard radiotherapy, there are no such systems designed to generate anatomical parameters for TBI planning. The paper describes an anthropometrical computerised system based on video image grabbing which was purpose-built to provide anatomical data for a PC-based TBI planning system. Using software, the system controls the acquisition and digitalisation of the images (external images of the patient in treatment position) and the measurement procedure itself (on the external images or the digital CT information). An ASCII file, readable by the TBI planning system, is generated to store the required parameters of the dose calculation points, i.e. depth, backscatter tissue thickness, thickness of inhomogeneity, off-axis distance (OAD) and source to skin distance (SSD).

  17. Trust in social computing. The case of peer-to-peer file sharing networks

    Directory of Open Access Journals (Sweden)

    Heng Xu

    2011-09-01

    Full Text Available Social computing and online communities are changing the fundamental way people share information and communicate with each other. Social computing focuses on how users may have more autonomy to express their ideas and participate in social exchanges in various ways, one of which may be peer-to-peer (P2P file sharing. Given the greater risk of opportunistic behavior by malicious or criminal communities in P2P networks, it is crucial to understand the factors that affect individual’s use of P2P file sharing software. In this paper, we develop and empirically test a research model that includes trust beliefs and perceived risks as two major antecedent beliefs to the usage intention. Six trust antecedents are assessed including knowledge-based trust, cognitive trust, and both organizational and peer-network factors of institutional trust. Our preliminary results show general support for the model and offer some important implications for software vendors in P2P sharing industry and regulatory bodies.

  18. Programmed temperature control of capsule in irradiation test with personal computer at JMTR

    International Nuclear Information System (INIS)

    Saito, H.; Uramoto, T.; Fukushima, M.; Obata, M.; Suzuki, S.; Nakazaki, C.; Tanaka, I.

    1992-01-01

    The capsule irradiation facility is one of various equipments employed at the Japan Materials Testing Reactor (JMTR). The capsule facility has been used in irradiation tests of both nuclear fuels and materials. The capsule to be irradiated consists of the specimen, the outer tube and inner tube with a annular space between them. The temperature of the specimen is controlled by varying the degree of pressure (below the atmospheric pressure) of He gas in the annular space (vacuum-controlled). Beside this, in another system the temperature of the specimen is controlled with electric heaters mounted around the specimen (heater-controlled). The use of personal computer in the capsule facility has led to the development of a versatile temperature control system at the JMTR. Features of this newly-developed temperature control system lie in the following: the temperature control mode for a operation period can be preset prior to the operation; and the vacuum-controlled irradiation facility can be used in cooperation with the heater-controlled. The introduction of personal computer has brought in automatic heat-up and cool-down operations of the capsule, setting aside the hand-operated jobs which had been conducted by the operators. As a result of this, the various requirements seeking a higher accuracy and efficiency in the irradiation can be met by fully exploiting the capabilities incorporated into the facility which allow the cyclic or delicate changes in the temperature. This paper deals with a capsule temperature control system with personal computer. (author)

  19. An analysis of file system and installation of the file management system for NOS operating system

    International Nuclear Information System (INIS)

    Lee, Young Jai; Park, Sun Hee; Hwang, In Ah; Kim, Hee Kyung

    1992-06-01

    In this technical report, we analyze NOS file structure for Cyber 170-875 and Cyber 960-31 computer system. We also describe functions, procedure and how-to-operate and how-to-use of VDS. VDS is used to manage large files effectively for Cyber computer system. The purpose of the VDS installation is to increase the virtual disk storage by utilizing magnetic tape, to assist the users of the computer system in managing their files, and to enhance the performance of KAERI Cyber computer system. (Author)

  20. Uncertainty Estimate of Surface Irradiances Computed with MODIS-, CALIPSO-, and CloudSat-Derived Cloud and Aerosol Properties

    Science.gov (United States)

    Kato, Seiji; Loeb, Norman G.; Rutan, David A.; Rose, Fred G.; Sun-Mack, Sunny; Miller, Walter F.; Chen, Yan

    2012-07-01

    Differences of modeled surface upward and downward longwave and shortwave irradiances are calculated using modeled irradiance computed with active sensor-derived and passive sensor-derived cloud and aerosol properties. The irradiance differences are calculated for various temporal and spatial scales, monthly gridded, monthly zonal, monthly global, and annual global. Using the irradiance differences, the uncertainty of surface irradiances is estimated. The uncertainty (1σ) of the annual global surface downward longwave and shortwave is, respectively, 7 W m-2 (out of 345 W m-2) and 4 W m-2 (out of 192 W m-2), after known bias errors are removed. Similarly, the uncertainty of the annual global surface upward longwave and shortwave is, respectively, 3 W m-2 (out of 398 W m-2) and 3 W m-2 (out of 23 W m-2). The uncertainty is for modeled irradiances computed using cloud properties derived from imagers on a sun-synchronous orbit that covers the globe every day (e.g., moderate-resolution imaging spectrometer) or modeled irradiances computed for nadir view only active sensors on a sun-synchronous orbit such as Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation and CloudSat. If we assume that longwave and shortwave uncertainties are independent of each other, but up- and downward components are correlated with each other, the uncertainty in global annual mean net surface irradiance is 12 W m-2. One-sigma uncertainty bounds of the satellite-based net surface irradiance are 106 W m-2 and 130 W m-2.

  1. Dimensional quality control of Ti-Ni dental file by optical coordinate metrology and computed tomography

    DEFF Research Database (Denmark)

    Yagüe-Fabra, J.A.; Tosello, Guido; Ontiveros, S.

    2014-01-01

    Endodontic dental files usually present complex 3D geometries, which make the complete measurement of the component very challenging with conventional micro metrology tools. Computed Tomography (CT) can represent a suitable alternative solution to micro metrology tools based on optical and tactile...... techniques. However, the establishment of CT systems traceability when measuring 3D complex geometries is still an open issue. In this work, to verify the quality of the CT dimensional measurements, the dental file has been measured both with a μCT system and an optical CMM (OCMM). The uncertainty...

  2. PCF File Format.

    Energy Technology Data Exchange (ETDEWEB)

    Thoreson, Gregory G [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    PCF files are binary files designed to contain gamma spectra and neutron count rates from radiation sensors. It is the native format for the GAmma Detector Response and Analysis Software (GADRAS) package [1]. It can contain multiple spectra and information about each spectrum such as energy calibration. This document outlines the format of the file that would allow one to write a computer program to parse and write such files.

  3. File Type Identification of File Fragments using Longest Common Subsequence (LCS)

    Science.gov (United States)

    Rahmat, R. F.; Nicholas, F.; Purnamawati, S.; Sitompul, O. S.

    2017-01-01

    Computer forensic analyst is a person in charge of investigation and evidence tracking. In certain cases, the file needed to be presented as digital evidence was deleted. It is difficult to reconstruct the file, because it often lost its header and cannot be identified while being restored. Therefore, a method is required for identifying the file type of file fragments. In this research, we propose Longest Common Subsequences that consists of three steps, namely training, testing and validation, to identify the file type from file fragments. From all testing results we can conlude that our proposed method works well and achieves 92.91% of accuracy to identify the file type of file fragment for three data types.

  4. Simple computational modeling for human extracorporeal irradiation using the BNCT facility of the RA-3 Reactor

    International Nuclear Information System (INIS)

    Farias, Ruben; Gonzalez, S.J.; Bellino, A.; Sztenjberg, M.; Pinto, J.; Thorp, Silvia I.; Gadan, M.; Pozzi, Emiliano; Schwint, Amanda E.; Heber, Elisa M.; Trivillin, V.A.; Zarza, Leandro G.; Estryk, Guillermo; Miller, M.; Bortolussi, S.; Soto, M.S.; Nigg, D.W.

    2009-01-01

    We present a simple computational model of the reactor RA-3 developed using Monte Carlo transport code MCNP. The model parameters are adjusted in order to reproduce experimental measured points in air and the source validation is performed in an acrylic phantom. Performance analysis is carried out using computational models of animal extracorporeal irradiation in liver and lung. Analysis is also performed inside a neutron shielded receptacle use for the irradiation of rats with a model of hepatic metastases.The computational model reproduces the experimental behavior in all the analyzed cases with a maximum difference of 10 percent. (author)

  5. Comparison of canal transportation and centering ability of twisted files, Pathfile-ProTaper system, and stainless steel hand K-files by using computed tomography.

    Science.gov (United States)

    Gergi, Richard; Rjeily, Joe Abou; Sader, Joseph; Naaman, Alfred

    2010-05-01

    The purpose of this study was to compare canal transportation and centering ability of 2 rotary nickel-titanium (NiTi) systems (Twisted Files [TF] and Pathfile-ProTaper [PP]) with conventional stainless steel K-files. Ninety root canals with severe curvature and short radius were selected. Canals were divided randomly into 3 groups of 30 each. After preparation with TF, PP, and stainless steel files, the amount of transportation that occurred was assessed by using computed tomography. Three sections from apical, mid-root, and coronal levels of the canal were recorded. Amount of transportation and centering ability were assessed. The 3 groups were statistically compared with analysis of variance and Tukey honestly significant difference test. Less transportation and better centering ability occurred with TF rotary instruments (P < .0001). K-files showed the highest transportation followed by PP system. PP system showed significant transportation when compared with TF (P < .0001). The TF system was found to be the best for all variables measured in this study. Copyright (c) 2010 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  6. Physico-technical irradiation planning for the therapy of oesophagus carcinomas by means of computed whole-body tomography

    International Nuclear Information System (INIS)

    Ammon, J.; Greiner, K.; Kaesberg, P.

    1980-01-01

    It is particularly difficult to establish a physico-technical irradiation plan for the thoracic part of an oesophagus carcinoma. This is due to the considerable modifications of the thoracic cross-section within the longitudinal axis of the radiation field. Therefore, tomographic cross-sections were made of the upper, the middle and the lower plane of the radiation field. The percentage dose distributions could be determined with a process computer (system TPS, Philips) for different irradiation techniques and irradiation equipments. Examinations of 21 patients showed that the best dose distribution, i.e. a distribution which spares the lung and spinal marrow regions adjacent to the target volume, is obtained by an excentric moving field therapy. Furthermore, localisation and dimensions of inhomogeneities are indicated by computer tomography which makes possible to take into consideration these inhomogeneities when calculating the dose. It was found that the irradiation times can so be reduced by more than 20%. We are therefore of the opinion that it is necessary to establish individual cross-sections of the body by computed tomography when elaborating a physico-technical irradiation plan for the treatment of an oesophagus carcinoma. (orig.) [de

  7. The computer system for the express-analysis of the irradiation samples

    International Nuclear Information System (INIS)

    Vzorov, I.K.; Kalmykov, A.V.; Korenev, S.A.; Minashkin, V.F.; Sikolenko, V.V.

    1999-01-01

    The computer system for the express-analysis (SEA) of the irradiation samples is described. The system is working together with the pulse high current electrons and ions source. It allows us to correct irradiation regime in real time. The SEA system automatically measures volt-ampere and volt-farad characteristics, sample resistance by 'four-probe' method, sample capacitor parameters. Its parameters are: in the volt-ampere measuring regime - U max = 200 V, minimal voltage step U sh =0.05 V, voltage accuracy 0.25%; in the capacity measuring regime - capacity measurement diapason 0 - 1600 pF, working frequencies diapason 1 -150 kHz, capacity accuracy 0.5%, voltage shifting diapason 1 - 200 V, minimal step of voltage shifting U sh 0.05 V. The SEA management is performed by IBM/AT computer. The control and measuring apparatus was realized in CAMAC standard. The programmed set consists of the first display procedures, control, treatment and information exit. (author)

  8. Computer Forensics Method in Analysis of Files Timestamps in Microsoft Windows Operating System and NTFS File System

    Directory of Open Access Journals (Sweden)

    Vesta Sergeevna Matveeva

    2013-02-01

    Full Text Available All existing file browsers displays 3 timestamps for every file in file system NTFS. Nowadays there are a lot of utilities that can manipulate temporal attributes to conceal the traces of file using. However every file in NTFS has 8 timestamps that are stored in file record and used in detecting the fact of attributes substitution. The authors suggest a method of revealing original timestamps after replacement and automated variant of it in case of a set of files.

  9. Building Parts Inventory Files Using the AppleWorks Data Base Subprogram and Apple IIe or GS Computers.

    Science.gov (United States)

    Schlenker, Richard M.

    This manual is a "how to" training device for building database files using the AppleWorks program with an Apple IIe or Apple IIGS Computer with Duodisk or two disk drives and an 80-column card. The manual provides step-by-step directions, and includes 25 figures depicting the computer screen at the various stages of the database file…

  10. Computational dosimetry in the development of a category-III irradiator

    International Nuclear Information System (INIS)

    George, Jain R.; Parmar, Y.D.; Kohli, A.D.

    2008-01-01

    Full text: Preservation of food by radiation processing has been widely accepted and yielding tremendous industrial and societal benefits. It is a challenge to meet the vast range of dose requirements for the radiation processing of food items in the most economical way. The Board of Radiation and Isotope Technology, a unit of the Department of Atomic Energy, Government of India is developing a Co 60 irradiator of Category-III for the radiation processing of food commodities for commercial application. In this irradiator water is the only shielding material and unlike other irradiators this does not have any concrete or lead shielding above ground and no electrical interlock systems. The radiation source never leaves underwater shielded position. A computational method has been developed for the dose and throughput estimations which is used for the optimization of various parameters of this irradiator. Co 60 source pencils of total strength 18.5 P Bq arranged in two tiers in a rectangular source frame is considered ideal for this irradiator. Food items in the density range 0.15 x 10 3 kg.m -3 to 1.0 x 10 3 kg.m -3 were considered for processing. Product enclosed in a box of dimensions 1.2 m x 0.6 m x 1.2 m moves and occupies four positions around the source. The product is kept dry by proper leak tighteners. The double side irradiation provides a good dose distribution within the product. Parametric analysis has been carried out for various possible inactive gap lengths between the two tiers of the source and for various distances between the source and the product in order to optimize them for the acceptable dose uniformity within the product as well as for the best source utilization efficiency. This has been done for different density products. (author)

  11. File management for experiment control parameters within a distributed function computer network

    International Nuclear Information System (INIS)

    Stubblefield, F.W.

    1976-10-01

    An attempt to design and implement a computer system for control of and data collection from a set of laboratory experiments reveals that many of the experiments in the set require an extensive collection of parameters for their control. The operation of the experiments can be greatly simplified if a means can be found for storing these parameters between experiments and automatically accessing them as they are required. A subsystem for managing files of such experiment control parameters is discussed. 3 figures

  12. Computer codes for simulating atomic-displacement cascades in solids subject to irradiation

    International Nuclear Information System (INIS)

    Asaoka, Takumi; Taji, Yukichi; Tsutsui, Tsuneo; Nakagawa, Masayuki; Nishida, Takahiko

    1979-03-01

    In order to study atomic displacement cascades originating from primary knock-on atoms in solids subject to incident radiation, the simulation code CASCADE/CLUSTER is adapted for use on FACOM/230-75 computer system. In addition, the code is modified so as to plot the defect patterns in crystalline solids. As other simulation code of the cascade process, MARLOWE is also available for use on the FACOM system. To deal with the thermal annealing of point defects produced in the cascade process, the code DAIQUIRI developed originally for body-centered cubic crystals is modified to be applicable also for face-centered cubic lattices. By combining CASCADE/CLUSTER and DAIQUIRI, we then prepared a computer code system CASCSRB to deal with heavy irradiation or saturation damage state of solids at normal temperature. Furthermore, a code system for the simulation of heavy irradiations CASCMARL is available, in which MARLOWE code is substituted for CASCADE in the CASCSRB system. (author)

  13. Study on irradiation effects of nucleus electromagnetic pulse on single chip computer system

    International Nuclear Information System (INIS)

    Hou Minsheng; Liu Shanghe; Wang Shuping

    2001-01-01

    Intense electromagnetic pulse, namely nucleus electromagnetic pulse (NEMP), lightning electromagnetic pulse (LEMP) and high power microwave (HPM), can disturb and destroy the single chip computer system. To study this issue, the authors made irradiation experiments by NEMPs generated by gigahertz transversal electromagnetic (GTEM) Cell. The experiments show that shutdown, restarting, communication errors of the single chip microcomputer system would occur when it was irradiated by the NEMPs. Based on the experiments, the cause on the effects on the single chip microcomputer system is discussed

  14. Modification to the Monte N-Particle (MCNP) Visual Editor (MCNPVised) to read in Computer Aided Design (CAD) files

    International Nuclear Information System (INIS)

    Schwarz, Randy A.; Carter, Leeland L.

    2004-01-01

    Monte Carlo N-Particle Transport Code (MCNP) (Reference 1) is the code of choice for doing complex neutron/photon/electron transport calculations for the nuclear industry and research institutions. The Visual Editor for Monte Carlo N-Particle (References 2 to 11) is recognized internationally as the best code for visually creating and graphically displaying input files for MCNP. The work performed in this grant enhanced the capabilities of the MCNP Visual Editor to allow it to read in a 2D Computer Aided Design (CAD) file, allowing the user to modify and view the 2D CAD file and then electronically generate a valid MCNP input geometry with a user specified axial extent

  15. Shaping ability of the conventional nickel-titanium and reciprocating nickel-titanium file systems: a comparative study using micro-computed tomography.

    Science.gov (United States)

    Hwang, Young-Hye; Bae, Kwang-Shik; Baek, Seung-Ho; Kum, Kee-Yeon; Lee, WooCheol; Shon, Won-Jun; Chang, Seok Woo

    2014-08-01

    This study used micro-computed tomographic imaging to compare the shaping ability of Mtwo (VDW, Munich, Germany), a conventional nickel-titanium file system, and Reciproc (VDW), a reciprocating file system morphologically similar to Mtwo. Root canal shaping was performed on the mesiobuccal and distobuccal canals of extracted maxillary molars. In the RR group (n = 15), Reciproc was used in a reciprocating motion (150° counterclockwise/30° clockwise, 300 rpm); in the MR group, Mtwo was used in a reciprocating motion (150° clockwise/30° counterclockwise, 300 rpm); and in the MC group, Mtwo was used in a continuous rotating motion (300 rpm). Micro-computed tomographic images taken before and after canal shaping were used to analyze canal volume change and the degree of transportation at the cervical, middle, and apical levels. The time required for canal shaping was recorded. Afterward, each file was analyzed using scanning electron microscopy. No statistically significant differences were found among the 3 groups in the time for canal shaping or canal volume change (P > .05). Transportation values of the RR and MR groups were not significantly different at any level. However, the transportation value of the MC group was significantly higher than both the RR and MR groups at the cervical and apical levels (P file deformation was observed for 1 file in group RR (1/15), 3 files in group MR (3/15), and 5 files in group MC (5/15). In terms of shaping ability, Mtwo used in a reciprocating motion was not significantly different from the Reciproc system. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  16. Fast probabilistic file fingerprinting for big data.

    Science.gov (United States)

    Tretyakov, Konstantin; Laur, Sven; Smant, Geert; Vilo, Jaak; Prins, Pjotr

    2013-01-01

    Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily concerns logistics within and between data centers, but is also important for workstation users in the analysis phase. Common usage patterns, such as comparing and transferring files, are proving computationally expensive and are tying down shared resources. We present an efficient method for calculating file uniqueness for large scientific data files, that takes less computational effort than existing techniques. This method, called Probabilistic Fast File Fingerprinting (PFFF), exploits the variation present in biological data and computes file fingerprints by sampling randomly from the file instead of reading it in full. Consequently, it has a flat performance characteristic, correlated with data variation rather than file size. We demonstrate that probabilistic fingerprinting can be as reliable as existing hashing techniques, with provably negligible risk of collisions. We measure the performance of the algorithm on a number of data storage and access technologies, identifying its strengths as well as limitations. Probabilistic fingerprinting may significantly reduce the use of computational resources when comparing very large files. Utilisation of probabilistic fingerprinting techniques can increase the speed of common file-related workflows, both in the data center and for workbench analysis. The implementation of the algorithm is available as an open-source tool named pfff, as a command-line tool as well as a C library. The tool can be downloaded from http://biit.cs.ut.ee/pfff.

  17. A secure file manager for UNIX

    Energy Technology Data Exchange (ETDEWEB)

    DeVries, R.G.

    1990-12-31

    The development of a secure file management system for a UNIX-based computer facility with supercomputers and workstations is described. Specifically, UNIX in its usual form does not address: (1) Operation which would satisfy rigorous security requirements. (2) Online space management in an environment where total data demands would be many times the actual online capacity. (3) Making the file management system part of a computer network in which users of any computer in the local network could retrieve data generated on any other computer in the network. The characteristics of UNIX can be exploited to develop a portable, secure file manager which would operate on computer systems ranging from workstations to supercomputers. Implementation considerations making unusual use of UNIX features, rather than requiring extensive internal system changes, are described, and implementation using the Cray Research Inc. UNICOS operating system is outlined.

  18. Parallel file system with metadata distributed across partitioned key-value store c

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron

    2017-09-19

    Improved techniques are provided for storing metadata associated with a plurality of sub-files associated with a single shared file in a parallel file system. The shared file is generated by a plurality of applications executing on a plurality of compute nodes. A compute node implements a Parallel Log Structured File System (PLFS) library to store at least one portion of the shared file generated by an application executing on the compute node and metadata for the at least one portion of the shared file on one or more object storage servers. The compute node is also configured to implement a partitioned data store for storing a partition of the metadata for the shared file, wherein the partitioned data store communicates with partitioned data stores on other compute nodes using a message passing interface. The partitioned data store can be implemented, for example, using Multidimensional Data Hashing Indexing Middleware (MDHIM).

  19. Neutron spectra calculation in material in order to compute irradiation damage

    International Nuclear Information System (INIS)

    Dupont, C.; Gonnord, J.; Le Dieu de Ville, A.; Nimal, J.C.; Totth, B.

    1982-01-01

    This short presentation will be on neutron spectra calculation methods in order to compute the damage rate formation in irradiated structure. Three computation schemes are used in the French C.E.A.: (1) 3-dimensional calculations using the line of sight attenuation method (MERCURE IV code), the removal cross section being obtained from an adjustment on a 1-dimensional transport calculation with the discrete ordinate code ANISN; (2) 2-dimensional calculations using the discrete ordinates method (DOT 3.5 code), 20 to 30 group library obtained by collapsing the 100 group a library on fluxes computed by ANISN; (3) 3-dimensional calculations using the Monte Carlo method (TRIPOLI system). The cross sections which originally came from UKNDL 73 and ENDF/B3 are now processed from ENDF B IV. (author)

  20. EVOLUT - a computer program for fast burnup evaluation

    International Nuclear Information System (INIS)

    Craciunescu, T.; Dobrin, R.; Stamatescu, L.; Alexa, A.

    1999-01-01

    EVOLUT is a computer program for burnup evaluation. The input data consist on the one hand of axial and radial gamma-scanning profiles (for the experimental evaluation of the number of nuclei of a fission product - the burnup monitor - at the end of irradiation) and on the other hand of the history of irradiation (the time length and values proportional to the neutron flux for each step of irradiation). Using the equation of evolution of the burnup monitor the flux values are iteratively adjusted, by a multiplier factor, until the calculated number of nuclei is equal to the experimental one. The flux values are used in the equation of evolution of the fissile and fertile nuclei to determine the fission number and consequently the burnup. EVOLUT was successfully used in the analysis of several hundreds of CANDU and TRIGA-type fuel rods. We appreciate that EVOLUT is a useful tool in the burnup evaluation based on gamma spectrometry measurements. EVOLUT can be used on an usual AT computer and in this case the results are obtained in a few minutes. It has an original and user-friendly graphical interface and it provides also output in script MATLAB files for graphical representation and further numerical analysis. The computer program needs simple data and it is valuable especially when a large number of burnup analyses are required quickly. (authors)

  1. Quantitative study of late injury in the irradiated mouse lung using computer graphics

    International Nuclear Information System (INIS)

    Tanabe, Masahiro; Furuse, Takeshi; Rapachietta, D.R.; Kallman, R.F.

    1990-01-01

    It is reported that quantitative histological analysis using current imaging technology and computer graphics is useful in studying late injury in the irradiated lung (with and without added chemotherapy), and that it correlated closely with results of the functional breathing rate test. (author). 7 refs.; 1 fig

  2. Activity computer program for calculating ion irradiation activation

    Science.gov (United States)

    Palmer, Ben; Connolly, Brian; Read, Mark

    2017-07-01

    A computer program, Activity, was developed to predict the activity and gamma lines of materials irradiated with an ion beam. It uses the TENDL (Koning and Rochman, 2012) [1] proton reaction cross section database, the Stopping and Range of Ions in Matter (SRIM) (Biersack et al., 2010) code, a Nuclear Data Services (NDS) radioactive decay database (Sonzogni, 2006) [2] and an ENDF gamma decay database (Herman and Chadwick, 2006) [3]. An extended version of Bateman's equation is used to calculate the activity at time t, and this equation is solved analytically, with the option to also solve by numeric inverse Laplace Transform as a failsafe. The program outputs the expected activity and gamma lines of the activated material.

  3. Building a parallel file system simulator

    International Nuclear Information System (INIS)

    Molina-Estolano, E; Maltzahn, C; Brandt, S A; Bent, J

    2009-01-01

    Parallel file systems are gaining in popularity in high-end computing centers as well as commercial data centers. High-end computing systems are expected to scale exponentially and to pose new challenges to their storage scalability in terms of cost and power. To address these challenges scientists and file system designers will need a thorough understanding of the design space of parallel file systems. Yet there exist few systematic studies of parallel file system behavior at petabyte- and exabyte scale. An important reason is the significant cost of getting access to large-scale hardware to test parallel file systems. To contribute to this understanding we are building a parallel file system simulator that can simulate parallel file systems at very large scale. Our goal is to simulate petabyte-scale parallel file systems on a small cluster or even a single machine in reasonable time and fidelity. With this simulator, file system experts will be able to tune existing file systems for specific workloads, scientists and file system deployment engineers will be able to better communicate workload requirements, file system designers and researchers will be able to try out design alternatives and innovations at scale, and instructors will be able to study very large-scale parallel file system behavior in the class room. In this paper we describe our approach and provide preliminary results that are encouraging both in terms of fidelity and simulation scalability.

  4. Reciprocating vs Rotary Instrumentation in Pediatric Endodontics: Cone Beam Computed Tomographic Analysis of Deciduous Root Canals using Two Single-file Systems.

    Science.gov (United States)

    Prabhakar, Attiguppe R; Yavagal, Chandrashekar; Dixit, Kratika; Naik, Saraswathi V

    2016-01-01

    Primary root canals are considered to be most challenging due to their complex anatomy. "Wave one" and "one shape" are single-file systems with reciprocating and rotary motion respectively. The aim of this study was to evaluate and compare dentin thickness, centering ability, canal transportation, and instrumentation time of wave one and one shape files in primary root canals using a cone beam computed tomographic (CBCT) analysis. This is an experimental, in vitro study comparing the two groups. A total of 24 extracted human primary teeth with minimum 7 mm root length were included in the study. Cone beam computed tomographic images were taken before and after the instrumentation for each group. Dentin thickness, centering ability, canal transportation, and instrumentation times were evaluated for each group. A significant difference was found in instrumentation time and canal transportation measures between the two groups. Wave one showed less canal transportation as compared with one shape, and the mean instrumentation time of wave one was significantly less than one shape. Reciprocating single-file systems was found to be faster with much less procedural errors and can hence be recommended for shaping the root canals of primary teeth. How to cite this article: Prabhakar AR, Yavagal C, Dixit K, Naik SV. Reciprocating vs Rotary Instrumentation in Pediatric Endodontics: Cone Beam Computed Tomographic Analysis of Deciduous Root Canals using Two Single-File Systems. Int J Clin Pediatr Dent 2016;9(1):45-49.

  5. Generation of Gaussian 09 Input Files for the Computation of 1H and 13C NMR Chemical Shifts of Structures from a Spartan’14 Conformational Search

    OpenAIRE

    sprotocols

    2014-01-01

    Authors: Spencer Reisbick & Patrick Willoughby ### Abstract This protocol describes an approach to preparing a series of Gaussian 09 computational input files for an ensemble of conformers generated in Spartan’14. The resulting input files are necessary for computing optimum geometries, relative conformer energies, and NMR shielding tensors using Gaussian. Using the conformational search feature within Spartan’14, an ensemble of conformational isomers was obtained. To convert the str...

  6. Research of Performance Linux Kernel File Systems

    Directory of Open Access Journals (Sweden)

    Andrey Vladimirovich Ostroukh

    2015-10-01

    Full Text Available The article describes the most common Linux Kernel File Systems. The research was carried out on a personal computer, the characteristics of which are written in the article. The study was performed on a typical workstation running GNU/Linux with below characteristics. On a personal computer for measuring the file performance, has been installed the necessary software. Based on the results, conclusions and proposed recommendations for use of file systems. Identified and recommended by the best ways to store data.

  7. A computer program (FUGI) for design and operation of a conveyor type irradiator with multi-tier and multi-layer

    International Nuclear Information System (INIS)

    Hoshi, Tatsuo; Aggarwal, K.S.

    1976-10-01

    A computer program (FUGI) was established to facilitate the determination of factors related to design and operation of a conveyor type irradiator with multi-tier and multi-layer. The factors determined by this program are as follows: (1) maximum dose, minimum dose and dose uniformity in irradiated material; (2) dose rate distribution on the path of irradiated material; (3) mass flow rate of irradiated material; (4) requisite activity of source; (5) requisite speed of conveyor; (6) utilization efficiency. This program partly uses the program FUDGE 4A for determination of dose rate in irradiated material in static state by Galanter and Krishnamurthy. (auth.)

  8. Extracting the Data From the LCM vk4 Formatted Output File

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, James G. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-29

    These are slides about extracting the data from the LCM vk4 formatted output file. The following is covered: vk4 file produced by Keyence VK Software, custom analysis, no off the shelf way to read the file, reading the binary data in a vk4 file, various offsets in decimal lines, finding the height image data, directly in MATLAB, binary output beginning of height image data, color image information, color image binary data, color image decimal and binary data, MATLAB code to read vk4 file (choose a file, read the file, compute offsets, read optical image, laser optical image, read and compute laser intensity image, read height image, timing, display height image, display laser intensity image, display RGB laser optical images, display RGB optical images, display beginning data and save images to workspace, gamma correction subroutine), reading intensity form the vk4 file, linear in the low range, linear in the high range, gamma correction for vk4 files, computing the gamma intensity correction, observations.

  9. Design and application of remote file management system

    International Nuclear Information System (INIS)

    Zhu Haijun; Liu Dekang; Shen liren

    2006-01-01

    File transfer protocol can help users transfer files between computers on internet. FTP can not fulfill the needs of users in special occasions, so it needs programmer define file transfer protocol himself based on users. The method or realization and application for user-defined file transfer protocol is introduced. (authors)

  10. NADAC and MERGE: computer codes for processing neutron activation analysis data

    International Nuclear Information System (INIS)

    Heft, R.E.; Martin, W.E.

    1977-01-01

    Absolute disintegration rates of specific radioactive products induced by neutron irradition of a sample are determined by spectrometric analysis of gamma-ray emissions. Nuclide identification and quantification is carried out by a complex computer code GAMANAL (described elsewhere). The output of GAMANAL is processed by NADAC, a computer code that converts the data on observed distintegration rates to data on the elemental composition of the original sample. Computations by NADAC are on an absolute basis in that stored nuclear parameters are used rather than the difference between the observed disintegration rate and the rate obtained by concurrent irradiation of elemental standards. The NADAC code provides for the computation of complex cases including those involving interrupted irradiations, parent and daughter decay situations where the daughter may also be produced independently, nuclides with very short half-lives compared to counting interval, and those involving interference by competing neutron-induced reactions. The NADAC output consists of a printed report, which summarizes analytical results, and a card-image file, which can be used as input to another computer code MERGE. The purpose of MERGE is to combine the results of multiple analyses and produce a single final answer, based on all available information, for each element found

  11. Improvements of top-of-atmosphere and surface irradiance computations with CALIPSO-, CloudSat-, and MODIS-derived cloud and aerosol properties

    Science.gov (United States)

    Kato, Seiji; Rose, Fred G.; Sun-Mack, Sunny; Miller, Walter F.; Chen, Yan; Rutan, David A.; Stephens, Graeme L.; Loeb, Norman G.; Minnis, Patrick; Wielicki, Bruce A.; Winker, David M.; Charlock, Thomas P.; Stackhouse, Paul W., Jr.; Xu, Kuan-Man; Collins, William D.

    2011-10-01

    One year of instantaneous top-of-atmosphere (TOA) and surface shortwave and longwave irradiances are computed using cloud and aerosol properties derived from instruments on the A-Train Constellation: the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) on the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) satellite, the CloudSat Cloud Profiling Radar (CPR), and the Aqua Moderate Resolution Imaging Spectrometer (MODIS). When modeled irradiances are compared with those computed with cloud properties derived from MODIS radiances by a Clouds and the Earth's Radiant Energy System (CERES) cloud algorithm, the global and annual mean of modeled instantaneous TOA irradiances decreases by 12.5 W m-2 (5.0%) for reflected shortwave and 2.5 W m-2 (1.1%) for longwave irradiances. As a result, the global annual mean of instantaneous TOA irradiances agrees better with CERES-derived irradiances to within 0.5W m-2 (out of 237.8 W m-2) for reflected shortwave and 2.6W m-2 (out of 240.1 W m-2) for longwave irradiances. In addition, the global annual mean of instantaneous surface downward longwave irradiances increases by 3.6 W m-2 (1.0%) when CALIOP- and CPR-derived cloud properties are used. The global annual mean of instantaneous surface downward shortwave irradiances also increases by 8.6 W m-2 (1.6%), indicating that the net surface irradiance increases when CALIOP- and CPR-derived cloud properties are used. Increasing the surface downward longwave irradiance is caused by larger cloud fractions (the global annual mean by 0.11, 0.04 excluding clouds with optical thickness less than 0.3) and lower cloud base heights (the global annual mean by 1.6 km). The increase of the surface downward longwave irradiance in the Arctic exceeds 10 W m-2 (˜4%) in winter because CALIOP and CPR detect more clouds in comparison with the cloud detection by the CERES cloud algorithm during polar night. The global annual mean surface downward longwave irradiance of

  12. Comparative evaluation of effect of rotary and reciprocating single-file systems on pericervical dentin: A cone-beam computed tomography study.

    Science.gov (United States)

    Zinge, Priyanka Ramdas; Patil, Jayaprakash

    2017-01-01

    The aim of this study is to evaluate and compare the effect of one shape, Neolix rotary single-file systems and WaveOne, Reciproc reciprocating single-file systems on pericervical dentin (PCD) using cone-beam computed tomography (CBCT). A total of 40 freshly extracted mandibular premolars were collected and divided into two groups, namely, Group A - Rotary: A 1 - Neolix and A 2 - OneShape and Group B - Reciprocating: B 1 - WaveOne and B 2 - Reciproc. Preoperative scans of each were taken followed by conventional access cavity preparation and working length determination with 10-k file. Instrumentation of the canal was done according to the respective file system, and postinstrumentation CBCT scans of teeth were obtained. 90 μm thick slices were obtained 4 mm apical and coronal to the cementoenamel junction. The PCD thickness was calculated as the shortest distance from the canal outline to the closest adjacent root surface, which was measured in four surfaces, i.e., facial, lingual, mesial, and distal for all the groups in the two obtained scans. There was no significant difference found between rotary single-file systems and reciprocating single-file systems in their effect on PCD, but in Group B 2 , there was most significant loss of tooth structure in the mesial, lingual, and distal surface ( P file system removes more PCD as compared to other experimental groups, whereas Neolix single file system had the least effect on PCD.

  13. A Centralized Control and Dynamic Dispatch Architecture for File Integrity Analysis

    Directory of Open Access Journals (Sweden)

    Ronald DeMara

    2006-02-01

    Full Text Available The ability to monitor computer file systems for unauthorized changes is a powerful administrative tool. Ideally this task could be performed remotely under the direction of the administrator to allow on-demand checking, and use of tailorable reporting and exception policies targeted to adjustable groups of network elements. This paper introduces M-FICA, a Mobile File Integrity and Consistency Analyzer as a prototype to achieve this capability using mobile agents. The M-FICA file tampering detection approach uses MD5 message digests to identify file changes. Two agent types, Initiator and Examiner, are used to perform file integrity tasks. An Initiator travels to client systems, computes a file digest, then stores those digests in a database file located on write-once media. An Examiner agent computes a new digest to compare with the original digests in the database file. Changes in digest values indicate that the file contents have been modified. The design and evaluation results for a prototype developed in the Concordia agent framework are described.

  14. CINDA 83 (1977-1983). The index to literature and computer files on microscopic neutron data

    International Nuclear Information System (INIS)

    1983-01-01

    CINDA, the Computer Index of Neutron Data, contains bibliographical references to measurements, calculations, reviews and evaluations of neutron cross-sections and other microscopic neutron data; it includes also index references to computer libraries of numerical neutron data exchanged between four regional neutron data centres. The present issue, CINDA 83, is an index to the literature on neutron data published after 1976. The basic volume, CINDA-A, together with the present issue, contains the full CINDA file as of 1 April 1983. A supplement to CINDA 83 is foreseen for fall 1983. Next year's issue, which is envisaged to be published in June 1984, will again cover all relevant literature that has appeared after 1976

  15. Modification to the Monte Carlo N-Particle (MCNP) Visual Editor (MCNPVised) to Read in Computer Aided Design (CAD) Files

    International Nuclear Information System (INIS)

    Randolph Schwarz; Leland L. Carter; Alysia Schwarz

    2005-01-01

    Monte Carlo N-Particle Transport Code (MCNP) is the code of choice for doing complex neutron/photon/electron transport calculations for the nuclear industry and research institutions. The Visual Editor for Monte Carlo N-Particle is internationally recognized as the best code for visually creating and graphically displaying input files for MCNP. The work performed in this grant was used to enhance the capabilities of the MCNP Visual Editor to allow it to read in both 2D and 3D Computer Aided Design (CAD) files, allowing the user to electronically generate a valid MCNP input geometry

  16. Dynamic Non-Hierarchical File Systems for Exascale Storage

    Energy Technology Data Exchange (ETDEWEB)

    Long, Darrell E. [Univ. of California, Santa Cruz, CA (United States); Miller, Ethan L [Univ. of California, Santa Cruz, CA (United States)

    2015-02-24

    This constitutes the final report for “Dynamic Non-Hierarchical File Systems for Exascale Storage”. The ultimate goal of this project was to improve data management in scientific computing and high-end computing (HEC) applications, and to achieve this goal we proposed: to develop the first, HEC-targeted, file system featuring rich metadata and provenance collection, extreme scalability, and future storage hardware integration as core design goals, and to evaluate and develop a flexible non-hierarchical file system interface suitable for providing more powerful and intuitive data management interfaces to HEC and scientific computing users. Data management is swiftly becoming a serious problem in the scientific community – while copious amounts of data are good for obtaining results, finding the right data is often daunting and sometimes impossible. Scientists participating in a Department of Energy workshop noted that most of their time was spent “...finding, processing, organizing, and moving data and it’s going to get much worse”. Scientists should not be forced to become data mining experts in order to retrieve the data they want, nor should they be expected to remember the naming convention they used several years ago for a set of experiments they now wish to revisit. Ideally, locating the data you need would be as easy as browsing the web. Unfortunately, existing data management approaches are usually based on hierarchical naming, a 40 year-old technology designed to manage thousands of files, not exabytes of data. Today’s systems do not take advantage of the rich array of metadata that current high-end computing (HEC) file systems can gather, including content-based metadata and provenance1 information. As a result, current metadata search approaches are typically ad hoc and often work by providing a parallel management system to the “main” file system, as is done in Linux (the locate utility), personal computers, and enterprise search

  17. Portable File Format (PFF) specifications

    Energy Technology Data Exchange (ETDEWEB)

    Dolan, Daniel H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Created at Sandia National Laboratories, the Portable File Format (PFF) allows binary data transfer across computer platforms. Although this capability is supported by many other formats, PFF files are still in use at Sandia, particularly in pulsed power research. This report provides detailed PFF specifications for accessing data without relying on legacy code.

  18. F2AC: A Lightweight, Fine-Grained, and Flexible Access Control Scheme for File Storage in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Wei Ren

    2016-01-01

    Full Text Available Current file storage service models for cloud servers assume that users either belong to single layer with different privileges or cannot authorize privileges iteratively. Thus, the access control is not fine-grained and flexible. Besides, most access control methods at cloud servers mainly rely on computationally intensive cryptographic algorithms and, especially, may not be able to support highly dynamic ad hoc groups with addition and removal of group members. In this paper, we propose a scheme called F2AC, which is a lightweight, fine-grained, and flexible access control scheme for file storage in mobile cloud computing. F2AC can not only achieve iterative authorization, authentication with tailored policies, and access control for dynamically changing accessing groups, but also provide access privilege transition and revocation. A new access control model called directed tree with linked leaf model is proposed for further implementations in data structures and algorithms. The extensive analysis is given for justifying the soundness and completeness of F2AC.

  19. Computational science simulation of laser materials processing and provision of their irradiation conditions

    International Nuclear Information System (INIS)

    Muramatsu, Toshiharu

    2016-01-01

    In laser processing, it is necessary for achieving the intended performance and product, to understand the complex physical courses including melting and solidification phenomena occurring in laser processing, and thus to set proper laser irradiation conditions. This condition optimization work requires an enormous amount of overhead due to repeated efforts, and has become a cause for inhibiting the introduction of laser processing technology into the industrial field that points to the small lot production of many products. JAEA tried to make it possible to quantitatively handle the complex physical course from the laser light irradiation to the fabricating material until the completion of processing, and is under development of the computational science simulation code SPLICE that connects micro behavior and macro behavior through a multi-level scale model. This SPLICE is able to visualize the design space and to reduce the overhead associated with the setting of laser irradiation conditions and the like, which gives the prospect of being effective as a tool for front-loading. This approach has been confirmed to be effective for the welding and fusing process. (A.O.)

  20. Computational system to create an entry file for replicating I-125 seeds simulating brachytherapy case studies using the MCNPX code

    Directory of Open Access Journals (Sweden)

    Leonardo da Silva Boia

    2014-03-01

    Full Text Available Purpose: A computational system was developed for this paper in the C++ programming language, to create a 125I radioactive seed entry file, based on the positioning of a virtual grid (template in voxel geometries, with the purpose of performing prostate cancer treatment simulations using the MCNPX code.Methods: The system is fed with information from the planning system with regard to each seed’s location and its depth, and an entry file is automatically created with all the cards (instructions for each seed regarding their cell blocks and surfaces spread out spatially in the 3D environment. The system provides with precision a reproduction of the clinical scenario for the MCNPX code’s simulation environment, thereby allowing the technique’s in-depth study.Results and Conclusion: The preliminary results from this study showed that the lateral penumbra of uniform scanning proton beams was less sensitive In order to validate the computational system, an entry file was created with 88 125I seeds that were inserted in the phantom’s MAX06 prostate region with initial activity determined for the seeds at the 0.27 mCi value. Isodose curves were obtained in all the prostate slices in 5 mm steps in the 7 to 10 cm interval, totaling 7 slices. Variance reduction techniques were applied in order to optimize computational time and the reduction of uncertainties such as photon and electron energy interruptions in 4 keV and forced collisions regarding cells of interest. Through the acquisition of isodose curves, the results obtained show that hot spots have values above 300 Gy, as anticipated in literature, stressing the importance of the sources’ correct positioning, in which the computational system developed provides, in order not to release excessive doses in adjacent risk organs. The 144 Gy prescription curve showed in the validation process that it covers perfectly a large percentage of the volume, at the same time that it demonstrates a large

  1. Adding Data Management Services to Parallel File Systems

    Energy Technology Data Exchange (ETDEWEB)

    Brandt, Scott [Univ. of California, Santa Cruz, CA (United States)

    2015-03-04

    The objective of this project, called DAMASC for “Data Management in Scientific Computing”, is to coalesce data management with parallel file system management to present a declarative interface to scientists for managing, querying, and analyzing extremely large data sets efficiently and predictably. Managing extremely large data sets is a key challenge of exascale computing. The overhead, energy, and cost of moving massive volumes of data demand designs where computation is close to storage. In current architectures, compute/analysis clusters access data in a physically separate parallel file system and largely leave it scientist to reduce data movement. Over the past decades the high-end computing community has adopted middleware with multiple layers of abstractions and specialized file formats such as NetCDF-4 and HDF5. These abstractions provide a limited set of high-level data processing functions, but have inherent functionality and performance limitations: middleware that provides access to the highly structured contents of scientific data files stored in the (unstructured) file systems can only optimize to the extent that file system interfaces permit; the highly structured formats of these files often impedes native file system performance optimizations. We are developing Damasc, an enhanced high-performance file system with native rich data management services. Damasc will enable efficient queries and updates over files stored in their native byte-stream format while retaining the inherent performance of file system data storage via declarative queries and updates over views of underlying files. Damasc has four key benefits for the development of data-intensive scientific code: (1) applications can use important data-management services, such as declarative queries, views, and provenance tracking, that are currently available only within database systems; (2) the use of these services becomes easier, as they are provided within a familiar file

  2. Schools (Students) Exchanging CAD/CAM Files over the Internet.

    Science.gov (United States)

    Mahoney, Gary S.; Smallwood, James E.

    This document discusses how students and schools can benefit from exchanging computer-aided design/computer-aided manufacturing (CAD/CAM) files over the Internet, explains how files are exchanged, and examines the problem of selected hardware/software incompatibility. Key terms associated with information search services are defined, and several…

  3. Computerized index for teaching files

    International Nuclear Information System (INIS)

    Bramble, J.M.

    1989-01-01

    A computerized index can be used to retrieve cases from a teaching file that have radiographic findings similar to an unknown case. The probability that a user will review cases with a correct diagnosis was estimated with use of radiographic findings of arthritis in hand radiographs of 110 cases from a teaching file. The nearest-neighbor classification algorithm was used as a computer index to 110 cases of arthritis. Each case was treated as an unknown and inputted to the computer index. The accuracy of the computer index in retrieving cases with the same diagnosis (including rheumatoid arthritis, gout, psoriatic arthritis, inflammatory osteoarthritis, and pyrophosphate arthropathy) was measured. A Bayes classifier algorithm was also tested on the same database. Results are presented. The nearest-neighbor algorithm was 83%. By comparison, the estimated accuracy of the Bayes classifier algorithm was 78%. Conclusions: A computerized index to a teaching file based on the nearest-neighbor algorithm should allow the user to review cases with the correct diagnosis of an unknown case, by entering the findings of the unknown case

  4. Securing the AliEn File Catalogue - Enforcing authorization with accountable file operations

    International Nuclear Information System (INIS)

    Schreiner, Steffen; Banerjee, Subho Sankar; Betev, Latchezar; Carminati, Federico; Vladimirovna Datskova, Olga; Furano, Fabrizio; Grigoras, Alina; Grigoras, Costin; Mendez Lorenzo, Patricia; Peters, Andreas Joachim; Saiz, Pablo; Bagnasco, Stefano; Zhu Jianlin

    2011-01-01

    The AliEn Grid Services, as operated by the ALICE Collaboration in its global physics analysis grid framework, is based on a central File Catalogue together with a distributed set of storage systems and the possibility to register links to external data resources. This paper describes several identified vulnerabilities in the AliEn File Catalogue access protocol regarding fraud and unauthorized file alteration and presents a more secure and revised design: a new mechanism, called LFN Booking Table, is introduced in order to keep track of access authorization in the transient state of files entering or leaving the File Catalogue. Due to a simplification of the original Access Envelope mechanism for xrootd-protocol-based storage systems, fundamental computational improvements of the mechanism were achieved as well as an up to 50% reduction of the credential's size. By extending the access protocol with signed status messages from the underlying storage system, the File Catalogue receives trusted information about a file's size and checksum and the protocol is no longer dependent on client trust. Altogether, the revised design complies with atomic and consistent transactions and allows for accountable, authentic, and traceable file operations. This paper describes these changes as part and beyond the development of AliEn version 2.19.

  5. Zebra: A striped network file system

    Science.gov (United States)

    Hartman, John H.; Ousterhout, John K.

    1992-01-01

    The design of Zebra, a striped network file system, is presented. Zebra applies ideas from log-structured file system (LFS) and RAID research to network file systems, resulting in a network file system that has scalable performance, uses its servers efficiently even when its applications are using small files, and provides high availability. Zebra stripes file data across multiple servers, so that the file transfer rate is not limited by the performance of a single server. High availability is achieved by maintaining parity information for the file system. If a server fails its contents can be reconstructed using the contents of the remaining servers and the parity information. Zebra differs from existing striped file systems in the way it stripes file data: Zebra does not stripe on a per-file basis; instead it stripes the stream of bytes written by each client. Clients write to the servers in units called stripe fragments, which are analogous to segments in an LFS. Stripe fragments contain file blocks that were written recently, without regard to which file they belong. This method of striping has numerous advantages over per-file striping, including increased server efficiency, efficient parity computation, and elimination of parity update.

  6. Cone-beam Computed Tomographic Assessment of Canal Centering Ability and Transportation after Preparation with Twisted File and Bio RaCe Instrumentation.

    Directory of Open Access Journals (Sweden)

    Kiamars Honardar

    2014-08-01

    Full Text Available Use of rotary Nickel-Titanium (NiTi instruments for endodontic preparation has introduced a new era in endodontic practice, but this issue has undergone dramatic modifications in order to achieve improved shaping abilities. Cone-beam computed tomography (CBCT has made it possible to accurately evaluate geometrical changes following canal preparation. This study was carried out to compare canal centering ability and transportation of Twisted File and BioRaCe rotary systems by means of cone-beam computed tomography.Thirty root canals from freshly extracted mandibular and maxillary teeth were selected. Teeth were mounted and scanned before and after preparation by CBCT at different apical levels. Specimens were divided into 2 groups of 15. In the first group Twisted File and in the second, BioRaCe was used for canal preparation. Canal transportation and centering ability after preparation were assessed by NNT Viewer and Photoshop CS4 software. Statistical analysis was performed using t-test and two-way ANOVA.All samples showed deviations from the original axes of the canals. No significant differences were detected between the two rotary NiTi instruments for canal centering ability in all sections. Regarding canal transportation however, a significant difference was seen in the BioRaCe group at 7.5mm from the apex.Under the conditions of this in vitro study, Twisted File and BioRaCe rotary NiTi files retained original canal geometry.

  7. Evaluation of clinical data in childhood asthma. Application of a computer file system

    International Nuclear Information System (INIS)

    Fife, D.; Twarog, F.J.; Geha, R.S.

    1983-01-01

    A computer file system was used in our pediatric allergy clinic to assess the value of chest roentgenograms and hemoglobin determinations used in the examination of patients and to correlate exposure to pets and forced hot air with the severity of asthma. Among 889 children with asthma, 20.7% had abnormal chest roentgenographic findings, excluding hyperinflation and peribronchial thickening, and 0.7% had abnormal hemoglobin values. Environmental exposure to pets or forced hot air was not associated with increased severity of asthma, as assessed by five measures of outcome: number of medications administered, requirement for corticosteroids, frequency of clinic visits, frequency of emergency room visits, and frequency of hospitalizations

  8. Geothermal-energy files in computer storage: sites, cities, and industries

    Energy Technology Data Exchange (ETDEWEB)

    O' Dea, P.L.

    1981-12-01

    The site, city, and industrial files are described. The data presented are from the hydrothermal site file containing about three thousand records which describe some of the principal physical features of hydrothermal resources in the United States. Data elements include: latitude, longitude, township, range, section, surface temperature, subsurface temperature, the field potential, and well depth for commercialization. (MHR)

  9. Developing a fast simulator for irradiated silicon detectors

    CERN Document Server

    Diez Gonzalez-Pardo, Alvaro

    2015-01-01

    Simulation software for irradiated silicon detectors has been developed on the basis of an already existing C++ simulation software called TRACS[1]. This software has been already proven useful in understanding non-irradiated silicon diodes and microstrips. In addition a wide variety of user-focus features has been implemented to improve on TRACS flexibility. Such features include an interface to allow any program to leverage TRACS functionalities, a configuration file and improved documentation.

  10. MR-AFS: a global hierarchical file-system

    International Nuclear Information System (INIS)

    Reuter, H.

    2000-01-01

    The next generation of fusion experiments will use object-oriented technology creating the need for world wide sharing of an underlying hierarchical file-system. The Andrew file system (AFS) is a well known and widely spread global distributed file-system. Multiple-resident-AFS (MR-AFS) combines the features of AFS with hierarchical storage management systems. Files in MR-AFS therefore may be migrated on secondary storage, such as roboted tape libraries. MR-AFS is in use at IPP for the current experiments and data originating from super-computer applications. Experiences and scalability issues are discussed

  11. Access to DIII-D data located in multiple files and multiple locations

    International Nuclear Information System (INIS)

    McHarg, B.B. Jr.

    1993-10-01

    The General Atomics DIII-D tokamak fusion experiment is now collecting over 80 MB of data per discharge once every 10 min, and that quantity is expected to double within the next year. The size of the data files, even in compressed format, is becoming increasingly difficult to handle. Data is also being acquired now on a variety of UNIX systems as well as MicroVAX and MODCOMP computer systems. The existing computers collect all the data into a single shot file, and this data collection is taking an ever increasing amount of time as the total quantity of data increases. Data is not available to experimenters until it has been collected into the shot file, which is in conflict with the substantial need for data examination on a timely basis between shots. The experimenters are also spread over many different types of computer systems (possibly located at other sites). To improve data availability and handling, software has been developed to allow individual computer systems to create their own shot files locally. The data interface routine PTDATA that is used to access DIII-D data has been modified so that a user's code on any computer can access data from any computer where that data might be located. This data access is transparent to the user. Breaking up the shot file into separate files in multiple locations also impacts software used for data archiving, data management, and data restoration

  12. Joint evaluated file qualification for thermal neutron reactors

    International Nuclear Information System (INIS)

    Tellier, H.; Van der Gucht, C.; Vanuxeem, J.

    1986-09-01

    The neutron and nuclear data which are needed by reactor physicists to perform core calculations are brought together in the evaluated files. The files are processed to provide multigroup cross sections. The accuracy of the core calculations depends on the initial data, which is sometimes not accurate enough. Therefore the reactor physicists carry out integral experiments. We show, in this paper, how the use of these integral experiments and the application of a tendency research method can improve the accuracy of the neutron data. This technique was applied to the validation of the joint evaluated file. For this purpose, 56 buckling measurements and 42 isotopic analysis of irradiated fuel were used. Small modifications of the initial data are proposed. The final values are compared with recent recommended values or microscopic data. 8 refs

  13. Joint evaluated file qualification for thermal neutron reactors

    International Nuclear Information System (INIS)

    Tellier, H.; van der Gucht, C.; Vanuxeem, J.

    1986-01-01

    The neutron and nuclear data which are needed by reactor physicists to perform core calculations are brought together in the evaluated files. The files are processes to provide multigroup cross sections. The accuracy of the core calculations depends on the initial data, which is sometimes not accurate enough. Therefore the reactor physicists carry out integral experiments. The authors show, in this paper, how the use of these integral experiments and the application of a tendency research method can improve the accuracy of the neutron data. This technique was applied to the validation of the Joint evaluated file. For this purpose, 56 buckling measurements and 42 isotopic analysis of irradiated fuel were used. Small modifications of the initial data are proposed. The final values are compared with recent recommended values or microscopic data

  14. JNDC FP decay data file

    International Nuclear Information System (INIS)

    Yamamoto, Tohru; Akiyama, Masatsugu

    1981-02-01

    The decay data file for fission product nuclides (FP DECAY DATA FILE) has been prepared for summation calculation of the decay heat of fission products. The average energies released in β- and γ-transitions have been calculated with computer code PROFP. The calculated results and necessary information have been arranged in tabular form together with the estimated results for 470 nuclides of which decay data are not available experimentally. (author)

  15. e-Learning Course on Food Irradiation

    International Nuclear Information System (INIS)

    Hénon, Yves

    2016-01-01

    Since May 2015, an online, interactive, multi-media and self-study course on Food Irradiation - Technology, Applications and Good Practices has been made available by the Food and Environmental Protection Section. This e-learning Course on Food Irradiation was initiated during a project (RAS/05/057) of the Regional Cooperative Agreement (RCA) Implementing Best Practices of Food Irradiation for Sanitary and Phytosanitary Purposes. Each module contains: • A lesson, largely based on the Manual of Good Practice in Food except for the first part (Food Irradiation) for which expanding the contents and addressing frequently asked questions seemed necessary. The latest chapters will help operators of irradiation facilities to appreciate and improve their practices. • A section called ‘Essentials’ that summarizes the key points. • A quiz to assess the knowledge acquired by the user from the course material. The quiz questions take a variety of forms: answer matching, multiple choice, true or false, picture selection, or simple calculation. Videos, Power Point presentations, pdf files and pictures enrich the contents. The course includes a glossary and approximately 80 downloadable references. These references cover safety of irradiated food, effects of irradiation on the nutritional quality of food, effects of irradiation on food microorganisms, insects and parasites, effects of irradiation on parasites, sanitary and phytosanitary applications of irradiation, packaging of irradiated food, food irradiation standards and regulations, history of food irradiation, and communication aspects.

  16. File-System Workload on a Scientific Multiprocessor

    Science.gov (United States)

    Kotz, David; Nieuwejaar, Nils

    1995-01-01

    Many scientific applications have intense computational and I/O requirements. Although multiprocessors have permitted astounding increases in computational performance, the formidable I/O needs of these applications cannot be met by current multiprocessors a their I/O subsystems. To prevent I/O subsystems from forever bottlenecking multiprocessors and limiting the range of feasible applications, new I/O subsystems must be designed. The successful design of computer systems (both hardware and software) depends on a thorough understanding of their intended use. A system designer optimizes the policies and mechanisms for the cases expected to most common in the user's workload. In the case of multiprocessor file systems, however, designers have been forced to build file systems based only on speculation about how they would be used, extrapolating from file-system characterizations of general-purpose workloads on uniprocessor and distributed systems or scientific workloads on vector supercomputers (see sidebar on related work). To help these system designers, in June 1993 we began the Charisma Project, so named because the project sought to characterize 1/0 in scientific multiprocessor applications from a variety of production parallel computing platforms and sites. The Charisma project is unique in recording individual read and write requests-in live, multiprogramming, parallel workloads (rather than from selected or nonparallel applications). In this article, we present the first results from the project: a characterization of the file-system workload an iPSC/860 multiprocessor running production, parallel scientific applications at NASA's Ames Research Center.

  17. Simulation of gamma-ray irradiation of lettuce leaves in a 137Cs irradiator using MCNP

    International Nuclear Information System (INIS)

    Kim, Jongsoon; Moreira, Rosana G.; Braby, Leslie A.

    2010-01-01

    Ionizing radiation effectively reduces the number of common microbial pathogens in fresh produce. However, the efficacy of the process for pathogens internalized into produce tissue is unknown. The objective of this study was to understand gamma irradiation of lettuce leaf structure exposed in a 137 Cs irradiator using MCNP. The simulated 137 Cs irradiator is a self-shielded device, and its geometry and sources are described in the MCNP input file. When the irradiation chamber is filled with water, lower doses are found at the center of the irradiation volume and the dose uniformity ratio (maximum dose/minimum dose) is 1.76. For randomly oriented rectangular lettuce leaf segments in the irradiation chamber, the dose uniformity ratio is 1.25. It shows that dose uniformity in the Cs irradiator is strongly dependent of the density of the sample. To understand dose distribution inside the leaf, we divided a lettuce leaf into a low density (flat) region (0.72 g/cm 3 ) and high density (rib) region (0.86 g/cm 3 ). Calculated doses to the rib are 61% higher than doses to the flat region of the leaf. This indicates that internalized microorganisms can be inactivated more easily than organisms on the surface. This study shows that irradiation can effectively reduce viable microorganism internalized in lettuce. (author)

  18. Measurement and computation for sag of calandria tube due to irradiation creep in PHWR

    International Nuclear Information System (INIS)

    Son, S. M.; Lee, W. R.; Lee, S. K.; Lee, J. S.; Kim, T. R.; Na, B. K.; Namgung I.

    2003-01-01

    Calandria tubes and Liquid Injection Shutdown System(LISS) tubes in a Pressurized Heavy Water Reactor(PHWR) are to sag due to irradiation creep and growth during plant operation. When the sag of calandria tube becomes bigger, the calandria tube possibly comes in contact with LISS tube crossing beneath the calandria tube. The contact subsequently may cause the damage on the calandria tube resulting in unpredicted outage of the plant. It is therefore necessary to check the gap between the two tubes in order to periodically confirm no contact by using a proper measure during the plant life. An ultrasonic gap measuring probe assembly which can be inserted into two viewing ports of the calandria was developed in Korea and utilized to measure the sags of both tubes in the PHWR. It was found that the centerlines of calandria tubes and liquid injection shutdown system tubes can be precisely detected by ultrasonic wave. The gaps between two tubes were easily obtained from the relative distance of the measured centerline elevations of the tubes. Based on the irradiation creep equation and the measurement data, a computer program to calculate the sags was also developed. With the computer program, the sag at the end of plant life was predicted

  19. Evaluation of Single File Systems Reciproc, Oneshape, and WaveOne using Cone Beam Computed Tomography -An In Vitro Study.

    Science.gov (United States)

    Dhingra, Annil; Ruhal, Nidhi; Miglani, Anjali

    2015-04-01

    Successful endodontic therapy depends on many factor, one of the most important step in any root canal treatment is root canal preparation. In addition, respecting the original shape of the canal is of the same importance; otherwise, canal aberrations such as transportation will be created. The purpose of this study is to compare and evaluate Reciprocating WaveOne ,Reciproc and Rotary Oneshape Single File Instrumentation System On Cervical Dentin Thickness, Cross Sectional Area and Canal Transportation on First Mandibular Molar Using Cone Beam Computed Tomography. Sixty Mandibular First Molars extracted due to periodontal reason was collected from the Department of Oral and Maxillofacial. Teeth were prepared using one rotary and two reciprocating single file system. Teeth were divided into 3 groups 20 teeth in each group. Pre instrumentation and Post instrumentation scans was done and evaluated for three parameters Canal Transportation, Cervical Dentinal Thickness, Cross-sectional Area. Results were analysed statistically using ANOVA, Post-Hoc Tukey analysis. The change in cross-sectional area after filing showed significant difference at 0mm, 1mm, 2mm and 7mm (pfile system over a distance of 7 mm (starting from 0mm and then evaluation at 1mm, 2mm, 3mm, 5mm and 7mm), the results showed a significant difference among the file systems at various lengths (p= 0.014, 0.046, 0.004, 0.028, 0.005 & 0.029 respectively). Mean value of cervical dentinal removal is maximum at all the levels for oneshape and minimum for waveone showing the better quality of waveone and reciproc over oneshape file system. Significant difference was found at 9mm, 11mm and 12mm between all the three file systems (p<0.001,< 0.001, <0.001). It was concluded that reciprocating motion is better than rotary motion in all the three parameters Canal Transportation, Cross-sectional Area, Cervical Dentinal Thickness.

  20. Data formats design of laser irradiation experiments in view of data analysis

    International Nuclear Information System (INIS)

    Su Chunxiao; Yu Xiaoqi; Yang Cunbang; Guo Su; Chen Hongsu

    2002-01-01

    The designing rules of new data file formats of laser irradiation experiments are introduced. Object-oriented programs are designed in studying experimental data of the laser facilities. The new format data files are combinations of the experiment data and diagnostic configuration data, which are applied in data processing and analysis. The edit of diagnostic configuration data in data acquisition program is also described

  1. Distributed Data Management and Distributed File Systems

    CERN Document Server

    Girone, Maria

    2015-01-01

    The LHC program has been successful in part due to the globally distributed computing resources used for collecting, serving, processing, and analyzing the large LHC datasets. The introduction of distributed computing early in the LHC program spawned the development of new technologies and techniques to synchronize information and data between physically separated computing centers. Two of the most challenges services are the distributed file systems and the distributed data management systems. In this paper I will discuss how we have evolved from local site services to more globally independent services in the areas of distributed file systems and data management and how these capabilities may continue to evolve into the future. I will address the design choices, the motivations, and the future evolution of the computing systems used for High Energy Physics.

  2. Activation cross section data file, (1)

    International Nuclear Information System (INIS)

    Yamamuro, Nobuhiro; Iijima, Shungo.

    1989-09-01

    To evaluate the radioisotope productions due to the neutron irradiation in fission of fusion reactors, the data for the activation cross sections ought to be provided. It is planning to file more than 2000 activation cross sections at final. In the current year, the neutron cross sections for 14 elements from Ni to W have been calculated and evaluated in the energy range 10 -5 to 20 MeV. The calculations with a simplified-input nuclear cross section calculation system SINCROS were described, and another method of evaluation which is consistent with the JENDL-3 were also mentioned. The results of cross section calculation are in good agreement with experimental data and they were stored in the file 8, 9 and 10 of ENDF/B format. (author)

  3. X-ray luminescence computed tomography imaging via multiple intensity weighted narrow beam irradiation

    Science.gov (United States)

    Feng, Bo; Gao, Feng; Zhao, Huijuan; Zhang, Limin; Li, Jiao; Zhou, Zhongxing

    2018-02-01

    The purpose of this work is to introduce and study a novel x-ray beam irradiation pattern for X-ray Luminescence Computed Tomography (XLCT), termed multiple intensity-weighted narrow-beam irradiation. The proposed XLCT imaging method is studied through simulations of x-ray and diffuse lights propagation. The emitted optical photons from X-ray excitable nanophosphors were collected by optical fiber bundles from the right-side surface of the phantom. The implementation of image reconstruction is based on the simulated measurements from 6 or 12 angular projections in terms of 3 or 5 x-ray beams scanning mode. The proposed XLCT imaging method is compared against the constant intensity weighted narrow-beam XLCT. From the reconstructed XLCT images, we found that the Dice similarity and quantitative ratio of targets have a certain degree of improvement. The results demonstrated that the proposed method can offer simultaneously high image quality and fast image acquisition.

  4. Methods and Algorithms for Detecting Objects in Video Files

    Directory of Open Access Journals (Sweden)

    Nguyen The Cuong

    2018-01-01

    Full Text Available Video files are files that store motion pictures and sounds like in real life. In today's world, the need for automated processing of information in video files is increasing. Automated processing of information has a wide range of application including office/home surveillance cameras, traffic control, sports applications, remote object detection, and others. In particular, detection and tracking of object movement in video file plays an important role. This article describes the methods of detecting objects in video files. Today, this problem in the field of computer vision is being studied worldwide.

  5. The file of evaluated decay data in ENDF/B

    International Nuclear Information System (INIS)

    Reich, C.W.

    1991-01-01

    One important application of nuclear decay data is the Evaluated Nuclear Data File/B (ENDF/B), the base of evaluated nuclear data used in reactor research and technology activities within the United States. The decay data in the Activation File (158 nuclides) and the Actinide File (108 nuclides) excellently represent the current status of this information. In particular, the half-lives and gamma and alpha emission probabilities, quantities that are so important for many applications, of the actinide nuclides represent a significant improvement over those in ENDF/B-V because of the inclusion of data produced by an International Atomic Energy Agency Coordinated Research Program. The Fission Product File contains experimental decay data on ∼510 nuclides, which is essentially all for which a meaningful number of data are available. For the first time, delayed-neutron spectra for the precursor nuclides are included. Some hint of problems in the fission product data base is provided by the gamma decay heat following a burst irradiation of 239 Pu

  6. Recalling ISX shot data files from the off-line archive

    International Nuclear Information System (INIS)

    Stanton, J.S.

    1981-02-01

    This document describes a set of computer programs designed to allow access to ISX shot data files stored on off-line disk packs. The programs accept user requests for data files and build a queue of end requests. When an operator is available to mount the necessary disk packs, the system copies the requested files to an on-line disk area. The program runs on the Fusion Energy Division's DECsystem-10 computer. The request queue is implemented under the System 1022 data base management system. The support programs are coded in MACRO-10 and FORTRAN-10

  7. Utilizing HDF4 File Content Maps for the Cloud

    Science.gov (United States)

    Lee, Hyokyung Joe

    2016-01-01

    We demonstrate a prototype study that HDF4 file content map can be used for efficiently organizing data in cloud object storage system to facilitate cloud computing. This approach can be extended to any binary data formats and to any existing big data analytics solution powered by cloud computing because HDF4 file content map project started as long term preservation of NASA data that doesn't require HDF4 APIs to access data.

  8. 29 CFR 4000.28 - What if I send a computer disk?

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false What if I send a computer disk? 4000.28 Section 4000.28... I send a computer disk? (a) In general. We determine your filing or issuance date for a computer... paragraph (b) of this section. (1) Filings. For computer-disk filings, we may treat your submission as...

  9. Gamma-Weighted Discrete Ordinate Two-Stream Approximation for Computation of Domain Averaged Solar Irradiance

    Science.gov (United States)

    Kato, S.; Smith, G. L.; Barker, H. W.

    2001-01-01

    An algorithm is developed for the gamma-weighted discrete ordinate two-stream approximation that computes profiles of domain-averaged shortwave irradiances for horizontally inhomogeneous cloudy atmospheres. The algorithm assumes that frequency distributions of cloud optical depth at unresolved scales can be represented by a gamma distribution though it neglects net horizontal transport of radiation. This algorithm is an alternative to the one used in earlier studies that adopted the adding method. At present, only overcast cloudy layers are permitted.

  10. Cloud object store for checkpoints of high performance computing applications using decoupling middleware

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-04-19

    Cloud object storage is enabled for checkpoints of high performance computing applications using a middleware process. A plurality of files, such as checkpoint files, generated by a plurality of processes in a parallel computing system are stored by obtaining said plurality of files from said parallel computing system; converting said plurality of files to objects using a log structured file system middleware process; and providing said objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  11. Sonographic determination of the irradiated pulmonary volume in case of irradiation of the thoracic wall

    International Nuclear Information System (INIS)

    Wittich, G.; Hohenberg, G.; Seitz, W.; Vienna Univ.

    1983-01-01

    In order to determine the irradiated pulmonary volume, comparative examinations by sonography and computed tomography were made in ten patients submitted to postoperative radiotherapy for mammary carcinoma. The physical and anatomical conditions of sonographic volumetry are discussed. In all cases irradiated with tangential contralateral fields, the irradiated pulmonary volume was less than 200 ccm (118 ccm on an average). The sonographic results did not differ essentially from those of computed tomography, so that the sonographic examination can be offered as a simple and sufficient precise method of documentation within the frame of an individual therapy planning. (orig.) [de

  12. File access prediction using neural networks.

    Science.gov (United States)

    Patra, Prashanta Kumar; Sahu, Muktikanta; Mohapatra, Subasish; Samantray, Ronak Kumar

    2010-06-01

    One of the most vexing issues in design of a high-speed computer is the wide gap of access times between the memory and the disk. To solve this problem, static file access predictors have been used. In this paper, we propose dynamic file access predictors using neural networks to significantly improve upon the accuracy, success-per-reference, and effective-success-rate-per-reference by using neural-network-based file access predictor with proper tuning. In particular, we verified that the incorrect prediction has been reduced from 53.11% to 43.63% for the proposed neural network prediction method with a standard configuration than the recent popularity (RP) method. With manual tuning for each trace, we are able to improve upon the misprediction rate and effective-success-rate-per-reference using a standard configuration. Simulations on distributed file system (DFS) traces reveal that exact fit radial basis function (RBF) gives better prediction in high end system whereas multilayer perceptron (MLP) trained with Levenberg-Marquardt (LM) backpropagation outperforms in system having good computational capability. Probabilistic and competitive predictors are the most suitable for work stations having limited resources to deal with and the former predictor is more efficient than the latter for servers having maximum system calls. Finally, we conclude that MLP with LM backpropagation algorithm has better success rate of file prediction than those of simple perceptron, last successor, stable successor, and best k out of m predictors.

  13. pcircle - A Suite of Scalable Parallel File System Tools

    Energy Technology Data Exchange (ETDEWEB)

    2015-10-01

    Most of the software related to file system are written for conventional local file system, they are serialized and can't take advantage of the benefit of a large scale parallel file system. "pcircle" software builds on top of ubiquitous MPI in cluster computing environment and "work-stealing" pattern to provide a scalable, high-performance suite of file system tools. In particular - it implemented parallel data copy and parallel data checksumming, with advanced features such as async progress report, checkpoint and restart, as well as integrity checking.

  14. Micro computed tomography evaluation of the Self-adjusting file and ProTaper Universal system on curved mandibular molars.

    Science.gov (United States)

    Serefoglu, Burcu; Piskin, Beyser

    2017-09-26

    The aim of this investigation was to compare the cleaning and shaping efficiency of Self-adjusting file and Protaper, and to assess the correlation between root canal curvature and working time in mandibular molars using micro-computed tomography. Twenty extracted mandibular molars instrumented with Protaper and Self-adjusting file and the total working time was measured in mesial canals. The changes in canal volume, surface area and structure model index, transportation, uninstrumented area and the correlation between working-time and the curvature were analyzed. Although no statistically significant difference was observed between two systems in distal canals (p>0.05), a significantly higher amount of removed dentin volume and lower uninstrumented area were provided by Protaper in mesial canals (p<0.0001). A correlation between working-time and the canal-curvature was also observed in mesial canals for both groups (SAFr 2 =0.792, p<0.0004, PTUr 2 =0.9098, p<0.0001).

  15. Selection of irradiator for potato preservation

    Energy Technology Data Exchange (ETDEWEB)

    Kinsara, A R; Melaibari, A G; Abulfaraj, W H; Mamoon, A M; Kamal, S E [Nuclear Engineering Department, Faculty of Engineering, King Abdulaziz University P.O.Box 9027, Jeddah-21413, (Saudi Arabia)

    1997-12-31

    A formal decision methodology is a sound approach for assisting in decision making needed for the selection of irradiators for Potato preservation. A formal analysis is preferred over an informal intuitive analysis which has limitations. This will focus on substantial issues and provide the basis for a compromise between conflicting objectives. All critical issues in selection of irradiators for Potato preservation can be addressed within the decision analysis framework. Of special significance is the treatment of the uncertainly associated with consequences of a decision and the preferences of the experts. A decision theory is employed in providing a strategy for implementation of the irradiator selection for food preservation for Saudi Arabia. To select a suitable decision methodology for the present case, a detailed survey of available decision methods was conducted. These methods have been developed and applied with varying degrees of success to many diverse areas of interest. Based on detailed surveys, the Analytic Hierarchy process (AHP) was selected to evaluate the various irradiators for Potato irradiation. These are electron accelerators, X-ray irradiators, and gamma irradiators. The purpose was to determine the optimal. The set of factors impacting irradiator selection were developed and defined to provide comprehensive and realistic variables that judge the represented irradiator alternatives. The factors developed are economic considerations, technical considerations, safety aspects, and compatibility with local environment. The AHP computer program was developed to computerize the tedious complicated computations towards implementing the AHP systematic procedures to solve the present problem. The program was developed using FOXPRO. Based upon the available data, and employing the APH computer program, the results show superiority of {sup 60} Co gamma-ray irradiator over other irradiators for saudi arabia`s present circumstances. 2 figs.,7 tabs.

  16. Dosimetry computer module of the gamma irradiator of ININ; Modulo informatico de dosimetria del irradiador gamma del ININ

    Energy Technology Data Exchange (ETDEWEB)

    Ledezma F, L. E.; Baldomero J, R. [ININ, Gerencia de Sistemas Informaticos, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Agis E, K. A., E-mail: luis.ledezma@inin.gob.mx [Universidad Autonoma del Estado de Mexico, Facultad de Ingenieria, Cerro de Coatepec s/n, Ciudad Universitaria, 50100 Toluca, Estado de Mexico (Mexico)

    2012-10-15

    This work present the technical specifications for the upgrade of the dosimetry module of the computer system of the gamma irradiator of the Instituto Nacional de Investigaciones Nucleares (ININ) whose result allows the integration and consultation of information in industrial dosimetry subject under an outline client-server. (Author)

  17. Globus File Transfer Services | High-Performance Computing | NREL

    Science.gov (United States)

    installed on the systems at both ends of the data transfer. The NREL endpoint is nrel#globus. Click Login on the Globus web site. On the login page select "Globus ID" as the login method and click Login to the Globus website. From the Manage Data drop down menu, select Transfer Files. Then click Get

  18. Maternal irradiation and Down Syndrome

    International Nuclear Information System (INIS)

    Gibson, D.L.; Uh, S.H.; Miller, J.R.

    1978-04-01

    The role of preconception irradiation in the etiology of Down Syndrome was examined using the techniques of record linkage. Although 909 cases of Down Syndrome, born in B.C. between 1952-70, were ascertained through a system of linked vital and health registrations, interest was restricted to the 348 case/control pairs born in the greater Vancouver area. The maternal identifying information routinely recorded on birth and ill-health registrations was used to link 155 Down Syndrome mothers and 116 control mothers to patient files at the Vancouver General Hospital. Only 28 of the case and 25 of the control mothers were subjected to diagnostic irradiation at the Vancouver Ganeral Hospital. The difference was not significant at the 5% level

  19. 75 FR 30839 - Privacy Act of 1974; CMS Computer Match No. 2010-03, HHS Computer Match No. 1003, SSA Computer...

    Science.gov (United States)

    2010-06-02

    ... 1974; CMS Computer Match No. 2010-03, HHS Computer Match No. 1003, SSA Computer Match No. 1048, IRS... Services (CMS). ACTION: Notice of renewal of an existing computer matching program (CMP) that has an...'' section below for comment period. DATES: Effective Dates: CMS filed a report of the Computer Matching...

  20. Cloud object store for archive storage of high performance computing data using decoupling middleware

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-06-30

    Cloud object storage is enabled for archived data, such as checkpoints and results, of high performance computing applications using a middleware process. A plurality of archived files, such as checkpoint files and results, generated by a plurality of processes in a parallel computing system are stored by obtaining the plurality of archived files from the parallel computing system; converting the plurality of archived files to objects using a log structured file system middleware process; and providing the objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  1. A data compression algorithm for nuclear spectrum files

    International Nuclear Information System (INIS)

    Mika, J.F.; Martin, L.J.; Johnston, P.N.

    1990-01-01

    The total space occupied by computer files of spectra generated in nuclear spectroscopy systems can lead to problems of storage, and transmission time. An algorithm is presented which significantly reduces the space required to store nuclear spectra, without loss of any information content. Testing indicates that spectrum files can be routinely compressed by a factor of 5. (orig.)

  2. A computer program to calculate nuclide yields in complex decay chain for selection of optimum irradiation and cooling condition

    International Nuclear Information System (INIS)

    Takeda, Tsuneo

    1977-11-01

    This report is prepared as a user's input manual for a computer code CODAC-No.5 and provides a general description of the code and instructions for its use. The code represents a modified version of the CODAC-No.4 code. The code developed is capable of calculating radioactive nuclide yields in an any given complex decay and activation chain independent of irradiation history. In this code, eighteen kinds of valuable tables and graphs can be prepared for output. They are available for selection of optimum irradiation and cooling conditions and for other intentions in accordance with irradiation and cooling. For a example, the ratio of a nuclide yield to total nuclide yield depending on irradiation and cooling times is obtained. In these outputs, several kinds of complex and intricate equations and others are included. This code has almost the same input forms as that of CODAC-No.4 code excepting input of irradiation history data. Input method and formats used for this code are very simple for any kinds of nuclear data. List of FORTRAN statements, examples of input data and output results and list of input parameters and its definitions are given in this report. (auth.)

  3. The crystallographic information file (CIF): A new standard archive file for crystallography

    International Nuclear Information System (INIS)

    Hall, S.R.; Allen, F.H.; Brown, I.D.

    1991-01-01

    The specification of a new standard Crystallographic Information File (CIF) is described. Its development is based on the Self-Defining Text Archieve and Retrieval (STAR) procedure. The CIF is a general, flexible and easily extensible free-format archive file; it is human and machine readable and can be edited by a simple editor. The CIF is designed for the electronic transmission of crystallographic data between individual laboratories, journals and databases: It has been adopted by the International Union of Crystallography as the recommended medium for this purpose. The file consists of data names and data items, together with a loop facility for repeated items. The data names, constructed hierarchically so as to form data categories, are self-descriptive within a 32-character limit. The sorted list of data names, together with their precise definitions, constitutes the CIF dictionary (core version 1991). The CIF core dictionary is presented in full and covers the fundamental and most commonly used data items relevant to crystal structure analysis. The dictionary is also available as an electronic file suitable for CIF computer applications. Future extensions to the dictionary will include data items used in more specialized areas of crystallography. (orig.)

  4. An information retrieval system for research file data

    Science.gov (United States)

    Joan E. Lengel; John W. Koning

    1978-01-01

    Research file data have been successfully retrieved at the Forest Products Laboratory through a high-speed cross-referencing system involving the computer program FAMULUS as modified by the Madison Academic Computing Center at the University of Wisconsin. The method of data input, transfer to computer storage, system utilization, and effectiveness are discussed....

  5. The global unified parallel file system (GUPFS) project: FY 2002 activities and results

    Energy Technology Data Exchange (ETDEWEB)

    Butler, Gregory F.; Lee, Rei Chi; Welcome, Michael L.

    2003-04-07

    The Global Unified Parallel File System (GUPFS) project is a multiple-phase, five-year project at the National Energy Research Scientific Computing (NERSC) Center to provide a scalable, high performance, high bandwidth, shared file system for all the NERSC production computing and support systems. The primary purpose of the GUPFS project is to make it easier to conduct advanced scientific research using the NERSC systems. This is to be accomplished through the use of a shared file system providing a unified file namespace, operating on consolidated shared storage that is directly accessed by all the NERSC production computing and support systems. During its first year, FY 2002, the GUPFS project focused on identifying, testing, and evaluating existing and emerging shared/cluster file system, SAN fabric, and storage technologies; identifying NERSC user input/output (I/O) requirements, methods, and mechanisms; and developing appropriate benchmarking methodologies and benchmark codes for a parallel environment. This report presents the activities and progress of the GUPFS project during its first year, the results of the evaluations conducted, and plans for near-term and longer-term investigations.

  6. Alloys under irradiation

    International Nuclear Information System (INIS)

    Martin, G.; Bellon, P.; Soisson, F.

    1997-01-01

    During the last two decades, some effort has been devoted to establishing a phenomenology for alloys under irradiation. Theoretically, the effects of the defect supersaturation, sustained defect fluxes and ballistic mixing on solid solubility under irradiation can now be formulated in a unified manner, at least for the most simple cases: coherent phase transformations and nearest-neighbor ballistic jumps. Even under such restrictive conditions, several intriguing features documented experimentally can be rationalized, sometimes in a quantitative manner and simple qualitative rules for alloy stability as a function of irradiation conditions can be formulated. A quasi-thermodynamic formalism can be proposed for alloys under irradiation. However, this point of view has limits illustrated by recent computer simulations. (orig.)

  7. LASIP-III, a generalized processor for standard interface files

    International Nuclear Information System (INIS)

    Bosler, G.E.; O'Dell, R.D.; Resnik, W.M.

    1976-03-01

    The LASIP-III code was developed for processing Version III standard interface data files which have been specified by the Committee on Computer Code Coordination. This processor performs two distinct tasks, namely, transforming free-field format, BCD data into well-defined binary files and providing for printing and punching data in the binary files. While LASIP-III is exported as a complete free-standing code package, techniques are described for easily separating the processor into two modules, viz., one for creating the binary files and one for printing the files. The two modules can be separated into free-standing codes or they can be incorporated into other codes. Also, the LASIP-III code can be easily expanded for processing additional files, and procedures are described for such an expansion. 2 figures, 8 tables

  8. CryptoCache: A Secure Sharable File Cache for Roaming Users

    DEFF Research Database (Denmark)

    Jensen, Christian D.

    2000-01-01

    . Conventional distributed file systems cache everything locally or not at all; there is no possibility to cache files on nearby nodes.In this paper we present the design of a secure cache system called CryptoCache that allows roaming users to cache files on untrusted file hosting servers. The system allows...... flexible sharing of cached files among unauthenticated users, i.e. unlike most distributed file systems CryptoCache does not require a global authentication framework.Files are encrypted when they are transferred over the network and while stored on untrusted servers. The system uses public key......Small mobile computers are now sufficiently powerful to run many applications, but storage capacity remains limited so working files cannot be cached or stored locally. Even if files can be stored locally, the mobile device is not powerful enough to act as server in collaborations with other users...

  9. GEODOC: the GRID document file, record structure and data element description

    Energy Technology Data Exchange (ETDEWEB)

    Trippe, T.; White, V.; Henderson, F.; Phillips, S.

    1975-11-06

    The purpose of this report is to describe the information structure of the GEODOC file. GEODOC is a computer based file which contains the descriptive cataloging and indexing information for all documents processed by the National Geothermal Information Resource Group. This file (along with other GRID files) is managed by DBMS, the Berkeley Data Base Management System. Input for the system is prepared using the IRATE Text Editing System with its extended (12 bit) character set, or punched cards.

  10. Virus Alert: Ten Steps to Safe Computing.

    Science.gov (United States)

    Gunter, Glenda A.

    1997-01-01

    Discusses computer viruses and explains how to detect them; discusses virus protection and the need to update antivirus software; and offers 10 safe computing tips, including scanning floppy disks and commercial software, how to safely download files from the Internet, avoiding pirated software copies, and backing up files. (LRW)

  11. Comparative Analysis of Canal Centering Ability of Different Single File Systems Using Cone Beam Computed Tomography- An In-Vitro Study.

    Science.gov (United States)

    Agarwal, Rolly S; Agarwal, Jatin; Jain, Pradeep; Chandra, Anil

    2015-05-01

    The ability of an endodontic instrument to remain centered in the root canal system is one of the most important characteristic influencing the clinical performance of a particular file system. Thus, it is important to assess the canal centering ability of newly introduced single file systems before they can be considered a viable replacement of full-sequence rotary file systems. The aim of the study was to compare the canal transportation, centering ability, and time taken for preparation of curved root canals after instrumentation with single file systems One Shape and Wave One, using cone-beam computed tomography (CBCT). Sixty mesiobuccal canals of mandibular molars with an angle of curvature ranging from 20(o) to 35(o) were divided into three groups of 20 samples each: ProTaper PT (group I) - full-sequence rotary control group, OneShape OS (group II)- single file continuous rotation, WaveOne WO - single file reciprocal motion (group III). Pre instrumentation and post instrumentation three-dimensional CBCT images were obtained from root cross-sections at 3mm, 6mm and 9mm from the apex. Scanned images were then accessed to determine canal transportation and centering ability. The data collected were evaluated using one-way analysis of variance (ANOVA) with Tukey's honestly significant difference test. It was observed that there were no differences in the magnitude of transportation between the rotary instruments (p >0.05) at both 3mm as well as 6mm from the apex. At 9 mm from the apex, Group I PT showed significantly higher mean canal transportation and lower centering ability (0.19±0.08 and 0.39±0.16), as compared to Group II OS (0.12±0.07 and 0.54±0.24) and Group III WO (0.13±0.06 and 0.55±0.18) while the differences between OS and WO were not statistically significant. It was concluded that there was minor difference between the tested groups. Single file systems demonstrated average canal transportation and centering ability comparable to full sequence

  12. Comparative Analysis of Canal Centering Ability of Different Single File Systems Using Cone Beam Computed Tomography- An In-Vitro Study

    Science.gov (United States)

    Agarwal, Jatin; Jain, Pradeep; Chandra, Anil

    2015-01-01

    Background The ability of an endodontic instrument to remain centered in the root canal system is one of the most important characteristic influencing the clinical performance of a particular file system. Thus, it is important to assess the canal centering ability of newly introduced single file systems before they can be considered a viable replacement of full-sequence rotary file systems. Aim The aim of the study was to compare the canal transportation, centering ability, and time taken for preparation of curved root canals after instrumentation with single file systems One Shape and Wave One, using cone-beam computed tomography (CBCT). Materials and Methods Sixty mesiobuccal canals of mandibular molars with an angle of curvature ranging from 20o to 35o were divided into three groups of 20 samples each: ProTaper PT (group I) – full-sequence rotary control group, OneShape OS (group II)- single file continuous rotation, WaveOne WO – single file reciprocal motion (group III). Pre instrumentation and post instrumentation three-dimensional CBCT images were obtained from root cross-sections at 3mm, 6mm and 9mm from the apex. Scanned images were then accessed to determine canal transportation and centering ability. The data collected were evaluated using one-way analysis of variance (ANOVA) with Tukey’s honestly significant difference test. Results It was observed that there were no differences in the magnitude of transportation between the rotary instruments (p >0.05) at both 3mm as well as 6mm from the apex. At 9 mm from the apex, Group I PT showed significantly higher mean canal transportation and lower centering ability (0.19±0.08 and 0.39±0.16), as compared to Group II OS (0.12±0.07 and 0.54±0.24) and Group III WO (0.13±0.06 and 0.55±0.18) while the differences between OS and WO were not statistically significant Conclusion It was concluded that there was minor difference between the tested groups. Single file systems demonstrated average canal

  13. ChemEngine: harvesting 3D chemical structures of supplementary data from PDF files.

    Science.gov (United States)

    Karthikeyan, Muthukumarasamy; Vyas, Renu

    2016-01-01

    Digital access to chemical journals resulted in a vast array of molecular information that is now available in the supplementary material files in PDF format. However, extracting this molecular information, generally from a PDF document format is a daunting task. Here we present an approach to harvest 3D molecular data from the supporting information of scientific research articles that are normally available from publisher's resources. In order to demonstrate the feasibility of extracting truly computable molecules from PDF file formats in a fast and efficient manner, we have developed a Java based application, namely ChemEngine. This program recognizes textual patterns from the supplementary data and generates standard molecular structure data (bond matrix, atomic coordinates) that can be subjected to a multitude of computational processes automatically. The methodology has been demonstrated via several case studies on different formats of coordinates data stored in supplementary information files, wherein ChemEngine selectively harvested the atomic coordinates and interpreted them as molecules with high accuracy. The reusability of extracted molecular coordinate data was demonstrated by computing Single Point Energies that were in close agreement with the original computed data provided with the articles. It is envisaged that the methodology will enable large scale conversion of molecular information from supplementary files available in the PDF format into a collection of ready- to- compute molecular data to create an automated workflow for advanced computational processes. Software along with source codes and instructions available at https://sourceforge.net/projects/chemengine/files/?source=navbar.Graphical abstract.

  14. Evaluation of Single File Systems Reciproc, Oneshape, and WaveOne using Cone Beam Computed Tomography –An In Vitro Study

    Science.gov (United States)

    Dhingra, Annil; Miglani, Anjali

    2015-01-01

    Background Successful endodontic therapy depends on many factor, one of the most important step in any root canal treatment is root canal preparation. In addition, respecting the original shape of the canal is of the same importance; otherwise, canal aberrations such as transportation will be created. Aim The purpose of this study is to compare and evaluate Reciprocating WaveOne ,Reciproc and Rotary Oneshape Single File Instrumentation System On Cervical Dentin Thickness, Cross Sectional Area and Canal Transportation on First Mandibular Molar Using Cone Beam Computed Tomography. Materials and Methods Sixty Mandibular First Molars extracted due to periodontal reason was collected from the Department of Oral and Maxillofacial. Teeth were prepared using one rotary and two reciprocating single file system. Teeth were divided into 3 groups 20 teeth in each group. Pre instrumentation and Post instrumentation scans was done and evaluated for three parameters Canal Transportation, Cervical Dentinal Thickness, Cross-sectional Area. Results were analysed statistically using ANOVA, Post-Hoc Tukey analysis. Results The change in cross-sectional area after filing showed significant difference at 0mm, 1mm, 2mm and 7mm (pfile system over a distance of 7 mm (starting from 0mm and then evaluation at 1mm, 2mm, 3mm, 5mm and 7mm), the results showed a significant difference among the file systems at various lengths (p= 0.014, 0.046, 0.004, 0.028, 0.005 & 0.029 respectively). Mean value of cervical dentinal removal is maximum at all the levels for oneshape and minimum for waveone showing the better quality of waveone and reciproc over oneshape file system. Significant difference was found at 9mm, 11mm and 12mm between all the three file systems (p<0.001,< 0.001, <0.001). Conclusion It was concluded that reciprocating motion is better than rotary motion in all the three parameters Canal Transportation, Cross-sectional Area, Cervical Dentinal Thickness. PMID:26023639

  15. Development of data base on food irradiation

    International Nuclear Information System (INIS)

    Ito, Hitoshi; Kume, Tamikazu; Hashimoto, Shoji; Izumi, Fumio.

    1995-12-01

    For the exact understanding on food irradiation in Japan, it is important to provide information of food irradiation to consumers, industries and government offices. However, many of information on food irradiation are only restricted in a few experts or institutes relating to this field. For this reason, data base of food irradiation has been completed together with the systems necessary for input the data using computer. In this data base, about 630 data with full reports were inputted in computer in the field of wholesomeness studies, irradiation effects on food, radiation engineering, detection methods of irradiated food and Q and A of food irradiation for easy understanding. Many of these data are inputted by Japanese language. Some English reports on wholesomeness studies are also included which were mainly obtained from international projects of food irradiation. Many of data on food irradiation are responsible in the fields of food science, dietetics, microbiology, radiation biology, molecular biology, medical science, agricultural science, radiation chemistry, radiation engineering and so on. Data base of food irradiation contains many useful data which can apply to many other fields of radiation processing not only on food irradiation but also on sterilization of medical equipments, upgrading of agricultural wastes and others. (author)

  16. Development of data base on food irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Ito, Hitoshi; Kume, Tamikazu; Hashimoto, Shoji [Japan Atomic Energy Research Inst., Takasaki, Gunma (Japan). Takasaki Radiation Chemistry Research Establishment; Izumi, Fumio

    1995-12-01

    For the exact understanding on food irradiation in Japan, it is important to provide information of food irradiation to consumers, industries and government offices. However, many of information on food irradiation are only restricted in a few experts or institutes relating to this field. For this reason, data base of food irradiation has been completed together with the systems necessary for input the data using computer. In this data base, about 630 data with full reports were inputted in computer in the field of wholesomeness studies, irradiation effects on food, radiation engineering, detection methods of irradiated food and Q and A of food irradiation for easy understanding. Many of these data are inputted by Japanese language. Some English reports on wholesomeness studies are also included which were mainly obtained from international projects of food irradiation. Many of data on food irradiation are responsible in the fields of food science, dietetics, microbiology, radiation biology, molecular biology, medical science, agricultural science, radiation chemistry, radiation engineering and so on. Data base of food irradiation contains many useful data which can apply to many other fields of radiation processing not only on food irradiation but also on sterilization of medical equipments, upgrading of agricultural wastes and others. (author).

  17. RRB / SSI Interface Checkwriting Integrated Computer Operation Extract File (CHICO)

    Data.gov (United States)

    Social Security Administration — This monthly file provides SSA with information about benefit payments made to railroad retirement beneficiaries. SSA uses this data to verify Supplemental Security...

  18. Analysis of the behavior under irradiation of high burnup nuclear fuels with the computer programs FRAPCON and FRAPTRAN

    Energy Technology Data Exchange (ETDEWEB)

    Reis, Regis; Silva, Antonio Teixeira e, E-mail: teixeira@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2017-11-01

    The objective of this paper is to verify the validity and accuracy of the results provided by computer programs FRAPCON-3.4a and FRAPTRAN-1.4, used in the simulation process of the irradiation behavior of Pressurized Water Reactors (PWR) fuel rods, in steady-state and transient operational conditions at high burnup. To achieve this goal, the results provided by these computer simulations are compared with experimental data available in the database FUMEX III. Through the results, it was found that the computer programs used have a good ability to predict the operational behavior of PWR fuel rods in high burnup steady-state conditions and under the influence of Reactivity Initiated Accident (RIA). (author)

  19. Visualization of biomedical image data and irradiation planning using a parallel computing system

    International Nuclear Information System (INIS)

    Lehrig, R.

    1991-01-01

    The contribution explains the development of a novel, low-cost workstation for the processing of biomedical tomographic data sequences. The workstation was to allow both graphical display of the data and implementation of modelling software for irradiation planning, especially for calculation of dose distributions on the basis of the measured tomogram data. The system developed according to these criteria is a parallel computing system which performs secondary, two-dimensional image reconstructions irrespective of the imaging direction of the original tomographic scans. Three-dimensional image reconstructions can be generated from any direction of view, with random selection of sections of the scanned object. (orig./MM) With 69 figs., 2 tabs [de

  20. The Improvement and Performance of Mobile Environment Using Both Cloud and Text Computing

    OpenAIRE

    S.Saravana Kumar; J.Lakshmi Priya; P.Hannah Jennifer; N.Jeff Monica; Fathima

    2013-01-01

    In this research paper presents an design model for file sharing system for ubiquitos mobile devices using both cloud and text computing. File s haring is one of the rationales for computer networks with increasing demand for file sharing ap plications and technologies in small and large enterprise networks and on the Internet. File transfer is an important process in any form of computing as we need to really share the data ac ross. ...

  1. Dynamic file-access characteristics of a production parallel scientific workload

    Science.gov (United States)

    Kotz, David; Nieuwejaar, Nils

    1994-01-01

    Multiprocessors have permitted astounding increases in computational performance, but many cannot meet the intense I/O requirements of some scientific applications. An important component of any solution to this I/O bottleneck is a parallel file system that can provide high-bandwidth access to tremendous amounts of data in parallel to hundreds or thousands of processors. Most successful systems are based on a solid understanding of the expected workload, but thus far there have been no comprehensive workload characterizations of multiprocessor file systems. This paper presents the results of a three week tracing study in which all file-related activity on a massively parallel computer was recorded. Our instrumentation differs from previous efforts in that it collects information about every I/O request and about the mix of jobs running in a production environment. We also present the results of a trace-driven caching simulation and recommendations for designers of multiprocessor file systems.

  2. Tabulation of Fundamental Assembly Heat and Radiation Source Files

    International Nuclear Information System (INIS)

    T. deBues; J.C. Ryman

    2006-01-01

    The purpose of this calculation is to tabulate a set of computer files for use as input to the WPLOAD thermal loading software. These files contain details regarding heat and radiation from pressurized water reactor (PWR) assemblies and boiling water reactor (BWR) assemblies. The scope of this calculation is limited to rearranging and reducing the existing file information into a more streamlined set of tables for use as input to WPLOAD. The electronic source term files used as input to this calculation were generated from the output files of the SAS2H/ORIGIN-S sequence of the SCALE Version 4.3 modular code system, as documented in References 2.1.1 and 2.1.2, and are included in Attachment II

  3. Preparation of functions of computer code GENGTC and improvement for two-dimensional heat transfer calculations for irradiation capsules

    International Nuclear Information System (INIS)

    Nomura, Yasushi; Someya, Hiroyuki; Ito, Haruhiko.

    1992-11-01

    Capsules for irradiation tests in the JMTR (Japan Materials Testing Reactor), consist of irradiation specimens surrounded by a cladding tube, holders, an inner tube and a container tube (from 30mm to 65mm in diameter). And the annular gaps between these structural materials in the capsule are filled with liquids or gases. Cooling of the capsule is done by reactor primary coolant flowing down outside the capsule. Most of the heat generated by fission in fuel specimens and gamma absorption in structural materials is directed radially to the capsule container outer surface. In thermal performance calculations for capsule design, an one(r)-dimensional heat transfer computer code entitled (Generalyzed Gap Temperature Calculation), GENGTC, originally developed in Oak Ridge National Laboratory, U.S.A., has been frequently used. In designing a capsule, are needed many cases of parametric calculations with respect to changes materials and gap sizes. And in some cases, two(r,z)-dimensional heat transfer calculations are needed for irradiation test capsules with short length fuel rods. Recently the authors improved the original one-dimensional code GENGTC, (1) to simplify preparation of input data, (2) to perform automatic calculations for parametric survey based on design temperatures, ect. Moreover, the computer code has been improved to perform r-z two-dimensional heat transfer calculation. This report describes contents of the preparation of the one-dimensional code GENGTC and the improvement for the two-dimensional code GENGTC-2, together with their code manuals. (author)

  4. Reoxygenation of hypoxic cells by tumor shrinkage during irradiation. A computer simulation

    International Nuclear Information System (INIS)

    Kocher, M.; Treuer, H.

    1995-01-01

    A 3-dimensional computer simulation was developed in order to estimate the impact of tumor shrinkage on reoxygenation of chronic hypoxic tumor cells during a full course of fractionated irradiation. The growth of a small tumor situated in a vascularized stroma with 350 capillary cross-sections/mm 3 which were displaced by the growing tumor was simulated. Tumors contained 10 4 cells when irradiation started, intrinsic radiosensitivity was set to either low (α=0.3 Gy -1 , β=0.03 Gy -2 ) or high (α=0.4 Gy -1 , β=0.04 Gy -2 ) values. Oxygen enhancement ratio was 3.0, potential tumor doubling time T pot =1, 2 or 5 days. A simulated fractionated radiotherapy was carried out with daily fractions of 2.0 Gy, total dose 50 to 70 Gy. The presence or absence of factors preventing tumor cord shrinkage was also included. During the growth phase, all tumors developed a necrotic core with a hypoxic cell fraction of 25% under these conditions. During irradiation, the slower growing tumors (T pot =2 to 5 days) showed complete reoxygenation of the hypoxic cells after 30 to 40 Gy independent from radiosensitivity, undisturbed tumor shrinkage provided. If shrinkage was prevented, the hypoxic fraction rose to 100% after 30 to 50 Gy. Local tumor control, defined as the destruction of all clonogenic and hypoxic tumor cells increased by 20 to 100% due to reoxygenation and 50 Gy were enough in order to sterilize the tumors in these cases. In the fast growing tumors (T pot =1 day), reoxygenation was only observed in the case of high radiosensitivity and undisturbed tumor shrinkage. In these tumors reoxygenation increased the control rates by up to 60%. (orig./MG) [de

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  6. The design and development of GRASS file reservation system

    International Nuclear Information System (INIS)

    Huang Qiulan; Zhu Suijiang; Cheng Yaodong; Chen Gang

    2010-01-01

    GFRS (GRASS File Reservation System) is designed to improve the file access performance of GRASS (Grid-enabled Advanced Storage System) which is a Hierarchical Storage Management (HSM) system developed at Computing Center, Institute of High Energy Physics. GRASS can provide massive storage management and data migration, but the data migration policy is simply based factors such as pool water level, the intervals for migration and so on, so it is short of precise control over files. As for that, we design GFRS to implement user-based file reservation which is to reserve and keep the required files on disks for High Energy physicists. CFRS can improve file access speed for users by avoiding migrating frequently accessed files to tapes. In this paper we first give a brief introduction of GRASS system and then detailed architecture and implementation of GFRS. Experiments results from GFRS have shown good performance and a simple analysis is made based on it. (authors)

  7. User's guide for the implementation of level one of the proposed American National Standard Specifications for an information interchange data descriptive file on control data 6000/7000 series computers

    CERN Document Server

    Wiley, R A

    1977-01-01

    User's guide for the implementation of level one of the proposed American National Standard Specifications for an information interchange data descriptive file on control data 6000/7000 series computers

  8. Present status and prospects of food irradiation in France

    International Nuclear Information System (INIS)

    Laizier, J.

    1985-09-01

    Following the conclusions of the JEFCI (Joint FAO/IAEA/WHO Expert Committee on whole someness of Irradiated Food), CEA (French Atomic Energy Commission) was required by the regulatory committees, in 1981-82, to present a white book on the whole someness of irradiated foods. Following the approval of this white book it was decided to not modify hastily the current regulation of 1970; but to lighten the content of the file of request for authorization by removing its part related to toxicological evidences. This liberalization of procedure has encouraged industrial projects. A large effort of development is initiated and taken in charge locally. The CEA has very important responsabilities in this national effort of development of food irradiation. Three new designs for gamma industrial irradiators were recently improved in France specifically in view of food irradiation, beside the other more conventional designs of carrier and pallet irradiators, already available and well known. The presently available accelerators are not well fitted to food irradiation; the penetration of electrons which are produced is not high enough for food products. A french company, CGR-MeV, recently developed a linear accelerator of 10 MeV and 10 Kw, which appears very attractive

  9. Variations of dose distribution in high energy electron beams as a function of geometrical parameters of irradiation. Application to computer calculation

    International Nuclear Information System (INIS)

    Villeret, O.

    1985-04-01

    An algorithm is developed for the purpose of compter treatment planning of electron therapy. The method uses experimental absorbed dose distribution data in the irradiated medium for electron beams in the 8-20 MeV range delivered by the Sagittaire linear accelerator (study of central axis depth dose, beam profiles) in various geometrical conditions. Experimental verification of the computer program showed agreement with 2% between dose measurement and computer calculation [fr

  10. New developments in file-based infrastructure for ATLAS event selection

    Energy Technology Data Exchange (ETDEWEB)

    Gemmeren, P van; Malon, D M [Argonne National Laboratory, Argonne, Illinois 60439 (United States); Nowak, M, E-mail: gemmeren@anl.go [Brookhaven National Laboratory, Upton, NY 11973-5000 (United States)

    2010-04-01

    In ATLAS software, TAGs are event metadata records that can be stored in various technologies, including ROOT files and relational databases. TAGs are used to identify and extract events that satisfy certain selection predicates, which can be coded as SQL-style queries. TAG collection files support in-file metadata to store information describing all events in the collection. Event Selector functionality has been augmented to provide such collection-level metadata to subsequent algorithms. The ATLAS I/O framework has been extended to allow computational processing of TAG attributes to select or reject events without reading the event data. This capability enables physicists to use more detailed selection criteria than are feasible in an SQL query. For example, the TAGs contain enough information not only to check the number of electrons, but also to calculate their distance to the closest jet-a calculation that would be difficult to express in SQL. Another new development allows ATLAS to write TAGs directly into event data files. This feature can improve performance by supporting advanced event selection capabilities, including computational processing of TAG information, without the need for external TAG file or database access.

  11. Prefetching in file systems for MIMD multiprocessors

    Science.gov (United States)

    Kotz, David F.; Ellis, Carla Schlatter

    1990-01-01

    The question of whether prefetching blocks on the file into the block cache can effectively reduce overall execution time of a parallel computation, even under favorable assumptions, is considered. Experiments have been conducted with an interleaved file system testbed on the Butterfly Plus multiprocessor. Results of these experiments suggest that (1) the hit ratio, the accepted measure in traditional caching studies, may not be an adequate measure of performance when the workload consists of parallel computations and parallel file access patterns, (2) caching with prefetching can significantly improve the hit ratio and the average time to perform an I/O (input/output) operation, and (3) an improvement in overall execution time has been observed in most cases. In spite of these gains, prefetching sometimes results in increased execution times (a negative result, given the optimistic nature of the study). The authors explore why it is not trivial to translate savings on individual I/O requests into consistently better overall performance and identify the key problems that need to be addressed in order to improve the potential of prefetching techniques in the environment.

  12. Utilization of irradiation on food preservation

    International Nuclear Information System (INIS)

    Cho, Han Ok; Kwon, Joong Ho; Byun, Myung Woo

    1985-04-01

    The number of total viable bacteria in chicken meat was reduced by over 90% with irradiation treatments of 5-10 kGy, and also an irradiation dose of yeasts, molds, coliforms, and especially Salmonella for 2-4 weeks of storage. In physicochemical properties of stored chichen, such as water holding capacity, TBA number, UBN, odor, color, overall appearance, cooking quality and organoleptic characteristics, the irradiated samples were superior to the nonirradiated samples, so the freshness of irradiated chicken was retained until 30 days ofter storage at 3-4degC. Commercial fried fish paste was comtaminated by 2.2x10 3 counts in total variable bacteria, 2.8x10 2 counts in yeasts and models, and 1.0x10 2 counts in coliforms, per gram of samples, but irradiation treatment of more than 3 kGy could reduce the microbial load up to 80-90%. As the storage period increased, chemical components of the irradiated samples were better than those of the nonirradiated samples, and the self-life of irradiated groups was extended by 3-4 times as compared with that of nonirradiated groups at room(10-20degC) and low(3-4degC) temperatures without apparent changes in organoleptic properties. Some packaged dried fishes, such as dried cod, dried squid, dried file fish and dried pollack, were preserved by irradiation under the room condictions. After storage of one year the by irradiated samples with doses of 3-8 kGy were found to be marketable resulting from organoleptic observations without showing any storage loss due to microbial and insect factors. (Author)

  13. [Comparison of effectiveness and safety between Twisted File technique and ProTaper Universal rotary full sequence based on micro-computed tomography].

    Science.gov (United States)

    Chen, Xiao-bo; Chen, Chen; Liang, Yu-hong

    2016-02-18

    To evaluate the efficacy and security of two type of rotary nickel titanium system (Twisted File and ProTaper Universal) for root canal preparation based on micro-computed tomography(micro-CT). Twenty extracted molars (including 62 canals) were divided into two experimental groups and were respectively instrumented using Twisted File rotary nickel titanium system (TF) and ProTaper Universal rotary nickel titanium system (PU) to #25/0.08 following recommended protocol. Time for root canal instrumentation (accumulation of time for every single file) was recorded. The 0-3 mm root surface from apex was observed under an optical stereomicroscope at 25 × magnification. The presence of crack line was noted. The root canals were scanned with micro-CT before and after root canal preparation. Three-dimensional shape images of canals were reconstructed, calculated and evaluated. The amount of canal central transportation of the two groups was calculated and compared. The shorter preparation time [(0.53 ± 0.14) min] was observed in TF group, while the preparation time of PU group was (2.06 ± 0.39) min (Pvs. (0.097 ± 0.084) mm, P<0.05]. No instrument separation was observed in both the groups. Cracks were not found in both the groups either based in micro-CT images or observation under an optical stereomicroscope at 25 × magnification. Compared with ProTaper Universal, Twisted File took less time in root canal preparation and exhibited better shaping ability, and less canal transportation.

  14. The global unified parallel file system (GUPFS) project: FY 2003 activities and results

    Energy Technology Data Exchange (ETDEWEB)

    Butler, Gregory F.; Baird William P.; Lee, Rei C.; Tull, Craig E.; Welcome, Michael L.; Whitney Cary L.

    2004-04-30

    The Global Unified Parallel File System (GUPFS) project is a multiple-phase project at the National Energy Research Scientific Computing (NERSC) Center whose goal is to provide a scalable, high-performance, high-bandwidth, shared file system for all of the NERSC production computing and support systems. The primary purpose of the GUPFS project is to make the scientific users more productive as they conduct advanced scientific research at NERSC by simplifying the scientists' data management tasks and maximizing storage and data availability. This is to be accomplished through the use of a shared file system providing a unified file namespace, operating on consolidated shared storage that is accessible by all the NERSC production computing and support systems. In order to successfully deploy a scalable high-performance shared file system with consolidated disk storage, three major emerging technologies must be brought together: (1) shared/cluster file systems software, (2) cost-effective, high-performance storage area network (SAN) fabrics, and (3) high-performance storage devices. Although they are evolving rapidly, these emerging technologies individually are not targeted towards the needs of scientific high-performance computing (HPC). The GUPFS project is in the process of assessing these emerging technologies to determine the best combination of solutions for a center-wide shared file system, to encourage the development of these technologies in directions needed for HPC, particularly at NERSC, and to then put them into service. With the development of an evaluation methodology and benchmark suites, and with the updating of the GUPFS testbed system, the project did a substantial number of investigations and evaluations during FY 2003. The investigations and evaluations involved many vendors and products. From our evaluation of these products, we have found that most vendors and many of the products are more focused on the commercial market. Most vendors

  15. Detection Of Alterations In Audio Files Using Spectrograph Analysis

    Directory of Open Access Journals (Sweden)

    Anandha Krishnan G

    2015-08-01

    Full Text Available The corresponding study was carried out to detect changes in audio file using spectrograph. An audio file format is a file format for storing digital audio data on a computer system. A sound spectrograph is a laboratory instrument that displays a graphical representation of the strengths of the various component frequencies of a sound as time passes. The objectives of the study were to find the changes in spectrograph of audio after altering them to compare altering changes with spectrograph of original files and to check for similarity and difference in mp3 and wav. Five different alterations were carried out on each audio file to analyze the differences between the original and the altered file. For altering the audio file MP3 or WAV by cutcopy the file was opened in Audacity. A different audio was then pasted to the audio file. This new file was analyzed to view the differences. By adjusting the necessary parameters the noise was reduced. The differences between the new file and the original file were analyzed. By adjusting the parameters from the dialog box the necessary changes were made. The edited audio file was opened in the software named spek where after analyzing a graph is obtained of that particular file which is saved for further analysis. The original audio graph received was combined with the edited audio file graph to see the alterations.

  16. NASA work unit system file maintenance manual

    Science.gov (United States)

    1972-01-01

    The NASA Work Unit System is a management information system for research tasks (i.e., work units) performed under NASA grants and contracts. It supplies profiles on research efforts and statistics on fund distribution. The file maintenance operator can add, delete and change records at a remote terminal or can submit punched cards to the computer room for batch update. The system is designed for file maintenance by a person with little or no knowledge of data processing techniques.

  17. Neutronic Modelling in Support of the Irradiation Programmes

    International Nuclear Information System (INIS)

    Koonen, E.

    2005-01-01

    Irradiation experiments are generally conducted to determine some specific characteristics of the concerned fuels and structural materials under well defined irradiation conditions. For the determination of the latter the BR2 division has an autonomous reactor physics cell and has implemented the required computational tools. The major tool used is a three-dimensional full-scale Monte Carlo model of the BR2 reactor developed under MCNP-4C for the simulation of irradiation conditions. The objectives of work performed by SCK-CEN are to evaluate and adjust irradiation conditions by adjustments of the environment, differential rod positions, axial and azimuthal positioning of the samples, global power level, ...; to deliver reliable, well defined irradiation condition and fluence data during and after irradiation; to assist the designer of new irradiation devices by simulations and neutronic optimisations of design options; to provide computational support to related projects as a way to valorise the capabilities that the BR2 reactor can offer

  18. Standard interface files and procedures for reactor physics codes. Version IV

    International Nuclear Information System (INIS)

    O'Dell, R.D.

    1977-09-01

    Standards, procedures, and recommendations of the Committee on Computer Code Coordination for promoting the exchange of reactor physics codes are updated to Version IV status. Standards and procedures covering general programming, program structure, standard interface files, and file management and handling subroutines are included

  19. Remote file inquiry (RFI) system

    Science.gov (United States)

    1975-01-01

    System interrogates and maintains user-definable data files from remote terminals, using English-like, free-form query language easily learned by persons not proficient in computer programming. System operates in asynchronous mode, allowing any number of inquiries within limitation of available core to be active concurrently.

  20. Evaluation of the Self-Adjusting File system (SAF) for the instrumentation of primary molar root canals: a micro-computed tomographic study.

    Science.gov (United States)

    Kaya, E; Elbay, M; Yiğit, D

    2017-06-01

    The Self-Adjusting File (SAF) system has been recommended for use in permanent teeth since it offers more conservative and effective root-canal preparation when compared to traditional rotary systems. However, no study had evaluated the usage of SAF in primary teeth. The aim of this study was to evaluate and compare the use of SAF, K file (manual instrumentation) and Profile (traditional rotary instrumentation) systems for primary-tooth root-canal preparation in terms of instrumentation time and amounts of dentin removed using micro-computed tomography (μCT) technology. Study Design: The study was conducted with 60 human primary mandibular second molar teeth divided into 3 groups according to instrumentation technique: Group I: SAF (n=20); Group II: K file (n=20); Group III; Profile (n=20). Teeth were embedded in acrylic blocks and scanned with a μCT scanner prior to instrumentation. All distal root canals were prepared up to size 30 for K file,.04/30 for Profile and 2 mm thickness, size 25 for SAF; instrumentation time was recorded for each tooth, and a second μCT scan was performed after instrumentation was complete. Amounts of dentin removed were measured using the three-dimensional images by calculating the difference in root-canal volume before and after preparation. Data was statistically analysed using the Kolmogorov-Smirnov and Kruskal-Wallis tests. Manual instrumentation (K file) resulted in significantly more dentin removal when compared to rotary instrumentation (Profile and SAF), while the SAF system generated significantly less dentin removal than both manual instrumentation (K file) and traditional rotary instrumentation (Profile) (psystems. Within the experimental conditions of the present study, the SAF seems as a useful system for root-canal instrumentation in primary molars because it removed less dentin than other systems, which is especially important for the relatively thin-walled canals of primary teeth, and because it involves less

  1. Nuclear plant fire incident data file

    International Nuclear Information System (INIS)

    Sideris, A.G.; Hockenbury, R.W.; Yeater, M.L.; Vesely, W.E.

    1979-01-01

    A computerized nuclear plant fire incident data file was developed by American Nuclear Insurers and was further analyzed by Rensselaer Polytechnic Institute with technical and monetary support provided by the Nuclear Regulatory Commission. Data on 214 fires that occurred at nuclear facilities have been entered in the file. A computer program has been developed to sort the fire incidents according to various parameters. The parametric sorts that are presented in this article are significant since they are the most comprehensive statistics presently available on fires that have occurred at nuclear facilities

  2. ATLAS, an integrated structural analysis and design system. Volume 4: Random access file catalog

    Science.gov (United States)

    Gray, F. P., Jr. (Editor)

    1979-01-01

    A complete catalog is presented for the random access files used by the ATLAS integrated structural analysis and design system. ATLAS consists of several technical computation modules which output data matrices to corresponding random access file. A description of the matrices written on these files is contained herein.

  3. Activity-based computing: computational management of activities reflecting human intention

    DEFF Research Database (Denmark)

    Bardram, Jakob E; Jeuris, Steven; Houben, Steven

    2015-01-01

    paradigm that has been applied in personal information management applications as well as in ubiquitous, multidevice, and interactive surface computing. ABC has emerged as a response to the traditional application- and file-centered computing paradigm, which is oblivious to a notion of a user’s activity...

  4. Nuclear irradiation parameters of beryllium under fusion, fission and IFMIF irradiation conditions

    International Nuclear Information System (INIS)

    Fischer, U.; Chen, Y.; Leichtle, D.; Simakov, S.; Moeslang, A.; Vladimirov, P.

    2004-01-01

    A computational analysis is presented of the nuclear irradiation parameters for Beryllium under irradiation in typical neutron environments of fission and fusion reactors, and of the presently designed intense fusion neutron source IFMIF. The analysis shows that dpa and Tritium production rates at fusion relevant levels can be achieved with existing high flux fission reactors while the achievable Helium production is too low. The resulting He-Tritium and He/dpa ratios do not meet typical fusion irradiation conditions. Irradiation simulations in the medium flux test modules of the IFMIF neutron source facility were shown to be more suitable to match fusion typical irradiation conditions. To achieve sufficiently high production rates it is suggested to remove the creep-fatigue testing machine together with the W spectra shifter plate and move the tritium release module upstream towards the high flux test module. (author)

  5. Radiology Teaching Files on the Internet

    International Nuclear Information System (INIS)

    Lim, Eun Chung; Kim, Eun Kyung

    1996-01-01

    There is increasing attention about radiology teaching files on the Internet in the field of diagnostic radiology. The purpose of this study was to aid in the creation of new radiology teaching file by analysing the present radiology teaching file sites on the Internet with many aspects and evaluating images on those sites, using Macintosh II ci compute r, 28.8kbps TelePort Fax/Modem, Netscape Navigator 2.0 software. The results were as follow : 1. Analysis of radiology teaching file sites (1) Country distribution was the highest in USA (57.5%). (2) Average number of cases was 186 cases and radiology teaching file sites with search engine were 9 sites (22.5%). (3) At the method of case arrangement, anatomic area type and diagnosis type were found at the 10 sites (25%) each, question and answer type was found at the 9 sites (22.5%). (4) Radiology teaching file sites with oro-maxillofacial disorder were 9 sites (22.5%). (5) At the image format, GIF format was found at the 14 sites (35%), and JPEG format found at the 14 sites (35%). (6) Created year was the highest in 1995 (43.7%). (7) Continuing case upload was found at the 35 sites (87.5%). 2. Evaluation of images on the radiology teaching files (1) Average file size of GIF format (71 Kbyte) was greater than that of JPEG format (24 Kbyte). (P<0.001) (2) Image quality of GIF format was better than that of JPEG format. (P<0.001)

  6. Lessons Learned in Deploying the World s Largest Scale Lustre File System

    Energy Technology Data Exchange (ETDEWEB)

    Dillow, David A [ORNL; Fuller, Douglas [ORNL; Wang, Feiyi [ORNL; Oral, H Sarp [ORNL; Zhang, Zhe [ORNL; Hill, Jason J [ORNL; Shipman, Galen M [ORNL

    2010-01-01

    The Spider system at the Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) is the world's largest scale Lustre parallel file system. Envisioned as a shared parallel file system capable of delivering both the bandwidth and capacity requirements of the OLCF's diverse computational environment, the project had a number of ambitious goals. To support the workloads of the OLCF's diverse computational platforms, the aggregate performance and storage capacity of Spider exceed that of our previously deployed systems by a factor of 6x - 240 GB/sec, and 17x - 10 Petabytes, respectively. Furthermore, Spider supports over 26,000 clients concurrently accessing the file system, which exceeds our previously deployed systems by nearly 4x. In addition to these scalability challenges, moving to a center-wide shared file system required dramatically improved resiliency and fault-tolerance mechanisms. This paper details our efforts in designing, deploying, and operating Spider. Through a phased approach of research and development, prototyping, deployment, and transition to operations, this work has resulted in a number of insights into large-scale parallel file system architectures, from both the design and the operational perspectives. We present in this paper our solutions to issues such as network congestion, performance baselining and evaluation, file system journaling overheads, and high availability in a system with tens of thousands of components. We also discuss areas of continued challenges, such as stressed metadata performance and the need for file system quality of service alongside with our efforts to address them. Finally, operational aspects of managing a system of this scale are discussed along with real-world data and observations.

  7. Central axis dose verification in patients treated with total body irradiation of photons using a Computed Radiography system

    International Nuclear Information System (INIS)

    Rubio Rivero, A.; Caballero Pinelo, R.; Gonzalez Perez, Y.

    2015-01-01

    To propose and evaluate a method for the central axis dose verification in patients treated with total body irradiation (TBI) of photons using images obtained through a Computed Radiography (CR) system. It was used the Computed Radiography (Fuji) portal imaging cassette readings and correlate with measured of absorbed dose in water using 10 x 10 irradiation fields with ionization chamber in the 60 Co equipment. The analytical and graphic expression is obtained through software 'Origin8', the TBI patient portal verification images were processed using software ImageJ, to obtain the patient dose. To validate the results, the absorbed dose in RW3 models was measured with ionization chamber with different thickness, simulating TBI real conditions. Finally it was performed a retrospective study over the last 4 years obtaining the patients absorbed dose based on the reading in the image and comparing with the planned dose. The analytical equation obtained permits estimate the absorbed dose using image pixel value and the dose measured with ionization chamber and correlated with patient clinical records. Those results are compared with reported evidence obtaining a difference less than 02%, the 3 methods were compared and the results are within 10%. (Author)

  8. Accurate computations of monthly average daily extraterrestrial irradiation and the maximum possible sunshine duration

    International Nuclear Information System (INIS)

    Jain, P.C.

    1985-12-01

    The monthly average daily values of the extraterrestrial irradiation on a horizontal plane and the maximum possible sunshine duration are two important parameters that are frequently needed in various solar energy applications. These are generally calculated by solar scientists and engineers each time they are needed and often by using the approximate short-cut methods. Using the accurate analytical expressions developed by Spencer for the declination and the eccentricity correction factor, computations for these parameters have been made for all the latitude values from 90 deg. N to 90 deg. S at intervals of 1 deg. and are presented in a convenient tabular form. Monthly average daily values of the maximum possible sunshine duration as recorded on a Campbell Stoke's sunshine recorder are also computed and presented. These tables would avoid the need for repetitive and approximate calculations and serve as a useful ready reference for providing accurate values to the solar energy scientists and engineers

  9. Configuration Management File Manager Developed for Numerical Propulsion System Simulation

    Science.gov (United States)

    Follen, Gregory J.

    1997-01-01

    One of the objectives of the High Performance Computing and Communication Project's (HPCCP) Numerical Propulsion System Simulation (NPSS) is to provide a common and consistent way to manage applications, data, and engine simulations. The NPSS Configuration Management (CM) File Manager integrated with the Common Desktop Environment (CDE) window management system provides a common look and feel for the configuration management of data, applications, and engine simulations for U.S. engine companies. In addition, CM File Manager provides tools to manage a simulation. Features include managing input files, output files, textual notes, and any other material normally associated with simulation. The CM File Manager includes a generic configuration management Application Program Interface (API) that can be adapted for the configuration management repositories of any U.S. engine company.

  10. Federating LHCb datasets using the DIRAC File catalog

    CERN Document Server

    Haen, Christophe; Frank, Markus; Tsaregorodtsev, Andrei

    2015-01-01

    In the distributed computing model of LHCb the File Catalog (FC) is a central component that keeps track of each file and replica stored on the Grid. It is federating the LHCb data files in a logical namespace used by all LHCb applications. As a replica catalog, it is used for brokering jobs to sites where their input data is meant to be present, but also by jobs for finding alternative replicas if necessary. The LCG File Catalog (LFC) used originally by LHCb and other experiments is now being retired and needs to be replaced. The DIRAC File Catalog (DFC) was developed within the framework of the DIRAC Project and presented during CHEP 2012. From the technical point of view, the code powering the DFC follows an Aspect oriented programming (AOP): each type of entity that is manipulated by the DFC (Users, Files, Replicas, etc) is treated as a separate 'concern' in the AOP terminology. Hence, the database schema can also be adapted to the needs of a Virtual Organization. LHCb opted for a highly tuned MySQL datab...

  11. Agent-Mining of Grid Log-Files: A Case Study

    NARCIS (Netherlands)

    Stoter, A.; Dalmolen, Simon; Mulder, .W.

    2013-01-01

    Grid monitoring requires analysis of large amounts of log files across multiple domains. An approach is described for automated extraction of job-flow information from large computer grids, using software agents and genetic computation. A prototype was created as a first step towards communities of

  12. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  13. Grammar-Based Specification and Parsing of Binary File Formats

    Directory of Open Access Journals (Sweden)

    William Underwood

    2012-03-01

    Full Text Available The capability to validate and view or play binary file formats, as well as to convert binary file formats to standard or current file formats, is critically important to the preservation of digital data and records. This paper describes the extension of context-free grammars from strings to binary files. Binary files are arrays of data types, such as long and short integers, floating-point numbers and pointers, as well as characters. The concept of an attribute grammar is extended to these context-free array grammars. This attribute grammar has been used to define a number of chunk-based and directory-based binary file formats. A parser generator has been used with some of these grammars to generate syntax checkers (recognizers for validating binary file formats. Among the potential benefits of an attribute grammar-based approach to specification and parsing of binary file formats is that attribute grammars not only support format validation, but support generation of error messages during validation of format, validation of semantic constraints, attribute value extraction (characterization, generation of viewers or players for file formats, and conversion to current or standard file formats. The significance of these results is that with these extensions to core computer science concepts, traditional parser/compiler technologies can potentially be used as a part of a general, cost effective curation strategy for binary file formats.

  14. A micro-computed tomographic evaluation of dentinal microcrack alterations during root canal preparation using single-file Ni-Ti systems.

    Science.gov (United States)

    Li, Mei-Lin; Liao, Wei-Li; Cai, Hua-Xiong

    2018-01-01

    The aim of the present study was to evaluate the length of dentinal microcracks observed prior to and following root canal preparation with different single-file nickel-titanium (Ni-Ti) systems using micro-computed tomography (micro-CT) analysis. A total of 80 mesial roots of mandibular first molars presenting with type II Vertucci canal configurations were scanned at an isotropic resolution of 7.4 µm. The samples were randomly assigned into four groups (n=20 per group) according to the system used for root canal preparation, including the WaveOne (WO), OneShape (OS), Reciproc (RE) and control groups. A second micro-CT scan was conducted after the root canals were prepared with size 25 instruments. Pre- and postoperative cross-section images of the roots (n=237,760) were then screened to identify the lengths of the microcracks. The results indicated that the microcrack lengths were notably increased following root canal preparation (Pfiles. Among the single-file Ni-Ti systems, WO and RE were not observed to cause notable microcracks, while the OS system resulted in evident microcracks.

  15. Peregrine System | High-Performance Computing | NREL

    Science.gov (United States)

    classes of nodes that users access: Login Nodes Peregrine has four login nodes, each of which has Intel E5 /scratch file systems, the /mss file system is mounted on all login nodes. Compute Nodes Peregrine has 2592

  16. BIBLIO: A Reprint File Management Algorithm

    Science.gov (United States)

    Zelnio, Robert N.; And Others

    1977-01-01

    The development of a simple computer algorithm designed for use by the individual educator or researcher in maintaining and searching reprint files is reported. Called BIBLIO, the system is inexpensive and easy to operate and maintain without sacrificing flexibility and utility. (LBH)

  17. The NEA computer program library: a possible GDMS application

    International Nuclear Information System (INIS)

    Schuler, W.

    1978-01-01

    NEA Computer Program library maintains a series of eleven sequential computer files, used for linked applications in managing their stock of computer codes for nuclear reactor calculations, storing index and program abstract information, and administering their service to requesters. The high data redundancy beween the files suggests that a data base approach would be valid and this paper suggests a possible 'schema' for an CODASYL GDMS

  18. Development of a script for converting DICOM files to .TXT

    International Nuclear Information System (INIS)

    Abrantes, Marcos E.S.; Oliveira, A.H. de

    2014-01-01

    Background: with the increased use of computer simulation techniques for diagnosis or therapy in patients, the MCNP and SCMS software is being widely used. For use as SCMS data entry interface for the MCNP is necessary to perform transformation of DICOM images to text files. Objective: to produce a semi automatic script conversion DICOM images generated by Computerized Tomography or Magnetic Resonance, for .txt in the IMAGEJ software. Methodology: this study was developed in the IMAGEJ software platform with an Intel Core 2 Duo computer, CPU of 2.00GHz, with 2:00 GB of RAM for 32-bit system. Development of the script was held in a text editor using JAVA language. For script insertion in IMAGEJ the plug in tool of this software was used. After this, a window is open asking for the path of the files that will be read, first and last name of DICOM file to be converted, along with where the new files will be stored. Results: for the manual processing of DICOM conversion to .txt of cerebral computed tomography with 600 images requires a time of about 8 hours. The use of script allows conversion time reduction for 12 minutes. Conclusion: the script used demonstrates DICOM conversion ability to .txt and a significant improvement in time savings in processing

  19. Computer simulation of void formation in residual gas atom free metals by dual beam irradiation experiments

    International Nuclear Information System (INIS)

    Shimomura, Y.; Nishiguchi, R.; La Rubia, T.D. de; Guinan, M.W.

    1992-01-01

    In our recent experiments (1), we found that voids nucleate at vacancy clusters which trap gas atoms such as hydrogen and helium in ion- and neutron-irradiated copper. A molecular dynamics computer simulation, which implements an empirical embedded atom method to calculate forces that act on atoms in metals, suggests that a void nucleation occurs in pure copper at six and seven vacancy clusters. The structure of six and seven vacancy clusters in copper fluctuates between a stacking fault tetrahedron and a void. When a hydrogen is trapped at voids of six and seven vacancy, a void can keep their structure for appreciably long time; that is, the void do not relax to a stacking fault tetrahedron and grows to a large void. In order to explore the detailed atomics of void formation, it is emphasized that dual-beam irradiation experiments that utilize beams of gas atoms and self-ions should be carried out with residual gas atom free metal specimens. (author)

  20. COXPRO-II: a computer program for calculating radiation and conduction heat transfer in irradiated fuel assemblies

    International Nuclear Information System (INIS)

    Rhodes, C.A.

    1984-12-01

    This report describes the computer program COXPRO-II, which was written for performing thermal analyses of irradiated fuel assemblies in a gaseous environment with no forced cooling. The heat transfer modes within the fuel pin bundle are radiation exchange among fuel pin surfaces and conduction by the stagnant gas. The array of parallel cylindrical fuel pins may be enclosed by a metal wrapper or shroud. Heat is dissipated from the outer surface of the fuel pin assembly by radiation and convection. Both equilateral triangle and square fuel pin arrays can be analyzed. Steady-state and unsteady-state conditions are included. Temperatures predicted by the COXPRO-II code have been validated by comparing them with experimental measurements. Temperature predictions compare favorably to temperature measurements in pressurized water reactor (PWR) and liquid-metal fast breeder reactor (LMFBR) simulated, electrically heated fuel assemblies. Also, temperature comparisons are made on an actual irradiated Fast-Flux Test Facility (FFTF) LMFBR fuel assembly

  1. Spectral and raw quasi in-situ energy dispersive X-ray data captured via a TEM analysis of an ODS austenitic stainless steel sample under 1 MeV Kr2+ high temperature irradiation.

    Science.gov (United States)

    Brooks, Adam J; Yao, Zhongwen

    2017-10-01

    The data presented in this article is related to the research experiment, titled: ' Quasi in-situ energy dispersive X-ray spectroscopy observation of matrix and solute interactions on Y-Ti-O oxide particles in an austenitic stainless steel under 1 MeV Kr 2+ high temperature irradiation' (Brooks et al., 2017) [1]. Quasi in-situ analysis during 1 MeV Kr 2+ 520 °C irradiation allowed the same microstructural area to be observed using a transmission electron microscope (TEM), on an oxide dispersion strengthened (ODS) austenitic stainless steel sample. The data presented contains two sets of energy dispersive X-ray spectroscopy (EDX) data collected before and after irradiation to 1.5 displacements-per-atom (~1.25×10 -3  dpa/s with 7.5×10 14  ions cm -2 ). The vendor software used to process and output the data is the Bruker Esprit v1.9 suite. The data includes the spectral (counts vs. keV energy) of the quasi in-situ scanned region (512×512 pixels at 56k magnification), along with the EDX scanning parameters. The.raw files from the Bruker Esprit v1.9 output are additionally included along with the.rpl data information files. Furthermore included are the two quasi in-situ HAADF images for visual comparison of the regions before and after irradiation. This in-situ experiment is deemed ' quasi' due to the thin foil irradiation taking place at an external TEM facility. We present this data for critical and/or extended analysis from the scientific community, with applications applying to: experimental data correlation, confirmation of results, and as computer based modeling inputs.

  2. Network survivability performance (computer diskette)

    Science.gov (United States)

    1993-11-01

    File characteristics: Data file; 1 file. Physical description: 1 computer diskette; 3 1/2 in.; high density; 2.0MB. System requirements: Mac; Word. This technical report has been developed to address the survivability of telecommunications networks including services. It responds to the need for a common understanding of, and assessment techniques for network survivability, availability, integrity, and reliability. It provides a basis for designing and operating telecommunication networks to user expectations for network survivability.

  3. Development of data file system for cardiovascular nuclear medicine

    International Nuclear Information System (INIS)

    Hayashida, Kohei; Nishimura, Tsunehiko; Uehara, Toshiisa; Nisawa, Yoshifumi.

    1985-01-01

    A computer-assisted filing system for storing and processing data from cardiac pool scintigraphy and myocardial scintigraphy has been developed. Individual patient data are stored with his (her) identification number (ID) into floppy discs successively in order of receiving scintigraphy. Data for 900 patients can be stored per floppy disc. Scintigraphic findings can be outputted in a uniform file format, and can be used as a reporting format. Output or retrieval of filed individual patient data is possible according to each examination, disease code or ID. This system seems to be used for prospective study in patients with cardiovascular diseases. (Namekawa, K.)

  4. A file of reference data for multiple-element neutron activation analysis

    International Nuclear Information System (INIS)

    Kabina, L.P.; Kondurov, I.A.; Shesterneva, I.M.

    1983-12-01

    Data needed for planning neutron activation analysis experiments and processing their results are given. The decay schemes of radioactive nuclei formed in irradiation with thermal neutrons during the (n,γ) reaction taken from the international ENSDF file are used for calculating the activities of nuclei and for drawing up an optimum table for identifying gamma lines in the spectra measured. (author)

  5. Study and development of a document file system with selective access

    International Nuclear Information System (INIS)

    Mathieu, Jean-Claude

    1974-01-01

    The objective of this research thesis was to design and to develop a set of software aimed at an efficient management of a document file system by using methods of selective access to information. Thus, the three main aspects of file processing (creation, modification, reorganisation) have been addressed. The author first presents the main problems related to the development of a comprehensive automatic documentation system, and their conventional solutions. Some future aspects, notably dealing with the development of peripheral computer technology, are also evoked. He presents the characteristics of INIS bibliographic records provided by the IAEA which have been used to create the files. In the second part, he briefly describes the file system general organisation. This system is based on the use of two main files: an inverse file which contains for each descriptor a list of of numbers of files indexed by this descriptor, and a dictionary of descriptor or input file which gives access to the inverse file. The organisation of these both files is then describes in a detailed way. Other related or associated files are created, and the overall architecture and mechanisms integrated into the file data input software are described, as well as various processing applied to these different files. Performance and possible development are finally discussed

  6. Computational analysis of modern HTGR fuel performance and fission product release during the HFR-EU1 irradiation experiment

    Energy Technology Data Exchange (ETDEWEB)

    Verfondern, Karl, E-mail: k.verfondern@fz-juelich.de [Research Center Jülich, Institute of Energy and Climate Research, 52425 Jülich (Germany); Xhonneux, André, E-mail: xhonneux@lrst.rwth-aachen.de [Research Center Jülich, Institute of Energy and Climate Research, 52425 Jülich (Germany); Nabielek, Heinz, E-mail: heinznabielek@me.com [Research Center Jülich, Monschauerstrasse 61, 52355 Düren (Germany); Allelein, Hans-Josef, E-mail: h.j.allelein@fz-juelich.de [Research Center Jülich, Institute of Energy and Climate Research, 52425 Jülich (Germany); RWTH Aachen, Chair for Reactor Safety and Reactor Technology, 52072 Aachen (Germany)

    2014-07-01

    Highlights: • HFR-EU1 irradiation test demonstrates high quality of HTGR spherical fuel elements. • Irradiation performance is in good agreement with German fuel performance modeling. • International benchmark exercise expected first particle to fail at ∼13–17% FIMA. • EOL silver release is predicted to be in the percentage range. • EOL cesium and strontium are expected to remain at a low level. - Abstract: Various countries engaged in the development and fabrication of modern HTGR fuel have initiated activities of modeling the fuel and fission product release behavior with the aim of predicting the fuel performance under HTGR operating and accident conditions. Verification and validation studies are conducted by code-to-code benchmarking and code-to-experiment comparisons as part of international exercises. The methodology developed in Germany since the 1980s represents valuable and efficient tools to describe fission product release from spherical fuel elements and TRISO fuel performance, respectively, under given conditions. Continued application to new results of irradiation and accident simulation testing demonstrates the appropriateness of the models in terms of a conservative estimation of the source term as part of interactions with HTGR licensing authorities. Within the European irradiation testing program for HTGR fuel and as part of the former EU RAPHAEL project, the HFR-EU1 irradiation experiment explores the potential for high performance of the presently existing German and newly produced Chinese fuel spheres under defined conditions up to high burnups. The fuel irradiation was completed in 2010. Test samples are prepared for further postirradiation examinations (PIE) including heatup simulation testing in the KÜFA-II furnace at the JRC-ITU, Karlsruhe, to be conducted within the on-going ARCHER Project of the European Commission. The paper will describe the application of the German computer models to the HFR-EU1 irradiation test and

  7. Multi-level, automatic file management system using magnetic disk, mass storage system and magnetic tape

    International Nuclear Information System (INIS)

    Fujii, Minoru; Asai, Kiyoshi

    1979-12-01

    A simple, effective file management system using magnetic disk, mass storage system (MSS) and magnetic tape is described. Following are the concepts and techniques introduced in this file management system. (1) File distribution and continuity character of file references are closely approximated by memory retention function. A density function using the memory retention function is thus defined. (2) A method of computing the cost/benefit lines for magnetic disk, MSS and magnetic tape is presented. (3) A decision process of an optimal organization of file facilities incorporating file demands distribution to respective file devices, is presented. (4) A method of simple, practical, effective, automatic file management, incorporating multi-level file management, space management and file migration control, is proposed. (author)

  8. FEDGROUP - A program system for producing group constants from evaluated nuclear data of files disseminated by IAEA

    International Nuclear Information System (INIS)

    Vertes, P.

    1976-06-01

    A program system for calculating group constants from several evaluated nuclear data files has been developed. These files are distributed by the Nuclear Data Section of IAEA. Our program system - FEDGROUP - has certain advantage over the well-known similar codes such as: 1. it requires only a medium sized computer />or approximately equal to 20000 words memory/, 2. it is easily adaptable to any type of computer, 3. it is flexible to the input evaluated nuclear data file and to the output group constant file. Nowadays, FEDGROUP calculates practically all types of group constants needed for reactor physics calculations by using the most frequent representations of evaluated data. (author)

  9. Decay data file based on the ENSDF file

    Energy Technology Data Exchange (ETDEWEB)

    Katakura, J. [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-03-01

    A decay data file with the JENDL (Japanese Evaluated Nuclear Data Library) format based on the ENSDF (Evaluated Nuclear Structure Data File) file was produced as a tentative one of special purpose files of JENDL. The problem using the ENSDF file as primary source data of the JENDL decay data file is presented. (author)

  10. KaZaA and similar Peer-to-Peer (P2P) file-sharing applications

    CERN Multimedia

    2003-01-01

    Personal use of Peer-to-Peer (P2P) file sharing applications is NOT permitted at CERN. A non-exhaustive list of such applications, popular for exchanging music, videos, software etc, is: KaZaA, Napster, Gnutella, Edonkey2000, Napigator, Limewire, Bearshare, WinMX, Aimster, Morpheus, BitTorrent, ... You are reminded that use of CERN's Computing Facilities is governed by CERN's Computing Rules (Operational Circular No 5). They require that all users of CERN's Computing Facilities respect copyright, license and confidentiality agreements for data of any form (software, music, videos, etc). Sanctions are applicable in case of non-respect of the Computing Rules. Further details on restrictions for P2P applications are at: http://cern.ch/security/file-sharing CERN's Computing Rules are at: http://cern.ch/ComputingRules Denise Heagerty, CERN Computer Security Officer, Computer.Security@cern.ch

  11. Grid collector an event catalog with automated file management

    CERN Document Server

    Ke Sheng Wu; Sim, A; Jun Min Gu; Shoshani, A

    2004-01-01

    High Energy Nuclear Physics (HENP) experiments such as STAR at BNL and ATLAS at CERN produce large amounts of data that are stored as files on mass storage systems in computer centers. In these files, the basic unit of data is an event. Analysis is typically performed on a selected set of events. The files containing these events have to be located, copied from mass storage systems to disks before analysis, and removed when no longer needed. These file management tasks are tedious and time consuming. Typically, all events contained in the files are read into memory before a selection is made. Since the time to read the events dominate the overall execution time, reading the unwanted event needlessly increases the analysis time. The Grid Collector is a set of software modules that works together to address these two issues. It automates the file management tasks and provides "direct" access to the selected events for analyses. It is currently integrated with the STAR analysis framework. The users can select ev...

  12. The version control service for the ATLAS data acquisition configuration files

    International Nuclear Information System (INIS)

    Soloviev, Igor

    2012-01-01

    The ATLAS experiment at the LHC in Geneva uses a complex and highly distributed Trigger and Data Acquisition system, involving a very large number of computing nodes and custom modules. The configuration of the system is specified by schema and data in more than 1000 XML files, with various experts responsible for updating the files associated with their components. Maintaining an error free and consistent set of XML files proved a major challenge. Therefore a special service was implemented; to validate any modifications; to check the authorization of anyone trying to modify a file; to record who had made changes, plus when and why; and to provide tools to compare different versions of files and to go back to earlier versions if required. This paper provides details of the implementation and exploitation experience, that may be interesting for other applications using many human-readable files maintained by different people, where consistency of the files and traceability of modifications are key requirements.

  13. Competitive Status Signaling in Peer-to-Peer File-Sharing Networks

    Directory of Open Access Journals (Sweden)

    Henry F. Lyle

    2007-04-01

    Full Text Available Internet peer-to-peer file sharing is a contemporary example of asymmetrical sharing in which “altruists” (file uploaders share unconditionally with non-reciprocating “free riders” (file downloaders. Those who upload digital media files over the Internet risk prosecution for copyright infringement, and are more vulnerable to computer hackers and viruses. In an analysis of file-sharing behavior among university undergraduates (N=331, we found that significantly more males than females engaged in risky file uploading. Contrary to expectations, uploaders were not concerned about their reputation online and file sharers were not interested in identifying or chatting with uploaders while online. Among uploaders, males were more likely than females to be identified as uploaders by friends, to discuss uploading and to upload in the presence of peers. We interpret these results using costly-signaling theory, and argue that uploading is a costly signal in which males engage in avoidable risk taking as a means to compete for status among peers in social contexts other than the Internet.

  14. Application of PLC in irradiation controlling system

    International Nuclear Information System (INIS)

    Qin Wenjuan; Lin Baoling; Wang Mingtao; Huang Daorong; Yao Qiuguo; Gao Weixiang; Yang Kun; Xue Changlin; Pu Jiangling

    2005-01-01

    To deal with the multiprogramming controlling system and computer gathering and printing measure data in Irradiation Station, we adopted Programmable Logic Controller (brief: PLC) instead of PCB to control Irradiation System. PLC improved the anti-jamming ability and debugged for convenience. (authors)

  15. Examination of irradiated fuel elements using gamma scanning technique

    International Nuclear Information System (INIS)

    Ichim, O.; Mincu, M.; Man, I.; Stanica, M.

    2016-01-01

    The purpose of this paper is to validate the gamma scanning technique used to calculate the activity of gamma fission products from CANDU/TRIGA irradiated fuel elements. After a short presentation of the equipments used and their characteristics, the paper describes the calibration technique for the devices and how computed tomography reconstruction is done. Following the previously mentioned steps is possible to obtain the axial and radial profiles and the computed tomography reconstruction for calibration sources and for the irradiated fuel elements. The results are used to validate the gamma scanning techniques as a non-destructive examination method. The gamma scanning techniques will be used to: identify the fission products in the irradiated CANDU/TRIGA fuel elements, construct the axial and radial distributions of fission products, get the distribution in cross section through computed tomography reconstruction, and determine the nuclei number and the fission products activity of the irradiated CANDU/TRIGA fuel elements. (authors)

  16. Testing the Forensic Interestingness of Image Files Based on Size and Type

    Science.gov (United States)

    2017-09-01

    down to 0.18% (Rowe, 2015). 7 III. IMAGE FILE FORMATS When scanning a computer hard drive, many kinds of pictures are found. Digital images are not...3  III.  IMAGE FILE FORMATS ...Interchange Format JPEG Joint Photographic Experts Group LSH Locality Sensitive Hashing NSRL National Software Reference Library PDF Portable Document

  17. Conversion of Input Data between KENO and MCNP File Formats for Computer Criticality Assessments

    International Nuclear Information System (INIS)

    Schwarz, Randolph A.; Carter, Leland L.; Schwarz Alysia L.

    2006-01-01

    KENO is a Monte Carlo criticality code that is maintained by Oak Ridge National Laboratory (ORNL). KENO is included in the SCALE (Standardized Computer Analysis for Licensing Evaluation) package. KENO is often used because it was specifically designed for criticality calculations. Because KENO has convenient geometry input, including the treatment of lattice arrays of materials, it is frequently used for production calculations. Monte Carlo N-Particle (MCNP) is a Monte Carlo transport code maintained by Los Alamos National Laboratory (LANL). MCNP has a powerful 3D geometry package and an extensive cross section database. It is a general-purpose code and may be used for calculations involving shielding or medical facilities, for example, but can also be used for criticality calculations. MCNP is becoming increasingly more popular for performing production criticality calculations. Both codes have their own specific advantages. After a criticality calculation has been performed with one of the codes, it is often desirable (or may be a safety requirement) to repeat the calculation with the other code to compare the important parameters using a different geometry treatment and cross section database. This manual conversion of input files between the two codes is labor intensive. The industry needs the capability of converting geometry models between MCNP and KENO without a large investment in manpower. The proposed conversion package will aid the user in converting between the codes. It is not intended to be used as a ''black box''. The resulting input file will need to be carefully inspected by criticality safety personnel to verify the intent of the calculation is preserved in the conversion. The purpose of this package is to help the criticality specialist in the conversion process by converting the geometry, materials, and pertinent data cards

  18. Evaluation of Functional Marrow Irradiation Based on Skeletal Marrow Composition Obtained Using Dual-Energy Computed Tomography

    Energy Technology Data Exchange (ETDEWEB)

    Magome, Taiki [Department of Radiological Sciences, Faculty of Health Sciences, Komazawa University, Tokyo (Japan); Masonic Cancer Center, University of Minnesota, Minneapolis, Minnesota (United States); Department of Radiology, The University of Tokyo Hospital, Tokyo (Japan); Froelich, Jerry [Department of Radiology, University of Minnesota, Minneapolis, Minnesota (United States); Takahashi, Yutaka [Masonic Cancer Center, University of Minnesota, Minneapolis, Minnesota (United States); Department of Radiation Oncology, Osaka University, Osaka (Japan); Arentsen, Luke [Department of Therapeutic Radiology, University of Minnesota, Minneapolis, Minnesota (United States); Holtan, Shernan; Verneris, Michael R. [Blood and Marrow Transplant Program, University of Minnesota, Minneapolis, Minnesota (United States); Brown, Keenan [Mindways Software Inc, Austin, Texas (United States); Haga, Akihiro; Nakagawa, Keiichi [Department of Radiology, The University of Tokyo Hospital, Tokyo (Japan); Holter Chakrabarty, Jennifer L. [College of Medicine, Oklahoma Health Sciences Center, Oklahoma City, Oklahoma (United States); Giebel, Sebastian [Department of Bone Marrow Transplantation, Comprehensive Cancer Center M. Curie-Sklodowska Memorial Institute, Gliwice (Poland); Wong, Jeffrey [Department of Radiation Oncology, Beckman Research Institute, City of Hope, Duarte, California (United States); Dusenbery, Kathryn [Department of Therapeutic Radiology, University of Minnesota, Minneapolis, Minnesota (United States); Storme, Guy [Department of Radiotherapy, Universitair Ziekenhuis Brussel, Brussels (Belgium); Hui, Susanta K., E-mail: shui@coh.org [Masonic Cancer Center, University of Minnesota, Minneapolis, Minnesota (United States); Department of Therapeutic Radiology, University of Minnesota, Minneapolis, Minnesota (United States); Department of Radiation Oncology, Beckman Research Institute, City of Hope, Duarte, California (United States)

    2016-11-01

    Purpose: To develop an imaging method to characterize and map marrow composition in the entire skeletal system, and to simulate differential targeted marrow irradiation based on marrow composition. Methods and Materials: Whole-body dual energy computed tomography (DECT) images of cadavers and leukemia patients were acquired, segmented to separate bone marrow components, namely, bone, red marrow (RM), and yellow marrow (YM). DECT-derived marrow fat fraction was validated using histology of lumbar vertebrae obtained from cadavers. The fractions of RM (RMF = RM/total marrow) and YMF were calculated in each skeletal region to assess the correlation of marrow composition with sites and ages. Treatment planning was simulated to target irradiation differentially at a higher dose (18 Gy) to either RM or YM and a lower dose (12 Gy) to the rest of the skeleton. Results: A significant correlation between fat fractions obtained from DECT and cadaver histology samples was observed (r=0.861, P<.0001, Pearson). The RMF decreased in the head, neck, and chest was significantly inversely correlated with age but did not show any significant age-related changes in the abdomen and pelvis regions. Conformity of radiation to targets (RM, YM) was significantly dependent on skeletal sites. The radiation exposure was significantly reduced (P<.05, t test) to organs at risk (OARs) in RM and YM irradiation compared with standard total marrow irradiation (TMI). Conclusions: Whole-body DECT offers a new imaging technique to visualize and measure skeletal-wide marrow composition. The DECT-based treatment planning offers volumetric and site-specific precise radiation dosimetry of RM and YM, which varies with aging. Our proposed method could be used as a functional compartment of TMI for further targeted radiation to specific bone marrow environment, dose escalation, reduction of doses to OARs, or a combination of these factors.

  19. Void formation in irradiated binary nickel alloys

    International Nuclear Information System (INIS)

    Shaikh, M.A.; Ahmed, M.; Akhter, J.I.

    1994-01-01

    In this work a computer program has been used to compute void radius, void density and swelling parameter for nickel and binary nickel-carbon alloys irradiated with nickel ions of 100 keV. The aim is to compare the computed results with experimental results already reported

  20. Analysis of gamma irradiated pepper constituents, 5

    International Nuclear Information System (INIS)

    Takagi, Kazuko; Okuyama, Tsuneo; Ishikawa, Toshihiro.

    1988-01-01

    Gamma irradiated peppers (10 krad, 100 krad, 1 Mrad) were analyzed by HPLC. The extraction method and HPLC conditions were same as the first report, that is, the extraction from pepper was performed by Automatic Air Hammer and the extracted samples were separated on a reversed phase C 8 column with a concave gradient from 0.1% trifluoro aceticacid (TFA) in water to 75% acetonitrile-0.1% TFA in water for 60 minutes and detected at 210 nm, 280 nm. It is difficult to compare with irradiated and unirradiated pepper constituents by their peak height or area. And the method of multi variant statistically analysis was introduced. The 'peak n area/peak n + 1 area' ratio was calculated by computer. Each peak area was accounted by integrator. The value of these ratio were called 'parameter'. Each chromatogram has 741 parameters calculated with 39 chromatographic peaks. And these parameters were abopted to the multi variant statiscally analysis. Comparison of constituents between irradiated pepper and unirradiated pepper was done by 741 parameters. The correlation of parameters between irradiated and unirradiated was investigated by use of computer. Some parameters of irradiated case were selected as which had no correlation with unirradiated case. That is to say these parameters were thought to be changed with gamma spectrum irradiation. By this method, Coumarin was identified as a changed component with gamma irradiation. (author)

  1. Overview and Status of the Ceph File System

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    The Ceph file system (CephFS) is the POSIX-compatible distributed file system running on top of Ceph's powerful and stable object store. This presentation will give a general introduction of CephFS and detail the recent work the Ceph team has done to improve its stability and usability. In particular, we will cover directory fragmentation, multiple active metadata servers, and directory subtree pinning to metadata servers, features slated for stability in the imminent Luminous release. This talk will also give an overview of how we are measuring performance of multiple active metadata servers using large on-demand cloud deployments. The results will highlight how CephFS distributes metadata load across metadata servers to achieve scaling. About the speaker Patrick Donnelly is a software engineer at Red Hat, Inc. currently working on the Ceph distributed file system. In 2016 he completed his Ph.D. in computer science at the University of Notre Dame with a dissertation on the topic of file transfer management...

  2. Tuning HDF5 subfiling performance on parallel file systems

    Energy Technology Data Exchange (ETDEWEB)

    Byna, Suren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Chaarawi, Mohamad [Intel Corp. (United States); Koziol, Quincey [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Mainzer, John [The HDF Group (United States); Willmore, Frank [The HDF Group (United States)

    2017-05-12

    Subfiling is a technique used on parallel file systems to reduce locking and contention issues when multiple compute nodes interact with the same storage target node. Subfiling provides a compromise between the single shared file approach that instigates the lock contention problems on parallel file systems and having one file per process, which results in generating a massive and unmanageable number of files. In this paper, we evaluate and tune the performance of recently implemented subfiling feature in HDF5. In specific, we explain the implementation strategy of subfiling feature in HDF5, provide examples of using the feature, and evaluate and tune parallel I/O performance of this feature with parallel file systems of the Cray XC40 system at NERSC (Cori) that include a burst buffer storage and a Lustre disk-based storage. We also evaluate I/O performance on the Cray XC30 system, Edison, at NERSC. Our results show performance benefits of 1.2X to 6X performance advantage with subfiling compared to writing a single shared HDF5 file. We present our exploration of configurations, such as the number of subfiles and the number of Lustre storage targets to storing files, as optimization parameters to obtain superior I/O performance. Based on this exploration, we discuss recommendations for achieving good I/O performance as well as limitations with using the subfiling feature.

  3. Molecular dynamics for irradiation driven chemistry

    DEFF Research Database (Denmark)

    Sushko, Gennady B.; Solov'yov, Ilia A.; Solov'yov, Andrey V.

    2016-01-01

    A new molecular dynamics (MD) approach for computer simulations of irradiation driven chemical transformations of complex molecular systems is suggested. The approach is based on the fact that irradiation induced quantum transformations can often be treated as random, fast and local processes...... that describe the classical MD of complex molecular systems under irradiation. The proposed irradiation driven molecular dynamics (IDMD) methodology is designed for the molecular level description of the irradiation driven chemistry. The IDMD approach is implemented into the MBN Explorer software package...... involving small molecules or molecular fragments. We advocate that the quantum transformations, such as molecular bond breaks, creation and annihilation of dangling bonds, electronic charge redistributions, changes in molecular topologies, etc., could be incorporated locally into the molecular force fields...

  4. Computer simulation of displacement cascade structures in D-T neutron-irradiated Au, Ag, Cu, Ni and Al with the MARLOWE code

    International Nuclear Information System (INIS)

    Watanabe, N.; Nishiguchi, R.; Shimomura, Y.

    1991-01-01

    Spatial distribution of point defects in displacement damage cascades at the early stage of their formation was simulated with the MARLOWE code for primary knock-on atoms which is relevant to D-T neutron irradiation. Calculations were carried out for Au, Ag, Cu, Ni and Al. Computer-simulated results were analyzed with complement of TEM observations of D-T neutron-irradiated metals at low temperature. The spatial configuration of displacement cascades, the size of small vacancy aggregates and the size of displacement damage cascade were examined. Results suggest that most of vacancy clusters which were formed in damage cascades may be as small as below 20 vacancies. The remarkable difference in defect yield of cascade damage in Ni and Cu is due to interstitial cluster formation and main contribution of cascade energy overlapping observed in cryotransfer TEM of D-T neutron-irradiated Au is due to ejected interstitials from cascade cores. (orig.)

  5. Decryption-decompression of AES protected ZIP files on GPUs

    Science.gov (United States)

    Duong, Tan Nhat; Pham, Phong Hong; Nguyen, Duc Huu; Nguyen, Thuy Thanh; Le, Hung Duc

    2011-10-01

    AES is a strong encryption system, so decryption-decompression of AES encrypted ZIP files requires very large computing power and techniques of reducing the password space. This makes implementations of techniques on common computing system not practical. In [1], we reduced the original very large password search space to a much smaller one which surely containing the correct password. Based on reduced set of passwords, in this paper, we parallel decryption, decompression and plain text recognition for encrypted ZIP files by using CUDA computing technology on graphics cards GeForce GTX295 of NVIDIA, to find out the correct password. The experimental results have shown that the speed of decrypting, decompressing, recognizing plain text and finding out the original password increases about from 45 to 180 times (depends on the number of GPUs) compared to sequential execution on the Intel Core 2 Quad Q8400 2.66 GHz. These results have demonstrated the potential applicability of GPUs in this cryptanalysis field.

  6. A Multiscale Computational Model of the Response of Swine Epidermis After Acute Irradiation

    Science.gov (United States)

    Hu, Shaowen; Cucinotta, Francis A.

    2012-01-01

    Radiation exposure from Solar Particle Events can lead to very high skin dose for astronauts on exploration missions outside the protection of the Earth s magnetic field [1]. Assessing the detrimental effects to human skin under such adverse conditions could be predicted by conducting territorial experiments on animal models. In this study we apply a computational approach to simulate the experimental data of the radiation response of swine epidermis, which is closely similar to human epidermis [2]. Incorporating experimentally measured histological and cell kinetic parameters into a multiscale tissue modeling framework, we obtain results of population kinetics and proliferation index comparable to unirradiated and acutely irradiated swine experiments [3]. It is noted the basal cell doubling time is 10 to 16 days in the intact population, but drops to 13.6 hr in the regenerating populations surviving irradiation. This complex 30-fold variation is proposed to be attributed to the shortening of the G1 phase duration. We investigate this radiation induced effect by considering at the sub-cellular level the expression and signaling of TGF-beta, as it is recognized as a key regulatory factor of tissue formation and wound healing [4]. This integrated model will allow us to test the validity of various basic biological rules at the cellular level and sub-cellular mechanisms by qualitatively comparing simulation results with published research, and should lead to a fuller understanding of the pathophysiological effects of ionizing radiation on the skin.

  7. An asynchronous writing method for restart files in the gysela code in prevision of exascale systems*

    Directory of Open Access Journals (Sweden)

    Thomine O.

    2013-12-01

    Full Text Available The present work deals with an optimization procedure developed in the full-f global GYrokinetic SEmi-LAgrangian code (GYSELA. Optimizing the writing of the restart files is necessary to reduce the computing impact of crashes. These files require a very large memory space, and particularly so for very large mesh sizes. The limited bandwidth of the data pipe between the comput- ing nodes and the storage system induces a non-scalable part in the GYSELA code, which increases with the mesh size. Indeed the transfer time of RAM to data depends linearly on the files size. The necessity of non synchronized writing-in-file procedure is therefore crucial. A new GYSELA module has been developed. This asynchronous procedure allows the frequent writ- ing of the restart files, whilst preventing a severe slowing down due to the limited writing bandwidth. This method has been improved to generate a checksum control of the restart files, and automatically rerun the code in case of a crash for any cause.

  8. Kepler Data Validation Time Series File: Description of File Format and Content

    Science.gov (United States)

    Mullally, Susan E.

    2016-01-01

    The Kepler space mission searches its time series data for periodic, transit-like signatures. The ephemerides of these events, called Threshold Crossing Events (TCEs), are reported in the TCE tables at the NASA Exoplanet Archive (NExScI). Those TCEs are then further evaluated to create planet candidates and populate the Kepler Objects of Interest (KOI) table, also hosted at the Exoplanet Archive. The search, evaluation and export of TCEs is performed by two pipeline modules, TPS (Transit Planet Search) and DV (Data Validation). TPS searches for the strongest, believable signal and then sends that information to DV to fit a transit model, compute various statistics, and remove the transit events so that the light curve can be searched for other TCEs. More on how this search is done and on the creation of the TCE table can be found in Tenenbaum et al. (2012), Seader et al. (2015), Jenkins (2002). For each star with at least one TCE, the pipeline exports a file that contains the light curves used by TPS and DV to find and evaluate the TCE(s). This document describes the content of these DV time series files, and this introduction provides a bit of context for how the data in these files are used by the pipeline.

  9. Results of first catamnestical examinations of children of parents irradiated preconceptionally

    International Nuclear Information System (INIS)

    Herrmann, T.; Eberhardt, H.J.; Rupprecht, E.; Voigtmann, L.; Jochem, I.; Medizinische Akademie, Dresden

    1976-01-01

    Results of first preliminary examinations of 21 children whose parents were preconceptionally irradiated for tumor treatment are presented. Most striking variations have been found in connective tissue and skeleton. Carporadiograms proved to be a valuable means for examining minus variants in cases of normal clinical evidence. In the case of a child of an irradiated father malformations of the bones of hands and feet have been observed. There were two premature still-births out of three pregnancies in the case of a patient exposed to a high gonad dose. No obvious deviations could be observed in sense organs and the central nervous system. The sex ratio of children of irradiated women was significantly shifted in favour of girls. It is proposed to establish for the GDR a central file of data on descendants from persons irradiated preconceptionally. The present study may be considered as a model. Finally, the information to be given to radiotherapy patients in the generative age is dealt with. (author)

  10. Results of first catamnestical examinations of children of parents irradiated preconceptionally

    Energy Technology Data Exchange (ETDEWEB)

    Herrmann, T; Eberhardt, H J; Rupprecht, E; Voigtmann, L; Jochem, I [Medizinische Akademie, Dresden (German Democratic Republic). Kinderklinik; Medizinische Akademie, Dresden (German Democratic Republic). Radiologische Klinik)

    1976-12-01

    Results of first preliminary examinations of 21 children whose parents were preconceptionally irradiated for tumor treatment are presented. Most striking variations have been found in connective tissue and skeleton. Carporadiograms proved to be a valuable means for examining minus variants in cases of normal clinical evidence. In the case of a child of an irradiated father malformations of the bones of hands and feet have been observed. There were two premature still-births out of three pregnancies in the case of a patient exposed to a high gonad dose. No obvious deviations could be observed in sense organs and the central nervous system. The sex ratio of children of irradiated women was significantly shifted in favour of girls. It is proposed to establish for the GDR a central file of data on descendants from persons irradiated preconceptionally. The present study may be considered as a model. Finally, the information to be given to radiotherapy patients in the generative age is dealt with.

  11. Grid collector: An event catalog with automated file management

    International Nuclear Information System (INIS)

    Wu, Kesheng; Zhang, Wei-Ming; Sim, Alexander; Gu, Junmin; Shoshani, Arie

    2003-01-01

    High Energy Nuclear Physics (HENP) experiments such as STAR at BNL and ATLAS at CERN produce large amounts of data that are stored as files on mass storage systems in computer centers. In these files, the basic unit of data is an event. Analysis is typically performed on a selected set of events. The files containing these events have to be located, copied from mass storage systems to disks before analysis, and removed when no longer needed. These file management tasks are tedious and time consuming. Typically, all events contained in the files are read into memory before a selection is made. Since the time to read the events dominate the overall execution time, reading the unwanted event needlessly increases the analysis time. The Grid Collector is a set of software modules that works together to address these two issues. It automates the file management tasks and provides ''direct'' access to the selected events for analyses. It is currently integrated with the STAR analysis framework. The users can select events based on tags, such as, ''production date between March 10 and 20, and the number of charged tracks > 100.'' The Grid Collector locates the files containing relevant events, transfers the files across the Grid if necessary, and delivers the events to the analysis code through the familiar iterators. There has been some research efforts to address the file management issues, the Grid Collector is unique in that it addresses the event access issue together with the file management issues. This makes it more useful to a large variety of users

  12. Irradiation facilities in JRR-3M

    International Nuclear Information System (INIS)

    Ohtomo, Akitoshi; Sigemoto, Masamitsu; Takahashi, Hidetake

    1992-01-01

    Irradiation facilities have been installed in the upgraded JRR-3 (JRR-3M) in Japan Atomic Energy Research Institute (JAERI). There are hydraulic rabbit facilities (HR), pneumatic rabbit facilities (PN), neutron activation analysis facility (PN3), uniform irradiation facility (SI), rotating irradiation facility and capsule irradiation facilities to carry out the neutron irradiation in the JRR-3M. These facilities are operated using a process control computer system to centerize the process information. Some of the characteristics for the facilities were satisfactorily measured at the same time of reactor performance test in 1990. During reactor operation, some of the tests are continued to confirm the basic characteristics on facilities, for example, PN3 was confirmed to have enough performance for activation analysis. Measurement of neutron flux at all irradiation positions has been carried out for the equilibrium core. (author)

  13. Reliable file sharing in distributed operating system using web RTC

    Science.gov (United States)

    Dukiya, Rajesh

    2017-12-01

    Since, the evolution of distributed operating system, distributed file system is come out to be important part in operating system. P2P is a reliable way in Distributed Operating System for file sharing. It was introduced in 1999, later it became a high research interest topic. Peer to Peer network is a type of network, where peers share network workload and other load related tasks. A P2P network can be a period of time connection, where a bunch of computers connected by a USB (Universal Serial Bus) port to transfer or enable disk sharing i.e. file sharing. Currently P2P requires special network that should be designed in P2P way. Nowadays, there is a big influence of browsers in our life. In this project we are going to study of file sharing mechanism in distributed operating system in web browsers, where we will try to find performance bottlenecks which our research will going to be an improvement in file sharing by performance and scalability in distributed file systems. Additionally, we will discuss the scope of Web Torrent file sharing and free-riding in peer to peer networks.

  14. Workshop on materials irradiation effects and applications 2012

    International Nuclear Information System (INIS)

    Xu, Qiu; Sato, Koichi; Yoshiie, Toshimasa

    2013-01-01

    For the study of the material irradiation effects, irradiation fields with improved control capabilities, advanced post irradiation experiments and well developed data analyses are required. This workshop aims to discuss new results and to plan the future irradiation research in the KUR. General meeting was held from December 14, 2012 to December 15, 2012 with 44 participants and 28 papers were presented. Especially recent experimental results using irradiation facilities in the KUR such as Materials Controlled Irradiation Facility, Low Temperature Loop and LINAC, and results of computer simulation, and fruitful discussions were performed. This volume contains the summary and selected transparencies presented in the meeting. (author)

  15. Irradiance gradients

    International Nuclear Information System (INIS)

    Ward, G.J.; Heckbert, P.S.; Technische Hogeschool Delft

    1992-04-01

    A new method for improving the accuracy of a diffuse interreflection calculation is introduced in a ray tracing context. The information from a hemispherical sampling of the luminous environment is interpreted in a new way to predict the change in irradiance as a function of position and surface orientation. The additional computation involved is modest and the benefit is substantial. An improved interpolation of irradiance resulting from the gradient calculation produces smoother, more accurate renderings. This result is achieved through better utilization of ray samples rather than additional samples or alternate sampling strategies. Thus, the technique is applicable to a variety of global illumination algorithms that use hemicubes or Monte Carlo sampling techniques

  16. Nuclide identifier and grat data reader application for ORIGEN output file

    International Nuclear Information System (INIS)

    Arif Isnaeni

    2011-01-01

    ORIGEN is a one-group depletion and radioactive decay computer code developed at the Oak Ridge National Laboratory (ORNL). ORIGEN takes one-group neutronics calculation providing various nuclear material characteristics (the buildup, decay and processing of radioactive materials). ORIGEN output is a text-based file, ORIGEN output file contains only numbers in the form of group data nuclide, nuclide identifier and grat. This application was created to facilitate data collection nuclide identifier and grat, this application also has a function to acquire mass number data and calculate mass (gram) for each nuclide. Output from these applications can be used for computer code data input for neutronic calculations such as MCNP. (author)

  17. The Fermilab central computing facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-01-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front-end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS cluster interactive front-end, an Amdahl VM Computing engine, ACP farms, and (primarily) VMS workstations. This paper will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. (orig.)

  18. The Fermilab Central Computing Facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-05-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS Cluster interactive front end, an Amdahl VM computing engine, ACP farms, and (primarily) VMS workstations. This presentation will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. 2 figs

  19. OK, Computer: File Sharing, the Music Industry, and Why We Need the Pirate Party

    Directory of Open Access Journals (Sweden)

    Adrian Cosstick

    2009-03-01

    Full Text Available The Pirate Party believes the state and big business are in the process of protecting stale and inefficient models of business for their own monetary benefit by limiting our right to share information. The Pirate Party suggests that they are achieving this goal through the amendment of intellectual property legislation. In the dawn of the digital era, the Pirate Party advocates that governments and multinational corporations are using intellectual property to: crack down on file sharing which limits the ability to share knowledge and information; increase the terms and length of copyright to raise profits; and build code into music files which limits their ability to be shared (Pirate Party, 2009. There are a number of ‘copyright industries’ that are affected by these issues, none more so than the music industry. Its relationship with file sharing is topical and makes an excellent case study to address the impact big business has had on intellectual property and the need for the Pirate Party’s legislative input. The essay will then examine the central issues raised by illegal file sharing. In particular, the future for record companies in an environment that increasingly demands flexibility, and whether the Pirate Party’s proposal is a viable solution to the music industry’s problems

  20. Accessing files in an Internet: The Jade file system

    Science.gov (United States)

    Peterson, Larry L.; Rao, Herman C.

    1991-01-01

    Jade is a new distribution file system that provides a uniform way to name and access files in an internet environment. It makes two important contributions. First, Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Jade is designed under the restriction that the underlying file system may not be modified. Second, rather than providing a global name space, Jade permits each user to define a private name space. These private name spaces support two novel features: they allow multiple file systems to be mounted under one directory, and they allow one logical name space to mount other logical name spaces. A prototype of the Jade File System was implemented on Sun Workstations running Unix. It consists of interfaces to the Unix file system, the Sun Network File System, the Andrew File System, and FTP. This paper motivates Jade's design, highlights several aspects of its implementation, and illustrates applications that can take advantage of its features.

  1. Accessing files in an internet - The Jade file system

    Science.gov (United States)

    Rao, Herman C.; Peterson, Larry L.

    1993-01-01

    Jade is a new distribution file system that provides a uniform way to name and access files in an internet environment. It makes two important contributions. First, Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Jade is designed under the restriction that the underlying file system may not be modified. Second, rather than providing a global name space, Jade permits each user to define a private name space. These private name spaces support two novel features: they allow multiple file systems to be mounted under one directory, and they allow one logical name space to mount other logical name spaces. A prototype of the Jade File System was implemented on Sun Workstations running Unix. It consists of interfaces to the Unix file system, the Sun Network File System, the Andrew File System, and FTP. This paper motivates Jade's design, highlights several aspects of its implementation, and illustrates applications that can take advantage of its features.

  2. Enkripsi dan Dekripsi File dengan Algoritma Blowfish pada Perangkat Mobile Berbasis Android

    Directory of Open Access Journals (Sweden)

    Siswo Wardoyo

    2016-03-01

    Full Text Available Cryptography is one of the ways used to secure data in the form of a file with encrypt files so that others are not entitled to know the file is private and confidential. One method is the algorithm Blowfish Cryptography which is a symmetric key using the algorithm to perform encryption and decryption. Applications that are built can perform file encryption-shaped images, videos, and documents. These applications can be running on a mobile phone that has a minimal operating system Android version 2.3. The software used to build these applications is Eclipse. The results of this research indicate that applications built capable of performing encryption and decryption. The results file encryption makes files into another unknown meaning. By using the keys numbered 72 bits or 9 character takes 1,49x108 years to break it with the speed it’s computation is 106 key/sec.

  3. Strategies for Sharing Seismic Data Among Multiple Computer Platforms

    Science.gov (United States)

    Baker, L. M.; Fletcher, J. B.

    2001-12-01

    Seismic waveform data is readily available from a variety of sources, but it often comes in a distinct, instrument-specific data format. For example, data may be from portable seismographs, such as those made by Refraction Technology or Kinemetrics, from permanent seismograph arrays, such as the USGS Parkfield Dense Array, from public data centers, such as the IRIS Data Center, or from personal communication with other researchers through e-mail or ftp. A computer must be selected to import the data - usually whichever is the most suitable for reading the originating format. However, the computer best suited for a specific analysis may not be the same. When copies of the data are then made for analysis, a proliferation of copies of the same data results, in possibly incompatible, computer-specific formats. In addition, if an error is detected and corrected in one copy, or some other change is made, all the other copies must be updated to preserve their validity. Keeping track of what data is available, where it is located, and which copy is authoritative requires an effort that is easy to neglect. We solve this problem by importing waveform data to a shared network file server that is accessible to all our computers on our campus LAN. We use a Network Appliance file server running Sun's Network File System (NFS) software. Using an NFS client software package on each analysis computer, waveform data can then be read by our MatLab or Fortran applications without first copying the data. Since there is a single copy of the waveform data in a single location, the NFS file system hierarchy provides an implicit complete waveform data catalog and the single copy is inherently authoritative. Another part of our solution is to convert the original data into a blocked-binary format (known historically as USGS DR100 or VFBB format) that is interpreted by MatLab or Fortran library routines available on each computer so that the idiosyncrasies of each machine are not visible to

  4. GIFT: an HEP project for file transfer

    International Nuclear Information System (INIS)

    Ferrer, M.L.; Mirabelli, G.; Valente, E.

    1986-01-01

    Started in autumn 1983, GIFT (General Internetwork File Transfer) is a collaboration among several HEP centers, including CERN, Frascati, Oslo, Oxford, RAL and Rome. The collaboration was initially set up with the aim of studying the feasibility of a software system to allow direct file exchange between computers which do not share a common Virtual File Protocol. After the completion of this first phase, an implementation phase started and, since March 1985, an experimental service based on this system has been running at CERN between DECnet, CERNET and the UK Coloured Book protocols. The authors present the motivations that, together with previous gateway experiences, led to the definition of GIFT specifications and to the implementation of the GIFT Kernel system. The position of GIFT in the overall development framework of the networking facilities needed by large international collaborations within the HEP community is explained. (Auth.)

  5. FROM CAD MODEL TO 3D PRINT VIA “STL” FILE FORMAT

    Directory of Open Access Journals (Sweden)

    Cătălin IANCU

    2010-06-01

    Full Text Available The paper work presents the STL file format, which is now used for transferring information from CAD software to a 3D printer, for obtaining the solid model in Rapid prototyping and Computer Aided Manufacturing. It’s presented also the STL format structure, its history, limitations and further development, as well as its new version to arrive and other similar file formats. As a conclusion, STL files used to transfer data from CAD package to 3D printers has a series of limitations and therefore new formats will replace it soon.

  6. Transmission of the environmental radiation data files on the internet

    International Nuclear Information System (INIS)

    Yamaguchi, Yoshiaki; Saito, Tadashi; Yamamoto, Takayoshi; Matsumoto, Atsushi; Kyoh, Bunkei

    1999-01-01

    Recently, any text or data file has come to be transportable through the Internet with a personal computer. It is, however, restricted by selection of monitoring point to lay the cable because the personal circuit is generally used in case of continuous type environmental monitors. This is the reason why we have developed an environmental monitoring system that can transmit radiation data files on the Internet. Both 3''φ x 3'' NaI(Tl) detector and Thermo-Hydrometer are installed in the monitoring post of this system, and the data files of those detectors are transmitted from a personal computer at the monitoring point to Radioisotope Research Center of Osaka University. Environmental monitoring data from remote places have easily been obtained due to the data transmission through the Internet. Moreover, the system brings a higher precision of the environmental monitoring data because it includes the energy information of γ-rays. If it is possible to maintain the monitors at remote places, this system could execute the continuous environmental monitoring over the wide area. (author)

  7. Transmission of the environmental radiation data files on the internet

    Energy Technology Data Exchange (ETDEWEB)

    Yamaguchi, Yoshiaki; Saito, Tadashi; Yamamoto, Takayoshi [Osaka Univ., Suita (Japan). Radioisotope Research Center; Matsumoto, Atsushi; Kyoh, Bunkei

    1999-01-01

    Recently, any text or data file has come to be transportable through the Internet with a personal computer. It is, however, restricted by selection of monitoring point to lay the cable because the personal circuit is generally used in case of continuous type environmental monitors. This is the reason why we have developed an environmental monitoring system that can transmit radiation data files on the Internet. Both 3``{phi} x 3`` NaI(Tl) detector and Thermo-Hydrometer are installed in the monitoring post of this system, and the data files of those detectors are transmitted from a personal computer at the monitoring point to Radioisotope Research Center of Osaka University. Environmental monitoring data from remote places have easily been obtained due to the data transmission through the Internet. Moreover, the system brings a higher precision of the environmental monitoring data because it includes the energy information of {gamma}-rays. If it is possible to maintain the monitors at remote places, this system could execute the continuous environmental monitoring over the wide area. (author)

  8. Grid collector: An event catalog with automated file management

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Kesheng; Zhang, Wei-Ming; Sim, Alexander; Gu, Junmin; Shoshani, Arie

    2003-10-17

    High Energy Nuclear Physics (HENP) experiments such as STAR at BNL and ATLAS at CERN produce large amounts of data that are stored as files on mass storage systems in computer centers. In these files, the basic unit of data is an event. Analysis is typically performed on a selected set of events. The files containing these events have to be located, copied from mass storage systems to disks before analysis, and removed when no longer needed. These file management tasks are tedious and time consuming. Typically, all events contained in the files are read into memory before a selection is made. Since the time to read the events dominate the overall execution time, reading the unwanted event needlessly increases the analysis time. The Grid Collector is a set of software modules that works together to address these two issues. It automates the file management tasks and provides ''direct'' access to the selected events for analyses. It is currently integrated with the STAR analysis framework. The users can select events based on tags, such as, ''production date between March 10 and 20, and the number of charged tracks > 100.'' The Grid Collector locates the files containing relevant events, transfers the files across the Grid if necessary, and delivers the events to the analysis code through the familiar iterators. There has been some research efforts to address the file management issues, the Grid Collector is unique in that it addresses the event access issue together with the file management issues. This makes it more useful to a large variety of users.

  9. Final Report for File System Support for Burst Buffers on HPC Systems

    Energy Technology Data Exchange (ETDEWEB)

    Yu, W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mohror, K. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-11-27

    Distributed burst buffers are a promising storage architecture for handling I/O workloads for exascale computing. As they are being deployed on more supercomputers, a file system that efficiently manages these burst buffers for fast I/O operations carries great consequence. Over the past year, FSU team has undertaken several efforts to design, prototype and evaluate distributed file systems for burst buffers on HPC systems. These include MetaKV: a Key-Value Store for Metadata Management of Distributed Burst Buffers, a user-level file system with multiple backends, and a specialized file system for large datasets of deep neural networks. Our progress for these respective efforts are elaborated further in this report.

  10. Embrittlement of irradiated ferritic/martensitic steels in the absence of irradiation hardening

    Energy Technology Data Exchange (ETDEWEB)

    Klueh, R.L. [Oak Ridge Noational Laboratory, TN (United States); Shiba, K. [Japan Atomic Energy Agency, Tokai-mura, Naga-gun, Ibaraki-ken (Japan); Sokolov, M. [Oak Ridge National Laboratory, Materials Science and Technology Div., TN (United States)

    2007-07-01

    regime. Indications were that this embrittlement was also caused by irradiation-accelerated or irradiation-induced precipitation. These observations of embrittlement in the absence of irradiation hardening have been examined and analyzed with computational thermodynamics modeling to illuminate and understand the effect. (authors)

  11. Characteristics of file sharing and peer to peer networking | Opara ...

    African Journals Online (AJOL)

    Characteristics of file sharing and peer to peer networking. ... distributing or providing access to digitally stored information, such as computer programs, ... including in multicast systems, anonymous communications systems, and web caches.

  12. Microstructure of irradiated materials

    International Nuclear Information System (INIS)

    Robertson, I.M.

    1995-01-01

    The focus of the symposium was on the changes produced in the microstructure of metals, ceramics, and semiconductors by irradiation with energetic particles. the symposium brought together those working in the different material systems, which revealed that there are a remarkable number of similarities in the irradiation-produced microstructures in the different classes of materials. Experimental, computational and theoretical contributions were intermixed in all of the sessions. This provided an opportunity for these groups, which should interact, to do so. Separate abstracts were prepared for 58 papers in this book

  13. Adobe acrobat: an alternative electronic teaching file construction methodology independent of HTML restrictions.

    Science.gov (United States)

    Katzman, G L

    2001-03-01

    The goal of the project was to create a method by which an in-house digital teaching file could be constructed that was simple, inexpensive, independent of hypertext markup language (HTML) restrictions, and appears identical on multiple platforms. To accomplish this, Microsoft PowerPoint and Adobe Acrobat were used in succession to assemble digital teaching files in the Acrobat portable document file format. They were then verified to appear identically on computers running Windows, Macintosh Operating Systems (OS), and the Silicon Graphics Unix-based OS as either a free-standing file using Acrobat Reader software or from within a browser window using the Acrobat browser plug-in. This latter display method yields a file viewed through a browser window, yet remains independent of underlying HTML restrictions, which may confer an advantage over simple HTML teaching file construction. Thus, a hybrid of HTML-distributed Adobe Acrobat generated WWW documents may be a viable alternative for digital teaching file construction and distribution.

  14. Nuclear structure data file. A manual for preparation of data sets

    International Nuclear Information System (INIS)

    Ewbank, W.B.; Schmorak, M.R.; Bertrand, F.E.; Feliciano, M.; Horen, D.J.

    1975-06-01

    The Nuclear Data Project at ORNL is building a computer-based file of nuclear structure data, which is intended for use by both basic and applied users. For every nucleus, the Nuclear Structure Data File contains evaluated nuclear structure information. This manual describes a standard input format for nuclear structure data. The format is sufficiently structured that bulk data can be entered efficiently. At the same time, the structure is open-ended and can accommodate most measured or deduced quantities that yield nuclear structure information. Computer programs have been developed at the Data Project to perform consistency checking and routine calculations. Programs are also used for preparing level scheme drawings. (U.S.)

  15. High School and Beyond Transcripts Survey (1982). Data File User's Manual. Contractor Report.

    Science.gov (United States)

    Jones, Calvin; And Others

    This data file user's manual documents the procedures used to collect and process high school transcripts for a large sample of the younger cohort (1980 sophomores) in the High School and Beyond survey. The manual provides the user with the technical assistance needed to use the computer file and also discusses the following: (1) sample design for…

  16. File structure and organization in the automation system for operative account of equipment and materials in JINR

    International Nuclear Information System (INIS)

    Gulyaeva, N.D.; Markova, N.F.; Nikitina, V.I.; Tentyukova, G.N.

    1975-01-01

    The structure and organization of files in the information bank for the first variant of a JINR material and technical supply subsystem are described. Automated system of equipment operative stock-taking on the base of the SDS-6200 computer is developed. Information is stored on magnetic discs. The arrangement of each file depends on its purpose and structure of data. Access to the files can be arbitrary or consecutive. The files are divided into groups: primary document files, long-term reference, information on items that may change as a result of administrative decision [ru

  17. Experimental Analysis of File Transfer Rates over Wide-Area Dedicated Connections

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Nageswara S. [ORNL; Liu, Qiang [ORNL; Sen, Satyabrata [ORNL; Hinkel, Gregory Carl [ORNL; Imam, Neena [ORNL; Foster, Ian [University of Chicago; Kettimuthu, R. [Argonne National Laboratory (ANL); Settlemyer, Bradley [Los Alamos National Laboratory (LANL); Wu, Qishi [University of Memphis; Yun, Daqing [Harrisburg University

    2016-12-01

    File transfers over dedicated connections, supported by large parallel file systems, have become increasingly important in high-performance computing and big data workflows. It remains a challenge to achieve peak rates for such transfers due to the complexities of file I/O, host, and network transport subsystems, and equally importantly, their interactions. We present extensive measurements of disk-to-disk file transfers using Lustre and XFS file systems mounted on multi-core servers over a suite of 10 Gbps emulated connections with 0-366 ms round trip times. Our results indicate that large buffer sizes and many parallel flows do not always guarantee high transfer rates. Furthermore, large variations in the measured rates necessitate repeated measurements to ensure confidence in inferences based on them. We propose a new method to efficiently identify the optimal joint file I/O and network transport parameters using a small number of measurements. We show that for XFS and Lustre with direct I/O, this method identifies configurations achieving 97% of the peak transfer rate while probing only 12% of the parameter space.

  18. Image Steganography of Multiple File Types with Encryption and Compression Algorithms

    Directory of Open Access Journals (Sweden)

    Ernest Andreigh C. Centina

    2017-05-01

    Full Text Available The goals of this study were to develop a system intended for securing files through the technique of image steganography integrated with cryptography by utilizing ZLIB Algorithm for compressing and decompressing secret files, DES Algorithm for encryption and decryption, and Least Significant Bit Algorithm for file embedding and extraction to avoid compromise on highly confidential files from exploits of unauthorized persons. Ensuing to this, the system is in acc ordance with ISO 9126 international quality standards. Every quality criteria of the system was evaluated by 10 Information Technology professionals, and the arithmetic Mean and Standard Deviation of the survey were computed. The result exhibits that m ost of them strongly agreed that the system is excellently effective based on Functionality, Reliability, Usability, Efficiency, Maintainability and Portability conformance to ISO 9126 standards. The system was found to be a useful tool for both governmen t agencies and private institutions for it could keep not only the message secret but also the existence of that particular message or file et maintaining the privacy of highly confidential and sensitive files from unauthorized access.

  19. Database organization for computer-aided characterization of laser diode

    International Nuclear Information System (INIS)

    Oyedokun, Z.O.

    1988-01-01

    Computer-aided data logging involves a huge amount of data which must be properly managed for optimized storage space, easy access, retrieval and utilization. An organization method is developed to enhance the advantages of computer-based data logging of the testing of the semiconductor injection laser which optimize storage space, permit authorized user easy access and inhibits penetration. This method is based on unique file identification protocol tree structure and command file-oriented access procedures

  20. Next generation WLCG File Transfer Service (FTS)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    LHC experiments at CERN and worldwide utilize WLCG resources and middleware components to perform distributed computing tasks. One of the most important tasks is reliable file replication. It is a complex problem, suffering from transfer failures, disconnections, transfer duplication, server and network overload, differences in storage systems, etc. To address these problems, EMI and gLite have provided the independent File Transfer Service (FTS) and Grid File Access Library (GFAL) tools. Their development started almost a decade ago, in the meantime, requirements in data management have changed - the old architecture of FTS and GFAL cannot keep support easily these changes. Technology has also been progressing: FTS and GFAL do not fit into the new paradigms (cloud, messaging, for example). To be able to serve the next stage of LHC data collecting (from 2013), we need a new generation of  these tools: FTS 3 and GFAL 2. We envision a service requiring minimal configuration, which can dynamically adapt to the...

  1. Nitrogen compounds behavior under irradiation environment

    International Nuclear Information System (INIS)

    Ichikawa, Nagayoshi; Takagi, Junichi; Yotsuyanagi, Tadasu

    1991-01-01

    Laboratory experiments were performed to evaluate nitrogen compounds behavior in liquid phase under irradiation environments. Nitrogen compounds take a chemical form of ammonium ion under reducing condition by gamma irradiation, whereas ammonium ions are rather stable even under oxidizing conditions. Key reactions were pointed out and their reaction rate constants and activation energies were estimated through computer code simulation. A reaction scheme for nitrogen compounds including protonate reaction was proposed. (author)

  2. Slag recycling of irradiated vanadium

    International Nuclear Information System (INIS)

    Gorman, P.K.

    1995-01-01

    An experimental inductoslag apparatus to recycle irradiated vanadium was fabricated and tested. An experimental electroslag apparatus was also used to test possible slags. The testing was carried out with slag materials that were fabricated along with impurity bearing vanadium samples. Results obtained include computer simulated thermochemical calculations and experimentally determined removal efficiencies of the transmutation impurities. Analyses of the samples before and after testing were carried out to determine if the slag did indeed remove the transmutation impurities from the irradiated vanadium

  3. Students "Hacking" School Computer Systems

    Science.gov (United States)

    Stover, Del

    2005-01-01

    This article deals with students hacking school computer systems. School districts are getting tough with students "hacking" into school computers to change grades, poke through files, or just pit their high-tech skills against district security. Dozens of students have been prosecuted recently under state laws on identity theft and unauthorized…

  4. A History of the Andrew File System

    CERN Multimedia

    CERN. Geneva; Altman, Jeffrey

    2011-01-01

    Derrick Brashear and Jeffrey Altman will present a technical history of the evolution of Andrew File System starting with the early days of the Andrew Project at Carnegie Mellon through the commercialization by Transarc Corporation and IBM and a decade of OpenAFS. The talk will be technical with a focus on the various decisions and implementation trade-offs that were made over the course of AFS versions 1 through 4, the development of the Distributed Computing Environment Distributed File System (DCE DFS), and the course of the OpenAFS development community. The speakers will also discuss the various AFS branches developed at the University of Michigan, Massachusetts Institute of Technology and Carnegie Mellon University.

  5. SIDS-toADF File Mapping Manual

    Science.gov (United States)

    McCarthy, Douglas; Smith, Matthew; Poirier, Diane; Smith, Charles A. (Technical Monitor)

    2002-01-01

    The "CFD General Notation System" (CGNS) consists of a collection of conventions, and conforming software, for the storage and retrieval of Computational Fluid Dynamics (CFD) data. It facilitates the exchange of data between sites and applications, and helps stabilize the archiving of aerodynamic data. This effort was initiated in order to streamline the procedures in exchanging data and software between NASA and its customers, but the goal is to develop CGNS into a National Standard for the exchange of aerodynamic data. The CGNS development team is comprised of members from Boeing Commercial Airplane Group, NASA-Ames, NASA-Langley, NASA-Lewis, McDonnell-Douglas Corporation (now Boeing-St. Louis), Air Force-Wright Lab., and ICEM-CFD Engineering. The elements of CGNS address all activities associated with the storage of data on external media and its movement to and from application programs. These elements include: 1) The Advanced Data Format (ADF) Database manager, consisting of both a file format specification and its I/O software, which handles the actual reading and writing of data from and to external storage media; 2) The Standard Interface Data Structures (SIDS), which specify the intellectual content of CFD data and the conventions governing naming and terminology; 3) The SIDS-to-ADF File Mapping conventions, which specify the exact location where the CFD data defined by the SIDS is to be stored within the ADF file(s); and 4) The CGNS Mid-level Library, which provides CFD-knowledgeable routines suitable for direct installation into application codes. The SIDS-toADF File Mapping Manual specifies the exact manner in which, under CGNS conventions, CFD data structures (the SIDS) are to be stored in (i.e., mapped onto) the file structure provided by the database manager (ADF). The result is a conforming CGNS database. Adherence to the mapping conventions guarantees uniform meaning and location of CFD data within ADF files, and thereby allows the construction of

  6. Competency Reference for Computer Assisted Drafting.

    Science.gov (United States)

    Oregon State Dept. of Education, Salem. Div. of Vocational Technical Education.

    This guide, developed in Oregon, lists competencies essential for students in computer-assisted drafting (CAD). Competencies are organized in eight categories: computer hardware, file usage and manipulation, basic drafting techniques, mechanical drafting, specialty disciplines, three dimensional drawing/design, plotting/printing, and advanced CAD.…

  7. Distributed computing for FTU data handling

    Energy Technology Data Exchange (ETDEWEB)

    Bertocchi, A. E-mail: bertocchi@frascati.enea.it; Bracco, G.; Buceti, G.; Centioli, C.; Giovannozzi, E.; Iannone, F.; Panella, M.; Vitale, V

    2002-06-01

    The growth of data warehouse in tokamak experiment is leading fusion laboratories to provide new IT solutions in data handling. In the last three years, the Frascati Tokamak Upgrade (FTU) experimental database was migrated from IBM-mainframe to Unix distributed computing environment. The migration efforts have taken into account the following items: (1) a new data storage solution based on storage area network over fibre channel; (2) andrew file system (AFS) for wide area network file sharing; (3) 'one measure/one file' philosophy replacing 'one shot/one file' to provide a faster read/write data access; (4) more powerful services, such as AFS, CORBA and MDSplus to allow users to access FTU database from different clients, regardless their O.S.; (5) large availability of data analysis tools, from the locally developed utility SHOW to the multi-platform Matlab, interactive data language and jScope (all these tools are now able to access also the Joint European Torus data, in the framework of the remote data access activity); (6) a batch-computing cluster of Alpha/CompaqTru64 CPU based on CODINE/GRD to optimize utilization of software and hardware resources.

  8. Dosimetry of the portable blood irradiator

    International Nuclear Information System (INIS)

    Roberson, P.L.; Hungate, F.P.; Reece, W.D.; Tanner, J.E.

    1985-08-01

    A portable blood irradiator was developed at the Pacific Northwest Laboratory to evaluate the effects of chronic irradiation of the blood in suppressing graft rejection. The irradiator, designed to be worn on the arm or leg and be surgically connected in an arterio-venous shunt, uses beta radiation from activated thulium imbedded in a vitreous carbon matrix to reduce the number of lymphocytes circulating in the blood. The dose and energy spectra relative to the distance from and position around a prototype irradiator were measured by thermoluminescent dosimeters, ion chambers and photon spectroscopy. With computer simulations using those measurements, the shielding was redesigned to minimize the radiation dose to the patient and to the attending personnel and to minimize the weight of the irradiator. The new shielding design was incorporated into a new prototype, and the dose and spectral measurements were repeated, which confirmed the design improvements. 10 refs., 11 figs

  9. Input data requirements for special processors in the computation system containing the VENTURE neutronics code

    International Nuclear Information System (INIS)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.

    1976-11-01

    This report presents user input data requirements for certain special processors in a nuclear reactor computation system. These processors generally read data in formatted form and generate binary interface data files. Some data processing is done to convert from the user-oriented form to the interface file forms. The VENTURE diffusion theory neutronics code and other computation modules in this system use the interface data files which are generated

  10. Input data requirements for special processors in the computation system containing the VENTURE neutronics code

    International Nuclear Information System (INIS)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.

    1979-07-01

    User input data requirements are presented for certain special processors in a nuclear reactor computation system. These processors generally read data in formatted form and generate binary interface data files. Some data processing is done to convert from the user oriented form to the interface file forms. The VENTURE diffusion theory neutronics code and other computation modules in this system use the interface data files which are generated

  11. The Fifth Workshop on HPC Best Practices: File Systems and Archives

    Energy Technology Data Exchange (ETDEWEB)

    Hick, Jason; Hules, John; Uselton, Andrew

    2011-11-30

    The workshop on High Performance Computing (HPC) Best Practices on File Systems and Archives was the fifth in a series sponsored jointly by the Department Of Energy (DOE) Office of Science and DOE National Nuclear Security Administration. The workshop gathered technical and management experts for operations of HPC file systems and archives from around the world. Attendees identified and discussed best practices in use at their facilities, and documented findings for the DOE and HPC community in this report.

  12. Analyses of multi-irradiation film for system alignments in stereotactic radiotherapy (SRT) and radiosurgery (SRS)

    International Nuclear Information System (INIS)

    Jen-San Tsai

    1996-01-01

    In stereotactic radiosurgery, a seven-irradiation film was used to define any discrepancy between the beam and target centres. A mathematical model based on the linac alignment and target set-up was developed to diagnose the discrepancies of the seven-irradiation film between the beam and simulation target centres. From the measured data of the multi-irradiation film, this mathematical model leads to five parameters in seven equations. Twin computer codes were employed to solve the five parameters from the seven equations. By feeding the discrepancy data into the two computer codes, the sources of the target-to-beam discrepancy were revealed. From these decoded sources, the target coordinates were adjusted and then the seven-irradiation film procedure was repeated. This discrepancy thus obtained was found to be drastically reduced. Some decoded parameters were consistently verified by direct measurements. This demonstrates that the present mathematical model and computer code do reveal the causes of the target-to-beam misalignment and gantry sag. In a further effort to test the feasibility of the mathematical model and the computer codes, the target's lateral coordinate was deliberately offset by 1.5 mm and then another seven-irradiation film was taken. By inserting these discrepancies into the computer codes, it was found that the deviation was consistent with the intentional offset. In addition, the mathematical model and computer codes are applicable to any multi-irradiation technique. (author)

  13. Dose Distribution Calculation Using MCNPX Code in the Gamma-ray Irradiation Cell

    International Nuclear Information System (INIS)

    Kim, Yong Ho

    1991-02-01

    60 Co-gamma irradiators have long been used for foods sterilization, plant mutation and development of radio-protective agents, radio-sensitizers and other purposes. The Applied Radiological Science Research Institute of Cheju National University has a multipurpose gamma irradiation facility loaded with a MDS Nordin standard 60 Co source (C188), of which the initial activity was 400 TBq (10,800 Ci) on February 19, 2004. This panoramic gamma irradiator is designed to irradiate in all directions various samples such as plants, cultured cells and mice to administer given radiation doses. In order to give accurate doses to irradiation samples, appropriate methods of evaluating, both by calculation and measurement, the radiation doses delivered to the samples should be set up. Computational models have been developed to evaluate the radiation dose distributions inside the irradiation chamber and the radiation doses delivered to typical biolological samples which are frequently irradiated in the facility. The computational models are based on using the MCNPX code. The horizontal and vertical dose distributions has been calculated inside the irradiation chamber and compared the calculated results with measured data obtained with radiation dosimeters to verify the computational models. The radiation dosimeters employed are a Famer's type ion chamber and MOSFET dosimeters. Radiation doses were calculated by computational models, which were delivered to cultured cell samples contained in test tubes and to a mouse fixed in a irradiation cage, and compared the calculated results with the measured data. The computation models are also tested to see if they can accurately simulate the case where a thick lead shield is placed between the source and detector. Three tally options of the MCNPX code, F4, F5 and F6, are alternately used to see which option produces optimum results. The computation models are also used to calculate gamma ray energy spectra of a BGO scintillator at

  14. JENDL special purpose file

    International Nuclear Information System (INIS)

    Nakagawa, Tsuneo

    1995-01-01

    In JENDL-3,2, the data on all the reactions having significant cross section over the neutron energy from 0.01 meV to 20 MeV are given for 340 nuclides. The object range of application extends widely, such as the neutron engineering, shield and others of fast reactors, thermal neutron reactors and nuclear fusion reactors. This is a general purpose data file. On the contrary to this, the file in which only the data required for a specific application field are collected is called special purpose file. The file for dosimetry is a typical special purpose file. The Nuclear Data Center, Japan Atomic Energy Research Institute, is making ten kinds of JENDL special purpose files. The files, of which the working groups of Sigma Committee are in charge, are listed. As to the format of the files, ENDF format is used similarly to JENDL-3,2. Dosimetry file, activation cross section file, (α, n) reaction data file, fusion file, actinoid file, high energy data file, photonuclear data file, PKA/KERMA file, gas production cross section file and decay data file are described on their contents, the course of development and their verification. Dosimetry file and gas production cross section file have been completed already. As for the others, the expected time of completion is shown. When these files are completed, they are opened to the public. (K.I.)

  15. Startup of the Whiteshell irradiation facility

    International Nuclear Information System (INIS)

    Barnard, J.W.; Stanley, F.W.

    1989-01-01

    Recently, a 10-MeV, 1-kW electron linear accelerator was installed in a specially designed irradiation facility at the Whiteshell Nuclear Research Establishment. The facility was designed for radiation applications research in the development of new radiation processes up to the pilot scale level. The accelerator is of advanced design. Automatic startup via computer control makes it compatible with industrial processing. It has been operated successfully as a fully integrated electron irradiator for a number of applications including curing of plastics and composites, sterilization of medical disposables and animal feed irradiation. We report here on our experience during the first six months of operation. (orig.)

  16. Startup of the whiteshell irradiation facility

    Science.gov (United States)

    Barnard, J. W.; Stanley, F. W.

    1989-04-01

    Recently, a 10-MeV, 1-kW electron linear accelerator was installed in a specially designed irradiation facility at the Whiteshell Nuclear Research Establishment. The facility was designed for radiation applications research in the development of new radiation processes up to the pilot scale level. The accelerator is of advanced design. Automatic startup via computer control makes it compatible with industrial processing. It has been operated successfully as a fully integrated electron irradiator for a number of applications including curing of plastics and composites, sterilization of medical disposables and animal feed irradiation. We report here on our experience during the first six months of operation.

  17. Estimation of irradiated control rod worth

    International Nuclear Information System (INIS)

    Varvayanni, M.; Catsaros, N.; Antonopoulos-Domis, M.

    2009-01-01

    When depleted control rods are planned to be used in new core configurations, their worth has to be accurately predicted in order to deduce key design and safety parameters such as the available shutdown margin. In this work a methodology is suggested for the derivation of the distributed absorbing capacity of a depleted rod, useful in the case that the level of detail that is known about the irradiation history of the control rod does not allow an accurate calculation of the absorber's burnup. The suggested methodology is based on measurements of the rod's worth carried out in the former core configuration and on corresponding calculations based on the original (before first irradiation) absorber concentration. The methodology is formulated for the general case of the multi-group theory; it is successfully tested for the one-group approximation, for a depleted control rod of the Greek Research Reactor, containing five neutron absorbers. The computations reproduce satisfactorily the irradiated rod worth measurements, practically eliminating the discrepancy of the total rod worth, compared to the computations based on the nominal absorber densities.

  18. Needs of food irradiation and its commercialization

    Energy Technology Data Exchange (ETDEWEB)

    Welt, M A

    1984-01-01

    On July 5, 1983, the United States Food and Drug Administration filed a notice in the Federal Register approving the use of cobalt 60 or cesium 137 gamma radiation to reduce or control microbial contamination in spices, onion powder and garlic powder. The approval was the first in nineteen years issued by the FDA and appears to set the stage for increased regulatory approvals in the area of radiation preservation of foods. On July 8, 1983, the Codex Alimentarius Commission approved an international standard for irradiated foods. The standard had previously been introduced by experts in the World Health Organization and the Food and Agriculture Organization. The ability of properly applied ionizing energy to inhibit sprouting, to eliminate the need for toxic chemical fumigants for insect disinfestation purposes, to extend refrigerated shelf life of many food products, to eliminate parasites and pathogens from our food chain and to preserve precooked packaged food products for indefinite storage without freezing or refrigeration, dictates the timeliness of food irradiation technology.

  19. Computer simulation of defect behavior under fusion irradiation environments

    International Nuclear Information System (INIS)

    Muroga, T.; Ishino, S.

    1983-01-01

    To simulate defect behavior under irradiation, three kinds of cascade-annealing calculations have been carried out in alpha-iron using the codes MARLOWE, DAIQUIRI and their modifications. They are (1) cascade-annealing calculation with different masses of projectile, (2) defect drifting near dislocations after cascade production and (3) cascade-overlap calculation. The defect survival ratio is found to increase as decreasing mass of the projectile both after athermal close-pair recombination and after thermal annealing. It is shown that at moderate temperatures vacancy clustering is enhanced near dislocations. Cascade-overlap is found to decrease the defect survivability. In addition, the role of helium in vacancy clustering has been calculated in aluminium lattices and its effect is found to depend strongly on temperature, interstitials and the mobility of small clusters. These results correspond well to the experimental data and will be helpful for correlating between fusion and simulation irradiations. (orig.)

  20. Experimental and computer simulation study of radionuclide yields in the ADT materials irradiated with intermediate energy protons

    Energy Technology Data Exchange (ETDEWEB)

    Titarenko, Yu.E.; Shvedov, O.V.; Batyaev, V.F. [Inst. for Theoretical and Experimental Physics, B. Cheremushkinskaya, Moscow (Russian Federation)] [and others

    1998-11-01

    The results of measurements and computer simulations of the yields of residual product nuclei in {sup 209}Bi, {sup 208,207,206,nat}Pb, {sup 65,63}Cu, {sup 59}Co thin targets irradiated by 0.13, 1.2 and 1.5 GeV protons are presented. The yields were measured by direct high-precision {gamma}-spectrometry. The process was monitored by the {sup 27}Al(p,x){sup 24}Na reaction. 801 cross sections are presented and used in comparisons between the reaction yields obtained experimentally and simulated by the HETC, GNASH, LAHET, INUCL, CEM95, CASCADE, NUCLEUS, YIELDX, QMD and ALICE codes. (author)

  1. A lightweight high availability strategy for Atlas LCG File Catalogs

    International Nuclear Information System (INIS)

    Martelli, Barbara; Salvo, Alessandro de; Anzellotti, Daniela; Rinaldi, Lorenzo; Cavalli, Alessandro; Pra, Stefano dal; Dell'Agnello, Luca; Gregori, Daniele; Prosperini, Andrea; Ricci, Pier Paolo; Sapunenko, Vladimir

    2010-01-01

    The LCG File Catalog is a key component of the LHC Computing Grid middleware [1], as it contains the mapping between Logical File Names and Physical File Names on the Grid. The Atlas computing model foresees multiple local LFC housed in each Tier-1 and Tier-0, containing all information about files stored in the regional cloud. As the local LFC contents are presently not replicated anywhere, this turns out in a dangerous single point of failure for all of the Atlas regional clouds. In order to solve this problem we propose a novel solution for high availability (HA) of Oracle based Grid services, obtained by composing an Oracle Data Guard deployment and a series of application level scripts. This approach has the advantage of being very easy to deploy and maintain, and represents a good candidate solution for all Tier-2s which are usually little centres with little manpower dedicated to service operations. We also present the results of a wide range of functionality and performance tests run on a test-bed having characteristics similar to the ones required for production. The test-bed consists of a failover deployment between the Italian LHC Tier-1 (INFN - CNAF) and an Atlas Tier-2 located at INFN - Roma1. Moreover, we explain how the proposed strategy can be deployed on the present Grid infrastructure, without requiring any change to the middleware and in a way that is totally transparent to end users and applications.

  2. Techno-economic studies on transportable moving-bed onion irradiator

    International Nuclear Information System (INIS)

    Krishnamurthy, K.; Sharma, K.S.S.; Deshmukh, V.P.; Bongirwar, D.R.; Nair, K.V.V.; Patil, K.B.

    1984-01-01

    The paper presents the optimisation studies and the design features of a transportable irradiator evolved to demonstrate the techno-economic advantage of the irradiation process at village level. A brief outline is also given of the computer programme generated and employed to optimise the source-target configuration based on a narrow plane source moving-bed irradiation concept that aimed at achieving a simplified product handling system and cost effective design of the biological shield and controls for the irradiator. The engineering features of the irradiator along with a summary of the analysis of the economics of the application of the process are also given. (author)

  3. Teaching, Learning, and Collaborating in the Cloud: Applications of Cloud Computing for Educators in Post-Secondary Institutions

    Science.gov (United States)

    Aaron, Lynn S.; Roche, Catherine M.

    2012-01-01

    "Cloud computing" refers to the use of computing resources on the Internet instead of on individual personal computers. The field is expanding and has significant potential value for educators. This is discussed with a focus on four main functions: file storage, file synchronization, document creation, and collaboration--each of which has…

  4. Publication and Retrieval of Computational Chemical-Physical Data Via the Semantic Web. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Ostlund, Neil [Chemical Semantics, Inc., Gainesville, FL (United States)

    2017-07-20

    This research showed the feasibility of applying the concepts of the Semantic Web to Computation Chemistry. We have created the first web portal (www.chemsem.com) that allows data created in the calculations of quantum chemistry, and other such chemistry calculations to be placed on the web in a way that makes the data accessible to scientists in a semantic form never before possible. The semantic web nature of the portal allows data to be searched, found, and used as an advance over the usual approach of a relational database. The semantic data on our portal has the nature of a Giant Global Graph (GGG) that can be easily merged with related data and searched globally via a SPARQL Protocol and RDF Query Language (SPARQL) that makes global searches for data easier than with traditional methods. Our Semantic Web Portal requires that the data be understood by a computer and hence defined by an ontology (vocabulary). This ontology is used by the computer in understanding the data. We have created such an ontology for computational chemistry (purl.org/gc) that encapsulates a broad knowledge of the field of computational chemistry. We refer to this ontology as the Gainesville Core. While it is perhaps the first ontology for computational chemistry and is used by our portal, it is only a start of what must be a long multi-partner effort to define computational chemistry. In conjunction with the above efforts we have defined a new potential file standard (Common Standard for eXchange – CSX for computational chemistry data). This CSX file is the precursor of data in the Resource Description Framework (RDF) form that the semantic web requires. Our portal translates CSX files (as well as other computational chemistry data files) into RDF files that are part of the graph database that the semantic web employs. We propose a CSX file as a convenient way to encapsulate computational chemistry data.

  5. 11 CFR 100.19 - File, filed or filing (2 U.S.C. 434(a)).

    Science.gov (United States)

    2010-01-01

    ... a facsimile machine or by electronic mail if the reporting entity is not required to file..., including electronic reporting entities, may use the Commission's website's on-line program to file 48-hour... the reporting entity is not required to file electronically in accordance with 11 CFR 104.18. [67 FR...

  6. Storage of sparse files using parallel log-structured file system

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron

    2017-11-07

    A sparse file is stored without holes by storing a data portion of the sparse file using a parallel log-structured file system; and generating an index entry for the data portion, the index entry comprising a logical offset, physical offset and length of the data portion. The holes can be restored to the sparse file upon a reading of the sparse file. The data portion can be stored at a logical end of the sparse file. Additional storage efficiency can optionally be achieved by (i) detecting a write pattern for a plurality of the data portions and generating a single patterned index entry for the plurality of the patterned data portions; and/or (ii) storing the patterned index entries for a plurality of the sparse files in a single directory, wherein each entry in the single directory comprises an identifier of a corresponding sparse file.

  7. 49 CFR 564.5 - Information filing; agency processing of filings.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 6 2010-10-01 2010-10-01 false Information filing; agency processing of filings... HIGHWAY TRAFFIC SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION REPLACEABLE LIGHT SOURCE INFORMATION (Eff. until 12-01-12) § 564.5 Information filing; agency processing of filings. (a) Each manufacturer...

  8. Benchmarking and monitoring framework for interconnected file synchronization and sharing services

    DEFF Research Database (Denmark)

    Mrówczyński, Piotr; Mościcki, Jakub T.; Lamanna, Massimo

    2018-01-01

    computing and storage infrastructure in the research labs. In this work we present a benchmarking and monitoring framework for file synchronization and sharing services. It allows service providers to monitor the operational status of their services, understand the service behavior under different load...... types and with different network locations of the synchronization clients. The framework is designed as a monitoring and benchmarking tool to provide performance and robustness metrics for interconnected file synchronization and sharing services such as Open Cloud Mesh....

  9. Translator program converts computer printout into braille language

    Science.gov (United States)

    Powell, R. A.

    1967-01-01

    Computer program converts print image tape files into six dot Braille cells, enabling a blind computer programmer to monitor and evaluate data generated by his own programs. The Braille output is printed 8 lines per inch.

  10. Optimized distributed systems achieve significant performance improvement on sorted merging of massive VCF files.

    Science.gov (United States)

    Sun, Xiaobo; Gao, Jingjing; Jin, Peng; Eng, Celeste; Burchard, Esteban G; Beaty, Terri H; Ruczinski, Ingo; Mathias, Rasika A; Barnes, Kathleen; Wang, Fusheng; Qin, Zhaohui S

    2018-06-01

    Sorted merging of genomic data is a common data operation necessary in many sequencing-based studies. It involves sorting and merging genomic data from different subjects by their genomic locations. In particular, merging a large number of variant call format (VCF) files is frequently required in large-scale whole-genome sequencing or whole-exome sequencing projects. Traditional single-machine based methods become increasingly inefficient when processing large numbers of files due to the excessive computation time and Input/Output bottleneck. Distributed systems and more recent cloud-based systems offer an attractive solution. However, carefully designed and optimized workflow patterns and execution plans (schemas) are required to take full advantage of the increased computing power while overcoming bottlenecks to achieve high performance. In this study, we custom-design optimized schemas for three Apache big data platforms, Hadoop (MapReduce), HBase, and Spark, to perform sorted merging of a large number of VCF files. These schemas all adopt the divide-and-conquer strategy to split the merging job into sequential phases/stages consisting of subtasks that are conquered in an ordered, parallel, and bottleneck-free way. In two illustrating examples, we test the performance of our schemas on merging multiple VCF files into either a single TPED or a single VCF file, which are benchmarked with the traditional single/parallel multiway-merge methods, message passing interface (MPI)-based high-performance computing (HPC) implementation, and the popular VCFTools. Our experiments suggest all three schemas either deliver a significant improvement in efficiency or render much better strong and weak scalabilities over traditional methods. Our findings provide generalized scalable schemas for performing sorted merging on genetics and genomics data using these Apache distributed systems.

  11. Cut-and-Paste file-systems: integrating simulators and file systems

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.

    1995-01-01

    We have implemented an integrated and configurable file system called the Pegasus filesystem (PFS) and a trace-driven file-system simulator called Patsy. Patsy is used for off-line analysis of file-systemalgorithms, PFS is used for on-line file-systemdata storage. Algorithms are first analyzed in

  12. Cluster computing software for GATE simulations

    International Nuclear Information System (INIS)

    Beenhouwer, Jan de; Staelens, Steven; Kruecker, Dirk; Ferrer, Ludovic; D'Asseler, Yves; Lemahieu, Ignace; Rannou, Fernando R.

    2007-01-01

    Geometry and tracking (GEANT4) is a Monte Carlo package designed for high energy physics experiments. It is used as the basis layer for Monte Carlo simulations of nuclear medicine acquisition systems in GEANT4 Application for Tomographic Emission (GATE). GATE allows the user to realistically model experiments using accurate physics models and time synchronization for detector movement through a script language contained in a macro file. The downside of this high accuracy is long computation time. This paper describes a platform independent computing approach for running GATE simulations on a cluster of computers in order to reduce the overall simulation time. Our software automatically creates fully resolved, nonparametrized macros accompanied with an on-the-fly generated cluster specific submit file used to launch the simulations. The scalability of GATE simulations on a cluster is investigated for two imaging modalities, positron emission tomography (PET) and single photon emission computed tomography (SPECT). Due to a higher sensitivity, PET simulations are characterized by relatively high data output rates that create rather large output files. SPECT simulations, on the other hand, have lower data output rates but require a long collimator setup time. Both of these characteristics hamper scalability as a function of the number of CPUs. The scalability of PET simulations is improved here by the development of a fast output merger. The scalability of SPECT simulations is improved by greatly reducing the collimator setup time. Accordingly, these two new developments result in higher scalability for both PET and SPECT simulations and reduce the computation time to more practical values

  13. The effect of irradiation on the subcutaneous fatty layer and the perirectal tissue by computed tomography

    International Nuclear Information System (INIS)

    Komatsu, Takashi

    1987-01-01

    Although it has been suggested that the subcutaneous fatty layer is affected by irradiation, the available reports have not been able to find out yet. While, it is reported that the intrapelvic fat increases in volume after whole pelvic irradiation. This paper report a study about the effect of irradiation on subcutaneous fatty layer and intrapelvic fat. The subjects studied were 20 cases treated by whole pelvic irradiation. The x-ray CT film was used to measure the subcutaneous fatty layer and the intrapelvic fat. Three slices, the lower end of sacro-iliac joint, upper end of the femoral head and upper rim of the pubic symphysis, were chosen as the cross section level, and the thickness of subcutaneous fatty layer on 6 points of the body and the presacral space (PS) were measured. Irradiation group was followed by measuring the thickness of fatty layer; before irradiation, 1 month, 3 or 4 months, 6 or 7 months and 12 months after irradiation. At the three of four points, which are included within the irradiation area, the thickness of subcutaneous fatty layer tended to increase after irradiation, though it showed increase or decrease at each period. This tendency was prominent at the lower than the upper slice of the pelvis. The other points, which are out of the irradiation field, showed no significant change and some of them even showed the tendency of decrease. Fatty layer of the presacral space tended to increase following irradiation, but there was no correlation with the irradiation dose. It is considered that the injury of subcutaneous tissue by irradiation results in the disturbance of blood flow and then it accelerates deposition of fat to the irradiated area. (author)

  14. Cut-and-Paste file-systems : integrating simulators and file systems

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.

    1996-01-01

    We have implemented an integrated and configurable file system called the PFS and a trace-driven file-system simulator called Patsy. Patsy is used for off-line analysis of file-system algorithms, PFS is used for on-line file-system data storage. Algorithms are first analyzed in Patsy and when we are

  15. Computer stress study of bone with computed tomography

    International Nuclear Information System (INIS)

    Linden, M.J.; Marom, S.A.; Linden, C.N.

    1986-01-01

    A computer processing tool has been developed which, together with a finite element program, determines the stress-deformation pattern in a long bone, utilizing Computed Tomography (CT) data files for the geometry and radiographic density information. The geometry, together with mechanical properties and boundary conditions: loads and displacements, comprise the input of the Finite element (FE) computer program. The output of the program is the stresses and deformations in the bone. The processor is capable of developing an accurate three-dimensional finite element model from a scanned human long bone due to the CT high pixel resolution and the local mechanical properties determined from the radiographic densities of the scanned bone. The processor, together with the finite element program, serves first as an analysis tool towards improved understanding of bone function and remodelling. In this first stage, actual long bones may be scanned and analyzed under applied loads and displacements, determined from existing gait analyses. The stress-deformation patterns thus obtained may be used for studying the biomechanical behavior of particular long bones such as bones with implants and with osteoporosis. As a second stage, this processor may serve as a diagnostic tool for analyzing the biomechanical response of a specific patient's long long bone under applied loading by utilizing a CT data file of the specific bone as an input to the processor with the FE program

  16. Computed micro-tomographic evaluation of glide path with nickel-titanium rotary PathFile in maxillary first molars curved canals.

    Science.gov (United States)

    Pasqualini, Damiano; Bianchi, Caterina Chiara; Paolino, Davide Salvatore; Mancini, Lucia; Cemenasco, Andrea; Cantatore, Giuseppe; Castellucci, Arnaldo; Berutti, Elio

    2012-03-01

    X-ray computed micro-tomography scanning allows high-resolution 3-dimensional imaging of small objects. In this study, micro-CT scanning was used to compare the ability of manual and mechanical glide path to maintain the original root canal anatomy. Eight extracted upper first permanent molars were scanned at the TOMOLAB station at ELETTRA Synchrotron Light Laboratory in Trieste, Italy, with a microfocus cone-beam geometry system. A total of 2,400 projections on 360° have been acquired at 100 kV and 80 μA, with a focal spot size of 8 μm. Buccal root canals of each specimen (n = 16) were randomly assigned to PathFile (P) or stainless-steel K-file (K) to perform glide path at the full working length. Specimens were then microscanned at the apical level (A) and at the point of the maximum curvature level (C) for post-treatment analyses. Curvatures of root canals were classified as moderate (≤35°) or severe (≥40°). The ratio of diameter ratios (RDRs) and the ratio of cross-sectional areas (RAs) were assessed. For each level of analysis (A and C), 2 balanced 2-way factorial analyses of variance (P < .05) were performed to evaluate the significance of the instrument factor and of canal curvature factor as well as the interactions of the factors both with RDRs and RAs. Specimens in the K group had a mean curvature of 35.4° ± 11.5°; those in the P group had a curvature of 38° ± 9.9°. The instrument factor (P and K) was extremely significant (P < .001) for both the RDR and RA parameters, regardless of the point of analysis. Micro-CT scanning confirmed that NiTi rotary PathFile instruments preserve the original canal anatomy and cause less canal aberrations. Copyright © 2012 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  17. Method for computed tomography

    International Nuclear Information System (INIS)

    Wagner, W.

    1980-01-01

    In transversal computer tomography apparatus, in which the positioning zone in which the patient can be positioned is larger than the scanning zone in which a body slice can be scanned, reconstruction errors are liable to occur. These errors are caused by incomplete irradiation of the body during examination. They become manifest not only as an incorrect image of the area not irradiated, but also have an adverse effect on the image of the other, completely irradiated areas. The invention enables reduction of these errors

  18. The effects of different source arrangement on the irradiation efficacy

    International Nuclear Information System (INIS)

    Liu Hongyue; Shi Peixin; Lin Yin

    1999-01-01

    The effects of 8 different arrangements with 16 pencil sources on irradiation productivity were studied by using a self-designed computer program. The results showed that the fashion of decentralized arrangement had a higher irradiation productivity than that of centralized in a static and uniform field

  19. Cloud Computing: Architecture and Services

    OpenAIRE

    Ms. Ravneet Kaur

    2018-01-01

    Cloud computing is Internet-based computing, whereby shared resources, software, and information are provided to computers and other devices on demand, like the electricity grid. It is a method for delivering information technology (IT) services where resources are retrieved from the Internet through web-based tools and applications, as opposed to a direct connection to a server. Rather than keeping files on a proprietary hard drive or local storage device, cloud-based storage makes it possib...

  20. EVALUATED NUCLEAR STRUCTURE DATA FILE. A MANUAL FOR PREPARATION OF DATA SETS

    International Nuclear Information System (INIS)

    TULI, J.K.

    2001-01-01

    This manual describes the organization and structure of the Evaluated Nuclear Structure Data File (ENSDF). This computer-based file is maintained by the National Nuclear Data Center (NNDC) at Brookhaven National Laboratory for the international Nuclear Structure and Decay Data Network. For every mass number (presently, A ≤ 293), the Evaluated Nuclear Structure Data File (ENSDF) contains evaluated structure information. For masses A ≥ 44, this information is published in the Nuclear Data Sheets; for A < 44, ENSDF is based on compilations published in the journal Nuclear Physics. The information in ENSDF is updated by mass chain or by nuclide with a varying cycle time dependent on the availability of new information

  1. File sharing

    NARCIS (Netherlands)

    van Eijk, N.

    2011-01-01

    File sharing’ has become generally accepted on the Internet. Users share files for downloading music, films, games, software etc. In this note, we have a closer look at the definition of file sharing, the legal and policy-based context as well as enforcement issues. The economic and cultural

  2. Modeling plastic deformation of post-irradiated copper micro-pillars

    Energy Technology Data Exchange (ETDEWEB)

    Crosby, Tamer, E-mail: tcrosby@ucla.edu; Po, Giacomo, E-mail: gpo@ucla.edu; Ghoniem, Nasr M., E-mail: ghoniem@ucla.edu

    2014-12-15

    We present here an application of a fundamentally new theoretical framework for description of the simultaneous evolution of radiation damage and plasticity that can describe both in situ and ex situ deformation of structural materials [1]. The theory is based on the variational principle of maximum entropy production rate; with constraints on dislocation climb motion that are imposed by point defect fluxes as a result of irradiation. The developed theory is implemented in a new computational code that facilitates the simulation of irradiated and unirradiated materials alike in a consistent fashion [2]. Discrete Dislocation Dynamics (DDD) computer simulations are presented here for irradiated fcc metals that address the phenomenon of dislocation channel formation in post-irradiated copper. The focus of the simulations is on the role of micro-pillar boundaries and the statistics of dislocation pinning by stacking-fault tetrahedra (SFTs) on the onset of dislocation channel and incipient surface crack formation. The simulations show that the spatial heterogeneity in the distribution of SFTs naturally leads to localized plastic deformation and incipient surface fracture of micro-pillars.

  3. A convertor and user interface to import CAD files into worldtoolkit virtual reality systems

    Science.gov (United States)

    Wang, Peter Hor-Ching

    1996-01-01

    Virtual Reality (VR) is a rapidly developing human-to-computer interface technology. VR can be considered as a three-dimensional computer-generated Virtual World (VW) which can sense particular aspects of a user's behavior, allow the user to manipulate the objects interactively, and render the VW at real-time accordingly. The user is totally immersed in the virtual world and feel the sense of transforming into that VW. NASA/MSFC Computer Application Virtual Environments (CAVE) has been developing the space-related VR applications since 1990. The VR systems in CAVE lab are based on VPL RB2 system which consists of a VPL RB2 control tower, an LX eyephone, an Isotrak polhemus sensor, two Fastrak polhemus sensors, a folk of Bird sensor, and two VPL DG2 DataGloves. A dynamics animator called Body Electric from VPL is used as the control system to interface with all the input/output devices and to provide the network communications as well as VR programming environment. The RB2 Swivel 3D is used as the modelling program to construct the VW's. A severe limitation of the VPL VR system is the use of RB2 Swivel 3D, which restricts the files to a maximum of 1020 objects and doesn't have the advanced graphics texture mapping. The other limitation is that the VPL VR system is a turn-key system which does not provide the flexibility for user to add new sensors and C language interface. Recently, NASA/MSFC CAVE lab provides VR systems built on Sense8 WorldToolKit (WTK) which is a C library for creating VR development environments. WTK provides device drivers for most of the sensors and eyephones available on the VR market. WTK accepts several CAD file formats, such as Sense8 Neutral File Format, AutoCAD DXF and 3D Studio file format, Wave Front OBJ file format, VideoScape GEO file format, Intergraph EMS stereolithographics and CATIA Stereolithographics STL file formats. WTK functions are object-oriented in their naming convention, are grouped into classes, and provide easy C

  4. Digital Stratigraphy: Contextual Analysis of File System Traces in Forensic Science.

    Science.gov (United States)

    Casey, Eoghan

    2017-12-28

    This work introduces novel methods for conducting forensic analysis of file allocation traces, collectively called digital stratigraphy. These in-depth forensic analysis methods can provide insight into the origin, composition, distribution, and time frame of strata within storage media. Using case examples and empirical studies, this paper illuminates the successes, challenges, and limitations of digital stratigraphy. This study also shows how understanding file allocation methods can provide insight into concealment activities and how real-world computer usage can complicate digital stratigraphy. Furthermore, this work explains how forensic analysts have misinterpreted traces of normal file system behavior as indications of concealment activities. This work raises awareness of the value of taking the overall context into account when analyzing file system traces. This work calls for further research in this area and for forensic tools to provide necessary information for such contextual analysis, such as highlighting mass deletion, mass copying, and potential backdating. © 2017 American Academy of Forensic Sciences.

  5. Dose controlled low energy electron irradiator for biomolecular films.

    Science.gov (United States)

    Kumar, S V K; Tare, Satej T; Upalekar, Yogesh V; Tsering, Thupten

    2016-03-01

    We have developed a multi target, Low Energy Electron (LEE), precise dose controlled irradiator for biomolecular films. Up to seven samples can be irradiated one after another at any preset electron energy and dose under UHV conditions without venting the chamber. In addition, one more sample goes through all the steps except irradiation, which can be used as control for comparison with the irradiated samples. All the samples are protected against stray electron irradiation by biasing them at -20 V during the entire period, except during irradiation. Ethernet based communication electronics hardware, LEE beam control electronics and computer interface were developed in house. The user Graphical User Interface to control the irradiation and dose measurement was developed using National Instruments Lab Windows CVI. The working and reliability of the dose controlled irradiator has been fully tested over the electron energy range of 0.5 to 500 eV by studying LEE induced single strand breaks to ΦX174 RF1 dsDNA.

  6. Dose controlled low energy electron irradiator for biomolecular films

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, S. V. K., E-mail: svkk@tifr.res.in; Tare, Satej T.; Upalekar, Yogesh V.; Tsering, Thupten [Tata Institute of Fundamental Research, Homi Bhabha Road, Colaba, Mumbai 400 005 (India)

    2016-03-15

    We have developed a multi target, Low Energy Electron (LEE), precise dose controlled irradiator for biomolecular films. Up to seven samples can be irradiated one after another at any preset electron energy and dose under UHV conditions without venting the chamber. In addition, one more sample goes through all the steps except irradiation, which can be used as control for comparison with the irradiated samples. All the samples are protected against stray electron irradiation by biasing them at −20 V during the entire period, except during irradiation. Ethernet based communication electronics hardware, LEE beam control electronics and computer interface were developed in house. The user Graphical User Interface to control the irradiation and dose measurement was developed using National Instruments Lab Windows CVI. The working and reliability of the dose controlled irradiator has been fully tested over the electron energy range of 0.5 to 500 eV by studying LEE induced single strand breaks to ΦX174 RF1 dsDNA.

  7. COMBINE archive and OMEX format: one file to share all information to reproduce a modeling project.

    Science.gov (United States)

    Bergmann, Frank T; Adams, Richard; Moodie, Stuart; Cooper, Jonathan; Glont, Mihai; Golebiewski, Martin; Hucka, Michael; Laibe, Camille; Miller, Andrew K; Nickerson, David P; Olivier, Brett G; Rodriguez, Nicolas; Sauro, Herbert M; Scharm, Martin; Soiland-Reyes, Stian; Waltemath, Dagmar; Yvon, Florent; Le Novère, Nicolas

    2014-12-14

    With the ever increasing use of computational models in the biosciences, the need to share models and reproduce the results of published studies efficiently and easily is becoming more important. To this end, various standards have been proposed that can be used to describe models, simulations, data or other essential information in a consistent fashion. These constitute various separate components required to reproduce a given published scientific result. We describe the Open Modeling EXchange format (OMEX). Together with the use of other standard formats from the Computational Modeling in Biology Network (COMBINE), OMEX is the basis of the COMBINE Archive, a single file that supports the exchange of all the information necessary for a modeling and simulation experiment in biology. An OMEX file is a ZIP container that includes a manifest file, listing the content of the archive, an optional metadata file adding information about the archive and its content, and the files describing the model. The content of a COMBINE Archive consists of files encoded in COMBINE standards whenever possible, but may include additional files defined by an Internet Media Type. Several tools that support the COMBINE Archive are available, either as independent libraries or embedded in modeling software. The COMBINE Archive facilitates the reproduction of modeling and simulation experiments in biology by embedding all the relevant information in one file. Having all the information stored and exchanged at once also helps in building activity logs and audit trails. We anticipate that the COMBINE Archive will become a significant help for modellers, as the domain moves to larger, more complex experiments such as multi-scale models of organs, digital organisms, and bioengineering.

  8. Temperature increases on the external root surface during endodontic treatment using single file systems.

    Science.gov (United States)

    Özkocak, I; Taşkan, M M; Gökt Rk, H; Aytac, F; Karaarslan, E Şirin

    2015-01-01

    The aim of this study is to evaluate increases in temperature on the external root surface during endodontic treatment with different rotary systems. Fifty human mandibular incisors with a single root canal were selected. All root canals were instrumented using a size 20 Hedstrom file, and the canals were irrigated with 5% sodium hypochlorite solution. The samples were randomly divided into the following three groups of 15 teeth: Group 1: The OneShape Endodontic File no.: 25; Group 2: The Reciproc Endodontic File no.: 25; Group 3: The WaveOne Endodontic File no.: 25. During the preparation, the temperature changes were measured in the middle third of the roots using a noncontact infrared thermometer. The temperature data were transferred from the thermometer to the computer and were observed graphically. Statistical analysis was performed using the Kruskal-Wallis analysis of variance at a significance level of 0.05. The increases in temperature caused by the OneShape file system were lower than those of the other files (P file showed the highest temperature increases. However, there were no significant differences between the Reciproc and WaveOne files. The single file rotary systems used in this study may be recommended for clinical use.

  9. Calculations on neutron irradiation damage in reactor materials

    International Nuclear Information System (INIS)

    Sone, Kazuho; Shiraishi, Kensuke

    1976-01-01

    Neutron irradiation damage calculations were made for Mo, Nb, V, Fe, Ni and Cr. Firstly, damage functions were calculated as a function of neutron energy with neutron cross sections of elastic and inelastic scatterings, and (n,2n) and (n,γ) reactions filed in ENDF/B-III. Secondly, displacement damage expressed in displacements per atom (DPA) was estimated for neutron environments such as fission spectrum, thermal neutron reactor (JMTR), fast breeder reactor (MONJU) and two fusion reactors (The Conceptual Design of Fusion Reactor in JAERI and ORNL-Benchmark). then, damage cross section in units of dpa. barn was defined as a factor to convert a given neutron fluence to the DPA value, and was calculated for the materials in the above neutron environments. Finally, production rates of helium and hydrogen atoms were calculated with (n,α) and (n,p) cross sections in ENDF/B-III for the materials irradiated in the above reactors. (auth.)

  10. Application of personal computer to development of entrance management system for radiating facilities

    International Nuclear Information System (INIS)

    Suzuki, Shogo; Hirai, Shouji

    1989-01-01

    The report describes a system for managing the entrance and exit of personnel to radiating facilities. A personal computer is applied to its development. Major features of the system is outlined first. The computer is connected to the gate and two magnetic card readers provided at the gate. The gate, which is installed at the entrance to a room under control, opens only for those who have a valid card. The entrance-exit management program developed is described next. The following three files are used: ID master file (random file of the magnetic card number, name, qualification, etc., of each card carrier), entrance-exit management file (random file of time of entrance/exit, etc., updated everyday), and entrance-exit record file (sequential file of card number, name, date, etc.), which are stored on floppy disks. A display is provided to show various lists including a list of workers currently in the room and a list of workers who left the room at earlier times of the day. This system is useful for entrance management of a relatively small facility. Though small in required cost, it requires only a few operators to perform effective personnel management. (N.K.)

  11. Clear-sky irradiance simulation using GMAO products and its comparison to ground and CERES satellite observation

    Science.gov (United States)

    Ham, S. H.; Loeb, N. G.; Kato, S.; Rose, F. G.; Bosilovich, M. G.; Rutan, D. A.; Huang, X.; Collow, A.

    2017-12-01

    Global Modeling Assimilation Office (GMAO) GEOS assimilated datasets are used to describe temperature and humidity profiles in the Clouds and the Earth's Radiant Energy System (CERES) data processing. Given that advance versions of the assimilated data sets known as of Forward Processing (FP), FP Parallel (FPP), and Modern-Era Retrospective Analysis for Research and Applications version 2 (MERRA-2) datasets are available, we examine clear-sky irradiance calculation to see if accuracy is improved with these newer versions of GMAO datasets when their temperature and humidity profiles are used in computing irradiances. Two older versions, GEOS-5.2.0 and GEOS-5.4.1 are used for producing, respectively, Ed3 and Ed4 CERES data products. For the evaluation, CERES-derived TOA irradiances and observed ground-based surface irradiances are compared with the computed irradiances for clear skies identified by Moderate Resolution Imaging Spectroradiometer (MODIS). Surface type dependent spectral emissivity is taken from an observationally-based monthly gridded emissivity dataset. TOA longwave (LW) irradiances computed with GOES-5.2.0 temperature and humidity profiles are biased low, up to -5 Wm-2, compared to CERES-derived TOA longwave irradiance over tropical oceans. In contrast, computed longwave irradiances agree well with CERES observations with the biases less than 2 W m-2 when GOES-5.4.1, FP v5.13, or MERRA-2 temperature and humidity are used. The negative biases of the TOA LW irradiance computed with GOES-5.2.0 appear to be related to a wet bias at 500-850 hPa layer. This indicates that if the input of CERES algorithm switches from GOES-5.2.0 to FP v5.13 or MERRA-2, the bias in clear-sky longwave TOA fluxes over tropical oceans is expected to be smaller. At surface, downward LW irradiances computed with FP v5.13 and MERRA-2 are biased low, up to -10 Wm-2, compared to ground observations over tropical oceans. The magnitude of the bias in the longwave surface irradiances

  12. Dose optimization in computed tomography: ICRP 87

    International Nuclear Information System (INIS)

    2007-01-01

    The doses given in the use of computed tomography scans are studied, aiming to calibrate the limits of irradiation in patients who need these tests. Furthermore, a good value of computed tomography should be guaranteed by physicians and radiologists for people not being irradiated unfairly, reducing doses and unnecessary tests. A critical evaluation by an ethics committee is suggested for cases where the test is performed for medical research without a cause [es

  13. Recent Advancements in the Numerical Simulation of Surface Irradiance for Solar Energy Applications: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Xie, Yu; Sengupta, Manajit; Deline, Chris

    2017-06-27

    This paper briefly reviews the National Renewable Energy Laboratory's recent efforts on developing all-sky solar irradiance models for solar energy applications. The Fast All-sky Radiation Model for Solar applications (FARMS) utilizes the simulation of clear-sky transmittance and reflectance and a parameterization of cloud transmittance and reflectance to rapidly compute broadband irradiances on horizontal surfaces. FARMS delivers accuracy that is comparable to the two-stream approximation, but it is approximately 1,000 times faster. A FARMS-Narrowband Irradiance over Tilted surfaces (FARMS-NIT) has been developed to compute spectral irradiances on photovoltaic (PV) panels in 2002 wavelength bands. Further, FARMS-NIT has been extended for bifacial PV panels.

  14. How users organize electronic files on their workstations in the office environment: a preliminary study of personal information organization behaviour

    Directory of Open Access Journals (Sweden)

    Christopher S.G. Khoo

    2007-01-01

    Full Text Available An ongoing study of how people organize their computer files and folders on the hard disk of their office workstations. A questionnaire was used to collect information on the subjects, their work responsibilities and characteristics of their workstations. Data on file and folder names and file structure were extracted from the hard disk using a computer program STG FolderPrint Plus, DOS command and screen capture. A semi-structured interview collected information on subjects' strategies in naming and organizing files and folders, and in locating and retrieving files. The data were analysed mainly through qualitative analysis and content analysis. The subjects organized their folders in a variety of structures, from broad and shallow to narrow and deep hierarchies. One to three levels of folders is common. The labels for first level folders tended to be task-based or project-based. Most subjects located files by browsing the folder structure, with searching used as a last resort. The most common types of folder names were document type, organizational function or structure, and miscellaneous or temporary. The frequency of folders of different types appear related to the type of occupation.

  15. Defining nuclear medical file formal based on DICOM standard

    International Nuclear Information System (INIS)

    He Bin; Jin Yongjie; Li Yulan

    2001-01-01

    With the wide application of computer technology in medical area, DICOM is becoming the standard of digital imaging and communication. The author discusses how to define medical imaging file formal based on DICOM standard. It also introduces the format of ANMIS system the authors defined the validity and integrality of this format

  16. PRO/Mapper: a plotting program for the DEC PRO/300 personal computers utilizing the MAPPER graphics language

    International Nuclear Information System (INIS)

    Wachter, J.W.

    1986-05-01

    PRO/Mapper is an application for the Digital Equipment Corporation PRO/300 series of personal computers that facilitates the preparation of visuals such as graphs, charts, and maps in color or black and white. The user prepares an input data file containing English-language commands and writes it into a file using standard editor. PRO/Mapper then reads these files and draws graphs, maps, boxes, and complex line segments onto the computer screen. Axes, curves, and error bars may be plotted in graphical presentations. The commands of PRO/Mapper are a subset of the commands of the more sophisticated MAPPER program written for mainframe computers. The PRO/Mapper commands were chosen primarily for the production of linear graphs. Command files written for the PRO/300 are upward compatible with the Martin Marietta Energy Systems version of MAPPER and can be used to produce publication-quality slides, drawings, and maps on the various output devices of the Oak Ridge National Laboratory mainframe computers

  17. Renewal-anomalous-heterogeneous files

    International Nuclear Information System (INIS)

    Flomenbom, Ophir

    2010-01-01

    Renewal-anomalous-heterogeneous files are solved. A simple file is made of Brownian hard spheres that diffuse stochastically in an effective 1D channel. Generally, Brownian files are heterogeneous: the spheres' diffusion coefficients are distributed and the initial spheres' density is non-uniform. In renewal-anomalous files, the distribution of waiting times for individual jumps is not exponential as in Brownian files, yet obeys: ψ α (t)∼t -1-α , 0 2 >, obeys, 2 >∼ 2 > nrml α , where 2 > nrml is the MSD in the corresponding Brownian file. This scaling is an outcome of an exact relation (derived here) connecting probability density functions of Brownian files and renewal-anomalous files. It is also shown that non-renewal-anomalous files are slower than the corresponding renewal ones.

  18. Irradiation doses on thyroid gland during the postoperative irradiation for breast cancer.

    Science.gov (United States)

    Akın, Mustafa; Ergen, Arzu; Unal, Aysegul; Bese, Nuran

    2014-01-01

    Thyroid gland is one of the radiosensitive endocrine organs in the body. It has been shown that direct irradiation of thyroid with total doses of 26 to 30 Gy can lead to functional abnormalities. In this study, irradiation doses on thyroid gland of the patients who received postoperative chest-wall/breast and regional nodal irradiation were assessed. Retrospective analyses of treatment plans from 122 breast cancer patients who were treated with 3D conformal radiotherapy (3D CRT) planning was performed. All patients received irradiation to supraclavicular/level III lymph nodes in addition to chest-wall/breast. A total dose of 46 Gy was delivered in 25 days to supraclavicular/level III lymph node region while a total dose of 50 Gy was delivered to whole breast/chest-wall. Thyroid gland was contoured on 2-5 mm thickness of computed tomography scans. Absolute thyroid volume, mean thyroid doses were calculated. The mean thyroid volume of all patients was 16.7 cc (min: 1.9 cc, max: 41.6 cc). The mean irradiation dose on was 22.5 Gy (0.32 Gy-46.5 Gy). The level of dose was higher than 26 Gy in 44% of the patients. In majority of the node-positive breast cancer patients treated with 3D CRT, the thyroid gland was exposed to considerable doses. On the other hand, for 44% of the patients are at risk for developing thyroid function abnormalities which should be considered during the routine follow-up.

  19. Ultrasonic and computed tomography in radiotherapy planning - a comparison

    International Nuclear Information System (INIS)

    Schertel, L.

    1980-01-01

    The precondition of any radiotherapy is radiation planning. This must be done individually for every patient and must be applicable for any region of the body. Modern irradiation planning requires pictures of the body parts concerned; these can be made by means of the ultrasonic method and computed tomography. This comparative investigation leads to the result (see fig. 4 and 5) that computed tomographic body part pictures should be preferred to those made sonographically. The opinion of Huenig and Co. [8] that ultrasonic tomography will soon lose some of its importance within irradiation planning once computed tomography is introduced could be confirmed by the latest developments. The authors can confirm this also out of their own experience and agree with Winkel and Hermann [23] that computed tomography cannot be done without any more irradiation planning. (orig.) [de

  20. Needs of food irradiation and its commercialization

    International Nuclear Information System (INIS)

    Welt, M.A.

    1984-01-01

    On July 5, 1983, the United States Food and Drug Administration filed a notice in the Federal Register approving the use of Cobalt 60 or Cesium 137 gamma radiation to reduce or control microbial contamination in spices, onion powder and garlic powder. The approval was the first in nineteen years issued by the FDA and appears to set the stage for increased regulatory approvals in the area of radiation preservation of foods. On July 8, 1983, the Codex Alimentarius Commission approved an international standard for irradiated foods. The standard had previously been introduced by experts in the World Health Organization and the Food and Agriculture Organization. The ability of properly applied ionizing energy to inhibit sprouting, to eliminate the need for toxic chemical fumigants for insect disinfestation purposes, to extend refrigerated shelf life of many food products, to eliminate parasites and pathogens from our food chain and to preserve precooked packaged food products for indefinite storage without freezing or refrigeration, dictates the timeliness of food irradiation technology. (author)

  1. NASA ARCH- A FILE ARCHIVAL SYSTEM FOR THE DEC VAX

    Science.gov (United States)

    Scott, P. J.

    1994-01-01

    The function of the NASA ARCH system is to provide a permanent storage area for files that are infrequently accessed. The NASA ARCH routines were designed to provide a simple mechanism by which users can easily store and retrieve files. The user treats NASA ARCH as the interface to a black box where files are stored. There are only five NASA ARCH user commands, even though NASA ARCH employs standard VMS directives and the VAX BACKUP utility. Special care is taken to provide the security needed to insure file integrity over a period of years. The archived files may exist in any of three storage areas: a temporary buffer, the main buffer, and a magnetic tape library. When the main buffer fills up, it is transferred to permanent magnetic tape storage and deleted from disk. Files may be restored from any of the three storage areas. A single file, multiple files, or entire directories can be stored and retrieved. archived entities hold the same name, extension, version number, and VMS file protection scheme as they had in the user's account prior to archival. NASA ARCH is capable of handling up to 7 directory levels. Wildcards are supported. User commands include TEMPCOPY, DISKCOPY, DELETE, RESTORE, and DIRECTORY. The DIRECTORY command searches a directory of savesets covering all three archival areas, listing matches according to area, date, filename, or other criteria supplied by the user. The system manager commands include 1) ARCHIVE- to transfer the main buffer to duplicate magnetic tapes, 2) REPORTto determine when the main buffer is full enough to archive, 3) INCREMENT- to back up the partially filled main buffer, and 4) FULLBACKUP- to back up the entire main buffer. On-line help files are provided for all NASA ARCH commands. NASA ARCH is written in DEC VAX DCL for interactive execution and has been implemented on a DEC VAX computer operating under VMS 4.X. This program was developed in 1985.

  2. PC Graphic file programing

    International Nuclear Information System (INIS)

    Yang, Jin Seok

    1993-04-01

    This book gives description of basic of graphic knowledge and understanding and realization of graphic file form. The first part deals with graphic with graphic data, store of graphic data and compress of data, programing language such as assembling, stack, compile and link of program and practice and debugging. The next part mentions graphic file form such as Mac paint file, GEM/IMG file, PCX file, GIF file, and TIFF file, consideration of hardware like mono screen driver and color screen driver in high speed, basic conception of dithering and conversion of formality.

  3. Lymphocyte development in irradiated thymuses: dynamics of colonization by progenitor cells and regeneration of resident cells

    International Nuclear Information System (INIS)

    Mehr, R.; Fridkis-Hareli, M.; Abel, L.; Segel, L.; Globerson, A.

    1995-01-01

    Lymphocyte development in irradiated thymuses was analyzed using two complementary strategies: an in vitro experimental model and computer simulations. In the in vitro model, fetal thymus lobes were irradiated and the regeneration of cells that survived irradiation were examined, with the results compared to those of reconstitution of the thymus by donor bone marrow cells and their competition with the thymic resident cells. In vitro measurements of resident cell kinetics showed that cell proliferation is slowed down significantly after a relatively low (10Gy) irradiation dose. Although the number of thymocytes that survived irradiation remained low for several days post-irradiation, further colonization by donor cells was not possible, unless performed within 6 h after irradiation. These experimental results, coupled with the analysis by computer simulations, suggest that bone marrow cell engraftment in the irradiated thymus may be limited by the presence of radiation-surviving thymic resident cells and the reduced availability of seeding niches. (Author)

  4. Preoperative irradiation of an extracerebral cavernous hemangioma in the middle fossa. Follow-up study with computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Shibata, S; Kurihara, M; Mori, K [Nagasaki Univ. (Japan). School of Medicine; Amamoto, Y

    1981-02-01

    This is a report of case with the extracerebral cavernous hemangioma in the middle fossa in which total removal was carried out after radiotherapy. Follow-up study with computed tomography during and after irradiation are presented. A 44-year-old house-wife complained of a decreased vision of the both eyes and paresis of the left upper and lower limbs. CT scan revealed a slightly high density area in the right middle cranial fossa which was markedly enhanced with contrast media. Right carotid angio-graphy demonstrated a large avascular mass in the right middle fossa and no feeding artery or draining vein was visualized except a faint irregular stain in the venous phase. An attempt to total removal of the tumor had failed to succeed because of extensive hemorrhage from the tumor. Histological examination revealed a cavernous hemangioma. Irradiation with a total dose of 5000 rads was delivered. After irradiation CT scan revealed a marked decrease of size and EMI number of the tumor. At this stage, hypervascular mass lesion with feeding arteries was noted in conventional angiography. Tumor stain in prolonged injection angiography was also visualized. In the second operation, removal of the tumor was performed without any difficulty and hemorrhage was controlled easily by electrocoagulation. Histology revealed a marked narrowing of vessels with an increase in the connective tissues. In the central part of specimen, there noted findings of coagulation necrosis, intraluminal thrombus formations and so on, which were attributed to the influence of radiation. It is concluded that in case of an extracerebral cavernous hemangioma with massive hemorrhage, radiation of up to 3000 - 5000 rads was a method of choice. The treatment results in an increase of probability of total removal of the tumor.

  5. Amorphous molecular junctions produced by ion irradiation on carbon nanotubes

    International Nuclear Information System (INIS)

    Wang Zhenxia; Yu Liping; Zhang Wei; Ding Yinfeng; Li Yulan; Han Jiaguang; Zhu Zhiyuan; Xu Hongjie; He Guowei; Chen Yi; Hu Gang

    2004-01-01

    Experiments and molecular dynamics have demonstrated that electron irradiation could create molecular junctions between crossed single-wall carbon nanotubes. Recently molecular dynamics computation predicted that ion irradiation could also join single-walled carbon nanotubes. Employing carbon ion irradiation on multi-walled carbon nanotubes, we find that these nanotubes evolve into amorphous carbon nanowires, more importantly, during the process of which various molecular junctions of amorphous nanowires are formed by welding from crossed carbon nanotubes. It demonstrates that ion-beam irradiation could be an effective way not only for the welding of nanotubes but also for the formation of nanowire junctions

  6. Experiences on File Systems: Which is the best file system for you?

    CERN Document Server

    Blomer, J

    2015-01-01

    The distributed file system landscape is scattered. Besides a plethora of research file systems, there is also a large number of production grade file systems with various strengths and weaknesses. The file system, as an abstraction of permanent storage, is appealing because it provides application portability and integration with legacy and third-party applications, including UNIX utilities. On the other hand, the general and simple file system interface makes it notoriously difficult for a distributed file system to perform well under a variety of different workloads. This contribution provides a taxonomy of commonly used distributed file systems and points out areas of research and development that are particularly important for high-energy physics.

  7. Tools for remote computing in accelerator control

    International Nuclear Information System (INIS)

    Anderssen, P.S.; Frammery, V.; Wilcke, R.

    1990-01-01

    In modern accelerator control systems, the intelligence of the equipment is distributed in the geographical and the logical sense. Control processes for a large variety of tasks reside in both the equipment and the control computers. Hence successful operation hinges on the availability and reliability of the communication infrastructure. The computers are interconnected by a communication system and use remote procedure calls and message passing for information exchange. These communication mechanisms need a well-defined convention, i.e. a protocol. They also require flexibility in both the setup and changes to the protocol specification. The network compiler is a tool which provides the programmer with a means of establishing such a protocol for his application. Input to the network compiler is a single interface description file provided by the programmer. This file is written according to a grammar, and completely specifies the interprocess communication interfaces. Passed through the network compiler, the interface description file automatically produces the additional source code needed for the protocol. Hence the programmer does not have to be concerned about the details of the communication calls. Any further additions and modifications are made easy, because all the information about the interface is kept in a single file. (orig.)

  8. Software For Computer-Security Audits

    Science.gov (United States)

    Arndt, Kate; Lonsford, Emily

    1994-01-01

    Information relevant to potential breaches of security gathered efficiently. Automated Auditing Tools for VAX/VMS program includes following automated software tools performing noted tasks: Privileged ID Identification, program identifies users and their privileges to circumvent existing computer security measures; Critical File Protection, critical files not properly protected identified; Inactive ID Identification, identifications of users no longer in use found; Password Lifetime Review, maximum lifetimes of passwords of all identifications determined; and Password Length Review, minimum allowed length of passwords of all identifications determined. Written in DEC VAX DCL language.

  9. Consumer acceptance of irradiated food products: an apple marketing study

    International Nuclear Information System (INIS)

    Terry, D.E.; Tabor, R.L.

    1990-01-01

    This study was exploratory in nature, with emphasis on initial purchases and not repeat purchases or long-term loyalties to either irradiated or non-irradiated produce. The investigation involved the actual sale of irradiated and non-irradiated apples to consumers. Limited information about the process was provided, and apples were sold at roadside stands. Prices for the irradiated apples were varied while the price for the non-irradiated apples was held constant. Of these 228 West-Central Missouri shoppers, 101 (44%) bought no irradiated apples, 86 (38%) bought only irradiated apples, and 41 (18%) bought some of both types, Results of probit regressions indicated three significant independent variables. There was an inverse relationship between the price of irradiated apples and the probability of purchasing irradiated apples. There was a positive relationship between the purchasers’ educational level and the probability of purchasing irradiated apples. Predicted probabilities for belonging to categories in probit models were computed. Depending on particular equation specification, correctly placed were approximately 70 percent of the purchasers of the two categories--bought only non-irradiated apples, or bought some of both irradiated and non-irradiated apples or only irradiated apples. This study suggests that consumers may be interested in food irradiation as a possible alternative or supplement to current preservation techniques

  10. Preparation of a data bank system for isotope correlation on spent nuclear fuels

    International Nuclear Information System (INIS)

    Nakahara, Yoshinori; Umezawa, Hirokazu

    1981-11-01

    For the purposes of studying isotope correlation on spent nuclear fuels and its applicability to safeguards technology of nuclear material, a data bank system has been prepared on the basis of a FACOM M200 computer in JAERI. Spent fuels data of fabrication, irradiation history, reactor operator's burnup calculation, and reprocessing are stored in four kinds of data files: (1) Fuel assembly data file, (2) reprocessing batch data file, (3) plutonium product data file, and (4) uranium product data file. Corrections for decay and for mixing from adjoining batches are made and the corrected data are also kept in the files. A wide variety of variables may be derived from the isotopic and other data stored and subjected to optional statistical treatments such as regression analysis and paired comparison. Computer language used for the system was FORTRAN-IV. The system can be operated in a conversational mode with graphic indication, so that one may proceed such a statistical analysis immediately under various conditions of calculation. (author)

  11. Relationship of microstructure and tensile properties for neutron-irradiated vanadium alloys

    International Nuclear Information System (INIS)

    Loomis, B.A.; Smith, D.L.

    1990-01-01

    The microstructures in V-15Cr-5Ti, V-10Cr-5RTi, V-3Ti-1Si, V-15Ti-7.5Cr, and V-20Ti alloys were examined by transmission electron microscopy after neutron irradiation at 600 degree C to 21--84 atom displacements per atom in the Materials Open Test Assembly of the Fast Flux Test Facility. The microstructures in these irradiated alloys were analyzed to determine the radiation-produced dislocation density, precipitate number density and size, and void number density and size. The results of these analyses were used to compute increases in yield stress and swelling of the irradiated alloys. The computed increase in yield stress was compared with the increase in yield stress determined from tensile tests on these irradiated alloys. This comparison made it possible to evaluate the influence of alloy composition on the evolution of radiation-damaged microstructures and the resulting tensile properties. 11 refs

  12. Open Computer Forensic Architecture a Way to Process Terabytes of Forensic Disk Images

    Science.gov (United States)

    Vermaas, Oscar; Simons, Joep; Meijer, Rob

    This chapter describes the Open Computer Forensics Architecture (OCFA), an automated system that dissects complex file types, extracts metadata from files and ultimately creates indexes on forensic images of seized computers. It consists of a set of collaborating processes, called modules. Each module is specialized in processing a certain file type. When it receives a so called 'evidence', the information that has been extracted so far about the file together with the actual data, it either adds new information about the file or uses the file to derive a new 'evidence'. All evidence, original and derived, is sent to a router after being processed by a particular module. The router decides which module should process the evidence next, based upon the metadata associated with the evidence. Thus the OCFA system can recursively process images until from every compound file the embedded files, if any, are extracted, all information that the system can derive, has been derived and all extracted text is indexed. Compound files include, but are not limited to, archive- and zip-files, disk images, text documents of various formats and, for example, mailboxes. The output of an OCFA run is a repository full of derived files, a database containing all extracted information about the files and an index which can be used when searching. This is presented in a web interface. Moreover, processed data is easily fed to third party software for further analysis or to be used in data mining or text mining-tools. The main advantages of the OCFA system are Scalability, it is able to process large amounts of data.

  13. Calculation simulation of equivalent irradiation swelling for dispersion nuclear fuel

    International Nuclear Information System (INIS)

    Cai Wei; Zhao Yunmei; Gong Xin; Ding Shurong; Huo Yongzhong

    2015-01-01

    The dispersion nuclear fuel was regarded as a kind of special particle composites. Assuming that the fuel particles are periodically distributed in the dispersion nuclear fuel meat, the finite element model to calculate its equivalent irradiation swelling was developed with the method of computational micro-mechanics. Considering irradiation swelling in the fuel particles and the irradiation hardening effect in the metal matrix, the stress update algorithms were established respectively for the fuel particles and metal matrix. The corresponding user subroutines were programmed, and the finite element simulation of equivalent irradiation swelling for the fuel meat was performed in Abaqus. The effects of the particle size and volume fraction on the equivalent irradiation swelling were investigated, and the fitting formula of equivalent irradiation swelling was obtained. The results indicate that the main factors to influence equivalent irradiation swelling of the fuel meat are the irradiation swelling and volume fraction of fuel particles. (authors)

  14. Systematic control of large computer programs

    International Nuclear Information System (INIS)

    Goedbloed, J.P.; Klieb, L.

    1986-07-01

    A package of CCL, UPDATE, and FORTRAN procedures is described which facilitates the systematic control and development of large scientific computer programs. The package provides a general tool box for this purpose which contains many conveniences for the systematic administration of files, editing, reformating of line printer output files, etc. In addition, a small number of procedures is devoted to the problem of structured development of a large computer program which is used by a group of scientists. The essence of the method is contained in three procedures N, R, and X for the creation of a new UPDATE program library, its revision, and execution, resp., and a procedure REVISE which provides a joint editor - UPDATE session which combines the advantages of the two systems, viz. speed and rigor. (Auth.)

  15. Computer system operation

    International Nuclear Information System (INIS)

    Lee, Young Jae; Lee, Hae Cho; Lee, Ho Yeun; Kim, Young Taek; Lee, Sung Kyu; Park, Jeong Suk; Nam, Ji Wha; Kim, Soon Kon; Yang, Sung Un; Sohn, Jae Min; Moon, Soon Sung; Park, Bong Sik; Lee, Byung Heon; Park, Sun Hee; Kim, Jin Hee; Hwang, Hyeoi Sun; Lee, Hee Ja; Hwang, In A.

    1993-12-01

    The report described the operation and the trouble shooting of main computer and KAERINet. The results of the project are as follows; 1. The operation and trouble shooting of the main computer system. (Cyber 170-875, Cyber 960-31, VAX 6320, VAX 11/780). 2. The operation and trouble shooting of the KAERINet. (PC to host connection, host to host connection, file transfer, electronic-mail, X.25, CATV etc.). 3. The development of applications -Electronic Document Approval and Delivery System, Installation the ORACLE Utility Program. 22 tabs., 12 figs. (Author) .new

  16. Computer system operation

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Jae; Lee, Hae Cho; Lee, Ho Yeun; Kim, Young Taek; Lee, Sung Kyu; Park, Jeong Suk; Nam, Ji Wha; Kim, Soon Kon; Yang, Sung Un; Sohn, Jae Min; Moon, Soon Sung; Park, Bong Sik; Lee, Byung Heon; Park, Sun Hee; Kim, Jin Hee; Hwang, Hyeoi Sun; Lee, Hee Ja; Hwang, In A [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1993-12-01

    The report described the operation and the trouble shooting of main computer and KAERINet. The results of the project are as follows; 1. The operation and trouble shooting of the main computer system. (Cyber 170-875, Cyber 960-31, VAX 6320, VAX 11/780). 2. The operation and trouble shooting of the KAERINet. (PC to host connection, host to host connection, file transfer, electronic-mail, X.25, CATV etc.). 3. The development of applications -Electronic Document Approval and Delivery System, Installation the ORACLE Utility Program. 22 tabs., 12 figs. (Author) .new.

  17. Computer-communication networks

    CERN Document Server

    Meditch, James S

    1983-01-01

    Computer- Communication Networks presents a collection of articles the focus of which is on the field of modeling, analysis, design, and performance optimization. It discusses the problem of modeling the performance of local area networks under file transfer. It addresses the design of multi-hop, mobile-user radio networks. Some of the topics covered in the book are the distributed packet switching queuing network design, some investigations on communication switching techniques in computer networks and the minimum hop flow assignment and routing subject to an average message delay constraint

  18. Helium production in mixed spectrum reactor-irradiated pure elements

    International Nuclear Information System (INIS)

    Kneff, D.W.; Oliver, B.M.; Skowronski, R.P.

    1986-01-01

    The objectives of this work are to apply helium accumulation neutron dosimetry to the measurement of neutron fluences and energy spectra in mixed-spectrum fission reactors utilized for fusion materials testing, and to measure helium generation rates of materials in these irradiation environments. Helium generation measurements have been made for several Fe, Cu Ti, Nb, Cr, and Pt samples irradiated in the mixed-spectrum High Flux Isotope Reactor (HFIR) and Oak Ridge Research Reactor (ORR) at the Oak Ridge National Laboratory. The results have been used to integrally test the ENDF/B-V Gas Production File, by comparing the measurements with helium generation predictions made by Argonne National Laboratory using ENDF/B-V cross sections and adjusted reactor spectra. The comparisons indicate consistency between the helium measurements and ENDF/B-V for iron, but cross section discrepancies exist for helium production by fast neutrons in Cu, Ti, Nb, and Cr (the latter for ORR). The Fe, Cu, and Ti work updates and extends previous measurements

  19. Catching errors with patient-specific pretreatment machine log file analysis.

    Science.gov (United States)

    Rangaraj, Dharanipathy; Zhu, Mingyao; Yang, Deshan; Palaniswaamy, Geethpriya; Yaddanapudi, Sridhar; Wooten, Omar H; Brame, Scott; Mutic, Sasa

    2013-01-01

    A robust, efficient, and reliable quality assurance (QA) process is highly desired for modern external beam radiation therapy treatments. Here, we report the results of a semiautomatic, pretreatment, patient-specific QA process based on dynamic machine log file analysis clinically implemented for intensity modulated radiation therapy (IMRT) treatments delivered by high energy linear accelerators (Varian 2100/2300 EX, Trilogy, iX-D, Varian Medical Systems Inc, Palo Alto, CA). The multileaf collimator machine (MLC) log files are called Dynalog by Varian. Using an in-house developed computer program called "Dynalog QA," we automatically compare the beam delivery parameters in the log files that are generated during pretreatment point dose verification measurements, with the treatment plan to determine any discrepancies in IMRT deliveries. Fluence maps are constructed and compared between the delivered and planned beams. Since clinical introduction in June 2009, 912 machine log file analyses QA were performed by the end of 2010. Among these, 14 errors causing dosimetric deviation were detected and required further investigation and intervention. These errors were the result of human operating mistakes, flawed treatment planning, and data modification during plan file transfer. Minor errors were also reported in 174 other log file analyses, some of which stemmed from false positives and unreliable results; the origins of these are discussed herein. It has been demonstrated that the machine log file analysis is a robust, efficient, and reliable QA process capable of detecting errors originating from human mistakes, flawed planning, and data transfer problems. The possibility of detecting these errors is low using point and planar dosimetric measurements. Copyright © 2013 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  20. A study to compute integrated dpa for neutron and ion irradiation environments using SRIM-2013

    Science.gov (United States)

    Saha, Uttiyoarnab; Devan, K.; Ganesan, S.

    2018-05-01

    Displacements per atom (dpa), estimated based on the standard Norgett-Robinson-Torrens (NRT) model, is used for assessing radiation damage effects in fast reactor materials. A computer code CRaD has been indigenously developed towards establishing the infrastructure to perform improved radiation damage studies in Indian fast reactors. We propose a method for computing multigroup neutron NRT dpa cross sections based on SRIM-2013 simulations. In this method, for each neutron group, the recoil or primary knock-on atom (PKA) spectrum and its average energy are first estimated with CRaD code from ENDF/B-VII.1. This average PKA energy forms the input for SRIM simulation, wherein the recoil atom is taken as the incoming ion on the target. The NRT-dpa cross section of iron computed with "Quick" Kinchin-Pease (K-P) option of SRIM-2013 is found to agree within 10% with the standard NRT-dpa values, if damage energy from SRIM simulation is used. SRIM-2013 NRT-dpa cross sections applied to estimate the integrated dpa for Fe, Cr and Ni are in good agreement with established computer codes and data. A similar study carried out for polyatomic material, SiC, shows encouraging results. In this case, it is observed that the NRT approach with average lattice displacement energy of 25 eV coupled with the damage energies from the K-P option of SRIM-2013 gives reliable displacement cross sections and integrated dpa for various reactor spectra. The source term of neutron damage can be equivalently determined in the units of dpa by simulating self-ion bombardment. This shows that the information of primary recoils obtained from CRaD can be reliably applied to estimate the integrated dpa and damage assessment studies in accelerator-based self-ion irradiation experiments of structural materials. This study would help to advance the investigation of possible correlations between the damages induced by ions and reactor neutrons.

  1. Provider of Services File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The POS file consists of two data files, one for CLIA labs and one for 18 other provider types. The file names are CLIA and OTHER. If downloading the file, note it...

  2. FRESCO-II: A computer program for analysis of fission product release from spherical HTGR-fuel elements in irradiation and annealing experiments

    International Nuclear Information System (INIS)

    Krohn, H.; Finken, R.

    1983-06-01

    The modular computer code FRESCO has been developed to describe the mechanism of fission product release from a HTGR-Core under accident conditions. By changing some program modules it has been extended to take into account the transport phenomena (i.e. recoil) too, which only occur under reactor operating conditions and during the irradiation experiments. For this report, the release of cesium and strontium from three HTGR-fuel elements has been evaluated and compared with the experimental data. The results show that the measured release can be described by the considered models. (orig.) [de

  3. Computer programs to make a Chart of the nuclides for WWW

    International Nuclear Information System (INIS)

    Nakagawa, Tsuneo; Katakura, Jun-ichi; Horiguchi, Takayoshi

    1999-06-01

    Computer programs to make a chart of the nuclides for World Wide Web (WWW) have been developed. The programs make a data file for WWW chart of the nuclides from a data file containing nuclide information in the format similar to ENSDF, by filling unknown half-lives with calculated ones. Then, the WWW chart of the nuclides in the gif format is created from the data file. The programs to make html files and image map files, to select a chart of selected nuclides, and to show various information of nuclides are included in the system. All the programs are written in C language. This report describes the formats of files, the programs and 1998 issue of Chart of the Nuclides made by means of the present programs. (author)

  4. BALANCER: A Computer Program for Balancing Chemical Equations.

    Science.gov (United States)

    Jones, R. David; Schwab, A. Paul

    1989-01-01

    Describes the theory and operation of a computer program which was written to balance chemical equations. Software consists of a compiled file of 46K for use under MS-DOS 2.0 or later on IBM PC or compatible computers. Additional specifications of courseware and availability information are included. (Author/RT)

  5. Text File Comparator

    Science.gov (United States)

    Kotler, R. S.

    1983-01-01

    File Comparator program IFCOMP, is text file comparator for IBM OS/VScompatable systems. IFCOMP accepts as input two text files and produces listing of differences in pseudo-update form. IFCOMP is very useful in monitoring changes made to software at the source code level.

  6. Computer automation of a health physics program record

    International Nuclear Information System (INIS)

    Bird, E.M.; Flook, B.A.; Jarrett, R.D.

    1984-01-01

    A multi-user computer data base management system (DBMS) has been developed to automate USDA's national radiological safety program. It maintains information on approved users of radioactive material and radiation emanating equipment, as a central file which is accessed whenever information on the user is required. Files of inventory, personnel dosemetry records, laboratory and equipment surveys, leak tests, bioassay reports, and all other information are linked to each approved user by an assigned code that identifies the user by state, agency, and facility. The DBMS is menu-driven with provisions for addition, modification and report generation of information maintained in the system. This DBMS was designed as a single entry system to reduce the redundency of data entry. Prompts guide the user at decision points and data validation routines check for proper data entry. The DBMS generates lists of current inventories, leak test forms, inspection reports, scans for overdue reports from users, and generates follow-up letters. The DBMS system operates on a Wang OIS computer and utilizes its compiled BASIC, List Processing, Word Processing, and indexed (ISAM) file features. This system is a very fast relational database supporting many users simultaneously while providing several methods of data protection. All data files are compatible with List Processing. Information in these files can be examined, sorted, modified, or outputted to word processing documents using software supplied by Wang. This has reduced the need for special one-time programs and provides alternative access to the data

  7. SLIB77, Source Library Data Compression and File Maintenance System

    International Nuclear Information System (INIS)

    Lunsford, A.

    1989-01-01

    Description of program or function: SLIB77 is a source librarian program designed to maintain FORTRAN source code in a compressed form on magnetic disk. The program was prepared to meet program maintenance requirements for ongoing program development and continual improvement of very large programs involving many programmers from a number of different organizations. SLIB77 automatically maintains in one file the source of the current program as well as all previous modifications. Although written originally for FORTRAN programs, SLIB77 is suitable for use with data files, text files, operating systems, and other programming languages, such as Ada, C and COBOL. It can handle libraries with records of up to 160-characters. Records are grouped into DECKS and assigned deck names by the user. SLIB77 assigns a number to each record in each DECK. Records can be deleted or restored singly or as a group within each deck. Modification records are grouped and assigned modification identification names by the user. The program assigns numbers to each new record within the deck. The program has two modes of execution, BATCH and EDIT. The BATCH mode is controlled by an input file and is used to make changes permanent and create new library files. The EDIT mode is controlled by interactive terminal input and a built-in line editor is used for modification of single decks. Transferring of a library from one computer system to another is accomplished using a Portable Library File created by SLIB77 in a BATCH run

  8. Updates and solution to the 21st century computer virus sourge ...

    African Journals Online (AJOL)

    The computer virus scourge continues to be a problem the Information Technology (IT), industries must address. Computer virus is a malicious program codes, which can replicate and spread infections into large number of possible hosts and cause damage to computer programs, files, databases and data, in general.

  9. 76 FR 48811 - Computer Matching and Privacy Protection Act of 1988

    Science.gov (United States)

    2011-08-09

    ... CORPORATION FOR NATIONAL AND COMMUNITY SERVICE Computer Matching and Privacy Protection Act of... of the Computer Matching and Privacy Protection Act of 1988 (54 FR 25818, June 19, 1989), and OMB... Security Administration (``SSA''). DATES: CNCS will file a report on the computer matching agreement with...

  10. Delivering LHC software to HPC compute elements

    CERN Document Server

    Blomer, Jakob; Hardi, Nikola; Popescu, Radu

    2017-01-01

    In recent years, there was a growing interest in improving the utilization of supercomputers by running applications of experiments at the Large Hadron Collider (LHC) at CERN when idle cores cannot be assigned to traditional HPC jobs. At the same time, the upcoming LHC machine and detector upgrades will produce some 60 times higher data rates and challenge LHC experiments to use so far untapped compute resources. LHC experiment applications are tailored to run on high-throughput computing resources and they have a different anatomy than HPC applications. LHC applications comprise a core framework that allows hundreds of researchers to plug in their specific algorithms. The software stacks easily accumulate to many gigabytes for a single release. New releases are often produced on a daily basis. To facilitate the distribution of these software stacks to world-wide distributed computing resources, LHC experiments use a purpose-built, global, POSIX file system, the CernVM File System. CernVM-FS pre-processes dat...

  11. A computer case definition for sudden cardiac death.

    Science.gov (United States)

    Chung, Cecilia P; Murray, Katherine T; Stein, C Michael; Hall, Kathi; Ray, Wayne A

    2010-06-01

    To facilitate studies of medications and sudden cardiac death, we developed and validated a computer case definition for these deaths. The study of community dwelling Tennessee Medicaid enrollees 30-74 years of age utilized a linked database with Medicaid inpatient/outpatient files, state death certificate files, and a state 'all-payers' hospital discharge file. The computerized case definition was developed from a retrospective cohort study of sudden cardiac deaths occurring between 1990 and 1993. Medical records for 926 potential cases had been adjudicated for this study to determine if they met the clinical definition for sudden cardiac death occurring in the community and were likely to be due to ventricular tachyarrhythmias. The computerized case definition included deaths with (1) no evidence of a terminal hospital admission/nursing home stay in any of the data sources; (2) an underlying cause of death code consistent with sudden cardiac death; and (3) no terminal procedures inconsistent with unresuscitated cardiac arrest. This definition was validated in an independent sample of 174 adjudicated deaths occurring between 1994 and 2005. The positive predictive value of the computer case definition was 86.0% in the development sample and 86.8% in the validation sample. The positive predictive value did not vary materially for deaths coded according to the ICO-9 (1994-1998, positive predictive value = 85.1%) or ICD-10 (1999-2005, 87.4%) systems. A computerized Medicaid database, linked with death certificate files and a state hospital discharge database, can be used for a computer case definition of sudden cardiac death. Copyright (c) 2009 John Wiley & Sons, Ltd.

  12. Performance of the ALIBAVA portable readout system with irradiated and non-irradiated microstrip silicon sensors

    International Nuclear Information System (INIS)

    Marco-Hernadez, R.

    2009-01-01

    A readout system for microstrip silicon sensors has been developed as a result of collaboration among the University of Liverpool, the CNM of Barcelona and the IFIC of Valencia. The name of this collaboration is ALIBAVA and it is integrated in the RD50 Collaboration. This system is able to measure the collected charge in one or two microstrip silicon sensors by reading out all the channels of the sensor(s), up to 256, as an analogue measurement. The system uses two Beetle chips to read out the detector(s). The Beetle chip is an analogue pipelined readout chip used in the LHCb experiment. The system can operate either with non-irradiated and irradiated sensors as well as with n-type and p-type microstrip silicon sensors. Heavily irradiated sensors will be used at the SLHC, so this system is being to research the performance of microstrip silicon sensors in conditions as similar as possible to the SLHC operating conditions. The system has two main parts: a hardware part and a software part. The hardware part acquires the sensor signals either from external trigger inputs, in case of a radioactive source setup is used, or from a synchronised trigger output generated by the system, if a laser setup is used. This acquired data is sent by USB to be stored in a PC for a further processing. The hardware is a dual board based system. The daughterboard is a small board intended for containing two Beetle readout chips as well as fan-ins and detector support to interface the sensors. The motherboard is intended to process the data, to control the whole hardware and to communicate with the software by USB. The software controls the system and processes the data acquired from the sensors in order to store it in an adequate format file. The main characteristics of the system will be described. Results of measurements acquired with n-type and p-type irradiated and non-irradiated detectors using both the laser and the radioactive source setup will be also presented and discussed

  13. Status and evaluation methods of JENDL fusion file and JENDL PKA/KERMA file

    International Nuclear Information System (INIS)

    Chiba, S.; Fukahori, T.; Shibata, K.; Yu Baosheng; Kosako, K.

    1997-01-01

    The status of evaluated nuclear data in the JENDL fusion file and PKA/KERMA file is presented. The JENDL fusion file was prepared in order to improve the quality of the JENDL-3.1 data especially on the double-differential cross sections (DDXs) of secondary neutrons and gamma-ray production cross sections, and to provide DDXs of secondary charged particles (p, d, t, 3 He and α-particle) for the calculation of PKA and KERMA factors. The JENDL fusion file contains evaluated data of 26 elements ranging from Li to Bi. The data in JENDL fusion file reproduce the measured data on neutron and charged-particle DDXs and also on gamma-ray production cross sections. Recoil spectra in PKA/KERMA file were calculated from secondary neutron and charged-particle DDXs contained in the fusion file with two-body reaction kinematics. The data in the JENDL fusion file and PKA/KERMA file were compiled in ENDF-6 format with an MF=6 option to store the DDX data. (orig.)

  14. Design computations and safety report of a cell for in pile irradiation tests

    International Nuclear Information System (INIS)

    Verri, A.

    1987-01-01

    The criteria adopted in positioning the irradiation cell within the 1Mw TRIGA reactor of the ENEA Casaccia are reported. Maximum heat which can be released by the cell is then evaluated. The final configuration of the cell as a whole, the heating system for the sample under irradiation, the procedure used in the calculations, are also reported. The selection and the design of the safety system, including auxiliary equipments are discussed

  15. Osteogenic Matrix Cell Sheets Facilitate Osteogenesis in Irradiated Rat Bone

    Directory of Open Access Journals (Sweden)

    Yoshinobu Uchihara

    2015-01-01

    Full Text Available Reconstruction of large bone defects after resection of malignant musculoskeletal tumors is a significant challenge in orthopedic surgery. Extracorporeal autogenous irradiated bone grafting is a treatment option for bone reconstruction. However, nonunion often occurs because the osteogenic capacity is lost by irradiation. In the present study, we established an autogenous irradiated bone graft model in the rat femur to assess whether osteogenic matrix cell sheets improve osteogenesis of the irradiated bone. Osteogenic matrix cell sheets were prepared from bone marrow-derived stromal cells and co-transplanted with irradiated bone. X-ray images at 4 weeks after transplantation showed bridging callus formation around the irradiated bone. Micro-computed tomography images at 12 weeks postoperatively showed abundant callus formation in the whole circumference of the irradiated bone. Histology showed bone union between the irradiated bone and host femur. Mechanical testing showed that the failure force at the irradiated bone site was significantly higher than in the control group. Our study indicates that osteogenic matrix cell sheet transplantation might be a powerful method to facilitate osteogenesis in irradiated bones, which may become a treatment option for reconstruction of bone defects after resection of malignant musculoskeletal tumors.

  16. Automatically controlled facilities for irradiation of silicon crystals at the Rossendorf Research Reactor

    International Nuclear Information System (INIS)

    Ross, R.

    1988-01-01

    This report describes the facilities for neutron transmutation doping of silicon in GDR. The irradiation of silicon single crystals began at Rossendorf in 1978 with simple equipment. Only a small amount of silicon could be irradiated in it. The fast increasing need of NTD-silicon made it necessary to design and construct new and better facilities. The new facilities are capable of irradiating silicon from 2'' to 3'' in diameter. The irradiation process takes place automatically with the assistance of a computer. Material produced has an axial homogeneity of ± 7%. Irradiation riggs, techniques, irradiation control and quality control are discussed. (author). 4 figs

  17. Automated quality control in a file-based broadcasting workflow

    Science.gov (United States)

    Zhang, Lina

    2014-04-01

    Benefit from the development of information and internet technologies, television broadcasting is transforming from inefficient tape-based production and distribution to integrated file-based workflows. However, no matter how many changes have took place, successful broadcasting still depends on the ability to deliver a consistent high quality signal to the audiences. After the transition from tape to file, traditional methods of manual quality control (QC) become inadequate, subjective, and inefficient. Based on China Central Television's full file-based workflow in the new site, this paper introduces an automated quality control test system for accurate detection of hidden troubles in media contents. It discusses the system framework and workflow control when the automated QC is added. It puts forward a QC criterion and brings forth a QC software followed this criterion. It also does some experiments on QC speed by adopting parallel processing and distributed computing. The performance of the test system shows that the adoption of automated QC can make the production effective and efficient, and help the station to achieve a competitive advantage in the media market.

  18. Development and validation of gui based input file generation code for relap

    International Nuclear Information System (INIS)

    Anwar, M.M.; Khan, A.A.; Chughati, I.R.; Chaudri, K.S.; Inyat, M.H.; Hayat, T.

    2009-01-01

    Reactor Excursion and Leak Analysis Program (RELAP) is a widely acceptable computer code for thermal hydraulics modeling of Nuclear Power Plants. It calculates thermal- hydraulic transients in water-cooled nuclear reactors by solving approximations to the one-dimensional, two-phase equations of hydraulics in an arbitrarily connected system of nodes. However, the preparation of input file and subsequent analysis of results in this code is a tedious task. The development of a Graphical User Interface (GUI) for preparation of the input file for RELAP-5 is done with the validation of GUI generated Input File. The GUI is developed in Microsoft Visual Studio using Visual C Sharp (C) as programming language. The Nodalization diagram is drawn graphically and the program contains various component forms along with the starting data form, which are launched for properties assignment to generate Input File Cards serving as GUI for the user. The GUI is provided with Open / Save function to store and recall the Nodalization diagram along with Components' properties. The GUI generated Input File is validated for several case studies and individual component cards are compared with the originally required format. The generated Input File of RELAP is found consistent with the requirement of RELAP. The GUI provided a useful platform for simulating complex hydrodynamic problems efficiently with RELAP. (author)

  19. Virtual file system on NoSQL for processing high volumes of HL7 messages.

    Science.gov (United States)

    Kimura, Eizen; Ishihara, Ken

    2015-01-01

    The Standardized Structured Medical Information Exchange (SS-MIX) is intended to be the standard repository for HL7 messages that depend on a local file system. However, its scalability is limited. We implemented a virtual file system using NoSQL to incorporate modern computing technology into SS-MIX and allow the system to integrate local patient IDs from different healthcare systems into a universal system. We discuss its implementation using the database MongoDB and describe its performance in a case study.

  20. Neutronic, thermal-hydraulics and safety calculations of a Miniplate Irradiation Device (MID) of dispersion type fuel elements

    International Nuclear Information System (INIS)

    Domingos, Douglas Borges

    2010-01-01

    Neutronic, thermal-hydraulics and accident analysis calculations were developed to estimate the safety of a Miniplate Irradiation Device (MID) to be placed in the IEA-R1 reactor core. The irradiation device is used to receive miniplates of U 3 O 8 -Al and U 3 Si 2 - Al dispersion fuels, LEU type (19.75 % 235 U) with uranium densities of, respectively, 3.2 gU/cm 3 and 4.8 gU/cm 3 . The fuel miniplates will be irradiated to nominal 235 U burnup levels of 50% and 80%, in order to qualify the above high-density dispersion fuels to be used in the Brazilian Multipurpose Reactor (RMB), now in the conception phase. For the neutronic calculation, the computer codes CITATION and 2DB were utilized. The computer code FLOW was used to calculate the coolant flow rate in the irradiation device, allowing the determination of the fuel miniplate temperatures with the computer model MTRCR-IEA-R1. A postulated Loss of Coolant Accident (LOCA) was analyzed with the computer codes LOSS and TEMPLOCA, allowing the calculation of the fuel miniplate temperatures after the reactor pool draining. The calculations showed that the irradiation should occur without adverse consequences in the IEA-R1 reactor. (author)

  1. Monte Carlo simulations and dosimetric studies of an irradiation facility

    Energy Technology Data Exchange (ETDEWEB)

    Belchior, A. [Instituto Tecnologico e Nuclear, Estrada nacional no. 10, Apartado 21, 2686-953 Sacavem (Portugal)], E-mail: anabelchior@itn.pt; Botelho, M.L; Vaz, P. [Instituto Tecnologico e Nuclear, Estrada nacional no. 10, Apartado 21, 2686-953 Sacavem (Portugal)

    2007-09-21

    There is an increasing utilization of ionizing radiation for industrial applications. Additionally, the radiation technology offers a variety of advantages in areas, such as sterilization and food preservation. For these applications, dosimetric tests are of crucial importance in order to assess the dose distribution throughout the sample being irradiated. The use of Monte Carlo methods and computational tools in support of the assessment of the dose distributions in irradiation facilities can prove to be economically effective, representing savings in the utilization of dosemeters, among other benefits. One of the purposes of this study is the development of a Monte Carlo simulation, using a state-of-the-art computational tool-MCNPX-in order to determine the dose distribution inside an irradiation facility of Cobalt 60. This irradiation facility is currently in operation at the ITN campus and will feature an automation and robotics component, which will allow its remote utilization by an external user, under REEQ/996/BIO/2005 project. The detailed geometrical description of the irradiation facility has been implemented in MCNPX, which features an accurate and full simulation of the electron-photon processes involved. The validation of the simulation results obtained was performed by chemical dosimetry methods, namely a Fricke solution. The Fricke dosimeter is a standard dosimeter and is widely used in radiation processing for calibration purposes.

  2. Computer control of fuel handling activities at FFTF

    International Nuclear Information System (INIS)

    Romrell, D.M.

    1985-03-01

    The Fast Flux Test Facility near Richland, Washington, utilizes computer control for reactor refueling and other related core component handling and processing tasks. The computer controlled tasks described in this paper include core component transfers within the reactor vessel, core component transfers into and out of the reactor vessel, remote duct measurements of irradiated core components, remote duct cutting, and finally, transferring irradiated components out of the reactor containment building for off-site shipments or to long term storage. 3 refs., 16 figs

  3. Surveillance of irradiation embrittlement of nuclear reactor pressure vessels

    International Nuclear Information System (INIS)

    Najzer, M.

    1982-01-01

    Surveillance of irradiation embrittlement of nuclear reactor pressure vessels is briefly discussed. The experimental techniques and computer programs available for this work at the J. Stefan Institute are described. (author)

  4. Implementing Journaling in a Linux Shared Disk File System

    Science.gov (United States)

    Preslan, Kenneth W.; Barry, Andrew; Brassow, Jonathan; Cattelan, Russell; Manthei, Adam; Nygaard, Erling; VanOort, Seth; Teigland, David; Tilstra, Mike; O'Keefe, Matthew; hide

    2000-01-01

    In computer systems today, speed and responsiveness is often determined by network and storage subsystem performance. Faster, more scalable networking interfaces like Fibre Channel and Gigabit Ethernet provide the scaffolding from which higher performance computer systems implementations may be constructed, but new thinking is required about how machines interact with network-enabled storage devices. In this paper we describe how we implemented journaling in the Global File System (GFS), a shared-disk, cluster file system for Linux. Our previous three papers on GFS at the Mass Storage Symposium discussed our first three GFS implementations, their performance, and the lessons learned. Our fourth paper describes, appropriately enough, the evolution of GFS version 3 to version 4, which supports journaling and recovery from client failures. In addition, GFS scalability tests extending to 8 machines accessing 8 4-disk enclosures were conducted: these tests showed good scaling. We describe the GFS cluster infrastructure, which is necessary for proper recovery from machine and disk failures in a collection of machines sharing disks using GFS. Finally, we discuss the suitability of Linux for handling the big data requirements of supercomputing centers.

  5. Volume dose of organs at risk in the irradiated volume

    International Nuclear Information System (INIS)

    Hishikawa, Yoshio; Tanaka, Shinichi; Miura, Takashi

    1984-01-01

    Absorbed dose of organs at risk in the 50% irradiated volume needs to be carefully monitored because there is high risk of radiation injury. This paper reports on the histogram of threedimensional volume dose of organs at risk, which is obtained by computer calculation of CT scans. In order to obtain this histogram, CT is first performed in the irradiation field. The dose in each pixel is then examined by the computer as to each slice. After the pixels of all slices in the organ at risk of the irradiated field are classified according to the doses, the number of pixels in the same dose class is counted. The result is expressed in a histogram. The histogram can show the differences of influence to organs at risk given by various radiation treatment techniques. Total volume dose of organs at risk after radiotherapy can also be obtained by integration of each dose of different treatment techniques. (author)

  6. CINDA 99, supplement 2 to CINDA 97 (1988-1999). The index to literature and computer files on microscopic neutron data

    International Nuclear Information System (INIS)

    1999-01-01

    CINDA, the Computer Index of Neutron Data, contains bibliographical references to measurements, calculations, reviews and evaluations of neutron cross-sections and other microscopic neutron data; it includes also index references to computer libraries of numerical neutron data available from four regional neutron data centres. The present issue, CINDA 99, is the second supplement to CINDA 97, the index to the literature on neutron data published after 1987. It supersedes the first supplement, CINDA 98. The complete CINDA file as of 1 June 1999 is contained in: the archival issue CINDA-A (5 volumes, 1990), CINDA 97 and the current issue CINDA 99. The compilation and publication of CINDA are the result of worldwide co-operation involving the following four data centres. Each centre is responsible for compiling the CINDA entries from the literature published in a defined geographical area given in brackets below: the USA National Nuclear Data Center at the Brookhaven National Laboratory, USA (United States of America and Canada); the Russian Nuclear Data Centre at the Fiziko-Energeticheskij Institut, Obninsk, Russian Federation (former USSR countries); the NEA Data Bank in Paris, France (European OECD member countries in Western Europe and Japan); and the IAEA Nuclear Data Section in Vienna, Austria (all other countries in Eastern Europe, Asia, Australia, Africa, Central and South America; also IAEA publications and translation journals). Besides the published CINDA books, up-to-date computer retrievals for specified CINDA information are currently available on request from the responsible CINDA centres, or via direct access to the on-line services as described in this publication

  7. New FORTRAN computer programs to acquire and process isotopic mass-spectrometric data

    International Nuclear Information System (INIS)

    Smith, D.H.

    1982-08-01

    The computer programs described in New Computer Programs to Acquire and Process Isotopic Mass Spectrometric Data have been revised. This report describes in some detail the operation of these programs, which acquire and process isotopic mass spectrometric data. Both functional and overall design aspects are addressed. The three basic program units - file manipulation, data acquisition, and data processing - are discussed in turn. Step-by-step instructions are included where appropriate, and each subsection is described in enough detail to give a clear picture of its function. Organization of file structure, which is central to the entire concept, is extensively discussed with the help of numerous tables. Appendices contain flow charts and outline file structure to help a programmer unfamiliar with the programs to alter them with a minimum of lost time

  8. Efficient analysis and extraction of MS/MS result data from Mascot™ result files

    Directory of Open Access Journals (Sweden)

    Sickmann Albert

    2005-12-01

    Full Text Available Abstract Background Mascot™ is a commonly used protein identification program for MS as well as for tandem MS data. When analyzing huge shotgun proteomics datasets with Mascot™'s native tools, limits of computing resources are easily reached. Up to now no application has been available as open source that is capable of converting the full content of Mascot™ result files from the original MIME format into a database-compatible tabular format, allowing direct import into database management systems and efficient handling of huge datasets analyzed by Mascot™. Results A program called mres2x is presented, which reads Mascot™ result files, analyzes them and extracts either selected or all information in order to store it in a single file or multiple files in formats which are easier to handle downstream of Mascot™. It generates different output formats. The output of mres2x in tab format is especially designed for direct high-performance import into relational database management systems using native tools of these systems. Having the data available in database management systems allows complex queries and extensive analysis. In addition, the original peak lists can be extracted in DTA format suitable for protein identification using the Sequest™ program, and the Mascot™ files can be split, preserving the original data format. During conversion, several consistency checks are performed. mres2x is designed to provide high throughput processing combined with the possibility to be driven by other computer programs. The source code including supplement material and precompiled binaries is available via http://www.protein-ms.de and http://sourceforge.net/projects/protms/. Conclusion The database upload allows regrouping of the MS/MS results using a database management system and complex analyzing queries using SQL without the need to run new Mascot™ searches when changing grouping parameters.

  9. A Metadata-Rich File System

    Energy Technology Data Exchange (ETDEWEB)

    Ames, S; Gokhale, M B; Maltzahn, C

    2009-01-07

    Despite continual improvements in the performance and reliability of large scale file systems, the management of file system metadata has changed little in the past decade. The mismatch between the size and complexity of large scale data stores and their ability to organize and query their metadata has led to a de facto standard in which raw data is stored in traditional file systems, while related, application-specific metadata is stored in relational databases. This separation of data and metadata requires considerable effort to maintain consistency and can result in complex, slow, and inflexible system operation. To address these problems, we have developed the Quasar File System (QFS), a metadata-rich file system in which files, metadata, and file relationships are all first class objects. In contrast to hierarchical file systems and relational databases, QFS defines a graph data model composed of files and their relationships. QFS includes Quasar, an XPATH-extended query language for searching the file system. Results from our QFS prototype show the effectiveness of this approach. Compared to the defacto standard, the QFS prototype shows superior ingest performance and comparable query performance on user metadata-intensive operations and superior performance on normal file metadata operations.

  10. New thermal neutron scattering files for ENDF/B-VI release 2

    International Nuclear Information System (INIS)

    MacFarlane, R.E.

    1994-03-01

    At thermal neutron energies, the binding of the scattering nucleus in a solid, liquid, or gas affects the cross section and the distribution of secondary neutrons. These effects are described in the thermal sub-library of Version VI of the Evaluated Nuclear Data Files (ENDF/B-VI) using the File 7 format. In the original release of the ENDF/B-VI library, the data in File 7 were obtained by converting the thermal scattering evaluations of ENDF/B-III to the ENDF-6 format. These original evaluations were prepared at General Atomics (GA) in the late sixties, and they suffer from accuracy limitations imposed by the computers of the day. This report describes new evaluations for six of the thermal moderator materials and six new cold moderator materials. The calculations were made with the LEAPR module of NJOY, which uses methods based on the British code LEAP, together with the original GA physics models, to obtain new ENDF files that are accurate over a wider range of energy and momentum transfer than the existing files. The new materials are H in H 2 O, Be metal, Be in BeO, C in graphite, H in ZrH, Zr in ZrH, liquid ortho-hydrogen, liquid para-hydrogen, liquid ortho-deuterium, liquid para-deuterium liquid methane, and solid methane

  11. ArrayBridge: Interweaving declarative array processing with high-performance computing

    Energy Technology Data Exchange (ETDEWEB)

    Xing, Haoyuan [The Ohio State Univ., Columbus, OH (United States); Floratos, Sofoklis [The Ohio State Univ., Columbus, OH (United States); Blanas, Spyros [The Ohio State Univ., Columbus, OH (United States); Byna, Suren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Prabhat, Prabhat [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wu, Kesheng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Brown, Paul [Paradigm4, Inc., Waltham, MA (United States)

    2017-05-04

    Scientists are increasingly turning to datacenter-scale computers to produce and analyze massive arrays. Despite decades of database research that extols the virtues of declarative query processing, scientists still write, debug and parallelize imperative HPC kernels even for the most mundane queries. This impedance mismatch has been partly attributed to the cumbersome data loading process; in response, the database community has proposed in situ mechanisms to access data in scientific file formats. Scientists, however, desire more than a passive access method that reads arrays from files. This paper describes ArrayBridge, a bi-directional array view mechanism for scientific file formats, that aims to make declarative array manipulations interoperable with imperative file-centric analyses. Our prototype implementation of ArrayBridge uses HDF5 as the underlying array storage library and seamlessly integrates into the SciDB open-source array database system. In addition to fast querying over external array objects, ArrayBridge produces arrays in the HDF5 file format just as easily as it can read from it. ArrayBridge also supports time travel queries from imperative kernels through the unmodified HDF5 API, and automatically deduplicates between array versions for space efficiency. Our extensive performance evaluation in NERSC, a large-scale scientific computing facility, shows that ArrayBridge exhibits statistically indistinguishable performance and I/O scalability to the native SciDB storage engine.

  12. PHREEQCI; a graphical user interface for the geochemical computer program PHREEQC

    Science.gov (United States)

    Charlton, Scott R.; Macklin, Clifford L.; Parkhurst, David L.

    1997-01-01

    PhreeqcI is a Windows-based graphical user interface for the geochemical computer program PHREEQC. PhreeqcI provides the capability to generate and edit input data files, run simulations, and view text files containing simulation results, all within the framework of a single interface. PHREEQC is a multipurpose geochemical program that can perform speciation, inverse, reaction-path, and 1D advective reaction-transport modeling. Interactive access to all of the capabilities of PHREEQC is available with PhreeqcI. The interface is written in Visual Basic and will run on personal computers under the Windows(3.1), Windows95, and WindowsNT operating systems.

  13. Parallel checksumming of data chunks of a shared data object using a log-structured file system

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-09-06

    Checksum values are generated and used to verify the data integrity. A client executing in a parallel computing system stores a data chunk to a shared data object on a storage node in the parallel computing system. The client determines a checksum value for the data chunk; and provides the checksum value with the data chunk to the storage node that stores the shared object. The data chunk can be stored on the storage node with the corresponding checksum value as part of the shared object. The storage node may be part of a Parallel Log-Structured File System (PLFS), and the client may comprise, for example, a Log-Structured File System client on a compute node or burst buffer. The checksum value can be evaluated when the data chunk is read from the storage node to verify the integrity of the data that is read.

  14. HUD GIS Boundary Files

    Data.gov (United States)

    Department of Housing and Urban Development — The HUD GIS Boundary Files are intended to supplement boundary files available from the U.S. Census Bureau. The files are for community planners interested in...

  15. Code 672 observational science branch computer networks

    Science.gov (United States)

    Hancock, D. W.; Shirk, H. G.

    1988-01-01

    In general, networking increases productivity due to the speed of transmission, easy access to remote computers, ability to share files, and increased availability of peripherals. Two different networks within the Observational Science Branch are described in detail.

  16. Computational analysis of the dose rates at JSI TRIGA reactor irradiation facilities.

    Science.gov (United States)

    Ambrožič, K; Žerovnik, G; Snoj, L

    2017-12-01

    The JSI TRIGA Mark II, IJS research reactor is equipped with numerous irradiation positions, where samples can be irradiated by neutrons and γ-rays. Irradiation position selection is based on its properties, such as physical size and accessibility, as well as neutron and γ-ray spectra, flux and dose intensities. This paper presents an overview on the neutron and γ-ray fluxes, spectra and dose intensities calculations using Monte Carlo MCNP software and ENDF/B-VII.0 nuclear data libraries. The dose-rates are presented in terms of ambient dose equivalents, air kerma, and silicon dose equivalent. At full reactor power the neutron ambient dose equivalent ranges from 5.5×10 3 Svh -1 to 6×10 6 Svh -1 , silicon dose equivalent from 6×10 2 Gy/h si to 3×10 5 Gy/h si , and neutron air kerma from 4.3×10 3 Gyh -1 to 2×10 5 Gyh -1 . Ratio of fast (1MeVreactor power from 3.4×10 3 Svh -1 to 3.6×10 5 Svh -1 and γ air kerma range 3.1×10 3 Gyh -1 to 2.9×10 5 Gyh -1 . Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Cloud Computing Cryptography "State-of-the-Art"

    OpenAIRE

    Omer K. Jasim; Safia Abbas; El-Sayed M. El-Horbaty; Abdel-Badeeh M. Salem

    2013-01-01

    Cloud computing technology is very useful in present day to day life, it uses the internet and the central remote servers to provide and maintain data as well as applications. Such applications in turn can be used by the end users via the cloud communications without any installation. Moreover, the end users’ data files can be accessed and manipulated from any other computer using the internet services. Despite the flexibility of data and application accessing and usage that cloud computing e...

  18. Radiation retinopathy after orbital irradiation for Graves' ophthalmopathy

    International Nuclear Information System (INIS)

    Kinyoun, J.L.; Kalina, R.E.; Brower, S.A.; Mills, R.P.; Johnson, R.H.

    1984-01-01

    Recent reports indicate that orbital irradiation for Graves' ophthalmopathy is sometimes beneficial, particularly for dysthyroid optic neuropathy, and is not associated with serious complications. We are aware, however, of four patients who were found to have radiation retinopathy after orbital irradiation for Grave's ophthalmopathy. All four patients have decreased central acuity, and three of the four are legally blind in one or both eyes. Computer reconstruction of the dosimetry, based on computed tomography and beam profiles, shows that errors in dosage calculations and radiotherapy technique probably account for the radiation retinopathy in three of the four patients. Radiotherapy for Graves' ophthalmopathy should be administered only by competent radiotherapists who are experienced in the treatment of this disease. Similar errors in dosage calculations and treatment techniques may account for other reports of radiation retinopathy after reportedly safe dosages

  19. Migrating Educational Data and Services to Cloud Computing: Exploring Benefits and Challenges

    Science.gov (United States)

    Lahiri, Minakshi; Moseley, James L.

    2013-01-01

    "Cloud computing" is currently the "buzzword" in the Information Technology field. Cloud computing facilitates convenient access to information and software resources as well as easy storage and sharing of files and data, without the end users being aware of the details of the computing technology behind the process. This…

  20. 33 CFR 148.246 - When is a document considered filed and where should I file it?

    Science.gov (United States)

    2010-07-01

    ... filed and where should I file it? 148.246 Section 148.246 Navigation and Navigable Waters COAST GUARD... Formal Hearings § 148.246 When is a document considered filed and where should I file it? (a) If a document to be filed is submitted by mail, it is considered filed on the date it is postmarked. If a...

  1. Collection Of Software For Computer Graphics

    Science.gov (United States)

    Hibbard, Eric A.; Makatura, George

    1990-01-01

    Ames Research Graphics System (ARCGRAPH) collection of software libraries and software utilities assisting researchers in generating, manipulating, and visualizing graphical data. Defines metafile format containing device-independent graphical data. File format used with various computer-graphics-manipulation and -animation software packages at Ames, including SURF (COSMIC Program ARC-12381) and GAS (COSMIC Program ARC-12379). Consists of two-stage "pipeline" used to put out graphical primitives. ARCGRAPH libraries developed on VAX computer running VMS.

  2. Quantitative computed tomography bone mineral density measurements in irradiated and non-irradiated minipig alveolar bone: an experimental study.

    NARCIS (Netherlands)

    Verdonck, H.W.; Meijer, G.J.; Nieman, F.H.; Stoll, C.; Riediger, D.; Baat, C. de

    2008-01-01

    OBJECTIVE: The objective of this study was to analyse the effect of irradiation on bone mineral density (BMD). MATERIALS AND METHODS: All maxillary and mandibular pre-molars and molars of six minipigs were extracted. After a 3-month healing period, the maxilla and mandibles of three minipigs

  3. Microcomputer-based systems for automatic control of sample irradiation and chemical analysis of short-lived isotopes

    International Nuclear Information System (INIS)

    Bourret, S.C.

    1974-01-01

    Two systems resulted from the need for the study of the nuclear decay of short-lived radionuclides. Automation was required for better repeatability, speed of chemical separation after irradiation and for protection from the high radiation fields of the samples. A MCS-8 computer was used as the nucleus of the automatic sample irradiation system because the control system required an extensive multiple-sequential circuit. This approach reduced the sequential problem to a computer program. The automatic chemistry control system is a mixture of a fixed and a computer-based programmable control system. The fixed control receives the irradiated liquid sample from the reactor, extracts the liquid and disposes of the used sample container. The programmable control executes the chemistry program that the user has entered through the teletype. (U.S.)

  4. Approaches in highly parameterized inversion-PESTCommander, a graphical user interface for file and run management across networks

    Science.gov (United States)

    Karanovic, Marinko; Muffels, Christopher T.; Tonkin, Matthew J.; Hunt, Randall J.

    2012-01-01

    Models of environmental systems have become increasingly complex, incorporating increasingly large numbers of parameters in an effort to represent physical processes on a scale approaching that at which they occur in nature. Consequently, the inverse problem of parameter estimation (specifically, model calibration) and subsequent uncertainty analysis have become increasingly computation-intensive endeavors. Fortunately, advances in computing have made computational power equivalent to that of dozens to hundreds of desktop computers accessible through a variety of alternate means: modelers have various possibilities, ranging from traditional Local Area Networks (LANs) to cloud computing. Commonly used parameter estimation software is well suited to take advantage of the availability of such increased computing power. Unfortunately, logistical issues become increasingly important as an increasing number and variety of computers are brought to bear on the inverse problem. To facilitate efficient access to disparate computer resources, the PESTCommander program documented herein has been developed to provide a Graphical User Interface (GUI) that facilitates the management of model files ("file management") and remote launching and termination of "slave" computers across a distributed network of computers ("run management"). In version 1.0 described here, PESTCommander can access and ascertain resources across traditional Windows LANs: however, the architecture of PESTCommander has been developed with the intent that future releases will be able to access computing resources (1) via trusted domains established in Wide Area Networks (WANs) in multiple remote locations and (2) via heterogeneous networks of Windows- and Unix-based operating systems. The design of PESTCommander also makes it suitable for extension to other computational resources, such as those that are available via cloud computing. Version 1.0 of PESTCommander was developed primarily to work with the

  5. Protecting your files on the DFS file system

    CERN Multimedia

    Computer Security Team

    2011-01-01

    The Windows Distributed File System (DFS) hosts user directories for all NICE users plus many more data.    Files can be accessed from anywhere, via a dedicated web portal (http://cern.ch/dfs). Due to the ease of access to DFS with in CERN it is of utmost importance to properly protect access to sensitive data. As the use of DFS access control mechanisms is not obvious to all users, passwords, certificates or sensitive files might get exposed. At least this happened in past to the Andrews File System (AFS) - the Linux equivalent to DFS) - and led to bad publicity due to a journalist accessing supposedly "private" AFS folders (SonntagsZeitung 2009/11/08). This problem does not only affect the individual user but also has a bad impact on CERN's reputation when it comes to IT security. Therefore, all departments and LHC experiments agreed recently to apply more stringent protections to all DFS user folders. The goal of this data protection policy is to assist users in pro...

  6. Protecting your files on the AFS file system

    CERN Multimedia

    2011-01-01

    The Andrew File System is a world-wide distributed file system linking hundreds of universities and organizations, including CERN. Files can be accessed from anywhere, via dedicated AFS client programs or via web interfaces that export the file contents on the web. Due to the ease of access to AFS it is of utmost importance to properly protect access to sensitive data in AFS. As the use of AFS access control mechanisms is not obvious to all users, passwords, private SSH keys or certificates have been exposed in the past. In one specific instance, this also led to bad publicity due to a journalist accessing supposedly "private" AFS folders (SonntagsZeitung 2009/11/08). This problem does not only affect the individual user but also has a bad impact on CERN's reputation when it comes to IT security. Therefore, all departments and LHC experiments agreed in April 2010 to apply more stringent folder protections to all AFS user folders. The goal of this data protection policy is to assist users in...

  7. Instructions for preparation of data entry sheets for Licensee Event Report (LER) file. Revision 1. Instruction manual

    International Nuclear Information System (INIS)

    1977-07-01

    The manual provides instructions for the preparation of data entry sheets for the licensee event report (LER) file. It is a revision to an interim manual published in October 1974 in 00E-SS-001. The LER file is a computer-based data bank of information using the data entry sheets as input. These data entry sheets contain pertinent information in regard to those occurrences required to be reported to the NRC. The computer-based data bank provides a centralized source of data that may be used for qualitative assessment of the nature and extent of off-normal events in the nuclear industry and as an index of source information to which users may refer for more detail

  8. Direct utilization of information from nuclear data files in Monte Carlo simulation of neutron and photon transport

    International Nuclear Information System (INIS)

    Androsenko, P.; Joloudov, D.; Kompaniyets, A.

    2001-01-01

    Questions, related to Monte-Carlo method for solution of neutron and photon transport equation, are discussed in the work concerned. Problems dealing with direct utilization of information from evaluated nuclear data files in run-time calculations are considered. ENDF-6 format libraries have been used for calculations. Approaches provided by the rules of ENDF-6 files 2, 3-6, 12-15, 23, 27 and algorithms for reconstruction of resolved and unresolved resonance region cross sections under preset energy are described. The comparison results of calculations made by NJOY and GRUCON programs and computed cross sections data are represented. Test computation data of neutron leakage spectra for spherical benchmark-experiments are also represented. (authors)

  9. The design and analysis of salmonid tagging studies in the Columbia Basin. Volume 10: Instructional guide to using program CaptHist to create SURPH files for survival analysis using PTAGIS data files

    International Nuclear Information System (INIS)

    Westhagen, P.; Skalski, J.

    1997-12-01

    The SURPH program is a valuable tool for estimating survivals and capture probabilities of fish outmigrations on the Snake and Columbia Rivers. Using special data files, SURPH computes reach to reach statistics for any release group passing a system of detection sites. Because the data must be recorded for individual fish, PIT tag data is best suited for use as input. However, PIT tag data as available from PTAGIS comes in a form that is not ready for use as SURPH input. SURPH requires a capture history for each fish. A capture history consists of a series of fields, one for each detection site, that has a code for whether the fish was detected and returned to the river, detected and removed, or not detected. For the PTAGIS data to be usable by SURPH it must be pre-processed. The data must be condensed down to one line per fish with the relevant detection information from the PTAGIS file represented compactly on each line. In addition, the PTAGIS data file coil information must be passed through a series of logic algorithms to determine whether or not a fish is returned to the river after detection. Program CaptHist was developed to properly pre-process the PTAGIS data files for input to program SURPH. This utility takes PTAGIS data files as input and creates a SURPH data file as well as other output including travel time records, detection date records, and a data error file. CaptHist allows a user to download PTAGIS files and easily process the data for use with SURPH

  10. Some computer graphical user interfaces in radiation therapy.

    Science.gov (United States)

    Chow, James C L

    2016-03-28

    In this review, five graphical user interfaces (GUIs) used in radiation therapy practices and researches are introduced. They are: (1) the treatment time calculator, superficial X-ray treatment time calculator (SUPCALC) used in the superficial X-ray radiation therapy; (2) the monitor unit calculator, electron monitor unit calculator (EMUC) used in the electron radiation therapy; (3) the multileaf collimator machine file creator, sliding window intensity modulated radiotherapy (SWIMRT) used in generating fluence map for research and quality assurance in intensity modulated radiation therapy; (4) the treatment planning system, DOSCTP used in the calculation of 3D dose distribution using Monte Carlo simulation; and (5) the monitor unit calculator, photon beam monitor unit calculator (PMUC) used in photon beam radiation therapy. One common issue of these GUIs is that all user-friendly interfaces are linked to complex formulas and algorithms based on various theories, which do not have to be understood and noted by the user. In that case, user only needs to input the required information with help from graphical elements in order to produce desired results. SUPCALC is a superficial radiation treatment time calculator using the GUI technique to provide a convenient way for radiation therapist to calculate the treatment time, and keep a record for the skin cancer patient. EMUC is an electron monitor unit calculator for electron radiation therapy. Instead of doing hand calculation according to pre-determined dosimetric tables, clinical user needs only to input the required drawing of electron field in computer graphical file format, prescription dose, and beam parameters to EMUC to calculate the required monitor unit for the electron beam treatment. EMUC is based on a semi-experimental theory of sector-integration algorithm. SWIMRT is a multileaf collimator machine file creator to generate a fluence map produced by a medical linear accelerator. This machine file controls

  11. A performance analysis of advanced I/O architectures for PC-based network file servers

    Science.gov (United States)

    Huynh, K. D.; Khoshgoftaar, T. M.

    1994-12-01

    In the personal computing and workstation environments, more and more I/O adapters are becoming complete functional subsystems that are intelligent enough to handle I/O operations on their own without much intervention from the host processor. The IBM Subsystem Control Block (SCB) architecture has been defined to enhance the potential of these intelligent adapters by defining services and conventions that deliver command information and data to and from the adapters. In recent years, a new storage architecture, the Redundant Array of Independent Disks (RAID), has been quickly gaining acceptance in the world of computing. In this paper, we would like to discuss critical system design issues that are important to the performance of a network file server. We then present a performance analysis of the SCB architecture and disk array technology in typical network file server environments based on personal computers (PCs). One of the key issues investigated in this paper is whether a disk array can outperform a group of disks (of same type, same data capacity, and same cost) operating independently, not in parallel as in a disk array.

  12. Mechanical properties of irradiated materials

    International Nuclear Information System (INIS)

    Robertson, I.M.; Robach, J.; Wirth, B.

    2001-01-01

    The effect of irradiation on the mechanical properties of metals is considered with particular attention being paid to the development of defect-free channels following uniaxial tensile loading. The in situ transmission electron microscope deformation technique is coupled with dislocation dynamic computer simulations to reveal the fundamental processes governing the elimination of defects by glissile dislocations. The observations of preliminary experiments are reported.(author)

  13. In-service irradiated and aged material evaluations

    International Nuclear Information System (INIS)

    Haggag, F.M.; Nanstad, R.K.; Alexander, D.J.

    1995-01-01

    The objective of this task is to provide a direct assessment of actual material properties in irradiated components of nuclear reactors, including the effects of irradiation and aging. Four activities are currently in progress: (1) establishing a machining capability for contaminated or activated materials by completing procurement and installation of a computer-based milling machine in a hot cell; (2) machining and testing specimens from cladding materials removed from the Gundremmingen reactor to establish their fracture properties; (3) preparing an interpretive report on the effects of neutron irradiation on cladding; and (4) continuing the evaluation of long-term aging of austenitic structural stainless steel weld metal by metallurgically examining and testing specimens aged at 288 and 343 degrees C and reporting the results, as well as by continuing the aging of the stainless steel cladding toward a total time of 50,000 h

  14. Computer-aided preparation of specifications for radial fans at VEB Lufttechnische Anlagen Berlin

    Energy Technology Data Exchange (ETDEWEB)

    Kubis, R.; Kull, W.

    1987-01-01

    The specification details the scope of delivery for radial fans on a standard page and also serves the preparation for production. In the place of previous manual preparation, a computer-aided technique for the office computer is presented that provides the technical parameters from data files out of few input data to identify the fan type. The data files and evaluative programs are based on the software tool REDABAS and the SCP operating system. Using this technique it has been possible to cut considerably the preparation time for the incoming orders.

  15. Understanding the Irradiation Behavior of Zirconium Carbide

    International Nuclear Information System (INIS)

    Motta, Arthur; Sridharan, Kumar; Morgan, Dane; Szlufarska, Izabela

    2013-01-01

    Zirconium carbide (ZrC) is being considered for utilization in high-temperature gas-cooled reactor fuels in deep-burn TRISO fuel. Zirconium carbide possesses a cubic B1-type crystal structure with a high melting point, exceptional hardness, and good thermal and electrical conductivities. The use of ZrC as part of the TRISO fuel requires a thorough understanding of its irradiation response. However, the radiation effects on ZrC are still poorly understood. The majority of the existing research is focused on the radiation damage phenomena at higher temperatures (>450ee)C) where many fundamental aspects of defect production and kinetics cannot be easily distinguished. Little is known about basic defect formation, clustering, and evolution of ZrC under irradiation, although some atomistic simulation and phenomenological studies have been performed. Such detailed information is needed to construct a model describing the microstructural evolution in fast-neutron irradiated materials that will be of great technological importance for the development of ZrC-based fuel. The goal of the proposed project is to gain fundamental understanding of the radiation-induced defect formation in zirconium carbide and irradiation response by using a combination of state-of-the-art experimental methods and atomistic modeling. This project will combine (1) in situ ion irradiation at a specialized facility at a national laboratory, (2) controlled temperature proton irradiation on bulk samples, and (3) atomistic modeling to gain a fundamental understanding of defect formation in ZrC. The proposed project will cover the irradiation temperatures from cryogenic temperature to as high as 800ee)C, and dose ranges from 0.1 to 100 dpa. The examination of this wide range of temperatures and doses allows us to obtain an experimental data set that can be effectively used to exercise and benchmark the computer calculations of defect properties. Combining the examination of radiation

  16. Performance Evaluation of Various STL File Mesh Refining Algorithms Applied for FDM-RP Process

    Science.gov (United States)

    Ledalla, Siva Rama Krishna; Tirupathi, Balaji; Sriram, Venkatesh

    2018-06-01

    Layered manufacturing machines use the stereolithography (STL) file to build parts. When a curved surface is converted from a computer aided design (CAD) file to STL, it results in a geometrical distortion and chordal error. Parts manufactured with this file, might not satisfy geometric dimensioning and tolerance requirements due to approximated geometry. Current algorithms built in CAD packages have export options to globally reduce this distortion, which leads to an increase in the file size and pre-processing time. In this work, different mesh subdivision algorithms are applied on STL file of a complex geometric features using MeshLab software. The mesh subdivision algorithms considered in this work are modified butterfly subdivision technique, loops sub division technique and general triangular midpoint sub division technique. A comparative study is made with respect to volume and the build time using the above techniques. It is found that triangular midpoint sub division algorithm is more suitable for the geometry under consideration. Only the wheel cap part is then manufactured on Stratasys MOJO FDM machine. The surface roughness of the part is measured on Talysurf surface roughness tester.

  17. Business and computing : Bridging the gap

    International Nuclear Information System (INIS)

    Gordon, B.; Coles, F.C.

    1999-01-01

    Information systems, in the form of paper files or electronic files on computer systems and digital storage devices are employed in the oil industry to handle data from 3 basic infrastructures: accounting, geotechnical, and administration. The accounting function was the main driving force behind the development of the computer. The geotechnical data storage and manipulation infrastructure has its basis in signal recording and processing related to seismic acquisition and well logging. The administrative infrastructure deals with documents and not just data. Management in the oil industry needs two main kinds of useful information: reports about their organization and about the marketplace. Using an example of an oil and gas enterprise whose aim is to pursue low cost shallow gas to increase production levels, the basic business process is shown to relate to land and prospect inventory management, tightly controlled drilling methods, gathering system and production facility standardization, logistics and planning models, and strong transportation and marketing management. The role of the computer in this process is to yield information, that is, to provide coordinated, integrated, useful information that facilitates the processes essential to accomplish the business's objectives

  18. JENDL Dosimetry File

    International Nuclear Information System (INIS)

    Nakazawa, Masaharu; Iguchi, Tetsuo; Kobayashi, Katsuhei; Iwasaki, Shin; Sakurai, Kiyoshi; Ikeda, Yujiro; Nakagawa, Tsuneo.

    1992-03-01

    The JENDL Dosimetry File based on JENDL-3 was compiled and integral tests of cross section data were performed by the Dosimetry Integral Test Working Group of the Japanese Nuclear Data Committee. Data stored in the JENDL Dosimetry File are the cross sections and their covariance data for 61 reactions. The cross sections were mainly taken from JENDL-3 and the covariances from IRDF-85. For some reactions, data were adopted from other evaluated data files. The data are given in the neutron energy region below 20 MeV in both of point-wise and group-wise files in the ENDF-5 format. In order to confirm reliability of the data, several integral tests were carried out; comparison with the data in IRDF-85 and average cross sections measured in fission neutron fields, fast reactor spectra, DT neutron fields and Li(d, n) neutron fields. As a result, it has been found that the JENDL Dosimetry File gives better results than IRDF-85 but there are some problems to be improved in future. The contents of the JENDL Dosimetry File and the results of the integral tests are described in this report. All of the dosimetry cross sections are shown in a graphical form. (author) 76 refs

  19. JENDL Dosimetry File

    Energy Technology Data Exchange (ETDEWEB)

    Nakazawa, Masaharu; Iguchi, Tetsuo [Tokyo Univ. (Japan). Faculty of Engineering; Kobayashi, Katsuhei [Kyoto Univ., Kumatori, Osaka (Japan). Research Reactor Inst.; Iwasaki, Shin [Tohoku Univ., Sendai (Japan). Faculty of Engineering; Sakurai, Kiyoshi; Ikeda, Yujior; Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1992-03-15

    The JENDL Dosimetry File based on JENDL-3 was compiled and integral tests of cross section data were performed by the Dosimetry Integral Test Working Group of the Japanese Nuclear Data Committee. Data stored in the JENDL Dosimetry File are the cross sections and their covariance data for 61 reactions. The cross sections were mainly taken from JENDL-3 and the covariances from IRDF-85. For some reactions, data were adopted from other evaluated data files. The data are given in the neutron energy region below 20 MeV in both of point-wise and group-wise files in the ENDF-5 format. In order to confirm reliability of the data, several integral tests were carried out; comparison with the data in IRDF-85 and average cross sections measured in fission neutron fields, fast reactor spectra, DT neutron fields and Li(d,n) neutron fields. As a result, it has been found that the JENDL Dosimetry File gives better results than IRDF-85 but there are some problems to be improved in future. The contents of the JENDL Dosimetry File and the results of the integral tests are described in this report. All of the dosimetry cross sections are shown in a graphical form.

  20. Methods for the development of large computer codes under LTSS

    International Nuclear Information System (INIS)

    Sicilian, J.M.

    1977-06-01

    TRAC is a large computer code being developed by Group Q-6 for the analysis of the transient thermal hydraulic behavior of light-water nuclear reactors. A system designed to assist the development of TRAC is described. The system consists of a central HYDRA dataset, R6LIB, containing files used in the development of TRAC, and a file maintenance program, HORSE, which facilitates the use of this dataset

  1. Installation and management of the SPS and LEP control system computers

    International Nuclear Information System (INIS)

    Bland, Alastair

    1994-01-01

    Control of the CERN SPS and LEP accelerators and service equipment on the two CERN main sites is performed via workstations, file servers, Process Control Assemblies (PCAs) and Device Stub Controllers (DSCs). This paper describes the methods and tools that have been developed to manage the file servers, PCAs and DSCs since the LEP startup in 1989. There are five operational DECstation 5000s used as file servers and boot servers for the PCAs and DSCs. The PCAs consist of 90 SCO Xenix 386 PCs, 40 LynxOS 486 PCs and more than 40 older NORD 100s. The DSCs consist of 90 OS-968030 VME crates and 10 LynxOS 68030 VME crates. In addition there are over 100 development systems. The controls group is responsible for installing the computers, starting all the user processes and ensuring that the computers and the processes run correctly. The operators in the SPS/LEP control room and the Services control room have a Motif-based X window program which gives them, in real time, the state of all the computers and allows them to solve problems or reboot them. ((orig.))

  2. Gamma-irradiation of homodeoxyoligonucleotides 32P-labelled at one end: computer simulation of the chain length distribution of the radioactive fragments

    International Nuclear Information System (INIS)

    Teoule, R.; Duplaa, A.M.

    1987-01-01

    Electrophoresis on polyacrylamide gels of the fragments resulting from γ-irradiation of single-stranded oligodeoxyribonucleotides labelled at their 5'- or 3'-end proved a potent tool for analysis of the radiation-induced chain breakage of DNA. Owing to the fact that the oligonucleotide may be ruptured at more than one site, counting of the electrophoresis bands must be corrected and it is necessary to assess the influence of the cleavage position on the band intensities. A complicating factor is the inhomogeneity of the system due to the presence of the four bases A, T, C and G. To circumvent this problem, the homooligodeoxyribonucleotides (dA) 15 , (dC) 15 , (dT) 15 were used as experimental probes. They were γ-irradiated in solution, heated in alkali and resulting fragments separated by gel electrophoresis. A computer simulation of band intensities was compiled based on the general assumption that the chain breakage is homogeneous. Experimental results obtained from the homooligodeoxyribonucleotides labelled at either the 5' or the 3'-end are in excellent agreement with theoretical calculations. Abacus giving the gel band intensities (percentage) against the nucleotide positions and the remaining intensity of the original oligonucleotide have been obtained. (author)

  3. Use of DBMS-10 for storage and retrieval of evaluated nuclear data files

    International Nuclear Information System (INIS)

    Dunford, C.L.

    1977-01-01

    The use of a data base management system (DBMS) for storage of, and retrieval from, the many scientific data bases maintained by the National Nuclear Data Center is currently being investigated. It would appear that a commercially available DBMS package would save the Center considerable money and manpower when adding new data files to the library and in the long-term maintenance of current data files. Current DBMS technology and experience with an internal DBMS system suggests an inherent inefficiency in processing large data networks where significant portions are accessed in a sequential manner. Such a file is the Evaluated Nuclear Data File (ENDF/B), which contains many large data tables, each one normally accessed in a sequential manner. After gaining some experience and success in small applications of the commercially available DBMS package, DBMS-10, on the Center's DECsystem-10 computer, it was decided to select a large data base as a test case before making a final decision on the implementation of DBMS-10 for all data bases. The obvious approach is to utilize the DBMS to index a random-access file. In this way one is able to increase the storage and retrieval efficiency at the one-time cost of additional programing effort. 2 figures

  4. Use of DBMS-10 for storage and retrieval of evaluated nuclear data files

    International Nuclear Information System (INIS)

    Dunford, C.L.

    1978-01-01

    The use of a data base management system (DBMS) for storage of, and retrieval from, the many scientific data bases maintained by the National Nuclear Data Center is currently being investigated. It would appear that a commercially available DBMS package would save the Center considerable money and manpower when adding new data files to our library and in the long-term maintenance of our current data files. Current DBMS technology and experience with our internal DBMS system suggests an inherent inefficiency in processing large data networks where significant portions are accessed in a sequential manner. Such a file is the Evaluated Nuclear Data File (ENDF/B) which contains many large data tables, each one normally accessed in a sequential manner. After gaining some experience and success in small applications of the commercially available DBMS package, DBMS-10, on the Center's DECsystem-10 computer, it was decided to select one of our large data bases as a test case before making a final decision on the implementation of DBMS-10 for all our data bases. The obvious approach is to utilize the DBMS to index a random access file. In this way one is able to increase the storage and retrieval efficiency at the one-time cost of additional programming effort

  5. DJFS: Providing Highly Reliable and High‐Performance File System with Small‐Sized

    Directory of Open Access Journals (Sweden)

    Junghoon Kim

    2017-11-01

    Full Text Available File systems and applications try to implement their own update protocols to guarantee data consistency, which is one of the most crucial aspects of computing systems. However, we found that the storage devices are substantially under‐utilized when preserving data consistency because they generate massive storage write traffic with many disk cache flush operations and force‐unit‐access (FUA commands. In this paper, we present DJFS (Delta‐Journaling File System that provides both a high level of performance and data consistency for different applications. We made three technical contributions to achieve our goal. First, to remove all storage accesses with disk cache flush operations and FUA commands, DJFS uses small‐sized NVRAM for a file system journal. Second, to reduce the access latency and space requirements of NVRAM, DJFS attempts to journal compress the differences in the modified blocks. Finally, to relieve explicit checkpointing overhead, DJFS aggressively reflects the checkpoint transactions to file system area in the unit of the specified region. Our evaluation on TPC‐C SQLite benchmark shows that, using our novel optimization schemes, DJFS outperforms Ext4 by up to 64.2 times with only 128 MB of NVRAM.

  6. Managing Laboratory Data Using Cloud Computing as an Organizational Tool

    Science.gov (United States)

    Bennett, Jacqueline; Pence, Harry E.

    2011-01-01

    One of the most significant difficulties encountered when directing undergraduate research and developing new laboratory experiments is how to efficiently manage the data generated by a number of students. Cloud computing, where both software and computer files reside online, offers a solution to this data-management problem and allows researchers…

  7. A File Archival System

    Science.gov (United States)

    Fanselow, J. L.; Vavrus, J. L.

    1984-01-01

    ARCH, file archival system for DEC VAX, provides for easy offline storage and retrieval of arbitrary files on DEC VAX system. System designed to eliminate situations that tie up disk space and lead to confusion when different programers develop different versions of same programs and associated files.

  8. An algorithm to evaluate solar irradiance and effective dose rates using spectral UV irradiance at four selected wavelengths

    International Nuclear Information System (INIS)

    Anav, A.; Rafanelli, C.; Di Menno, I.; Di Menno, M.

    2004-01-01

    The paper shows a semi-analytical method for environmental and dosimetric applications to evaluate, in clear sky conditions, the solar irradiance and the effective dose rates for some action spectra using only four spectral irradiance values at selected wavelengths in the UV-B and UV-A regions (305, 320, 340 and 380 nm). The method, named WL4UV, is based on the reconstruction of an approximated spectral irradiance that can be integrated, to obtain the solar irradiance, or convoluted with an action spectrum to obtain an effective dose rate. The parameters required in the algorithm are deduced from archived solar spectral irradiance data. This database contains measurements carried out by some Brewer spectrophotometers located in various geographical positions, at similar altitudes, with very different environmental characteristics: Rome (Italy), Ny Aalesund (Svalbard Islands (Norway)) and Ushuaia (Tierra del Fuego (Argentina)). To evaluate the precision of the method, a double test was performed with data not used in developing the model. Archived Brewer measurement data, in clear sky conditions, from Rome and from the National Science Foundation UV data set in San Diego (CA, USA) and Ushuaia, where SUV 100 spectro-radiometers operate, were drawn randomly. The comparison of measured and computed irradiance has a relative deviation of about ±2%. The effective dose rates for action spectra of Erythema, DNA and non-Melanoma skin cancer have a relative deviation of less than ∼20% for solar zenith angles <50 deg.. (authors)

  9. Detection of irradiated spice in blend of irradiated and un-irradiated spices using thermoluminescence method

    International Nuclear Information System (INIS)

    Goto, Michiko; Yamazaki, Masao; Sekiguchi, Masayuki; Todoriki, Setsuko; Miyahara, Makoto

    2007-01-01

    Five blended spice sample were prepared by mixing irradiated and un-irradiated black pepper and paprika at different ratios. Blended black pepper containing 2%(w/w) of 5.4 kGy-irradiated black pepper showed no maximum at glow1. Irradiated black pepper samples, mixed to 5 or 10%(w/w), were identified as 'irradiated' or 'partially irradiated' or 'un-irradiated'. All samples with un-irradiated pepper up to 20%(w/w) were identified as irradiated'. In the case 5.0 kGy-irradiated paprika were mixed with un-irradiated paprika up to 5%(w/w), all samples were identified as irradiated'. The glow1 curves of samples, including irradiated paprika at 0.2%(w/w) or higher, exhibited a maximum between 150 and 250degC. The results suggest the existence of different critical mixing ratio for the detection of irradiation among each spices. Temperature range for integration of the TL glow intensity were compared between 70-400degC and approximate 150-250degC, and revealed that the latter temperature range was determined based on the measurement of TLD100. Although TL glow ratio in 150-250degC was lower than that of 70-400degC range, identification of irradiation was not affected. Treatment of un-irradiated black pepper and paprika with ultraviolet rays had no effect on the detection of irradiation. (author)

  10. Single event upset threshold estimation based on local laser irradiation

    International Nuclear Information System (INIS)

    Chumakov, A.I.; Egorov, A.N.; Mavritsky, O.B.; Yanenko, A.V.

    1999-01-01

    An approach for estimation of ion-induced SEU threshold based on local laser irradiation is presented. Comparative experiment and software simulation research were performed at various pulse duration and spot size. Correlation of single event threshold LET to upset threshold laser energy under local irradiation was found. The computer analysis of local laser irradiation of IC structures was developed for SEU threshold LET estimation. The correlation of local laser threshold energy with SEU threshold LET was shown. Two estimation techniques were suggested. The first one is based on the determination of local laser threshold dose taking into account the relation of sensitive area to local irradiated area. The second technique uses the photocurrent peak value instead of this relation. The agreement between the predicted and experimental results demonstrates the applicability of this approach. (authors)

  11. PETRI NET MODELING OF COMPUTER VIRUS LIFE CYCLE

    African Journals Online (AJOL)

    Dr Obe

    dynamic system analysis is applied to model the virus life cycle. Simulation of the derived model ... Keywords: Virus lifecycle, Petri nets, modeling. simulation. .... complex process. Figure 2 .... by creating Matlab files for five different computer ...

  12. Comparative evaluation of debris extruded apically by using, Protaper retreatment file, K3 file and H-file with solvent in endodontic retreatment

    Directory of Open Access Journals (Sweden)

    Chetna Arora

    2012-01-01

    Full Text Available Aim: The aim of this study was to evaluate the apical extrusion of debris comparing 2 engine driven systems and hand instrumentation technique during root canal retreatment. Materials and Methods: Forty five human permanent mandibular premolars were prepared using the step-back technique, obturated with gutta-percha/zinc oxide eugenol sealer and cold lateral condensation technique. The teeth were divided into three groups: Group A: Protaper retreatment file, Group B: K3, file Group C: H-file with tetrachloroethylene. All the canals were irrigated with 20ml distilled water during instrumentation. Debris extruded along with the irrigating solution during retreatment procedure was carefully collected in preweighed Eppendorf tubes. The tubes were stored in an incubator for 5 days, placed in a desiccator and then re-weighed. Weight of dry debris was calculated by subtracting the weight of the tube before instrumentation and from the weight of the tube after instrumentation. Data was analyzed using Two Way ANOVA and Post Hoc test. Results : There was statistically significant difference in the apical extrusion of debris between hand instrumentation and protaper retreatment file and K3 file. The amount of extruded debris caused by protaper retreatment file and K3 file instrumentation technique was not statistically significant. All the three instrumentation techniques produced apically extruded debris and irrigant. Conclusion: The best way to minimize the extrusion of debris is by adapting crown down technique therefore the use of rotary technique (Protaper retreatment file, K3 file is recommended.

  13. Development of a script for converting DICOM files to .TXT; Desenvolvimento de um script para conversao de arquivos DICOM para .TXT

    Energy Technology Data Exchange (ETDEWEB)

    Abrantes, Marcos E.S.; Oliveira, A.H. de, E-mail: marcosabrantes2003@yahoo.com.br [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear; Abrantes, R.C., E-mail: abrantes.rafa1@gmail.com [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Eletrica; Magalhaes, M.J., E-mail: mjuliano100@yahoo.com.br [Ambienttal Protecao Radiologica, Belo Horizonte, MG (Brazil)

    2014-07-01

    Background: with the increased use of computer simulation techniques for diagnosis or therapy in patients, the MCNP and SCMS software is being widely used. For use as SCMS data entry interface for the MCNP is necessary to perform transformation of DICOM images to text files. Objective: to produce a semi automatic script conversion DICOM images generated by Computerized Tomography or Magnetic Resonance, for .txt in the IMAGEJ software. Methodology: this study was developed in the IMAGEJ software platform with an Intel Core 2 Duo computer, CPU of 2.00GHz, with 2:00 GB of RAM for 32-bit system. Development of the script was held in a text editor using JAVA language. For script insertion in IMAGEJ the plug in tool of this software was used. After this, a window is open asking for the path of the files that will be read, first and last name of DICOM file to be converted, along with where the new files will be stored. Results: for the manual processing of DICOM conversion to .txt of cerebral computed tomography with 600 images requires a time of about 8 hours. The use of script allows conversion time reduction for 12 minutes. Conclusion: the script used demonstrates DICOM conversion ability to .txt and a significant improvement in time savings in processing.

  14. Improving CMS data transfers among its distributed computing facilities

    CERN Document Server

    Flix, J; Sartirana, A

    2001-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on thei...

  15. Improving CMS data transfers among its distributed computing facilities

    CERN Document Server

    Flix, Jose

    2010-01-01

    CMS computing needs reliable, stable and fast connections among multi-tiered computing infrastructures. For data distribution, the CMS experiment relies on a data placement and transfer system, PhEDEx, managing replication operations at each site in the distribution network. PhEDEx uses the File Transfer Service (FTS), a low level data movement service responsible for moving sets of files from one site to another, while allowing participating sites to control the network resource usage. FTS servers are provided by Tier-0 and Tier-1 centres and are used by all computing sites in CMS, according to the established policy. FTS needs to be set up according to the Grid site's policies, and properly configured to satisfy the requirements of all Virtual Organizations making use of the Grid resources at the site. Managing the service efficiently requires good knowledge of the CMS needs for all kinds of transfer workflows. This contribution deals with a revision of FTS servers used by CMS, collecting statistics on the...

  16. RECOLA2: REcursive Computation of One-Loop Amplitudes 2

    Science.gov (United States)

    Denner, Ansgar; Lang, Jean-Nicolas; Uccirati, Sandro

    2018-03-01

    We present the Fortran95 program RECOLA2 for the perturbative computation of next-to-leading-order transition amplitudes in the Standard Model of particle physics and extended Higgs sectors. New theories are implemented via model files in the 't Hooft-Feynman gauge in the conventional formulation of quantum field theory and in the Background-Field method. The present version includes model files for Two-Higgs-Doublet Model and the Higgs-Singlet Extension of the Standard Model. We support standard renormalization schemes for the Standard Model as well as many commonly used renormalization schemes in extended Higgs sectors. Within these models the computation of next-to-leading-order polarized amplitudes and squared amplitudes, optionally summed over spin and colour, is fully automated for any process. RECOLA2 allows the computation of colour- and spin-correlated leading-order squared amplitudes that are needed in the dipole subtraction formalism. RECOLA2 is publicly available for download at http://recola.hepforge.org.

  17. Computerization of radiology departmental management using the personal computer: application in medium sized hospital

    International Nuclear Information System (INIS)

    Ahn, Woo Hyun

    1992-01-01

    The author developed a computer program for use in registration, monthly statistics, printing of reports, data storage and retrieval in the Radiology department. This program has been used in the Department of Radiology, MoonHwa Hospital since November 1990. This program was written in FoxBASE language, consisted of two independent subprograms, one installed in a registration computer without a printer and another in a reporting computer with a printer. The subprograms were designed to link their data by floppy disk. Each computer's hard disk contained permanent files for retrieval and temporary files for data input. All permanent files were indexed on several keywords including the patient's identification data. 1. Registration was performed easily and rapidly. 2. A monthly statistic was obtained simply. 3. Retrieval of the results of previous radiologic studies, printing of report, storage and indexing of data were achieved automatically. This program had merits of simple operation, large storage capacity, rapid retrieval speed, relative low price, easy adjustment for other hospitals. Therefore this program is considered to be an economic substitute for computerization of radiology departmental management in medium sized hospitals

  18. 76 FR 43679 - Filing via the Internet; Notice of Additional File Formats for efiling

    Science.gov (United States)

    2011-07-21

    ... list of acceptable file formats the four-character file extensions for Microsoft Office 2007/2010... files from Office 2007 or Office 2010 in an Office 2003 format prior to submission. Dated: July 15, 2011...

  19. Bulk and interface defects in electron irradiated InP

    International Nuclear Information System (INIS)

    Peng Chen; Sun Heng-hui

    1989-01-01

    Systematic studies on the structure of defects in InP caused by electron irradiation are conducted based on experimental measurements and theoretical calculations. The rates of introduction and annealing-out temperatures of In and P vancancies are estimated using proper theoretical models. These calculations reveal that after room temperature irradiation only complexes may exist. It is also supported by our experimental data that the sum of introducing rates of three detected levels are less than the theoretical value calculated for single vacancies. According to our equation on the relation between interface states and DLTS signal and from the results of computer calculation we believe that the broad peak appearing in the DLTS diagram before irradiation is related to interface states. Its disappearance after electron irradiation suggests the reduction of interface states; this is further confirmed by the reduction of surface recombination rate derived from the results of surface photovoltage measurement

  20. UPIN Group File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Group Unique Physician Identifier Number (UPIN) File is the business entity file that contains the group practice UPIN and descriptive information. It does NOT...

  1. Interoperability format translation and transformation between IFC architectural design file and simulation file formats

    Science.gov (United States)

    Chao, Tian-Jy; Kim, Younghun

    2015-01-06

    Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function to convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.

  2. Irradiation effects test series test IE-1 test results report

    International Nuclear Information System (INIS)

    Quapp, W.J.; Allison, C.M.; Farrar, L.C.; Mehner, A.S.

    1977-03-01

    The report describes the results of the first programmatic test in the Nuclear Regulatory Commission Irradiation Effects Test Series. This test (IE-1) used four 0.97m long PWR-type fuel rods fabricated from previously irradiated Saxton fuel. The objectives of this test were to evaluate the effect of fuel pellet density on pellet-cladding interaction during a power ramp and to evaluate the influence of the irradiated state of the fuel and cladding on rod behavior during film boiling operation. Data are presented on the behavior of irradiated fuel rods during steady-state operation, a power ramp, and film boiling operation. The effects of as-fabricated gap size, as-fabricated fuel density, rod power, and power ramp rate on pellet-cladding interaction are discussed. Test data are compared with FRAP-T2 computer model predictions, and comments on the consequences of sustained film boiling operation on irradiated fuel rod behavior are provided

  3. Neutron irradiation experiments for fusion reactor materials through JUPITER program

    International Nuclear Information System (INIS)

    Abe, K.; Namba, C.; Wiffen, F.W.; Jones, R.H.

    1998-01-01

    A Japan-USA program of irradiation experiments for fusion research, ''JUPITER'', has been established as a 6 year program from 1995 to 2000. The goal is to study ''the dynamic behavior of fusion reactor materials and their response to variable and complex irradiation environment''. This is phase-three of the collaborative program, which follows RTNS-II program (phase-1: 1982-1986) and FFTF/MOTA program (phase-2: 1987-1994). This program is to provide a scientific basis for application of materials performance data, generated by fission reactor experiments, to anticipated fusion environments. Following the systematic study on cumulative irradiation effects, done through FFTF/MOTA program. JUPITER is emphasizing the importance of dynamic irradiation effects on materials performance in fusion systems. The irradiation experiments in this program include low activation structural materials, functional ceramics and other innovative materials. The experimental data are analyzed by theoretical modeling and computer simulation to integrate the above effects. (orig.)

  4. Heat and radiation analysis of NPP Krsko irradiated fuel

    International Nuclear Information System (INIS)

    Lalovic, M.

    1986-01-01

    Radioactive and heat potential for irradiated fuel in the region 2 with burnup of 13400 MWd/tHM, and in the region 4A with burnup of 9360 MWd/tHM for NPP KRSKO, was calculated. Computer code KORIGEN (Karlsruhe Oak Ridge Isotope Generation and Depletion Code) was used. The aspects of radiation (mainly gamma and neutrons) and of heat production was considered with respect to their impact on fuel handing and waste management. Isotopic concentrations for irradiated fuel was calculated and compared with Westinghouse data. (author)

  5. Industrial irradiation

    International Nuclear Information System (INIS)

    Stirling, Andrew

    1995-01-01

    Production lines for rubber gloves would not appear to have much in common with particle physics laboratories, but they both use accelerators. Electron beam irradiation is often used in industry to improve the quality of manufactured goods or to reduce production cost. Products range from computer disks, shrink packaging, tyres, cables, and plastics to hot water pipes. Some products, such as medical goods, cosmetics and certain foodstuffs, are sterilized in this way. In electron beam irradiation, electrons penetrate materials creating showers of low energy electrons. After many collisions these electrons have the correct energy to create chemically active sites. They may either break molecular bonds or activate a site which promotes a new chemical linkage. This industrial irradiation can be exploited in three ways: breaking down a biological molecule usually renders it useless and kills the organism; breaking an organic molecule can change its toxicity or function; and crosslinking a polymer can strengthen it. In addition to traditional gamma irradiation using isotopes, industrial irradiation uses three accelerator configurations, each type defining an energy range, and consequently the electron penetration depth. For energies up to 750 kV, the accelerator consists of a DC potential applied to a simple wire anode and the electrons extracted through a slot in a coaxially mounted cylindrical cathode. In the 1-5 MeV range, the Cockcroft-Walton or Dynamitron( R ) accelerators are normally used. To achieve the high potentials in these DC accelerators, insulating SF6 gas and large dimension vessels separate the anode and cathode; proprietary techniques distinguish the various commercial models available. Above 5 MeV, the size of DC accelerators render them impractical, and more compact radiofrequency-driven linear accelerators are used. Irradiation electron beams are actually 'sprayed' over the product using a magnetic deflection system. Lower energy beams of

  6. 12 CFR 5.4 - Filing required.

    Science.gov (United States)

    2010-01-01

    ... CORPORATE ACTIVITIES Rules of General Applicability § 5.4 Filing required. (a) Filing. A depository institution shall file an application or notice with the OCC to engage in corporate activities and... advise an applicant through a pre-filing communication to send the filing or submission directly to the...

  7. Calculation of displacement and helium production at the LAMPF irradiation facility

    International Nuclear Information System (INIS)

    Davidson, D.R.; Greenwood, L.R.; Sommer, W.F.; Wechsler, M.S.

    1984-01-01

    Differential and total displacement and helium production rates are calculated for copper irradiated by spallation neutrons and 760 MeV protons at LAMPF. The calculations are performed using the SPECTER and VNMTC computer codes, the latter being specially designed for spallation radiation damage calculations. For comparison, similar SPECTER calculations are also described for irradiation of copper in EBR-II and RTNS-II. The results indicate substantial contributions to the displacement and helium production rates due to neutrons in the high-energy tail (above 40 MeV) of the LAMPF spallation neutron spectrum. Still higher production rates are calculated for irradiations in the direct proton beam. These results will provide useful background information for research to be conducted at a new irradiation facility at LAMPF

  8. Post-radiation nephritis. Study of the renal consequences of splenic irradiation for lymphoma

    International Nuclear Information System (INIS)

    Le Bourgeois, J.P.; Godefroy, D.; Di Paolo, M.; Parmentier, C.; Tubiana, M.

    1975-01-01

    The left kidney consequences of splenic irradiation in 40 patients with lymphomas were studied. The renal work-up performed before irradiation and every six months afterwards includes: blood pressure, biological tests, IV P and 179 Hg neohydrine renal scan. Computer scan data processing showed a partial disfunction of left kidney in 16 patients with 18 month-follow up. Renal disfunction appeared within 8 to 10 months following spleen irradiation. During that period no clinical or radiological abnormalities were observed [fr

  9. Evaluation of Root Canal Preparation Using Rotary System and Hand Instruments Assessed by Micro-Computed Tomography

    Science.gov (United States)

    Stavileci, Miranda; Hoxha, Veton; Görduysus, Ömer; Tatar, Ilkan; Laperre, Kjell; Hostens, Jeroen; Küçükkaya, Selen; Muhaxheri, Edmond

    2015-01-01

    Background Complete mechanical preparation of the root canal system is rarely achieved. Therefore, the purpose of this study was to evaluate and compare the root canal shaping efficacy of ProTaper rotary files and standard stainless steel K-files using micro-computed tomography. Material/Methods Sixty extracted upper second premolars were selected and divided into 2 groups of 30 teeth each. Before preparation, all samples were scanned by micro-computed tomography. Thirty teeth were prepared with the ProTaper system and the other 30 with stainless steel files. After preparation, the untouched surface and root canal straightening were evaluated with micro-computed tomography. The percentage of untouched root canal surface was calculated in the coronal, middle, and apical parts of the canal. We also calculated straightening of the canal after root canal preparation. Results from the 2 groups were statistically compared using the Minitab statistical package. Results ProTaper rotary files left less untouched root canal surface compared with manual preparation in coronal, middle, and apical sector (protary techniques completely prepared the root canal, and both techniques caused slight straightening of the root canal. PMID:26092929

  10. Huygens file service and storage architecture

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.; Stabell-Kulo, Tage; Stabell-Kulo, Tage

    1993-01-01

    The Huygens file server is a high-performance file server which is able to deliver multi-media data in a timely manner while also providing clients with ordinary “Unix” like file I/O. The file server integrates client machines, file servers and tertiary storage servers in the same storage

  11. Huygens File Service and Storage Architecture

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.; Stabell-Kulo, Tage; Stabell-Kulo, Tage

    1993-01-01

    The Huygens file server is a high-performance file server which is able to deliver multi-media data in a timely manner while also providing clients with ordinary “Unix” like file I/O. The file server integrates client machines, file servers and tertiary storage servers in the same storage

  12. Analysis On Security Of Cloud Computing

    Directory of Open Access Journals (Sweden)

    Muhammad Zunnurain Hussain

    2017-01-01

    Full Text Available In this paper Author will be discussing the security issues and challenges faced by the industry in securing the cloud computing and how these problems can be tackled. Cloud computing is modern technique of sharing resources like data sharing file sharing basically sharing of resources without launching own infrastructure and using some third party resources to avoid huge investment . It is very challenging these days to secure the communication between two users although people use different encryption techniques 1.

  13. Apically extruded dentin debris by reciprocating single-file and multi-file rotary system.

    Science.gov (United States)

    De-Deus, Gustavo; Neves, Aline; Silva, Emmanuel João; Mendonça, Thais Accorsi; Lourenço, Caroline; Calixto, Camila; Lima, Edson Jorge Moreira

    2015-03-01

    This study aims to evaluate the apical extrusion of debris by the two reciprocating single-file systems: WaveOne and Reciproc. Conventional multi-file rotary system was used as a reference for comparison. The hypotheses tested were (i) the reciprocating single-file systems extrude more than conventional multi-file rotary system and (ii) the reciprocating single-file systems extrude similar amounts of dentin debris. After solid selection criteria, 80 mesial roots of lower molars were included in the present study. The use of four different instrumentation techniques resulted in four groups (n = 20): G1 (hand-file technique), G2 (ProTaper), G3 (WaveOne), and G4 (Reciproc). The apparatus used to evaluate the collection of apically extruded debris was typical double-chamber collector. Statistical analysis was performed for multiple comparisons. No significant difference was found in the amount of the debris extruded between the two reciprocating systems. In contrast, conventional multi-file rotary system group extruded significantly more debris than both reciprocating groups. Hand instrumentation group extruded significantly more debris than all other groups. The present results yielded favorable input for both reciprocation single-file systems, inasmuch as they showed an improved control of apically extruded debris. Apical extrusion of debris has been studied extensively because of its clinical relevance, particularly since it may cause flare-ups, originated by the introduction of bacteria, pulpal tissue, and irrigating solutions into the periapical tissues.

  14. elPrep: High-Performance Preparation of Sequence Alignment/Map Files for Variant Calling.

    Directory of Open Access Journals (Sweden)

    Charlotte Herzeel

    Full Text Available elPrep is a high-performance tool for preparing sequence alignment/map files for variant calling in sequencing pipelines. It can be used as a replacement for SAMtools and Picard for preparation steps such as filtering, sorting, marking duplicates, reordering contigs, and so on, while producing identical results. What sets elPrep apart is its software architecture that allows executing preparation pipelines by making only a single pass through the data, no matter how many preparation steps are used in the pipeline. elPrep is designed as a multithreaded application that runs entirely in memory, avoids repeated file I/O, and merges the computation of several preparation steps to significantly speed up the execution time. For example, for a preparation pipeline of five steps on a whole-exome BAM file (NA12878, we reduce the execution time from about 1:40 hours, when using a combination of SAMtools and Picard, to about 15 minutes when using elPrep, while utilising the same server resources, here 48 threads and 23GB of RAM. For the same pipeline on whole-genome data (NA12878, elPrep reduces the runtime from 24 hours to less than 5 hours. As a typical clinical study may contain sequencing data for hundreds of patients, elPrep can remove several hundreds of hours of computing time, and thus substantially reduce analysis time and cost.

  15. 76 FR 52323 - Combined Notice of Filings; Filings Instituting Proceedings

    Science.gov (United States)

    2011-08-22

    .... Applicants: Young Gas Storage Company, Ltd. Description: Young Gas Storage Company, Ltd. submits tariff..., but intervention is necessary to become a party to the proceeding. The filings are accessible in the.... More detailed information relating to filing requirements, interventions, protests, and service can be...

  16. Hierarchical remote data possession checking method based on massive cloud files

    Directory of Open Access Journals (Sweden)

    Ma Haifeng

    2017-06-01

    Full Text Available Cloud storage service enables users to migrate their data and applications to the cloud, which saves the local data maintenance and brings great convenience to the users. But in cloud storage, the storage servers may not be fully trustworthy. How to verify the integrity of cloud data with lower overhead for users has become an increasingly concerned problem. Many remote data integrity protection methods have been proposed, but these methods authenticated cloud files one by one when verifying multiple files. Therefore, the computation and communication overhead are still high. Aiming at this problem, a hierarchical remote data possession checking (hierarchical-remote data possession checking (H-RDPC method is proposed, which can provide efficient and secure remote data integrity protection and can support dynamic data operations. This paper gives the algorithm descriptions, security, and false negative rate analysis of H-RDPC. The security analysis and experimental performance evaluation results show that the proposed H-RDPC is efficient and reliable in verifying massive cloud files, and it has 32–81% improvement in performance compared with RDPC.

  17. Using TRIGA Mark II research reactor for irradiation with thermal neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Kolšek, Aljaž, E-mail: aljaz.kolsek@gmail.com; Radulović, Vladimir, E-mail: vladimir.radulovic@ijs.si; Trkov, Andrej, E-mail: andrej.trkov@ijs.si; Snoj, Luka, E-mail: luka.snoj@ijs.si

    2015-03-15

    Highlights: • Monte Carlo N-Particle Transport Code was used to design and perform calculations. • Characterization of the TRIGA Mark II ex-core irradiation facilities was performed. • The irradiation device was designed in the TRIGA irradiation channel. • The use of the device improves the fraction of thermal neutron flux by 390%. - Abstract: Recently a series of test irradiations was performed at the JSI TRIGA Mark II reactor for the Fission Track-Thermoionization Mass Spectrometry (FT-TIMS) method, which requires a well thermalized neutron spectrum for sample irradiation. For this purpose the Monte Carlo N-Particle Transport Code (MCNP5) was used to computationally support the design of an irradiation device inside the TRIGA model and to support the actual measurements by calculating the neutron fluxes inside the major ex-core irradiation facilities. The irradiation device, filled with heavy water, was designed and optimized inside the Thermal Column and the additional moderation was placed inside the Elevated Piercing Port. The use of the device improves the ratio of thermal neutron flux to the sum of epithermal and fast neutron flux inside the Thermal Column Port by 390% and achieves the desired thermal neutron fluence of 10{sup 15} neutrons/cm{sup 2} in irradiation time of 20 h.

  18. MCNP Variance Reduction technique application for the Development Of the Citrusdal Irradiation Facility

    International Nuclear Information System (INIS)

    Makgae, R.

    2008-01-01

    A private company, Citrus Research International (CIR) is intending to construct an insect irradiation facility for the irradiation of insect for pest management in south western region of South Africa. The facility will employ a Co-60 cylindrical source in the chamber. An adequate thickness for the concrete shielding walls and the ability of the labyrinth leading to the irradiation chamber, to attenuate radiation to dose rates that are acceptably low, were determined. Two methods of MCNP variance reduction techniques were applied to accommodate the two pathways of deep penetration to evaluate the radiological impact outside the 150 cm concrete walls and steaming of gamma photons through the labyrinth. The point-kernel based MicroShield software was used in the deep penetration calculations for the walls around the source room to test its accuracy and the results obtained are in good agreement with about 15-20% difference. The dose rate mapping due to radiation Streaming along the labyrinth to the facility entrance is also to be validated with the Attila code, which is a deterministic code that solves the Discrete Ordinates approximation. This file provides a template for writing papers for the conference. (authors)

  19. MCNP Variance Reduction technique application for the Development Of the Citrusdal Irradiation Facility

    Energy Technology Data Exchange (ETDEWEB)

    Makgae, R. [Pebble Bed Modular Reactor (PBMR), P.O. Box 9396, Centurion (South Africa)

    2008-07-01

    A private company, Citrus Research International (CIR) is intending to construct an insect irradiation facility for the irradiation of insect for pest management in south western region of South Africa. The facility will employ a Co-60 cylindrical source in the chamber. An adequate thickness for the concrete shielding walls and the ability of the labyrinth leading to the irradiation chamber, to attenuate radiation to dose rates that are acceptably low, were determined. Two methods of MCNP variance reduction techniques were applied to accommodate the two pathways of deep penetration to evaluate the radiological impact outside the 150 cm concrete walls and steaming of gamma photons through the labyrinth. The point-kernel based MicroShield software was used in the deep penetration calculations for the walls around the source room to test its accuracy and the results obtained are in good agreement with about 15-20% difference. The dose rate mapping due to radiation Streaming along the labyrinth to the facility entrance is also to be validated with the Attila code, which is a deterministic code that solves the Discrete Ordinates approximation. This file provides a template for writing papers for the conference. (authors)

  20. Study and development of a document file system with selective access; Etude et realisation d'un systeme de fichiers documentaires a acces selectif

    Energy Technology Data Exchange (ETDEWEB)

    Mathieu, Jean-Claude

    1974-06-21

    The objective of this research thesis was to design and to develop a set of software aimed at an efficient management of a document file system by using methods of selective access to information. Thus, the three main aspects of file processing (creation, modification, reorganisation) have been addressed. The author first presents the main problems related to the development of a comprehensive automatic documentation system, and their conventional solutions. Some future aspects, notably dealing with the development of peripheral computer technology, are also evoked. He presents the characteristics of INIS bibliographic records provided by the IAEA which have been used to create the files. In the second part, he briefly describes the file system general organisation. This system is based on the use of two main files: an inverse file which contains for each descriptor a list of of numbers of files indexed by this descriptor, and a dictionary of descriptor or input file which gives access to the inverse file. The organisation of these both files is then describes in a detailed way. Other related or associated files are created, and the overall architecture and mechanisms integrated into the file data input software are described, as well as various processing applied to these different files. Performance and possible development are finally discussed.

  1. Development of a photonuclear activation file and measurement of delayed neutron spectra; Creation d'une bibliotheque d'activation photonucleaire et mesures de spectres d'emission de neutrons retardes

    Energy Technology Data Exchange (ETDEWEB)

    Giacri-Mauborgne, M.L

    2005-11-15

    This thesis work consists in two parts. The first part is the description of the creation of a photonuclear activation file which will be used to calculated photonuclear activation. To build this file we have used different data sources: evaluations but also calculations done using several cross sections codes (HMS-ALICE, GNASH, ABLA). This file contains photonuclear activation cross sections for more than 600 nuclides and fission fragments distributions for 30 actinides at tree different Bremsstrahlung energies and the delay neutron spectrum associated. These spectra are not in good agreement with experimental data. That is why we decided to launch measurement of delayed neutrons spectra from photofission. The second part of this thesis consists in demonstrating the possibility to do such measurements at the ELSA accelerator facility. To that purpose, we have developed the detection, the acquisition system and the analysis method of such spectra. These were tested for the measurement of the delayed neutron spectrum of uranium-238 after irradiation in a 2 MeV neutron flux. Finally, we have measured the delayed neutron spectrum of uranium-238 after irradiation in a 15 MeV Bremsstrahlung flux. We compare our results with experimental data. The experiment has allowed us to improve the value of {nu}{sub p}-bar with an absolute uncertainty below 7%, we propose {nu}{sub p}-bar = (3.03 {+-} 0.02) n/100 fissions, and to correct the Nikotin's parameters for the six group representation. Particularly, we have improved the data concerning the sixth group by taking into account results from different irradiation times.

  2. Locking Editor A Utility For Protecting Software Exercises In The Computer Laboratory Of AMA University

    Directory of Open Access Journals (Sweden)

    Paul M. Grafilon

    2017-07-01

    Full Text Available The student of AMA University persistence in computing which has the keys to providing their talent needed to fill the computer laboratory in the computing professions. A range of factors can affect a students decision to remain in a computing major or change to another major if ever they feel that computing education is difficult. This has to describe the activities in computer laboratory specifically exercises machine problems and computing case studies interacting different application programs as the basis of their skills and knowledge in programming capability. The nature of those activities addresses by using of IDE as open source in all programming applications which may result of specific intervention such as using the editor to create a source file the code blocks comments and program statements are entered and the file saved. In case there are no corrective actions taken as the editor does not know this is supposed to be a source file as opposed to notes for class. If working in a position-dependent language like Java the developer would have to be very careful about indenting. The file has to be saved with the correct file extension and in a directory where the compiler can find it. Each source file has to be compiled separately if the program has a few source files they all have to be named separately in the compiler. When invoking the compiler it has to be directed to look in the correct directory for the source files and where the output files should be stored. If there is an error in the source file the compiler will output messages and fail to complete. For any errors the developer goes back and edits the source file working from line numbers and compiler messages to fix the problems and these steps continue until all the source files compile without errors. When linking each object file is specified as being part of the build. Again the locations for the object files and executable are given. There may be errors at this point

  3. Efficacy of Twisted File Adaptive, Reciproc and ProTaper Universal Retreatment instruments for root-canal-filling removal: A cone-beam computed tomography study.

    Science.gov (United States)

    Akbulut, Makbule Bilge; Akman, Melek; Terlemez, Arslan; Magat, Guldane; Sener, Sevgi; Shetty, Heeresh

    2016-01-01

    The aim of this study was to evaluate the efficacy of Twisted File (TF) Adaptive, Reciproc, and ProTaper Universal Retreatment (UR) System instruments for removing root-canal-filling. Sixty single rooted teeth were decoronated, instrumented and obturated. Preoperative CBCT scans were taken and the teeth were retreated with TF Adaptive, Reciproc, ProTaper UR, or hand files (n=15). Then, the teeth were rescanned, and the percentage volume of the residual root-canal-filling material was established. The total time for retreatment was recorded, and the data was statistically analyzed. The statistical ranking of the residual filling material volume was as follows: hand file=TF Adaptive>ProTaper UR=Reciproc. The ProTaper UR and Reciproc systems required shorter periods of time for retreatment. Root canal filling was more efficiently removed by using Reciproc and ProTaper UR instruments than TF Adaptive instruments and hand files. The TF Adaptive system was advantageous over hand files with regard to operating time.

  4. NJOY99, Data Processing System of Evaluated Nuclear Data Files ENDF Format

    International Nuclear Information System (INIS)

    2000-01-01

    1 - Description of program or function: The NJOY nuclear data processing system is a modular computer code used for converting evaluated nuclear data in the ENDF format into libraries useful for applications calculations. Because the Evaluated Nuclear Data File (ENDF) format is used all around the world (e.g., ENDF/B-VI in the US, JEF-2.2 in Europe, JENDL-3.2 in Japan, BROND-2.2 in Russia), NJOY gives its users access to a wide variety of the most up-to-date nuclear data. NJOY provides comprehensive capabilities for processing evaluated data, and it can serve applications ranging from continuous-energy Monte Carlo (MCNP), through deterministic transport codes (DANT, ANISN, DORT), to reactor lattice codes (WIMS, EPRI). NJOY handles a wide variety of nuclear effects, including resonances, Doppler broadening, heating (KERMA), radiation damage, thermal scattering (even cold moderators), gas production, neutrons and charged particles, photo-atomic interactions, self shielding, probability tables, photon production, and high-energy interactions (to 150 MeV). Output can include printed listings, special library files for applications, and Postscript graphics (plus color). More information on NJOY is available from the developer's home page at http://t2.lanl.gov/tour/tourbus.html. Follow the Tourbus section of the Tour area to find notes from the ICTP lectures held at Trieste in March 2000 on the ENDF format and on the NJOY code. NJOY contains the following modules: NJOY directs the flow of data through the other modules and contains a library of common functions and subroutines used by the other modules. RECONR reconstructs pointwise (energy-dependent) cross sections from ENDF resonance parameters and interpolation schemes. BROADR Doppler broadens and thins pointwise cross sections. UNRESR computes effective self-shielded pointwise cross sections in the unresolved energy range. HEATR generates pointwise heat production cross sections (KERMA coefficients) and radiation

  5. Evaluated neutronic file for indium

    International Nuclear Information System (INIS)

    Smith, A.B.; Chiba, S.; Smith, D.L.; Meadows, J.W.; Guenther, P.T.; Lawson, R.D.; Howerton, R.J.

    1990-01-01

    A comprehensive evaluated neutronic data file for elemental indium is documented. This file, extending from 10 -5 eV to 20 MeV, is presented in the ENDF/B-VI format, and contains all neutron-induced processes necessary for the vast majority of neutronic applications. In addition, an evaluation of the 115 In(n,n') 116m In dosimetry reaction is presented as a separate file. Attention is given in quantitative values, with corresponding uncertainty information. These files have been submitted for consideration as a part of the ENDF/B-VI national evaluated-file system. 144 refs., 10 figs., 4 tabs

  6. Distributed Computing for the Pierre Auger Observatory

    International Nuclear Information System (INIS)

    Chudoba, J.

    2015-01-01

    Pierre Auger Observatory operates the largest system of detectors for ultra-high energy cosmic ray measurements. Comparison of theoretical models of interactions with recorded data requires thousands of computing cores for Monte Carlo simulations. Since 2007 distributed resources connected via EGI grid are successfully used. The first and the second versions of production system based on bash scripts and MySQL database were able to submit jobs to all reliable sites supporting Virtual Organization auger. For many years VO auger belongs to top ten of EGI users based on the total used computing time. Migration of the production system to DIRAC interware started in 2014. Pilot jobs improve efficiency of computing jobs and eliminate problems with small and less reliable sites used for the bulk production. The new system has also possibility to use available resources in clouds. Dirac File Catalog replaced LFC for new files, which are organized in datasets defined via metadata. CVMFS is used for software distribution since 2014. In the presentation we give a comparison of the old and the new production system and report the experience on migrating to the new system. (paper)

  7. Distributed Computing for the Pierre Auger Observatory

    Science.gov (United States)

    Chudoba, J.

    2015-12-01

    Pierre Auger Observatory operates the largest system of detectors for ultra-high energy cosmic ray measurements. Comparison of theoretical models of interactions with recorded data requires thousands of computing cores for Monte Carlo simulations. Since 2007 distributed resources connected via EGI grid are successfully used. The first and the second versions of production system based on bash scripts and MySQL database were able to submit jobs to all reliable sites supporting Virtual Organization auger. For many years VO auger belongs to top ten of EGI users based on the total used computing time. Migration of the production system to DIRAC interware started in 2014. Pilot jobs improve efficiency of computing jobs and eliminate problems with small and less reliable sites used for the bulk production. The new system has also possibility to use available resources in clouds. Dirac File Catalog replaced LFC for new files, which are organized in datasets defined via metadata. CVMFS is used for software distribution since 2014. In the presentation we give a comparison of the old and the new production system and report the experience on migrating to the new system.

  8. FHEO Filed Cases

    Data.gov (United States)

    Department of Housing and Urban Development — The dataset is a list of all the Title VIII fair housing cases filed by FHEO from 1/1/2007 - 12/31/2012 including the case number, case name, filing date, state and...

  9. How Hedstrom files fail during clinical use? A retrieval study based on SEM, optical microscopy and micro-XCT analysis.

    Science.gov (United States)

    Zinelis, Spiros; Al Jabbari, Youssef S

    2018-05-01

    This study was conducted to evaluate the failure mechanism of clinically failed Hedstrom (H)-files. Discarded H-files (n=160) from #8 to #40 ISO sizes were collected from different dental clinics. Retrieved files were classified according to their macroscopic appearance and they were investigated under scanning electron microscopy (SEM) and X-ray micro-computed tomography (mXCT). Then the files were embedded in resin along their longitudinal axis and after metallographic grinding and polishing, studied under an incident light microscope. The macroscopic evaluation showed that small ISO sizes (#08-#15) failed by extensive plastic deformation, while larger sizes (≥#20) tended to fracture. Light microscopy and mXCT results coincided showing that unused and plastically deformed files were free of internal defects, while fractured files demonstrate the presence of intense cracking in the flute region. SEM analysis revealed the presence of striations attributed to the fatigue mechanism. Secondary cracks were also identified by optical microscopy and their distribution was correlated to fatigue under bending loading. Experimental results demonstrated that while overloading of cutting instruments is the predominating failure mechanism of small file sizes (#08-#15), fatigue should be considered the fracture mechanism for larger sizes (≥#20).

  10. ITP Adjuster 1.0: A New Utility Program to Adjust Charges in the Topology Files Generated by the PRODRG Server

    Directory of Open Access Journals (Sweden)

    Diogo de Jesus Medeiros

    2013-01-01

    Full Text Available The suitable computation of accurate atomic charges for the GROMACS topology *.itp files of small molecules, generated in the PRODRG server, has been a tricky task nowadays because it does not calculate atomic charges using an ab initio method. Usually additional steps of structure optimization and charges calculation, followed by a tedious manual replacement of atomic charges in the *.itp file, are needed. In order to assist this task, we report here the ITP Adjuster 1.0, a utility program developed to perform the replacement of the PRODRG charges in the *.itp files of small molecules by ab initio charges.

  11. Common Sense Planning for a Computer, or, What's It Worth to You?

    Science.gov (United States)

    Crawford, Walt

    1984-01-01

    Suggests factors to be considered in planning for the purchase of a microcomputer, including budgets, benefits, costs, and decisions. Major uses of a personal computer are described--word processing, financial analysis, file and database management, programming and computer literacy, education, entertainment, and thrill of high technology. (EJS)

  12. COMPUTING SERVICES DURING THE ANNUAL CERN SHUTDOWN

    CERN Multimedia

    2001-01-01

    As in previous years, computing services run by IT division will be left running unattended during the annual shutdown. The following points should be noted. No interruptions are scheduled for local and wide area networking and the ACB, e-mail and unix interactive services. Unix batch services will be available but without access to manually mounted tapes. Dedicated Engineering services, general purpose database services and the Helpdesk will be closed during this period. An operator service will be maintained and can be reached at extension 75011 or by Email to computer.operations@cern.ch. Users should be aware that, except where there are special arrangements, any major problems that develop during this period will most likely be resolved only after CERN has reopened. In particular, we cannot guarantee backups for Home Directory files (for Unix or Windows) or for email folders. Any changes that you make to your files during this period may be lost in the event of a disk failure. Please note that all t...

  13. Fast processing the film data file

    International Nuclear Information System (INIS)

    Abramov, B.M.; Avdeev, N.F.; Artemov, A.V.

    1978-01-01

    The problems of processing images obtained from three-meter magnetic spectrometer on a new PSP-2 automatic device are considered. A detailed description of the filtration program, which controls the correctness of operation connection line, as well as of scanning parameters and technical quality of information. The filtration process can be subdivided into the following main stages: search of fiducial marks binding of track to fiducial marks; plotting from sparks of track fragments in chambers. For filtration purposes the BESM-6 computer has been chosen. The complex of filtration programs is shaped as a RAM-file, the required version of the program is collected by the PATCHY program. The subprograms, performing the greater part of the calculations are written in the autocode MADLEN, the rest of the subprograms - in FORTRAN and ALGOL. The filtration time for one image makes 1,2-2 s of the calculation. The BESM-6 computer processes up to 12 thousand images a day

  14. Shielding of a neutron irradiator with {sup 241}Am-Be source

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, K.A.M. de; Crispim, V.R.; Silva, A.X., E-mail: koliveira@con.ufrj.b, E-mail: verginia@con.ufrj.b, E-mail: ademir@con.ufrj.b [Universidade Federal do Rio de Janeiro (PEN/COPPE/UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-Graduacao de Engenharia. Programa de Engenharia Nuclear; Fonseca, E.S., E-mail: evaldo@ird.gov.b [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2011-07-01

    The equivalent dose rates at 1.0 cm from the outer surface of the shielding of a neutron irradiation system that uses {sup 241}Am-Be source with activity of 185 GBq (5 Ci) were determined. A theoretical-experimental approach including case studies, through computer simulations with MCNP code was employed to calculate the best shielding thickness. Following the construction of the neutron irradiator, dose measurements were conducted in order to validate data obtained from simulation. The neutron irradiator shielding was designed in such a way to allow transport of the neutron radiography system for in loco inspections ensuring workers' radiologic safety. (author)

  15. Determination of neutron flux distribution in an Am-Be irradiator using the MCNP.

    Science.gov (United States)

    Shtejer-Diaz, K; Zamboni, C B; Zahn, G S; Zevallos-Chávez, J Y

    2003-10-01

    A neutron irradiator has been assembled at IPEN facilities to perform qualitative-quantitative analysis of many materials using thermal and fast neutrons outside the nuclear reactor premises. To establish the prototype specifications, the neutron flux distribution and the absorbed dose rates were calculated using the MCNP computer code. These theoretical predictions then allow one to discuss the optimum irradiator design and its performance.

  16. 76 FR 62092 - Filing Procedures

    Science.gov (United States)

    2011-10-06

    ... INTERNATIONAL TRADE COMMISSION Filing Procedures AGENCY: International Trade Commission. ACTION: Notice of issuance of Handbook on Filing Procedures. SUMMARY: The United States International Trade Commission (``Commission'') is issuing a Handbook on Filing Procedures to replace its Handbook on Electronic...

  17. 12 CFR 1780.9 - Filing of papers.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Filing of papers. 1780.9 Section 1780.9 Banks... papers. (a) Filing. Any papers required to be filed shall be addressed to the presiding officer and filed... Director or the presiding officer. All papers filed by electronic media shall also concurrently be filed in...

  18. Computer Software for Life Cycle Cost.

    Science.gov (United States)

    1987-04-01

    34 111. 1111I .25 IL4 jj 16 MICROCOPY RESOLUTION TEST CHART hut FILE C AIR CoMMNAMN STFF COLLG STUJDET PORTO i COMpUTER SOFTWARE FOR LIFE CYCLE CO879...obsolete), physical life (utility before physically wearing out), or application life (utility in a given function)." (7:5) The costs are usually

  19. Desk-top computer assisted processing of thermoluminescent dosimeters

    International Nuclear Information System (INIS)

    Archer, B.R.; Glaze, S.A.; North, L.B.; Bushong, S.C.

    1977-01-01

    An accurate dosimetric system utilizing a desk-top computer and high sensitivity ribbon type TLDs has been developed. The system incorporates an exposure history file and procedures designed for constant spatial orientation of each dosimeter. Processing of information is performed by two computer programs. The first calculates relative response factors to insure that the corrected response of each TLD is identical following a given dose of radiation. The second program computes a calibration factor and uses it and the relative response factor to determine the actual dose registered by each TLD. (U.K.)

  20. The Jade File System. Ph.D. Thesis

    Science.gov (United States)

    Rao, Herman Chung-Hwa

    1991-01-01

    File systems have long been the most important and most widely used form of shared permanent storage. File systems in traditional time-sharing systems, such as Unix, support a coherent sharing model for multiple users. Distributed file systems implement this sharing model in local area networks. However, most distributed file systems fail to scale from local area networks to an internet. Four characteristics of scalability were recognized: size, wide area, autonomy, and heterogeneity. Owing to size and wide area, techniques such as broadcasting, central control, and central resources, which are widely adopted by local area network file systems, are not adequate for an internet file system. An internet file system must also support the notion of autonomy because an internet is made up by a collection of independent organizations. Finally, heterogeneity is the nature of an internet file system, not only because of its size, but also because of the autonomy of the organizations in an internet. The Jade File System, which provides a uniform way to name and access files in the internet environment, is presented. Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Because of autonomy, Jade is designed under the restriction that the underlying file systems may not be modified. In order to avoid the complexity of maintaining an internet-wide, global name space, Jade permits each user to define a private name space. In Jade's design, we pay careful attention to avoiding unnecessary network messages between clients and file servers in order to achieve acceptable performance. Jade's name space supports two novel features: (1) it allows multiple file systems to be mounted under one direction; and (2) it permits one logical name space to mount other logical name spaces. A prototype of Jade was implemented to examine and validate its

  1. The Galley Parallel File System

    Science.gov (United States)

    Nieuwejaar, Nils; Kotz, David

    1996-01-01

    Most current multiprocessor file systems are designed to use multiple disks in parallel, using the high aggregate bandwidth to meet the growing I/0 requirements of parallel scientific applications. Many multiprocessor file systems provide applications with a conventional Unix-like interface, allowing the application to access multiple disks transparently. This interface conceals the parallelism within the file system, increasing the ease of programmability, but making it difficult or impossible for sophisticated programmers and libraries to use knowledge about their I/O needs to exploit that parallelism. In addition to providing an insufficient interface, most current multiprocessor file systems are optimized for a different workload than they are being asked to support. We introduce Galley, a new parallel file system that is intended to efficiently support realistic scientific multiprocessor workloads. We discuss Galley's file structure and application interface, as well as the performance advantages offered by that interface.

  2. Irradiation probe and laboratory for irradiated material evaluation

    International Nuclear Information System (INIS)

    Smutny, S.; Kupca, L.; Beno, P.; Stubna, M.; Mrva, V.; Chmelo, P.

    1975-09-01

    The survey and assessment are given of the tasks carried out in the years 1971 to 1975 within the development of methods for structural materials irradiation and of a probe for the irradiation thereof in the A-1 reactor. The programme and implementation of laboratory tests of the irradiation probe are described. In the actual reactor irradiation, the pulse tube length between the pressure governor and the irradiation probe is approximately 20 m, the diameter is 2.2 mm. Temperature reaches 800 degC while the pressure control system operates at 20 degC. The laboratory tests (carried out at 20 degC) showed that the response time of the pressure control system to a stepwise pressure change in the irradiation probe from 0 to 22 at. is 0.5 s. Pressure changes were also studied in the irradiation probe and in the entire system resulting from temperature changes in the irradiation probe. Temperature distribution in the body of the irradiation probe heating furnace was determined. (B.S.)

  3. Gas-liquid flow filed in agitated vessels

    International Nuclear Information System (INIS)

    Hormazi, F.; Alaie, M.; Dabir, B.; Ashjaie, M.

    2001-01-01

    Agitated vessels in form of sti reed tank reactors and mixed ferment ors are being used in large numbers of industry. It is more important to develop good, and theoretically sound models for scaling up and design of agitated vessels. In this article, two phase flow (gas-liquid) in a agitated vessel has been investigated numerically. A two-dimensional computational fluid dynamics model, is used to predict the gas-liquid flow. The effects of gas phase, varying gas flow rates and variation of bubbles shape on flow filed of liquid phase are investigated. The numerical results are verified against the experimental data

  4. Effective preoperative irradiation of highly vascular cerebellopontine angle neurinomas

    International Nuclear Information System (INIS)

    Ikeda, K.; Ito, H.; Kashihara, K.; Fujisawa, H.; Yamamoto, S.

    1988-01-01

    Three cases of large cerebellopontine angle neurinoma with marked vascularity and tumor staining on the angiogram were treated with effective preoperative irradiation. The radiotherapy was given before the second operation in two cases and before the first operation in the other case. Irradiation doses administered with a linear accelerator were 2.34 to 3.0 Gy for 3 to 3.5 weeks, and radical operations were done 1.5 to 2 months after irradiation. After the irradiation, vertebral angiography showed moderate to marked decrease of the hypervascular capsular stain and disappearance of the early draining vein. Computed tomographic scan showed enlargement of the central necrotic area within the heterogeneously enhanced tumor, which was unchanged in size. Radical operations, which had been impossible because of uncontrollable massive bleeding, were successful without any intraoperative bleeding after radiotherapy. Postirradiation radiological findings corresponded well with those of histopathological examination, which showed decrease in cellularity and in vascularity and diffuse coagulation necrosis around the collapsed tumoral vessels as radiation effects. Preoperative irradiation of the hypervascular neurinoma was though to facilitate radical surgery by abolishing or diminishing the risk of intraoperative bleeding

  5. Computer simulation of ion recombination in irradiated nonpolar liquids

    International Nuclear Information System (INIS)

    Bartczak, W.M.; Hummel, A.

    1986-01-01

    A review on the results of computer simulation of the diffusion controlled recombination of ions is presented. The ions generated in clusters of two and three pairs of oppositely charged ions were considered. The recombination kinetics and the ion escape probability at infinite time with and without external electric field were computed. These results are compared with the calculations based on the single-pair theory. (athor)

  6. Chapter 2: Irradiators

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2018-04-01

    The chapter 2 presents the subjects: 1) gamma irradiators which includes: Category-I gamma irradiators (self-contained); Category-II gamma irradiators (panoramic and dry storage); Category-III gamma irradiators (self-contained in water); Category-IV gamma irradiators (panoramic and wet storage); source rack for Category-IV gamma irradiators; product transport system for Category-IV gamma irradiators; radiation shield for gamma irradiators; 2) accelerators which includes: Category-I Accelerators (shielded irradiator); Category-II Accelerators (irradiator inside a shielded room); Irradiation application examples.

  7. TH-AB-201-12: Using Machine Log-Files for Treatment Planning and Delivery QA

    Energy Technology Data Exchange (ETDEWEB)

    Stanhope, C [Beaumont Health System, Royal Oak MI and Wayne State University, Detroit, MI (United States); Liang, J; Drake, D; Yan, D [Beaumont Health System, Royal Oak, MI (United States)

    2016-06-15

    Purpose: To determine the segment reduction and dose resolution necessary for machine log-files to effectively replace current phantom-based patient-specific quality assurance, while minimizing computational cost. Methods: Elekta’s Log File Convertor R3.2 records linac delivery parameters (dose rate, gantry angle, leaf position) every 40ms. Five VMAT plans [4 H&N, 1 Pulsed Brain] comprised of 2 arcs each were delivered on the ArcCHECK phantom. Log-files were reconstructed in Pinnacle on the phantom geometry using 1/2/3/4° control point spacing and 2/3/4mm dose grid resolution. Reconstruction effectiveness was quantified by comparing 2%/2mm gamma passing rates of the original and log-file plans. Modulation complexity scores (MCS) were calculated for each beam to correlate reconstruction accuracy and beam modulation. Percent error in absolute dose for each plan-pair combination (log-file vs. ArcCHECK, original vs. ArcCHECK, log-file vs. original) was calculated for each arc and every diode greater than 10% of the maximum measured dose (per beam). Comparing standard deviations of the three plan-pair distributions, relative noise of the ArcCHECK and log-file systems was elucidated. Results: The original plans exhibit a mean passing rate of 95.1±1.3%. The eight more modulated H&N arcs [MCS=0.088±0.014] and two less modulated brain arcs [MCS=0.291±0.004] yielded log-file pass rates most similar to the original plan when using 1°/2mm [0.05%±1.3% lower] and 2°/3mm [0.35±0.64% higher] log-file reconstructions respectively. Log-file and original plans displayed percent diode dose errors 4.29±6.27% and 3.61±6.57% higher than measurement. Excluding the phantom eliminates diode miscalibration and setup errors; log-file dose errors were 0.72±3.06% higher than the original plans – significantly less noisy. Conclusion: For log-file reconstructed VMAT arcs, 1° control point spacing and 2mm dose resolution is recommended, however, less modulated arcs may allow less

  8. Evaluation of Root Canal Preparation Using Rotary System and Hand Instruments Assessed by Micro-Computed Tomography.

    Science.gov (United States)

    Stavileci, Miranda; Hoxha, Veton; Görduysus, Ömer; Tatar, Ilkan; Laperre, Kjell; Hostens, Jeroen; Küçükkaya, Selen; Muhaxheri, Edmond

    2015-06-20

    Complete mechanical preparation of the root canal system is rarely achieved. Therefore, the purpose of this study was to evaluate and compare the root canal shaping efficacy of ProTaper rotary files and standard stainless steel K-files using micro-computed tomography. Sixty extracted upper second premolars were selected and divided into 2 groups of 30 teeth each. Before preparation, all samples were scanned by micro-computed tomography. Thirty teeth were prepared with the ProTaper system and the other 30 with stainless steel files. After preparation, the untouched surface and root canal straightening were evaluated with micro-computed tomography. The percentage of untouched root canal surface was calculated in the coronal, middle, and apical parts of the canal. We also calculated straightening of the canal after root canal preparation. Results from the 2 groups were statistically compared using the Minitab statistical package. ProTaper rotary files left less untouched root canal surface compared with manual preparation in coronal, middle, and apical sector (p<0.001). Similarly, there was a statistically significant difference in root canal straightening after preparation between the techniques (p<0.001). Neither manual nor rotary techniques completely prepared the root canal, and both techniques caused slight straightening of the root canal.

  9. 75 FR 32519 - Miracor Diagnostics, Inc., Monaco Finance, Inc., MPEL Holdings Corp. (f/k/a Computer Transceiver...

    Science.gov (United States)

    2010-06-08

    ... SECURITIES AND EXCHANGE COMMISSION [File No. 500-1] Miracor Diagnostics, Inc., Monaco Finance, Inc., MPEL Holdings Corp. (f/k/a Computer Transceiver Systems, Inc.), MR3 Systems, Inc., Mutual Risk... concerning the securities of Monaco Finance, Inc. because it has not filed any periodic reports since the...

  10. Accomplish the Application Area in Cloud Computing

    OpenAIRE

    Bansal, Nidhi; Awasthi, Amit

    2012-01-01

    In the cloud computing application area of accomplish, we find the fact that cloud computing covers a lot of areas are its main asset. At a top level, it is an approach to IT where many users, some even from different companies get access to shared IT resources such as servers, routers and various file extensions, instead of each having their own dedicated servers. This offers many advantages like lower costs and higher efficiency. Unfortunately there have been some high profile incidents whe...

  11. Informational-computer system for the neutron spectra analysis

    International Nuclear Information System (INIS)

    Berzonis, M.A.; Bondars, H.Ya.; Lapenas, A.A.

    1979-01-01

    In this article basic principles of the build-up of the informational-computer system for the neutron spectra analysis on a basis of measured reaction rates are given. The basic data files of the system, needed software and hardware for the system operation are described

  12. Computer-aided engineering system for design of sequence arrays and lithographic masks

    Science.gov (United States)

    Hubbell, Earl A.; Morris, MacDonald S.; Winkler, James L.

    1996-01-01

    An improved set of computer tools for forming arrays. According to one aspect of the invention, a computer system (100) is used to select probes and design the layout of an array of DNA or other polymers with certain beneficial characteristics. According to another aspect of the invention, a computer system uses chip design files (104) to design and/or generate lithographic masks (110).

  13. ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations

    Science.gov (United States)

    Laloo, Jalal Z. A.; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai

    2017-07-01

    The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.

  14. 78 FR 21930 - Aquenergy Systems, Inc.; Notice of Intent To File License Application, Filing of Pre-Application...

    Science.gov (United States)

    2013-04-12

    ... Systems, Inc.; Notice of Intent To File License Application, Filing of Pre-Application Document, and Approving Use of the Traditional Licensing Process a. Type of Filing: Notice of Intent to File License...: November 11, 2012. d. Submitted by: Aquenergy Systems, Inc., a fully owned subsidiaries of Enel Green Power...

  15. Comparative analysis of a hypothetical 0.1 $/SEC transient overpower accident in an irradiated LMFBR core using different computer models

    International Nuclear Information System (INIS)

    Cacciabue, P.C.; Fremont, R. de; Renard, A.

    1982-01-01

    The Report gives the results of comparative calculations performed by the Whole Core Accident Codes Group which is a subgroup of the Safety Working Group of the Fast Reactor Coordinating Committee for a hypothetical transient overpower accident in an irradiated LMFBR core. Different computer codes from members of the European Community and the United States were used. The calculations are based on a Benchmark problem, using commonly agreed input data for the most important phenomena, such as the fuel pin failure threshold, FCl parameters, etc. Beside this, results with alternative assumptions for theoretical modelling are presented with the scope to show in a parametric way the influence of more advanced modelling capabilities and/or better (so-called best estimate) input data for the most important phenomena on the accident sequences

  16. 12 CFR 16.33 - Filing fees.

    Science.gov (United States)

    2010-01-01

    ... Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY SECURITIES OFFERING DISCLOSURE RULES § 16.33 Filing fees. (a) Filing fees must accompany certain filings made under the provisions of this part... Comptroller of the Currency Fees published pursuant to § 8.8 of this chapter. (b) Filing fees must be paid by...

  17. 75 FR 4689 - Electronic Tariff Filings

    Science.gov (United States)

    2010-01-29

    ... elements ``are required to properly identify the nature of the tariff filing, organize the tariff database... (or other pleading) and the Type of Filing code chosen will be resolved in favor of the Type of Filing...'s wish expressed in its transmittal letter or in other pleadings, the Commission may not review a...

  18. The SIMRAND 1 computer program: Simulation of research and development projects

    Science.gov (United States)

    Miles, R. F., Jr.

    1986-01-01

    The SIMRAND I Computer Program (Version 5.0 x 0.3) written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles is described. The SIMRAND I Computer Program comprises eleven modules-a main routine and ten subroutines. Two additional files are used at compile time; one inserts the system or task equations into the source code, while the other inserts the dimension statements and common blocks. The SIMRAND I Computer Program can be run on most microcomputers or mainframe computers with only minor modifications to the computer code.

  19. Application of pulsed multi-ion irradiations in radiation damage research: A stochastic cluster dynamics simulation study

    Science.gov (United States)

    Hoang, Tuan L.; Nazarov, Roman; Kang, Changwoo; Fan, Jiangyuan

    2018-07-01

    Under the multi-ion irradiation conditions present in accelerated material-testing facilities or fission/fusion nuclear reactors, the combined effects of atomic displacements with radiation products may induce complex synergies in the structural materials. However, limited access to multi-ion irradiation facilities and the lack of computational models capable of simulating the evolution of complex defects and their synergies make it difficult to understand the actual physical processes taking place in the materials under these extreme conditions. In this paper, we propose the application of pulsed single/dual-beam irradiation as replacements for the expensive steady triple-beam irradiation to study radiation damages in materials under multi-ion irradiation.

  20. FragIt: a tool to prepare input files for fragment based quantum chemical calculations.

    Directory of Open Access Journals (Sweden)

    Casper Steinmann

    Full Text Available Near linear scaling fragment based quantum chemical calculations are becoming increasingly popular for treating large systems with high accuracy and is an active field of research. However, it remains difficult to set up these calculations without expert knowledge. To facilitate the use of such methods, software tools need to be available to support these methods and help to set up reasonable input files which will lower the barrier of entry for usage by non-experts. Previous tools relies on specific annotations in structure files for automatic and successful fragmentation such as residues in PDB files. We present a general fragmentation methodology and accompanying tools called FragIt to help setup these calculations. FragIt uses the SMARTS language to locate chemically appropriate fragments in large structures and is applicable to fragmentation of any molecular system given suitable SMARTS patterns. We present SMARTS patterns of fragmentation for proteins, DNA and polysaccharides, specifically for D-galactopyranose for use in cyclodextrins. FragIt is used to prepare input files for the Fragment Molecular Orbital method in the GAMESS program package, but can be extended to other computational methods easily.

  1. Up-to-date review on food irradiation

    International Nuclear Information System (INIS)

    Ehlermann, D.A.E.; Gruenewald, T.

    1984-01-01

    Public interest was focussed on food irradiation in the Federal Republic of Germany after petitions for the treatment of spices had been filed and after the Federal Government's attitude concerning radiation processing of food has been discussed in the Parliament. The review discusses potential and limitations of the method and presents literature references as examples for the relevant applications rather than listing all available references. Electron-, gamma-, Roentgen- and bremsstrahlung-rays are used to obtain desinfestation, shelf-life extension, eradication of pathogene microorganisms, and product improvement. The relation between dose and effected radiochemical changes on the one hand and estimation of the wholesomeness of radiation processed food on the other hand, is discussed. 'Codex Alimentarius', a world-wide body for the standardization of food regulations, has recommended the general use of food irradiation up to a maximum dose limit of 10 kGy which covers the most promising applications. Under the premises of the European Community harmonization of food law is indispensable which means that some clearances in several countries have to be accepted by all countries in the Community. There is no need, based on scientific considerations, to label radiation processed foods. However, with regard to the growing environmental concern of the consumer, labelling of radiation processed products is recommended. For practical reasons only 'first generation' products should be labeled. (orig.) [de

  2. Up-to-date review on food irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Ehlermann, D A.E.; Gruenewald, T

    1984-01-01

    Public interest was focussed on food irradiation in the Federal Republic of Germany after petitions for the treatment of spices had been filed and after the Federal Government's attitude concerning radiation processing of food has been discussed in the Parliament. The review discusses potential and limitations of the method and presents literature references as examples for the relevant applications rather than listing all available references. Electron-, gamma-, Roentgen- and bremsstrahlung-rays are used to obtain desinfestation, shelf-life extension, eradication of pathogene microorganisms, and product improvement. The relation between dose and effected radiochemical changes on the one hand and estimation of the wholesomeness of radiation processed food on the other hand, is discussed. 'Codex Alimentarius', a world-wide body for the standardization of food regulations, has recommended the general use of food irradiation up to a maximum dose limit of 10 kGy which covers the most promising applications. Under the premises of the European Community harmonization of food law is indispensable which means that some clearances in several countries have to be accepted by all countries in the Community. There is no need, based on scientific considerations, to label radiation processed foods. However, with regard to the growing environmental concern of the consumer, labelling of radiation processed products is recommended. For practical reasons only 'first generation' products should be labeled.

  3. Integral transport computation of gamma detector response with the CPM2 code

    International Nuclear Information System (INIS)

    Jones, D.B.

    1989-12-01

    CPM-2 Version 3 is an enhanced version of the CPM-2 lattice physics computer code which supports the capabilities to (1) perform a two-dimensional gamma flux calculation and (2) perform Restart/Data file maintenance operations. The Gamma Calculation Module implemented in CPM-2 was first developed for EPRI in the CASMO-1 computer code by Studsvik Energiteknik under EPRI Agreement RP2352-01. The gamma transport calculation uses the CPM-HET code module to calculate the transport of gamma rays in two dimensions in a mixed cylindrical-rectangular geometry, where the basic fuel assembly and component regions are maintained in a rectangular geometry, but the fuel pins are represented as cylinders within a square pin cell mesh. Such a capability is needed to represent gamma transport in an essentially transparent medium containing spatially distributed ''black'' cylindrical pins. Under a subcontract to RP2352-01, RPI developed the gamma production and gamma interaction library used for gamma calculation. The CPM-2 gamma calculation was verified against reference results generated by Studsvik using the CASMO-1 program. The CPM-2 Restart/Data file maintenance capabilities provide the user with options to copy files between Restart/Data tapes and to purge files from the Restart/Data tapes

  4. Food irradiation

    International Nuclear Information System (INIS)

    Sato, Tomotaro; Aoki, Shohei

    1976-01-01

    Definition and significance of food irradiation were described. The details of its development and present state were also described. The effect of the irradiation on Irish potatoes, onions, wiener sausages, kamaboko (boiled fish-paste), and mandarin oranges was evaluated; and healthiness of food irradiation was discussed. Studies of the irradiation equipment for Irish potatoes in a large-sized container, and the silo-typed irradiation equipment for rice and wheat were mentioned. Shihoro RI center in Hokkaido which was put to practical use for the irradiation of Irish potatoes was introduced. The state of permission of food irradiation in foreign countries in 1975 was introduced. As a view of the food irradiation in the future, its utilization for the prevention of epidemics due to imported foods was mentioned. (Serizawa, K.)

  5. Food irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Gruenewald, T

    1985-01-01

    Food irradiation has become a matter of topical interest also in the Federal Republic of Germany following applications for exemptions concerning irradiation tests of spices. After risks to human health by irradiation doses up to a level sufficient for product pasteurization were excluded, irradiation now offers a method suitable primarily for the disinfestation of fruit and decontamination of frozen and dried food. Codex Alimentarius standards which refer also to supervision and dosimetry have been established; they should be adopted as national law. However, in the majority of cases where individual countries including EC member-countries so far permitted food irradiation, these standards were not yet used. Approved irradiation technique for industrial use is available. Several industrial food irradiation plants, partly working also on a contractual basis, are already in operation in various countries. Consumer response still is largely unknown; since irradiated food is labelled, consumption of irradiated food will be decided upon by consumers.

  6. 78 FR 75554 - Combined Notice of Filings

    Science.gov (United States)

    2013-12-12

    ...-000. Applicants: Young Gas Storage Company, Ltd. Description: Young Fuel Reimbursement Filing to be.... Protests may be considered, but intervention is necessary to become a party to the proceeding. eFiling is... qualifying facilities filings can be found at: http://www.ferc.gov/docs-filing/efiling/filing-req.pdf . For...

  7. 5 CFR 1203.13 - Filing pleadings.

    Science.gov (United States)

    2010-01-01

    ... delivery, by facsimile, or by e-filing in accordance with § 1201.14 of this chapter. If the document was... submitted by e-filing, it is considered to have been filed on the date of electronic submission. (e... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Filing pleadings. 1203.13 Section 1203.13...

  8. PFS: a distributed and customizable file system

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.

    1996-01-01

    In this paper we present our ongoing work on the Pegasus File System (PFS), a distributed and customizable file system that can be used for off-line file system experiments and on-line file system storage. PFS is best described as an object-oriented component library from which either a true file

  9. 76 FR 61351 - Combined Notice of Filings #1

    Science.gov (United States)

    2011-10-04

    ... MBR Baseline Tariff Filing to be effective 9/22/2011. Filed Date: 09/22/2011. Accession Number... submits tariff filing per 35.1: ECNY MBR Re-File to be effective 9/22/2011. Filed Date: 09/22/2011... Industrial Energy Buyers, LLC submits tariff filing per 35.1: NYIEB MBR Re-File to be effective 9/22/2011...

  10. Deceit: A flexible distributed file system

    Science.gov (United States)

    Siegel, Alex; Birman, Kenneth; Marzullo, Keith

    1989-01-01

    Deceit, a distributed file system (DFS) being developed at Cornell, focuses on flexible file semantics in relation to efficiency, scalability, and reliability. Deceit servers are interchangeable and collectively provide the illusion of a single, large server machine to any clients of the Deceit service. Non-volatile replicas of each file are stored on a subset of the file servers. The user is able to set parameters on a file to achieve different levels of availability, performance, and one-copy serializability. Deceit also supports a file version control mechanism. In contrast with many recent DFS efforts, Deceit can behave like a plain Sun Network File System (NFS) server and can be used by any NFS client without modifying any client software. The current Deceit prototype uses the ISIS Distributed Programming Environment for all communication and process group management, an approach that reduces system complexity and increases system robustness.

  11. Silvabase: A flexible data file management system

    Science.gov (United States)

    Lambing, Steven J.; Reynolds, Sandra J.

    1991-01-01

    The need for a more flexible and efficient data file management system for mission planning in the Mission Operations Laboratory (EO) at MSFC has spawned the development of Silvabase. Silvabase is a new data file structure based on a B+ tree data structure. This data organization allows for efficient forward and backward sequential reads, random searches, and appends to existing data. It also provides random insertions and deletions with reasonable efficiency, utilization of storage space well but not at the expense of speed, and performance of these functions on a large volume of data. Mission planners required that some data be keyed and manipulated in ways not found in a commercial product. Mission planning software is currently being converted to use Silvabase in the Spacelab and Space Station Mission Planning Systems. Silvabase runs on a Digital Equipment Corporation's popular VAX/VMS computers in VAX Fortran. Silvabase has unique features involving time histories and intervals such as in operations research. Because of its flexibility and unique capabilities, Silvabase could be used in almost any government or commercial application that requires efficient reads, searches, and appends in medium to large amounts of almost any kind of data.

  12. 10 CFR 110.89 - Filing and service.

    Science.gov (United States)

    2010-01-01

    ...: Rulemakings and Adjudications Staff or via the E-Filing system, following the procedure set forth in 10 CFR 2.302. Filing by mail is complete upon deposit in the mail. Filing via the E-Filing system is completed... residence with some occupant of suitable age and discretion; (2) Following the requirements for E-Filing in...

  13. 49 CFR 1104.6 - Timely filing required.

    Science.gov (United States)

    2010-10-01

    ... offers next day delivery to Washington, DC. If the e-filing option is chosen (for those pleadings and documents that are appropriate for e-filing, as determined by reference to the information on the Board's Web site), then the e-filed pleading or document is timely filed if the e-filing process is completed...

  14. Measurement of kernel swelling and buffer densification in irradiated UCO-TRISO particles

    Energy Technology Data Exchange (ETDEWEB)

    Bower, Gordon R., E-mail: bowegr@inl.gov [Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID, 83415-6188 (United States); Ploger, Scott A.; Demkowicz, Paul A. [Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID, 83415-6188 (United States); Hunn, John D. [Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, TN, 37830-6093 (United States)

    2017-04-01

    Radiation-induced volume changes in the fuel kernels and buffer layers of UCO-TRISO particles irradiated to an average burnup of 16.1% FIMA have been determined. Measurements of particle dimensions were made on polished cross-sections of 56 irradiated particles at several different polish planes. The data were then analyzed to compute the equivalent spherical diameters of the kernels and the various coating layers, and these were compared to the average as-fabricated values to determine changes due to irradiation. The kernel volume was found to have increased by an average of 26 ± 6%. Buffer volume decreased by an average of 39 ± 2% due to densification.

  15. DICOM supported sofware configuration by XML files

    International Nuclear Information System (INIS)

    LucenaG, Bioing Fabian M; Valdez D, Andres E; Gomez, Maria E; Nasisi, Oscar H

    2007-01-01

    A method for the configuration of informatics systems that provide support to DICOM standards using XML files is proposed. The difference with other proposals is base on that this system does not code the information of a DICOM objects file, but codes the standard itself in an XML file. The development itself is the format for the XML files mentioned, in order that they can support what DICOM normalizes for multiple languages. In this way, the same configuration file (or files) can be use in different systems. Jointly the XML configuration file generated, we wrote also a set of CSS and XSL files. So the same file can be visualized in a standard browser, as a query system of DICOM standard, emerging use, that did not was a main objective but brings a great utility and versatility. We exposed also some uses examples of the configuration file mainly in relation with the load of DICOM information objects. Finally, at the conclusions we show the utility that the system has already provided when the edition of DICOM standard changes from 2006 to 2007

  16. Effect of front and rear incident proton irradiation on silicon solar cells

    Science.gov (United States)

    Anspaugh, Bruce; Kachare, Ram

    1987-01-01

    Four solar cell types of current manufacture were irradiated through the front and rear surfaces with protons in the energy range between 1 and 10 MeV. The solar cell parameters varied for this study were cell thickness and back surface field (BSF) vs. no BSF. Some cells were irradiated at normal incidence and an equal number were irradiated with simulated isotropic fluences. The solar cell electrical characteristics were measured under simulated AM0 illumination after each fluence. Using the normal incidence data, proton damage coefficients were computed for all four types of cells for both normal and omnidirectional radiation fields. These were found to compare well with the omnidirectional damage coefficients derived directly from the rear-incidence radiation data. Similarly, the rear-incidence omnidirectional radiation data were used to compute appropriate damage coefficients. A method for calculating the effect of a spectrum of energies is derived from these calculations. It is suitable for calculating the degradation of cells in space when they have minimal rear-surface shielding.

  17. Review of tomography technique for 3-dimensional fission product distribution determination in irradiated fuel

    Energy Technology Data Exchange (ETDEWEB)

    Pakr, D G; Hong, K P; Joo, Y S; Lee, H K

    2006-04-15

    Tomography algorithm is reviewed in order to determine radial 2-dimensional fission product distribution of irradiated fuel rod and to reconstruct it's image using computer. Main contents are Radon transformation, Fourier central slice theorem, inverse Fourier transform, accompanied FBP(Filtered Back Projection) and BPB(Back Projection Filtering). In addition, another tomography reconstruction algorithm, namely, ART(Algebraic Reconstruction Technique) is reviewed briefly. According to reviewed results, we devise equipment for determining of 2-dimensional distribution of irradiated nuclear fuel using existing gamma scanning apparatus. On results of review, It is necessary to develop computer program of reconstruction algorithm for determining of object function and image reconstruction.

  18. Review of tomography technique for 3-dimensional fission product distribution determination in irradiated fuel

    International Nuclear Information System (INIS)

    Pakr, D. G.; Hong, K. P.; Joo, Y. S.; Lee, H. K.

    2006-04-01

    Tomography algorithm is reviewed in order to determine radial 2-dimensional fission product distribution of irradiated fuel rod and to reconstruct it's image using computer. Main contents are Radon transformation, Fourier central slice theorem, inverse Fourier transform, accompanied FBP(Filtered Back Projection) and BPB(Back Projection Filtering). In addition, another tomography reconstruction algorithm, namely, ART(Algebraic Reconstruction Technique) is reviewed briefly. According to reviewed results, we devise equipment for determining of 2-dimensional distribution of irradiated nuclear fuel using existing gamma scanning apparatus. On results of review, It is necessary to develop computer program of reconstruction algorithm for determining of object function and image reconstruction

  19. An analysis of cobalt irradiation in CANDU 6 reactor core

    International Nuclear Information System (INIS)

    Gugiu, E.D.; Dumitrache, I.

    2003-01-01

    In CANDU reactors, one has the ability to replace the stainless steel adjuster rods with neutronically equivalent Co assemblies with a minimum impact on the power plant safety and efficiency. The 60 Co produced by 59 Co irradiation is used extensively in medicine and industry. The paper mainly describes some of the reactor physics and safety requirements that must be carried into practice for the Co adjuster rods. The computations related to the neutronically equivalence of the stainless steel adjusters with the Co adjuster assemblies, as well as the estimations of the activity and the heating of the irradiated cobalt rods are performed using the Monte Carlo codes MCNP5 and MONTEBURNS2.1. The 60 Co activity and heating evaluations are closely related to the neutronics computations and to the density evolution of cobalt isotopes during assumed in-core irradiation period. Unfortunately, the activities of these isotopes could not be evaluated directly using the burn-up capabilities of the MONTEBURNS code because of the lack of their neutron cross-section from the MCNP5 code library. Additional MCNP5 runs for all the cobalt assemblies have been done in order to compute the flux-spectrum, the 59 Co and the 60 Co radiative capture reaction rates in the adjusters. The 60m Co cross-section was estimated using the flux-spectrum and the ORIGEN2.1 code capabilities THERM and RES. These computational steps allowed the evaluation of the one-group cross-section for the radiative capture reactions of cobalt isotopes. The values obtained replaced the corresponding ones from the ORIGEN library, which have been estimated using the flux-spectrum specific to the fuel. The activity values are used to evaluate the dose at the surface of the device designed to transport the cobalt adjusters. (authors)

  20. Electronic circuit design with HEP computational tools

    International Nuclear Information System (INIS)

    Vaz, Mario

    1996-01-01

    CPSPICE is an electronic circuit statistical simulation program developed to run in a parallel environment under UNIX operating system and TCP/IP communications protocol, using CPS - Cooperative Processes Software , SPICE program and CERNLIB software package. It is part of a set of tools being develop, intended to help electronic engineers to design, model and simulate complex systems and circuits for High Energy Physics detectors, based on statistical methods, using the same software and methodology used by HEP physicists for data analysis. CPSPICE simulates electronic circuits by Monte Carlo method, through several different processes running simultaneously SPICE in UNIX parallel computers or workstation farms. Data transfer between CPS processes for a modified version of SPICE2G6 is done by RAM memory, but can also be done through hard disk files if no source files are available for the simulator, and for bigger simulation outputs files. Simulation results are written in a HBOOK file as a NTUPLE, to be examined by HBOOK in batch model or graphics, and analyzed by statistical procedures available. The HBOOK file be stored on hard disk for small amount of data, or into Exabyte tape file for large amount of data. HEP tools also helps circuit or component modeling, like MINUT program from CERNLIB, that implements Nelder and Mead Simplex and Gradient with or without derivatives algorithms, and can be used for design optimization.This paper presents CPSPICE program implementation. The scheme adopted is suitable to make parallel other electronic circuit simulators. (author)

  1. UPEML, Computer Independent Emulator of CDC Update Utility

    International Nuclear Information System (INIS)

    2002-01-01

    1 - Description of program or function: UPEML is a machine-portable CDC UPDATE emulation program. It is capable of emulating a significant subset of the standard CDC UPDATE functions, including program library creation and subsequent modification. 2 - Method of solution: UPEML was originally written to facilitate the use of CDC-based scientific packages on alternate computers. In addition to supporting computers such as the VAX/VMS, IBM, and CRAY/COS, Version 3.0 now supports UNIX workstations and the CRAY/UNICOS operating system. Several program bugs have been corrected in Version 3.0. Version 3.0 has several new features including 1) improved error checking, 2) the ability to use *ADDFILE and READ from nested files, 3) creation of compile file on creation, 4) allows identifiers to begin with numbers, and 5) ability to control warning messages and program termination on error conditions. 3 - Restrictions on the complexity of the problem: None noted

  2. EQPT, a data file preprocessor for the EQ3/6 software package: User's guide and related documentation (Version 7.0)

    International Nuclear Information System (INIS)

    Daveler, S.A.; Wolery, T.J.

    1992-01-01

    EQPT is a data file preprocessor for the EQ3/6 software package. EQ3/6 currently contains five primary data files, called datao files. These files comprise alternative data sets. These data files contain both standard state and activity coefficient-related data. Three (com, sup, and nea) support the use of the Davies or B equations for the activity coefficients; the other two (hmw and pit) support the use of Pitzer's (1973, 1975) equations. The temperature range of the thermodynamic data on these data files varies from 25 degrees C only to 0-300 degrees C. The principal modeling codes in EQ3/6, EQ3NR and EQ6, do not read a data0 file, however. Instead, these codes read an unformatted equivalent called a data1 file. EQPT writes a datal file, using the corresponding data0 file as input. In processing a data0 file, EQPT checks the data for common errors, such as unbalanced reactions. It also conducts two kinds of data transformation. Interpolating polynomials are fit to data which are input on temperature adds. The coefficients of these polynomials are then written on the datal file in place of the original temperature grids. A second transformation pertains only to data files tied to Pitzer's equations. The commonly reported observable Pitzer coefficient parameters are mapped into a set of primitive parameters by means of a set of conventional relations. These primitive form parameters are then written onto the datal file in place of their observable counterparts. Usage of the primitive form parameters makes it easier to evaluate Pitzer's equations in EQ3NR and EQ6. EQPT and the other codes in the EQ3/6 package are written in FORTRAN 77 and have been developed to run under the UNIX operating system on computers ranging from workstations to supercomputers

  3. 12 CFR 908.25 - Filing of papers.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Filing of papers. 908.25 Section 908.25 Banks... RULES OF PRACTICE AND PROCEDURE IN HEARINGS ON THE RECORD General Rules § 908.25 Filing of papers. (a) Filing. Any papers required to be filed shall be addressed to the presiding officer and filed with the...

  4. PFS: a distributed and customizable file system

    OpenAIRE

    Bosch, H.G.P.; Mullender, Sape J.

    1996-01-01

    In this paper we present our ongoing work on the Pegasus File System (PFS), a distributed and customizable file system that can be used for off-line file system experiments and on-line file system storage. PFS is best described as an object-oriented component library from which either a true file system or a file-system simulator can be constructed. Each of the components in the library is easily replaced by another implementation to accommodate a wide range of applications.

  5. Detecting Malicious Code by Binary File Checking

    Directory of Open Access Journals (Sweden)

    Marius POPA

    2014-01-01

    Full Text Available The object, library and executable code is stored in binary files. Functionality of a binary file is altered when its content or program source code is changed, causing undesired effects. A direct content change is possible when the intruder knows the structural information of the binary file. The paper describes the structural properties of the binary object files, how the content can be controlled by a possible intruder and what the ways to identify malicious code in such kind of files. Because the object files are inputs in linking processes, early detection of the malicious content is crucial to avoid infection of the binary executable files.

  6. Formalizing a hierarchical file system

    NARCIS (Netherlands)

    Hesselink, Wim H.; Lali, Muhammad Ikram

    An abstract file system is defined here as a partial function from (absolute) paths to data. Such a file system determines the set of valid paths. It allows the file system to be read and written at a valid path, and it allows the system to be modified by the Unix operations for creation, removal,

  7. Food irradiation

    International Nuclear Information System (INIS)

    Lindqvist, H.

    1996-01-01

    This paper is a review of food irradiation and lists plants for food irradiation in the world. Possible applications for irradiation are discussed, and changes induced in food from radiation, nutritional as well as organoleptic, are reviewed. Possible toxicological risks with irradiated food and risks from alternative methods for treatment are also brought up. Ways to analyze weather food has been irradiated or not are presented. 8 refs

  8. Identification of. gamma. -irradiated spices by electron spin resonance (ESR) spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Uchiyama, Sadao; Kawamura, Yoko; Saito, Yukio (National Inst. of Hygienic Sciences, Tokyo (Japan))

    1990-12-01

    The electron spin resonance (ESR) spectrometry spectra of white (WP), black (BP) and red (Capsicum annuum L. var. frutescerns L., RP) peppers each had a principal signal with a g-value of 2.0043, and the intensities of the principal signals were increased not only by {gamma}-irradiation but also by heating. Irradiated RP also showed a minor signal -30G from the principal one, and the intensity of the minor signal increased linearly with increasing dose from 10 to 50 kGy. Since the minor signal was observed in RP irradiated at 10 kGy and stored for one year, but did not appear either after heating or after exposure to this signal is unique to {gamma}-irradiated RP and should therefore be useful for the identification of {gamma}-irradiated spices of Capsicum genus, such as paprika and chili pepper. The computer simulation of the ESR spectra suggested that the minor signal should be assigned to methyl radical and the principal signal mainly to a combination of phenoxyl and peroxyl radicals. Such minor signals were found in {gamma}-irradiated allspice and cinnamon among 10 kinds of other spices. (author).

  9. Identification of γ-irradiated spices by electron spin resonance (ESR) spectrometry

    International Nuclear Information System (INIS)

    Uchiyama, Sadao; Kawamura, Yoko; Saito, Yukio

    1990-01-01

    The electron spin resonance (ESR) spectrometry spectra of white (WP), black (BP) and red (Capsicum annuum L. var. frutescerns L., RP) peppers each had a principal signal with a g-value of 2.0043, and the intensities of the principal signals were increased not only by γ-irradiation but also by heating. Irradiated RP also showed a minor signal -30G from the principal one, and the intensity of the minor signal increased linearly with increasing dose from 10 to 50 kGy. Since the minor signal was observed in RP irradiated at 10 kGy and stored for one year, but did not appear either after heating or after exposure to this signal is unique to γ-irradiated RP and should therefore be useful for the identification of γ-irradiated spices of Capsicum genus, such as paprika and chili pepper. The computer simulation of the ESR spectra suggested that the minor signal should be assigned to methyl radical and the principal signal mainly to a combination of phenoxyl and peroxyl radicals. Such minor signals were found in γ-irradiated allspice and cinnamon among 10 kinds of other spices. (author)

  10. Relation between radiation-induced whole lung functional loss and regional structural changes in partial irradiated rat lung

    International Nuclear Information System (INIS)

    Luijk, Peter van; Novakova-Jiresova, Alena; Faber, Hette; Steneker, Marloes N.J.; Kampinga, Harm H.; Meertens, Haarm; Coppes, Robert P.

    2006-01-01

    Purpose: Radiation-induced pulmonary toxicity is characterized by dose, region, and time-dependent severe changes in lung morphology and function. This study sought to determine the relation between the structural and functional changes in the irradiated rat lung at three different phases after irradiation. Materials and Methods: Six groups of animals were irradiated to 16-22 Gy to six different lung regions, each containing 50% of the total lung volume. Before and every 2 weeks after irradiation, the breathing rate (BR) was measured, and at Weeks 8, 26, and 38 CT was performed. From the computed tomography scans, the irradiated lung tissue was delineated using a computerized algorithm. A single quantitative measure for structural change was derived from changes of the mean and standard deviation of the density within the delineated lung. Subsequently, this was correlated with the BR in the corresponding phase. Results: In the mediastinal and apex region, the BR and computed tomography density changes did not correlate in any phase. After lateral irradiation, the density changes always correlated with the BR; however, in all other regions, the density changes only correlated significantly (r 2 = 0.46-0.85, p < 0.05) with the BR in Week 26. Conclusion: Changes in pulmonary function correlated with the structural changes in the absence of confounding heart irradiation

  11. Computer codes for evaluation of control room habitability (HABIT)

    International Nuclear Information System (INIS)

    Stage, S.A.

    1996-06-01

    This report describes the Computer Codes for Evaluation of Control Room Habitability (HABIT). HABIT is a package of computer codes designed to be used for the evaluation of control room habitability in the event of an accidental release of toxic chemicals or radioactive materials. Given information about the design of a nuclear power plant, a scenario for the release of toxic chemicals or radionuclides, and information about the air flows and protection systems of the control room, HABIT can be used to estimate the chemical exposure or radiological dose to control room personnel. HABIT is an integrated package of several programs that previously needed to be run separately and required considerable user intervention. This report discusses the theoretical basis and physical assumptions made by each of the modules in HABIT and gives detailed information about the data entry windows. Sample runs are given for each of the modules. A brief section of programming notes is included. A set of computer disks will accompany this report if the report is ordered from the Energy Science and Technology Software Center. The disks contain the files needed to run HABIT on a personal computer running DOS. Source codes for the various HABIT routines are on the disks. Also included are input and output files for three demonstration runs

  12. 77 FR 74839 - Combined Notice of Filings

    Science.gov (United States)

    2012-12-18

    ..., LP. Description: National Grid LNG, LP submits tariff filing per 154.203: Adoption of NAESB Version 2... with Order to Amend NAESB Version 2.0 Filing to be effective 12/1/2012. Filed Date: 12/11/12. Accession...: Refile to comply with Order on NAESB Version 2.0 Filing to be effective 12/1/2012. Filed Date: 12/11/12...

  13. Continuous parameter determination of irradiated nuclear fuels in the test-reactor

    International Nuclear Information System (INIS)

    Bevilacqua, A.; Junod, E.; Mas, P.; Perdreau, R.

    1977-01-01

    During the irradiation tests of nuclear fuels, the flux level may often be variable by shifting the loops in a high neutron-gradient. So integral fluence measurements are no longer sufficient. The self-powered neutron detectors allow to finely scan instantaneous fluxes. More than 100 such SPN detectors are used on the experiments irradiated in the SILOE reactor. The treatment of the large amount of information is following. A first minicomputer scans all the measurement lines at a variable frequence (10 min to 1 hr) and writes rough voltage values on a magnetic disk. A second computer does a sorting of these values for each set of SPND corresponding to an experiment. At the present time, the main treatment is performed by batch processing by some FORTRAN codes to get time-evolving quantities, such as effective flux, fission power, burn-up, fission product activities, etc. The future development of the system will allow some of these calculations to be performed directly on the second computer in such a manner to control the movements of the loops automatically in view of a given irradiation program

  14. Perspectives on treatment with irradiation in Slovenia

    Directory of Open Access Journals (Sweden)

    Primož Strojan

    2007-12-01

    Full Text Available Background: Radiotherapy is one of the three main modalities of cancer treatment. However, effective treatment with radiotherapy may only be assured by highly advanced irradiation facilities, including systems for planning, performing and quality control of irradiation. The second requirement assuring an effective treatment is proper capacities of treatment units and computer equipment to provide a timely access to treatment to > 50 % of all cancer patients and a proper structure and number of staff specialized in handling with radiotherapy equipment. In Slovenia, only 38 % of cancer patients are treated with radiotherapy. In general, the waiting times of patients referred to radiotherapy are too long. Therefore, further development and upgrading of irradiation facilities will remain a priority in oncology in Slovenia also in the future. At the same time, in our endeavors to meet the set goals, we have been facing unforeseen problems both with human resources and inadequate financial appreciation of radiotherapeutic services that, without significant national aid, do not yield sufficient funds for renewal and upgrading of equipment and its further expansion.

  15. Formalizing a Hierarchical File System

    NARCIS (Netherlands)

    Hesselink, Wim H.; Lali, M.I.

    2009-01-01

    In this note, we define an abstract file system as a partial function from (absolute) paths to data. Such a file system determines the set of valid paths. It allows the file system to be read and written at a valid path, and it allows the system to be modified by the Unix operations for removal

  16. Computer systems for nuclear installation data control

    International Nuclear Information System (INIS)

    1987-09-01

    The computer programs developed by Divisao de Instalacoes Nucleares (DIN) from Brazilian CNEN for data control on nuclear installations in Brazil are presented. The following computer programs are described: control of registered companies, control of industrial sources, irradiators and monitors; control of liable person; control of industry irregularities; for elaborating credence tests; for shielding analysis; control of waste refuge [pt

  17. The construction and commissioning of MINT's latex irradiator

    International Nuclear Information System (INIS)

    Razali Hamzah; Muhd Khairi Muhd Said; Muhd Ariff Hamzah; Wan Manshol Wan Zin; Taiman Kadni

    1996-01-01

    The construction and installation of MINT's automatic continuous latex irradiator is described. MINT cooperated with NUKEM to design the plant. Construction was done by local building consultants and local contractor. The installation of the plant includes local fabrication components and imported components. The plant is automatically controlled by a computer system. Features of plant is described

  18. Free radical formation in deoxyguanosine-5'-monophosphate γ-irradiated in frozen solution. A computer-assisted analysis of temperature-dependent ESR spectra

    International Nuclear Information System (INIS)

    Gregoli, S.; Olast, M.; Bertinchamps, A.

    1977-01-01

    Deoxyguanosine-5'-monophosphate (dGMP) was γ-irradiated at 77 K in frozen aqueous solution and then annealed in a stepwise fashion up to the melting point. During this process, the primary radicals formed in DGMP at 77 K are progressively converted into secondary radical species. This is observed as changes in the spectrum intensity and conformation. Computer-assisted analysis of these temperature-dependent spectra permitted us to identify the transient radical species involved and to draw up single-radical concentration kinetics vs temperature. The radiation chemical behavior of dGMP was found to be quite similar to that of dAMP, investigated previously. In both these purine derivatives, radical anions are converted into radicals of H-addition to C-8, and radical cations are converted into radicals of OH-addition to the same position. In dGMP, however, the cationic channel is only induced under certain experimental conditions (alkaline pH, presence of electron scavengers). At neutral pH, G + radicals are quite stable and finally become deactivated without being converted into secondary GOH radicals. Specific deuterium substitution at carbon C-8, and irradiation in H 2 O or in D 2 O, confirmed that both H + and OH - attachments do occur at C-8, and that both the H + and OH - groups come from the aqueous medium

  19. 78 FR 70971 - Privacy Act of 1974, as Amended; Notice of Computer Matching Program (Railroad Retirement Board...

    Science.gov (United States)

    2013-11-27

    ... will file a report of this computer-matching program with the Committee on Homeland Security and... . SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988, (Pub. L. 100-503... RAILROAD RETIREMENT BOARD Privacy Act of 1974, as Amended; Notice of Computer Matching Program...

  20. Adjoint sensitivity analysis applied on a model of irradiation assisted degradation of metals in aqueous systems

    International Nuclear Information System (INIS)

    Simonson, S.A.; Ballinger, R.G.; Christensen, R.A.

    1990-01-01

    Irradiation of an aqueous environment results, in general, in a steady state concentration of oxidizing chemical species in solution. Although the effect may be beneficial to the metal in contact with the solution in some cases, say by producing a more protective film, it is generally believed to be detrimental. The ability to predict the concentrations of the oxidizing species and from this begin to analyze the detrimental behavior on the metals requires computer codes that model the chemical reactions, production rates, and diffusion characteristics of the species being produced by irradiation. The large number of parameters and the complexity of the interactions involved in the predictions of irradiation effects on metals degradation requires a more sophisticated approach to determining the sensitivities of the final results. Monte Carlo techniques are too computationally intensive for practical use in determining sensitivities. The paper presents an approach, adjoint sensitivity analysis, that is more practical, i.e., three computer runs versus thousands, and also a more accurate measure of the sensitivities of the model