WorldWideScience

Sample records for production computer file

  1. Algorithms and file structures for computational geometry

    International Nuclear Information System (INIS)

    Hinrichs, K.; Nievergelt, J.

    1983-01-01

    Algorithms for solving geometric problems and file structures for storing large amounts of geometric data are of increasing importance in computer graphics and computer-aided design. As examples of recent progress in computational geometry, we explain plane-sweep algorithms, which solve various topological and geometric problems efficiently; and we present the grid file, an adaptable, symmetric multi-key file structure that provides efficient access to multi-dimensional data along any space dimension. (orig.)

  2. Small file aggregation in a parallel computing system

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Zhang, Jingwang

    2014-09-02

    Techniques are provided for small file aggregation in a parallel computing system. An exemplary method for storing a plurality of files generated by a plurality of processes in a parallel computing system comprises aggregating the plurality of files into a single aggregated file; and generating metadata for the single aggregated file. The metadata comprises an offset and a length of each of the plurality of files in the single aggregated file. The metadata can be used to unpack one or more of the files from the single aggregated file.

  3. Storing files in a parallel computing system using list-based index to identify replica files

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Zhang, Zhenhua; Grider, Gary

    2015-07-21

    Improved techniques are provided for storing files in a parallel computing system using a list-based index to identify file replicas. A file and at least one replica of the file are stored in one or more storage nodes of the parallel computing system. An index for the file comprises at least one list comprising a pointer to a storage location of the file and a storage location of the at least one replica of the file. The file comprises one or more of a complete file and one or more sub-files. The index may also comprise a checksum value for one or more of the file and the replica(s) of the file. The checksum value can be evaluated to validate the file and/or the file replica(s). A query can be processed using the list.

  4. JENDL gas-production cross section file

    International Nuclear Information System (INIS)

    Nakagawa, Tsuneo; Narita, Tsutomu

    1992-05-01

    The JENDL gas-production cross section file was compiled by taking cross-section data from JENDL-3 and by using the ENDF-5 format. The data were given to 23 nuclei or elements in light nuclei and structural materials. Graphs of the cross sections and brief description on their evaluation methods are given in this report. (author)

  5. Computer aided product design

    DEFF Research Database (Denmark)

    Constantinou, Leonidas; Bagherpour, Khosrow; Gani, Rafiqul

    1996-01-01

    A general methodology for Computer Aided Product Design (CAPD) with specified property constraints which is capable of solving a large range of problems is presented. The methodology employs the group contribution approach, generates acyclic, cyclic and aromatic compounds of various degrees......-liquid equilibria (LLE), solid-liquid equilibria (SLE) and gas solubility. Finally, a computer program based on the extended methodology has been developed and the results from five case studies highlighting various features of the methodology are presented....

  6. RAMA: A file system for massively parallel computers

    Science.gov (United States)

    Miller, Ethan L.; Katz, Randy H.

    1993-01-01

    This paper describes a file system design for massively parallel computers which makes very efficient use of a few disks per processor. This overcomes the traditional I/O bottleneck of massively parallel machines by storing the data on disks within the high-speed interconnection network. In addition, the file system, called RAMA, requires little inter-node synchronization, removing another common bottleneck in parallel processor file systems. Support for a large tertiary storage system can easily be integrated in lo the file system; in fact, RAMA runs most efficiently when tertiary storage is used.

  7. WinSCP for Windows File Transfers | High-Performance Computing | NREL

    Science.gov (United States)

    WinSCP for Windows File Transfers WinSCP for Windows File Transfers WinSCP for can used to securely transfer files between your local computer running Microsoft Windows and a remote computer running Linux

  8. Archive of Census Related Products (ACRP): 1980 SAS Transport Files

    Data.gov (United States)

    National Aeronautics and Space Administration — The 1980 SAS Transport Files portion of the Archive of Census Related Products (ACRP) contains housing and population demographics from the 1980 Summary Tape File...

  9. Documentation of CATHENA input files for the APOLLO computer

    International Nuclear Information System (INIS)

    1988-06-01

    Input files created for the VAX version of the CATHENA two-fluid code have been modified and documented for simulation on the AECB's APOLLO computer system. The input files describe the RD-14 thermalhydraulic loop, the RD-14 steam generator, the RD-12 steam generator blowdown test facility, the Stern Laboratories Cold Water Injection Facility (CWIT), and a CANDU 600 reactor. Sample CATHENA predictions are given and compared with experimental results where applicable. 24 refs

  10. NET: an inter-computer file transfer command

    International Nuclear Information System (INIS)

    Burris, R.D.

    1978-05-01

    The NET command was defined and supported in order to facilitate file transfer between computers. Among the goals of the implementation were greatest possible ease of use, maximum power (i.e., support of a diversity of equipment and operations), and protection of the operating system

  11. Storing files in a parallel computing system based on user-specified parser function

    Science.gov (United States)

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Manzanares, Adam; Torres, Aaron

    2014-10-21

    Techniques are provided for storing files in a parallel computing system based on a user-specified parser function. A plurality of files generated by a distributed application in a parallel computing system are stored by obtaining a parser from the distributed application for processing the plurality of files prior to storage; and storing one or more of the plurality of files in one or more storage nodes of the parallel computing system based on the processing by the parser. The plurality of files comprise one or more of a plurality of complete files and a plurality of sub-files. The parser can optionally store only those files that satisfy one or more semantic requirements of the parser. The parser can also extract metadata from one or more of the files and the extracted metadata can be stored with one or more of the plurality of files and used for searching for files.

  12. A software to report and file by personal computer

    International Nuclear Information System (INIS)

    Di Giandomenico, E.; Filippone, A.; Esposito, A.; Bonomo, L.

    1989-01-01

    During the past four years the authors have been gaining experince in reporting radiological examinations by personal computer. Today they describe the project of a new software which allows the reporting and filing of roentgenograms. This program was realized by a radiologist, using a well known data base management system: dBASE III. The program was shaped to fit the radiologist's needs: it helps to report, and allows to file, radiological data, with the diagnosic codes used by the American College of Radiology. In this paper the authors describe the data base structure and indicate the software functions which make its use possible. Thus, this paper is not aimed at advertising a new reporting program, but at demonstrating how the radiologist can himself manage some aspects of his work with the help of a personal computer

  13. Dynamic file-access characteristics of a production parallel scientific workload

    Science.gov (United States)

    Kotz, David; Nieuwejaar, Nils

    1994-01-01

    Multiprocessors have permitted astounding increases in computational performance, but many cannot meet the intense I/O requirements of some scientific applications. An important component of any solution to this I/O bottleneck is a parallel file system that can provide high-bandwidth access to tremendous amounts of data in parallel to hundreds or thousands of processors. Most successful systems are based on a solid understanding of the expected workload, but thus far there have been no comprehensive workload characterizations of multiprocessor file systems. This paper presents the results of a three week tracing study in which all file-related activity on a massively parallel computer was recorded. Our instrumentation differs from previous efforts in that it collects information about every I/O request and about the mix of jobs running in a production environment. We also present the results of a trace-driven caching simulation and recommendations for designers of multiprocessor file systems.

  14. 22 CFR 1429.21 - Computation of time for filing papers.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Computation of time for filing papers. 1429.21... MISCELLANEOUS AND GENERAL REQUIREMENTS General Requirements § 1429.21 Computation of time for filing papers. In... subchapter requires the filing of any paper, such document must be received by the Board or the officer or...

  15. Archive of Census Related Products (ACRP): 1992 Boundary Files

    Data.gov (United States)

    National Aeronautics and Space Administration — The 1992 Boundary Files portion of the Archive of Census Related Products (ACRP) consists of 1992 boundary data from the U.S. Census Bureau's Topologically...

  16. Archive of Census Related Products (ACRP): 1990 Standard Extract Files

    Data.gov (United States)

    National Aeronautics and Space Administration — The 1990 Standard Extract Files portion of the Archive of Census Related Products (ACRP) contains population and housing data derived from the U.S. Census Bureau's...

  17. 21 CFR 225.102 - Master record file and production records.

    Science.gov (United States)

    2010-04-01

    ... or production run of medicated feed to which it pertains. The Master Record File or card shall... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Master record file and production records. 225.102....102 Master record file and production records. (a) The Master Record File provides the complete...

  18. Production Management System for AMS Computing Centres

    Science.gov (United States)

    Choutko, V.; Demakov, O.; Egorov, A.; Eline, A.; Shan, B. S.; Shi, R.

    2017-10-01

    The Alpha Magnetic Spectrometer [1] (AMS) has collected over 95 billion cosmic ray events since it was installed on the International Space Station (ISS) on May 19, 2011. To cope with enormous flux of events, AMS uses 12 computing centers in Europe, Asia and North America, which have different hardware and software configurations. The centers are participating in data reconstruction, Monte-Carlo (MC) simulation [2]/Data and MC production/as well as in physics analysis. Data production management system has been developed to facilitate data and MC production tasks in AMS computing centers, including job acquiring, submitting, monitoring, transferring, and accounting. It was designed to be modularized, light-weighted, and easy-to-be-deployed. The system is based on Deterministic Finite Automaton [3] model, and implemented by script languages, Python and Perl, and the built-in sqlite3 database on Linux operating systems. Different batch management systems, file system storage, and transferring protocols are supported. The details of the integration with Open Science Grid are presented as well.

  19. Methods and apparatus for multi-resolution replication of files in a parallel computing system using semantic information

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-10-20

    Techniques are provided for storing files in a parallel computing system using different resolutions. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a sub-file. The method comprises the steps of obtaining semantic information related to the file; generating a plurality of replicas of the file with different resolutions based on the semantic information; and storing the file and the plurality of replicas of the file in one or more storage nodes of the parallel computing system. The different resolutions comprise, for example, a variable number of bits and/or a different sub-set of data elements from the file. A plurality of the sub-files can be merged to reproduce the file.

  20. File and metadata management for BESIII distributed computing

    International Nuclear Information System (INIS)

    Nicholson, C; Zheng, Y H; Lin, L; Deng, Z Y; Li, W D; Zhang, X M

    2012-01-01

    The BESIII experiment at the Institute of High Energy Physics (IHEP), Beijing, uses the high-luminosity BEPCII e + e − collider to study physics in the π-charm energy region around 3.7 GeV; BEPCII has produced the worlds largest samples of J/φ and φ’ events to date. An order of magnitude increase in the data sample size over the 2011-2012 data-taking period demanded a move from a very centralized to a distributed computing environment, as well as the development of an efficient file and metadata management system. While BESIII is on a smaller scale than some other HEP experiments, this poses particular challenges for its distributed computing and data management system. These constraints include limited resources and manpower, and low quality of network connections to IHEP. Drawing on the rich experience of the HEP community, a system has been developed which meets these constraints. The design and development of the BESIII distributed data management system, including its integration with other BESIII distributed computing components, such as job management, are presented here.

  1. Comparison of canal transportation and centering ability of hand Protaper files and rotary Protaper files by using micro computed tomography

    OpenAIRE

    Amit Gandhi; Taru Gandhi

    2011-01-01

    Introduction and objective: The aim of the present study was to compare root canal preparation with rotary ProTaper files and hand ProTaper files to find a better instrumentation technique for maintaining root canal geometry with the aid of computed tomography. Material and methods: Twenty curved root canals with at least 10 degree of curvature were divided into 2 groups of 10 teeth each. In group I the canals were prepared with hand ProTaper files and in group II the canals were prepared wit...

  2. Arranging and finding folders and files on your Windows 7 computer

    CERN Document Server

    Steps, Studio Visual

    2014-01-01

    If you have lots of documents on your desk, it may prove to be impossible to find the document you are looking for. In order to easily find certain documents, they are often stored in a filing cabinet and arranged in a logical order. The folders on your computer serve the same purpose. They do not just contain files; they can also contain other folders. You can create an unlimited number of folders, and each folder can contain any number of subfolders and files. You can use Windows Explorer, also called the folder window, to work with the files and folders on your computer. You can copy, delete, move, find, and sort files, among other things. Or you can transfer files and folders to a USB stick, an external hard drive, a CD, DVD or Blu-Ray disk. In this practical guide we will show you how to use the folder window, and help you arrange your own files.

  3. ENDF/B-5. Fission Product Yields File

    International Nuclear Information System (INIS)

    Schwerer, O.

    1985-10-01

    The ENDF/B-5 Fission Product Yields File contains a complete set of independent and cumulative fission product yields, representing the final data from ENDF/B-5 as received at the IAEA Nuclear Data Section in June 1985. Yields for 11 fissioning nuclides at one or more neutron incident energies are included. The data are available costfree on magnetic tape from the IAEA Nuclear Data Section. (author). 4 refs

  4. Methods and apparatus for capture and storage of semantic information with sub-files in a parallel computing system

    Science.gov (United States)

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-02-03

    Techniques are provided for storing files in a parallel computing system using sub-files with semantically meaningful boundaries. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a plurality of sub-files. The method comprises the steps of obtaining a user specification of semantic information related to the file; providing the semantic information as a data structure description to a data formatting library write function; and storing the semantic information related to the file with one or more of the sub-files in one or more storage nodes of the parallel computing system. The semantic information provides a description of data in the file. The sub-files can be replicated based on semantically meaningful boundaries.

  5. RRB / SSI Interface Checkwriting Integrated Computer Operation Extract File (CHICO)

    Data.gov (United States)

    Social Security Administration — This monthly file provides SSA with information about benefit payments made to railroad retirement beneficiaries. SSA uses this data to verify Supplemental Security...

  6. A technique for integrating remote minicomputers into a general computer's file system

    CERN Document Server

    Russell, R D

    1976-01-01

    This paper describes a simple technique for interfacing remote minicomputers used for real-time data acquisition into the file system of a central computer. Developed as part of the ORION system at CERN, this 'File Manager' subsystem enables a program in the minicomputer to access and manipulate files of any type as if they resided on a storage device attached to the minicomputer. Yet, completely transparent to the program, the files are accessed from disks on the central system via high-speed data links, with response times comparable to local storage devices. (6 refs).

  7. 77 FR 71750 - DSM Nutritional Products; Filing of Food Additive Petition (Animal Use)

    Science.gov (United States)

    2012-12-04

    .... FDA-2012-F-1100] DSM Nutritional Products; Filing of Food Additive Petition (Animal Use) AGENCY: Food... (FDA) is announcing that DSM Nutritional Products has filed a petition proposing that the food additive...) (21 U.S.C. 348(b)(5))), notice is given that a food additive petition (FAP 2273) has been filed by DSM...

  8. 78 FR 77384 - DSM Nutritional Products; Filing of Food Additive Petition (Animal Use)

    Science.gov (United States)

    2013-12-23

    .... FDA-2013-F-1539] DSM Nutritional Products; Filing of Food Additive Petition (Animal Use) AGENCY: Food... (FDA) is announcing that DSM Nutritional Products has filed a petition proposing that the food additive... U.S.C. 348(b)(5)), notice is given that a food additive petition (FAP 2276) has been filed by DSM...

  9. Informatics in Radiology (infoRAD): personal computer security: part 2. Software Configuration and file protection.

    Science.gov (United States)

    Caruso, Ronald D

    2004-01-01

    Proper configuration of software security settings and proper file management are necessary and important elements of safe computer use. Unfortunately, the configuration of software security options is often not user friendly. Safe file management requires the use of several utilities, most of which are already installed on the computer or available as freeware. Among these file operations are setting passwords, defragmentation, deletion, wiping, removal of personal information, and encryption. For example, Digital Imaging and Communications in Medicine medical images need to be anonymized, or "scrubbed," to remove patient identifying information in the header section prior to their use in a public educational or research environment. The choices made with respect to computer security may affect the convenience of the computing process. Ultimately, the degree of inconvenience accepted will depend on the sensitivity of the files and communications to be protected and the tolerance of the user. Copyright RSNA, 2004

  10. Processing of evaluated neutron data files in ENDF format on personal computers

    International Nuclear Information System (INIS)

    Vertes, P.

    1991-11-01

    A computer code package - FDMXPC - has been developed for processing evaluated data files in ENDF format. The earlier version of this package is supplemented with modules performing calculations using Reich-Moore and Adler-Adler resonance parameters. The processing of evaluated neutron data files by personal computers requires special programming considerations outlined in this report. The scope of the FDMXPC program system is demonstrated by means of numerical examples. (author). 5 refs, 4 figs, 4 tabs

  11. Computational Intelligence Techniques for New Product Design

    CERN Document Server

    Chan, Kit Yan; Dillon, Tharam S

    2012-01-01

    Applying computational intelligence for product design is a fast-growing and promising research area in computer sciences and industrial engineering. However, there is currently a lack of books, which discuss this research area. This book discusses a wide range of computational intelligence techniques for implementation on product design. It covers common issues on product design from identification of customer requirements in product design, determination of importance of customer requirements, determination of optimal design attributes, relating design attributes and customer satisfaction, integration of marketing aspects into product design, affective product design, to quality control of new products. Approaches for refinement of computational intelligence are discussed, in order to address different issues on product design. Cases studies of product design in terms of development of real-world new products are included, in order to illustrate the design procedures, as well as the effectiveness of the com...

  12. PHOBINS: an index file of photon production cross section data and its utility code system

    International Nuclear Information System (INIS)

    Hasegawa, Akira; Koyama, Kinji; Ido, Masaru; Hotta, Masakazu; Miyasaka, Shun-ichi

    1978-08-01

    The code System PHOBINS developed for reference of photon production cross sections is described in detail. The system is intended to grasp the present status of photon production data and present the information of available data. It consists of four utility routines, CREA, UP-DT, REF and BACK, and data files. These utility routines are used for making an index file of the photon production cross sections, updating the index file, searching the index file and producing a back-up file of the index file. In the index file of the photon production cross sections, a data base system is employed for efficient data management in economical storage, ease of updating and efficient reference. The present report is a reference manual of PHOBINS. (author)

  13. A digital imaging teaching file by using the internet, HTML and personal computers

    International Nuclear Information System (INIS)

    Chun, Tong Jin; Jeon, Eun Ju; Baek, Ho Gil; Kang, Eun Joo; Baik, Seung Kug; Choi, Han Yong; Kim, Bong Ki

    1996-01-01

    A film-based teaching file takes up space and the need to search through such a file places limits on the extent to which it is likely to be used. Furthermore it is not easy for doctors in a medium-sized hospital to experience a variety of cases, and so for these reasons we created an easy-to-use digital imaging teaching file with HTML(Hypertext Markup Language) and downloaded images via World Wide Web(WWW) services on the Internet. This was suitable for use by computer novices. We used WWW internet services as a resource for various images and three different IMB-PC compatible computers(386DX, 486DX-II, and Pentium) in downloading the images and in developing a digitalized teaching file. These computers were connected with the Internet through a high speed dial-up modem(28.8Kbps) and to navigate the Internet. Twinsock and Netscape were used. 3.0, Korean word processing software, was used to create HTML(Hypertext Markup Language) files and the downloaded images were linked to the HTML files. In this way, a digital imaging teaching file program was created. Access to a Web service via the Internet required a high speed computer(at least 486DX II with 8MB RAM) for comfortabel use; this also ensured that the quality of downloaded images was not degraded during downloading and that these were good enough to use as a teaching file. The time needed to retrieve the text and related images depends on the size of the file, the speed of the network, and the network traffic at the time of connection. For computer novices, a digital image teaching file using HTML is easy to use. Our method of creating a digital imaging teaching file by using Internet and HTML would be easy to create and radiologists with little computer experience who want to study various digital radiologic imaging cases would find it easy to use

  14. Globus File Transfer Services | High-Performance Computing | NREL

    Science.gov (United States)

    installed on the systems at both ends of the data transfer. The NREL endpoint is nrel#globus. Click Login on the Globus web site. On the login page select "Globus ID" as the login method and click Login to the Globus website. From the Manage Data drop down menu, select Transfer Files. Then click Get

  15. Cooperative storage of shared files in a parallel computing system with dynamic block size

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-11-10

    Improved techniques are provided for parallel writing of data to a shared object in a parallel computing system. A method is provided for storing data generated by a plurality of parallel processes to a shared object in a parallel computing system. The method is performed by at least one of the processes and comprises: dynamically determining a block size for storing the data; exchanging a determined amount of the data with at least one additional process to achieve a block of the data having the dynamically determined block size; and writing the block of the data having the dynamically determined block size to a file system. The determined block size comprises, e.g., a total amount of the data to be stored divided by the number of parallel processes. The file system comprises, for example, a log structured virtual parallel file system, such as a Parallel Log-Structured File System (PLFS).

  16. Product placement of computer games in cyberspace.

    Science.gov (United States)

    Yang, Heng-Li; Wang, Cheng-Shu

    2008-08-01

    Computer games are considered an emerging media and are even regarded as an advertising channel. By a three-phase experiment, this study investigated the advertising effectiveness of computer games for different product placement forms, product types, and their combinations. As the statistical results revealed, computer games are appropriate for placement advertising. Additionally, different product types and placement forms produced different advertising effectiveness. Optimum combinations of product types and placement forms existed. An advertisement design model is proposed for use in game design environments. Some suggestions are given for advertisers and game companies respectively.

  17. Computer Forensics Method in Analysis of Files Timestamps in Microsoft Windows Operating System and NTFS File System

    Directory of Open Access Journals (Sweden)

    Vesta Sergeevna Matveeva

    2013-02-01

    Full Text Available All existing file browsers displays 3 timestamps for every file in file system NTFS. Nowadays there are a lot of utilities that can manipulate temporal attributes to conceal the traces of file using. However every file in NTFS has 8 timestamps that are stored in file record and used in detecting the fact of attributes substitution. The authors suggest a method of revealing original timestamps after replacement and automated variant of it in case of a set of files.

  18. Conversion of Input Data between KENO and MCNP File Formats for Computer Criticality Assessments

    International Nuclear Information System (INIS)

    Schwarz, Randolph A.; Carter, Leland L.; Schwarz Alysia L.

    2006-01-01

    KENO is a Monte Carlo criticality code that is maintained by Oak Ridge National Laboratory (ORNL). KENO is included in the SCALE (Standardized Computer Analysis for Licensing Evaluation) package. KENO is often used because it was specifically designed for criticality calculations. Because KENO has convenient geometry input, including the treatment of lattice arrays of materials, it is frequently used for production calculations. Monte Carlo N-Particle (MCNP) is a Monte Carlo transport code maintained by Los Alamos National Laboratory (LANL). MCNP has a powerful 3D geometry package and an extensive cross section database. It is a general-purpose code and may be used for calculations involving shielding or medical facilities, for example, but can also be used for criticality calculations. MCNP is becoming increasingly more popular for performing production criticality calculations. Both codes have their own specific advantages. After a criticality calculation has been performed with one of the codes, it is often desirable (or may be a safety requirement) to repeat the calculation with the other code to compare the important parameters using a different geometry treatment and cross section database. This manual conversion of input files between the two codes is labor intensive. The industry needs the capability of converting geometry models between MCNP and KENO without a large investment in manpower. The proposed conversion package will aid the user in converting between the codes. It is not intended to be used as a ''black box''. The resulting input file will need to be carefully inspected by criticality safety personnel to verify the intent of the calculation is preserved in the conversion. The purpose of this package is to help the criticality specialist in the conversion process by converting the geometry, materials, and pertinent data cards

  19. Survey on Security Issues in File Management in Cloud Computing Environment

    Science.gov (United States)

    Gupta, Udit

    2015-06-01

    Cloud computing has pervaded through every aspect of Information technology in past decade. It has become easier to process plethora of data, generated by various devices in real time, with the advent of cloud networks. The privacy of users data is maintained by data centers around the world and hence it has become feasible to operate on that data from lightweight portable devices. But with ease of processing comes the security aspect of the data. One such security aspect is secure file transfer either internally within cloud or externally from one cloud network to another. File management is central to cloud computing and it is paramount to address the security concerns which arise out of it. This survey paper aims to elucidate the various protocols which can be used for secure file transfer and analyze the ramifications of using each protocol.

  20. The image database management system of teaching file using personal computer

    International Nuclear Information System (INIS)

    Shin, M. J.; Kim, G. W.; Chun, T. J.; Ahn, W. H.; Baik, S. K.; Choi, H. Y.; Kim, B. G.

    1995-01-01

    For the systemic management and easy using of teaching file in radiology department, the authors tried to do the setup of a database management system of teaching file using personal computer. We used a personal computer (IBM PC compatible, 486DX2) including a image capture card(Window vision, Dooin Elect, Seoul, Korea) and video camera recorder (8mm, CCD-TR105, Sony, Tokyo, Japan) for the acquisition and storage of images. We developed the database program by using Foxpro for Window 2.6(Microsoft, Seattle, USA) executed in the Window 3.1 (Microsoft, Seattle, USA). Each datum consisted of hospital number, name, sex, age, examination date, keyword, radiologic examination modalities, final diagnosis, radiologic findings, references and representative images. The images were acquired and stored as bitmap format (8 bitmap, 540 X 390 ∼ 545 X 414, 256 gray scale) and displayed on the 17 inch-flat monitor(1024 X 768, Samtron, Seoul, Korea). Without special devices, the images acquisition and storage could be done on the reading viewbox, simply. The image quality on the computer's monitor was less than the one of original film on the viewbox, but generally the characteristics of each lesions could be differentiated. Easy retrieval of data was possible for the purpose of teaching file system. Without high cost appliances, we could consummate the image database system of teaching file using personal computer with relatively inexpensive method

  1. Transfer of numeric ASCII data files between Apple and IBM personal computers.

    Science.gov (United States)

    Allan, R W; Bermejo, R; Houben, D

    1986-01-01

    Listings for programs designed to transfer numeric ASCII data files between Apple and IBM personal computers are provided with accompanying descriptions of how the software operates. Details of the hardware used are also given. The programs may be easily adapted for transferring data between other microcomputers.

  2. FISPRO: a simplified computer program for general fission product formation and decay calculations

    International Nuclear Information System (INIS)

    Jiacoletti, R.J.; Bailey, P.G.

    1979-08-01

    This report describes a computer program that solves a general form of the fission product formation and decay equations over given time steps for arbitrary decay chains composed of up to three nuclides. All fission product data and operational history data are input through user-defined input files. The program is very useful in the calculation of fission product activities of specific nuclides for various reactor operational histories and accident consequence calculations

  3. Dimensional quality control of Ti-Ni dental file by optical coordinate metrology and computed tomography

    DEFF Research Database (Denmark)

    Yagüe-Fabra, J.A.; Tosello, Guido; Ontiveros, S.

    2014-01-01

    Endodontic dental files usually present complex 3D geometries, which make the complete measurement of the component very challenging with conventional micro metrology tools. Computed Tomography (CT) can represent a suitable alternative solution to micro metrology tools based on optical and tactile...... techniques. However, the establishment of CT systems traceability when measuring 3D complex geometries is still an open issue. In this work, to verify the quality of the CT dimensional measurements, the dental file has been measured both with a μCT system and an optical CMM (OCMM). The uncertainty...

  4. Neutron capture cross-section of fission products in the European activation file EAF-3

    International Nuclear Information System (INIS)

    Kopecky, J.; Delfini, M.G.; Kamp, H.A.J. van der; Gruppelaar, H.; Nierop, D.

    1992-05-01

    This paper contains a description of the work performed to extend and revise the neutron capture data in the European Activation File (EAF-3) with emphasis on nuclides in the fission-product mass range. The starter was the EAF-1 data file from 1989. The present version, EAF/NG-3, contains (n,γ) excitation functions for all nuclides (729 targets) with half-lives exceeding 1/2 day in the mass range from H-1 to Cm-248. The data file is equipped with a preliminary uncertainty file, that will be improved in the near future. (author). 19 refs.; 5 figs.; 3 tabs

  5. Design of a Control System for Quality Maintenance on Cutting Edges of Files Production

    Directory of Open Access Journals (Sweden)

    E. Seabra

    2000-01-01

    Full Text Available The file cutting edges are the most important parameter that influence the performance of the filing operation. The practice shows that the most efficient way of generating these cutting edges is by penetration, by blow, of a cutting tool, which creates a plastic deformation on the file body. The penetration depth is probably the most important factor of the final quality of a file. In the existing machines of files manufacturing, this depth is manually adjusted by the operator, using specific mechanism. This means that files are manufactured on an empirical basis, relying on subjective factors, that do not allow to keep constant quality level of the production. In a research work, being developed in the University of Minho, it is intended to eliminate the subjectivity factors by the means of the evolution of the present “all-mechanical” system to a “mechatronic” one. In this paper, which is related with that research work, it presented a study of a round files production machine, regarding the identification, as well as the categorisation, of the operating parameters that affect the cutting edges production. They are, as well, defined and quantified those factors that influence the final quality of a round file.

  6. Global Navigation Satellite System (GNSS) Final Clock Product (5 minute resolution, daily files, generated weekly) from NASA CDDIS

    Data.gov (United States)

    National Aeronautics and Space Administration — This derived product set consists of Global Navigation Satellite System Final Satellite and Receiver Clock Product (5-minute granularity, daily files, generated...

  7. 78 FR 19182 - Electronic Filing of Import Inspection Applications for Meat, Poultry, and Egg Products...

    Science.gov (United States)

    2013-03-29

    ...] Electronic Filing of Import Inspection Applications for Meat, Poultry, and Egg Products: Availability of..., and egg products through the Automated Commercial Environment (ACE). ACE is the Web- based portal for... products (21 U.S.C. 620, 466). The Egg Products Inspection Act (EPIA) (21 U.S.C. 1031 et seq.) prohibits...

  8. File management for experiment control parameters within a distributed function computer network

    International Nuclear Information System (INIS)

    Stubblefield, F.W.

    1976-10-01

    An attempt to design and implement a computer system for control of and data collection from a set of laboratory experiments reveals that many of the experiments in the set require an extensive collection of parameters for their control. The operation of the experiments can be greatly simplified if a means can be found for storing these parameters between experiments and automatically accessing them as they are required. A subsystem for managing files of such experiment control parameters is discussed. 3 figures

  9. Computer Applications in Production and Engineering

    DEFF Research Database (Denmark)

    Sørensen, Torben

    1997-01-01

    This paper address how neutral product model interfaces can be identified, specified, and implemented to provide intelligent and flexible means for information management in manufacturing of discrete mechanical products.The use of advanced computer based systems, such as CAD, CAE, CNC, and robotics......, offers a potential for significant cost-savings and quality improvements in manufacturing of discrete mechanical products.However, these systems are introduced into production as 'islands of automation' or 'islands of information', and to benefit from the said potential, the systems must be integrated...... domains; the CA(X) systems are placed in two different domains for design and planning, respectively. A third domain within the CIME architecture comprises the automated equipment on the shop floor....

  10. Evaluation of clinical data in childhood asthma. Application of a computer file system

    International Nuclear Information System (INIS)

    Fife, D.; Twarog, F.J.; Geha, R.S.

    1983-01-01

    A computer file system was used in our pediatric allergy clinic to assess the value of chest roentgenograms and hemoglobin determinations used in the examination of patients and to correlate exposure to pets and forced hot air with the severity of asthma. Among 889 children with asthma, 20.7% had abnormal chest roentgenographic findings, excluding hyperinflation and peribronchial thickening, and 0.7% had abnormal hemoglobin values. Environmental exposure to pets or forced hot air was not associated with increased severity of asthma, as assessed by five measures of outcome: number of medications administered, requirement for corticosteroids, frequency of clinic visits, frequency of emergency room visits, and frequency of hospitalizations

  11. Building Parts Inventory Files Using the AppleWorks Data Base Subprogram and Apple IIe or GS Computers.

    Science.gov (United States)

    Schlenker, Richard M.

    This manual is a "how to" training device for building database files using the AppleWorks program with an Apple IIe or Apple IIGS Computer with Duodisk or two disk drives and an 80-column card. The manual provides step-by-step directions, and includes 25 figures depicting the computer screen at the various stages of the database file…

  12. CINDA 83 (1977-1983). The index to literature and computer files on microscopic neutron data

    International Nuclear Information System (INIS)

    1983-01-01

    CINDA, the Computer Index of Neutron Data, contains bibliographical references to measurements, calculations, reviews and evaluations of neutron cross-sections and other microscopic neutron data; it includes also index references to computer libraries of numerical neutron data exchanged between four regional neutron data centres. The present issue, CINDA 83, is an index to the literature on neutron data published after 1976. The basic volume, CINDA-A, together with the present issue, contains the full CINDA file as of 1 April 1983. A supplement to CINDA 83 is foreseen for fall 1983. Next year's issue, which is envisaged to be published in June 1984, will again cover all relevant literature that has appeared after 1976

  13. Trust in social computing. The case of peer-to-peer file sharing networks

    Directory of Open Access Journals (Sweden)

    Heng Xu

    2011-09-01

    Full Text Available Social computing and online communities are changing the fundamental way people share information and communicate with each other. Social computing focuses on how users may have more autonomy to express their ideas and participate in social exchanges in various ways, one of which may be peer-to-peer (P2P file sharing. Given the greater risk of opportunistic behavior by malicious or criminal communities in P2P networks, it is crucial to understand the factors that affect individual’s use of P2P file sharing software. In this paper, we develop and empirically test a research model that includes trust beliefs and perceived risks as two major antecedent beliefs to the usage intention. Six trust antecedents are assessed including knowledge-based trust, cognitive trust, and both organizational and peer-network factors of institutional trust. Our preliminary results show general support for the model and offer some important implications for software vendors in P2P sharing industry and regulatory bodies.

  14. 77 FR 20047 - Certain Computer and Computer Peripheral Devices and Components Thereof and Products Containing...

    Science.gov (United States)

    2012-04-03

    ... INTERNATIONAL TRADE COMMISSION [DN 2889] Certain Computer and Computer Peripheral Devices and... Certain Computer and Computer Peripheral Devices and Components Thereof and Products Containing the Same... importation, and the sale within the United States after importation of certain computer and computer...

  15. ENDF/B-IV fission-product files: summary of major nuclide data

    International Nuclear Information System (INIS)

    England, T.R.; Schenter, R.E.

    1975-09-01

    The major fission-product parameters [sigma/sub th/, RI, tau/sub 1/2/, E-bar/sub β/, E-bar/sub γ/, E-bar/sub α/, decay and (n,γ) branching, Q, and AWR] abstracted from ENDF/B-IV files for 824 nuclides are summarized. These data are most often requested by users concerned with reactor design, reactor safety, dose, and other sundry studies. The few known file errors are corrected to date. Tabular data are listed by increasing mass number

  16. Comparison of canal transportation and centering ability of twisted files, Pathfile-ProTaper system, and stainless steel hand K-files by using computed tomography.

    Science.gov (United States)

    Gergi, Richard; Rjeily, Joe Abou; Sader, Joseph; Naaman, Alfred

    2010-05-01

    The purpose of this study was to compare canal transportation and centering ability of 2 rotary nickel-titanium (NiTi) systems (Twisted Files [TF] and Pathfile-ProTaper [PP]) with conventional stainless steel K-files. Ninety root canals with severe curvature and short radius were selected. Canals were divided randomly into 3 groups of 30 each. After preparation with TF, PP, and stainless steel files, the amount of transportation that occurred was assessed by using computed tomography. Three sections from apical, mid-root, and coronal levels of the canal were recorded. Amount of transportation and centering ability were assessed. The 3 groups were statistically compared with analysis of variance and Tukey honestly significant difference test. Less transportation and better centering ability occurred with TF rotary instruments (P < .0001). K-files showed the highest transportation followed by PP system. PP system showed significant transportation when compared with TF (P < .0001). The TF system was found to be the best for all variables measured in this study. Copyright (c) 2010 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  17. Verification of data files of TREF-computer program; TREF-ohjelmiston ohjaustiedostojen soveltuvuustutkimus

    Energy Technology Data Exchange (ETDEWEB)

    Ruottu, S.; Halme, A.; Ruottu, A. [Einco Oy, Karhula (Finland)

    1996-12-01

    Originally the aim of Y43 project was to verify TREF data files for several different processes. However, it appeared that deficient or missing coordination between experimental and theoretical works made meaningful verifications impossible in some cases. Therefore verification calculations were focused on catalytic cracking reactor which was developed by Neste. The studied reactor consisted of prefluidisation and reaction zones. Verification calculations concentrated mainly on physical phenomena like vaporization near oil injection zone. The main steps of the cracking process can be described as follows oil (liquid) -> oil (gas) -> oil (catal) -> product (catal) + char (catal) -> product (gas). Catalytic nature of cracking reaction was accounted by defining the cracking pseudoreaction into catalyst phase. This simplified reaction model was valid only for vaporization zone. Applied fluid dynamic theory was based on the results of EINCO`s earlier LIEKKI-projects. (author)

  18. JNDC FP decay data file

    International Nuclear Information System (INIS)

    Yamamoto, Tohru; Akiyama, Masatsugu

    1981-02-01

    The decay data file for fission product nuclides (FP DECAY DATA FILE) has been prepared for summation calculation of the decay heat of fission products. The average energies released in β- and γ-transitions have been calculated with computer code PROFP. The calculated results and necessary information have been arranged in tabular form together with the estimated results for 470 nuclides of which decay data are not available experimentally. (author)

  19. Yankee links computing needs, increases productivity

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    Yankee Atomic Electric Company provides design and consultation services to electric utility companies that operate nuclear power plants. This means bringing together the skills and talents of more than 500 people in many disciplines, including computer-aided design, human resources, financial services, and nuclear engineering. The company was facing a problem familiar to many companies in the nuclear industry.Key corporate data and applications resided on UNIX or other types of computer systems, but most users at Yankee had personal computers on their desks. How could Yankee enable the PC users to share the data, applications, and resources of the larger computing environment such as UNIX, while ensuring they could still use their favorite PC applications? The solution was PC-NFS from Sunsoft, of Chelmsford, Mass., which links PCs to UNIX and other systems. The Yankee computing story is an example of computer downsizing-the trend of moving away from mainframe computers in favor of lower-cost, more flexible client/server computing. Today, Yankee Atomic has more than 350 PCs on desktops throughout the company, using PC-NFS, which enables them t;o use the data, applications, disks, and printers of the FUNIX server systems. This new client/server environment has reduced Yankee's computing costs while increasing its computing power and its ability to respond to customers

  20. Modification to the Monte Carlo N-Particle (MCNP) Visual Editor (MCNPVised) to Read in Computer Aided Design (CAD) Files

    International Nuclear Information System (INIS)

    Randolph Schwarz; Leland L. Carter; Alysia Schwarz

    2005-01-01

    Monte Carlo N-Particle Transport Code (MCNP) is the code of choice for doing complex neutron/photon/electron transport calculations for the nuclear industry and research institutions. The Visual Editor for Monte Carlo N-Particle is internationally recognized as the best code for visually creating and graphically displaying input files for MCNP. The work performed in this grant was used to enhance the capabilities of the MCNP Visual Editor to allow it to read in both 2D and 3D Computer Aided Design (CAD) files, allowing the user to electronically generate a valid MCNP input geometry

  1. As-Built design specification for the CLASFYT program. [production of classification files - crop inventory

    Science.gov (United States)

    Horton, C. L. (Principal Investigator)

    1981-01-01

    The CLASFYT program is described in detail. The program produces a one-channel universal-formatted classification file. Trajectory coefficients and a composite set of tolerance values are calculated from five acquisitions of radiance values in each of the training fields corresponding to up to ten agricultural products. These coefficients and tolerance values are used to classify each pixel in the test field of the same segment to be the same agricultural product as one of the training fields, none of the products or a screened pixel.

  2. Modification to the Monte N-Particle (MCNP) Visual Editor (MCNPVised) to read in Computer Aided Design (CAD) files

    International Nuclear Information System (INIS)

    Schwarz, Randy A.; Carter, Leeland L.

    2004-01-01

    Monte Carlo N-Particle Transport Code (MCNP) (Reference 1) is the code of choice for doing complex neutron/photon/electron transport calculations for the nuclear industry and research institutions. The Visual Editor for Monte Carlo N-Particle (References 2 to 11) is recognized internationally as the best code for visually creating and graphically displaying input files for MCNP. The work performed in this grant enhanced the capabilities of the MCNP Visual Editor to allow it to read in a 2D Computer Aided Design (CAD) file, allowing the user to modify and view the 2D CAD file and then electronically generate a valid MCNP input geometry with a user specified axial extent

  3. CIM [computer-integrated manufacturing]: It all starts with product definition

    International Nuclear Information System (INIS)

    Stephens, A.E.

    1986-01-01

    The logical starting place for computer-integrated manufacturing (CIM) is at the front end of the production process - product definition. It consists of the part/assembly drawings, material lists, specifications, and procedures. Product definition starts at the design agencies: two nuclear design laboratories (Los Alamos National Laboratory and Lawrence Livermore National Laboratory) and a non-nuclear design laboratory (Sandia National Laboratories with two site locations). These laboratories perform the basic part design which is then transferred over a secure communications network to the Oak Ridge Y-12 Plant, where weapon components are produced by Martin Marietta Energy Systems, Inc., under contract with the Department of Energy (DOE). Initial Graphics Exchange Specifications (IGES) and DOE Data Exchange Format (DOEDEF) translation software is used to transfer part designs between dissimilar graphics systems. Product-definition data flow is examined both external and internal to the Y-12 Plant. Software developed specifically to computerize product definition is covered as follows: Electronic File Manager (EFM), Manage Design Documents, Distribute Product Definition, Manage Manufacturing Procedures and Product Specifications. Trident II is the first program to beneficially use CIM technologies plant-wide. Prototype software was written to add a layer of user friendliness through multilayer menu selects to enable access to a number of existing application software packages. Additional software was developed and purchased that enables a single personal computer to meet many needs. These product-definition needs include procedures generation, graphics viewing, and office automation. 3 figs

  4. Geothermal-energy files in computer storage: sites, cities, and industries

    Energy Technology Data Exchange (ETDEWEB)

    O' Dea, P.L.

    1981-12-01

    The site, city, and industrial files are described. The data presented are from the hydrothermal site file containing about three thousand records which describe some of the principal physical features of hydrothermal resources in the United States. Data elements include: latitude, longitude, township, range, section, surface temperature, subsurface temperature, the field potential, and well depth for commercialization. (MHR)

  5. Evaluation of Single File Systems Reciproc, Oneshape, and WaveOne using Cone Beam Computed Tomography -An In Vitro Study.

    Science.gov (United States)

    Dhingra, Annil; Ruhal, Nidhi; Miglani, Anjali

    2015-04-01

    Successful endodontic therapy depends on many factor, one of the most important step in any root canal treatment is root canal preparation. In addition, respecting the original shape of the canal is of the same importance; otherwise, canal aberrations such as transportation will be created. The purpose of this study is to compare and evaluate Reciprocating WaveOne ,Reciproc and Rotary Oneshape Single File Instrumentation System On Cervical Dentin Thickness, Cross Sectional Area and Canal Transportation on First Mandibular Molar Using Cone Beam Computed Tomography. Sixty Mandibular First Molars extracted due to periodontal reason was collected from the Department of Oral and Maxillofacial. Teeth were prepared using one rotary and two reciprocating single file system. Teeth were divided into 3 groups 20 teeth in each group. Pre instrumentation and Post instrumentation scans was done and evaluated for three parameters Canal Transportation, Cervical Dentinal Thickness, Cross-sectional Area. Results were analysed statistically using ANOVA, Post-Hoc Tukey analysis. The change in cross-sectional area after filing showed significant difference at 0mm, 1mm, 2mm and 7mm (pfile system over a distance of 7 mm (starting from 0mm and then evaluation at 1mm, 2mm, 3mm, 5mm and 7mm), the results showed a significant difference among the file systems at various lengths (p= 0.014, 0.046, 0.004, 0.028, 0.005 & 0.029 respectively). Mean value of cervical dentinal removal is maximum at all the levels for oneshape and minimum for waveone showing the better quality of waveone and reciproc over oneshape file system. Significant difference was found at 9mm, 11mm and 12mm between all the three file systems (p<0.001,< 0.001, <0.001). It was concluded that reciprocating motion is better than rotary motion in all the three parameters Canal Transportation, Cross-sectional Area, Cervical Dentinal Thickness.

  6. Generation of Gaussian 09 Input Files for the Computation of 1H and 13C NMR Chemical Shifts of Structures from a Spartan’14 Conformational Search

    OpenAIRE

    sprotocols

    2014-01-01

    Authors: Spencer Reisbick & Patrick Willoughby ### Abstract This protocol describes an approach to preparing a series of Gaussian 09 computational input files for an ensemble of conformers generated in Spartan’14. The resulting input files are necessary for computing optimum geometries, relative conformer energies, and NMR shielding tensors using Gaussian. Using the conformational search feature within Spartan’14, an ensemble of conformational isomers was obtained. To convert the str...

  7. JENDL special purpose file

    International Nuclear Information System (INIS)

    Nakagawa, Tsuneo

    1995-01-01

    In JENDL-3,2, the data on all the reactions having significant cross section over the neutron energy from 0.01 meV to 20 MeV are given for 340 nuclides. The object range of application extends widely, such as the neutron engineering, shield and others of fast reactors, thermal neutron reactors and nuclear fusion reactors. This is a general purpose data file. On the contrary to this, the file in which only the data required for a specific application field are collected is called special purpose file. The file for dosimetry is a typical special purpose file. The Nuclear Data Center, Japan Atomic Energy Research Institute, is making ten kinds of JENDL special purpose files. The files, of which the working groups of Sigma Committee are in charge, are listed. As to the format of the files, ENDF format is used similarly to JENDL-3,2. Dosimetry file, activation cross section file, (α, n) reaction data file, fusion file, actinoid file, high energy data file, photonuclear data file, PKA/KERMA file, gas production cross section file and decay data file are described on their contents, the course of development and their verification. Dosimetry file and gas production cross section file have been completed already. As for the others, the expected time of completion is shown. When these files are completed, they are opened to the public. (K.I.)

  8. Analytical calculation of heavy quarkonia production processes in computer

    International Nuclear Information System (INIS)

    Braguta, V V; Likhoded, A K; Luchinsky, A V; Poslavsky, S V

    2014-01-01

    This report is devoted to the analytical calculation of heavy quarkonia production processes in modern experiments such as LHC, B-factories and superB-factories in computer. Theoretical description of heavy quarkonia is based on the factorization theorem. This theorem leads to special structure of the production amplitudes which can be used to develop computer algorithm which calculates these amplitudes automatically. This report is devoted to the description of this algorithm. As an example of its application we present the results of the calculation of double charmonia production in bottomonia decays and inclusive the χ cJ mesons production in pp-collisions

  9. Computational system to create an entry file for replicating I-125 seeds simulating brachytherapy case studies using the MCNPX code

    Directory of Open Access Journals (Sweden)

    Leonardo da Silva Boia

    2014-03-01

    Full Text Available Purpose: A computational system was developed for this paper in the C++ programming language, to create a 125I radioactive seed entry file, based on the positioning of a virtual grid (template in voxel geometries, with the purpose of performing prostate cancer treatment simulations using the MCNPX code.Methods: The system is fed with information from the planning system with regard to each seed’s location and its depth, and an entry file is automatically created with all the cards (instructions for each seed regarding their cell blocks and surfaces spread out spatially in the 3D environment. The system provides with precision a reproduction of the clinical scenario for the MCNPX code’s simulation environment, thereby allowing the technique’s in-depth study.Results and Conclusion: The preliminary results from this study showed that the lateral penumbra of uniform scanning proton beams was less sensitive In order to validate the computational system, an entry file was created with 88 125I seeds that were inserted in the phantom’s MAX06 prostate region with initial activity determined for the seeds at the 0.27 mCi value. Isodose curves were obtained in all the prostate slices in 5 mm steps in the 7 to 10 cm interval, totaling 7 slices. Variance reduction techniques were applied in order to optimize computational time and the reduction of uncertainties such as photon and electron energy interruptions in 4 keV and forced collisions regarding cells of interest. Through the acquisition of isodose curves, the results obtained show that hot spots have values above 300 Gy, as anticipated in literature, stressing the importance of the sources’ correct positioning, in which the computational system developed provides, in order not to release excessive doses in adjacent risk organs. The 144 Gy prescription curve showed in the validation process that it covers perfectly a large percentage of the volume, at the same time that it demonstrates a large

  10. Productivity associated with visual status of computer users.

    Science.gov (United States)

    Daum, Kent M; Clore, Katherine A; Simms, Suzanne S; Vesely, Jon W; Wilczek, Dawn D; Spittle, Brian M; Good, Greg W

    2004-01-01

    The aim of this project is to examine the potential connection between the astigmatic refractive corrections of subjects using computers and their productivity and comfort. We hypothesize that improving the visual status of subjects using computers results in greater productivity, as well as improved visual comfort. Inclusion criteria required subjects 19 to 30 years of age with complete vision examinations before being enrolled. Using a double-masked, placebo-controlled, randomized design, subjects completed three experimental tasks calculated to assess the effects of refractive error on productivity (time to completion and the number of errors) at a computer. The tasks resembled those commonly undertaken by computer users and involved visual search tasks of: (1) counties and populations; (2) nonsense word search; and (3) a modified text-editing task. Estimates of productivity for time to completion varied from a minimum of 2.5% upwards to 28.7% with 2 D cylinder miscorrection. Assuming a conservative estimate of an overall 2.5% increase in productivity with appropriate astigmatic refractive correction, our data suggest a favorable cost-benefit ratio of at least 2.3 for the visual correction of an employee (total cost 268 dollars) with a salary of 25,000 dollars per year. We conclude that astigmatic refractive error affected both productivity and visual comfort under the conditions of this experiment. These data also suggest a favorable cost-benefit ratio for employers who provide computer-specific eyewear to their employees.

  11. Method and computer program product for maintenance and modernization backlogging

    Science.gov (United States)

    Mattimore, Bernard G; Reynolds, Paul E; Farrell, Jill M

    2013-02-19

    According to one embodiment, a computer program product for determining future facility conditions includes a computer readable medium having computer readable program code stored therein. The computer readable program code includes computer readable program code for calculating a time period specific maintenance cost, for calculating a time period specific modernization factor, and for calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. In another embodiment, a computer-implemented method for calculating future facility conditions includes calculating a time period specific maintenance cost, calculating a time period specific modernization factor, and calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. Other embodiments are also presented.

  12. SHIPBUILDING PRODUCTION PROCESS DESIGN METHODOLOGY USING COMPUTER SIMULATION

    OpenAIRE

    Marko Hadjina; Nikša Fafandjel; Tin Matulja

    2015-01-01

    In this research a shipbuilding production process design methodology, using computer simulation, is suggested. It is expected from suggested methodology to give better and more efficient tool for complex shipbuilding production processes design procedure. Within the first part of this research existing practice for production process design in shipbuilding was discussed, its shortcomings and problem were emphasized. In continuing, discrete event simulation modelling method, as basis of sugge...

  13. [Computer assisted prescription of labile blood products: What are we expecting?

    Science.gov (United States)

    Daurat, G

    2016-11-01

    Computer assisted prescription of labile blood products is just at its beginning. Current programs already allow embedding automatically such data as patient's and prescribers' identification or ward details to produce readable prescriptions, also complying with part of Good Practice guidelines. Now prescriptions can also be sent electronically to the Etablissement Francais du Sang, the French blood products services. Usually they are computer programs specialised in transfusion and interfaced with the main patient's file software. Hardly ever the main software is able to manage transfusion itself. Next step would consist in performing checks, calculations or displaying warning or help messages based on academic or local medical recommendations or even tailored to pre-defined individual requirements. But these call for direct access to patient's data such as diagnosis or tests results, that must be accurately classified and coded before use. The main software could provide such functionalities: but actually that would be infrequent and difficult to transpose from one hospital to the other, regarding to the diversity of main software and their settings. Another solution would be to enhance the very few transfusion specialised programs in order to assist prescribers. Data could be prepared and sent by the main software according to a standardised format each time a prescription is to be entered. This standardised format should be independent from software in order to ensure interoperability, whatever the main and specialised programs. The content and format of this data exchange has to be defined, but this would allow hundreds of hospitals to provide a comprehensive tool for prescription of labile blood products, regardless of their main patient's file software. Copyright © 2016. Published by Elsevier SAS.

  14. OK, Computer: File Sharing, the Music Industry, and Why We Need the Pirate Party

    Directory of Open Access Journals (Sweden)

    Adrian Cosstick

    2009-03-01

    Full Text Available The Pirate Party believes the state and big business are in the process of protecting stale and inefficient models of business for their own monetary benefit by limiting our right to share information. The Pirate Party suggests that they are achieving this goal through the amendment of intellectual property legislation. In the dawn of the digital era, the Pirate Party advocates that governments and multinational corporations are using intellectual property to: crack down on file sharing which limits the ability to share knowledge and information; increase the terms and length of copyright to raise profits; and build code into music files which limits their ability to be shared (Pirate Party, 2009. There are a number of ‘copyright industries’ that are affected by these issues, none more so than the music industry. Its relationship with file sharing is topical and makes an excellent case study to address the impact big business has had on intellectual property and the need for the Pirate Party’s legislative input. The essay will then examine the central issues raised by illegal file sharing. In particular, the future for record companies in an environment that increasingly demands flexibility, and whether the Pirate Party’s proposal is a viable solution to the music industry’s problems

  15. Computational chemical product design problems under property uncertainties

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Cignitti, Stefano; Abildskov, Jens

    2017-01-01

    Three different strategies of how to combine computational chemical product design with Monte Carlo based methods for uncertainty analysis of chemical properties are outlined. One method consists of a computer-aided molecular design (CAMD) solution and a post-processing property uncertainty...... fluid design. While the higher end of the uncertainty range of the process model output is similar for the best performing fluids, the lower end of the uncertainty range differs largely....

  16. Comparative evaluation of effect of rotary and reciprocating single-file systems on pericervical dentin: A cone-beam computed tomography study.

    Science.gov (United States)

    Zinge, Priyanka Ramdas; Patil, Jayaprakash

    2017-01-01

    The aim of this study is to evaluate and compare the effect of one shape, Neolix rotary single-file systems and WaveOne, Reciproc reciprocating single-file systems on pericervical dentin (PCD) using cone-beam computed tomography (CBCT). A total of 40 freshly extracted mandibular premolars were collected and divided into two groups, namely, Group A - Rotary: A 1 - Neolix and A 2 - OneShape and Group B - Reciprocating: B 1 - WaveOne and B 2 - Reciproc. Preoperative scans of each were taken followed by conventional access cavity preparation and working length determination with 10-k file. Instrumentation of the canal was done according to the respective file system, and postinstrumentation CBCT scans of teeth were obtained. 90 μm thick slices were obtained 4 mm apical and coronal to the cementoenamel junction. The PCD thickness was calculated as the shortest distance from the canal outline to the closest adjacent root surface, which was measured in four surfaces, i.e., facial, lingual, mesial, and distal for all the groups in the two obtained scans. There was no significant difference found between rotary single-file systems and reciprocating single-file systems in their effect on PCD, but in Group B 2 , there was most significant loss of tooth structure in the mesial, lingual, and distal surface ( P file system removes more PCD as compared to other experimental groups, whereas Neolix single file system had the least effect on PCD.

  17. CMS Monte Carlo production in the WLCG computing grid

    International Nuclear Information System (INIS)

    Hernandez, J M; Kreuzer, P; Hof, C; Khomitch, A; Mohapatra, A; Filippis, N D; Pompili, A; My, S; Abbrescia, M; Maggi, G; Donvito, G; Weirdt, S D; Maes, J; Mulders, P v; Villella, I; Wakefield, S; Guan, W; Fanfani, A; Evans, D; Flossdorf, A

    2008-01-01

    Monte Carlo production in CMS has received a major boost in performance and scale since the past CHEP06 conference. The production system has been re-engineered in order to incorporate the experience gained in running the previous system and to integrate production with the new CMS event data model, data management system and data processing framework. The system is interfaced to the two major computing Grids used by CMS, the LHC Computing Grid (LCG) and the Open Science Grid (OSG). Operational experience and integration aspects of the new CMS Monte Carlo production system is presented together with an analysis of production statistics. The new system automatically handles job submission, resource monitoring, job queuing, job distribution according to the available resources, data merging, registration of data into the data bookkeeping, data location, data transfer and placement systems. Compared to the previous production system automation, reliability and performance have been considerably improved. A more efficient use of computing resources and a better handling of the inherent Grid unreliability have resulted in an increase of production scale by about an order of magnitude, capable of running in parallel at the order of ten thousand jobs and yielding more than two million events per day

  18. ERX: a software for editing files containing X-ray spectra to be used in exposure computational models

    International Nuclear Information System (INIS)

    Cabral, Manuela O.M.; Vieira, Jose W.; Silva, Alysson G.; Leal Neto, Viriato; Oliveira, Alex C.H.; Lima, Fernando R.A.

    2011-01-01

    Exposure Computational Models (ECMs) are utilities that simulate situations in which occurs irradiation in a given environment. An ECM is composed primarily by an anthropomorphic model (phantom), and a Monte Carlo code (MC). This paper presents a tutorial of the software Espectro de Raios-X (ERX). This software performs reading and numerical and graphical analysis of text files containing diagnostic X-ray spectra for use in algorithms of radioactive sources in the ECMs of a Grupo de Dosimetria Numerica. The ERX allows the user to select one among several X-ray spectrums in the energy range Diagnostic radiology X-Ray most commonly used in radiology clinics. In the current version of the ERX there are two types of input files: the contained in mspectra.dat file and the resulting of MC simulations in Geant4. The software allows the construction of charts of the Probability Density Function (PDF) and Cumulative Distribution Function (CDF) of a selected spectrum as well as the table with the values of these functions and the spectrum. In addition, the ERX allows the user to make comparative analysis between the PDF graphics of the two catalogs of spectra available, besides being can perform dosimetric evaluations with the selected spectrum. A software of this kind is an important computational tool for researchers in numerical dosimetry because of the diversity of Diagnostic radiology X-Ray machines, which implies in a mass of input data highly diverse. And because of this, the ERX provides independence to the group related to the data origin that is contained in the catalogs created, not being necessary to resort to others. (author)

  19. F2AC: A Lightweight, Fine-Grained, and Flexible Access Control Scheme for File Storage in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Wei Ren

    2016-01-01

    Full Text Available Current file storage service models for cloud servers assume that users either belong to single layer with different privileges or cannot authorize privileges iteratively. Thus, the access control is not fine-grained and flexible. Besides, most access control methods at cloud servers mainly rely on computationally intensive cryptographic algorithms and, especially, may not be able to support highly dynamic ad hoc groups with addition and removal of group members. In this paper, we propose a scheme called F2AC, which is a lightweight, fine-grained, and flexible access control scheme for file storage in mobile cloud computing. F2AC can not only achieve iterative authorization, authentication with tailored policies, and access control for dynamically changing accessing groups, but also provide access privilege transition and revocation. A new access control model called directed tree with linked leaf model is proposed for further implementations in data structures and algorithms. The extensive analysis is given for justifying the soundness and completeness of F2AC.

  20. Empowering file-based radio production through media asset management systems

    Science.gov (United States)

    Muylaert, Bjorn; Beckers, Tom

    2006-10-01

    In recent years, IT-based production and archiving of media has matured to a level which enables broadcasters to switch over from tape- or CD-based to file-based workflows for the production of their radio and television programs. This technology is essential for the future of broadcasters as it provides the flexibility and speed of execution the customer demands by enabling, among others, concurrent access and production, faster than real-time ingest, edit during ingest, centrally managed annotation and quality preservation of media. In terms of automation of program production, the radio department is the most advanced within the VRT, the Flemish broadcaster. Since a couple of years ago, the radio department has been working with digital equipment and producing its programs mainly on standard IT equipment. Historically, the shift from analogue to digital based production has been a step by step process initiated and coordinated by each radio station separately, resulting in a multitude of tools and metadata collections, some of them developed in-house, lacking integration. To make matters worse, each of those stations adopted a slightly different production methodology. The planned introduction of a company-wide Media Asset Management System allows a coordinated overhaul to a unified production architecture. Benefits include the centralized ingest and annotation of audio material and the uniform, integrated (in terms of IT infrastructure) workflow model. Needless to say, the ingest strategy, metadata management and integration with radio production systems play a major role in the level of success of any improvement effort. This paper presents a data model for audio-specific concepts relevant to radio production. It includes an investigation of ingest techniques and strategies. Cooperation with external, professional production tools is demonstrated through a use-case scenario: the integration of an existing, multi-track editing tool with a commercially available

  1. Parallel distributed computing in modeling of the nanomaterials production technologies

    NARCIS (Netherlands)

    Krzhizhanovskaya, V.V.; Korkhov, V.V.; Zatevakhin, M.A.; Gorbachev, Y.E.

    2008-01-01

    Simulation of physical and chemical processes occurring in the nanomaterial production technologies is a computationally challenging problem, due to the great number of coupled processes, time and length scales to be taken into account. To solve such complex problems with a good level of detail in a

  2. A Computer Program to Evaluate Timber Production Investments Under Uncertainty

    Science.gov (United States)

    Dennis L. Schweitzer

    1968-01-01

    A computer program has been written in Fortran IV to calculate probability distributions of present worths of investments in timber production. Inputs can include both point and probabilistic estimates of future costs, prices, and yields. Distributions of rates of return can also be constructed.

  3. The Hofmethode: Computing Semantic Similarities between E-Learning Products

    Directory of Open Access Journals (Sweden)

    Oliver Michel

    2009-11-01

    Full Text Available The key task in building useful e-learning repositories is to develop a system with an algorithm allowing users to retrieve information that corresponds to their specific requirements. To achieve this, products (or their verbal descriptions, i.e. presented in metadata need to be compared and structured according to the results of this comparison. Such structuring is crucial insofar as there are many search results that correspond to the entered keyword. The Hofmethode is an algorithm (based on psychological considerations to compute semantic similarities between texts and therefore offer a way to compare e-learning products. The computed similarity values are used to build semantic maps in which the products are visually arranged according to their similarities. The paper describes how the Hofmethode is implemented in the online database edulap, and how it contributes to help the user to explore the data in which he is interested.

  4. Micro computed tomography evaluation of the Self-adjusting file and ProTaper Universal system on curved mandibular molars.

    Science.gov (United States)

    Serefoglu, Burcu; Piskin, Beyser

    2017-09-26

    The aim of this investigation was to compare the cleaning and shaping efficiency of Self-adjusting file and Protaper, and to assess the correlation between root canal curvature and working time in mandibular molars using micro-computed tomography. Twenty extracted mandibular molars instrumented with Protaper and Self-adjusting file and the total working time was measured in mesial canals. The changes in canal volume, surface area and structure model index, transportation, uninstrumented area and the correlation between working-time and the curvature were analyzed. Although no statistically significant difference was observed between two systems in distal canals (p>0.05), a significantly higher amount of removed dentin volume and lower uninstrumented area were provided by Protaper in mesial canals (p<0.0001). A correlation between working-time and the canal-curvature was also observed in mesial canals for both groups (SAFr 2 =0.792, p<0.0004, PTUr 2 =0.9098, p<0.0001).

  5. Towards a Tool for Computer Supported Structuring of Products

    DEFF Research Database (Denmark)

    Hansen, Claus Thorp

    1997-01-01

    . However, a product possesses not only a component structure but also various organ structures which are superimposed on the component structure. The organ structures carry behaviour and make the product suited for its life phases.Our long-term research goal is to develop a computer-based system...... that is capable of supporting synthesis activities in engineering design, and thereby also support handling of various organ structures. Such a system must contain a product model, in which it is possible to describe and manipulate both various organ structures and the component structure.In this paper we focus...... on the relationships between organ structures and the component structure. By an analysis of an existing product it is shown that a component may contribute to more than one organ. A set of organ structures is identified and their influence on the component strucute is illustrated....

  6. Computer integration of engineering design and production: A national opportunity

    Science.gov (United States)

    1984-01-01

    The National Aeronautics and Space Administration (NASA), as a purchaser of a variety of manufactured products, including complex space vehicles and systems, clearly has a stake in the advantages of computer-integrated manufacturing (CIM). Two major NASA objectives are to launch a Manned Space Station by 1992 with a budget of $8 billion, and to be a leader in the development and application of productivity-enhancing technology. At the request of NASA, a National Research Council committee visited five companies that have been leaders in using CIM. Based on these case studies, technical, organizational, and financial issues that influence computer integration are described, guidelines for its implementation in industry are offered, and the use of CIM to manage the space station program is recommended.

  7. The computer-aided design of rubber-metal products

    Directory of Open Access Journals (Sweden)

    Pavlo S. Shvets

    2015-12-01

    Full Text Available The important problem in design of rubber-metal products is the optimization of their mass without sacrificing of proportionality factor is in the limits of standard. Aim: The aim of this work is to improve the computer-aided systems by development and implementation of improved optimization method in rubber-metal CAD systems for designers based on the reverse optimization. Materials and Methods: The paper studies the matters of computer-aided structural design of technical composite products composed of anisotropic materials that are essentially different in properties. Results: The structure of CAD systems for designers solving the problems of such design is offered and the work principles of its subsystems are described. It is shown that complicated systems optimization in CAD systems must consider as restrictions the entitative connection between separate elements of these systems within the area of the optimizing arguments. Conclusions: The problem of the “reverse” optimization when objective functions are the connectivity area parameters is considered. In many cases, this allows receiving solutions that are more effective during the computer-aided design process. The developed CAD system for designers was used during the production of rubber-metal shock absorbers at the Odessa Rubber Technical Articles Plant. The positive technical and economic effect was obtained.

  8. Request queues for interactive clients in a shared file system of a parallel computing system

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin

    2015-08-18

    Interactive requests are processed from users of log-in nodes. A metadata server node is provided for use in a file system shared by one or more interactive nodes and one or more batch nodes. The interactive nodes comprise interactive clients to execute interactive tasks and the batch nodes execute batch jobs for one or more batch clients. The metadata server node comprises a virtual machine monitor; an interactive client proxy to store metadata requests from the interactive clients in an interactive client queue; a batch client proxy to store metadata requests from the batch clients in a batch client queue; and a metadata server to store the metadata requests from the interactive client queue and the batch client queue in a metadata queue based on an allocation of resources by the virtual machine monitor. The metadata requests can be prioritized, for example, based on one or more of a predefined policy and predefined rules.

  9. A computer program for creating keyword indexes to textual data files

    Science.gov (United States)

    Moody, David W.

    1972-01-01

    A keyword-in-context (KWIC) or out-of-context (KWOC) index is a convenient means of organizing information. This keyword index program can be used to create either KWIC or KWOC indexes of bibliographic references or other types of information punched on. cards, typed on optical scanner sheets, or retrieved from various Department of Interior data bases using the Generalized Information Processing System (GIPSY). The index consists of a 'bibliographic' section and a keyword-section based on the permutation of. document titles, project titles, environmental impact statement titles, maps, etc. or lists of descriptors. The program can also create a back-of-the-book index to documents from a list of descriptors. By providing the user with a wide range of input and output options, the program provides the researcher, manager, or librarian with a means of-maintaining a list and index to documents in. a small library, reprint collection, or office file.

  10. An information retrieval system for research file data

    Science.gov (United States)

    Joan E. Lengel; John W. Koning

    1978-01-01

    Research file data have been successfully retrieved at the Forest Products Laboratory through a high-speed cross-referencing system involving the computer program FAMULUS as modified by the Madison Academic Computing Center at the University of Wisconsin. The method of data input, transfer to computer storage, system utilization, and effectiveness are discussed....

  11. Shaping ability of the conventional nickel-titanium and reciprocating nickel-titanium file systems: a comparative study using micro-computed tomography.

    Science.gov (United States)

    Hwang, Young-Hye; Bae, Kwang-Shik; Baek, Seung-Ho; Kum, Kee-Yeon; Lee, WooCheol; Shon, Won-Jun; Chang, Seok Woo

    2014-08-01

    This study used micro-computed tomographic imaging to compare the shaping ability of Mtwo (VDW, Munich, Germany), a conventional nickel-titanium file system, and Reciproc (VDW), a reciprocating file system morphologically similar to Mtwo. Root canal shaping was performed on the mesiobuccal and distobuccal canals of extracted maxillary molars. In the RR group (n = 15), Reciproc was used in a reciprocating motion (150° counterclockwise/30° clockwise, 300 rpm); in the MR group, Mtwo was used in a reciprocating motion (150° clockwise/30° counterclockwise, 300 rpm); and in the MC group, Mtwo was used in a continuous rotating motion (300 rpm). Micro-computed tomographic images taken before and after canal shaping were used to analyze canal volume change and the degree of transportation at the cervical, middle, and apical levels. The time required for canal shaping was recorded. Afterward, each file was analyzed using scanning electron microscopy. No statistically significant differences were found among the 3 groups in the time for canal shaping or canal volume change (P > .05). Transportation values of the RR and MR groups were not significantly different at any level. However, the transportation value of the MC group was significantly higher than both the RR and MR groups at the cervical and apical levels (P file deformation was observed for 1 file in group RR (1/15), 3 files in group MR (3/15), and 5 files in group MC (5/15). In terms of shaping ability, Mtwo used in a reciprocating motion was not significantly different from the Reciproc system. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  12. PCF File Format.

    Energy Technology Data Exchange (ETDEWEB)

    Thoreson, Gregory G [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    PCF files are binary files designed to contain gamma spectra and neutron count rates from radiation sensors. It is the native format for the GAmma Detector Response and Analysis Software (GADRAS) package [1]. It can contain multiple spectra and information about each spectrum such as energy calibration. This document outlines the format of the file that would allow one to write a computer program to parse and write such files.

  13. Reciprocating vs Rotary Instrumentation in Pediatric Endodontics: Cone Beam Computed Tomographic Analysis of Deciduous Root Canals using Two Single-file Systems.

    Science.gov (United States)

    Prabhakar, Attiguppe R; Yavagal, Chandrashekar; Dixit, Kratika; Naik, Saraswathi V

    2016-01-01

    Primary root canals are considered to be most challenging due to their complex anatomy. "Wave one" and "one shape" are single-file systems with reciprocating and rotary motion respectively. The aim of this study was to evaluate and compare dentin thickness, centering ability, canal transportation, and instrumentation time of wave one and one shape files in primary root canals using a cone beam computed tomographic (CBCT) analysis. This is an experimental, in vitro study comparing the two groups. A total of 24 extracted human primary teeth with minimum 7 mm root length were included in the study. Cone beam computed tomographic images were taken before and after the instrumentation for each group. Dentin thickness, centering ability, canal transportation, and instrumentation times were evaluated for each group. A significant difference was found in instrumentation time and canal transportation measures between the two groups. Wave one showed less canal transportation as compared with one shape, and the mean instrumentation time of wave one was significantly less than one shape. Reciprocating single-file systems was found to be faster with much less procedural errors and can hence be recommended for shaping the root canals of primary teeth. How to cite this article: Prabhakar AR, Yavagal C, Dixit K, Naik SV. Reciprocating vs Rotary Instrumentation in Pediatric Endodontics: Cone Beam Computed Tomographic Analysis of Deciduous Root Canals using Two Single-File Systems. Int J Clin Pediatr Dent 2016;9(1):45-49.

  14. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  15. Trend Analysis of the Brazilian Scientific Production in Computer Science

    Directory of Open Access Journals (Sweden)

    TRUCOLO, C. C.

    2014-12-01

    Full Text Available The growth of scientific information volume and diversity brings new challenges in order to understand the reasons, the process and the real essence that propel this growth. This information can be used as the basis for the development of strategies and public politics to improve the education and innovation services. Trend analysis is one of the steps in this way. In this work, trend analysis of Brazilian scientific production of graduate programs in the computer science area is made to identify the main subjects being studied by these programs in general and individual ways.

  16. Cone-beam Computed Tomographic Assessment of Canal Centering Ability and Transportation after Preparation with Twisted File and Bio RaCe Instrumentation.

    Directory of Open Access Journals (Sweden)

    Kiamars Honardar

    2014-08-01

    Full Text Available Use of rotary Nickel-Titanium (NiTi instruments for endodontic preparation has introduced a new era in endodontic practice, but this issue has undergone dramatic modifications in order to achieve improved shaping abilities. Cone-beam computed tomography (CBCT has made it possible to accurately evaluate geometrical changes following canal preparation. This study was carried out to compare canal centering ability and transportation of Twisted File and BioRaCe rotary systems by means of cone-beam computed tomography.Thirty root canals from freshly extracted mandibular and maxillary teeth were selected. Teeth were mounted and scanned before and after preparation by CBCT at different apical levels. Specimens were divided into 2 groups of 15. In the first group Twisted File and in the second, BioRaCe was used for canal preparation. Canal transportation and centering ability after preparation were assessed by NNT Viewer and Photoshop CS4 software. Statistical analysis was performed using t-test and two-way ANOVA.All samples showed deviations from the original axes of the canals. No significant differences were detected between the two rotary NiTi instruments for canal centering ability in all sections. Regarding canal transportation however, a significant difference was seen in the BioRaCe group at 7.5mm from the apex.Under the conditions of this in vitro study, Twisted File and BioRaCe rotary NiTi files retained original canal geometry.

  17. Computer-aided process planning in prismatic shape die components based on Standard for the Exchange of Product model data

    Directory of Open Access Journals (Sweden)

    Awais Ahmad Khan

    2015-11-01

    Full Text Available Insufficient technologies made good integration between the die components in design, process planning, and manufacturing impossible in the past few years. Nowadays, the advanced technologies based on Standard for the Exchange of Product model data are making it possible. This article discusses the three main steps for achieving the complete process planning for prismatic parts of the die components. These three steps are data extraction, feature recognition, and process planning. The proposed computer-aided process planning system works as part of an integrated system to cover the process planning of any prismatic part die component. The system is built using Visual Basic with EWDraw system for visualizing the Standard for the Exchange of Product model data file. The system works successfully and can cover any type of sheet metal die components. The case study discussed in this article is taken from a large design of progressive die.

  18. Using resources for scientific-driven pharmacovigilance: from many product safety documents to one product safety master file.

    Science.gov (United States)

    Furlan, Giovanni

    2012-08-01

    Current regulations require a description of the overall safety profile or the specific risks of a drug in multiple documents such as the Periodic and Development Safety Update Reports, Risk Management Plans (RMPs) and Signal Detection Reports. In a resource-constrained world, the need for preparing multiple documents reporting the same information results in shifting the focus from a thorough scientific and medical evaluation of the available data to maintaining compliance with regulatory timelines. Since the aim of drug safety is to understand and characterize product issues to take adequate risk minimization measures rather than to comply with bureaucratic requirements, there is the need to avoid redundancy. In order to identify core drug safety activities that need to be undertaken to protect patient safety and reduce the number of documents reporting the results of these activities, the author has reviewed the main topics included in the drug safety guidelines and templates. The topics and sources that need to be taken into account in the main regulatory documents have been found to greatly overlap and, in the future, as a result of the new Periodic Safety Update Report structure and requirements, in the author's opinion this overlap is likely to further increase. Many of the identified inter-document differences seemed to be substantially formal. The Development Safety Update Report, for example, requires separate presentation of the safety issues emerging from different sources followed by an overall evaluation of each safety issue. The RMP, instead, requires a detailed description of the safety issues without separate presentation of the evidence derived from each source. To some extent, however, the individual documents require an in-depth analysis of different aspects; the RMP, for example, requires an epidemiological description of the indication for which the drug is used and its risks. At the time of writing this article, this is not specifically

  19. Code 672 observational science branch computer networks

    Science.gov (United States)

    Hancock, D. W.; Shirk, H. G.

    1988-01-01

    In general, networking increases productivity due to the speed of transmission, easy access to remote computers, ability to share files, and increased availability of peripherals. Two different networks within the Observational Science Branch are described in detail.

  20. Artificial intelligence in pharmaceutical product formulation: neural computing

    Directory of Open Access Journals (Sweden)

    Svetlana Ibrić

    2009-10-01

    Full Text Available The properties of a formulation are determined not only by the ratios in which the ingredients are combined but also by the processing conditions. Although the relationships between the ingredient levels, processing conditions, and product performance may be known anecdotally, they can rarely be quantified. In the past, formulators tended to use statistical techniques to model their formulations, relying on response surfaces to provide a mechanism for optimazation. However, the optimization by such a method can be misleading, especially if the formulation is complex. More recently, advances in mathematics and computer science have led to the development of alternative modeling and data mining techniques which work with a wider range of data sources: neural networks (an attempt to mimic the processing of the human brain; genetic algorithms (an attempt to mimic the evolutionary process by which biological systems self-organize and adapt, and fuzzy logic (an attempt to mimic the ability of the human brain to draw conclusions and generate responses based on incomplete or imprecise information. In this review the current technology will be examined, as well as its application in pharmaceutical formulation and processing. The challenges, benefits and future possibilities of neural computing will be discussed.

  1. Application of large computers for predicting the oil field production

    Energy Technology Data Exchange (ETDEWEB)

    Philipp, W; Gunkel, W; Marsal, D

    1971-10-01

    The flank injection drive plays a dominant role in the exploitation of the BEB-oil fields. Therefore, 2-phase flow computer models were built up, adapted to a predominance of a single flow direction and combining a high accuracy of prediction with a low job time. Any case study starts with the partitioning of the reservoir into blocks. Then the statistics of the time-independent reservoir properties are analyzed by means of an IBM 360/25 unit. Using these results and the past production of oil, water and gas, a Fortran-program running on a CDC-3300 computer yields oil recoveries and the ratios of the relative permeabilities as a function of the local oil saturation for all blocks penetrated by mobile water. In order to assign kDwU/KDoU-functions to blocks not yet reached by the advancing water-front, correlation analysis is used to relate reservoir properties to kDwU/KDoU-functions. All these results are used as input into a CDC-660 Fortran program, allowing short-, medium-, and long-term forecasts as well as the handling of special problems.

  2. Computer-assisted training in the thermal production department

    International Nuclear Information System (INIS)

    Felgines, R.

    1985-01-01

    For many years now, in the United States and Canada, computer-assisted training (CAT) experiments have been carried out in various fields: general or professional education, student testing in universities. This method seems very promising and particularly for continuing education and for keeping industrial process operating and maintenance personnel abreast of their specialities. Thanks to the progress in data processing and remote processing with central computers, this technique is being developed in France for professional training applications. Faced with many training problems, the Thermal Production Department of EDF (Electricite de France) first conducted in 1979 a test involving a limited subset of the nuclear power station operating personnel; this course amounted to some ten hours with very limited content. It seemed promising enough, so that in 1981, a real test was launched at 4 PWR plants: DAMPIERRE, FESSENHEIM, GRAVELINES, TRICASTIN. This test which involves about 700 employees has been fruitful and we decided to generalise this system to all the French nuclear power plants (40 units of 900 and 1300 MW). (author)

  3. A restructuring of the MELCOR fission product packages for the MIDAS computer code

    International Nuclear Information System (INIS)

    Park, S.H.; Kim, K.R.; Kim, D.H.

    2004-01-01

    The RN1/RN2 packages, which are the fission product-related packages in MELCOR, have been restructured for the MIDAS computer code. MIDAS is being developed as an integrated severe accident analysis code with a user-friendly graphical user interface and a modernized data structure. To do this, the data transferring methods of the current MELCOR code are modified and adopted into the RN1/RN2 package. The data structure of the current MELCOR code using FORTRAN77 has a difficulty in grasping the meaning of the variables as well as waste of memory. New features of FORTRAN90 make it possible to allocate the storage dynamically and to user-defined data type, which leads to an efficient memory treatment and an easy understanding of the code. Restructuring of the RN1/RN2 package addressed in this paper includes a module development, subroutine modification, and the treatment of MELGEN, which generates the data file, as well as MELCOR, which is processing the calculation. The verification has been done by comparing the results of the modified code with those of the existing code. As the trends are similar to each other, it implies that the same approach could be extended to the entire code package. It is expected that the code restructuring will accelerate the code domestication thanks to a direct understanding of each variable and an easy implementation of the modified or newly developed models. (author)

  4. Evaluation of Single File Systems Reciproc, Oneshape, and WaveOne using Cone Beam Computed Tomography –An In Vitro Study

    Science.gov (United States)

    Dhingra, Annil; Miglani, Anjali

    2015-01-01

    Background Successful endodontic therapy depends on many factor, one of the most important step in any root canal treatment is root canal preparation. In addition, respecting the original shape of the canal is of the same importance; otherwise, canal aberrations such as transportation will be created. Aim The purpose of this study is to compare and evaluate Reciprocating WaveOne ,Reciproc and Rotary Oneshape Single File Instrumentation System On Cervical Dentin Thickness, Cross Sectional Area and Canal Transportation on First Mandibular Molar Using Cone Beam Computed Tomography. Materials and Methods Sixty Mandibular First Molars extracted due to periodontal reason was collected from the Department of Oral and Maxillofacial. Teeth were prepared using one rotary and two reciprocating single file system. Teeth were divided into 3 groups 20 teeth in each group. Pre instrumentation and Post instrumentation scans was done and evaluated for three parameters Canal Transportation, Cervical Dentinal Thickness, Cross-sectional Area. Results were analysed statistically using ANOVA, Post-Hoc Tukey analysis. Results The change in cross-sectional area after filing showed significant difference at 0mm, 1mm, 2mm and 7mm (pfile system over a distance of 7 mm (starting from 0mm and then evaluation at 1mm, 2mm, 3mm, 5mm and 7mm), the results showed a significant difference among the file systems at various lengths (p= 0.014, 0.046, 0.004, 0.028, 0.005 & 0.029 respectively). Mean value of cervical dentinal removal is maximum at all the levels for oneshape and minimum for waveone showing the better quality of waveone and reciproc over oneshape file system. Significant difference was found at 9mm, 11mm and 12mm between all the three file systems (p<0.001,< 0.001, <0.001). Conclusion It was concluded that reciprocating motion is better than rotary motion in all the three parameters Canal Transportation, Cross-sectional Area, Cervical Dentinal Thickness. PMID:26023639

  5. User's guide for the implementation of level one of the proposed American National Standard Specifications for an information interchange data descriptive file on control data 6000/7000 series computers

    CERN Document Server

    Wiley, R A

    1977-01-01

    User's guide for the implementation of level one of the proposed American National Standard Specifications for an information interchange data descriptive file on control data 6000/7000 series computers

  6. Simulating quantum systems on classical computers with matrix product states

    International Nuclear Information System (INIS)

    Kleine, Adrian

    2010-01-01

    In this thesis, the numerical simulation of strongly-interacting many-body quantum-mechanical systems using matrix product states (MPS) is considered. Matrix-Product-States are a novel representation of arbitrary quantum many-body states. Using quantum information theory, it is possible to show that Matrix-Product-States provide a polynomial-sized representation of one-dimensional quantum systems, thus allowing an efficient simulation of one-dimensional quantum system on classical computers. Matrix-Product-States form the conceptual framework of the density-matrix renormalization group (DMRG). After a general introduction in the first chapter of this thesis, the second chapter deals with Matrix-Product-States, focusing on the development of fast and stable algorithms. To obtain algorithms to efficiently calculate ground states, the density-matrix renormalization group is reformulated using the Matrix-Product-States framework. Further, time-dependent problems are considered. Two different algorithms are presented, one based on a Trotter decomposition of the time-evolution operator, the other one on Krylov subspaces. Finally, the evaluation of dynamical spectral functions is discussed, and a correction vector-based method is presented. In the following chapters, the methods presented in the second chapter, are applied to a number of different physical problems. The third chapter deals with the existence of chiral phases in isotropic one-dimensional quantum spin systems. A preceding analytical study based on a mean-field approach indicated the possible existence of those phases in an isotropic Heisenberg model with a frustrating zig-zag interaction and a magnetic field. In this thesis, the existence of the chiral phases is shown numerically by using Matrix-Product-States-based algorithms. In the fourth chapter, we propose an experiment using ultracold atomic gases in optical lattices, which allows a well controlled observation of the spin-charge separation (of

  7. Simulating quantum systems on classical computers with matrix product states

    Energy Technology Data Exchange (ETDEWEB)

    Kleine, Adrian

    2010-11-08

    In this thesis, the numerical simulation of strongly-interacting many-body quantum-mechanical systems using matrix product states (MPS) is considered. Matrix-Product-States are a novel representation of arbitrary quantum many-body states. Using quantum information theory, it is possible to show that Matrix-Product-States provide a polynomial-sized representation of one-dimensional quantum systems, thus allowing an efficient simulation of one-dimensional quantum system on classical computers. Matrix-Product-States form the conceptual framework of the density-matrix renormalization group (DMRG). After a general introduction in the first chapter of this thesis, the second chapter deals with Matrix-Product-States, focusing on the development of fast and stable algorithms. To obtain algorithms to efficiently calculate ground states, the density-matrix renormalization group is reformulated using the Matrix-Product-States framework. Further, time-dependent problems are considered. Two different algorithms are presented, one based on a Trotter decomposition of the time-evolution operator, the other one on Krylov subspaces. Finally, the evaluation of dynamical spectral functions is discussed, and a correction vector-based method is presented. In the following chapters, the methods presented in the second chapter, are applied to a number of different physical problems. The third chapter deals with the existence of chiral phases in isotropic one-dimensional quantum spin systems. A preceding analytical study based on a mean-field approach indicated the possible existence of those phases in an isotropic Heisenberg model with a frustrating zig-zag interaction and a magnetic field. In this thesis, the existence of the chiral phases is shown numerically by using Matrix-Product-States-based algorithms. In the fourth chapter, we propose an experiment using ultracold atomic gases in optical lattices, which allows a well controlled observation of the spin-charge separation (of

  8. Digital Radiography and Computed Tomography (DRCT) Product Improvement Plan (PIP)

    Energy Technology Data Exchange (ETDEWEB)

    Tim Roney; Bob Pink; Karen Wendt; Robert Seifert; Mike Smith

    2010-12-01

    The Idaho National Laboratory (INL) has been developing and deploying x-ray inspection systems for chemical weapons containers for the past 12 years under the direction of the Project Manager for Non-Stockpile Chemical Materiel (PMNSCM). In FY-10 funding was provided to advance the capabilities of these systems through the DRCT (Digital Radiography and Computed Tomography) Product Improvement Plan (PIP), funded by the PMNSCM. The DRCT PIP identified three research tasks; end user study, detector evaluation and DRCT/PINS integration. Work commenced in February, 2010. Due to the late start and the schedule for field inspection of munitions at various sites, it was not possible to spend sufficient field time with operators to develop a complete end user study. We were able to interact with several operators, principally Mr. Mike Rowan who provided substantial useful input through several discussions and development of a set of field notes from the Pueblo, CO field mission. We will be pursuing ongoing interactions with field personnel as opportunities arise in FY-11.

  9. 77 FR 26041 - Certain Computers and Computer Peripheral Devices and Components Thereof and Products Containing...

    Science.gov (United States)

    2012-05-02

    ... INTERNATIONAL TRADE COMMISSION [Inv. No. 337-TA-841] Certain Computers and Computer Peripheral... after importation of certain computers and computer peripheral devices and components thereof and... industry in the United States exists as required by subsection (a)(2) of section 337. The complainant...

  10. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  12. Comparative Analysis of Canal Centering Ability of Different Single File Systems Using Cone Beam Computed Tomography- An In-Vitro Study

    Science.gov (United States)

    Agarwal, Jatin; Jain, Pradeep; Chandra, Anil

    2015-01-01

    Background The ability of an endodontic instrument to remain centered in the root canal system is one of the most important characteristic influencing the clinical performance of a particular file system. Thus, it is important to assess the canal centering ability of newly introduced single file systems before they can be considered a viable replacement of full-sequence rotary file systems. Aim The aim of the study was to compare the canal transportation, centering ability, and time taken for preparation of curved root canals after instrumentation with single file systems One Shape and Wave One, using cone-beam computed tomography (CBCT). Materials and Methods Sixty mesiobuccal canals of mandibular molars with an angle of curvature ranging from 20o to 35o were divided into three groups of 20 samples each: ProTaper PT (group I) – full-sequence rotary control group, OneShape OS (group II)- single file continuous rotation, WaveOne WO – single file reciprocal motion (group III). Pre instrumentation and post instrumentation three-dimensional CBCT images were obtained from root cross-sections at 3mm, 6mm and 9mm from the apex. Scanned images were then accessed to determine canal transportation and centering ability. The data collected were evaluated using one-way analysis of variance (ANOVA) with Tukey’s honestly significant difference test. Results It was observed that there were no differences in the magnitude of transportation between the rotary instruments (p >0.05) at both 3mm as well as 6mm from the apex. At 9 mm from the apex, Group I PT showed significantly higher mean canal transportation and lower centering ability (0.19±0.08 and 0.39±0.16), as compared to Group II OS (0.12±0.07 and 0.54±0.24) and Group III WO (0.13±0.06 and 0.55±0.18) while the differences between OS and WO were not statistically significant Conclusion It was concluded that there was minor difference between the tested groups. Single file systems demonstrated average canal

  13. Comparative Analysis of Canal Centering Ability of Different Single File Systems Using Cone Beam Computed Tomography- An In-Vitro Study.

    Science.gov (United States)

    Agarwal, Rolly S; Agarwal, Jatin; Jain, Pradeep; Chandra, Anil

    2015-05-01

    The ability of an endodontic instrument to remain centered in the root canal system is one of the most important characteristic influencing the clinical performance of a particular file system. Thus, it is important to assess the canal centering ability of newly introduced single file systems before they can be considered a viable replacement of full-sequence rotary file systems. The aim of the study was to compare the canal transportation, centering ability, and time taken for preparation of curved root canals after instrumentation with single file systems One Shape and Wave One, using cone-beam computed tomography (CBCT). Sixty mesiobuccal canals of mandibular molars with an angle of curvature ranging from 20(o) to 35(o) were divided into three groups of 20 samples each: ProTaper PT (group I) - full-sequence rotary control group, OneShape OS (group II)- single file continuous rotation, WaveOne WO - single file reciprocal motion (group III). Pre instrumentation and post instrumentation three-dimensional CBCT images were obtained from root cross-sections at 3mm, 6mm and 9mm from the apex. Scanned images were then accessed to determine canal transportation and centering ability. The data collected were evaluated using one-way analysis of variance (ANOVA) with Tukey's honestly significant difference test. It was observed that there were no differences in the magnitude of transportation between the rotary instruments (p >0.05) at both 3mm as well as 6mm from the apex. At 9 mm from the apex, Group I PT showed significantly higher mean canal transportation and lower centering ability (0.19±0.08 and 0.39±0.16), as compared to Group II OS (0.12±0.07 and 0.54±0.24) and Group III WO (0.13±0.06 and 0.55±0.18) while the differences between OS and WO were not statistically significant. It was concluded that there was minor difference between the tested groups. Single file systems demonstrated average canal transportation and centering ability comparable to full sequence

  14. Computational manufacturing as a bridge between design and production.

    Science.gov (United States)

    Tikhonravov, Alexander V; Trubetskov, Michael K

    2005-11-10

    Computational manufacturing of optical coatings is a research area that can be placed between theoretical designing and practical manufacturing in the same way that computational physics can be placed between theoretical and experimental physics. Investigations in this area have been performed for more than 30 years under the name of computer simulation of manufacturing and monitoring processes. Our goal is to attract attention to the increasing importance of computational manufacturing at the current state of the art in the design and manufacture of optical coatings and to demonstrate possible applications of this research tool.

  15. Computer model for economic study of unbleached kraft paperboard production

    Science.gov (United States)

    Peter J. Ince

    1984-01-01

    Unbleached kraft paperboard is produced from wood fiber in an industrial papermaking process. A highly specific and detailed model of the process is presented. The model is also presented as a working computer program. A user of the computer program will provide data on physical parameters of the process and on prices of material inputs and outputs. The program is then...

  16. Views of CMS Event Data Objects, Files, Collections, Virtual Data Products

    CERN Document Server

    Holtman, Koen

    2001-01-01

    The CMS data grid system will store many types of data maintained by the CMS collaboration. An important type of data is the event data, which is defined in this note as all data that directly represents simulated, raw, or reconstructed CMS physics events. Many views on this data will exist simultaneously. To a CMS physics code implementer this data will appear as C++ objects, to a tape robot operator the data will appear as files. This note identifies different views that can exist, describes each of them, and interrelates them by placing them into a vertical stack. This particular stack integrates several existing architectural structures, and is therefore a plausible basis for further prototyping and architectural work. This document is intended as a contribution to, and as common (terminological) reference material for, the CMS architectural efforts and for the Grid projects PPDG, GriPhyN, and the EU DataGrid.

  17. Review of ENDF/B-VI Fission-Product Cross Sections[Evaluated Nuclear Data File

    Energy Technology Data Exchange (ETDEWEB)

    Wright, R.Q.; MacFarlane, R.E.

    2000-04-01

    In response to concerns raised in the Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 93-2, the US Department of Energy (DOE) developed a comprehensive program to help assure that the DOE maintain and enhance its capability to predict the criticality of systems throughout the complex. Tasks developed to implement the response to DNFSB recommendation 93-2 included Critical Experiments, Criticality Benchmarks, Training, Analytical Methods, and Nuclear Data. The Nuclear Data Task consists of a program of differential measurements at the Oak Ridge Electron Linear Accelerator (ORELA), precise fitting of the differential data with the generalized least-squares fitting code SAMMY to represent the data with resonance parameters using the Reich-Moore formalism along with covariance (uncertainty) information, and the development of complete evaluations for selected nuclides for inclusion in the Evaluated Nuclear Data File (ENDFB).

  18. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  19. Evaluation of the Self-Adjusting File system (SAF) for the instrumentation of primary molar root canals: a micro-computed tomographic study.

    Science.gov (United States)

    Kaya, E; Elbay, M; Yiğit, D

    2017-06-01

    The Self-Adjusting File (SAF) system has been recommended for use in permanent teeth since it offers more conservative and effective root-canal preparation when compared to traditional rotary systems. However, no study had evaluated the usage of SAF in primary teeth. The aim of this study was to evaluate and compare the use of SAF, K file (manual instrumentation) and Profile (traditional rotary instrumentation) systems for primary-tooth root-canal preparation in terms of instrumentation time and amounts of dentin removed using micro-computed tomography (μCT) technology. Study Design: The study was conducted with 60 human primary mandibular second molar teeth divided into 3 groups according to instrumentation technique: Group I: SAF (n=20); Group II: K file (n=20); Group III; Profile (n=20). Teeth were embedded in acrylic blocks and scanned with a μCT scanner prior to instrumentation. All distal root canals were prepared up to size 30 for K file,.04/30 for Profile and 2 mm thickness, size 25 for SAF; instrumentation time was recorded for each tooth, and a second μCT scan was performed after instrumentation was complete. Amounts of dentin removed were measured using the three-dimensional images by calculating the difference in root-canal volume before and after preparation. Data was statistically analysed using the Kolmogorov-Smirnov and Kruskal-Wallis tests. Manual instrumentation (K file) resulted in significantly more dentin removal when compared to rotary instrumentation (Profile and SAF), while the SAF system generated significantly less dentin removal than both manual instrumentation (K file) and traditional rotary instrumentation (Profile) (psystems. Within the experimental conditions of the present study, the SAF seems as a useful system for root-canal instrumentation in primary molars because it removed less dentin than other systems, which is especially important for the relatively thin-walled canals of primary teeth, and because it involves less

  20. Investigation and analytical results of bituminized products in drums at filing room

    International Nuclear Information System (INIS)

    Shibata, Atsuhiro; Kato, Yoshiyuki; Sano, Yuichi; Kitajima, Takafumi; Fujita, Hideto

    1999-09-01

    This report describes the results of investigation of the bituminized products in drums, liquid waste in the receiving tank V21 and the bituminized mixture in the extruder. The investigation of the products in drums showed most of the unburned products filled after 28B had abnormality, such as hardened surfaces, caves and porous brittle products. The particle sizes of the salt fixed in bituminized products depended neither on batch number nor on feed rate. It indicates the fining of the salt particle caused by the decreased feed rate did not occur. The measured concentrations of metals and anions in the bituminized products showed no abnormality. The catalytic content was not recognized in the products. The infrared absorption spectra obtained with the bituminized products show the oxidation at the incident occurred without oxygen. There was no organic phase on the surface of liquid waste in V21. Chemical analysis and thermal analysis on the precipitate in V21 showed no abnormality. Concentration of sodium nitrate/nitrite in the mixture collected from the extruder was lower than normal products. These results show no chemical activation of the bituminized products. It can be concluded that the chemical characteristics of the products had little abnormality even around the incident. (author)

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  2. Behold the Trojan Horse: Instructional vs. Productivity Computing in the Classroom.

    Science.gov (United States)

    Loop, Liza

    This background paper for a symposium on the school of the future reviews the current instructional applications of computers in the classroom (the computer as a means or the subject of instruction), and suggests strategies that administrators might use to move toward viewing the computer as a productivity tool for students, i.e., its use for word…

  3. [Comparison of effectiveness and safety between Twisted File technique and ProTaper Universal rotary full sequence based on micro-computed tomography].

    Science.gov (United States)

    Chen, Xiao-bo; Chen, Chen; Liang, Yu-hong

    2016-02-18

    To evaluate the efficacy and security of two type of rotary nickel titanium system (Twisted File and ProTaper Universal) for root canal preparation based on micro-computed tomography(micro-CT). Twenty extracted molars (including 62 canals) were divided into two experimental groups and were respectively instrumented using Twisted File rotary nickel titanium system (TF) and ProTaper Universal rotary nickel titanium system (PU) to #25/0.08 following recommended protocol. Time for root canal instrumentation (accumulation of time for every single file) was recorded. The 0-3 mm root surface from apex was observed under an optical stereomicroscope at 25 × magnification. The presence of crack line was noted. The root canals were scanned with micro-CT before and after root canal preparation. Three-dimensional shape images of canals were reconstructed, calculated and evaluated. The amount of canal central transportation of the two groups was calculated and compared. The shorter preparation time [(0.53 ± 0.14) min] was observed in TF group, while the preparation time of PU group was (2.06 ± 0.39) min (Pvs. (0.097 ± 0.084) mm, P<0.05]. No instrument separation was observed in both the groups. Cracks were not found in both the groups either based in micro-CT images or observation under an optical stereomicroscope at 25 × magnification. Compared with ProTaper Universal, Twisted File took less time in root canal preparation and exhibited better shaping ability, and less canal transportation.

  4. The computation of properties of injection-moulded products

    NARCIS (Netherlands)

    Douven, L.F.A.; Baaijens, F.P.T.; Meijer, H.E.H.

    1995-01-01

    Injection moulding is a flexible production technique for the manufacture of complex shaped, thin walled polymer products that require minimal finishing. During processing, the polymer experiences a complex deformation and temperature history that affects the final properties of the product. In a

  5. Building the Teraflops/Petabytes Production Computing Center

    International Nuclear Information System (INIS)

    Kramer, William T.C.; Lucas, Don; Simon, Horst D.

    1999-01-01

    In just one decade, the 1990s, supercomputer centers have undergone two fundamental transitions which require rethinking their operation and their role in high performance computing. The first transition in the early to mid-1990s resulted from a technology change in high performance computing architecture. Highly parallel distributed memory machines built from commodity parts increased the operational complexity of the supercomputer center, and required the introduction of intellectual services as equally important components of the center. The second transition is happening in the late 1990s as centers are introducing loosely coupled clusters of SMPs as their premier high performance computing platforms, while dealing with an ever-increasing volume of data. In addition, increasing network bandwidth enables new modes of use of a supercomputer center, in particular, computational grid applications. In this paper we describe what steps NERSC is taking to address these issues and stay at the leading edge of supercomputing centers.; N

  6. Сlassification of methods of production of computer forensic by usage approach of graph theory

    Directory of Open Access Journals (Sweden)

    Anna Ravilyevna Smolina

    2016-06-01

    Full Text Available Сlassification of methods of production of computer forensic by usage approach of graph theory is proposed. If use this classification, it is possible to accelerate and simplify the search of methods of production of computer forensic and this process to automatize.

  7. Сlassification of methods of production of computer forensic by usage approach of graph theory

    OpenAIRE

    Anna Ravilyevna Smolina; Alexander Alexandrovich Shelupanov

    2016-01-01

    Сlassification of methods of production of computer forensic by usage approach of graph theory is proposed. If use this classification, it is possible to accelerate and simplify the search of methods of production of computer forensic and this process to automatize.

  8. Fission product yield evaluation for the USA evaluated nuclear data files

    International Nuclear Information System (INIS)

    Rider, B.F.; England, T.R.

    1994-01-01

    An evaluated set of fission product yields for use in calculation of decay heat curves with improved accuracy has been prepared. These evaluated yields are based on all known experimental data through 1992. Unmeasured fission product yields are calculated from charge distribution, pairing effects, and isomeric state models developed at Los Alamos National Laboratory. The current evaluation has been distributed as the ENDF/B-VI fission product yield data set

  9. Computer work and self-reported variables on anthropometrics, computer usage, work ability, productivity, pain, and physical activity.

    Science.gov (United States)

    Madeleine, Pascal; Vangsgaard, Steffen; Hviid Andersen, Johan; Ge, Hong-You; Arendt-Nielsen, Lars

    2013-08-01

    Computer users often report musculoskeletal complaints and pain in the upper extremities and the neck-shoulder region. However, recent epidemiological studies do not report a relationship between the extent of computer use and work-related musculoskeletal disorders (WMSD).The aim of this study was to conduct an explorative analysis on short and long-term pain complaints and work-related variables in a cohort of Danish computer users. A structured web-based questionnaire including questions related to musculoskeletal pain, anthropometrics, work-related variables, work ability, productivity, health-related parameters, lifestyle variables as well as physical activity during leisure time was designed. Six hundred and ninety office workers completed the questionnaire responding to an announcement posted in a union magazine. The questionnaire outcomes, i.e., pain intensity, duration and locations as well as anthropometrics, work-related variables, work ability, productivity, and level of physical activity, were stratified by gender and correlations were obtained. Women reported higher pain intensity, longer pain duration as well as more locations with pain than men (P women scored poorer work ability and ability to fulfil the requirements on productivity than men (P work ability/productivity (P work ability reported by women workers relate to their higher risk of contracting WMSD. Overall, this investigation confirmed the complex interplay between anthropometrics, work ability, productivity, and pain perception among computer users.

  10. Trends in scientific computing applied to petroleum exploration and production

    International Nuclear Information System (INIS)

    Guevara, Saul E; Piedrahita, Carlos E; Arroyo, Elkin R; Soto Rodolfo

    2002-01-01

    Current trends of computational tools in the upstream of the petroleum industry ore presented herein several results and images obtained through commercial programs and through in-house software developments illustrate the topics discussed. They include several types of problems and programming paradigms. Emphasis is made on the future of parallel processing through the use of affordable, open systems, as the Linux system. This kind of technologies will likely make possible new research and industry applications, since quite advanced computational resources will be available to many people working in the area

  11. Effect of the STereoLithography file structure on the ear shell production for hearing aids according to DICOM images

    Energy Technology Data Exchange (ETDEWEB)

    KIm, Hyeong Gyun [Dept. of Radiological Science, Far East University, Eumseong (Korea, Republic of)

    2017-03-15

    A technique for producing the ear shell for a hearing aid using DICOM (Digital Imaging and Communication in Medicine) image and a 3D printing was studied. It is a new application method, and is an application technique that can improve the safety and infection of hearing aid users and can reduce the production time and process stages. In this study, the effects on the shape surface were examined before and after the printing of the ear shell using a 3D printer based on the values obtained from the raw data of the DICOM images at the volumes of 0.5 mm, 1.0 mm, and 2.0 mm, respectively. Before the printing, relative relationship was compared with respect to the STL (STereoLithography) file structure; and after the printing, the intervals of the layered structure of the ear shell shape surface were compared by magnifying them using a microscope. For the STL file structure, the numbers of triangular vertices, more than five intersecting points, and maximum intersecting points were large in the order of 0.5 mm, 1.0 m, and 2.0 mm, respectively; and the triangular structure was densely distributed in the order of the bending, angle, and crest regions depending on the sinuosity of the external auditory meatus shape. As for the ear shell shape surface examined by the digital microscope, the interval of the layered structure was thick in the order of 2.0 mm, 1.0 mm, and 0.5 mm. For the STL surface structure mentioned above, the intersecting STL triangular structure was denser as the sinuosity of the 3D ear shell shape became more irregular and the volume of the raw data decreased.

  12. Effect of the STereoLithography file structure on the ear shell production for hearing aids according to DICOM images

    International Nuclear Information System (INIS)

    KIm, Hyeong Gyun

    2017-01-01

    A technique for producing the ear shell for a hearing aid using DICOM (Digital Imaging and Communication in Medicine) image and a 3D printing was studied. It is a new application method, and is an application technique that can improve the safety and infection of hearing aid users and can reduce the production time and process stages. In this study, the effects on the shape surface were examined before and after the printing of the ear shell using a 3D printer based on the values obtained from the raw data of the DICOM images at the volumes of 0.5 mm, 1.0 mm, and 2.0 mm, respectively. Before the printing, relative relationship was compared with respect to the STL (STereoLithography) file structure; and after the printing, the intervals of the layered structure of the ear shell shape surface were compared by magnifying them using a microscope. For the STL file structure, the numbers of triangular vertices, more than five intersecting points, and maximum intersecting points were large in the order of 0.5 mm, 1.0 m, and 2.0 mm, respectively; and the triangular structure was densely distributed in the order of the bending, angle, and crest regions depending on the sinuosity of the external auditory meatus shape. As for the ear shell shape surface examined by the digital microscope, the interval of the layered structure was thick in the order of 2.0 mm, 1.0 mm, and 0.5 mm. For the STL surface structure mentioned above, the intersecting STL triangular structure was denser as the sinuosity of the 3D ear shell shape became more irregular and the volume of the raw data decreased

  13. 11 CFR 9003.6 - Production of computer information.

    Science.gov (United States)

    2010-01-01

    ... disbursements; (2) Receipts by and disbursements from a legal and accounting compliance fund under 11 CFR 9003.3... legal and accounting services, including the allocation of payroll and overhead expenditures; (4... explaining the computer system's software capabilities, such as user guides, technical manuals, formats...

  14. Long term file migration. Part I: file reference patterns

    International Nuclear Information System (INIS)

    Smith, A.J.

    1978-08-01

    In most large computer installations, files are moved between on-line disk and mass storage (tape, integrated mass storage device) either automatically by the system or specifically at the direction of the user. This is the first of two papers which study the selection of algorithms for the automatic migration of files between mass storage and disk. The use of the text editor data sets at the Stanford Linear Accelerator Center (SLAC) computer installation is examined through the analysis of thirteen months of file reference data. Most files are used very few times. Of those that are used sufficiently frequently that their reference patterns may be examined, about a third show declining rates of reference during their lifetime; of the remainder, very few (about 5%) show correlated interreference intervals, and interreference intervals (in days) appear to be more skewed than would occur with the Bernoulli process. Thus, about two-thirds of all sufficiently active files appear to be referenced as a renewal process with a skewed interreference distribution. A large number of other file reference statistics (file lifetimes, interference distributions, moments, means, number of uses/file, file sizes, file rates of reference, etc.) are computed and presented. The results are applied in the following paper to the development and comparative evaluation of file migration algorithms. 17 figures, 13 tables

  15. Computer Aided Synthesis of Innovative Processes: Renewable Adipic Acid Production

    DEFF Research Database (Denmark)

    Rosengarta, Alessandro; Bertran, Maria-Ona; Manenti, Flavio

    2017-01-01

    A promising biotechnological route for the production of adipic acid from renewables has been evaluated, applying a systematic methodology for process network synthesis and optimization. The method allows organizing in a structured database the available knowledge from different sources (prelimin...

  16. Computer-aided production in the chemical industry

    International Nuclear Information System (INIS)

    Perez Castellanos, J.L.

    1993-01-01

    In these centres, chlorine is produced by means of electrochemical reactions which, moreover, originate other products such as soda and potash. Both the chlorine and the soda and potash are sold on demand in markets associated to the production centres and at prices which vary depending on the period of sale and the centre. Production surpluses of any one of the centres may be transported to any other so as to optimize the overall supply-demand combination of all the plants. The relevant transport and storage costs may also vary depending on the centre and on the time of year. The main problem lies in controlling the multiple combinations which permit a determined overall annual production of chlorine at the lowest possible cost. What is important is not only the quantity manufactured per month (for sale and self-consumption, or storage), but also how much is manufactured at each production centre. The monthly production of a plant could be obtained in different ways (modulations) giving rise to different production power costs (due to the electrolysis process itself, or because of the structure of electricity rates). In the first step towards solving the problem, for each plant and each month, a range of chlorine productions was selected -per plant and per month- with their corresponding electricity bills for the entire plant (once again, the rating structure makes it difficult to distinguish which part of the bill refers to electrolysis and which does not). These electric bills can be considered to be optimum in that they are minimal for a determined production of chlorine. Otherwise, in view of the targeted monthly production of chlorine, the current in the electrolysis is modulated so that the electricity bill shows the lowest possible amount, while minimum technical conditions are respected and the rest of the plant remains constant. In the assumptions described, the essence of the problem consists in deciding how much to produce every month and where to

  17. Computer program FPIP-REV calculates fission product inventory for U-235 fission

    Science.gov (United States)

    Brown, W. S.; Call, D. W.

    1967-01-01

    Computer program calculates fission product inventories and source strengths associated with the operation of U-235 fueled nuclear power reactor. It utilizes a fission-product nuclide library of 254 nuclides, and calculates the time dependent behavior of the fission product nuclides formed by fissioning of U-235.

  18. Decomposition and Cross-Product-Based Method for Computing the Dynamic Equation of Robots

    Directory of Open Access Journals (Sweden)

    Ching-Long Shih

    2012-08-01

    Full Text Available This paper aims to demonstrate a clear relationship between Lagrange equations and Newton-Euler equations regarding computational methods for robot dynamics, from which we derive a systematic method for using either symbolic or on-line numerical computations. Based on the decomposition approach and cross-product operation, a computing method for robot dynamics can be easily developed. The advantages of this computing framework are that: it can be used for both symbolic and on-line numeric computation purposes, and it can also be applied to biped systems, as well as some simple closed-chain robot systems.

  19. Systematic Computer-Aided Framework for Sustainable Chemical Product Design

    DEFF Research Database (Denmark)

    Cignitti, Stefano; Zhang, Lei; Kalakul, Sawitree

    -physical property needs and the process/application needs. Process/application and property needs are connected through an analysis of the property influence on the process/application models and thermodynamic relations. The sustainability is considered through product and process/application performance, economics......-designing demand increased sustainability and minimal trade-off with system performance. In the CAPD formulation, the product properties are related to the needs of heat pump cycle and its components through sensitivity analysis of the thermodynamic models and energy balances of the system. Furthermore, simple...... models are included for efficient assessment of the sustainability and design criteria of both the cycle and its components. It will be demonstrated that the working fluid product designed is optimal with respect to the sustainability and the heat pump cycle performance....

  20. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  1. Computing the Net Primary Productivity for a Savanna- Dominated ...

    African Journals Online (AJOL)

    komla

    2003-05-19

    May 19, 2003 ... productivity of CO2 (between 1–2% per year) continues, a doubling of the CO2 ... The work ... Numerous isotope mass balance equa-tions are proposed to ..... Terrestrial ecoregions of the world: a new map of life on earth.

  2. A micro-computed tomographic evaluation of dentinal microcrack alterations during root canal preparation using single-file Ni-Ti systems.

    Science.gov (United States)

    Li, Mei-Lin; Liao, Wei-Li; Cai, Hua-Xiong

    2018-01-01

    The aim of the present study was to evaluate the length of dentinal microcracks observed prior to and following root canal preparation with different single-file nickel-titanium (Ni-Ti) systems using micro-computed tomography (micro-CT) analysis. A total of 80 mesial roots of mandibular first molars presenting with type II Vertucci canal configurations were scanned at an isotropic resolution of 7.4 µm. The samples were randomly assigned into four groups (n=20 per group) according to the system used for root canal preparation, including the WaveOne (WO), OneShape (OS), Reciproc (RE) and control groups. A second micro-CT scan was conducted after the root canals were prepared with size 25 instruments. Pre- and postoperative cross-section images of the roots (n=237,760) were then screened to identify the lengths of the microcracks. The results indicated that the microcrack lengths were notably increased following root canal preparation (Pfiles. Among the single-file Ni-Ti systems, WO and RE were not observed to cause notable microcracks, while the OS system resulted in evident microcracks.

  3. Remote sensing of oceanic primary production: Computations using a spectral model

    Digital Repository Service at National Institute of Oceanography (India)

    Sathyendranath, S.; Platt, T.; Caverhill, C.M.; Warnock, R.E.; Lewis, M.R.

    A spectral model of underwater irradiance is coupled with a spectral version of the photosynthesis-light relationship to compute oceanic primary production. The results are shown to be significantly different from those obtained using...

  4. The Fermilab central computing facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-01-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front-end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS cluster interactive front-end, an Amdahl VM Computing engine, ACP farms, and (primarily) VMS workstations. This paper will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. (orig.)

  5. The Fermilab Central Computing Facility architectural model

    International Nuclear Information System (INIS)

    Nicholls, J.

    1989-05-01

    The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS Cluster interactive front end, an Amdahl VM computing engine, ACP farms, and (primarily) VMS workstations. This presentation will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. 2 figs

  6. Time-ordered product expansions for computational stochastic system biology

    International Nuclear Information System (INIS)

    Mjolsness, Eric

    2013-01-01

    The time-ordered product framework of quantum field theory can also be used to understand salient phenomena in stochastic biochemical networks. It is used here to derive Gillespie’s stochastic simulation algorithm (SSA) for chemical reaction networks; consequently, the SSA can be interpreted in terms of Feynman diagrams. It is also used here to derive other, more general simulation and parameter-learning algorithms including simulation algorithms for networks of stochastic reaction-like processes operating on parameterized objects, and also hybrid stochastic reaction/differential equation models in which systems of ordinary differential equations evolve the parameters of objects that can also undergo stochastic reactions. Thus, the time-ordered product expansion can be used systematically to derive simulation and parameter-fitting algorithms for stochastic systems. (paper)

  7. Artificial intelligence in pharmaceutical product formulation: neural computing

    OpenAIRE

    Svetlana Ibrić; Jelena Petrović; Jelena Parojčić; Zorica Djurić

    2009-01-01

    The properties of a formulation are determined not only by the ratios in which the ingredients are combined but also by the processing conditions. Although the relationships between the ingredient levels, processing conditions, and product performance may be known anecdotally, they can rarely be quantified. In the past, formulators tended to use statistical techniques to model their formulations, relying on response surfaces to provide a mechanism for optimazation. However, the optimization b...

  8. Environmental Life Cycle Inventory of Crystalline Silicon Photovoltaic System Production. Status 2005-2006 (Excel File)

    International Nuclear Information System (INIS)

    De Wild - Scholten, M.J.; Alsema, E.A.

    2007-03-01

    The authors have assembled this LCI data set to the best of their knowledge and in their opinion it gives a reliable representation of the crystalline silicon module production technology in Western-Europe in the year 2005/2006 and Balance-of-System components of the year 2006. However, most of the data were provided to them by the companies that helped them. Although they have cross-checked the data from different sources they cannot guarantee that it does not contain any errors. Therefore they cannot accept any responsibility for the use of these data

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  11. Computer-aided Framework for Design of Pure, Mixed and Blended Products

    DEFF Research Database (Denmark)

    Cignitti, Stefano; Zhang, Lei; Gani, Rafiqul

    2015-01-01

    This paper presents a framework for computer-aided design of pure, mixed and blended chemical based products. The framework is a systematic approach to convert a Computer-aided Molecular, Mixture and Blend Design (CAMbD) formulation, based on needs and target properties, into a mixed integer non...

  12. Abstracts of computer programs and data libraries pertaining to photon production data

    International Nuclear Information System (INIS)

    White, J.E.; Manneschmidt, J.B.; Finch, S.Y.; Dickens, J.K.

    1998-01-01

    Abstracts, or descriptions, of computer programs and data libraries pertaining to Photon Production Data (Measurements, Evaluations and Calculations) maintained in the collections of the Radiation Safety Information Computational Center, Oak Ridge, Tennessee USA and at the OECD/NEA Data Bank, Paris, are collected in this document

  13. Abstracts of computer programs and data libraries pertaining to photon production data

    Energy Technology Data Exchange (ETDEWEB)

    White, J.E.; Manneschmidt, J.B.; Finch, S.Y.; Dickens, J.K.

    1998-06-01

    Abstracts, or descriptions, of computer programs and data libraries pertaining to Photon Production Data (Measurements, Evaluations and Calculations) maintained in the collections of the Radiation Safety Information Computational Center, Oak Ridge, Tennessee USA and at the OECD/NEA Data Bank, Paris, are collected in this document.

  14. Application of computer aided tolerance analysis in product design

    International Nuclear Information System (INIS)

    Du Hua

    2009-01-01

    This paper introduces the shortage of the traditional tolerance design method and the strong point of the computer aided tolerancing (CAT) method,compares the shortage and the strong point among the three tolerance analysis methods, which are Worst Case Analysis, Statistical Analysis and Monte-Carlo Simulation Analysis, and offers the basic courses and correlative details for CAT. As the study objects, the reactor pressure vessel, the core barrel, the hold-down barrel and the support plate are used to upbuild the tolerance simulation model, based on their 3D design models. Then the tolerance simulation analysis has been conducted and the scheme of the tolerance distribution is optimized based on the analysis results. (authors)

  15. Kinetic computer modeling of microwave surface-wave plasma production

    International Nuclear Information System (INIS)

    Ganachev, Ivan P.

    2004-01-01

    Kinetic computer plasma modeling occupies an intermediate position between the time consuming rigorous particle dynamic simulation and the fast but rather rough cold- or warm-plasma fluid models. The present paper reviews the kinetic modeling of microwave surface-wave discharges with accent on recent kinetic self-consistent models, where the external input parameters are reduced to the necessary minimum (frequency and intensity of the applied microwave field and pressure and geometry of the discharge vessel). The presentation is limited to low pressures, so that Boltzmann equation is solved in non-local approximation and collisional electron heating is neglected. The numerical results reproduce correctly the bi-Maxwellian electron energy distribution functions observed experimentally. (author)

  16. Fission Product Experimental Program: Validation and Computational Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Leclaire, N.; Ivanova, T.; Letang, E. [Inst Radioprotect and Surete Nucl, F-92262 Fontenay Aux Roses (France); Girault, E. [CEA Valduc, Serv Rech Neutron and Critcite, 21 - Is-sur-Tille (France); Thro, J. F. [AREVA NC, F-78000 Versailles (France)

    2009-02-15

    From 1998 to 2004, a series of critical experiments referred to as the fission product (FP) experimental program was performed at the Commissariat a l'Energie Atomique Valduc research facility. The experiments were designed by Institut de Radioprotection et de Surete Nucleaire (IRSN) and funded by AREVA NC and IRSN within the French program supporting development of a technical basis for burnup credit validation. The experiments were performed with the following six key fission products encountered in solution either individually or as mixtures: {sup 103}Rh, {sup 133}Cs, {sup nat}Nd, {sup 149}Sm, {sup 152}Sm, and {sup 155}Gd. The program aimed at compensating for the lack of information on critical experiments involving FPs and at establishing a basis for FPs credit validation. One hundred forty-five critical experiments were performed, evaluated, and analyzed with the French CRISTAL criticality safety package and the American SCALE5. 1 code system employing different cross-section libraries. The aim of the paper is to show the experimental data potential to improve the ability to perform validation of full burnup credit calculation. The paper describes three Phases of the experimental program; the results of preliminary evaluation, the calculation, and the sensitivity/uncertainty study of the FP experiments used to validate the APOLLO2-MORET 4 route in the CRISTAL criticality package for burnup credit applications. (authors)

  17. Scientific computing and algorithms in industrial simulations projects and products of Fraunhofer SCAI

    CERN Document Server

    Schüller, Anton; Schweitzer, Marc

    2017-01-01

    The contributions gathered here provide an overview of current research projects and selected software products of the Fraunhofer Institute for Algorithms and Scientific Computing SCAI. They show the wide range of challenges that scientific computing currently faces, the solutions it offers, and its important role in developing applications for industry. Given the exciting field of applied collaborative research and development it discusses, the book will appeal to scientists, practitioners, and students alike. The Fraunhofer Institute for Algorithms and Scientific Computing SCAI combines excellent research and application-oriented development to provide added value for our partners. SCAI develops numerical techniques, parallel algorithms and specialized software tools to support and optimize industrial simulations. Moreover, it implements custom software solutions for production and logistics, and offers calculations on high-performance computers. Its services and products are based on state-of-the-art metho...

  18. A computer controlled ultrasonic measurement and testequipment for rotation-symmetrical products

    International Nuclear Information System (INIS)

    Abend, K.; Lang, R.; Schmidt, U.; Schuett, U.; Sterberg, W.

    1976-01-01

    During production of rotation-symmetrical thin wall precision tubes, dimensions must be measured and the tubes have to be inspected for surface defects. Within the production area of the tubes, several measurement points for different applications are located at different places. The paper describes their on-line connection to a process-computer system

  19. Computer Graphics Orientation and Training in a Corporate/Production Environment.

    Science.gov (United States)

    McDevitt, Marsha Jean

    This master's thesis provides an overview of a computer graphics production environment and proposes a realistic approach to orientation and on-going training for employees working within a fast-paced production schedule. Problems involved in meeting the training needs of employees are briefly discussed in the first chapter, while the second…

  20. Educational Impact of Digital Visualization Tools on Digital Character Production Computer Science Courses

    Science.gov (United States)

    van Langeveld, Mark Christensen

    2009-01-01

    Digital character production courses have traditionally been taught in art departments. The digital character production course at the University of Utah is centered, drawing uniformly from art and engineering disciplines. Its design has evolved to include a synergy of computer science, functional art and human anatomy. It gives students an…

  1. The impact of computers on productivity in the trade sector : Explorations with Dutch microdata

    NARCIS (Netherlands)

    Broersma, L.; McGuckin, R.H.; Timmer, M.P.

    The impact of computers on productivity in the Dutch trade sector during the period 1988-1994 is examined. The analysis is based on a panel data set derived from the Production Survey of Statistics Netherlands, which includes data on output, employment, wages, and various types of investment. A new

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  5. Issues on the Development and Application of Computer Tools to Support Product Structuring and Configuring

    DEFF Research Database (Denmark)

    Hansen, Claus Thorp; Riitahuhta, A.

    2001-01-01

    The aim of this article is to make a balance on the results and challenges in the efforts to develop computer tools to support product structuring and configuring in product development projects. The balance will be made in two dimensions, a design science and an industrial dimension. The design ...... that there are large positive effects to be gained for industrial companies by conscious implementing computer tools based on the results of design science. The positive effects will be measured by e.g. predictable product quality, reduced lead time, and reuse of design solutions....

  6. Dynamic Non-Hierarchical File Systems for Exascale Storage

    Energy Technology Data Exchange (ETDEWEB)

    Long, Darrell E. [Univ. of California, Santa Cruz, CA (United States); Miller, Ethan L [Univ. of California, Santa Cruz, CA (United States)

    2015-02-24

    This constitutes the final report for “Dynamic Non-Hierarchical File Systems for Exascale Storage”. The ultimate goal of this project was to improve data management in scientific computing and high-end computing (HEC) applications, and to achieve this goal we proposed: to develop the first, HEC-targeted, file system featuring rich metadata and provenance collection, extreme scalability, and future storage hardware integration as core design goals, and to evaluate and develop a flexible non-hierarchical file system interface suitable for providing more powerful and intuitive data management interfaces to HEC and scientific computing users. Data management is swiftly becoming a serious problem in the scientific community – while copious amounts of data are good for obtaining results, finding the right data is often daunting and sometimes impossible. Scientists participating in a Department of Energy workshop noted that most of their time was spent “...finding, processing, organizing, and moving data and it’s going to get much worse”. Scientists should not be forced to become data mining experts in order to retrieve the data they want, nor should they be expected to remember the naming convention they used several years ago for a set of experiments they now wish to revisit. Ideally, locating the data you need would be as easy as browsing the web. Unfortunately, existing data management approaches are usually based on hierarchical naming, a 40 year-old technology designed to manage thousands of files, not exabytes of data. Today’s systems do not take advantage of the rich array of metadata that current high-end computing (HEC) file systems can gather, including content-based metadata and provenance1 information. As a result, current metadata search approaches are typically ad hoc and often work by providing a parallel management system to the “main” file system, as is done in Linux (the locate utility), personal computers, and enterprise search

  7. Computational Model of D-Region Ion Production Caused by Energetic Electron Precipitations Based on General Monte Carlo Transport Calculations

    Science.gov (United States)

    Kouznetsov, A.; Cully, C. M.

    2017-12-01

    During enhanced magnetic activities, large ejections of energetic electrons from radiation belts are deposited in the upper polar atmosphere where they play important roles in its physical and chemical processes, including VLF signals subionospheric propagation. Electron deposition can affect D-Region ionization, which are estimated based on ionization rates derived from energy depositions. We present a model of D-region ion production caused by an arbitrary (in energy and pitch angle) distribution of fast (10 keV - 1 MeV) electrons. The model relies on a set of pre-calculated results obtained using a general Monte Carlo approach with the latest version of the MCNP6 (Monte Carlo N-Particle) code for the explicit electron tracking in magnetic fields. By expressing those results using the ionization yield functions, the pre-calculated results are extended to cover arbitrary magnetic field inclinations and atmospheric density profiles, allowing ionization rate altitude profile computations in the range of 20 and 200 km at any geographic point of interest and date/time by adopting results from an external atmospheric density model (e.g. NRLMSISE-00). The pre-calculated MCNP6 results are stored in a CDF (Common Data Format) file, and IDL routines library is written to provide an end-user interface to the model.

  8. Computer-aided approach for design of tailor-made blended products

    DEFF Research Database (Denmark)

    Yunus, Nor Alafiza; Gernaey, Krist; Woodley, John

    2012-01-01

    A computer-aided methodology has been developed for the design of blended (mixture) products. Through this methodology, it is possible to identify the most suitable chemicals for blending, and “tailor” the blend according to specified product needs (usually product attributes, e.g. performance...... as well as regulatory). The product design methodology has four tasks. First, the design problem is defined: the product needs are identified, translated into target properties and the constraints for each target property are defined. Secondly, target property models are retrieved from a property model...

  9. Grid collector: An event catalog with automated file management

    International Nuclear Information System (INIS)

    Wu, Kesheng; Zhang, Wei-Ming; Sim, Alexander; Gu, Junmin; Shoshani, Arie

    2003-01-01

    High Energy Nuclear Physics (HENP) experiments such as STAR at BNL and ATLAS at CERN produce large amounts of data that are stored as files on mass storage systems in computer centers. In these files, the basic unit of data is an event. Analysis is typically performed on a selected set of events. The files containing these events have to be located, copied from mass storage systems to disks before analysis, and removed when no longer needed. These file management tasks are tedious and time consuming. Typically, all events contained in the files are read into memory before a selection is made. Since the time to read the events dominate the overall execution time, reading the unwanted event needlessly increases the analysis time. The Grid Collector is a set of software modules that works together to address these two issues. It automates the file management tasks and provides ''direct'' access to the selected events for analyses. It is currently integrated with the STAR analysis framework. The users can select events based on tags, such as, ''production date between March 10 and 20, and the number of charged tracks > 100.'' The Grid Collector locates the files containing relevant events, transfers the files across the Grid if necessary, and delivers the events to the analysis code through the familiar iterators. There has been some research efforts to address the file management issues, the Grid Collector is unique in that it addresses the event access issue together with the file management issues. This makes it more useful to a large variety of users

  10. Grid collector: An event catalog with automated file management

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Kesheng; Zhang, Wei-Ming; Sim, Alexander; Gu, Junmin; Shoshani, Arie

    2003-10-17

    High Energy Nuclear Physics (HENP) experiments such as STAR at BNL and ATLAS at CERN produce large amounts of data that are stored as files on mass storage systems in computer centers. In these files, the basic unit of data is an event. Analysis is typically performed on a selected set of events. The files containing these events have to be located, copied from mass storage systems to disks before analysis, and removed when no longer needed. These file management tasks are tedious and time consuming. Typically, all events contained in the files are read into memory before a selection is made. Since the time to read the events dominate the overall execution time, reading the unwanted event needlessly increases the analysis time. The Grid Collector is a set of software modules that works together to address these two issues. It automates the file management tasks and provides ''direct'' access to the selected events for analyses. It is currently integrated with the STAR analysis framework. The users can select events based on tags, such as, ''production date between March 10 and 20, and the number of charged tracks > 100.'' The Grid Collector locates the files containing relevant events, transfers the files across the Grid if necessary, and delivers the events to the analysis code through the familiar iterators. There has been some research efforts to address the file management issues, the Grid Collector is unique in that it addresses the event access issue together with the file management issues. This makes it more useful to a large variety of users.

  11. File sharing

    NARCIS (Netherlands)

    van Eijk, N.

    2011-01-01

    File sharing’ has become generally accepted on the Internet. Users share files for downloading music, films, games, software etc. In this note, we have a closer look at the definition of file sharing, the legal and policy-based context as well as enforcement issues. The economic and cultural

  12. Computation for LHC experiments: a worldwide computing grid; Le calcul scientifique des experiences LHC: une grille de production mondiale

    Energy Technology Data Exchange (ETDEWEB)

    Fairouz, Malek [Universite Joseph-Fourier, LPSC, CNRS-IN2P3, Grenoble I, 38 (France)

    2010-08-15

    In normal operating conditions the LHC detectors are expected to record about 10{sup 10} collisions each year. The processing of all the consequent experimental data is a real computing challenge in terms of equipment, software and organization: it requires sustaining data flows of a few 10{sup 9} octets per second and recording capacity of a few tens of 10{sup 15} octets each year. In order to meet this challenge a computing network implying the dispatch and share of tasks, has been set. The W-LCG grid (World wide LHC computing grid) is made up of 4 tiers. Tiers 0 is the computer center in CERN, it is responsible for collecting and recording the raw data from the LHC detectors and to dispatch it to the 11 tiers 1. The tiers 1 is typically a national center, it is responsible for making a copy of the raw data and for processing it in order to recover relevant data with a physical meaning and to transfer the results to the 150 tiers 2. The tiers 2 is at the level of the Institute or laboratory, it is in charge of the final analysis of the data and of the production of the simulations. Tiers 3 are at the level of the laboratories, they provide a complementary and local resource to tiers 2 in terms of data analysis. (A.C.)

  13. Computed micro-tomographic evaluation of glide path with nickel-titanium rotary PathFile in maxillary first molars curved canals.

    Science.gov (United States)

    Pasqualini, Damiano; Bianchi, Caterina Chiara; Paolino, Davide Salvatore; Mancini, Lucia; Cemenasco, Andrea; Cantatore, Giuseppe; Castellucci, Arnaldo; Berutti, Elio

    2012-03-01

    X-ray computed micro-tomography scanning allows high-resolution 3-dimensional imaging of small objects. In this study, micro-CT scanning was used to compare the ability of manual and mechanical glide path to maintain the original root canal anatomy. Eight extracted upper first permanent molars were scanned at the TOMOLAB station at ELETTRA Synchrotron Light Laboratory in Trieste, Italy, with a microfocus cone-beam geometry system. A total of 2,400 projections on 360° have been acquired at 100 kV and 80 μA, with a focal spot size of 8 μm. Buccal root canals of each specimen (n = 16) were randomly assigned to PathFile (P) or stainless-steel K-file (K) to perform glide path at the full working length. Specimens were then microscanned at the apical level (A) and at the point of the maximum curvature level (C) for post-treatment analyses. Curvatures of root canals were classified as moderate (≤35°) or severe (≥40°). The ratio of diameter ratios (RDRs) and the ratio of cross-sectional areas (RAs) were assessed. For each level of analysis (A and C), 2 balanced 2-way factorial analyses of variance (P < .05) were performed to evaluate the significance of the instrument factor and of canal curvature factor as well as the interactions of the factors both with RDRs and RAs. Specimens in the K group had a mean curvature of 35.4° ± 11.5°; those in the P group had a curvature of 38° ± 9.9°. The instrument factor (P and K) was extremely significant (P < .001) for both the RDR and RA parameters, regardless of the point of analysis. Micro-CT scanning confirmed that NiTi rotary PathFile instruments preserve the original canal anatomy and cause less canal aberrations. Copyright © 2012 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  14. Enterprise logic vs product logic: the development of GE’s computer product line

    OpenAIRE

    Gandy, Anthony; Edwards, Roy

    2017-01-01

    The following article focuses on corporate strategies at General Electric (GE) and how corporate-level interventions impacted the market performance of the firm’s general purpose commercial mainframe product set in the period 1960–1968. We show that in periods of both divisional independent planning and corporate-level planning strategic governance, central decisions interfered in the execution of GE’s product strategy. GE’s institutional ‘enterprise logic’ negatively impacted the ‘product lo...

  15. Portable File Format (PFF) specifications

    Energy Technology Data Exchange (ETDEWEB)

    Dolan, Daniel H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Created at Sandia National Laboratories, the Portable File Format (PFF) allows binary data transfer across computer platforms. Although this capability is supported by many other formats, PFF files are still in use at Sandia, particularly in pulsed power research. This report provides detailed PFF specifications for accessing data without relying on legacy code.

  16. Computer-Aided Chemical Product Design Framework: Design of High Performance and Environmentally Friendly Refrigerants

    DEFF Research Database (Denmark)

    Cignitti, Stefano; Zhang, Lei; Gani, Rafiqul

    properties and needs should carefully be selected for a given heat pump cycle to ensure that an optimum refrigerant is found? How can cycle performance and environmental criteria be integrated at the product design stage and not in post-design analysis? Computer-aided product design methods enable...... the possibility of designing novel molecules, mixtures and blends, such as refrigerants through a systematic framework (Cignitti et al., 2015; Yunus et al., 2014). In this presentation a computer-aided framework is presented for chemical product design through mathematical optimization. Here, molecules, mixtures...... and blends, are systematically designed through a decomposition based solution method. Given a problem definition, computer-aided molecular design (CAMD) problem is defined, which is formulated into a mixed integer nonlinear program (MINLP). The decomposed solution method then sequentially divides the MINLP...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  18. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  20. CINDA 99, supplement 2 to CINDA 97 (1988-1999). The index to literature and computer files on microscopic neutron data

    International Nuclear Information System (INIS)

    1999-01-01

    CINDA, the Computer Index of Neutron Data, contains bibliographical references to measurements, calculations, reviews and evaluations of neutron cross-sections and other microscopic neutron data; it includes also index references to computer libraries of numerical neutron data available from four regional neutron data centres. The present issue, CINDA 99, is the second supplement to CINDA 97, the index to the literature on neutron data published after 1987. It supersedes the first supplement, CINDA 98. The complete CINDA file as of 1 June 1999 is contained in: the archival issue CINDA-A (5 volumes, 1990), CINDA 97 and the current issue CINDA 99. The compilation and publication of CINDA are the result of worldwide co-operation involving the following four data centres. Each centre is responsible for compiling the CINDA entries from the literature published in a defined geographical area given in brackets below: the USA National Nuclear Data Center at the Brookhaven National Laboratory, USA (United States of America and Canada); the Russian Nuclear Data Centre at the Fiziko-Energeticheskij Institut, Obninsk, Russian Federation (former USSR countries); the NEA Data Bank in Paris, France (European OECD member countries in Western Europe and Japan); and the IAEA Nuclear Data Section in Vienna, Austria (all other countries in Eastern Europe, Asia, Australia, Africa, Central and South America; also IAEA publications and translation journals). Besides the published CINDA books, up-to-date computer retrievals for specified CINDA information are currently available on request from the responsible CINDA centres, or via direct access to the on-line services as described in this publication

  1. File-System Workload on a Scientific Multiprocessor

    Science.gov (United States)

    Kotz, David; Nieuwejaar, Nils

    1995-01-01

    Many scientific applications have intense computational and I/O requirements. Although multiprocessors have permitted astounding increases in computational performance, the formidable I/O needs of these applications cannot be met by current multiprocessors a their I/O subsystems. To prevent I/O subsystems from forever bottlenecking multiprocessors and limiting the range of feasible applications, new I/O subsystems must be designed. The successful design of computer systems (both hardware and software) depends on a thorough understanding of their intended use. A system designer optimizes the policies and mechanisms for the cases expected to most common in the user's workload. In the case of multiprocessor file systems, however, designers have been forced to build file systems based only on speculation about how they would be used, extrapolating from file-system characterizations of general-purpose workloads on uniprocessor and distributed systems or scientific workloads on vector supercomputers (see sidebar on related work). To help these system designers, in June 1993 we began the Charisma Project, so named because the project sought to characterize 1/0 in scientific multiprocessor applications from a variety of production parallel computing platforms and sites. The Charisma project is unique in recording individual read and write requests-in live, multiprogramming, parallel workloads (rather than from selected or nonparallel applications). In this article, we present the first results from the project: a characterization of the file-system workload an iPSC/860 multiprocessor running production, parallel scientific applications at NASA's Ames Research Center.

  2. Spice Products Available to The Planetary Science Community

    Science.gov (United States)

    Acton, Charles

    1999-01-01

    This paper presents the availability of SPICE products to the Planetary Science Community. The topics include: 1) What Are SPICE Data; 2) SPICE File Types; 3) SPICE Software; 4) Examples of What Can Be Computed Using SPICE Data and Software; and 5) SPICE File Avalability.

  3. Computations of concentration of radon and its decay products against time. Computer program; Obliczanie koncentracji radonu i jego produktow rozpadu w funkcji czasu. Program komputerowy

    Energy Technology Data Exchange (ETDEWEB)

    Machaj, B. [Institute of Nuclear Chemistry and Technology, Warsaw (Poland)

    1996-12-31

    This research is aimed to develop a device for continuous monitoring of radon in the air, by measuring alpha activity of radon and its short lived decay products. The influence of alpha activity variation of radon and its daughters on the measured results is of importance and requires a knowledge of this variation with time. Employing the measurement of alpha radiation of radon and of its short lived decay products, require knowledge of radon concentration variation and its decay products against the time. A computer program in Turbo Pascal language was therefore developed performing the computations employing the known relations involved, the program being adapted for IBM PC computers. The presented program enables computation of activity of {sup 222}Rn and its daughter products: {sup 218}Po, {sup 214}Pb, {sup 214}Bi and {sup 214}Po every 1 min within the period of 0-255 min for any state of radiation equilibrium between the radon and its daughter products. The program permits also to compute alpha activity of {sup 222}Rn + {sup 218}Po + {sup 214}Po against time and the total alpha activity at selected interval of time. The results of computations are stored on the computer hard disk in ASCII format and are used a graphic program e.g. by DrawPerfect program to make diagrams. Equations employed for computation of the alpha activity of radon and its decay products as well as the description of program functions are given. (author). 2 refs, 4 figs.

  4. Productization and Commercialization of IT-Enabled Higher Education in Computer Science: A Systematic Literature Review

    Science.gov (United States)

    Kankaanpää, Irja; Isomäki, Hannakaisa

    2013-01-01

    This paper reviews research literature on the production and commercialization of IT-enabled higher education in computer science. Systematic literature review (SLR) was carried out in order to find out to what extent this area has been studied, more specifically how much it has been studied and to what detail. The results of this paper make a…

  5. An Alternative Method for Computing Unit Costs and Productivity Ratios. AIR 1984 Annual Forum Paper.

    Science.gov (United States)

    Winstead, Wayland H.; And Others

    An alternative measure for evaluating the performance of academic departments was studied. A comparison was made with the traditional manner for computing unit costs and productivity ratios: prorating the salary and effort of each faculty member to each course level based on the personal mix of course taught. The alternative method used averaging…

  6. Computational issues in a stochastic finite horizon one product recovery inventory model

    NARCIS (Netherlands)

    Kiesmüller, G.P.; Scherer, C.W.

    2003-01-01

    Inderfurth [OR Spektrum 19 (1997) 111] and Simpson [Operations Research 26 (1978) 270] have shown how the optimal decision rules in a stochastic one product recovery system with equal leadtimes can be characterized. Using these results we provide in this paper a method for the exact computation of

  7. Measuring the impact of computer resource quality on the software development process and product

    Science.gov (United States)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  8. Computer Aided Methods & Tools for Separation & Purification of Fine Chemical & Pharmaceutical Products

    DEFF Research Database (Denmark)

    Afonso, Maria B.C.; Soni, Vipasha; Mitkowski, Piotr Tomasz

    2006-01-01

    An integrated approach that is particularly suitable for solving problems related to product-process design from the fine chemicals, agrochemicals, food and pharmaceutical industries is presented together with the corresponding methods and tools, which forms the basis for an integrated computer...

  9. The importance of ergonomic design in product innovation. Lessons from the development of the portable computer

    NARCIS (Netherlands)

    Windrum, P.; Frenken, K.; Green, Lawrence

    2017-01-01

    The article addresses the role of ergonomic design in product innovation. Designers meet users’ needs by developing solutions to complex trade-offs—reverse salients—between a product’s characteristics. The fundamental ergonomic design challenge in portable computers concerns the reverse salient

  10. 77 FR 66866 - Certain Computer Forensic Devices and Products Containing the Same Notice of Request for...

    Science.gov (United States)

    2012-11-07

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-799] Certain Computer Forensic Devices and Products Containing the Same Notice of Request for Statements on the Public Interest AGENCY: U.S. International Trade Commission. ACTION: Notice. SUMMARY: Notice is hereby given that the presiding...

  11. Experience of BESIII data production with local cluster and distributed computing model

    International Nuclear Information System (INIS)

    Deng, Z Y; Li, W D; Liu, H M; Sun, Y Z; Zhang, X M; Lin, L; Nicholson, C; Zhemchugov, A

    2012-01-01

    The BES III detector is a new spectrometer which works on the upgraded high-luminosity collider, BEPCII. The BES III experiment studies physics in the tau-charm energy region from 2 GeV to 4.6 GeV . From 2009 to 2011, BEPCII has produced 106M ψ(2S) events, 225M J/ψ events, 2.8 fb −1 ψ(3770) data, and 500 pb −1 data at 4.01 GeV. All the data samples were processed successfully and many important physics results have been achieved based on these samples. Doing data production correctly and efficiently with limited CPU and storage resources is a big challenge. This paper will describe the implementation of the experiment-specific data production for BESIII in detail, including data calibration with event-level parallel computing model, data reconstruction, inclusive Monte Carlo generation, random trigger background mixing and multi-stream data skimming. Now, with the data sample increasing rapidly, there is a growing demand to move from solely using a local cluster to a more distributed computing model. A distributed computing environment is being set up and expected to go into production use in 2012. The experience of BESIII data production, both with a local cluster and with a distributed computing model, is presented here.

  12. Analyses of Receptive and Productive Korean EFL Vocabulary: Computer-Based Vocabulary Learning Program

    Science.gov (United States)

    Kim, Scott Sungki

    2013-01-01

    The present research study investigated the effects of 8 versions of a computer-based vocabulary learning program on receptive and productive knowledge levels of college students. The participants were 106 male and 103 female Korean EFL students from Kyungsung University and Kwandong University in Korea. Students who participated in versions of…

  13. In-vitro Assessing the Shaping Ability of Three Nickel-Titanium Rotary Single File Systems by Cone Beam Computed Tomography

    Directory of Open Access Journals (Sweden)

    Ali Imad Al-Asadi

    2018-02-01

    Full Text Available Aim of the study was to evaluate the canal transportation and centering ability of three nickel-titanium single file rotary systems by cone beam computed tomography (CBCT. Materials and methods: Thirty permanent maxillary first molar with a range of mesiobuccal canals curvature from 20-30 degree were selected and assigned into three groups (n=10, according to the biomechanical preparation system used: Hyflex EDM (HF, Reciproc blue (RB and OneShape (OS. The sampled were scanned by CBCT after being mounted on customized acrylic base and then rescanned after the instrumentation. Slices from the axial section were taken from both exposures at 3 mm, 6 mm and 9 mm from the root apex corresponding to the apical, middle, and coronal third respectively. Data were statistically analyzed using Kurskal-Wallis and Mann-Whitney U tests at the 5% confidence level. Results: The results showed that there were no significant differences at the apical and coronal third and a significant difference at the middle third regarding canal transportation. However, there was a significant difference at the apical third and no significant difference at the middle and coronal third regarding centering ratio. Conclusion: It was concluded that the three single rotary systems reported a degree in canal transportation and centric ratio but the Hyflex EDM reported the least one.

  14. Canal transportation and centering ability of protaper and self-adjusting file system in long oval canals: An ex-vivo cone-beam computed tomography analysis.

    Science.gov (United States)

    Shah, Dipali Yogesh; Wadekar, Swati Ishwara; Dadpe, Ashwini Manish; Jadhav, Ganesh Ranganath; Choudhary, Lalit Jayant; Kalra, Dheeraj Deepak

    2017-01-01

    The purpose of this study was to compare and evaluate the shaping ability of ProTaper (PT) and Self-Adjusting File (SAF) system using cone-beam computed tomography (CBCT) to assess their performance in oval-shaped root canals. Sixty-two mandibular premolars with single oval canals were divided into two experimental groups ( n = 31) according to the systems used: Group I - PT and Group II - SAF. Canals were evaluated before and after instrumentation using CBCT to assess centering ratio and canal transportation at three levels. Data were statistically analyzed using one-way analysis of variance, post hoc Tukey's test, and t -test. The SAF showed better centering ability and lesser canal transportation than the PT only in the buccolingual plane at 6 and 9 mm levels. The shaping ability of the PT was best in the apical third in both the planes. The SAF had statistically significant better centering and lesser canal transportation in the buccolingual as compared to the mesiodistal plane at the middle and coronal levels. The SAF produced significantly less transportation and remained centered than the PT at the middle and coronal levels in the buccolingual plane of oval canals. In the mesiodistal plane, the performance of both the systems was parallel.

  15. Building a parallel file system simulator

    International Nuclear Information System (INIS)

    Molina-Estolano, E; Maltzahn, C; Brandt, S A; Bent, J

    2009-01-01

    Parallel file systems are gaining in popularity in high-end computing centers as well as commercial data centers. High-end computing systems are expected to scale exponentially and to pose new challenges to their storage scalability in terms of cost and power. To address these challenges scientists and file system designers will need a thorough understanding of the design space of parallel file systems. Yet there exist few systematic studies of parallel file system behavior at petabyte- and exabyte scale. An important reason is the significant cost of getting access to large-scale hardware to test parallel file systems. To contribute to this understanding we are building a parallel file system simulator that can simulate parallel file systems at very large scale. Our goal is to simulate petabyte-scale parallel file systems on a small cluster or even a single machine in reasonable time and fidelity. With this simulator, file system experts will be able to tune existing file systems for specific workloads, scientists and file system deployment engineers will be able to better communicate workload requirements, file system designers and researchers will be able to try out design alternatives and innovations at scale, and instructors will be able to study very large-scale parallel file system behavior in the class room. In this paper we describe our approach and provide preliminary results that are encouraging both in terms of fidelity and simulation scalability.

  16. Neural computation of visual imaging based on Kronecker product in the primary visual cortex

    Directory of Open Access Journals (Sweden)

    Guozheng Yao

    2010-03-01

    Full Text Available Abstract Background What kind of neural computation is actually performed by the primary visual cortex and how is this represented mathematically at the system level? It is an important problem in the visual information processing, but has not been well answered. In this paper, according to our understanding of retinal organization and parallel multi-channel topographical mapping between retina and primary visual cortex V1, we divide an image into orthogonal and orderly array of image primitives (or patches, in which each patch will evoke activities of simple cells in V1. From viewpoint of information processing, this activated process, essentially, involves optimal detection and optimal matching of receptive fields of simple cells with features contained in image patches. For the reconstruction of the visual image in the visual cortex V1 based on the principle of minimum mean squares error, it is natural to use the inner product expression in neural computation, which then is transformed into matrix form. Results The inner product is carried out by using Kronecker product between patches and function architecture (or functional column in localized and oriented neural computing. Compared with Fourier Transform, the mathematical description of Kronecker product is simple and intuitive, so is the algorithm more suitable for neural computation of visual cortex V1. Results of computer simulation based on two-dimensional Gabor pyramid wavelets show that the theoretical analysis and the proposed model are reasonable. Conclusions Our results are: 1. The neural computation of the retinal image in cortex V1 can be expressed to Kronecker product operation and its matrix form, this algorithm is implemented by the inner operation between retinal image primitives and primary visual cortex's column. It has simple, efficient and robust features, which is, therefore, such a neural algorithm, which can be completed by biological vision. 2. It is more suitable

  17. Design and application of remote file management system

    International Nuclear Information System (INIS)

    Zhu Haijun; Liu Dekang; Shen liren

    2006-01-01

    File transfer protocol can help users transfer files between computers on internet. FTP can not fulfill the needs of users in special occasions, so it needs programmer define file transfer protocol himself based on users. The method or realization and application for user-defined file transfer protocol is introduced. (authors)

  18. Quantum computation via local control theory: Direct sum vs. direct product Hilbert spaces

    International Nuclear Information System (INIS)

    Sklarz, Shlomo E.; Tannor, David J.

    2006-01-01

    The central objective in any quantum computation is the creation of a desired unitary transformation; the mapping that this unitary transformation produces between the input and output states is identified with the computation. In [S.E. Sklarz, D.J. Tannor, arXiv:quant-ph/0404081 (submitted to PRA) (2004)] it was shown that local control theory can be used to calculate fields that will produce such a desired unitary transformation. In contrast with previous strategies for quantum computing based on optimal control theory, the local control scheme maintains the system within the computational subspace at intermediate times, thereby avoiding unwanted decay processes. In [S.E. Sklarz et al.], the structure of the Hilbert space had a direct sum structure with respect to the computational register and the mediating states. In this paper, we extend the formalism to the important case of a direct product Hilbert space. The final equations for the control algorithm for the two cases are remarkably similar in structure, despite the fact that the derivations are completely different and that in one case the dynamics is in a Hilbert space and in the other case the dynamics is in a Liouville space. As shown in [S.E. Sklarz et al.], the direct sum implementation leads to a computational mechanism based on virtual transitions, and can be viewed as an extension of the principles of Stimulated Raman Adiabatic Passage from state manipulation to evolution operator manipulation. The direct product implementation developed here leads to the intriguing concept of virtual entanglement - computation that exploits second-order transitions that pass through entangled states but that leaves the subsystems nearly separable at all intermediate times. Finally, we speculate on a connection between the algorithm developed here and the concept of decoherence free subspaces

  19. Computer-guided total synthesis of natural products: Recent examples and future perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Della-Felice, Franco; Pilli, Ronaldo A. [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Instituto de Química; Sarotti, Ariel M., E-mail: pilli@iqm.unicamp.br, E-mail: sarotti@iquir-conicet.gov.ar [Instituto de Química, Universidad Nacional de Rosario-CONICET (Argentina)

    2018-05-01

    Quantum chemical calculations of nuclear magnetic resonance (NMR) shifts and coupling constants have been extensively employed in recent years mainly to facilitate structural elucidation of organic molecules. When the results of such calculations are used to determine the most likely structure of a natural product in advance, guiding the subsequent synthetic work, the term 'computer-guided synthesis' could be coined. This review article describes the most relevant examples from recent literature, highlighting the scope and limitations of this merged computational/experimental approach as well. (author)

  20. The Optimal Pricing of Computer Software and Other Products with High Switching Costs

    OpenAIRE

    Pekka Ahtiala

    2004-01-01

    The paper studies the determinants of the optimum prices of computer programs and their upgrades. It is based on the notion that because of the human capital invested in the use of a computer program by its user, this product has high switching costs, and on the finding that pirates are responsible for generating over 80 per cent of new software sales. A model to maximize the present value of the program to the program house is constructed to determine the optimal prices of initial programs a...

  1. Computer-guided total synthesis of natural products: Recent examples and future perspectives

    International Nuclear Information System (INIS)

    Della-Felice, Franco; Pilli, Ronaldo A.

    2018-01-01

    Quantum chemical calculations of nuclear magnetic resonance (NMR) shifts and coupling constants have been extensively employed in recent years mainly to facilitate structural elucidation of organic molecules. When the results of such calculations are used to determine the most likely structure of a natural product in advance, guiding the subsequent synthetic work, the term 'computer-guided synthesis' could be coined. This review article describes the most relevant examples from recent literature, highlighting the scope and limitations of this merged computational/experimental approach as well. (author)

  2. Zebra: A striped network file system

    Science.gov (United States)

    Hartman, John H.; Ousterhout, John K.

    1992-01-01

    The design of Zebra, a striped network file system, is presented. Zebra applies ideas from log-structured file system (LFS) and RAID research to network file systems, resulting in a network file system that has scalable performance, uses its servers efficiently even when its applications are using small files, and provides high availability. Zebra stripes file data across multiple servers, so that the file transfer rate is not limited by the performance of a single server. High availability is achieved by maintaining parity information for the file system. If a server fails its contents can be reconstructed using the contents of the remaining servers and the parity information. Zebra differs from existing striped file systems in the way it stripes file data: Zebra does not stripe on a per-file basis; instead it stripes the stream of bytes written by each client. Clients write to the servers in units called stripe fragments, which are analogous to segments in an LFS. Stripe fragments contain file blocks that were written recently, without regard to which file they belong. This method of striping has numerous advantages over per-file striping, including increased server efficiency, efficient parity computation, and elimination of parity update.

  3. Development of computer-aided design and production system for nuclear power plant

    International Nuclear Information System (INIS)

    Ishii, Masanori

    1983-01-01

    The technically required matters related to the design and production of nuclear power stations tended to increase from the viewpoint of the safety and reliability, and it is indispensable to cope with such technically required matters skillfully for the rationalization of the design and production and for the construction of highly reliable plants. Ishikawajima Harima Heavy Industries Co., Ltd., has developed the computer-aided design data information and engineering system which performs dialogue type design and drawing, and as the result, the design-production consistent system is developed to do stress analysis, production design, production management and the output of data for numerically controlled machine tools consistently. In this paper, mainly the consistent system in the field of plant design centering around piping and also the computer system for the design of vessels and others are outlined. The features of the design works for nuclear power plants, the rationalization of the design and production management of piping and vessels, and the application of the CAD system to other general equipment and improvement works are reported. This system is the powerful means to meet the requirement of heightening quality and reducing cost. (Kako, I.)

  4. Computational Modelling of Large Scale Phage Production Using a Two-Stage Batch Process

    Directory of Open Access Journals (Sweden)

    Konrad Krysiak-Baltyn

    2018-04-01

    Full Text Available Cost effective and scalable methods for phage production are required to meet an increasing demand for phage, as an alternative to antibiotics. Computational models can assist the optimization of such production processes. A model is developed here that can simulate the dynamics of phage population growth and production in a two-stage, self-cycling process. The model incorporates variable infection parameters as a function of bacterial growth rate and employs ordinary differential equations, allowing application to a setup with multiple reactors. The model provides simple cost estimates as a function of key operational parameters including substrate concentration, feed volume and cycling times. For the phage and bacteria pairing examined, costs and productivity varied by three orders of magnitude, with the lowest cost found to be most sensitive to the influent substrate concentration and low level setting in the first vessel. An example case study of phage production is also presented, showing how parameter values affect the production costs and estimating production times. The approach presented is flexible and can be used to optimize phage production at laboratory or factory scale by minimizing costs or maximizing productivity.

  5. Datenschutz- und Medizinprodukterecht bei Ubiquitous Computing-Anwendungen im Gesundheitssektor / Data protection and medical product law with respect to medical ubiquitous computing applications

    Directory of Open Access Journals (Sweden)

    Skistims, Hendrik

    2011-01-01

    Full Text Available With respect to ubiquitous computing there is a great potential of application, particularly in medicine and health care. This work deals with the legal problems which ubiquitous computing is facing in these areas. At the beginning, issues with respect to data protection and professional secrecy are treated. Afterwards the problem of applicability of medical product law for medical ubiquitous computing applications as well as the resulting requirements for manufactures, operators and users will be discussed.

  6. Computer-aided modeling for efficient and innovative product-process engineering

    DEFF Research Database (Denmark)

    Heitzig, Martina

    Model-based computer aided product-process engineering has attained increased importance in a number of industries, including pharmaceuticals, petrochemicals, fine chemicals, polymers, biotechnology, food, energy and water. This trend is set to continue due to the substantial benefits computer...... in chemical and biochemical engineering have been solved to illustrate the application of the generic modelling methodology, the computeraided modelling framework and the developed software tool.......-aided methods provide. The key prerequisite of computer-aided productprocess engineering is however the availability of models of different types, forms and application modes. The development of the models required for the systems under investigation tends to be a challenging, time-consuming and therefore cost...

  7. Computer Aided Process Planning for Non-Axisymmetric Deep Drawing Products

    Science.gov (United States)

    Park, Dong Hwan; Yarlagadda, Prasad K. D. V.

    2004-06-01

    In general, deep drawing products have various cross-section shapes such as cylindrical, rectangular and non-axisymmetric shapes. The application of the surface area calculation to non-axisymmetric deep drawing process has not been published yet. In this research, a surface area calculation for non-axisymmetric deep drawing products with elliptical shape was constructed for a design of blank shape of deep drawing products by using an AutoLISP function of AutoCAD software. A computer-aided process planning (CAPP) system for rotationally symmetric deep drawing products has been developed. However, the application of the system to non-axisymmetric components has not been reported yet. Thus, the CAPP system for non-axisymmetric deep drawing products with elliptical shape was constructed by using process sequence design. The system developed in this work consists of four modules. The first is recognition of shape module to recognize non-axisymmetric products. The second is a three-dimensional (3-D) modeling module to calculate the surface area for non-axisymmetric products. The third is a blank design module to create an oval-shaped blank with the identical surface area. The forth is a process planning module based on the production rules that play the best important role in an expert system for manufacturing. The production rules are generated and upgraded by interviewing field engineers. Especially, the drawing coefficient, the punch and die radii for elliptical shape products are considered as main design parameters. The suitability of this system was verified by applying to a real deep drawing product. This CAPP system constructed would be very useful to reduce lead-time for manufacturing and improve an accuracy of products.

  8. User's manual for computer code RIBD-II, a fission product inventory code

    International Nuclear Information System (INIS)

    Marr, D.R.

    1975-01-01

    The computer code RIBD-II is used to calculate inventories, activities, decay powers, and energy releases for the fission products generated in a fuel irradiation. Changes from the earlier RIBD code are: the expansion to include up to 850 fission product isotopes, input in the user-oriented NAMELIST format, and run-time choice of fuels from an extensively enlarged library of nuclear data. The library that is included in the code package contains yield data for 818 fission product isotopes for each of fourteen different fissionable isotopes, together with fission product transmutation cross sections for fast and thermal systems. Calculational algorithms are little changed from those in RIBD. (U.S.)

  9. Impact of office productivity cloud computing on energy consumption and greenhouse gas emissions.

    Science.gov (United States)

    Williams, Daniel R; Tang, Yinshan

    2013-05-07

    Cloud computing is usually regarded as being energy efficient and thus emitting less greenhouse gases (GHG) than traditional forms of computing. When the energy consumption of Microsoft's cloud computing Office 365 (O365) and traditional Office 2010 (O2010) software suites were tested and modeled, some cloud services were found to consume more energy than the traditional form. The developed model in this research took into consideration the energy consumption at the three main stages of data transmission; data center, network, and end user device. Comparable products from each suite were selected and activities were defined for each product to represent a different computing type. Microsoft provided highly confidential data for the data center stage, while the networking and user device stages were measured directly. A new measurement and software apportionment approach was defined and utilized allowing the power consumption of cloud services to be directly measured for the user device stage. Results indicated that cloud computing is more energy efficient for Excel and Outlook which consumed less energy and emitted less GHG than the standalone counterpart. The power consumption of the cloud based Outlook (8%) and Excel (17%) was lower than their traditional counterparts. However, the power consumption of the cloud version of Word was 17% higher than its traditional equivalent. A third mixed access method was also measured for Word which emitted 5% more GHG than the traditional version. It is evident that cloud computing may not provide a unified way forward to reduce energy consumption and GHG. Direct conversion from the standalone package into the cloud provision platform can now consider energy and GHG emissions at the software development and cloud service design stage using the methods described in this research.

  10. Final Report on XStack: Software Synthesis for High Productivity ExaScale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Solar-Lezama, Armando [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States). Computer Science and Artificial Intelligence Lab.

    2016-07-12

    The goal of the project was to develop a programming model that would significantly improve productivity in the high-performance computing domain by bringing together three components: a) Automated equivalence checking, b) Sketch-based program synthesis, and c) Autotuning. The report provides an executive summary of the research accomplished through this project. At the end of the report is appended a paper that describes in more detail the key technical accomplishments from this project, and which was published in SC 2014.

  11. Computational Methods to Assess the Production Potential of Bio-Based Chemicals.

    Science.gov (United States)

    Campodonico, Miguel A; Sukumara, Sumesh; Feist, Adam M; Herrgård, Markus J

    2018-01-01

    Elevated costs and long implementation times of bio-based processes for producing chemicals represent a bottleneck for moving to a bio-based economy. A prospective analysis able to elucidate economically and technically feasible product targets at early research phases is mandatory. Computational tools can be implemented to explore the biological and technical spectrum of feasibility, while constraining the operational space for desired chemicals. In this chapter, two different computational tools for assessing potential for bio-based production of chemicals from different perspectives are described in detail. The first tool is GEM-Path: an algorithm to compute all structurally possible pathways from one target molecule to the host metabolome. The second tool is a framework for Modeling Sustainable Industrial Chemicals production (MuSIC), which integrates modeling approaches for cellular metabolism, bioreactor design, upstream/downstream processes, and economic impact assessment. Integrating GEM-Path and MuSIC will play a vital role in supporting early phases of research efforts and guide the policy makers with decisions, as we progress toward planning a sustainable chemical industry.

  12. Comparing ProFile Vortex to ProTaper Next for the efficacy of removal of root filling material: An ex vivo micro-computed tomography study

    Directory of Open Access Journals (Sweden)

    Emad AlShwaimi

    2018-01-01

    Conclusion: Our findings suggest that PV is as effective as PTN for removal of root canal filling material. Therefore, PV can be considered for use in endodontic retreatment, although more effective files or techniques are still required.

  13. Simulation of Rn-222 decay products concentration deposited on a filter. Description of radon1.pas computer program

    International Nuclear Information System (INIS)

    Machaj, B.

    1996-01-01

    A computer program allowing simulation of activity distribution of 222 Rn short lived decay products deposited on a filter against time is presented, for any radiation equilibrium degree of the decay products. Deposition of the decay products is simulated by summing discrete samples every 1/10 min in the sampling time from 1 to 10 min. The concentration (activity) of the decay products is computed in one minute intervals in the range 1 - 100 min. The alpha concentration and the total activity of 218 Po + 214 Po produced are computed in the range 1 to 100 min as well. (author). 10 refs, 4 figs

  14. Methods of Computational Intelligence in the Context of Quality Assurance in Foundry Products

    Directory of Open Access Journals (Sweden)

    Rojek G.

    2016-06-01

    Full Text Available One way to ensure the required technical characteristics of castings is the strict control of production parameters affecting the quality of the finished products. If the production process is improperly configured, the resulting defects in castings lead to huge losses. Therefore, from the point of view of economics, it is advisable to use the methods of computational intelligence in the field of quality assurance and adjustment of parameters of future production. At the same time, the development of knowledge in the field of metallurgy, aimed to raise the technical level and efficiency of the manufacture of foundry products, should be followed by the development of information systems to support production processes in order to improve their effectiveness and compliance with the increasingly more stringent requirements of ergonomics, occupational safety, environmental protection and quality. This article is a presentation of artificial intelligence methods used in practical applications related to quality assurance. The problem of control of the production process involves the use of tools such as the induction of decision trees, fuzzy logic, rough set theory, artificial neural networks or case-based reasoning.

  15. Do ergonomics improvements increase computer workers' productivity?: an intervention study in a call centre.

    Science.gov (United States)

    Smith, Michael J; Bayehi, Antoinette Derjani

    2003-01-15

    This paper examines whether improving physical ergonomics working conditions affects worker productivity in a call centre with computer-intensive work. A field study was conducted at a catalogue retail service organization to explore the impact of ergonomics improvements on worker production. There were three levels of ergonomics interventions, each adding incrementally to the previous one. The first level was ergonomics training for all computer users accompanied by workstation ergonomics analysis leading to specific customized adjustments to better fit each worker (Group C). The second level added specific workstation accessories to improve the worker fit if the ergonomics analysis indicated a need for them (Group B). The third level met Group B requirements plus an improved chair (Group A). Productivity data was gathered from 72 volunteer participants who received ergonomics improvements to their workstations and 370 control subjects working in the same departments. Daily company records of production outputs for each worker were taken before ergonomics intervention (baseline) and 12 months after ergonomics intervention. Productivity improvement from baseline to 12 months post-intervention was examined across all ergonomics conditions combined, and also compared to the control group. The findings showed that worker performance increased for 50% of the ergonomics improvement participants and decreased for 50%. Overall, there was a 4.87% output increase for the ergonomics improvement group as compared to a 3.46% output decrease for the control group. The level of productivity increase varied by the type of the ergonomics improvements with Group C showing the best improvement (9.43%). Even though the average production improved, caution must be used in interpreting the findings since the ergonomics interventions were not successful for one-half of the participants.

  16. ACONC Files

    Data.gov (United States)

    U.S. Environmental Protection Agency — ACONC files containing simulated ozone and PM2.5 fields that were used to create the model difference plots shown in the journal article. This dataset is associated...

  17. XML Files

    Science.gov (United States)

    ... this page: https://medlineplus.gov/xml.html MedlinePlus XML Files To use the sharing features on this page, please enable JavaScript. MedlinePlus produces XML data sets that you are welcome to download ...

  18. 831 Files

    Data.gov (United States)

    Social Security Administration — SSA-831 file is a collection of initial and reconsideration adjudicative level DDS disability determinations. (A few hearing level cases are also present, but the...

  19. A computer code PACTOLE to predict activation and transport of corrosion products in a PWR

    International Nuclear Information System (INIS)

    Beslu, P.; Frejaville, G.; Lalet, A.

    1978-01-01

    Theoretical studies on activation and transport of corrosion products in a PWR primary circuit have been concentrated, at CEA on the development of a computer code : PACTOLE. This code takes into account the major phenomena which govern corrosion products transport: 1. Ion solubility is obtained by usual thermodynamics laws in function of water chemistry: pH at operating temperature is calculated by the code. 2. Release rates of base metals, dissolution rates of deposits, precipitation rates of soluble products are derived from solubility variations. 3. Deposition of solid particles is treated by a model taking into account particle size, brownian and turbulent diffusion and inertial effect. Erosion of deposits is accounted for by a semi-empirical model. After a review of calculational models, an application of PACTOLE is presented in view of analyzing the distribution of in core. (author)

  20. A vector-product information retrieval system adapted to heterogeneous, distributed computing environments

    Science.gov (United States)

    Rorvig, Mark E.

    1991-01-01

    Vector-product information retrieval (IR) systems produce retrieval results superior to all other searching methods but presently have no commercial implementations beyond the personal computer environment. The NASA Electronic Library Systems (NELS) provides a ranked list of the most likely relevant objects in collections in response to a natural language query. Additionally, the system is constructed using standards and tools (Unix, X-Windows, Notif, and TCP/IP) that permit its operation in organizations that possess many different hosts, workstations, and platforms. There are no known commercial equivalents to this product at this time. The product has applications in all corporate management environments, particularly those that are information intensive, such as finance, manufacturing, biotechnology, and research and development.

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  2. Using Computer Simulation Method to Improve Throughput of Production Systems by Buffers and Workers Allocation

    Directory of Open Access Journals (Sweden)

    Kłos Sławomir

    2015-12-01

    Full Text Available This paper proposes the application of computer simulation methods to support decision making regarding intermediate buffer allocations in a series-parallel production line. The simulation model of the production system is based on a real example of a manufacturing company working in the automotive industry. Simulation experiments were conducted for different allocations of buffer capacities and different numbers of employees. The production system consists of three technological operations with intermediate buffers between each operation. The technological operations are carried out using machines and every machine can be operated by one worker. Multi-work in the production system is available (one operator operates several machines. On the basis of the simulation experiments, the relationship between system throughput, buffer allocation and the number of employees is analyzed. Increasing the buffer capacity results in an increase in the average product lifespan. Therefore, in the article a new index is proposed that includes the throughput of the manufacturing system and product life span. Simulation experiments were performed for different configurations of technological operations.

  3. Efficacy of Twisted File Adaptive, Reciproc and ProTaper Universal Retreatment instruments for root-canal-filling removal: A cone-beam computed tomography study.

    Science.gov (United States)

    Akbulut, Makbule Bilge; Akman, Melek; Terlemez, Arslan; Magat, Guldane; Sener, Sevgi; Shetty, Heeresh

    2016-01-01

    The aim of this study was to evaluate the efficacy of Twisted File (TF) Adaptive, Reciproc, and ProTaper Universal Retreatment (UR) System instruments for removing root-canal-filling. Sixty single rooted teeth were decoronated, instrumented and obturated. Preoperative CBCT scans were taken and the teeth were retreated with TF Adaptive, Reciproc, ProTaper UR, or hand files (n=15). Then, the teeth were rescanned, and the percentage volume of the residual root-canal-filling material was established. The total time for retreatment was recorded, and the data was statistically analyzed. The statistical ranking of the residual filling material volume was as follows: hand file=TF Adaptive>ProTaper UR=Reciproc. The ProTaper UR and Reciproc systems required shorter periods of time for retreatment. Root canal filling was more efficiently removed by using Reciproc and ProTaper UR instruments than TF Adaptive instruments and hand files. The TF Adaptive system was advantageous over hand files with regard to operating time.

  4. Le resume de texte: une activite de production en langue etrangere assistee par ordinateur (Abstracting: A Computer-Assisted Foreign Language Production Activity).

    Science.gov (United States)

    Janitza, Jean

    1985-01-01

    A productive language exercise that uses the computer and a variation on the cloze procedure by deleting every third word and requiring the student to insert a grammatically and thematically acceptable term is described. (MSE)

  5. Monitoring dose-length product in computed tomography of the chest considering sex and body weight

    International Nuclear Information System (INIS)

    Inoue, Yusuke; Nagahara, Kazunori; Hayakawa, Naomichi; Hanawa, Hironori; Hata, Hirofumi

    2016-01-01

    Dose-length product (DLP) is widely used as an indicator of the radiation dose in computed tomography. The aim of this study was to investigate the significance of sex and body weight in DLP-based monitoring of the radiation dose. Eight hundred computed tomographies of the chest performed using four different scanners were analysed. The DLP was compared with body weight by linear regression in men and women separately. The DLP was positively correlated with body weight, and dependence on sex and weight differed among scanners. Standard DLP values adjusted for sex and weight facilitated inter-scanner comparison of the radiation dose and its dependence on sex and weight. Adjusting the DLP for sex and weight allowed one to identify examinations with possibly excessive doses independently of weight. Monitoring the DLP in relation to sex and body weight appears to aid detailed comparison of the radiation dose among imaging protocols and scanners and daily observations to find unexpected variance. (authors)

  6. Computer aided production of manufacturing CAMAC-wired boards by the multiwire-technique

    Energy Technology Data Exchange (ETDEWEB)

    Martini, M; Brehmer, W

    1975-10-01

    The multiwire-technique is a computer controlled wiring method for the manufacturing of circuit boards with insulated conductors. The technical data for production are dimensional drawings of the board and a list of all points which are to be connected. The listing must be in absolute co-ordinates including a list of all soldering points for component parts and a reproducible print pattern for inscription. For this wiring method a CAMAC standard board, a layout plan with alpha-numeric symbols, and a computer program which produces the essential technical data were developed. A description of the alpha-numeric symbols, the quality of the program, recognition and checking of these symbols, and the produced technical data is presented. (auth)

  7. Initial Flight Test of the Production Support Flight Control Computers at NASA Dryden Flight Research Center

    Science.gov (United States)

    Carter, John; Stephenson, Mark

    1999-01-01

    The NASA Dryden Flight Research Center has completed the initial flight test of a modified set of F/A-18 flight control computers that gives the aircraft a research control law capability. The production support flight control computers (PSFCC) provide an increased capability for flight research in the control law, handling qualities, and flight systems areas. The PSFCC feature a research flight control processor that is "piggybacked" onto the baseline F/A-18 flight control system. This research processor allows for pilot selection of research control law operation in flight. To validate flight operation, a replication of a standard F/A-18 control law was programmed into the research processor and flight-tested over a limited envelope. This paper provides a brief description of the system, summarizes the initial flight test of the PSFCC, and describes future experiments for the PSFCC.

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  9. Implementation of a Thermodynamic Solver within a Computer Program for Calculating Fission-Product Release Fractions

    Science.gov (United States)

    Barber, Duncan Henry

    During some postulated accidents at nuclear power stations, fuel cooling may be impaired. In such cases, the fuel heats up and the subsequent increased fission-gas release from the fuel to the gap may result in fuel sheath failure. After fuel sheath failure, the barrier between the coolant and the fuel pellets is lost or impaired, gases and vapours from the fuel-to-sheath gap and other open voids in the fuel pellets can be vented. Gases and steam from the coolant can enter the broken fuel sheath and interact with the fuel pellet surfaces and the fission-product inclusion on the fuel surface (including material at the surface of the fuel matrix). The chemistry of this interaction is an important mechanism to model in order to assess fission-product releases from fuel. Starting in 1995, the computer program SOURCE 2.0 was developed by the Canadian nuclear industry to model fission-product release from fuel during such accidents. SOURCE 2.0 has employed an early thermochemical model of irradiated uranium dioxide fuel developed at the Royal Military College of Canada. To overcome the limitations of computers of that time, the implementation of the RMC model employed lookup tables to pre-calculated equilibrium conditions. In the intervening years, the RMC model has been improved, the power of computers has increased significantly, and thermodynamic subroutine libraries have become available. This thesis is the result of extensive work based on these three factors. A prototype computer program (referred to as SC11) has been developed that uses a thermodynamic subroutine library to calculate thermodynamic equilibria using Gibbs energy minimization. The Gibbs energy minimization requires the system temperature (T) and pressure (P), and the inventory of chemical elements (n) in the system. In order to calculate the inventory of chemical elements in the fuel, the list of nuclides and nuclear isomers modelled in SC11 had to be expanded from the list used by SOURCE 2.0. A

  10. 77 FR 77093 - Certain Computer Forensic Devices and Products Containing Same; Commission Determination Not To...

    Science.gov (United States)

    2012-12-31

    ... ALJ. 19 CFR 210.42(d); see also Certain Video Game Systems and Wireless Controllers and Components... the economic prong of the domestic industry requirement. No petitions for review of the ID were filed...

  11. Computer analyses for the design, operation and safety of new isotope production reactors: A technology status review

    International Nuclear Information System (INIS)

    Wulff, W.

    1990-01-01

    A review is presented on the currently available technologies for nuclear reactor analyses by computer. The important distinction is made between traditional computer calculation and advanced computer simulation. Simulation needs are defined to support the design, operation, maintenance and safety of isotope production reactors. Existing methods of computer analyses are categorized in accordance with the type of computer involved in their execution: micro, mini, mainframe and supercomputers. Both general and special-purpose computers are discussed. Major computer codes are described, with regard for their use in analyzing isotope production reactors. It has been determined in this review that conventional systems codes (TRAC, RELAP5, RETRAN, etc.) cannot meet four essential conditions for viable reactor simulation: simulation fidelity, on-line interactive operation with convenient graphics, high simulation speed, and at low cost. These conditions can be met by special-purpose computers (such as the AD100 of ADI), which are specifically designed for high-speed simulation of complex systems. The greatest shortcoming of existing systems codes (TRAC, RELAP5) is their mismatch between very high computational efforts and low simulation fidelity. The drift flux formulation (HIPA) is the viable alternative to the complicated two-fluid model. No existing computer code has the capability of accommodating all important processes in the core geometry of isotope production reactors. Experiments are needed (heat transfer measurements) to provide necessary correlations. It is important for the nuclear community, both in government, industry and universities, to begin to take advantage of modern simulation technologies and equipment. 41 refs

  12. Computational Methods to Assess the Production Potential of Bio-Based Chemicals

    DEFF Research Database (Denmark)

    Campodonico, Miguel A; Sukumara, Sumesh; Feist, Adam M.

    2018-01-01

    are described in detail. The first tool is GEM-Path: an algorithm to compute all structurally possible pathways from one target molecule to the host metabolome. The second tool is a framework for Modeling Sustainable Industrial Chemicals production (MuSIC), which integrates modeling approaches for cellular...... metabolism, bioreactor design, upstream/downstream processes, and economic impact assessment. Integrating GEM-Path and MuSIC will play a vital role in supporting early phases of research efforts and guide the policy makers with decisions, as we progress toward planning a sustainable chemical industry....

  13. Impacts of mobile tablet computing on provider productivity, communications, and the process of care.

    Science.gov (United States)

    Schooley, Benjamin; Walczak, Steven; Hikmet, Neset; Patel, Nitin

    2016-04-01

    Health information technology investments continue to increase while the value derived from their implementation and use is mixed. Mobile device adoption into practice is a recent trend that has increased dramatically and formal studies are needed to investigate consequent benefits and challenges. The objective of this study is to evaluate practitioner perceptions of improvements in productivity, provider-patient communications, care provision, technology usability and other outcomes following the adoption and use of a tablet computer connected to electronic health information resources. A pilot program was initiated in June 2013 to evaluate the effect of mobile tablet computers at one health provider organization in the southeast United States. Providers were asked to volunteer for the evaluation and were each given a mobile tablet computer. A total of 42 inpatient and outpatient providers were interviewed in 2015 using a survey style questionnaire that utilized yes/no, Likert-style, and open ended questions. Each had previously used an electronic health record (EHR) system a minimum of one year outside of residency, and were regular users of personal mobile devices. Each used a mobile tablet computer in the context of their practice connected to the health system EHR. The survey results indicate that more than half of providers perceive the use of the tablet device as having a positive effect on patient communications, patient education, patient's perception of the provider, time spent interacting with patients, provider productivity, process of care, satisfaction with EHR when used together with the device, and care provision. Providers also reported feeling comfortable using the device (82.9%), would recommend the device to colleagues (69.2%), did not experience increased information security and privacy concerns (95%), and noted significant reductions in EHR login times (64.1%). Less than 25% of participants reported negative impacts on any of these areas as

  14. The global unified parallel file system (GUPFS) project: FY 2002 activities and results

    Energy Technology Data Exchange (ETDEWEB)

    Butler, Gregory F.; Lee, Rei Chi; Welcome, Michael L.

    2003-04-07

    The Global Unified Parallel File System (GUPFS) project is a multiple-phase, five-year project at the National Energy Research Scientific Computing (NERSC) Center to provide a scalable, high performance, high bandwidth, shared file system for all the NERSC production computing and support systems. The primary purpose of the GUPFS project is to make it easier to conduct advanced scientific research using the NERSC systems. This is to be accomplished through the use of a shared file system providing a unified file namespace, operating on consolidated shared storage that is directly accessed by all the NERSC production computing and support systems. During its first year, FY 2002, the GUPFS project focused on identifying, testing, and evaluating existing and emerging shared/cluster file system, SAN fabric, and storage technologies; identifying NERSC user input/output (I/O) requirements, methods, and mechanisms; and developing appropriate benchmarking methodologies and benchmark codes for a parallel environment. This report presents the activities and progress of the GUPFS project during its first year, the results of the evaluations conducted, and plans for near-term and longer-term investigations.

  15. Bridging computational approaches to speech production: The semantic–lexical–auditory–motor model (SLAM)

    Science.gov (United States)

    Hickok, Gregory

    2017-01-01

    Speech production is studied from both psycholinguistic and motor-control perspectives, with little interaction between the approaches. We assessed the explanatory value of integrating psycholinguistic and motor-control concepts for theories of speech production. By augmenting a popular psycholinguistic model of lexical retrieval with a motor-control-inspired architecture, we created a new computational model to explain speech errors in the context of aphasia. Comparing the model fits to picture-naming data from 255 aphasic patients, we found that our new model improves fits for a theoretically predictable subtype of aphasia: conduction. We discovered that the improved fits for this group were a result of strong auditory-lexical feedback activation, combined with weaker auditory-motor feedforward activation, leading to increased competition from phonologically related neighbors during lexical selection. We discuss the implications of our findings with respect to other extant models of lexical retrieval. PMID:26223468

  16. Application of Computer Vision for quality control in frozen mixed berries production: colour calibration issues

    Directory of Open Access Journals (Sweden)

    D. Ricauda Aimonino

    2013-09-01

    Full Text Available Computer vision is becoming increasingly important in quality control of many food processes. The appearance properties of food products (colour, texture, shape and size are, in fact, correlated with organoleptic characteristics and/or the presence of defects. Quality control based on image processing eliminates the subjectivity of human visual inspection, allowing rapid and non-destructive analysis. However, most food matrices show a wide variability in appearance features, therefore robust and customized image elaboration algorithms have to be implemented for each specific product. For this reason, quality control by visual inspection is still rather diffused in several food processes. The case study inspiring this paper concerns the production of frozen mixed berries. Once frozen, different kinds of berries are mixed together, in different amounts, according to a recipe. The correct quantity of each kind of fruit, within a certain tolerance, has to be ensured by producers. Quality control relies on bringing few samples for each production lot (samples of the same weight and, manually, counting the amount of each species. This operation is tedious, subject to errors, and time consuming, while a computer vision system (CVS could determine the amount of each kind of berries in a few seconds. This paper discusses the problem of colour calibration of the CVS used for frozen berries mixture evaluation. Images are acquired by a digital camera coupled with a dome lighting system, which gives a homogeneous illumination on the entire visible surface of the berries, and a flat bed scanner. RBG device dependent data are then mapped onto CIELab colorimetric colour space using different transformation operators. The obtained results show that the proposed calibration procedure leads to colour discrepancies comparable or even below the human eyes sensibility.

  17. File Type Identification of File Fragments using Longest Common Subsequence (LCS)

    Science.gov (United States)

    Rahmat, R. F.; Nicholas, F.; Purnamawati, S.; Sitompul, O. S.

    2017-01-01

    Computer forensic analyst is a person in charge of investigation and evidence tracking. In certain cases, the file needed to be presented as digital evidence was deleted. It is difficult to reconstruct the file, because it often lost its header and cannot be identified while being restored. Therefore, a method is required for identifying the file type of file fragments. In this research, we propose Longest Common Subsequences that consists of three steps, namely training, testing and validation, to identify the file type from file fragments. From all testing results we can conlude that our proposed method works well and achieves 92.91% of accuracy to identify the file type of file fragment for three data types.

  18. Production of proteinase A by Saccharomyces cerevisiae in a cell-recycling fermentation system: Experiments and computer simulations

    DEFF Research Database (Denmark)

    Grøn, S.; Biedermann, K.; Emborg, Claus

    1996-01-01

    experimentally and by computer simulations. Experiments and simulations showed that cell mass and product concentration were enhanced by high ratios of recycling. Additional simulations showed that the proteinase A concentration decreased drastically at high dilution rates and the optimal volumetric...... productivities were at high dilution rates just below washout and at high ratios of recycling. Cell-recycling fermentation gave much higher volumetric productivities and stable product concentrations in contrast to simple continuous fermentation....

  19. Uncertainties in source term calculations generated by the ORIGEN2 computer code for Hanford Production Reactors

    International Nuclear Information System (INIS)

    Heeb, C.M.

    1991-03-01

    The ORIGEN2 computer code is the primary calculational tool for computing isotopic source terms for the Hanford Environmental Dose Reconstruction (HEDR) Project. The ORIGEN2 code computes the amounts of radionuclides that are created or remain in spent nuclear fuel after neutron irradiation and radioactive decay have occurred as a result of nuclear reactor operation. ORIGEN2 was chosen as the primary code for these calculations because it is widely used and accepted by the nuclear industry, both in the United States and the rest of the world. Its comprehensive library of over 1,600 nuclides includes any possible isotope of interest to the HEDR Project. It is important to evaluate the uncertainties expected from use of ORIGEN2 in the HEDR Project because these uncertainties may have a pivotal impact on the final accuracy and credibility of the results of the project. There are three primary sources of uncertainty in an ORIGEN2 calculation: basic nuclear data uncertainty in neutron cross sections, radioactive decay constants, energy per fission, and fission product yields; calculational uncertainty due to input data; and code uncertainties (i.e., numerical approximations, and neutron spectrum-averaged cross-section values from the code library). 15 refs., 5 figs., 5 tabs

  20. An Instructional Design Model for Developing a Computer Curriculum To Increase Employee Productivity in a Pharmaceutical Company.

    Science.gov (United States)

    Stumpf, Mark R.

    This report presents an instructional design model that was developed for use by the End-Users Computing department of a large pharmaceutical company in developing effective--but not lengthy--microcomputer training seminars to train office workers and executives in the proper use of computers and thus increase their productivity. The 14 steps of…

  1. 75 FR 8400 - In the Matter of Certain Notebook Computer Products and Components Thereof; Notice of Investigation

    Science.gov (United States)

    2010-02-24

    ... INTERNATIONAL TRADE COMMISSION [Inv. No. 337-TA-705] In the Matter of Certain Notebook Computer... United States after importation of certain notebook computer products and components thereof by reason of... an industry in the United States exists as required by subsection (a)(2) of section 337. The...

  2. Computational model for a high temperature electrolyzer coupled to a HTTR for efficient nuclear hydrogen production

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, Daniel; Rojas, Leorlen; Rosales, Jesus; Castro, Landy; Gamez, Abel; Brayner, Carlos, E-mail: danielgonro@gmail.com [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil); Garcia, Lazaro; Garcia, Carlos; Torre, Raciel de la, E-mail: lgarcia@instec.cu [Instituto Superior de Tecnologias y Ciencias Aplicadas (InSTEC), La Habana (Cuba); Sanchez, Danny [Universidade Estadual de Santa Cruz (UESC), Ilheus, BA (Brazil)

    2015-07-01

    High temperature electrolysis process coupled to a very high temperature reactor (VHTR) is one of the most promising methods for hydrogen production using a nuclear reactor as the primary heat source. However there are not references in the scientific publications of a test facility that allow to evaluate the efficiency of the process and other physical parameters that has to be taken into consideration for its accurate application in the hydrogen economy as a massive production method. For this lack of experimental facilities, mathematical models are one of the most used tools to study this process and theirs flowsheets, in which the electrolyzer is the most important component because of its complexity and importance in the process. A computational fluid dynamic (CFD) model for the evaluation and optimization of the electrolyzer of a high temperature electrolysis hydrogen production process flowsheet was developed using ANSYS FLUENT®. Electrolyzer's operational and design parameters will be optimized in order to obtain the maximum hydrogen production and the higher efficiency in the module. This optimized model of the electrolyzer will be incorporated to a chemical process simulation (CPS) code to study the overall high temperature flowsheet coupled to a high temperature accelerator driven system (ADS) that offers advantages in the transmutation of the spent fuel. (author)

  3. The secondary metabolite bioinformatics portal: Computational tools to facilitate synthetic biology of secondary metabolite production

    Directory of Open Access Journals (Sweden)

    Tilmann Weber

    2016-06-01

    Full Text Available Natural products are among the most important sources of lead molecules for drug discovery. With the development of affordable whole-genome sequencing technologies and other ‘omics tools, the field of natural products research is currently undergoing a shift in paradigms. While, for decades, mainly analytical and chemical methods gave access to this group of compounds, nowadays genomics-based methods offer complementary approaches to find, identify and characterize such molecules. This paradigm shift also resulted in a high demand for computational tools to assist researchers in their daily work. In this context, this review gives a summary of tools and databases that currently are available to mine, identify and characterize natural product biosynthesis pathways and their producers based on ‘omics data. A web portal called Secondary Metabolite Bioinformatics Portal (SMBP at http://www.secondarymetabolites.org is introduced to provide a one-stop catalog and links to these bioinformatics resources. In addition, an outlook is presented how the existing tools and those to be developed will influence synthetic biology approaches in the natural products field.

  4. Computational model for a high temperature electrolyzer coupled to a HTTR for efficient nuclear hydrogen production

    International Nuclear Information System (INIS)

    Gonzalez, Daniel; Rojas, Leorlen; Rosales, Jesus; Castro, Landy; Gamez, Abel; Brayner, Carlos; Garcia, Lazaro; Garcia, Carlos; Torre, Raciel de la; Sanchez, Danny

    2015-01-01

    High temperature electrolysis process coupled to a very high temperature reactor (VHTR) is one of the most promising methods for hydrogen production using a nuclear reactor as the primary heat source. However there are not references in the scientific publications of a test facility that allow to evaluate the efficiency of the process and other physical parameters that has to be taken into consideration for its accurate application in the hydrogen economy as a massive production method. For this lack of experimental facilities, mathematical models are one of the most used tools to study this process and theirs flowsheets, in which the electrolyzer is the most important component because of its complexity and importance in the process. A computational fluid dynamic (CFD) model for the evaluation and optimization of the electrolyzer of a high temperature electrolysis hydrogen production process flowsheet was developed using ANSYS FLUENT®. Electrolyzer's operational and design parameters will be optimized in order to obtain the maximum hydrogen production and the higher efficiency in the module. This optimized model of the electrolyzer will be incorporated to a chemical process simulation (CPS) code to study the overall high temperature flowsheet coupled to a high temperature accelerator driven system (ADS) that offers advantages in the transmutation of the spent fuel. (author)

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  7. A lightweight high availability strategy for Atlas LCG File Catalogs

    International Nuclear Information System (INIS)

    Martelli, Barbara; Salvo, Alessandro de; Anzellotti, Daniela; Rinaldi, Lorenzo; Cavalli, Alessandro; Pra, Stefano dal; Dell'Agnello, Luca; Gregori, Daniele; Prosperini, Andrea; Ricci, Pier Paolo; Sapunenko, Vladimir

    2010-01-01

    The LCG File Catalog is a key component of the LHC Computing Grid middleware [1], as it contains the mapping between Logical File Names and Physical File Names on the Grid. The Atlas computing model foresees multiple local LFC housed in each Tier-1 and Tier-0, containing all information about files stored in the regional cloud. As the local LFC contents are presently not replicated anywhere, this turns out in a dangerous single point of failure for all of the Atlas regional clouds. In order to solve this problem we propose a novel solution for high availability (HA) of Oracle based Grid services, obtained by composing an Oracle Data Guard deployment and a series of application level scripts. This approach has the advantage of being very easy to deploy and maintain, and represents a good candidate solution for all Tier-2s which are usually little centres with little manpower dedicated to service operations. We also present the results of a wide range of functionality and performance tests run on a test-bed having characteristics similar to the ones required for production. The test-bed consists of a failover deployment between the Italian LHC Tier-1 (INFN - CNAF) and an Atlas Tier-2 located at INFN - Roma1. Moreover, we explain how the proposed strategy can be deployed on the present Grid infrastructure, without requiring any change to the middleware and in a way that is totally transparent to end users and applications.

  8. Review of Well Operator Files for Hydraulically Fractured Oil and Gas Production Wells: Well Design and Construction Fact Sheet

    Science.gov (United States)

    EPA reviewed a statistically representative sample of oil and gas production wells reported by nine service companies to help understand the role of well design and construction practices preventing pathways for subsurface fluid movement.

  9. Biodegradation of Cosmetics Products: A Computational Study of Cytochrome P450 Metabolism of Phthalates

    Directory of Open Access Journals (Sweden)

    Fabián G. Cantú Reinhard

    2017-11-01

    Full Text Available Cytochrome P450s are a broad class of enzymes in the human body with important functions for human health, which include the metabolism and detoxification of compounds in the liver. Thus, in their catalytic cycle, the P450s form a high-valent iron(IV-oxo heme cation radical as the active species (called Compound I that reacts with substrates through oxygen atom transfer. This work discusses the possible degradation mechanisms of phthalates by cytochrome P450s in the liver, through computational modelling, using 2-ethylhexyl-phthalate as a model substrate. Phthalates are a type of compound commonly found in the environment from cosmetics usage, but their biodegradation in the liver may lead to toxic metabolites. Experimental studies revealed a multitude of products and varying product distributions among P450 isozymes. To understand the regio- and chemoselectivity of phthalate activation by P450 isozymes, we focus here on the mechanisms of phthalate activation by Compound I leading to O-dealkylation, aliphatic hydroxylation and aromatic hydroxylation processes. We set up model complexes of Compound I with the substrate and investigated the reaction mechanisms for products using the density functional theory on models and did a molecular mechanics study on enzymatic structures. The work shows that several reaction barriers in the gas-phase are close in energy, leading to a mixture of products. However, when we tried to dock the substrate into a P450 isozyme, some of the channels were inaccessible due to unfavorable substrate positions. Product distributions are discussed under various reaction conditions and rationalized with valence bond and thermodynamic models.

  10. An analysis of file system and installation of the file management system for NOS operating system

    International Nuclear Information System (INIS)

    Lee, Young Jai; Park, Sun Hee; Hwang, In Ah; Kim, Hee Kyung

    1992-06-01

    In this technical report, we analyze NOS file structure for Cyber 170-875 and Cyber 960-31 computer system. We also describe functions, procedure and how-to-operate and how-to-use of VDS. VDS is used to manage large files effectively for Cyber computer system. The purpose of the VDS installation is to increase the virtual disk storage by utilizing magnetic tape, to assist the users of the computer system in managing their files, and to enhance the performance of KAERI Cyber computer system. (Author)

  11. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  12. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  15. Bayesian Action–Perception Computational Model: Interaction of Production and Recognition of Cursive Letters

    Science.gov (United States)

    Gilet, Estelle; Diard, Julien; Bessière, Pierre

    2011-01-01

    In this paper, we study the collaboration of perception and action representations involved in cursive letter recognition and production. We propose a mathematical formulation for the whole perception–action loop, based on probabilistic modeling and Bayesian inference, which we call the Bayesian Action–Perception (BAP) model. Being a model of both perception and action processes, the purpose of this model is to study the interaction of these processes. More precisely, the model includes a feedback loop from motor production, which implements an internal simulation of movement. Motor knowledge can therefore be involved during perception tasks. In this paper, we formally define the BAP model and show how it solves the following six varied cognitive tasks using Bayesian inference: i) letter recognition (purely sensory), ii) writer recognition, iii) letter production (with different effectors), iv) copying of trajectories, v) copying of letters, and vi) letter recognition (with internal simulation of movements). We present computer simulations of each of these cognitive tasks, and discuss experimental predictions and theoretical developments. PMID:21674043

  16. Computer simulation for improving radio frequency (RF) heating uniformity of food products: A review.

    Science.gov (United States)

    Huang, Zhi; Marra, Francesco; Subbiah, Jeyamkondan; Wang, Shaojin

    2018-04-13

    Radio frequency (RF) heating has great potential for achieving rapid and volumetric heating in foods, providing safe and high-quality food products due to deep penetration depth, moisture self-balance effects, and leaving no chemical residues. However, the nonuniform heating problem (usually resulting in hot and cold spots in the heated product) needs to be resolved. The inhomogeneous temperature distribution not only affects the quality of the food but also raises the issue of food safety when the microorganisms or insects may not be controlled in the cold spots. The mathematical modeling for RF heating processes has been extensively studied in a wide variety of agricultural products recently. This paper presents a comprehensive review of recent progresses in computer simulation for RF heating uniformity improvement and the offered solutions to reduce the heating nonuniformity. It provides a brief introduction on the basic principle of RF heating technology, analyzes the applications of numerical simulation, and discusses the factors influencing the RF heating uniformity and the possible methods to improve heating uniformity. Mathematical modeling improves the understanding of RF heating of food and is essential to optimize the RF treatment protocol for pasteurization and disinfestation applications. Recommendations for future research have been proposed to further improve the accuracy of numerical models, by covering both heat and mass transfers in the model, validating these models with sample movement and mixing, and identifying the important model parameters by sensitivity analysis.

  17. Enhanced active extracellular polysaccharide production from Ganoderma formosanum using computational modeling.

    Science.gov (United States)

    Hsu, Kai-Di; Wu, Shu-Pei; Lin, Shin-Ping; Lum, Chi-Chin; Cheng, Kuan-Chen

    2017-10-01

    Extracellular polysaccharide (EPS) is one of the major bioactive ingredients contributing to the health benefits of Ganoderma spp. In this study, response surface methodology was applied to determine the optimal culture conditions for EPS production of Ganoderma formosanum. The optimum medium composition was found to be at initial pH 5.3, 49.2 g/L of glucose, and 4.9 g/L of yeast extract by implementing a three-factor-three-level Box-Behnken design. Under this condition, the predicted yield of EPS was up to 830.2 mg/L, which was 1.4-fold higher than the one from basic medium (604.5 mg/L). Furthermore, validating the experimental value of EPS production depicted a high correlation (100.4%) with the computational prediction response model. In addition, the percentage of β-glucan, a well-recognized bioactive polysaccharide, in EPS was 53±5.5%, which was higher than that from Ganoderma lucidum in a previous study. Moreover, results of monosaccharide composition analysis indicated that glucose was the major component of G. formosanum EPS, supporting a high β-glucan percentage in EPS. Taken together, this is the first study to investigate the influence of medium composition for G. formosanum EPS production as well as its β-glucan composition. Copyright © 2017. Published by Elsevier B.V.

  18. COMPUTER SYSTEM FOR DETERMINATION OF COST DAILY SUGAR PRODUCTION AND INCIDENTS DECISIONS FOR COMPANIES SUGAR (SACODI

    Directory of Open Access Journals (Sweden)

    Alejandro Álvarez-Navarro

    2016-01-01

    Full Text Available The process of sugar production is complex; anything that affects this chain has direct repercussions in the sugar production’s costs, it’s synthetic and decisive indicator for the taking of decisions. Currently the Cuban sugar factory determine this cost weekly, for that, its process of taking of decisions is affected. Looking for solutions to this problem, the present work, being part of a territorial project approved by CITMA, intended to calculate the cost of production daily, weekly, monthly and accumulated until indicated date, according to an adaptation to the methodology used by the National Costs System of sugarcane created by the MINAZ, it’s supported by a computer system denominated SACODI. This adaptation registers the physical and economic indicators of all direct and indirect expenses of the  sugarcane and besides this information generates an economic-mathematical model of goal programming whose solution indicates the best balance in amount of sugar of the entities of the sugar factory, in short term. The implementation of the system in the sugar factory «Julio A. Mella» in Santiago de Cuba in the sugar-cane production 08-09 produced an estimate of decrease of the cost of until 3,5 % for the taking of better decisions. 

  19. Computation of fission product distribution in core and primary circuit of a high temperature reactor during normal operation

    International Nuclear Information System (INIS)

    Mattke, U.H.

    1991-08-01

    The fission product release during normal operation from the core of a high temperature reactor is well known to be very low. A HTR-Modul-reactor with a reduced power of 170 MW th is examined under the aspect whether the contamination with Cs-137 as most important nuclide will be so low that a helium turbine in the primary circuit is possible. The program SPTRAN is the tool for the computations and siumlations of fission product transport in HTRs. The program initially developed for computations of accident events has been enlarged for computing the fission product transport under the conditions of normal operation. The theoretical basis, the used programs and data basis are presented followed by the results of the computations. These results are explained and discussed; moreover the consequences and future possibilities of development are shown. (orig./HP) [de

  20. EXTENDCHAIN: a package of computer programs for calculating the buildup of heavy metals, fission products, and activation products in reactor fuel elements

    International Nuclear Information System (INIS)

    Robertson, M.W.

    1977-01-01

    Design of HTGR recycle and refabrication facilities requires a detailed knowledge of the concentrations of around 400 nuclides which are segregated into four different fuel particle types. The EXTENDCHAIN package of computer programs and the supporting input data files were created to provide an efficient method for calculating the 1600 different concentrations required. The EXTENDCHAIN code performs zero-dimensional nuclide burnup, decay, and activation calculations in nine energy groups for up to 108 nuclides per run. Preparation and handling of the input and output for the sixteen EXTENDCHAIN runs required to produce the desired data are the most time consuming tasks in the computation of the spent fuel element composition. The EXTENDCHAIN package of computer programs contains four codes to aid in the preparation and handling of these data. Most of the input data such as cross sections, decay constants, and the nuclide interconnection scheme will not change when calculating new cases. These data were developed for the life cycle of a typical HTGR and stored on archive tapes for future use. The fuel element composition for this typical HTGR life has been calculated and the results for an equilibrium recycle reload are presented. 12 figures, 7 tables

  1. Unification of behavioural, computational and neural accounts of word production errors in post-stroke aphasia

    Directory of Open Access Journals (Sweden)

    Marija Tochadse

    Full Text Available Neuropsychological assessment, brain imaging and computational modelling have augmented our understanding of the multifaceted functional deficits in people with language disorders after stroke. Despite the volume of research using each technique, no studies have attempted to assimilate all three approaches in order to generate a unified behavioural-computational-neural model of post-stroke aphasia.The present study included data from 53 participants with chronic post-stroke aphasia and merged: aphasiological profiles based on a detailed neuropsychological assessment battery which was analysed with principal component and correlational analyses; measures of the impairment taken from Dell's computational model of word production; and the neural correlates of both behavioural and computational accounts analysed by voxel-based correlational methodology.As a result, all three strands coincide with the separation of semantic and phonological stages of aphasic naming, revealing the prominence of these dimensions for the explanation of aphasic performance. Over and above three previously described principal components (phonological ability, semantic ability, executive-demand, we observed auditory working memory as a novel factor. While the phonological Dell parameter was uniquely related to phonological errors/factor, the semantic parameter was less clear-cut, being related to both semantic errors and omissions, and loading heavily with semantic ability and auditory working memory factors. The close relationship between the semantic Dell parameter and omission errors recurred in their high lesion-correlate overlap in the anterior middle temporal gyrus. In addition, the simultaneous overlap of the lesion correlate of omission errors with more dorsal temporal regions, associated with the phonological parameter, highlights the multiple drivers that underpin this error type. The novel auditory working memory factor was located along left superior

  2. 78 FR 24199 - Streak Products, Inc. v. UTi, United States, Inc.; Notice of Filing of Complaint and Assignment

    Science.gov (United States)

    2013-04-24

    ... FEDERAL MARITIME COMMISSION [Docket No. 13--04] Streak Products, Inc. v. UTi, United States, Inc...,'' against UTi, United States, Inc. (``UTi''), hereinafter ``Respondent.'' Complainant states that it is a... therefore, has violated 46 U.S.C. 41104(2). Complainant also alleges that ``UTi engaged in an unfair or...

  3. PLATO: a computer code for the analysis of fission product plateout in HTGRs

    International Nuclear Information System (INIS)

    Suzuki, Katsuo; Morimoto, Toshio.

    1981-01-01

    The computer code PLATO for estimating plateout activities on surfaces of primary cooling system of HTGRs has been developed, and in this report, analytical model and digital calculation method incorporated in the code are described. The code utilizes the mass transfer model analogous to heat transfer coupled with an expression for adsorption-desorption phenomenon, and is able to analyze plateout behaviours in a closed circuit, like a reactor cooling system, which is constructed from a various kind of components, as well as in an open-ended tube. With the code, fission product concentration in the coolant and plateout amount on the surfaces are calculated along the coolant stream, and total removal rate by the plateout process is also obtained. Comparison of the analytical results with the experimental results, including checks of the effects of some calculation conditions on the results, and preliminary analysis on the VHTR plant have been made. (author)

  4. Experimental and computational fluid dynamics studies of mixing of complex oral health products

    Science.gov (United States)

    Cortada-Garcia, Marti; Migliozzi, Simona; Weheliye, Weheliye Hashi; Dore, Valentina; Mazzei, Luca; Angeli, Panagiota; ThAMes Multiphase Team

    2017-11-01

    Highly viscous non-Newtonian fluids are largely used in the manufacturing of specialized oral care products. Mixing often takes place in mechanically stirred vessels where the flow fields and mixing times depend on the geometric configuration and the fluid physical properties. In this research, we study the mixing performance of complex non-Newtonian fluids using Computational Fluid Dynamics models and validate them against experimental laser-based optical techniques. To this aim, we developed a scaled-down version of an industrial mixer. As test fluids, we used mixtures of glycerol and a Carbomer gel. The viscosities of the mixtures against shear rate at different temperatures and phase ratios were measured and found to be well described by the Carreau model. The numerical results were compared against experimental measurements of velocity fields from Particle Image Velocimetry (PIV) and concentration profiles from Planar Laser Induced Fluorescence (PLIF).

  5. Computer-aided design of microvasculature systems for use in vascular scaffold production

    Energy Technology Data Exchange (ETDEWEB)

    Mondy, William Lafayette [Department of Chemical and Biomedical Engineering, University of South Florida, FL (United States); Cameron, Don [Department of Pathology and Cell Biology, College of Medicine, University of South Florida, FL (United States); Timmermans, Jean-Pierre [Department of Veterinary Sciences, University of Antwerp (Belgium); De Clerck, Nora [Department of Biomedical Sciences University of Antwerp (Belgium); Sasov, Alexander [Skyscan (Belgium); Casteleyn, Christophe [College of Veterinary Medicine, Ghent University (Belgium); Piegl, Les A [Department of Computer Science and Engineering, University of South Florida, FL (United States)

    2009-09-15

    In vitro biomedical engineering of intact, functional vascular networks, which include capillary structures, is a prerequisite for adequate vascular scaffold production. Capillary structures are necessary since they provide the elements and compounds for the growth, function and maintenance of 3D tissue structures. Computer-aided modeling of stereolithographic (STL) micro-computer tomographic (micro-CT) 3D models is a technique that enables us to mimic the design of vascular tree systems containing capillary beds, found in tissues. In our first paper (Mondy et al 2009 Tissue Eng. at press), using micro-CT, we studied the possibility of using vascular tissues to produce data capable of aiding the design of vascular tree scaffolding, which would help in the reverse engineering of a complete vascular tree system including capillary bed structures. In this paper, we used STL models of large datasets of computer-aided design (CAD) data of vascular structures which contained capillary structures that mimic those in the dermal layers of rabbit skin. Using CAD software we created from 3D STL models a bio-CAD design for the development of capillary-containing vascular tree scaffolding for skin. This method is designed to enhance a variety of therapeutic protocols including, but not limited to, organ and tissue repair, systemic disease mediation and cell/tissue transplantation therapy. Our successful approach to in vitro vasculogenesis will allow the bioengineering of various other types of 3D tissue structures, and as such greatly expands the potential applications of biomedical engineering technology into the fields of biomedical research and medicine.

  6. Application of computational chemistry methods to obtain thermodynamic data for hydrogen production from liquefied petroleum gas

    Directory of Open Access Journals (Sweden)

    J. A. Sousa

    2013-03-01

    Full Text Available The objective of this study was to estimate thermodynamic data, such as standard enthalpy, entropy and Gibbs free energy changes of reaction and, consequently, chemical equilibrium constants, for a reaction system describing the hydrogen production from Liquefied Petroleum Gas (LPG. The acquisition of those properties was made using computational chemistry methods and the results were compared with experimental data reported in the literature. The reaction system of steam reforming of LPG was reported as a set of seven independent reactions involving the chemical species n-C4H10, C3H8, C2H6, C2H4, CH4, CO2, CO, H2O, H2 and solid carbon. Six computational approaches were used: Density Functional Theory (DFT employing Becke's three parameter hybrid exchange functional, and the Lee-Yang-Parr correlation functional (B3LYP using the 6-31G++(d,p basis set and the composite methods CBS-QB3, Gaussian-1 (G1, Gaussian-2 (G2, Gaussian-3 (G3 and Gaussian-4 (G4. Mole fractions of the system components were also determined between 873.15 and 1173.15 K, at 1 atm and a feed with a stoichiometric amount of water. Results showed that the hybrid functional B3LYP/6-31G++(d,p, G3 and G4 theories were the most appropriated methods to predict the properties of interest. Gaussian-3 and Gaussian-4 theories are expected to be good thermodynamic data predictors and the known efficient prediction of vibrational frequencies by B3LYP is probably the source of the good agreement found in this study. This last methodology is of special interest since it presents low computational cost, which is important when more complex molecular systems are considered.

  7. Computer-aided design of microvasculature systems for use in vascular scaffold production

    International Nuclear Information System (INIS)

    Mondy, William Lafayette; Cameron, Don; Timmermans, Jean-Pierre; De Clerck, Nora; Sasov, Alexander; Casteleyn, Christophe; Piegl, Les A

    2009-01-01

    In vitro biomedical engineering of intact, functional vascular networks, which include capillary structures, is a prerequisite for adequate vascular scaffold production. Capillary structures are necessary since they provide the elements and compounds for the growth, function and maintenance of 3D tissue structures. Computer-aided modeling of stereolithographic (STL) micro-computer tomographic (micro-CT) 3D models is a technique that enables us to mimic the design of vascular tree systems containing capillary beds, found in tissues. In our first paper (Mondy et al 2009 Tissue Eng. at press), using micro-CT, we studied the possibility of using vascular tissues to produce data capable of aiding the design of vascular tree scaffolding, which would help in the reverse engineering of a complete vascular tree system including capillary bed structures. In this paper, we used STL models of large datasets of computer-aided design (CAD) data of vascular structures which contained capillary structures that mimic those in the dermal layers of rabbit skin. Using CAD software we created from 3D STL models a bio-CAD design for the development of capillary-containing vascular tree scaffolding for skin. This method is designed to enhance a variety of therapeutic protocols including, but not limited to, organ and tissue repair, systemic disease mediation and cell/tissue transplantation therapy. Our successful approach to in vitro vasculogenesis will allow the bioengineering of various other types of 3D tissue structures, and as such greatly expands the potential applications of biomedical engineering technology into the fields of biomedical research and medicine.

  8. Computer-aided design of microvasculature systems for use in vascular scaffold production.

    Science.gov (United States)

    Mondy, William Lafayette; Cameron, Don; Timmermans, Jean-Pierre; De Clerck, Nora; Sasov, Alexander; Casteleyn, Christophe; Piegl, Les A

    2009-09-01

    In vitro biomedical engineering of intact, functional vascular networks, which include capillary structures, is a prerequisite for adequate vascular scaffold production. Capillary structures are necessary since they provide the elements and compounds for the growth, function and maintenance of 3D tissue structures. Computer-aided modeling of stereolithographic (STL) micro-computer tomographic (micro-CT) 3D models is a technique that enables us to mimic the design of vascular tree systems containing capillary beds, found in tissues. In our first paper (Mondy et al 2009 Tissue Eng. at press), using micro-CT, we studied the possibility of using vascular tissues to produce data capable of aiding the design of vascular tree scaffolding, which would help in the reverse engineering of a complete vascular tree system including capillary bed structures. In this paper, we used STL models of large datasets of computer-aided design (CAD) data of vascular structures which contained capillary structures that mimic those in the dermal layers of rabbit skin. Using CAD software we created from 3D STL models a bio-CAD design for the development of capillary-containing vascular tree scaffolding for skin. This method is designed to enhance a variety of therapeutic protocols including, but not limited to, organ and tissue repair, systemic disease mediation and cell/tissue transplantation therapy. Our successful approach to in vitro vasculogenesis will allow the bioengineering of various other types of 3D tissue structures, and as such greatly expands the potential applications of biomedical engineering technology into the fields of biomedical research and medicine.

  9. Exploring Natural Products from the Biodiversity of Pakistan for Computational Drug Discovery Studies: Collection, Optimization, Design and Development of A Chemical Database (ChemDP).

    Science.gov (United States)

    Mirza, Shaher Bano; Bokhari, Habib; Fatmi, Muhammad Qaiser

    2015-01-01

    Pakistan possesses a rich and vast source of natural products (NPs). Some of these secondary metabolites have been identified as potent therapeutic agents. However, the medicinal usage of most of these compounds has not yet been fully explored. The discoveries for new scaffolds of NPs as inhibitors of certain enzymes or receptors using advanced computational drug discovery approaches are also limited due to the unavailability of accurate 3D structures of NPs. An organized database incorporating all relevant information, therefore, can facilitate to explore the medicinal importance of the metabolites from Pakistani Biodiversity. The Chemical Database of Pakistan (ChemDP; release 01) is a fully-referenced, evolving, web-based, virtual database which has been designed and developed to introduce natural products (NPs) and their derivatives from the biodiversity of Pakistan to Global scientific communities. The prime aim is to provide quality structures of compounds with relevant information for computer-aided drug discovery studies. For this purpose, over 1000 NPs have been identified from more than 400 published articles, for which 2D and 3D molecular structures have been generated with a special focus on their stereochemistry, where applicable. The PM7 semiempirical quantum chemistry method has been used to energy optimize the 3D structure of NPs. The 2D and 3D structures can be downloaded as .sdf, .mol, .sybyl, .mol2, and .pdb files - readable formats by many chemoinformatics/bioinformatics software packages. Each entry in ChemDP contains over 100 data fields representing various molecular, biological, physico-chemical and pharmacological properties, which have been properly documented in the database for end users. These pieces of information have been either manually extracted from the literatures or computationally calculated using various computational tools. Cross referencing to a major data repository i.e. ChemSpider has been made available for overlapping

  10. Advanced display object selection methods for enhancing user-computer productivity

    Science.gov (United States)

    Osga, Glenn A.

    1993-01-01

    The User-Interface Technology Branch at NCCOSC RDT&E Division has been conducting a series of studies to address the suitability of commercial off-the-shelf (COTS) graphic user-interface (GUI) methods for efficiency and performance in critical naval combat systems. This paper presents an advanced selection algorithm and method developed to increase user performance when making selections on tactical displays. The method has also been applied with considerable success to a variety of cursor and pointing tasks. Typical GUI's allow user selection by: (1) moving a cursor with a pointing device such as a mouse, trackball, joystick, touchscreen; and (2) placing the cursor on the object. Examples of GUI objects are the buttons, icons, folders, scroll bars, etc. used in many personal computer and workstation applications. This paper presents an improved method of selection and the theoretical basis for the significant performance gains achieved with various input devices tested. The method is applicable to all GUI styles and display sizes, and is particularly useful for selections on small screens such as notebook computers. Considering the amount of work-hours spent pointing and clicking across all styles of available graphic user-interfaces, the cost/benefit in applying this method to graphic user-interfaces is substantial, with the potential for increasing productivity across thousands of users and applications.

  11. The global unified parallel file system (GUPFS) project: FY 2003 activities and results

    Energy Technology Data Exchange (ETDEWEB)

    Butler, Gregory F.; Baird William P.; Lee, Rei C.; Tull, Craig E.; Welcome, Michael L.; Whitney Cary L.

    2004-04-30

    The Global Unified Parallel File System (GUPFS) project is a multiple-phase project at the National Energy Research Scientific Computing (NERSC) Center whose goal is to provide a scalable, high-performance, high-bandwidth, shared file system for all of the NERSC production computing and support systems. The primary purpose of the GUPFS project is to make the scientific users more productive as they conduct advanced scientific research at NERSC by simplifying the scientists' data management tasks and maximizing storage and data availability. This is to be accomplished through the use of a shared file system providing a unified file namespace, operating on consolidated shared storage that is accessible by all the NERSC production computing and support systems. In order to successfully deploy a scalable high-performance shared file system with consolidated disk storage, three major emerging technologies must be brought together: (1) shared/cluster file systems software, (2) cost-effective, high-performance storage area network (SAN) fabrics, and (3) high-performance storage devices. Although they are evolving rapidly, these emerging technologies individually are not targeted towards the needs of scientific high-performance computing (HPC). The GUPFS project is in the process of assessing these emerging technologies to determine the best combination of solutions for a center-wide shared file system, to encourage the development of these technologies in directions needed for HPC, particularly at NERSC, and to then put them into service. With the development of an evaluation methodology and benchmark suites, and with the updating of the GUPFS testbed system, the project did a substantial number of investigations and evaluations during FY 2003. The investigations and evaluations involved many vendors and products. From our evaluation of these products, we have found that most vendors and many of the products are more focused on the commercial market. Most vendors

  12. High-Bandwidth Tactical-Network Data Analysis in a High-Performance-Computing (HPC) Environment: Packet-Level Analysis

    Science.gov (United States)

    2015-09-01

    individual fragments using the hash-based method. In general, fragments 6 appear in order and relatively close to each other in the file. A fragment...data product derived from the data model is shown in Fig. 5, a Google Earth12 Keyhole Markup Language (KML) file. This product includes aggregate...System BLOb binary large object FPGA field-programmable gate array HPC high-performance computing IP Internet Protocol KML Keyhole Markup Language

  13. Virtual file system for PSDS

    Science.gov (United States)

    Runnels, Tyson D.

    1993-01-01

    This is a case study. It deals with the use of a 'virtual file system' (VFS) for Boeing's UNIX-based Product Standards Data System (PSDS). One of the objectives of PSDS is to store digital standards documents. The file-storage requirements are that the files must be rapidly accessible, stored for long periods of time - as though they were paper, protected from disaster, and accumulative to about 80 billion characters (80 gigabytes). This volume of data will be approached in the first two years of the project's operation. The approach chosen is to install a hierarchical file migration system using optical disk cartridges. Files are migrated from high-performance media to lower performance optical media based on a least-frequency-used algorithm. The optical media are less expensive per character stored and are removable. Vital statistics about the removable optical disk cartridges are maintained in a database. The assembly of hardware and software acts as a single virtual file system transparent to the PSDS user. The files are copied to 'backup-and-recover' media whose vital statistics are also stored in the database. Seventeen months into operation, PSDS is storing 49 gigabytes. A number of operational and performance problems were overcome. Costs are under control. New and/or alternative uses for the VFS are being considered.

  14. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  15. Time and temperature dependence of cascade induced defect production in in situ experiments and computer simulation

    International Nuclear Information System (INIS)

    Ishino, Shiori

    1993-01-01

    Understanding of the defect production and annihilation processes in a cascade is important in modelling of radiation damage for establishing irradiation correlation. In situ observation of heavy ion radiation damage has a great prospect in this respect. Time and temperature dependence of formation and annihilation of vacancy clusters in a cascade with a time resolution of 30 ms has been studied with a facility which comprises a heavy ion accelerator and an electron microscope. Formation and annihilation rates of defect clusters have been separately measured by this technique. The observed processes have been analysed by simple kinetic equations, taking into account the sink effect of surface and the defect clusters themselves together with the annihilation process due to thermal emission of vacancies from the defect clusters. Another tool to study time and temperature dependence of defect production in a cascade is computer simulation. Recent results of molecular dynamics calculations on the temperature dependence of cascade evolution are presented, including directional and temperature dependence of the lengths of replacement collision sequences, temperature dependence of the process to reach thermal equilibrium and so on. These results are discussed under general time frame of radiation damage evolution covering from 10 -15 to 10 9 s, and several important issues for the general understanding have been identified. (orig.)

  16. The Computation of Nash Equilibrium in Fashion Games via Semi-Tensor Product Method

    Institute of Scientific and Technical Information of China (English)

    GUO Peilian; WANG Yuzhen

    2016-01-01

    Using the semi-tensor product of matrices,this paper investigates the computation of pure-strategy Nash equilibrium (PNE) for fashion games,and presents several new results.First,a formal fashion game model on a social network is given.Second,the utility function of each player is converted into an algebraic form via the semi-tensor product of matrices,based on which the case of two-strategy fashion game is studied and two methods are obtained for the case to verify the existence of PNE.Third,the multi-strategy fashion game model is investigated and an algorithm is established to find all the PNEs for the general case.Finally,two kinds of optimization problems,that is,the so-called social welfare and normalized satisfaction degree optimization problems are investigated and two useful results are given.The study of several illustrative examples shows that the new results obtained in this paper are effective.

  17. A deterministic computer simulation model of life-cycle lamb and wool production.

    Science.gov (United States)

    Wang, C T; Dickerson, G E

    1991-11-01

    A deterministic mathematical computer model was developed to simulate effects on life-cycle efficiency of lamb and wool production from genetic improvement of performance traits under alternative management systems. Genetic input parameters can be varied for age at puberty, length of anestrus, fertility, precocity of fertility, number born, milk yield, mortality, growth rate, body fat, and wool growth. Management options include mating systems, lambing intervals, feeding levels, creep feeding, weaning age, marketing age or weight, and culling policy. Simulated growth of animals is linear from birth to inflection point, then slows asymptotically to specified mature empty BW and fat content when nutrition is not limiting. The ME intake requirement to maintain normal condition is calculated daily or weekly for maintenance, protein and fat deposition, wool growth, gestation, and lactation. Simulated feed intake is the minimum of availability, DM physical limit, or ME physiological limit. Tissue catabolism occurs when intake is below the requirement for essential functions. Mortality increases when BW is depressed. Equations developed for calculations of biological functions were validated with published and unpublished experimental data. Lifetime totals are accumulated for TDN, DM, and protein intake and for market lamb equivalent output values of empty body or carcass lean and wool from both lambs and ewes. These measures of efficiency for combinations of genetic, management, and marketing variables can provide the relative economic weighting of traits needed to derive optimal criteria for genetic selection among and within breeds under defined industry production systems.

  18. A computer analysis code of radioactive corrosion product behaviour in primary circuits of LMFBRs (PSYCHE)

    International Nuclear Information System (INIS)

    Iizawa, Katsuyuki; Seki, Seiichi; Kawasaki, Yuji; Kano, Shigeki; Nihei, Isao

    1986-01-01

    Recently it has become an important subject to reduce exposure to radiation from radioactive corrosion products (CPs) during maintenance and repair works in reactor plants. Metallic sodium is used as cooling material in fast reactor plants, leading to different CP behaviours compared to light water reactors. In the present study, a computer code for analyzing behaviours of CPs in fast reactor plants is developed. The analysis code, called PSYCHE, makes it possible to perform consistent analysis of production, migration and deposition of CPs in primary circuits together with dose rate around piping of apparatus in cooling systems. An analysis model is developed based on test results on CP behaviour in out-pile sodium. The model, called the ''dissolution-deposition model'', can reproduce atom-selective behaviour, transient phenomenon and downstream effect of CPs, which represent mass transfer phenomena in sodium. Verification of this code is carried out on the basis of CP measurements made in ''Joyo''. The calculation vs. measurement ratio is found to be 0.5 - 2 for CP deposition density in piping for cooling systems and 0.7 - 1.3 for dose rate, demonstrating that this code can give reasonable results. Analysis is also made to predict future changes in total amount of deposited CP in ''Joyo''. (Nogami, K.)

  19. Detection Of Alterations In Audio Files Using Spectrograph Analysis

    Directory of Open Access Journals (Sweden)

    Anandha Krishnan G

    2015-08-01

    Full Text Available The corresponding study was carried out to detect changes in audio file using spectrograph. An audio file format is a file format for storing digital audio data on a computer system. A sound spectrograph is a laboratory instrument that displays a graphical representation of the strengths of the various component frequencies of a sound as time passes. The objectives of the study were to find the changes in spectrograph of audio after altering them to compare altering changes with spectrograph of original files and to check for similarity and difference in mp3 and wav. Five different alterations were carried out on each audio file to analyze the differences between the original and the altered file. For altering the audio file MP3 or WAV by cutcopy the file was opened in Audacity. A different audio was then pasted to the audio file. This new file was analyzed to view the differences. By adjusting the necessary parameters the noise was reduced. The differences between the new file and the original file were analyzed. By adjusting the parameters from the dialog box the necessary changes were made. The edited audio file was opened in the software named spek where after analyzing a graph is obtained of that particular file which is saved for further analysis. The original audio graph received was combined with the edited audio file graph to see the alterations.

  20. Remote file inquiry (RFI) system

    Science.gov (United States)

    1975-01-01

    System interrogates and maintains user-definable data files from remote terminals, using English-like, free-form query language easily learned by persons not proficient in computer programming. System operates in asynchronous mode, allowing any number of inquiries within limitation of available core to be active concurrently.

  1. Research of Performance Linux Kernel File Systems

    Directory of Open Access Journals (Sweden)

    Andrey Vladimirovich Ostroukh

    2015-10-01

    Full Text Available The article describes the most common Linux Kernel File Systems. The research was carried out on a personal computer, the characteristics of which are written in the article. The study was performed on a typical workstation running GNU/Linux with below characteristics. On a personal computer for measuring the file performance, has been installed the necessary software. Based on the results, conclusions and proposed recommendations for use of file systems. Identified and recommended by the best ways to store data.

  2. Fast probabilistic file fingerprinting for big data.

    Science.gov (United States)

    Tretyakov, Konstantin; Laur, Sven; Smant, Geert; Vilo, Jaak; Prins, Pjotr

    2013-01-01

    Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily concerns logistics within and between data centers, but is also important for workstation users in the analysis phase. Common usage patterns, such as comparing and transferring files, are proving computationally expensive and are tying down shared resources. We present an efficient method for calculating file uniqueness for large scientific data files, that takes less computational effort than existing techniques. This method, called Probabilistic Fast File Fingerprinting (PFFF), exploits the variation present in biological data and computes file fingerprints by sampling randomly from the file instead of reading it in full. Consequently, it has a flat performance characteristic, correlated with data variation rather than file size. We demonstrate that probabilistic fingerprinting can be as reliable as existing hashing techniques, with provably negligible risk of collisions. We measure the performance of the algorithm on a number of data storage and access technologies, identifying its strengths as well as limitations. Probabilistic fingerprinting may significantly reduce the use of computational resources when comparing very large files. Utilisation of probabilistic fingerprinting techniques can increase the speed of common file-related workflows, both in the data center and for workbench analysis. The implementation of the algorithm is available as an open-source tool named pfff, as a command-line tool as well as a C library. The tool can be downloaded from http://biit.cs.ut.ee/pfff.

  3. Former food products safety: microbiological quality and computer vision evaluation of packaging remnants contamination.

    Science.gov (United States)

    Tretola, M; Di Rosa, A R; Tirloni, E; Ottoboni, M; Giromini, C; Leone, F; Bernardi, C E M; Dell'Orto, V; Chiofalo, V; Pinotti, L

    2017-08-01

    The use of alternative feed ingredients in farm animal's diets can be an interesting choice from several standpoints, including safety. In this respect, this study investigated the safety features of selected former food products (FFPs) intended for animal nutrition produced in the framework of the IZS PLV 06/14 RC project by an FFP processing plant. Six FFP samples, both mash and pelleted, were analysed for the enumeration of total viable count (TVC) (ISO 4833), Enterobacteriaceae (ISO 21528-1), Escherichia coli (ISO 16649-1), coagulase-positive Staphylococci (CPS) (ISO 6888), presumptive Bacillus cereus and its spores (ISO 7932), sulphite-reducing Clostridia (ISO 7937), yeasts and moulds (ISO 21527-1), and the presence in 25 g of Salmonella spp. (ISO 6579). On the same samples, the presence of undesired ingredients, which can be identified as remnants of packaging materials, was evaluated by two different methods: stereomicroscopy according to published methods; and stereomicroscopy coupled with a computer vision system (IRIS Visual Analyzer VA400). All FFPs analysed were safe from a microbiological point of view. TVC was limited and Salmonella was always absent. When remnants of packaging materials were considered, the contamination level was below 0.08% (w/w). Of note, packaging remnants were found mainly from the 1-mm sieve mesh fractions. Finally, the innovative computer vision system demonstrated the possibility of rapid detection for the presence of packaging remnants in FFPs when combined with a stereomicroscope. In conclusion, the FFPs analysed in the present study can be considered safe, even though some improvements in FFP processing in the feeding plant can be useful in further reducing their microbial loads and impurity.

  4. HUD GIS Boundary Files

    Data.gov (United States)

    Department of Housing and Urban Development — The HUD GIS Boundary Files are intended to supplement boundary files available from the U.S. Census Bureau. The files are for community planners interested in...

  5. Evolution of product lifespan and implications for environmental assessment and management: a case study of personal computers in higher education.

    Science.gov (United States)

    Babbitt, Callie W; Kahhat, Ramzy; Williams, Eric; Babbitt, Gregory A

    2009-07-01

    Product lifespan is a fundamental variable in understanding the environmental impacts associated with the life cycle of products. Existing life cycle and materials flow studies of products, almost without exception, consider lifespan to be constant over time. To determine the validity of this assumption, this study provides an empirical documentation of the long-term evolution of personal computer lifespan, using a major U.S. university as a case study. Results indicate that over the period 1985-2000, computer lifespan (purchase to "disposal") decreased steadily from a mean of 10.7 years in 1985 to 5.5 years in 2000. The distribution of lifespan also evolved, becoming narrower over time. Overall, however, lifespan distribution was broader than normally considered in life cycle assessments or materials flow forecasts of electronic waste management for policy. We argue that these results suggest that at least for computers, the assumption of constant lifespan is problematic and that it is important to work toward understanding the dynamics of use patterns. We modify an age-structured model of population dynamics from biology as a modeling approach to describe product life cycles. Lastly, the purchase share and generation of obsolete computers from the higher education sector is estimated using different scenarios for the dynamics of product lifespan.

  6. The Impact of Computer and Communications Technology on Recruiter Productivity and Quality of Life

    National Research Council Canada - National Science Library

    Blackstone, Tanja

    2003-01-01

    .... A test group of recruiters was given a set of tools, which included state of the art laptop computers, computer projection equipment, communications and database software, Internet and Intranet...

  7. Volume Measurement Algorithm for Food Product with Irregular Shape using Computer Vision based on Monte Carlo Method

    Directory of Open Access Journals (Sweden)

    Joko Siswantoro

    2014-11-01

    Full Text Available Volume is one of important issues in the production and processing of food product. Traditionally, volume measurement can be performed using water displacement method based on Archimedes’ principle. Water displacement method is inaccurate and considered as destructive method. Computer vision offers an accurate and nondestructive method in measuring volume of food product. This paper proposes algorithm for volume measurement of irregular shape food product using computer vision based on Monte Carlo method. Five images of object were acquired from five different views and then processed to obtain the silhouettes of object. From the silhouettes of object, Monte Carlo method was performed to approximate the volume of object. The simulation result shows that the algorithm produced high accuracy and precision for volume measurement.

  8. TRAFIC, a computer program for calculating the release of metallic fission products from an HTGR core

    International Nuclear Information System (INIS)

    Smith, P.D.

    1978-02-01

    A special purpose computer program, TRAFIC, is presented for calculating the release of metallic fission products from an HTGR core. The program is based upon Fick's law of diffusion for radioactive species. One-dimensional transient diffusion calculations are performed for the coated fuel particles and for the structural graphite web. A quasi steady-state calculation is performed for the fuel rod matrix material. The model accounts for nonlinear adsorption behavior in the fuel rod gap and on the coolant hole boundary. The TRAFIC program is designed to operate in a core survey mode; that is, it performs many repetitive calculations for a large number of spatial locations in the core. This is necessary in order to obtain an accurate volume integrated release. For this reason the program has been designed with calculational efficiency as one of its main objectives. A highly efficient numerical method is used in the solution. The method makes use of the Duhamel superposition principle to eliminate interior spatial solutions from consideration. Linear response functions relating the concentrations and mass fluxes on the boundaries of a homogeneous region are derived. Multiple regions are numerically coupled through interface conditions. Algebraic elimination is used to reduce the equations as far as possible. The problem reduces to two nonlinear equations in two unknowns, which are solved using a Newton Raphson technique

  9. Importance of the Computed Tomography Dose Index (CTDI) and Dose Length Product (DLP)

    International Nuclear Information System (INIS)

    Rasolomboahanginjatovo, L.M.

    2014-01-01

    This work is under the auspice of the International Atomic Energy Agency (IAEA) projects (RAF/9/053) untitled S trengthening of the technical capacity for the protection patients and worker . The goal of this work is to highlight the importance of the Computed Tomography Dose Index (CTDI) and the Dose Length product (DLP). Measures were done at Polyclinic of Ilafy and CRDT Anosivavaka, Antananarivo, Madagascar. Doses were evaluated by use of pencil ionization chamber model 6000-10 connected with an electrometer RAD-CHECK model 06-256. Knowledge of dose indicator and Diagnostic Reference Level (DRL) allow the monitoring of scanner within the appropriate average dosimeter. It also insures the progressive determination for the most adapted dose requirements by choice of parameters available on scanner device. Measurements confirmed that doses from scanner devices of the two centers were under DRL requirements proposed by the IAEA, the European Commission (EC) and the National Radiological Protection Board (NRPB). The present results confirm that the patient delivered doses for the two centers are optimized. [fr

  10. Experiences on File Systems: Which is the best file system for you?

    CERN Document Server

    Blomer, J

    2015-01-01

    The distributed file system landscape is scattered. Besides a plethora of research file systems, there is also a large number of production grade file systems with various strengths and weaknesses. The file system, as an abstraction of permanent storage, is appealing because it provides application portability and integration with legacy and third-party applications, including UNIX utilities. On the other hand, the general and simple file system interface makes it notoriously difficult for a distributed file system to perform well under a variety of different workloads. This contribution provides a taxonomy of commonly used distributed file systems and points out areas of research and development that are particularly important for high-energy physics.

  11. Evaluated nuclear data file of Th-232

    International Nuclear Information System (INIS)

    Meadows, J.; Poenitz, W.; Smith, A.; Smith, D.; Whalen, J.; Howerton, R.

    1977-09-01

    An evaluated nuclear data file for thorium is described. The file extends over the energy range 0.049 (i.e., the inelastic-scattering threshold) to 20.0 MeV and is formulated within the framework of the ENDF system. The input data base, the evaluation procedures and judgments, and ancillary experiments carried out in conjunction with the evaluation are outlined. The file includes: neutron total cross sections, neutron scattering processes, neutron radiative capture cross sections, fission cross sections, (n;2n) and (n;3n) processes, fission properties (e.g., nu-bar and delayed neutron emission) and photon production processes. Regions of uncertainty are pointed out particularly where new measured results would be of value. The file is extended to thermal energies using previously reported resonance evaluations thereby providing a complete file for neutronic calculations. Integral data tests indicated that the file was suitable for neutronic calculations in the MeV range

  12. A secure file manager for UNIX

    Energy Technology Data Exchange (ETDEWEB)

    DeVries, R.G.

    1990-12-31

    The development of a secure file management system for a UNIX-based computer facility with supercomputers and workstations is described. Specifically, UNIX in its usual form does not address: (1) Operation which would satisfy rigorous security requirements. (2) Online space management in an environment where total data demands would be many times the actual online capacity. (3) Making the file management system part of a computer network in which users of any computer in the local network could retrieve data generated on any other computer in the network. The characteristics of UNIX can be exploited to develop a portable, secure file manager which would operate on computer systems ranging from workstations to supercomputers. Implementation considerations making unusual use of UNIX features, rather than requiring extensive internal system changes, are described, and implementation using the Cray Research Inc. UNICOS operating system is outlined.

  13. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false IEEE 1680 Standard for the... CONTRACT CLAUSES Text of Provisions and Clauses 52.223-16 IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products. As prescribed in 23.706(b)(1), insert the following clause: IEEE...

  14. Reassigning the Structures of Natural Products Using NMR Chemical Shifts Computed with Quantum Mechanics: A Laboratory Exercise

    Science.gov (United States)

    Palazzo, Teresa A.; Truong, Tiana T.; Wong, Shirley M. T.; Mack, Emma T.; Lodewyk, Michael W.; Harrison, Jason G.; Gamage, R. Alan; Siegel, Justin B.; Kurth, Mark J.; Tantillo, Dean J.

    2015-01-01

    An applied computational chemistry laboratory exercise is described in which students use modern quantum chemical calculations of chemical shifts to assign the structure of a recently isolated natural product. A pre/post assessment was used to measure student learning gains and verify that students demonstrated proficiency of key learning…

  15. Schools (Students) Exchanging CAD/CAM Files over the Internet.

    Science.gov (United States)

    Mahoney, Gary S.; Smallwood, James E.

    This document discusses how students and schools can benefit from exchanging computer-aided design/computer-aided manufacturing (CAD/CAM) files over the Internet, explains how files are exchanged, and examines the problem of selected hardware/software incompatibility. Key terms associated with information search services are defined, and several…

  16. Computational prediction of dust production in graphite moderated pebble bed reactors

    Science.gov (United States)

    Rostamian, Maziar

    The scope of the work reported here, which is the computational study of graphite wear behavior, supports the Nuclear Engineering University Programs project "Experimental Study and Computational Simulations of Key Pebble Bed Thermomechanics Issues for Design and Safety" funded by the US Department of Energy. In this work, modeling and simulating the contact mechanics, as anticipated in a PBR configuration, is carried out for the purpose of assessing the amount of dust generated during a full power operation year of a PBR. A methodology that encompasses finite element analysis (FEA) and micromechanics of wear is developed to address the issue of dust production and its quantification. Particularly, the phenomenon of wear and change of its rate with sliding length is the main focus of this dissertation. This work studies the wear properties of graphite by simulating pebble motion and interactions of a specific type of nuclear grade graphite, IG-11. This study consists of two perspectives: macroscale stress analysis and microscale analysis of wear mechanisms. The first is a set of FEA simulations considering pebble-pebble frictional contact. In these simulations, the mass of generated graphite particulates due to frictional contact is calculated by incorporating FEA results into Archard's equation, which is a linear correlation between wear mass and wear length. However, the experimental data by Johnson, University of Idaho, revealed that the wear rate of graphite decreases with sliding length. This is because the surfaces of the graphite pebbles become smoother over time, which results in a gradual decrease in wear rate. In order to address the change in wear rate, a more detailed analysis of wear mechanisms at room temperature is presented. In this microscale study, the wear behavior of graphite at the asperity level is studied by simulating the contact between asperities of facing surfaces. By introducing the effect of asperity removal on wear rate, a nonlinear

  17. Parallel file system with metadata distributed across partitioned key-value store c

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron

    2017-09-19

    Improved techniques are provided for storing metadata associated with a plurality of sub-files associated with a single shared file in a parallel file system. The shared file is generated by a plurality of applications executing on a plurality of compute nodes. A compute node implements a Parallel Log Structured File System (PLFS) library to store at least one portion of the shared file generated by an application executing on the compute node and metadata for the at least one portion of the shared file on one or more object storage servers. The compute node is also configured to implement a partitioned data store for storing a partition of the metadata for the shared file, wherein the partitioned data store communicates with partitioned data stores on other compute nodes using a message passing interface. The partitioned data store can be implemented, for example, using Multidimensional Data Hashing Indexing Middleware (MDHIM).

  18. 21 CFR 720.2 - Times for filing.

    Science.gov (United States)

    2010-04-01

    ... Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) COSMETICS VOLUNTARY FILING OF COSMETIC PRODUCT INGREDIENT COMPOSITION STATEMENTS § 720.2 Times for filing. Within 180 days after forms are made available to the industry, Form FDA 2512 should be filed for each cosmetic...

  19. Towards a Cloud Computing Environment: Near Real-time Cloud Product Processing and Distribution for Next Generation Satellites

    Science.gov (United States)

    Nguyen, L.; Chee, T.; Minnis, P.; Palikonda, R.; Smith, W. L., Jr.; Spangenberg, D.

    2016-12-01

    The NASA LaRC Satellite ClOud and Radiative Property retrieval System (SatCORPS) processes and derives near real-time (NRT) global cloud products from operational geostationary satellite imager datasets. These products are being used in NRT to improve forecast model, aircraft icing warnings, and support aircraft field campaigns. Next generation satellites, such as the Japanese Himawari-8 and the upcoming NOAA GOES-R, present challenges for NRT data processing and product dissemination due to the increase in temporal and spatial resolution. The volume of data is expected to increase to approximately 10 folds. This increase in data volume will require additional IT resources to keep up with the processing demands to satisfy NRT requirements. In addition, these resources are not readily available due to cost and other technical limitations. To anticipate and meet these computing resource requirements, we have employed a hybrid cloud computing environment to augment the generation of SatCORPS products. This paper will describe the workflow to ingest, process, and distribute SatCORPS products and the technologies used. Lessons learn from working on both AWS Clouds and GovCloud will be discussed: benefits, similarities, and differences that could impact decision to use cloud computing and storage. A detail cost analysis will be presented. In addition, future cloud utilization, parallelization, and architecture layout will be discussed for GOES-R.

  20. Download this PDF file

    African Journals Online (AJOL)

    Nigerian agriculture still maintained peasant oriented economy that was prominent in the pre- ... demand (Baba, 2010). The Food and .... For the Linear functional form, the elasticity with respect to the production inputs was computed using the ...

  1. The early days of computer aided newspaper productions sys­tems

    OpenAIRE

    Enlund, Nils; Andersin, Hans

    2007-01-01

    During the years 1970-1973, an ambitious research project, the Computer Graphics Project (CGP), was carried out at the laboratory of Information Processing Science at the Helsinki University of Technology. The initial objective of the project was to develop application oriented system solutions for the emerging computer graphics technology, but the activities were soon focused on the problems of producing newspaper text, advertisement, and complete pages using interactive computer graphics. T...

  2. Influence of core design, production technique, and material selection on fracture behavior of yttria-stabilized tetragonal zirconia polycrystal fixed dental prostheses produced using different multilayer techniques: split-file, over-pressing, and manually built-up veneers.

    Science.gov (United States)

    Mahmood, Deyar Jallal Hadi; Linderoth, Ewa H; Wennerberg, Ann; Vult Von Steyern, Per

    2016-01-01

    To investigate and compare the fracture strength and fracture mode in eleven groups of currently, the most commonly used multilayer three-unit all-ceramic yttria-stabilized tetragonal zirconia polycrystal (Y-TZP) fixed dental prostheses (FDPs) with respect to the choice of core material, veneering material area, manufacturing technique, design of connectors, and radii of curvature of FDP cores. A total of 110 three-unit Y-TZP FDP cores with one intermediate pontic were made. The FDP cores in groups 1-7 were made with a split-file design, veneered with manually built-up porcelain, computer-aided design-on veneers, and over-pressed veneers. Groups 8-11 consisted of FDPs with a state-of-the-art design, veneered with manually built-up porcelain. All the FDP cores were subjected to simulated aging and finally loaded to fracture. There was a significant difference (Pdesigns, but not between the different types of Y-TZP materials. The split-file designs with VITABLOCS(®) (1,806±165 N) and e.max(®) ZirPress (1,854±115 N) and the state-of-the-art design with VITA VM(®) 9 (1,849±150 N) demonstrated the highest mean fracture values. The shape of a split-file designed all-ceramic reconstruction calls for a different dimension protocol, compared to traditionally shaped ones, as the split-file design leads to sharp approximal indentations acting as fractural impressions, thus decreasing the overall strength. The design of a framework is a crucial factor for the load bearing capacity of an all-ceramic FDP. The state-of-the-art design is preferable since the split-file designed cores call for a cross-sectional connector area at least 42% larger, to have the same load bearing capacity as the state-of-the-art designed cores. All veneering materials and techniques tested in the study, split-file, over-press, built-up porcelains, and glass-ceramics are, with a great safety margin, sufficient for clinical use both anteriorly and posteriorly. Analysis of the fracture pattern shows

  3. Peregrine System | High-Performance Computing | NREL

    Science.gov (United States)

    classes of nodes that users access: Login Nodes Peregrine has four login nodes, each of which has Intel E5 /scratch file systems, the /mss file system is mounted on all login nodes. Compute Nodes Peregrine has 2592

  4. Provider of Services File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The POS file consists of two data files, one for CLIA labs and one for 18 other provider types. The file names are CLIA and OTHER. If downloading the file, note it...

  5. Analyzing data files in SWAN

    CERN Document Server

    Gajam, Niharika

    2016-01-01

    Traditionally analyzing data happens via batch-processing and interactive work on the terminal. The project aims to provide another way of analyzing data files: A cloud-based approach. It aims to make it a productive and interactive environment through the combination of FCC and SWAN software.

  6. PC Graphic file programing

    International Nuclear Information System (INIS)

    Yang, Jin Seok

    1993-04-01

    This book gives description of basic of graphic knowledge and understanding and realization of graphic file form. The first part deals with graphic with graphic data, store of graphic data and compress of data, programing language such as assembling, stack, compile and link of program and practice and debugging. The next part mentions graphic file form such as Mac paint file, GEM/IMG file, PCX file, GIF file, and TIFF file, consideration of hardware like mono screen driver and color screen driver in high speed, basic conception of dithering and conversion of formality.

  7. Methods and Algorithms for Detecting Objects in Video Files

    Directory of Open Access Journals (Sweden)

    Nguyen The Cuong

    2018-01-01

    Full Text Available Video files are files that store motion pictures and sounds like in real life. In today's world, the need for automated processing of information in video files is increasing. Automated processing of information has a wide range of application including office/home surveillance cameras, traffic control, sports applications, remote object detection, and others. In particular, detection and tracking of object movement in video file plays an important role. This article describes the methods of detecting objects in video files. Today, this problem in the field of computer vision is being studied worldwide.

  8. Canada files WTO complaint against EC.

    Science.gov (United States)

    2000-01-01

    In December 1998, Canada filed a complaint alleging that the European Communities (EC) had adopted regulations that amounted to a scheme to extend patent terms, limited to pharmaceutical and agricultural chemical products.

  9. Silvabase: A flexible data file management system

    Science.gov (United States)

    Lambing, Steven J.; Reynolds, Sandra J.

    1991-01-01

    The need for a more flexible and efficient data file management system for mission planning in the Mission Operations Laboratory (EO) at MSFC has spawned the development of Silvabase. Silvabase is a new data file structure based on a B+ tree data structure. This data organization allows for efficient forward and backward sequential reads, random searches, and appends to existing data. It also provides random insertions and deletions with reasonable efficiency, utilization of storage space well but not at the expense of speed, and performance of these functions on a large volume of data. Mission planners required that some data be keyed and manipulated in ways not found in a commercial product. Mission planning software is currently being converted to use Silvabase in the Spacelab and Space Station Mission Planning Systems. Silvabase runs on a Digital Equipment Corporation's popular VAX/VMS computers in VAX Fortran. Silvabase has unique features involving time histories and intervals such as in operations research. Because of its flexibility and unique capabilities, Silvabase could be used in almost any government or commercial application that requires efficient reads, searches, and appends in medium to large amounts of almost any kind of data.

  10. Automated quality control in a file-based broadcasting workflow

    Science.gov (United States)

    Zhang, Lina

    2014-04-01

    Benefit from the development of information and internet technologies, television broadcasting is transforming from inefficient tape-based production and distribution to integrated file-based workflows. However, no matter how many changes have took place, successful broadcasting still depends on the ability to deliver a consistent high quality signal to the audiences. After the transition from tape to file, traditional methods of manual quality control (QC) become inadequate, subjective, and inefficient. Based on China Central Television's full file-based workflow in the new site, this paper introduces an automated quality control test system for accurate detection of hidden troubles in media contents. It discusses the system framework and workflow control when the automated QC is added. It puts forward a QC criterion and brings forth a QC software followed this criterion. It also does some experiments on QC speed by adopting parallel processing and distributed computing. The performance of the test system shows that the adoption of automated QC can make the production effective and efficient, and help the station to achieve a competitive advantage in the media market.

  11. File Level Provenance Tracking in CMS

    CERN Document Server

    Jones, C D; Paterno, M; Sexton-Kennedy, L; Tanenbaum, W; Riley, D S

    2009-01-01

    The CMS off-line framework stores provenance information within CMS's standard ROOT event data files. The provenance information is used to track how each data product was constructed, including what other data products were read to do the construction. We will present how the framework gathers the provenance information, the efforts necessary to minimise the space used to store the provenance in the file and the tools that will be available to use the provenance.

  12. EXPERIENCE OF USING CLOUD COMPUTING IN NETWORK PRODUCTS FOR SCHOOL EDUCATION

    Directory of Open Access Journals (Sweden)

    L. Sokolova

    2011-05-01

    Full Text Available We study data on the use of sites in the middle grades, secondary school, and their influence on the formation of information culture of students and their level of training. Sites use a technology called "cloud computing in Google, accessible from any internet-connected computer and do not require the use of resources of the computer itself. Sites are devoid of any advertising, does not require periodic backup, protection and general operation of the system administrator. This simplifies their use in the educational process for schools of different levels. A statistical analysis of the site was done, identified the main trends of their use.

  13. An exploratory discussion on business files compilation

    International Nuclear Information System (INIS)

    Gao Chunying

    2014-01-01

    Business files compilation for an enterprise is a distillation and recreation of its spiritual wealth, from which the applicable information can be available to those who want to use it in a fast, extensive and precise way. Proceeding from the effects of business files compilation on scientific researches, productive constructions and developments, this paper in five points discusses the way how to define topics, analyze historical materials, search or select data and process it to an enterprise archives collection. Firstly, to expound the importance and necessity of business files compilation in production, operation and development of an company; secondly, to present processing methods from topic definition, material searching and data selection to final examination and correction; thirdly, to define principle and classification in order to make different categories and levels of processing methods available to business files compilation; fourthly, to discuss the specific method how to implement a file compilation through a documentation collection upon principle of topic definition gearing with demand; fifthly, to address application of information technology to business files compilation in view point of widely needs for business files so as to level up enterprise archives management. The present discussion focuses on the examination and correction principle of enterprise historical material compilation and the basic classifications as well as the major forms of business files compilation achievements. (author)

  14. Formal computer-aided product family architecture design for mass customization

    DEFF Research Database (Denmark)

    Bonev, Martin; Hvam, Lars; Clarkson, John

    2015-01-01

    With product customization companies aim at creating higher customer value and stronger economic benefits. The profitability of the offered variety relies on the quality of the developed product family architectures and their consistent implementation in configuration systems. Yet existing method...

  15. NASA work unit system file maintenance manual

    Science.gov (United States)

    1972-01-01

    The NASA Work Unit System is a management information system for research tasks (i.e., work units) performed under NASA grants and contracts. It supplies profiles on research efforts and statistics on fund distribution. The file maintenance operator can add, delete and change records at a remote terminal or can submit punched cards to the computer room for batch update. The system is designed for file maintenance by a person with little or no knowledge of data processing techniques.

  16. Integrated Computer-aided Framework for Sustainable Chemical Product Design and Evaluation

    DEFF Research Database (Denmark)

    Kalakul, Sawitree; Cignitti, Stefano; Zhang, Lei

    2016-01-01

    This work proposes an integrated model-based framework for chemical product design and evaluation based on which the software, VPPD-Lab (The Virtual Product-Process Design Laboratory) has been developed. The framework allows the following options: (1) design a product using design templates...

  17. Computer Aided Product Service Systems Design : Service CAD and Its integration with Life Cycle Simulation

    NARCIS (Netherlands)

    Komoto, H.

    2009-01-01

    Integration of product design into service design, or vice versa, is considered to bring more efficient and effective value addition. Besides EcoDesign tools and methods, a methodology to design such an integration of products and services from a systemic perspective, or product-service systems

  18. Computational Science And Engineering Software Sustainability And Productivity (CSESSP) Challenges Workshop Report

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — This report details the challenges and opportunities discussed at the NITRD sponsored multi-agency workshop on Computational Science and Engineering Software...

  19. Computer-assisted training experiment used in the field of thermal energy production (EDF)

    International Nuclear Information System (INIS)

    Felgines, R.

    1982-01-01

    In 1981, the EDF carried out an experiment with computer-assisted training (EAO). This new approach, which continued until June 1982, involved about 700 employees all of whom operated nuclear power stations. The different stages of this experiment and the lessons which can be drawn from it are given the lessons were of a positive nature and make it possible to envisage complete coverage of all nuclear power stations by computer-assisted training within a very short space of time [fr

  20. Computational simulation of reactive species production by methane-air DBD at high pressure and high temperature

    Science.gov (United States)

    Takana, H.; Tanaka, Y.; Nishiyama, H.

    2012-01-01

    Computational simulations of a single streamer in DBD in lean methane-air mixture at pressure of 1 and 3 atm and temperature of 300 and 500 K were conducted for plasma-enhanced chemical reactions in a closed system. The effects of surrounding pressure and temperature are characterized for reactive species production by a DBD discharge. The results show that the production characteristics of reactive species are strongly influenced by the total gas number density and the higher concentration of reactive species are produced at higher pressure and lower gas temperature for a given initial reduced electric field.

  1. Models and methods for design and implementation of computer based control and monitoring systems for production cells

    DEFF Research Database (Denmark)

    Lynggaard, Hans Jørgen Birk

    This dissertation is concerned with the engineering, i.e. the designing and making, of industrial cell control systems. The focus is on automated robot welding cells in the shipbuilding industry. The industrial research project defines models and methods for design and implementation of computer...... through the implementation of two cell control systems for robot welding cells in production at Odense Steel Shipyard.It is concluded that cell control technology provides for increased performance in production systems, and that the Cell Control Engineering concept reduces the effort for providing high...... quality and high functionality cell control solutions for the industry....

  2. Computationally-generated nuclear forensic characteristics of early production reactors with an emphasis on sensitivity and uncertainty

    International Nuclear Information System (INIS)

    Redd, Evan M.; Sjoden, Glenn; Erickson, Anna

    2017-01-01

    Highlights: •X-10 reactor is used as a case study for nuclear forensic signatures. •S/U analysis is conducted to derive statistically relevant markers. •Computationally-generated signatures aid with proliferation pathway identification. •Highest uncertainty in total plutonium production originates from 238 Pu and 242 Pu. -- Abstract: With nuclear technology and analysis advancements, site access restrictions, and ban on nuclear testing, computationally-generated nuclear forensic signatures are becoming more important in gaining knowledge to a reclusive country’s weapon material production capabilities. In particular, graphite-moderated reactors provide an appropriate case study for isotopics relevant in Pu production in a clandestine nuclear program due to the ease of design and low thermal output. We study the production characteristics of the X-10 reactor with a goal to develop statistically-relevant nuclear forensic signatures from early Pu production. In X-10 reactor, a flat flux gradient and low burnup produce exceptionally pure Pu as evident by the 240 Pu/ 239 Pu ratio. However, these design aspects also make determining reactor zone attribution, done with the 242 Pu/ 240 Pu ratio, uncertain. Alternatively, the same ratios produce statistically differentiable results between Manhattan Project and post-Manhattan Project reactor configurations, allowing for attribution conclusions.

  3. Evaluation of Cross-Section Sensitivities in Computing Burnup Credit Fission Product Concentrations

    International Nuclear Information System (INIS)

    Gauld, I.C.

    2005-01-01

    U.S. Nuclear Regulatory Commission Interim Staff Guidance 8 (ISG-8) for burnup credit covers actinides only, a position based primarily on the lack of definitive critical experiments and adequate radiochemical assay data that can be used to quantify the uncertainty associated with fission product credit. The accuracy of fission product neutron cross sections is paramount to the accuracy of criticality analyses that credit fission products in two respects: (1) the microscopic cross sections determine the reactivity worth of the fission products in spent fuel and (2) the cross sections determine the reaction rates during irradiation and thus influence the accuracy of predicted final concentrations of the fission products in the spent fuel. This report evaluates and quantifies the importance of the fission product cross sections in predicting concentrations of fission products proposed for use in burnup credit. The study includes an assessment of the major fission products in burnup credit and their production precursors. Finally, the cross-section importances, or sensitivities, are combined with the importance of each major fission product to the system eigenvalue (k eff ) to determine the net importance of cross sections to k eff . The importances established the following fission products, listed in descending order of priority, that are most likely to benefit burnup credit when their cross-section uncertainties are reduced: 151 Sm, 103 Rh, 155 Eu, 150 Sm, 152 Sm, 153 Eu, 154 Eu, and 143 Nd

  4. Computer model for refinery operations with emphasis on jet fuel production. Volume 3: Detailed systems and programming documentation

    Science.gov (United States)

    Dunbar, D. N.; Tunnah, B. G.

    1978-01-01

    The FORTRAN computing program predicts flow streams and material, energy, and economic balances of a typical petroleum refinery, with particular emphasis on production of aviation turbine fuels of varying end point and hydrogen content specifications. The program has a provision for shale oil and coal oil in addition to petroleum crudes. A case study feature permits dependent cases to be run for parametric or optimization studies by input of only the variables which are changed from the base case.

  5. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  6. Computerized index for teaching files

    International Nuclear Information System (INIS)

    Bramble, J.M.

    1989-01-01

    A computerized index can be used to retrieve cases from a teaching file that have radiographic findings similar to an unknown case. The probability that a user will review cases with a correct diagnosis was estimated with use of radiographic findings of arthritis in hand radiographs of 110 cases from a teaching file. The nearest-neighbor classification algorithm was used as a computer index to 110 cases of arthritis. Each case was treated as an unknown and inputted to the computer index. The accuracy of the computer index in retrieving cases with the same diagnosis (including rheumatoid arthritis, gout, psoriatic arthritis, inflammatory osteoarthritis, and pyrophosphate arthropathy) was measured. A Bayes classifier algorithm was also tested on the same database. Results are presented. The nearest-neighbor algorithm was 83%. By comparison, the estimated accuracy of the Bayes classifier algorithm was 78%. Conclusions: A computerized index to a teaching file based on the nearest-neighbor algorithm should allow the user to review cases with the correct diagnosis of an unknown case, by entering the findings of the unknown case

  7. Status and evaluation methods of JENDL fusion file and JENDL PKA/KERMA file

    International Nuclear Information System (INIS)

    Chiba, S.; Fukahori, T.; Shibata, K.; Yu Baosheng; Kosako, K.

    1997-01-01

    The status of evaluated nuclear data in the JENDL fusion file and PKA/KERMA file is presented. The JENDL fusion file was prepared in order to improve the quality of the JENDL-3.1 data especially on the double-differential cross sections (DDXs) of secondary neutrons and gamma-ray production cross sections, and to provide DDXs of secondary charged particles (p, d, t, 3 He and α-particle) for the calculation of PKA and KERMA factors. The JENDL fusion file contains evaluated data of 26 elements ranging from Li to Bi. The data in JENDL fusion file reproduce the measured data on neutron and charged-particle DDXs and also on gamma-ray production cross sections. Recoil spectra in PKA/KERMA file were calculated from secondary neutron and charged-particle DDXs contained in the fusion file with two-body reaction kinematics. The data in the JENDL fusion file and PKA/KERMA file were compiled in ENDF-6 format with an MF=6 option to store the DDX data. (orig.)

  8. Development changes of geometric layout product, developed by means of computer aided design

    Directory of Open Access Journals (Sweden)

    С.Г. Кєворков

    2007-01-01

    Full Text Available  Contains results of development of modification formation methodology in a product geometrical mockup made by means of CAD system. Change process of a CAD data (assembly structures, details and influencing on a product structure is considered. The analysis of the assembly version creations algorithm, which creates a product structure with certain serial number, is carried out. The algorithms of CAD user environment creations, restriction of CAD object and CAD object cancellation algorithm are created.

  9. Computer Aided Methodology for Simultaneous Synthesis, Design & Analysis of Chemical Products-Processes

    DEFF Research Database (Denmark)

    d'Anterroches, Loïc; Gani, Rafiqul

    2006-01-01

    A new combined methodology for computer aided molecular design and process flowsheet design is presented. The methodology is based on the group contribution approach for prediction of molecular properties and design of molecules. Using the same principles, process groups have been developed...... a wide range of problems. In this paper, only the computer aided flowsheet design related features are presented....... together with their corresponding flowsheet property models. To represent the process flowsheets in the same way as molecules, a unique but simple notation system has been developed. The methodology has been converted into a prototype software, which has been tested with several case studies covering...

  10. The design and development of GRASS file reservation system

    International Nuclear Information System (INIS)

    Huang Qiulan; Zhu Suijiang; Cheng Yaodong; Chen Gang

    2010-01-01

    GFRS (GRASS File Reservation System) is designed to improve the file access performance of GRASS (Grid-enabled Advanced Storage System) which is a Hierarchical Storage Management (HSM) system developed at Computing Center, Institute of High Energy Physics. GRASS can provide massive storage management and data migration, but the data migration policy is simply based factors such as pool water level, the intervals for migration and so on, so it is short of precise control over files. As for that, we design GFRS to implement user-based file reservation which is to reserve and keep the required files on disks for High Energy physicists. CFRS can improve file access speed for users by avoiding migrating frequently accessed files to tapes. In this paper we first give a brief introduction of GRASS system and then detailed architecture and implementation of GFRS. Experiments results from GFRS have shown good performance and a simple analysis is made based on it. (authors)

  11. Decay data file based on the ENSDF file

    Energy Technology Data Exchange (ETDEWEB)

    Katakura, J. [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-03-01

    A decay data file with the JENDL (Japanese Evaluated Nuclear Data Library) format based on the ENSDF (Evaluated Nuclear Structure Data File) file was produced as a tentative one of special purpose files of JENDL. The problem using the ENSDF file as primary source data of the JENDL decay data file is presented. (author)

  12. Computational metabolic engineering strategies for growth-coupled biofuel production by Synechocystis

    Directory of Open Access Journals (Sweden)

    Kiyan Shabestary

    2016-12-01

    Full Text Available Chemical and fuel production by photosynthetic cyanobacteria is a promising technology but to date has not reached competitive rates and titers. Genome-scale metabolic modeling can reveal limitations in cyanobacteria metabolism and guide genetic engineering strategies to increase chemical production. Here, we used constraint-based modeling and optimization algorithms on a genome-scale model of Synechocystis PCC6803 to find ways to improve productivity of fermentative, fatty-acid, and terpene-derived fuels. OptGene and MOMA were used to find heuristics for knockout strategies that could increase biofuel productivity. OptKnock was used to find a set of knockouts that led to coupling between biofuel and growth. Our results show that high productivity of fermentation or reversed beta-oxidation derived alcohols such as 1-butanol requires elimination of NADH sinks, while terpenes and fatty-acid based fuels require creating imbalances in intracellular ATP and NADPH production and consumption. The FBA-predicted productivities of these fuels are at least 10-fold higher than those reported so far in the literature. We also discuss the physiological and practical feasibility of implementing these knockouts. This work gives insight into how cyanobacteria could be engineered to reach competitive biofuel productivities. Keywords: Cyanobacteria, Modeling, Flux balance analysis, Biofuel, MOMA, OptFlux, OptKnock

  13. Estimating and validating ground-based timber harvesting production through computer simulation

    Science.gov (United States)

    Jingxin Wang; Chris B. LeDoux

    2003-01-01

    Estimating ground-based timber harvesting systems production with an object oriented methodology was investigated. The estimation model developed generates stands of trees, simulates chain saw, drive-to-tree feller-buncher, swing-to-tree single-grip harvester felling, and grapple skidder and forwarder extraction activities, and analyzes costs and productivity. It also...

  14. Computer software to estimate timber harvesting system production, cost, and revenue

    Science.gov (United States)

    Dr. John E. Baumgras; Dr. Chris B. LeDoux

    1992-01-01

    Large variations in timber harvesting cost and revenue can result from the differences between harvesting systems, the variable attributes of harvesting sites and timber stands, or changing product markets. Consequently, system and site specific estimates of production rates and costs are required to improve estimates of harvesting revenue. This paper describes...

  15. A hybrid approach to the computational aeroacoustics of human voice production

    Czech Academy of Sciences Publication Activity Database

    Šidlof, Petr; Zörner, S.; Huppe, A.

    2015-01-01

    Roč. 14, č. 3 (2015), s. 473-488 ISSN 1617-7959 R&D Projects: GA ČR(CZ) GAP101/11/0207 Institutional support: RVO:61388998 Keywords : computational aeroacoustics * parallel CFD * human voice * vocal folds * ventricular folds Subject RIV: BI - Acoustics Impact factor: 3.032, year: 2015

  16. Memristor-Based Analog Computation and Neural Network Classification with a Dot Product Engine.

    Science.gov (United States)

    Hu, Miao; Graves, Catherine E; Li, Can; Li, Yunning; Ge, Ning; Montgomery, Eric; Davila, Noraica; Jiang, Hao; Williams, R Stanley; Yang, J Joshua; Xia, Qiangfei; Strachan, John Paul

    2018-03-01

    Using memristor crossbar arrays to accelerate computations is a promising approach to efficiently implement algorithms in deep neural networks. Early demonstrations, however, are limited to simulations or small-scale problems primarily due to materials and device challenges that limit the size of the memristor crossbar arrays that can be reliably programmed to stable and analog values, which is the focus of the current work. High-precision analog tuning and control of memristor cells across a 128 × 64 array is demonstrated, and the resulting vector matrix multiplication (VMM) computing precision is evaluated. Single-layer neural network inference is performed in these arrays, and the performance compared to a digital approach is assessed. Memristor computing system used here reaches a VMM accuracy equivalent of 6 bits, and an 89.9% recognition accuracy is achieved for the 10k MNIST handwritten digit test set. Forecasts show that with integrated (on chip) and scaled memristors, a computational efficiency greater than 100 trillion operations per second per Watt is possible. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Factors Influencing F/OSS Cloud Computing Software Product Success: A Quantitative Study

    Science.gov (United States)

    Letort, D. Brian

    2012-01-01

    Cloud Computing introduces a new business operational model that allows an organization to shift information technology consumption from traditional capital expenditure to operational expenditure. This shift introduces challenges from both the adoption and creation vantage. This study evaluates factors that influence Free/Open Source Software…

  18. Model-driven product line engineering for mapping parallel algorithms to parallel computing platforms

    NARCIS (Netherlands)

    Arkin, Ethem; Tekinerdogan, Bedir

    2016-01-01

    Mapping parallel algorithms to parallel computing platforms requires several activities such as the analysis of the parallel algorithm, the definition of the logical configuration of the platform, the mapping of the algorithm to the logical configuration platform and the implementation of the

  19. Evaluating the Acceptance of Cloud-Based Productivity Computer Solutions in Small and Medium Enterprises

    Science.gov (United States)

    Dominguez, Alfredo

    2013-01-01

    Cloud computing has emerged as a new paradigm for on-demand delivery and consumption of shared IT resources over the Internet. Research has predicted that small and medium organizations (SMEs) would be among the earliest adopters of cloud solutions; however, this projection has not materialized. This study set out to investigate if behavior…

  20. Artificial intelligence applied to natural products; computer study of pimarane diterpene

    International Nuclear Information System (INIS)

    Lopes, M.N.; Borges, J.H.G.; Furlan, M.; Gastmans, J.P.; Emerenciano, V. de

    1989-01-01

    This paper describes the study of the sup(13)C NMR characteristic signals of naturally occurring pimarane. The analysis is performed by computer, starting from a data base which encloses about 400 diterpenes and using the PICKUPS programm. BNy this way it is possible to analyse substructure from one to five atoms as well as the effects of substituents on them. (author)

  1. Map Design for Computer Processing: Literature Review and DMA Product Critique.

    Science.gov (United States)

    1985-01-01

    outcome. Using a program 0 Use only a narrow border of layer tint on each side called " Seurat ," gridded elevation data is processed by of the contour line...Massachusetts., unpublished. sity Cartographers 6, pp. 40-45. Dutton, Geoffrey (1981bj The Seurat Program. Computer French, Robert J. (1954). Pattern

  2. Computer-aided preparation of specifications for radial fans at VEB Lufttechnische Anlagen Berlin

    Energy Technology Data Exchange (ETDEWEB)

    Kubis, R.; Kull, W.

    1987-01-01

    The specification details the scope of delivery for radial fans on a standard page and also serves the preparation for production. In the place of previous manual preparation, a computer-aided technique for the office computer is presented that provides the technical parameters from data files out of few input data to identify the fan type. The data files and evaluative programs are based on the software tool REDABAS and the SCP operating system. Using this technique it has been possible to cut considerably the preparation time for the incoming orders.

  3. UPIN Group File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Group Unique Physician Identifier Number (UPIN) File is the business entity file that contains the group practice UPIN and descriptive information. It does NOT...

  4. Multi-level, automatic file management system using magnetic disk, mass storage system and magnetic tape

    International Nuclear Information System (INIS)

    Fujii, Minoru; Asai, Kiyoshi

    1979-12-01

    A simple, effective file management system using magnetic disk, mass storage system (MSS) and magnetic tape is described. Following are the concepts and techniques introduced in this file management system. (1) File distribution and continuity character of file references are closely approximated by memory retention function. A density function using the memory retention function is thus defined. (2) A method of computing the cost/benefit lines for magnetic disk, MSS and magnetic tape is presented. (3) A decision process of an optimal organization of file facilities incorporating file demands distribution to respective file devices, is presented. (4) A method of simple, practical, effective, automatic file management, incorporating multi-level file management, space management and file migration control, is proposed. (author)

  5. Biomimicry in Product Design through Materials Selection and Computer Aided Engineering

    Science.gov (United States)

    Alexandridis, G.; Tzetzis, D.; Kyratsis, P.

    2016-11-01

    The aim of this study is to demonstrate a 7-step methodology that describes the way nature can act as a source of inspiration for the design and the development of a product. Furthermore, it suggests special computerized tools and methods for the product optimization regarding its environmental impact i.e. material selection, production methods. For validation purposes, a garden chaise lounge that imitates the form of a scorpion was developed as a result for the case study and the presentation of the current methodology.

  6. Computer modelling of the influences of a subsystems’ interaction on energetic efficiency of biofuel production systems

    Directory of Open Access Journals (Sweden)

    Wasiak Andrzej

    2017-01-01

    Full Text Available Energetic efficiency of biofuel production systems, as well as that of other fuels production systems, can be evaluated on the basis of modified EROEI indicator. In earlier papers, a new definition of the EROEI indicator was introduced. This approach enables the determination of this indicator separately for individual subsystems of a chosen production system, and therefore enables the studies of the influence of every subsystem on the energetic efficiency of the system as a whole. The method has been applied to the analysis of interactions between agricultural, internal transport subsystems, as well as preliminary studies of the effect of industrial subsystem.

  7. A File Archival System

    Science.gov (United States)

    Fanselow, J. L.; Vavrus, J. L.

    1984-01-01

    ARCH, file archival system for DEC VAX, provides for easy offline storage and retrieval of arbitrary files on DEC VAX system. System designed to eliminate situations that tie up disk space and lead to confusion when different programers develop different versions of same programs and associated files.

  8. Text File Comparator

    Science.gov (United States)

    Kotler, R. S.

    1983-01-01

    File Comparator program IFCOMP, is text file comparator for IBM OS/VScompatable systems. IFCOMP accepts as input two text files and produces listing of differences in pseudo-update form. IFCOMP is very useful in monitoring changes made to software at the source code level.

  9. Study on Production Management in Programming of Computer Numerical Control Machines

    Directory of Open Access Journals (Sweden)

    Gheorghe Popovici

    2014-12-01

    Full Text Available The paper presents the results of a study regarding the need for technology in programming for machinetools with computer-aided command. Engineering is the science of making skilled things. That is why, in the "factory of the future", programming engineering will have to realise the part processing on MU-CNCs (Computer Numerical Control Machines in the optimum economic variant. There is no "recipe" when it comes to technologies. In order to select the correct variant from among several technical variants, 10 technological requirements are forwarded for the engineer to take into account in MU-CNC programming. It is the first argued synthesis of the need for technological knowledge in MU-CNC programming.

  10. Characterization and optimization of single-use bioreactors and biopharmaceutical production processes using computational fluid dynamics

    OpenAIRE

    Kaiser, Stephan Christian

    2015-01-01

    Durch die örtliche und zeitliche Modellierung der auftretenden Strömungen bietet die numerische Fluiddynamik (engl. Computational Fluid Dynamics, CFD) das Potenzial detaillierte Untersuchungen der Hydrodynamik in Bioreaktoren durchzuführen. Allerdings sind bisher nur wenige Studien in Verbindung mit Einwegbioreaktoren, die sich durch konstruktiven Besonderheiten von ihren klassischen Gegenspielern aus Glas und/oder Edelstahl unterscheiden, publiziert. Die vorliegende Arbeit soll daher geeigne...

  11. Image Analysis via Soft Computing: Prototype Applications at NASA KSC and Product Commercialization

    Science.gov (United States)

    Dominguez, Jesus A.; Klinko, Steve

    2011-01-01

    This slide presentation reviews the use of "soft computing" which differs from "hard computing" in that it is more tolerant of imprecision, partial truth, uncertainty, and approximation and its use in image analysis. Soft computing provides flexible information processing to handle real life ambiguous situations and achieve tractability, robustness low solution cost, and a closer resemblance to human decision making. Several systems are or have been developed: Fuzzy Reasoning Edge Detection (FRED), Fuzzy Reasoning Adaptive Thresholding (FRAT), Image enhancement techniques, and visual/pattern recognition. These systems are compared with examples that show the effectiveness of each. NASA applications that are reviewed are: Real-Time (RT) Anomaly Detection, Real-Time (RT) Moving Debris Detection and the Columbia Investigation. The RT anomaly detection reviewed the case of a damaged cable for the emergency egress system. The use of these techniques is further illustrated in the Columbia investigation with the location and detection of Foam debris. There are several applications in commercial usage: image enhancement, human screening and privacy protection, visual inspection, 3D heart visualization, tumor detections and x ray image enhancement.

  12. Density functional computational studies on the glucose and glycine Maillard reaction: Formation of the Amadori rearrangement products

    Science.gov (United States)

    Jalbout, Abraham F.; Roy, Amlan K.; Shipar, Abul Haider; Ahmed, M. Samsuddin

    Theoretical energy changes of various intermediates leading to the formation of the Amadori rearrangement products (ARPs) under different mechanistic assumptions have been calculated, by using open chain glucose (O-Glu)/closed chain glucose (A-Glu and B-Glu) and glycine (Gly) as a model for the Maillard reaction. Density functional theory (DFT) computations have been applied on the proposed mechanisms under different pH conditions. Thus, the possibility of the formation of different compounds and electronic energy changes for different steps in the proposed mechanisms has been evaluated. B-Glu has been found to be more efficient than A-Glu, and A-Glu has been found more efficient than O-Glu in the reaction. The reaction under basic condition is the most favorable for the formation of ARPs. Other reaction pathways have been computed and discussed in this work.0

  13. Computer simulation of charged fusion-product trajectories and detection efficiency expected for future experiments within the COMPASS tokamak

    International Nuclear Information System (INIS)

    Kwiatkowski, Roch; Malinowski, Karol; Sadowski, Marek J

    2014-01-01

    This paper presents results of computer simulations of charged particle motions and detection efficiencies for an ion-pinhole camera of a new diagnostic system to be used in future COMPASS tokamak experiments. A probe equipped with a nuclear track detector can deliver information about charged products of fusion reactions. The calculations were performed with a so-called Gourdon code, based on a single-particle model and toroidal symmetry. There were computed trajectories of fast ions (> 500 keV) in medium-dense plasma (n e  < 10 14  cm −3 ) and an expected detection efficiency (a ratio of the number of detected particles to that of particles emitted from plasma). The simulations showed that charged fusion products can reach the new diagnostic probe, and the expected detection efficiency can reach 2 × 10 −8 . Based on such calculations, one can determine the optimal position and orientation of the probe. The obtained results are of importance for the interpretation of fusion-product images to be recorded in future COMPASS experiments. (paper)

  14. METHODS OF ASSESSING THE DEGREE OF DESTRUCTION OF RUBBER PRODUCTS USING COMPUTER VISION ALGORITHMS

    Directory of Open Access Journals (Sweden)

    A. A. Khvostov

    2015-01-01

    Full Text Available For technical inspection of rubber products are essential methods of improving video scopes analyzing the degree of destruction and aging of rubber in an aggressive environment. The main factor determining the degree of destruction of the rubber product, the degree of coverage is cracked, which can be described as the amount of the total area, perimeter cracks, geometric shapes and other parameters. In the process of creating a methodology for assessing the degree of destruction of rubber products arises the problem of the development of machine vision algorithm for estimating the degree of coverage of the sample fractures and fracture characterization. For the development of image processing algorithm performed experimental studies on the artificial aging of several samples of products that are made from different rubbers. In the course of the experiments it was obtained several samples of shots vulcanizates in real time. To achieve the goals initially made light stabilization of array images using Gaussian filter. Thereafter, for each image binarization operation is applied. To highlight the contours of the surface damage of the sample is used Canny algorithm. The detected contours are converted into an array of pixels. However, a crack may be allocated to several contours. Therefore, an algorithm was developed by combining contours criterion of minimum distance between them. At the end of the calculation is made of the morphological features of each contour (area, perimeter, length, width, angle of inclination, the At the end of the calculation is made of the morphological features of each contour (area, perimeter, length, width, angle of inclination, the Minkowski dimension. Show schedule obtained by the method parameters destruction of samples of rubber products. The developed method allows you to automate assessment of the degree of aging of rubber products in telemetry systems, to study the dynamics of the aging process of polymers to

  15. Analysis of reaction cross-section production in neutron induced fission reactions on uranium isotope using computer code COMPLET.

    Science.gov (United States)

    Asres, Yihunie Hibstie; Mathuthu, Manny; Birhane, Marelgn Derso

    2018-04-22

    This study provides current evidence about cross-section production processes in the theoretical and experimental results of neutron induced reaction of uranium isotope on projectile energy range of 1-100 MeV in order to improve the reliability of nuclear stimulation. In such fission reactions of 235 U within nuclear reactors, much amount of energy would be released as a product that able to satisfy the needs of energy to the world wide without polluting processes as compared to other sources. The main objective of this work is to transform a related knowledge in the neutron-induced fission reactions on 235 U through describing, analyzing and interpreting the theoretical results of the cross sections obtained from computer code COMPLET by comparing with the experimental data obtained from EXFOR. The cross section value of 235 U(n,2n) 234 U, 235 U(n,3n) 233 U, 235 U(n,γ) 236 U, 235 U(n,f) are obtained using computer code COMPLET and the corresponding experimental values were browsed by EXFOR, IAEA. The theoretical results are compared with the experimental data taken from EXFOR Data Bank. Computer code COMPLET has been used for the analysis with the same set of input parameters and the graphs were plotted by the help of spreadsheet & Origin-8 software. The quantification of uncertainties stemming from both experimental data and computer code calculation plays a significant role in the final evaluated results. The calculated results for total cross sections were compared with the experimental data taken from EXFOR in the literature, and good agreement was found between the experimental and theoretical data. This comparison of the calculated data was analyzed and interpreted with tabulation and graphical descriptions, and the results were briefly discussed within the text of this research work. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Internet resources for dentistry: computer, Internet, reference, and sites for enhancing personal productivity of the dental professional.

    Science.gov (United States)

    Guest, G F

    2000-08-15

    At the onset of the new millennium the Internet has become the new standard means of distributing information. In the last two to three years there has been an explosion of e-commerce with hundreds of new web sites being created every minute. For most corporate entities, a web site is as essential as the phone book listing used to be. Twenty years ago technologist directed how computer-based systems were utilized. Now it is the end users of personal computers that have gained expertise and drive the functionality of software applications. The computer, initially invented for mathematical functions, has transitioned from this role to an integrated communications device that provides the portal to the digital world. The Web needs to be used by healthcare professionals, not only for professional activities, but also for instant access to information and services "just when they need it." This will facilitate the longitudinal use of information as society continues to gain better information access skills. With the demand for current "just in time" information and the standards established by Internet protocols, reference sources of information may be maintained in dynamic fashion. News services have been available through the Internet for several years, but now reference materials such as online journals and digital textbooks have become available and have the potential to change the traditional publishing industry. The pace of change should make us consider Will Rogers' advice, "It isn't good enough to be moving in the right direction. If you are not moving fast enough, you can still get run over!" The intent of this article is to complement previous articles on Internet Resources published in this journal, by presenting information about web sites that present information on computer and Internet technologies, reference materials, news information, and information that lets us improve personal productivity. Neither the author, nor the Journal endorses any of the

  17. Grammar-Based Specification and Parsing of Binary File Formats

    Directory of Open Access Journals (Sweden)

    William Underwood

    2012-03-01

    Full Text Available The capability to validate and view or play binary file formats, as well as to convert binary file formats to standard or current file formats, is critically important to the preservation of digital data and records. This paper describes the extension of context-free grammars from strings to binary files. Binary files are arrays of data types, such as long and short integers, floating-point numbers and pointers, as well as characters. The concept of an attribute grammar is extended to these context-free array grammars. This attribute grammar has been used to define a number of chunk-based and directory-based binary file formats. A parser generator has been used with some of these grammars to generate syntax checkers (recognizers for validating binary file formats. Among the potential benefits of an attribute grammar-based approach to specification and parsing of binary file formats is that attribute grammars not only support format validation, but support generation of error messages during validation of format, validation of semantic constraints, attribute value extraction (characterization, generation of viewers or players for file formats, and conversion to current or standard file formats. The significance of these results is that with these extensions to core computer science concepts, traditional parser/compiler technologies can potentially be used as a part of a general, cost effective curation strategy for binary file formats.

  18. 29 CFR 4000.28 - What if I send a computer disk?

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false What if I send a computer disk? 4000.28 Section 4000.28... I send a computer disk? (a) In general. We determine your filing or issuance date for a computer... paragraph (b) of this section. (1) Filings. For computer-disk filings, we may treat your submission as...

  19. Integrating Molecular Computation and Material Production in an Artificial Subcellular Matrix

    DEFF Research Database (Denmark)

    Fellermann, Harold; Hadorn, Maik; Bönzli, Eva

    Living systems are unique in that they integrate molecular recognition and information processing with material production on the molecular scale. Pre- dominant locus of this integration is the cellular matrix, where a multitude of biochemical reactions proceed simultaneously in highly compartmen......Living systems are unique in that they integrate molecular recognition and information processing with material production on the molecular scale. Pre- dominant locus of this integration is the cellular matrix, where a multitude of biochemical reactions proceed simultaneously in highly...... compartmentalized re- action compartments that interact and get delivered through vesicle trafficking. The European Commission funded project MatchIT (Matrix for Chemical IT) aims at creating an artificial cellular matrix that seamlessly integrates infor- mation processing and material production in much the same...

  20. Computer study of isotope production for medical and industrial applications in high power accelerators

    Science.gov (United States)

    Mashnik, S. G.; Wilson, W. B.; Van Riper, K. A.

    2001-07-01

    Methods for radionuclide production calculation in a high power proton accelerator have been developed and applied to study production of 22 isotopes. These methods are readily applicable both to accelerator and reactor environments and to the production of other radioactive and stable isotopes. We have also developed methods for evaluating cross sections from a wide variety of sources into a single cross section set and have produced an evaluated library covering about a third of all natural elements that may be expanded to other reactions. A 684 page detailed report on this study, with 37 tables and 264 color figures, is available on the Web at http://t2.lanl.gov/publications/.

  1. A computer study of radionuclide production in high power accelerators for medical and industrial applications

    Science.gov (United States)

    Van Riper, K. A.; Mashnik, S. G.; Wilson, W. B.

    2001-05-01

    Methods for radionuclide production calculation in a high power proton accelerator have been developed and applied to study production of 22 isotopes by high-energy protons and neutrons. These methods are readily applicable to accelerator, and reactor, environments other than the particular model we considered and to the production of other radioactive and stable isotopes. We have also developed methods for evaluating cross sections from a wide variety of sources into a single cross section set and have produced an evaluated library covering about a third of all natural elements. These methods also are applicable to an expanded set of reactions. A 684 page detailed report on this study, with 37 tables and 264 color figures is available on the Web at http://t2.lanl.gov/publications/publications.html, or, if not accessible, in hard copy from the authors.

  2. A data compression algorithm for nuclear spectrum files

    International Nuclear Information System (INIS)

    Mika, J.F.; Martin, L.J.; Johnston, P.N.

    1990-01-01

    The total space occupied by computer files of spectra generated in nuclear spectroscopy systems can lead to problems of storage, and transmission time. An algorithm is presented which significantly reduces the space required to store nuclear spectra, without loss of any information content. Testing indicates that spectrum files can be routinely compressed by a factor of 5. (orig.)

  3. A Computational Method for Determining the Equilibrium Composition and Product Temperature in a LH2/LOX Combustor

    Science.gov (United States)

    Sozen, Mehmet

    2003-01-01

    In what follows, the model used for combustion of liquid hydrogen (LH2) with liquid oxygen (LOX) using chemical equilibrium assumption, and the novel computational method developed for determining the equilibrium composition and temperature of the combustion products by application of the first and second laws of thermodynamics will be described. The modular FORTRAN code developed as a subroutine that can be incorporated into any flow network code with little effort has been successfully implemented in GFSSP as the preliminary runs indicate. The code provides capability of modeling the heat transfer rate to the coolants for parametric analysis in system design.

  4. New orthopaedic implant management tool for computer-assisted planning, navigation, and simulation: from implant CAD files to a standardized XML-based implant database.

    Science.gov (United States)

    Sagbo, S; Blochaou, F; Langlotz, F; Vangenot, C; Nolte, L-P; Zheng, G

    2005-01-01

    Computer-Assisted Orthopaedic Surgery (CAOS) has made much progress over the last 10 years. Navigation systems have been recognized as important tools that help surgeons, and various such systems have been developed. A disadvantage of these systems is that they use non-standard formalisms and techniques. As a result, there are no standard concepts for implant and tool management or data formats to store information for use in 3D planning and navigation. We addressed these limitations and developed a practical and generic solution that offers benefits for surgeons, implant manufacturers, and CAS application developers. We developed a virtual implant database containing geometrical as well as calibration information for orthopedic implants and instruments, with a focus on trauma. This database has been successfully tested for various applications in the client/server mode. The implant information is not static, however, because manufacturers periodically revise their implants, resulting in the deletion of some implants and the introduction of new ones. Tracking these continuous changes and keeping CAS systems up to date is a tedious task if done manually. This leads to additional costs for system development, and some errors are inevitably generated due to the huge amount of information that has to be processed. To ease management with respect to implant life cycle, we developed a tool to assist end-users (surgeons, hospitals, CAS system providers, and implant manufacturers) in managing their implants. Our system can be used for pre-operative planning and intra-operative navigation, and also for any surgical simulation involving orthopedic implants. Currently, this tool allows addition of new implants, modification of existing ones, deletion of obsolete implants, export of a given implant, and also creation of backups. Our implant management system has been successfully tested in the laboratory with very promising results. It makes it possible to fill the current gap

  5. Securing the AliEn File Catalogue - Enforcing authorization with accountable file operations

    International Nuclear Information System (INIS)

    Schreiner, Steffen; Banerjee, Subho Sankar; Betev, Latchezar; Carminati, Federico; Vladimirovna Datskova, Olga; Furano, Fabrizio; Grigoras, Alina; Grigoras, Costin; Mendez Lorenzo, Patricia; Peters, Andreas Joachim; Saiz, Pablo; Bagnasco, Stefano; Zhu Jianlin

    2011-01-01

    The AliEn Grid Services, as operated by the ALICE Collaboration in its global physics analysis grid framework, is based on a central File Catalogue together with a distributed set of storage systems and the possibility to register links to external data resources. This paper describes several identified vulnerabilities in the AliEn File Catalogue access protocol regarding fraud and unauthorized file alteration and presents a more secure and revised design: a new mechanism, called LFN Booking Table, is introduced in order to keep track of access authorization in the transient state of files entering or leaving the File Catalogue. Due to a simplification of the original Access Envelope mechanism for xrootd-protocol-based storage systems, fundamental computational improvements of the mechanism were achieved as well as an up to 50% reduction of the credential's size. By extending the access protocol with signed status messages from the underlying storage system, the File Catalogue receives trusted information about a file's size and checksum and the protocol is no longer dependent on client trust. Altogether, the revised design complies with atomic and consistent transactions and allows for accountable, authentic, and traceable file operations. This paper describes these changes as part and beyond the development of AliEn version 2.19.

  6. Seeing red? : The agency of computer software in the production and management of students’ school absences

    OpenAIRE

    Bodén, Linnea

    2013-01-01

    An increasing number of Swedish municipalities use digital software to manage the registration of students’ school absences. The software is regarded as a problem-solving tool to make registration more efficient, but its effects on the educational setting have been largely neglected. Focusing on an event with two students from a class of 11-year-olds, the aim of the paper is to explore schools’ common uses of computer software for registering absence in order to understand how materialities –...

  7. Application of computer virtual simulation technology in 3D animation production

    Science.gov (United States)

    Mo, Can

    2017-11-01

    In the continuous development of computer technology, the application system of virtual simulation technology has been further optimized and improved. It also has been widely used in various fields of social development, such as city construction, interior design, industrial simulation and tourism teaching etc. This paper mainly introduces the virtual simulation technology used in 3D animation. Based on analyzing the characteristics of virtual simulation technology, the application ways and means of this technology in 3D animation are researched. The purpose is to provide certain reference for the 3D effect promotion days after.

  8. Automating the segmentation of medical images for the production of voxel tomographic computational models

    International Nuclear Information System (INIS)

    Caon, M.

    2001-01-01

    Radiation dosimetry for the diagnostic medical imaging procedures performed on humans requires anatomically accurate, computational models. These may be constructed from medical images as voxel-based tomographic models. However, they are time consuming to produce and as a consequence, there are few available. This paper discusses the emergence of semi-automatic segmentation techniques and describes an application (iRAD) written in Microsoft Visual Basic that allows the bitmap of a medical image to be segmented interactively and semi-automatically while displayed in Microsoft Excel. iRAD will decrease the time required to construct voxel models. Copyright (2001) Australasian College of Physical Scientists and Engineers in Medicine

  9. Organizational diagnosis of computer and information learning needs: the process and product.

    Science.gov (United States)

    Nelson, R; Anton, B

    1997-01-01

    Organizational diagnosis views the organization as a single entity with problems and challenges that are unique to the organization as a whole. This paper describes the process of establishing organizational diagnoses related to computer and information learning needs within a clinical or academic health care institution. The assessment of a college within a state-owned university in the U.S.A. is used to demonstrate the process of organizational diagnosis. The diagnoses identified include the need to improve information seeking skills and the information presentation skills of faculty.

  10. Influence of core design, production technique, and material selection on fracture behavior of yttria-stabilized tetragonal zirconia polycrystal fixed dental prostheses produced using different multilayer techniques: split-file, over-pressing, and manually built-up veneers

    Directory of Open Access Journals (Sweden)

    Mahmood DJH

    2016-02-01

    Full Text Available Deyar Jallal Hadi Mahmood, Ewa H Linderoth, Ann Wennerberg, Per Vult Von Steyern Department of Prosthetic Dentistry, Faculty of Odontology, Malmö University, Malmö, Sweden Aim: To investigate and compare the fracture strength and fracture mode in eleven groups of currently, the most commonly used multilayer three-unit all-ceramic yttria-stabilized tetragonal zirconia polycrystal (Y-TZP fixed dental prostheses (FDPs with respect to the choice of core material, veneering material area, manufacturing technique, design of connectors, and radii of curvature of FDP cores. Materials and methods: A total of 110 three-unit Y-TZP FDP cores with one intermediate pontic were made. The FDP cores in groups 1–7 were made with a split-file design, veneered with manually built-up porcelain, computer-aided design-on veneers, and over-pressed veneers. Groups 8–11 consisted of FDPs with a state-of-the-art design, veneered with manually built-up porcelain. All the FDP cores were subjected to simulated aging and finally loaded to fracture. Results: There was a significant difference (P<0.05 between the core designs, but not between the different types of Y-TZP materials. The split-file designs with VITABLOCS® (1,806±165 N and e.max® ZirPress (1,854±115 N and the state-of-the-art design with VITA VM® 9 (1,849±150 N demonstrated the highest mean fracture values. Conclusion: The shape of a split-file designed all-ceramic reconstruction calls for a different dimension protocol, compared to traditionally shaped ones, as the split-file design leads to sharp approximal indentations acting as fractural impressions, thus decreasing the overall strength. The design of a framework is a crucial factor for the load bearing capacity of an all-ceramic FDP. The state-of-the-art design is preferable since the split-file designed cores call for a cross-sectional connector area at least 42% larger, to have the same load bearing capacity as the state-of-the-art designed

  11. [Evaluation of production and clinical working time of computer-aided design/computer-aided manufacturing (CAD/CAM) custom trays for complete denture].

    Science.gov (United States)

    Wei, L; Chen, H; Zhou, Y S; Sun, Y C; Pan, S X

    2017-02-18

    To compare the technician fabrication time and clinical working time of custom trays fabricated using two different methods, the three-dimensional printing custom trays and the conventional custom trays, and to prove the feasibility of the computer-aided design/computer-aided manufacturing (CAD/CAM) custom trays in clinical use from the perspective of clinical time cost. Twenty edentulous patients were recruited into this study, which was prospective, single blind, randomized self-control clinical trials. Two custom trays were fabricated for each participant. One of the custom trays was fabricated using functional suitable denture (FSD) system through CAD/CAM process, and the other was manually fabricated using conventional methods. Then the final impressions were taken using both the custom trays, followed by utilizing the final impression to fabricate complete dentures respectively. The technician production time of the custom trays and the clinical working time of taking the final impression was recorded. The average time spent on fabricating the three-dimensional printing custom trays using FSD system and fabricating the conventional custom trays manually were (28.6±2.9) min and (31.1±5.7) min, respectively. The average time spent on making the final impression with the three-dimensional printing custom trays using FSD system and the conventional custom trays fabricated manually were (23.4±11.5) min and (25.4±13.0) min, respectively. There was significant difference in the technician fabrication time and the clinical working time between the three-dimensional printing custom trays using FSD system and the conventional custom trays fabricated manually (Pmanufacture custom trays by three-dimensional printing method, there is no need to pour preliminary cast after taking the primary impression, therefore, it can save the impression material and model material. As to completing denture restoration, manufacturing custom trays using FSD system is worth being

  12. Distributed Computing for the Pierre Auger Observatory

    International Nuclear Information System (INIS)

    Chudoba, J.

    2015-01-01

    Pierre Auger Observatory operates the largest system of detectors for ultra-high energy cosmic ray measurements. Comparison of theoretical models of interactions with recorded data requires thousands of computing cores for Monte Carlo simulations. Since 2007 distributed resources connected via EGI grid are successfully used. The first and the second versions of production system based on bash scripts and MySQL database were able to submit jobs to all reliable sites supporting Virtual Organization auger. For many years VO auger belongs to top ten of EGI users based on the total used computing time. Migration of the production system to DIRAC interware started in 2014. Pilot jobs improve efficiency of computing jobs and eliminate problems with small and less reliable sites used for the bulk production. The new system has also possibility to use available resources in clouds. Dirac File Catalog replaced LFC for new files, which are organized in datasets defined via metadata. CVMFS is used for software distribution since 2014. In the presentation we give a comparison of the old and the new production system and report the experience on migrating to the new system. (paper)

  13. Distributed Computing for the Pierre Auger Observatory

    Science.gov (United States)

    Chudoba, J.

    2015-12-01

    Pierre Auger Observatory operates the largest system of detectors for ultra-high energy cosmic ray measurements. Comparison of theoretical models of interactions with recorded data requires thousands of computing cores for Monte Carlo simulations. Since 2007 distributed resources connected via EGI grid are successfully used. The first and the second versions of production system based on bash scripts and MySQL database were able to submit jobs to all reliable sites supporting Virtual Organization auger. For many years VO auger belongs to top ten of EGI users based on the total used computing time. Migration of the production system to DIRAC interware started in 2014. Pilot jobs improve efficiency of computing jobs and eliminate problems with small and less reliable sites used for the bulk production. The new system has also possibility to use available resources in clouds. Dirac File Catalog replaced LFC for new files, which are organized in datasets defined via metadata. CVMFS is used for software distribution since 2014. In the presentation we give a comparison of the old and the new production system and report the experience on migrating to the new system.

  14. Covariant computation of e+e- production in nucleon-nucleon collisions

    International Nuclear Information System (INIS)

    Haglin, K.; Kapusta, J.; Gale, C.

    1989-01-01

    Electron-positron production differential cross sections in nucleon-nucleon collisions are calculated analytically via meson exchange with a realistic pseudovector coupling including strong interaction form factors. These results are compared with newly obtained data from the DLS at the BEVALAC of proton on beryllium. A comparison with the soft photon approximation is also made. (orig.)

  15. Increased productivity in power plants by the computer-based information system PRAUT

    International Nuclear Information System (INIS)

    Hanbaba, P.

    1978-01-01

    Decrease of commissionning times, reduction of shut-down periods, avoiding of power reductions, fast adaption to load requirement variations act in the direction of increasing the productivity of a power plant. An essential contribution to this is provided by harmonized control, monitoring and communications concepts as realized, e.g. in the PRO-CONTROL system by Brown Boveri. (orig.) [de

  16. Development of a Computer Vision Technology for the Forest Products Manufacturing Industry

    Science.gov (United States)

    D. Earl Kline; Richard Conners; Philip A. Araman

    1992-01-01

    The goal of this research is to create an automated processing/grading system for hardwood lumber that will be of use to the forest products industry. The objective of creating a full scale machine vision prototype for inspecting hardwood lumber will become a reality in calendar year 1992. Space for the full scale prototype has been created at the Brooks Forest...

  17. Novel Computational Methods that Facilitate Development of Cyanofactories for Free Fatty Acid Production

    KAUST Repository

    Motwalli, Olaa Amin

    2017-05-28

    Finding a source from which high-energy-density biofuels can be derived at an industrial scale has become an urgent challenge for renewable energy production. Some microorganisms can produce free fatty acids (FFA) as precursors towards such high-energy-density biofuels. In particular, photosynthetic cyanobacteria are capable of directly converting carbon dioxide into FFA. However, current engineered strains need several rounds of engineering to reach the level of FFA production for it to be commercially viable. Thus, new chassis strains that require less engineering are needed. Although more than 140 cyanobacterial genomes are sequenced, the natural potential of these strains for FFA production and excretion has not been systematically estimated. In relation to the above-mentioned problems, we developed the first in silico screening method (FFASC) that evaluates the cyanobacterial strains’ potential for FFA production based on the strains’ proteome, which for the first time allows non-experimental selection of the most promising chassis for cyanofactories. The solution is based on the original problem formulation, optimization and ranking. To provide developers and researchers easy means for evaluation and assessment of the cyanobacterial strains potential for production of FFA, we developed the BioPS platform. In addition to being able to compare capacity for FFA production of any novel strain against 140 pre-valuate strains, BioPS can be used to explore characteristics and assessment rules in play for an individual strain. This is the first tool of this type developed. Finally, we developed a novel generic in silico method (PathDES) for ranking and selection of the most suitable pathways / sets of metabolic reactions, which suggests genetic modifications for improved metabolic productivity. The method heavily relies on optimization and integration of disparate information in a novel manner. It has been successfully used in connection with FFASC for design of

  18. Radcalc: A computer program to calculate the radiolytic production of hydrogen gas from radioactive wastes in packages

    International Nuclear Information System (INIS)

    Green, J.R.; Schwarz, R.A.; Hillesland, K.E.; Roetman, V.E.; Field, J.G.

    1995-11-01

    Radcalc for Windows' is a menu-driven Microsoft2 Windows-compatible computer code that calculates the radiolytic production of hydrogen gas in high- and low-level radioactive waste. In addition, the code also determines US Department of Transportation (DOT) transportation classifications, calculates the activities of parent and daughter isotopes for a specified period of time, calculates decay heat, and calculates pressure buildup from the production of hydrogen gas in a given package geometry. Radcalc for Windows was developed by Packaging Engineering, Transportation and Packaging, Westinghouse Hanford Company, Richland, Washington, for the US Department of Energy (DOE). It is available from Packaging Engineering and is issued with a user's manual and a technical manual. The code has been verified and validated

  19. Titanium-II: an evaluated nuclear data file

    International Nuclear Information System (INIS)

    Philis, C.; Howerton, R.; Smith, A.B.

    1977-06-01

    A comprehensive evaluated nuclear data file for elemental titanium is outlined including definition of the data base, the evaluation procedures and judgments, and the final evaluated results. The file describes all significant neutron-induced reactions with elemental titanium and the associated photon-production processes to incident neutron energies of 20.0 MeV. In addition, isotopic-reaction files, consistent with the elemental file, are separately defined for those processes which are important to applied considerations of material-damage and neutron-dosimetry. The file is formulated in the ENDF format. This report formally documents the evaluation and, together with the numerical file, is submitted for consideration as a part of the ENDF/B-V evaluated file system. 20 figures, 9 tables

  20. ATLAS, an integrated structural analysis and design system. Volume 4: Random access file catalog

    Science.gov (United States)

    Gray, F. P., Jr. (Editor)

    1979-01-01

    A complete catalog is presented for the random access files used by the ATLAS integrated structural analysis and design system. ATLAS consists of several technical computation modules which output data matrices to corresponding random access file. A description of the matrices written on these files is contained herein.

  1. pcircle - A Suite of Scalable Parallel File System Tools

    Energy Technology Data Exchange (ETDEWEB)

    2015-10-01

    Most of the software related to file system are written for conventional local file system, they are serialized and can't take advantage of the benefit of a large scale parallel file system. "pcircle" software builds on top of ubiquitous MPI in cluster computing environment and "work-stealing" pattern to provide a scalable, high-performance suite of file system tools. In particular - it implemented parallel data copy and parallel data checksumming, with advanced features such as async progress report, checkpoint and restart, as well as integrity checking.

  2. MR-AFS: a global hierarchical file-system

    International Nuclear Information System (INIS)

    Reuter, H.

    2000-01-01

    The next generation of fusion experiments will use object-oriented technology creating the need for world wide sharing of an underlying hierarchical file-system. The Andrew file system (AFS) is a well known and widely spread global distributed file-system. Multiple-resident-AFS (MR-AFS) combines the features of AFS with hierarchical storage management systems. Files in MR-AFS therefore may be migrated on secondary storage, such as roboted tape libraries. MR-AFS is in use at IPP for the current experiments and data originating from super-computer applications. Experiences and scalability issues are discussed

  3. Network survivability performance (computer diskette)

    Science.gov (United States)

    1993-11-01

    File characteristics: Data file; 1 file. Physical description: 1 computer diskette; 3 1/2 in.; high density; 2.0MB. System requirements: Mac; Word. This technical report has been developed to address the survivability of telecommunications networks including services. It responds to the need for a common understanding of, and assessment techniques for network survivability, availability, integrity, and reliability. It provides a basis for designing and operating telecommunication networks to user expectations for network survivability.

  4. Computer experiment studies on mechanisms for irradiation induced defect production and annealing processes. Final report

    International Nuclear Information System (INIS)

    Beeler, J.R. Jr.; Beeler, M.F.

    1979-06-01

    This research is based on pair potentials used in the Brookhaven work. It extends their use in defect production simulations to the 5 MeV range and characterizes the short term annealing of the primary defect states. Defect properties and interactions are studied. Defect interactions include carbon, helium, and misfit metallic substitutional impurity interactions with vacancy and interstitial defects as well as vacancy-vacancy, interstitial-interstitial and vacancy-interstitial interactions

  5. Computer experiment studies on mechanisms for irradiation induced defect production and annealing processes. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Beeler, J.R. Jr.; Beeler, M.F.

    1979-06-01

    This research is based on pair potentials used in the Brookhaven work. It extends their use in defect production simulations to the 5 MeV range and characterizes the short term annealing of the primary defect states. Defect properties and interactions are studied. Defect interactions include carbon, helium, and misfit metallic substitutional impurity interactions with vacancy and interstitial defects as well as vacancy-vacancy, interstitial-interstitial and vacancy-interstitial interactions. (FS)

  6. Computational Fluid Dynamics Modeling to Improve Natural Flow Rate and Sweet Pepper Productivity in Greenhouse

    OpenAIRE

    W. Limtrakarn; P. Boonmongkol; A. Chompupoung; K. Rungprateepthaworn; J. Kruenate; P. Dechaumphai

    2012-01-01

    Natural flow rate and sweet peppers productivity in tropical greenhouse are improved by CFD simulation is the main objective of this research work. Most of the greenhouse types today are in the arch shape. To develop an improved greenhouse structure for the region, the arch type was built and used as the control model. Mae Sar Mai agriculture research station under the royal project foundation was selected as the field test site. Temperature sensors with data logger were installed to monitor ...

  7. Collaborative Computer Graphics Product Development between Academia and Government: A Dynamic Model

    Science.gov (United States)

    Fowler, Deborah R.; Kostis, Helen-Nicole

    2016-01-01

    Collaborations and partnerships between academia and government agencies are common, especially when it comes to research and development in the fields of science, engineering and technology. However, collaboration between a government agency and an art school is rather atypical. This paper presents the Collaborative Student Project, which aims to explore the following challenge: The ideation, development and realization of education and public outreach products for NASAs upcoming ICESat-2 mission in collaboration with art students.

  8. A new way to compute charges fusion products trajectories. Application to the detection of 3 MeV protons

    International Nuclear Information System (INIS)

    Doloc, C.M.; Martin, G.

    1995-01-01

    We report here recent results concerning the 3 MeV Fusion Proton trajectories in the Tore-Supra Tokamak. The orbit computations were made in a new and unusual manner based on a topological equation which governs these trajectories. This method eludes both the problem of computing precision and the need to follow a large number of particles along their orbit: it allows to draw a topological map of trajectories, i.e. to find all possible trajectory classes, without any numerical computation. It gives also the transitions occurring between the various classes. The confinement of the proton orbits and the optimisation of the detector location were studied under the same topological rules. The need to develop this subject comes from the necessity to explain a large quantity of experimental data recorded by a silicon detector system on Tore-Supra. Experimental analysis of the Charged Fusion Product (CFP) is ensured by this unique detection system which allows to obtain simultaneously energy and pitch-angle resolution. (authors). 9 refs., 11 figs

  9. Improving Global Gross Primary Productivity Estimates by Computing Optimum Light Use Efficiencies Using Flux Tower Data

    Science.gov (United States)

    Madani, Nima; Kimball, John S.; Running, Steven W.

    2017-11-01

    In the light use efficiency (LUE) approach of estimating the gross primary productivity (GPP), plant productivity is linearly related to absorbed photosynthetically active radiation assuming that plants absorb and convert solar energy into biomass within a maximum LUE (LUEmax) rate, which is assumed to vary conservatively within a given biome type. However, it has been shown that photosynthetic efficiency can vary within biomes. In this study, we used 149 global CO2 flux towers to derive the optimum LUE (LUEopt) under prevailing climate conditions for each tower location, stratified according to model training and test sites. Unlike LUEmax, LUEopt varies according to heterogeneous landscape characteristics and species traits. The LUEopt data showed large spatial variability within and between biome types, so that a simple biome classification explained only 29% of LUEopt variability over 95 global tower training sites. The use of explanatory variables in a mixed effect regression model explained 62.2% of the spatial variability in tower LUEopt data. The resulting regression model was used for global extrapolation of the LUEopt data and GPP estimation. The GPP estimated using the new LUEopt map showed significant improvement relative to global tower data, including a 15% R2 increase and 34% root-mean-square error reduction relative to baseline GPP calculations derived from biome-specific LUEmax constants. The new global LUEopt map is expected to improve the performance of LUE-based GPP algorithms for better assessment and monitoring of global terrestrial productivity and carbon dynamics.

  10. Personal computers pollute indoor air: effects on perceived air quality, SBS symptoms and productivity in offices

    DEFF Research Database (Denmark)

    Bako-Biro, Zsolt; Wargocki, Pawel; Weschler, Charles J.

    2002-01-01

    was reduced and air freshness increased; all effects were significant. In the presence of PCs the performance of text typing significantly decreased. The sensory pollution load of the PCs was found to be 3 olf per PC, i.e. three times the load of the occupants. Present results indicate negative effects of PCs......Perceived air quality and Sick Building Syndrome (SBS) symptoms were studied in a low-polluting office space ventilated at an air change rate of 2 h-1 (10 L/s per person with 6 people present) with and without personal computers (PCs). Other environmental parameters were kept constant. Thirty...... female subjects were exposed for 4.8 h to each of the two conditions in the office and performed simulated office work. They remained thermally neutral by adjusting their clothing and were blind to the interventions. In the absence of PCs in the office the perceived air quality improved, odour intensity...

  11. Production of custom beam profiles in computer-controlled radiation therapy

    International Nuclear Information System (INIS)

    Lane, R.G.; Loyd, M.D.; Chow, C.H.; Ekwelundu, E.; Rosen, I.I.

    1989-01-01

    This paper presents a study to produce custom beam profiles in patients that compensate for variations in patient anatomy and achieve uniform dose distributions in the treatment volume. A conventional treatment field is supplemented by a number of centered and/or offset smaller coincident fields of various sizes. The sizes and positions of these supplemental fields and the doses delivered by them are designed to compensate for variations in external patient contour, internal heterogeneities, and variation in tumor volume shape. A computer-controlled linear accelerator with four independent collimator jaws, a built-in motorized wedge filter, an automatic setup capability, and a patient prescription database is used to deliver these complex treatments automatically by means of multiple overlapping beams. Calculations, measurements, and dose distributions demonstrate the efficacy of this technique

  12. Effects of two eye drop products on computer users with subjective ocular discomfort.

    Science.gov (United States)

    Skilling, Francis C; Weaver, Tony A; Kato, Kenneth P; Ford, Jerry G; Dussia, Elyse M

    2005-01-01

    An increasing number of people seek medical attention for symptoms of visual discomfort due to computer vision syndrome (CVS). We compared the efficacy and adverse event rates of a new eye lubricant, OptiZen (InnoZen, Inc., polysorbate 80 0.5%) and Visine Original (Pfizer Consumer Healthcare, tetrahydrozoline HCl 0.05%). In this double-blind parallel arm trial, 50 healthy men and women, ages 18 to 65 years, with symptoms of CVS who use a video display terminal for a minimum of 4 hours per day were randomized to OptiZen (n = 25) or Visine Original (n= 25), 1 to 2 drops b.i.d. for 5 days. The primary end-points were ocular discomfort and adverse events. OptiZen and Visine Original had similar efficacy in alleviating symptoms of ocular discomfort (odds ratio of 1.23 [95% confidence interval, 0.63 to 2.42], P= 0.55). OptiZen and Visine Original were very similar with respect to odds ratios and 95% confidence interval (CI) for each of the measurement times (P= 0.72). Visine Original users reported a significantly higher incidence of temporary ocular stinging/burning immediately after drug instillation (28%, 7/25) than did OptiZen users (4%, 1/24) (P= 0.05). Patients using OptiZen were 89% less likely to have stinging/burning effects than those patients using Visine Original (95% CI: 0.01 to 0.95). OptiZen and Visine Original are effective at alleviating ocular discomfort associated with prolonged computer use. Adverse event findings suggest that OptiZen causes less ocular discomfort on instillation, potentially attributable to its milder ingredient profile.

  13. Investigation of the Feasibility of Utilizing Gamma Emission Computed Tomography in Evaluating Fission Product Migration in Irradiated TRISO Fuel Experiments

    International Nuclear Information System (INIS)

    Harp, Jason M.; Demkowicz, Paul A.

    2014-01-01

    In the High Temperature Gas-Cooled Reactor (HTGR) the TRISO particle fuel serves as the primary fission product containment. However the large number of TRISO particles present in proposed HTGRs dictates that there will be a small fraction (~10"-"4 to 10"-"5) of as manufactured defects and in-pile particle failures that will lead to some fission product release. The matrix material surrounding the TRISO particles in fuel compacts and the structural graphite holding the TRISO particles in place can also serve as sinks for containing any released fission products. However data on the migration of solid fission products through these materials is lacking. One of the primary goals of the AGR-3/4 experiment is to study fission product migration from intentionally failed TRISO particles in prototypic HTGR components such as structural graphite and compact matrix material. In this work, the potential for a Gamma Emission Computed Tomography (GECT) technique to non-destructively examine the fission product distribution in AGR-3/4 components and other irradiation experiments is explored. Specifically, the feasibility of using the Idaho National Laboratory (INL) Hot Fuels Examination Facility (HFEF) Precision Gamma Scanner (PGS) system for this GECT application was considered. Previous experience utilizing similar techniques, the expected activities in AGR-3/4 rings, and analysis of this work indicate using GECT to evaluate AGR-3/4 will be feasible. The GECT technique was also applied to other irradiated nuclear fuel systems currently available in the HFEF hot cell, including oxide fuel pins, metallic fuel pins, and monolithic plate fuel. Results indicate GECT with the HFEF PGS is effective. (author)

  14. Characteristics of file sharing and peer to peer networking | Opara ...

    African Journals Online (AJOL)

    Characteristics of file sharing and peer to peer networking. ... distributing or providing access to digitally stored information, such as computer programs, ... including in multicast systems, anonymous communications systems, and web caches.

  15. Computational Benchmark for Estimation of Reactivity Margin from Fission Products and Minor Actinides in PWR Burnup Credit

    International Nuclear Information System (INIS)

    Wagner, J.C.

    2001-01-01

    This report proposes and documents a computational benchmark problem for the estimation of the additional reactivity margin available in spent nuclear fuel (SNF) from fission products and minor actinides in a burnup-credit storage/transport environment, relative to SNF compositions containing only the major actinides. The benchmark problem/configuration is a generic burnup credit cask designed to hold 32 pressurized water reactor (PWR) assemblies. The purpose of this computational benchmark is to provide a reference configuration for the estimation of the additional reactivity margin, which is encouraged in the U.S. Nuclear Regulatory Commission (NRC) guidance for partial burnup credit (ISG8), and document reference estimations of the additional reactivity margin as a function of initial enrichment, burnup, and cooling time. Consequently, the geometry and material specifications are provided in sufficient detail to enable independent evaluations. Estimates of additional reactivity margin for this reference configuration may be compared to those of similar burnup-credit casks to provide an indication of the validity of design-specific estimates of fission-product margin. The reference solutions were generated with the SAS2H-depletion and CSAS25-criticality sequences of the SCALE 4.4a package. Although the SAS2H and CSAS25 sequences have been extensively validated elsewhere, the reference solutions are not directly or indirectly based on experimental results. Consequently, this computational benchmark cannot be used to satisfy the ANS 8.1 requirements for validation of calculational methods and is not intended to be used to establish biases for burnup credit analyses

  16. BIPAL - a data library for computing the burnup of fissionable isotopes and products of their decay

    International Nuclear Information System (INIS)

    Kralovcova, E.; Hep, J.; Valenta, V.

    1978-01-01

    The BIPAL databank contains data on 100 heavy metal isotopes starting with 206 Tl and finishing with 253 Es. Four are stable, the others are unstable. The following data are currently stored in the databank: the serial number and name of isotopes, decay modes and, for stable isotopes, the isotopic abundance (%), numbers of P decays and Q captures, numbers of corresponding final products, branching ratios, half-lives and their units, decay constants, thermal neutron captures, and fission cross sections, and other data (mainly alpha, beta and gamma intensities). The description of data and a printout of the BIPAL library are presented. (J.B.)

  17. Computational study of hydrogen shifts and ring-opening mechanisms in α-pinene ozonolysis products

    DEFF Research Database (Denmark)

    Kurtén, Theo; Rissanen, Matti P.; Mackeprang, Kasper

    2015-01-01

    , sterically unhindered) H-shifts of all four peroxy radicals formed in the ozonolysis of α-pinene using density functional (ωB97XD) and coupled cluster [CCSD(T)-F12] theory. In contrast to the related but chemically simpler cyclohexene ozonolysis system, none of the calculated H-shifts have rate constants...... products in the α-pinene ozonolysis system, additional ring-opening reaction mechanisms breaking the cyclobutyl ring are therefore needed. We further investigate possible uni- and bimolecular pathways for opening the cyclobutyl ring in the α-pinene ozonolysis system....

  18. LASIP-III, a generalized processor for standard interface files

    International Nuclear Information System (INIS)

    Bosler, G.E.; O'Dell, R.D.; Resnik, W.M.

    1976-03-01

    The LASIP-III code was developed for processing Version III standard interface data files which have been specified by the Committee on Computer Code Coordination. This processor performs two distinct tasks, namely, transforming free-field format, BCD data into well-defined binary files and providing for printing and punching data in the binary files. While LASIP-III is exported as a complete free-standing code package, techniques are described for easily separating the processor into two modules, viz., one for creating the binary files and one for printing the files. The two modules can be separated into free-standing codes or they can be incorporated into other codes. Also, the LASIP-III code can be easily expanded for processing additional files, and procedures are described for such an expansion. 2 figures, 8 tables

  19. Vacancy production in molybdenum by low energy light ion bombardment: computer simulation

    International Nuclear Information System (INIS)

    Hou, M.; Veen, A. van; Caspers, L.M.; Ypma, M.R.

    1983-01-01

    A comparison is made of the room temperature vacancy production measured with THDS (thermal helium desorption spectrometry) and the Frenkel pair production calculated in the binary collision approximation with MARLOWE for 0.5 to 3 keV He + ions and 1.5 keV protons injected into a Mo(110) crystal. Using the distributions of Frenkel pair separation distances calculated with MARLOWE for various values of the displacement threshold Esub(d), the experimental data are matched by selecting a cut-off radius Rsub(c) so that for separations larger than Rsub(c) the Frenkel pairs survive recombination. It became apparent that all experimental data could be reasonably described by a pair of parameters Esub(d) = 33 eV and Rsub(c) = 3.7 a 0 (a 0 is the lattice cell edge unit). The value of Esub(d) we found is close to the experimentally determined threshold energy for permanent displacements in Mo. A detailed analysis of the recombination process using the MARLOWE results shows that the found cut-off radius corresponds with an effective recombination radius Rsub(o) = 2.8 a 0 . In the literature lower (theoretical) values of Rsub(o) = 1.4 - 2.1 a 0 are quoted for correlated recombination of single Frenkel pairs in molybdenum. (orig.)

  20. Innovations in the production of ceramic luminous environments: where craftsman meets computer

    Directory of Open Access Journals (Sweden)

    R. Urbano Gutiérrez

    2016-12-01

    Full Text Available Ceramics offer exceptional properties as an energy-efficient building material, but have rarely been investigated alongside active environmental performance. Responding to light-control criteria, we work with advanced digital modelling, fabrication and performance simulation tools to craft experimental full-scale ceramic prototypes of architectural daylighting components. Our research has three main goals: to investigate alternative daylighting technology solutions made of a low-impact material such as clay; to explore design methodologies that look into how current architectural ceramics manufacturing can be enhanced by emergent design and fabrication technologies; and to engage with the materiality of the clay through collaborative working with recognised artists and ceramicists. A critical aspect of our research is to test the compatibility and interoperability of different software and design techniques, as phases of the production process (optimisation of form finding in real time. This paper presents the development, construction and analytical data of three of the experimental production methods developed during the first three years of this project.

  1. Computational Fluid Dynamics Modeling to Improve Natural Flow Rate and Sweet Pepper Productivity in Greenhouse

    Directory of Open Access Journals (Sweden)

    W. Limtrakarn

    2012-01-01

    Full Text Available Natural flow rate and sweet peppers productivity in tropical greenhouse are improved by CFD simulation is the main objective of this research work. Most of the greenhouse types today are in the arch shape. To develop an improved greenhouse structure for the region, the arch type was built and used as the control model. Mae Sar Mai agriculture research station under the royal project foundation was selected as the field test site. Temperature sensors with data logger were installed to monitor variation of temperature inside the greenhouse. The measured temperature data were used as the boundary conditions for the CFD analysis. A new greenhouse model with two-step roof shape was designed and the air flow behavior was simulated by using CFD. Regarding CFD results the air flow rate of the new model is about 39% higher than that of old model. The maximum temperature of the new model is lower than that of the old one. The sweet paper growths in both greenhouse models were measured and compared. Results show that the new model obtains 4°C lower maximum temperature in day time, 97% in number and 90% in weight higher the first grade pepper productivity than the old one.

  2. CryptoCache: A Secure Sharable File Cache for Roaming Users

    DEFF Research Database (Denmark)

    Jensen, Christian D.

    2000-01-01

    . Conventional distributed file systems cache everything locally or not at all; there is no possibility to cache files on nearby nodes.In this paper we present the design of a secure cache system called CryptoCache that allows roaming users to cache files on untrusted file hosting servers. The system allows...... flexible sharing of cached files among unauthenticated users, i.e. unlike most distributed file systems CryptoCache does not require a global authentication framework.Files are encrypted when they are transferred over the network and while stored on untrusted servers. The system uses public key......Small mobile computers are now sufficiently powerful to run many applications, but storage capacity remains limited so working files cannot be cached or stored locally. Even if files can be stored locally, the mobile device is not powerful enough to act as server in collaborations with other users...

  3. Distributed Data Management and Distributed File Systems

    CERN Document Server

    Girone, Maria

    2015-01-01

    The LHC program has been successful in part due to the globally distributed computing resources used for collecting, serving, processing, and analyzing the large LHC datasets. The introduction of distributed computing early in the LHC program spawned the development of new technologies and techniques to synchronize information and data between physically separated computing centers. Two of the most challenges services are the distributed file systems and the distributed data management systems. In this paper I will discuss how we have evolved from local site services to more globally independent services in the areas of distributed file systems and data management and how these capabilities may continue to evolve into the future. I will address the design choices, the motivations, and the future evolution of the computing systems used for High Energy Physics.

  4. Utilizing HDF4 File Content Maps for the Cloud

    Science.gov (United States)

    Lee, Hyokyung Joe

    2016-01-01

    We demonstrate a prototype study that HDF4 file content map can be used for efficiently organizing data in cloud object storage system to facilitate cloud computing. This approach can be extended to any binary data formats and to any existing big data analytics solution powered by cloud computing because HDF4 file content map project started as long term preservation of NASA data that doesn't require HDF4 APIs to access data.

  5. BIBLIO: A Reprint File Management Algorithm

    Science.gov (United States)

    Zelnio, Robert N.; And Others

    1977-01-01

    The development of a simple computer algorithm designed for use by the individual educator or researcher in maintaining and searching reprint files is reported. Called BIBLIO, the system is inexpensive and easy to operate and maintain without sacrificing flexibility and utility. (LBH)

  6. Effects of pollution from personal computers on perceived air quality, SBS symptoms and productivity in offices

    DEFF Research Database (Denmark)

    Bako-Biro, Zsolt; Wargocki, Pawel; Weschler, Charles J.

    2004-01-01

    In groups of six, 30 female subjects were exposed for 4.8 h in a low-polluting office to each of two conditions the presence or absence of 3-month-old personal computers (PCs). These PCs were placed behind a screen so that they were not visible to the subjects. Throughout the exposure the outdoor...... air supply was maintained at 10 l/s per person. Under each of the two conditions the subjects performed simulated office work using old low-polluting PCs. They also evaluated the air quality and reported Sick Building Syndrome (SBS) symptoms. The PCs were found to be strong indoor pollution sources......, even after they had been in service for 3 months. The sensory pollution load of each PC was 3.4 olf, more than three times the pollution of a standard person. The presence of PCs increased the percentage of people dissatisfied with the perceived air quality from 13 to 41% and increased by 9% the time...

  7. GEODOC: the GRID document file, record structure and data element description

    Energy Technology Data Exchange (ETDEWEB)

    Trippe, T.; White, V.; Henderson, F.; Phillips, S.

    1975-11-06

    The purpose of this report is to describe the information structure of the GEODOC file. GEODOC is a computer based file which contains the descriptive cataloging and indexing information for all documents processed by the National Geothermal Information Resource Group. This file (along with other GRID files) is managed by DBMS, the Berkeley Data Base Management System. Input for the system is prepared using the IRATE Text Editing System with its extended (12 bit) character set, or punched cards.

  8. Three-Dimensional Printing of X-Ray Computed Tomography Datasets with Multiple Materials Using Open-Source Data Processing

    Science.gov (United States)

    Sander, Ian M.; McGoldrick, Matthew T.; Helms, My N.; Betts, Aislinn; van Avermaete, Anthony; Owers, Elizabeth; Doney, Evan; Liepert, Taimi; Niebur, Glen; Liepert, Douglas; Leevy, W. Matthew

    2017-01-01

    Advances in three-dimensional (3D) printing allow for digital files to be turned into a "printed" physical product. For example, complex anatomical models derived from clinical or pre-clinical X-ray computed tomography (CT) data of patients or research specimens can be constructed using various printable materials. Although 3D printing…

  9. Development of computer code on sodium-water reaction products transport

    International Nuclear Information System (INIS)

    Arikawa, H.; Yoshioka, N.; Suemori, M.; Nishida, K.

    1988-01-01

    The LMFBR concept eliminating the secondary sodium system has been considered to be one of the most promissing concepts for offering cost reductions. In this reactor concept, the evaluation of effects on reactor core by the sodium-water reaction products (SWRPs) during sodium-water reaction at primary steam generator becomes one of the major safety issues. In this study, the calculation code was developed as the first step of the processes of establishing the evaluation method for SWRP effects. The calculation code, called SPROUT, simulates the SWRPs transport and distribution in primary sodium system using the system geometry, thermal hydraulic data and sodium-water reacting conditions as input. This code principally models SWRPs behavior. The paper contain the modelings for SWRPs behaviors, with solution, precipation, deposition and so on, and the results and discussions of the demonstration calculation for a typical FBR plant eliminating the secondary sodium system

  10. Computational and experimental prediction of dust production in pebble bed reactors, Part II

    Energy Technology Data Exchange (ETDEWEB)

    Hiruta, Mie; Johnson, Gannon [Department of Mechanical Engineering, University of Idaho, 1776 Science Center Drive, Idaho Falls, ID 83401 (United States); Rostamian, Maziar, E-mail: mrostamian@asme.org [Department of Mechanical Engineering, University of Idaho, 1776 Science Center Drive, Idaho Falls, ID 83401 (United States); Potirniche, Gabriel P. [Department of Mechanical Engineering, University of Idaho, 1776 Science Center Drive, Idaho Falls, ID 83401 (United States); Ougouag, Abderrafi M. [Idaho National Laboratory, 2525 N Fremont Avenue, Idaho Falls, ID 83401 (United States); Bertino, Massimo; Franzel, Louis [Department of Physics, Virginia Commonwealth University, Richmond, VA 23284 (United States); Tokuhiro, Akira [Department of Mechanical Engineering, University of Idaho, 1776 Science Center Drive, Idaho Falls, ID 83401 (United States)

    2013-10-15

    Highlights: • Custom-built high temperature, high pressure tribometer is designed. • Two different wear phenomena at high temperatures are observed. • Experimental wear results for graphite are presented. • The graphite wear dust production in a typical Pebble Bed Reactor is predicted. -- Abstract: This paper is the continuation of Part I, which describes the high temperature and high pressure helium environment wear tests of graphite–graphite in frictional contact. In the present work, it has been attempted to simulate a Pebble Bed Reactor core environment as compared to Part I. The experimental apparatus, which is a custom-designed tribometer, is capable of performing wear tests at PBR relevant higher temperatures and pressures under a helium environment. This environment facilitates prediction of wear mass loss of graphite as dust particulates from the pebble bed. The experimental results of high temperature helium environment are used to anticipate the amount of wear mass produced in a pebble bed nuclear reactor.

  11. Scaling up ATLAS Event Service to production levels on opportunistic computing platforms

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00066086; The ATLAS collaboration; Caballero, Jose; Ernst, Michael; Guan, Wen; Hover, John; Lesny, David; Maeno, Tadashi; Nilsson, Paul; Tsulaia, Vakhtang; van Gemmeren, Peter; Vaniachine, Alexandre; Wang, Fuquan; Wenaus, Torre

    2016-01-01

    Continued growth in public cloud and HPC resources is on track to exceed the dedicated resources available for ATLAS on the WLCG. Examples of such platforms are Amazon AWS EC2 Spot Instances, Edison Cray XC30 supercomputer, backfill at Tier 2 and Tier 3 sites, opportunistic resources at the Open Science Grid (OSG), and ATLAS High Level Trigger farm between the data taking periods. Because of specific aspects of opportunistic resources such as preemptive job scheduling and data I/O, their efficient usage requires workflow innovations provided by the ATLAS Event Service. Thanks to the finer granularity of the Event Service data processing workflow, the opportunistic resources are used more efficiently. We report on our progress in scaling opportunistic resource usage to double-digit levels in ATLAS production.

  12. Scaling up ATLAS Event Service to production levels on opportunistic computing platforms

    CERN Document Server

    Benjamin, Douglas; The ATLAS collaboration; Ernst, Michael; Guan, Wen; Hover, John; Lesny, David; Maeno, Tadashi; Nilsson, Paul; Tsulaia, Vakhtang; van Gemmeren, Peter; Vaniachine, Alexandre; Wang, Fuquan; Wenaus, Torre

    2016-01-01

    Continued growth in public cloud and HPC resources is on track to overcome the dedicated resources available for ATLAS on the WLCG. Example of such platforms are Amazon AWS EC2 Spot Instances, Edison Cray XC30 supercomputer, backfill at the Tier-2 and Tier-3 sites, opportunistic resources at the Open Science Grid, and ATLAS High Level Trigger farm between the data taking periods. Because of opportunistic resources specifics such as preemptive job scheduling and data I/O, their efficient usage requires workflow innovations provided by the ATLAS Event Service. Thanks to the finer granularity of the Event Service data processing workflow, the opportunistic resources are used more efficiently. We report on our progress in scaling opportunistic resource usage to double-digit levels in ATLAS production.

  13. Adding Data Management Services to Parallel File Systems

    Energy Technology Data Exchange (ETDEWEB)

    Brandt, Scott [Univ. of California, Santa Cruz, CA (United States)

    2015-03-04

    The objective of this project, called DAMASC for “Data Management in Scientific Computing”, is to coalesce data management with parallel file system management to present a declarative interface to scientists for managing, querying, and analyzing extremely large data sets efficiently and predictably. Managing extremely large data sets is a key challenge of exascale computing. The overhead, energy, and cost of moving massive volumes of data demand designs where computation is close to storage. In current architectures, compute/analysis clusters access data in a physically separate parallel file system and largely leave it scientist to reduce data movement. Over the past decades the high-end computing community has adopted middleware with multiple layers of abstractions and specialized file formats such as NetCDF-4 and HDF5. These abstractions provide a limited set of high-level data processing functions, but have inherent functionality and performance limitations: middleware that provides access to the highly structured contents of scientific data files stored in the (unstructured) file systems can only optimize to the extent that file system interfaces permit; the highly structured formats of these files often impedes native file system performance optimizations. We are developing Damasc, an enhanced high-performance file system with native rich data management services. Damasc will enable efficient queries and updates over files stored in their native byte-stream format while retaining the inherent performance of file system data storage via declarative queries and updates over views of underlying files. Damasc has four key benefits for the development of data-intensive scientific code: (1) applications can use important data-management services, such as declarative queries, views, and provenance tracking, that are currently available only within database systems; (2) the use of these services becomes easier, as they are provided within a familiar file

  14. File access prediction using neural networks.

    Science.gov (United States)

    Patra, Prashanta Kumar; Sahu, Muktikanta; Mohapatra, Subasish; Samantray, Ronak Kumar

    2010-06-01

    One of the most vexing issues in design of a high-speed computer is the wide gap of access times between the memory and the disk. To solve this problem, static file access predictors have been used. In this paper, we propose dynamic file access predictors using neural networks to significantly improve upon the accuracy, success-per-reference, and effective-success-rate-per-reference by using neural-network-based file access predictor with proper tuning. In particular, we verified that the incorrect prediction has been reduced from 53.11% to 43.63% for the proposed neural network prediction method with a standard configuration than the recent popularity (RP) method. With manual tuning for each trace, we are able to improve upon the misprediction rate and effective-success-rate-per-reference using a standard configuration. Simulations on distributed file system (DFS) traces reveal that exact fit radial basis function (RBF) gives better prediction in high end system whereas multilayer perceptron (MLP) trained with Levenberg-Marquardt (LM) backpropagation outperforms in system having good computational capability. Probabilistic and competitive predictors are the most suitable for work stations having limited resources to deal with and the former predictor is more efficient than the latter for servers having maximum system calls. Finally, we conclude that MLP with LM backpropagation algorithm has better success rate of file prediction than those of simple perceptron, last successor, stable successor, and best k out of m predictors.

  15. Links among available integral benchmarks and differential date evaluations, computational biases and uncertainties, and nuclear criticality safety biases on potential MOX production throughput

    International Nuclear Information System (INIS)

    Goluoglu, S.; Hopper, C.M.

    2004-01-01

    Through the use of Oak Ridge National Laboratory's recently developed and applied sensitivity and uncertainty computational analysis techniques, this paper presents the relevance and importance of available and needed integral benchmarks and differential data evaluations impacting potential MOX production throughput determinations relative to low-moderated MOX fuel blending operations. The relevance and importance in the availability of or need for critical experiment benchmarks and data evaluations are presented in terms of computational biases as influenced by computational and experimental sensitivities and uncertainties relative to selected MOX production powder blending processes. Recent developments for estimating the safe margins of subcriticality for assuring nuclear criticality safety for process approval are presented. In addition, the impact of the safe margins (due to computational biases and uncertainties) on potential MOX production throughput will also be presented. (author)

  16. Source Reference File

    Data.gov (United States)

    Social Security Administration — This file contains a national set of names and contact information for doctors, hospitals, clinics, and other facilities (known collectively as sources) from which...

  17. Patient Assessment File (PAF)

    Data.gov (United States)

    Department of Veterans Affairs — The Patient Assessment File (PAF) database compiles the results of the Patient Assessment Instrument (PAI) questionnaire filled out for intermediate care Veterans...

  18. RRB Earnings File (RRBERN)

    Data.gov (United States)

    Social Security Administration — RRBERN contains records for all beneficiaries on the RRB's PSSVES file who's SSNs are validated through the SVES processing. Validated output is processed through...

  19. Radiology Teaching Files on the Internet

    International Nuclear Information System (INIS)

    Lim, Eun Chung; Kim, Eun Kyung

    1996-01-01

    There is increasing attention about radiology teaching files on the Internet in the field of diagnostic radiology. The purpose of this study was to aid in the creation of new radiology teaching file by analysing the present radiology teaching file sites on the Internet with many aspects and evaluating images on those sites, using Macintosh II ci compute r, 28.8kbps TelePort Fax/Modem, Netscape Navigator 2.0 software. The results were as follow : 1. Analysis of radiology teaching file sites (1) Country distribution was the highest in USA (57.5%). (2) Average number of cases was 186 cases and radiology teaching file sites with search engine were 9 sites (22.5%). (3) At the method of case arrangement, anatomic area type and diagnosis type were found at the 10 sites (25%) each, question and answer type was found at the 9 sites (22.5%). (4) Radiology teaching file sites with oro-maxillofacial disorder were 9 sites (22.5%). (5) At the image format, GIF format was found at the 14 sites (35%), and JPEG format found at the 14 sites (35%). (6) Created year was the highest in 1995 (43.7%). (7) Continuing case upload was found at the 35 sites (87.5%). 2. Evaluation of images on the radiology teaching files (1) Average file size of GIF format (71 Kbyte) was greater than that of JPEG format (24 Kbyte). (P<0.001) (2) Image quality of GIF format was better than that of JPEG format. (P<0.001)

  20. Experimental and computational fluid dynamic studies of mixing for complex oral health products

    Science.gov (United States)

    Garcia, Marti Cortada; Mazzei, Luca; Angeli, Panagiota

    2015-11-01

    Mixing high viscous non-Newtonian fluids is common in the consumer health industry. Sometimes this process is empirical and involves many pilot plants trials which are product specific. The first step to study the mixing process is to build on knowledge on the rheology of the fluids involved. In this research a systematic approach is used to validate the rheology of two liquids: glycerol and a gel formed by polyethylene glycol and carbopol. Initially, the constitutive equation is determined which relates the viscosity of the fluids with temperature, shear rate, and concentration. The key variable for the validation is the power required for mixing, which can be obtained both from CFD and experimentally using a stirred tank and impeller of well-defined geometries at different impeller speeds. A good agreement between the two values indicates a successful validation of the rheology and allows the CFD model to be used for the study of mixing in the complex vessel geometries and increased sizes encountered during scale up.

  1. Single-Step Fabrication of Computationally Designed Microneedles by Continuous Liquid Interface Production.

    Directory of Open Access Journals (Sweden)

    Ashley R Johnson

    Full Text Available Microneedles, arrays of micron-sized needles that painlessly puncture the skin, enable transdermal delivery of medications that are difficult to deliver using more traditional routes. Many important design parameters, such as microneedle size, shape, spacing, and composition, are known to influence efficacy, but are notoriously difficult to alter due to the complex nature of microfabrication techniques. Herein, we utilize a novel additive manufacturing ("3D printing" technique called Continuous Liquid Interface Production (CLIP to rapidly prototype sharp microneedles with tuneable geometries (size, shape, aspect ratio, spacing. This technology allows for mold-independent, one-step manufacturing of microneedle arrays of virtually any design in less than 10 minutes per patch. Square pyramidal CLIP microneedles composed of trimethylolpropane triacrylate, polyacrylic acid and photopolymerizable derivatives of polyethylene glycol and polycaprolactone were fabricated to demonstrate the range of materials that can be utilized within this platform for encapsulating and controlling the release of therapeutics. These CLIP microneedles effectively pierced murine skin ex vivo and released the fluorescent drug surrogate rhodamine.

  2. An effective dose assessment technique with NORM added consumer products using skin-point source on computational human phantom

    International Nuclear Information System (INIS)

    Yoo, Do Hyeon; Shin, Wook-Geun; Lee, Hyun Cheol; Choi, Hyun Joon; Testa, Mauro; Lee, Jae Kook; Yeom, Yeon Soo; Kim, Chan Hyeong; Min, Chul Hee

    2016-01-01

    The aim of this study is to develop the assessment technique of the effective dose by calculating the organ equivalent dose with a Monte Carlo (MC) simulation and a computational human phantom for the naturally occurring radioactive material (NORM) added consumer products. In this study, we suggests the method determining the MC source term based on the skin-point source enabling the convenient and conservative modeling of the various type of the products. To validate the skin-point source method, the organ equivalent doses were compared with that by the product modeling source of the realistic shape for the pillow, waist supporter, sleeping mattress etc. Our results show that according to the source location, the organ equivalent doses were observed as the similar tendency for both source determining methods, however, it was observed that the annual effective dose with the skin-point source was conservative than that with the modeling source with the maximum 3.3 times higher dose. With the assumption of the gamma energy of 1 MeV and product activity of 1 Bq g"−"1, the annual effective doses of the pillow, waist supporter and sleeping mattress with skin-point source was 3.09E-16 Sv Bq"−"1 year"−"1, 1.45E-15 Sv Bq"−"1 year"−"1, and 2,82E-16 Sv Bq"−"1 year"−"1, respectively, while the product modeling source showed 9.22E-17 Sv Bq"−"1 year"−"1, 9.29E-16 Sv Bq"−"1 year"−"1, and 8.83E-17 Sv Bq"−"1 year"−"1, respectively. In conclusion, it was demonstrated in this study that the skin-point source method could be employed to efficiently evaluate the annual effective dose due to the usage of the NORM added consumer products. - Highlights: • We evaluate the exposure dose from the usage of NORM added consumer products. • We suggest the method determining the MC source term based on the skin-point source. • To validate the skin-point source, the organ equivalent doses were compared with that the modeling source. • The skin-point source could

  3. Initial draft of CSE-UCLA evaluation model based on weighted product in order to optimize digital library services in computer college in Bali

    Science.gov (United States)

    Divayana, D. G. H.; Adiarta, A.; Abadi, I. B. G. S.

    2018-01-01

    The aim of this research was to create initial design of CSE-UCLA evaluation model modified with Weighted Product in evaluating digital library service at Computer College in Bali. The method used in this research was developmental research method and developed by Borg and Gall model design. The results obtained from the research that conducted earlier this month was a rough sketch of Weighted Product based CSE-UCLA evaluation model that the design had been able to provide a general overview of the stages of weighted product based CSE-UCLA evaluation model used in order to optimize the digital library services at the Computer Colleges in Bali.

  4. Experimental research on the contrast production of the chemical elements with the atomic numbers 1-83 in a computer-totalbody-tomogram

    International Nuclear Information System (INIS)

    Kirschner, H.; Burmester, U.; Stringaris, K.

    1979-01-01

    The contrast production for the chemical elements with the atomic numbers Z=1-83 were determined by computer-tomography. With the formula relation of the Δ-number and the atomic number can one compute the contrast production of any chosen chemical compound. Iodine-free and inorganic iodine-containing contrast media are examined for their contrast production and compared with presently used organic iodine-containing contrast media. The contrast enhancement of organic contrast media in tissue are discussed. (orig.) [de

  5. Applying knowledge engineering tools for the personal computer to the operation and maintenance of radiopharmaceutical production systems

    International Nuclear Information System (INIS)

    Alexoff, D.L.

    1990-01-01

    A practical consequence of over three decades of Artificial Intelligence (AI) research has been the emergence of Personal Computer-based AI programming tools. A special class of this microcomputer-based software, called expert systems shells, is now applied routinely outside the realm of classical AI to solve many types of problems, particularly in analytical chemistry. These AI tools offer not only some of the advantages inherent to symbolic programming languages, but, as significant, they bring with them advanced program development environments which can facilitate software development and maintenance. Exploitation of this enhanced programming environment was a major motivation for using an AI tool. The goal of this work is to evaluate the use of an example-based expert system shell (1st Class FUSION, 1st Class Expert Systems, Inc.) as a programming tool for developing software useful for automated radiopharmaceutical production

  6. 76 FR 62092 - Filing Procedures

    Science.gov (United States)

    2011-10-06

    ... INTERNATIONAL TRADE COMMISSION Filing Procedures AGENCY: International Trade Commission. ACTION: Notice of issuance of Handbook on Filing Procedures. SUMMARY: The United States International Trade Commission (``Commission'') is issuing a Handbook on Filing Procedures to replace its Handbook on Electronic...

  7. Massive stereo-based DTM production for Mars on cloud computers

    Science.gov (United States)

    Tao, Y.; Muller, J.-P.; Sidiropoulos, P.; Xiong, Si-Ting; Putri, A. R. D.; Walter, S. H. G.; Veitch-Michaelis, J.; Yershov, V.

    2018-05-01

    Digital Terrain Model (DTM) creation is essential to improving our understanding of the formation processes of the Martian surface. Although there have been previous demonstrations of open-source or commercial planetary 3D reconstruction software, planetary scientists are still struggling with creating good quality DTMs that meet their science needs, especially when there is a requirement to produce a large number of high quality DTMs using "free" software. In this paper, we describe a new open source system to overcome many of these obstacles by demonstrating results in the context of issues found from experience with several planetary DTM pipelines. We introduce a new fully automated multi-resolution DTM processing chain for NASA Mars Reconnaissance Orbiter (MRO) Context Camera (CTX) and High Resolution Imaging Science Experiment (HiRISE) stereo processing, called the Co-registration Ames Stereo Pipeline (ASP) Gotcha Optimised (CASP-GO), based on the open source NASA ASP. CASP-GO employs tie-point based multi-resolution image co-registration, and Gotcha sub-pixel refinement and densification. CASP-GO pipeline is used to produce planet-wide CTX and HiRISE DTMs that guarantee global geo-referencing compliance with respect to High Resolution Stereo Colour imaging (HRSC), and thence to the Mars Orbiter Laser Altimeter (MOLA); providing refined stereo matching completeness and accuracy. All software and good quality products introduced in this paper are being made open-source to the planetary science community through collaboration with NASA Ames, United States Geological Survey (USGS) and the Jet Propulsion Laboratory (JPL), Advanced Multi-Mission Operations System (AMMOS) Planetary Data System (PDS) Pipeline Service (APPS-PDS4), as well as browseable and visualisable through the iMars web based Geographic Information System (webGIS) system.

  8. Production Support Flight Control Computers: Research Capability for F/A-18 Aircraft at Dryden Flight Research Center

    Science.gov (United States)

    Carter, John F.

    1997-01-01

    NASA Dryden Flight Research Center (DFRC) is working with the United States Navy to complete ground testing and initiate flight testing of a modified set of F/A-18 flight control computers. The Production Support Flight Control Computers (PSFCC) can give any fleet F/A-18 airplane an in-flight, pilot-selectable research control law capability. NASA DFRC can efficiently flight test the PSFCC for the following four reasons: (1) Six F/A-18 chase aircraft are available which could be used with the PSFCC; (2) An F/A-18 processor-in-the-loop simulation exists for validation testing; (3) The expertise has been developed in programming the research processor in the PSFCC; and (4) A well-defined process has been established for clearing flight control research projects for flight. This report presents a functional description of the PSFCC. Descriptions of the NASA DFRC facilities, PSFCC verification and validation process, and planned PSFCC projects are also provided.

  9. Virus Alert: Ten Steps to Safe Computing.

    Science.gov (United States)

    Gunter, Glenda A.

    1997-01-01

    Discusses computer viruses and explains how to detect them; discusses virus protection and the need to update antivirus software; and offers 10 safe computing tips, including scanning floppy disks and commercial software, how to safely download files from the Internet, avoiding pirated software copies, and backing up files. (LRW)

  10. Nuclear plant fire incident data file

    International Nuclear Information System (INIS)

    Sideris, A.G.; Hockenbury, R.W.; Yeater, M.L.; Vesely, W.E.

    1979-01-01

    A computerized nuclear plant fire incident data file was developed by American Nuclear Insurers and was further analyzed by Rensselaer Polytechnic Institute with technical and monetary support provided by the Nuclear Regulatory Commission. Data on 214 fires that occurred at nuclear facilities have been entered in the file. A computer program has been developed to sort the fire incidents according to various parameters. The parametric sorts that are presented in this article are significant since they are the most comprehensive statistics presently available on fires that have occurred at nuclear facilities

  11. 76 FR 24467 - Combined Notice of Filings #1

    Science.gov (United States)

    2011-05-02

    ... Company Depreciation Study and Change in Depreciation Rates for Wholesale Production Service. Filed Date..., 2011. Docket Numbers: ER11-3431-000. Applicants: New Mexico Green Initiatives, LLC. Description: New Mexico Green Initiatives, LLC submits tariff filing per 35.12: NM Green Initiatives MBR Application to be...

  12. 76 FR 23320 - Combined Notice of Filings #2

    Science.gov (United States)

    2011-04-26

    ... Tuesday, May 10, 2011. Docket Numbers: ER10-3096-002. Applicants: Public Service Company of New Mexico. Description: Public Service Company of New Mexico submits tariff filing per 35: WestConnect Experimental... tariff filing per 35.13(a)(2)(iii): IPL Changes in Depreciation Rates for Wholesale Production Service to...

  13. FHEO Filed Cases

    Data.gov (United States)

    Department of Housing and Urban Development — The dataset is a list of all the Title VIII fair housing cases filed by FHEO from 1/1/2007 - 12/31/2012 including the case number, case name, filing date, state and...

  14. An Open Computing Infrastructure that Facilitates Integrated Product and Process Development from a Decision-Based Perspective

    Science.gov (United States)

    Hale, Mark A.

    1996-01-01

    Computer applications for design have evolved rapidly over the past several decades, and significant payoffs are being achieved by organizations through reductions in design cycle times. These applications are overwhelmed by the requirements imposed during complex, open engineering systems design. Organizations are faced with a number of different methodologies, numerous legacy disciplinary tools, and a very large amount of data. Yet they are also faced with few interdisciplinary tools for design collaboration or methods for achieving the revolutionary product designs required to maintain a competitive advantage in the future. These organizations are looking for a software infrastructure that integrates current corporate design practices with newer simulation and solution techniques. Such an infrastructure must be robust to changes in both corporate needs and enabling technologies. In addition, this infrastructure must be user-friendly, modular and scalable. This need is the motivation for the research described in this dissertation. The research is focused on the development of an open computing infrastructure that facilitates product and process design. In addition, this research explicitly deals with human interactions during design through a model that focuses on the role of a designer as that of decision-maker. The research perspective here is taken from that of design as a discipline with a focus on Decision-Based Design, Theory of Languages, Information Science, and Integration Technology. Given this background, a Model of IPPD is developed and implemented along the lines of a traditional experimental procedure: with the steps of establishing context, formalizing a theory, building an apparatus, conducting an experiment, reviewing results, and providing recommendations. Based on this Model, Design Processes and Specification can be explored in a structured and implementable architecture. An architecture for exploring design called DREAMS (Developing Robust

  15. Efficacy of D-RaCe and ProTaper Universal Retreatment NiTi instruments and hand files in removing gutta-percha from curved root canals - a micro-computed tomography study.

    Science.gov (United States)

    Rödig, T; Hausdörfer, T; Konietschke, F; Dullin, C; Hahn, W; Hülsmann, M

    2012-06-01

    To compare the efficacy of two rotary NiTi retreatment systems and Hedström files in removing filling material from curved root canals. Curved root canals of 57 extracted teeth were prepared using FlexMaster instruments and filled with gutta-percha and AH Plus. After determination of root canal curvatures and radii in two directions, the teeth were assigned to three identical groups (n = 19). The root fillings were removed with D-RaCe instruments, ProTaper Universal Retreatment instruments or Hedström files. Pre- and postoperative micro-CT imaging was used to assess the percentage of residual filling material as well as the amount of dentine removal. Working time and procedural errors were recorded. Data were analysed using analysis of covariance and analysis of variance procedures. D-RaCe instruments were significantly more effective than ProTaper Universal Retreatment instruments and Hedström files (P ProTaper group, four instrument fractures and one lateral perforation were observed. Five instrument fractures were recorded for D-RaCe. D-RaCe instruments were associated with significantly less residual filling material than ProTaper Universal Retreatment instruments and hand files. Hedström files removed significantly less dentine than both rotary NiTi systems. Retreatment with rotary NiTi systems resulted in a high incidence of procedural errors. © 2012 International Endodontic Journal.

  16. Next generation WLCG File Transfer Service (FTS)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    LHC experiments at CERN and worldwide utilize WLCG resources and middleware components to perform distributed computing tasks. One of the most important tasks is reliable file replication. It is a complex problem, suffering from transfer failures, disconnections, transfer duplication, server and network overload, differences in storage systems, etc. To address these problems, EMI and gLite have provided the independent File Transfer Service (FTS) and Grid File Access Library (GFAL) tools. Their development started almost a decade ago, in the meantime, requirements in data management have changed - the old architecture of FTS and GFAL cannot keep support easily these changes. Technology has also been progressing: FTS and GFAL do not fit into the new paradigms (cloud, messaging, for example). To be able to serve the next stage of LHC data collecting (from 2013), we need a new generation of  these tools: FTS 3 and GFAL 2. We envision a service requiring minimal configuration, which can dynamically adapt to the...

  17. GIFT: an HEP project for file transfer

    International Nuclear Information System (INIS)

    Ferrer, M.L.; Mirabelli, G.; Valente, E.

    1986-01-01

    Started in autumn 1983, GIFT (General Internetwork File Transfer) is a collaboration among several HEP centers, including CERN, Frascati, Oslo, Oxford, RAL and Rome. The collaboration was initially set up with the aim of studying the feasibility of a software system to allow direct file exchange between computers which do not share a common Virtual File Protocol. After the completion of this first phase, an implementation phase started and, since March 1985, an experimental service based on this system has been running at CERN between DECnet, CERNET and the UK Coloured Book protocols. The authors present the motivations that, together with previous gateway experiences, led to the definition of GIFT specifications and to the implementation of the GIFT Kernel system. The position of GIFT in the overall development framework of the networking facilities needed by large international collaborations within the HEP community is explained. (Auth.)

  18. Business and computing : Bridging the gap

    International Nuclear Information System (INIS)

    Gordon, B.; Coles, F.C.

    1999-01-01

    Information systems, in the form of paper files or electronic files on computer systems and digital storage devices are employed in the oil industry to handle data from 3 basic infrastructures: accounting, geotechnical, and administration. The accounting function was the main driving force behind the development of the computer. The geotechnical data storage and manipulation infrastructure has its basis in signal recording and processing related to seismic acquisition and well logging. The administrative infrastructure deals with documents and not just data. Management in the oil industry needs two main kinds of useful information: reports about their organization and about the marketplace. Using an example of an oil and gas enterprise whose aim is to pursue low cost shallow gas to increase production levels, the basic business process is shown to relate to land and prospect inventory management, tightly controlled drilling methods, gathering system and production facility standardization, logistics and planning models, and strong transportation and marketing management. The role of the computer in this process is to yield information, that is, to provide coordinated, integrated, useful information that facilitates the processes essential to accomplish the business's objectives

  19. Scientific production of brazilian researchers who filed patents in the area of biotechnology from 2001 to 2005: institutional and interpersonal collaboration

    Directory of Open Access Journals (Sweden)

    Ana Maria Mielniczuk de Moura

    2010-05-01

    Full Text Available Analyzes the scientific production of researchers who deposited patents in the field of Biotechnology in the period from 2001 to 2005. From a scientometric approach, aims to reveal the inter-institutional collaboration and interpersonal existing. The corpus is based on 2584 items collected in WebofScience. We used the methodology of Social Network Analysis and MDS to observe the formation of clusters of authors and institutions. The results indicate that to most of the articles has up to three institutions involved in field C1, because 88.7% of cases present themselves in this way. It was observed that the scientific production is concentrated in a few institutions, led by public universities (federal and state and research institutions of repute. Among the universities, the most productive are the USP, UNICAMP, UNESP and UFRJ, and between research institutions, have highlighted the FIOCRUZ, Instituto Butantan and EMBRAPA. Some institutions have a regional pattern of collaboration, since they have only interaction with other institutions closer geographically to form regional clusters with motivation. The most productive authors are not in the top positions in the ranking by outdegree, meaning the centrality is not directly related to productivity. It was observed that the interpersonal collaboration is strengthened after the partnership formed by the bond created in graduate school, as many partnerships have been formed from this type of relationship, with significant production between agents and targeted.

  20. Uranium and thorium occurrences in New Mexico: distribution, geology, production, and resources, with selected bibliography. Open-file report OF-183

    International Nuclear Information System (INIS)

    McLemore, V.T.

    1983-09-01

    Over 1300 uranium and thorium occurrences are found in over 100 formational units in all but two counties, in all 1- by 2-degree topographic quadrangles, and in all four geographic provinces in New Mexico. Uranium production in New Mexico has surpassed yearly production from all other states since 1956. Over 200 mines in 18 counties in New Mexico have produced 163,010 tons (147,880 metric tons) of U 3 O 8 from 1948 to 1982, 40% of the total uranium production in the United States. More than 99% of this production has come from sedimentary rocks in the San Juan Basin area in northwestern New Mexico; 96% has come from the Morrison Formation alone. All of the uranium reserves and the majority of the potential uranium resources in New Mexico are in the Grants uranium district. About 112,500 tons (102,058 metric tons) of $30 per pound of U 3 O 8 reserves are in the San Juan Basin, about 55% of the total $30 reserves in the United States. Thorium reserves and resources in New Mexico have not been adequately evaluated and are unknown. Over 1300 uranium and thorium occurrences are described in this report, about 400 of these have been examined in the field by the author. The occurrence descriptions include information on location, commodities, production, development, geology, and classification. Over 1000 citations are included in the bibliography and referenced in the occurrence descriptions. Production statistics for uranium mines that operated from 1948 to 1970 are also included. Mines that operated after 1970 are classified into production categories. 43 figures, 9 tables

  1. Configuration Management File Manager Developed for Numerical Propulsion System Simulation

    Science.gov (United States)

    Follen, Gregory J.

    1997-01-01

    One of the objectives of the High Performance Computing and Communication Project's (HPCCP) Numerical Propulsion System Simulation (NPSS) is to provide a common and consistent way to manage applications, data, and engine simulations. The NPSS Configuration Management (CM) File Manager integrated with the Common Desktop Environment (CDE) window management system provides a common look and feel for the configuration management of data, applications, and engine simulations for U.S. engine companies. In addition, CM File Manager provides tools to manage a simulation. Features include managing input files, output files, textual notes, and any other material normally associated with simulation. The CM File Manager includes a generic configuration management Application Program Interface (API) that can be adapted for the configuration management repositories of any U.S. engine company.

  2. Tabulation of Fundamental Assembly Heat and Radiation Source Files

    International Nuclear Information System (INIS)

    T. deBues; J.C. Ryman

    2006-01-01

    The purpose of this calculation is to tabulate a set of computer files for use as input to the WPLOAD thermal loading software. These files contain details regarding heat and radiation from pressurized water reactor (PWR) assemblies and boiling water reactor (BWR) assemblies. The scope of this calculation is limited to rearranging and reducing the existing file information into a more streamlined set of tables for use as input to WPLOAD. The electronic source term files used as input to this calculation were generated from the output files of the SAS2H/ORIGIN-S sequence of the SCALE Version 4.3 modular code system, as documented in References 2.1.1 and 2.1.2, and are included in Attachment II

  3. Study and development of a document file system with selective access

    International Nuclear Information System (INIS)

    Mathieu, Jean-Claude

    1974-01-01

    The objective of this research thesis was to design and to develop a set of software aimed at an efficient management of a document file system by using methods of selective access to information. Thus, the three main aspects of file processing (creation, modification, reorganisation) have been addressed. The author first presents the main problems related to the development of a comprehensive automatic documentation system, and their conventional solutions. Some future aspects, notably dealing with the development of peripheral computer technology, are also evoked. He presents the characteristics of INIS bibliographic records provided by the IAEA which have been used to create the files. In the second part, he briefly describes the file system general organisation. This system is based on the use of two main files: an inverse file which contains for each descriptor a list of of numbers of files indexed by this descriptor, and a dictionary of descriptor or input file which gives access to the inverse file. The organisation of these both files is then describes in a detailed way. Other related or associated files are created, and the overall architecture and mechanisms integrated into the file data input software are described, as well as various processing applied to these different files. Performance and possible development are finally discussed

  4. Economic missions. Synthetic file: the petroleum sector in Brazil (exploration and production); the refining activity in Brazil; natural gas in Brazil: a fragile market, inferior to forecasts

    International Nuclear Information System (INIS)

    Anon.

    2002-01-01

    This dossier prepared by the economic mission of the French embassy in Brazil makes a synthesis of the exploration-production and refining activities of the petroleum industry, and of the natural gas distribution market in Brazil: oil reserves and production, Petrobras company, partnership agreements with Petrobras, legal aspects, concessions, projects financing, refining capacity, refinery projects in progress or under study, para-petroleum market perspectives and opportunities, natural gas market development, pipelines network, gas utilities, privatization and foreign participation, lack of expertise and of gas infrastructures and equipments. (J.S.)

  5. Evaluation of resonance parameters of Mo, Tc, Te, Ba, La, Ce, Pr, Nd, Pm, Sm and Eu isotopes for JENDL-2 fission product file

    International Nuclear Information System (INIS)

    Kikuchi, Yasuyuki; Togawa, Orihiko; Nakagawa, Tsuneo

    1986-03-01

    The resonance parameters of 39 fission product nuclides have been evaluated. The present work is a part of the evaluation of 100 fission product nuclei for JENDL-2 by Japanese Nuclear Data Committee. All the available experimental data were collected, stored in REPSTOR system and compared with one another. The evaluation was made on the basis of the experimental data. The precise description of the evaluation is given in this report. The presently evaluated resonance parameters are tabulated in Appendix with the experimental data. (author)

  6. Design and development of a computer-based continuous monitor for the determination of the short-lived decay products of radon and thoron

    Energy Technology Data Exchange (ETDEWEB)

    Bigu, J [Department of Energy, Mines and Resources, Elliot Lake, Ontario (Canada). Elliot Lake Lab.; Raz, R; Golden, K; Dominguez, P [Alpha-NUCLEAR, Toronto, Ontario (Canada)

    1984-08-15

    A portable, rugged, monitor has been designed and built for measuring the short-lived decay products of radon and thoron. The monitor is computer-based and employs a continuous filter strip which can be advanced at programmable time intervals to allow unattended continuous operation with automatic sampling, analysis and recording of radiation levels. Radionuclide analysis is carried out by two silicon diffused-junction alpha-detectors and electronic circuitry with multichannel spectral analysis capabilities. Standard gross ..cap alpha..-count methods and ..cap alpha..-spectroscopy methods can easily be implemented. The built-in computer performs a variety of operations via a specially designed interface module, including control and data recording functions, and computations, program storage and display functions. Programs and data are stored in the built-in casette tape drive and the computer integrated CRT display and keyboard allow simple, prompted menu-type operation of standard software. Graphical presentation of ..cap alpha..-spectra can be shown on the computer CRT and printed when required on the computer built-in thermal printer. In addition, to implementing the specially developed radionuclide analysis software, the operator can interact and modify existing software, and program new ones, through BASIC language programming, or employ the computer in a totally unrelated, general purpose model. Although the monitor is ideally suited for environmental radon (thoron) daughter monitoring, it could also be used in the determination of other airborne radionuclides provided adequate analytical procedures are developed or included in the already existing computer software.

  7. Design and development of a computer-based continuous monitor for the determination of the short-lived decay products of radon and thoron

    International Nuclear Information System (INIS)

    Bigu, J.

    1984-01-01

    A portable, rugged, monitor has been designed and built for measuring the short-lived decay products of radon and thoron. The monitor is computer-based and employs a continuous filter strip which can be advanced at programmable time intervals to allow unattended continuous operatin with automatic sampling, analysis and recording of radiation levels. Radionuclide analysis is carried out by two silicon diffused-junction alpha-detectors and electronic circuitry with multichannel spectral analysis capabilities. Standard gross α-count methods and α-spectroscopy methods can easily be implemented. The built-in computer performs a variety of operations via a specially designed interface module, including control and data recording functions, and computations, program storage and display functions. Programs and data are stored in the built-in casette tape drive and the computer integrated CRT display and keyboard allow simple, prompted menu-type operation of standard software. Graphical presentation of α-spectra can be shown on the computer CRT and printed when required on the computer built-in thermal printer. In addition, to implementing the specially developed radionuclide analysis software, the operator can interact and modify existing software, and program new ones, through BASIC language programming, or employ the computer in a totally unrelated, general purpose model. Although the monitor is ideally suited for environmental radon (thoron) daughter monitoring, it could also be used in the determination of other airborne radionuclides provided adequate analytical procedures are developed or included in the already existing computer software. (orig.)

  8. Improving the mixing performances of rice straw anaerobic digestion for higher biogas production by computational fluid dynamics (CFD) simulation.

    Science.gov (United States)

    Shen, Fei; Tian, Libin; Yuan, Hairong; Pang, Yunzhi; Chen, Shulin; Zou, Dexun; Zhu, Baoning; Liu, Yanping; Li, Xiujin

    2013-10-01

    As a lignocellulose-based substrate for anaerobic digestion, rice straw is characterized by low density, high water absorbability, and poor fluidity. Its mixing performances in digestion are completely different from traditional substrates such as animal manures. Computational fluid dynamics (CFD) simulation was employed to investigate mixing performances and determine suitable stirring parameters for efficient biogas production from rice straw. The results from CFD simulation were applied in the anaerobic digestion tests to further investigate their reliability. The results indicated that the mixing performances could be improved by triple impellers with pitched blade, and complete mixing was easily achieved at the stirring rate of 80 rpm, as compared to 20-60 rpm. However, mixing could not be significantly improved when the stirring rate was further increased from 80 to 160 rpm. The simulation results agreed well with the experimental results. The determined mixing parameters could achieve the highest biogas yield of 370 mL (g TS)(-1) (729 mL (g TS(digested))(-1)) and 431 mL (g TS)(-1) (632 mL (g TS(digested))(-1)) with the shortest technical digestion time (T 80) of 46 days. The results obtained in this work could provide useful guides for the design and operation of biogas plants using rice straw as substrates.

  9. Computing the sparse matrix vector product using block-based kernels without zero padding on processors with AVX-512 instructions

    Directory of Open Access Journals (Sweden)

    Bérenger Bramas

    2018-04-01

    Full Text Available The sparse matrix-vector product (SpMV is a fundamental operation in many scientific applications from various fields. The High Performance Computing (HPC community has therefore continuously invested a lot of effort to provide an efficient SpMV kernel on modern CPU architectures. Although it has been shown that block-based kernels help to achieve high performance, they are difficult to use in practice because of the zero padding they require. In the current paper, we propose new kernels using the AVX-512 instruction set, which makes it possible to use a blocking scheme without any zero padding in the matrix memory storage. We describe mask-based sparse matrix formats and their corresponding SpMV kernels highly optimized in assembly language. Considering that the optimal blocking size depends on the matrix, we also provide a method to predict the best kernel to be used utilizing a simple interpolation of results from previous executions. We compare the performance of our approach to that of the Intel MKL CSR kernel and the CSR5 open-source package on a set of standard benchmark matrices. We show that we can achieve significant improvements in many cases, both for sequential and for parallel executions. Finally, we provide the corresponding code in an open source library, called SPC5.

  10. A Well-Mixed Computational Model for Estimating Room Air Levels of Selected Constituents from E-Vapor Product Use

    Directory of Open Access Journals (Sweden)

    Ali A. Rostami

    2016-08-01

    Full Text Available Concerns have been raised in the literature for the potential of secondhand exposure from e-vapor product (EVP use. It would be difficult to experimentally determine the impact of various factors on secondhand exposure including, but not limited to, room characteristics (indoor space size, ventilation rate, device specifications (aerosol mass delivery, e-liquid composition, and use behavior (number of users and usage frequency. Therefore, a well-mixed computational model was developed to estimate the indoor levels of constituents from EVPs under a variety of conditions. The model is based on physical and thermodynamic interactions between aerosol, vapor, and air, similar to indoor air models referred to by the Environmental Protection Agency. The model results agree well with measured indoor air levels of nicotine from two sources: smoking machine-generated aerosol and aerosol exhaled from EVP use. Sensitivity analysis indicated that increasing air exchange rate reduces room air level of constituents, as more material is carried away. The effect of the amount of aerosol released into the space due to variability in exhalation was also evaluated. The model can estimate the room air level of constituents as a function of time, which may be used to assess the level of non-user exposure over time.

  11. Computational fluid dynamics tracking of UF6 reaction products release into a gaseous diffusion plant cell housing

    International Nuclear Information System (INIS)

    Wendel, M.W.; Chen, N.C.J.; Kim, S.H.; Taleyarkhan, R.P.; Keith, K.D.; Schmidt, R.W.

    1996-01-01

    A three-dimensional (3-D) computational fluid dynamics (CFD) model has been developed using CFDS-FLOW3D Version 3.3 to model the transport of aerosol products formed during a release of uranium hexafluoride (UF 6 ) into a gaseous diffusion plant (GDP) process building. As part of a facility-wide safety evaluation, a one-dimensional (1-D) analysis of aerosol/vapor transport following such an hypothesized severe accident is being performed. The objective of this study is to supplement the 1-D analysis with more detailed 3-D results. Specifically, the goal is to quantify the distribution of aerosol passing out of the process building during the hypothetical accident. This work demonstrates a useful role for CFD in large 3-D problems, where some experimental data are available for calibrating key parameters and the desired results are global (total time-integrated aerosol flow rates across a few boundary surfaces) as opposed to local velocities, temperatures, or heat transfer coefficients

  12. Is importing second-hand products a good thing? The cases of computers and tires in Cambodia

    International Nuclear Information System (INIS)

    Chanthy, Lay; Nitivattananon, Vilas

    2011-01-01

    Is importing second-hand products (SHPs) good for Cambodia? To answer this question, one must seriously consider environmental and social effects. The main objective of this study is to identify and assess the economic, social, and environmental impacts of imported SHPs to determine whether or not Cambodia benefits. Imported second-hand computers (SHPCs) and second-hand tires (SHTs) were selected as cases for the study. The study used a scaling checklist to identify significant impacts of these two imported items. Significant impacts were ranked and rated into a single value (score) for integration. Integrated impact assessment showed that imported SHPCs create a very small positive impact (+ 0.1 of + 5) and imported SHTs generate a large negative impact (- 2.83 of - 5). These scores are mainly the result of environmental impact, predominantly waste issues. Thus, current imports of SHPCs and SHTs do not really benefit Cambodia, but instead cause serious environmental problems from their waste issues. The import serves as a channel to transfer waste into developing countries.

  13. 77 FR 4558 - Combined Notice of Filings #1

    Science.gov (United States)

    2012-01-30

    ... Numbers: EC12-62-000. Applicants: La Paloma Generating Company, LLC, Merrill Lynch Credit Products, LLC..., LLC and La Paloma Generating Company, LLC. Filed Date: 1/20/12. Accession Number: 20120120-5257...

  14. The Improvement and Performance of Mobile Environment Using Both Cloud and Text Computing

    OpenAIRE

    S.Saravana Kumar; J.Lakshmi Priya; P.Hannah Jennifer; N.Jeff Monica; Fathima

    2013-01-01

    In this research paper presents an design model for file sharing system for ubiquitos mobile devices using both cloud and text computing. File s haring is one of the rationales for computer networks with increasing demand for file sharing ap plications and technologies in small and large enterprise networks and on the Internet. File transfer is an important process in any form of computing as we need to really share the data ac ross. ...

  15. Fuel behaviour and fission product release under realistic hydrogen conditions comparisons between HEVA 06 test results and Vulcain computations

    International Nuclear Information System (INIS)

    Dumas, J.M.; Lhiaubet, G.

    1989-07-01

    The HEVA 06 test was designed to simulate the conditions existing at the time when fission products are released from irradiated fuel under hydrogen conditions occurring in a PWR core at low pressure. The test conditions were defined from results provided by the core degradation module of the ESCADRE system (1): VULCAIN. This computer code has been recently used to analyse the early core degradation of a 900 MWe PWR in the AF accident sequence (as defined in WASH - 1400, USNRC - 1975). In this scenario, the core would begin to uncover about one day after scram with the system pressure at about 0.4 MPa. The fission product release starts 70 minutes after core dewatering. The F.P. are transferred to the core outlet in an increasingly hydrogen-rich steam atmosphere. The carrier gas is nearly pure hydrogen in the time period 100 - 130 minutes after core uncovering. A large release of F.P. is predicted in the upper part of the core when the steam starvation occurs. At that time, two thirds of the cladding have been oxidised on an average. Before each HEVA test a fuel sample with a burn-up of 36 GWd/tU is reirradiated in order to observe the release of short-lived fission products. A pre-oxidation was primarely conducted in the HEVA 06 test at a temperature of 1300 0 C and controlled to reach a 2/3 cladding oxidation state. Then the steam was progressively replaced by hydrogen and a heat-up rate of 1.5 0 C/s was induced to reach a temperature of 2100 0 C. The fuel was maintained at this temperature for half an hour in hydrogen. The volatile F.P. release kinetics were observed by on-line gamma spectrometry. Pre test calculations of F.P. release kinetics performed with the EMIS module based on the CORSOR models (3) are compared with the test results. Measured releases of cesium and iodine are really lower than those predicted. Axial and radial F.P. distributions in the fuel pellets are available from gamma tomography measurements performed after the test. Tellurium seems

  16. Download this PDF file

    African Journals Online (AJOL)

    5,. May. 1923, p. 287. ISouth African Military Schools) p 287. CGS Box 231, File 31/0/2. .... One gains the impression that the sphere .... tions, Anthropology, Sociology and Man Manage- ment. ... of the word, possesses personality and initiative,.

  17. MMLEADS Public Use File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicare-Medicaid Linked Enrollee Analytic Data Source (MMLEADS) Public Use File (PUF) contains demographic, enrollment, condition prevalence, utilization, and...

  18. Hospital Service Area File

    Data.gov (United States)

    U.S. Department of Health & Human Services — This file is derived from the calendar year inpatient claims data. The records contain number of discharges, length of stay, and total charges summarized by provider...

  19. Patient Treatment File (PTF)

    Data.gov (United States)

    Department of Veterans Affairs — This database is part of the National Medical Information System (NMIS). The Patient Treatment File (PTF) contains a record for each inpatient care episode provided...

  20. USEEIO Satellite Files

    Data.gov (United States)

    U.S. Environmental Protection Agency — These files contain the environmental data as particular emissions or resources associated with a BEA sectors that are used in the USEEIO model. They are organized...

  1. Provider of Services File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The POS file contains data on characteristics of hospitals and other types of healthcare facilities, including the name and address of the facility and the type of...

  2. Download this PDF file

    African Journals Online (AJOL)

    countries quite a number of distance education institutions and programmes are more likely to be ... The Open University of Tanzania (OUT), (Ministry of Higher Education, Science and ..... (1991) Comic Relief Funding file. BAI, London, 1st ...

  3. Determining the explosion effects on the Gasbuggy reservoir from computer simulation of the postshot gas production history

    Energy Technology Data Exchange (ETDEWEB)

    Rogers, Leo A [El Paso Natural Gas Company (United States)

    1970-05-01

    Analysis of the gas production data from Gasbuggy to deduce reservoir properties outside the chimney is complicated by the large gas storage volume in the chimney because the gas flow from the surrounding reservoir into the chimney cannot be directly measured. This problem was overcome by developing a chimney volume factor F (M{sup 2}CF/PSI) based upon analysis of rapid drawdowns during the production tests. The chimney volume factor was in turn used to construct the time history of the required influx of gas into the chimney from the surrounding reservoir. The most probable value of F to describe the chimney is found to be 0.150 M{sup 2}CF/PSI. Postulated models of the reservoir properties outside the chimney are examined by calculating the pressure distribution and flow of gas through the reservoir with the experimentally observed chimney pressure history applied to the cavity wall. The calculated influx from the reservoir into the chimney is then compared to the required influx and the calculated pressure at a radius of 300 feet is compared to the observed pressures in a shut-in satellite well (GB-2RS) which intersects the gas-bearing formation 300 feet from the center of the chimney. A description of the mathematics in the computer program used to perform the calculations is given. Gas flow for a radial model wherein permeability and porosity are uniform through the gas producing sand outside the chimney was calculated for several values of permeability. These calculations indicated that for the first drawdown test (July 1968) the permeability-producing height product (kh) was in the region of 15 to 30 millidarcy-feet (md-ft) and that after several months of testing, the effective kh had dropped to less than 8 md-ft. Calculations wherein (1) the permeability decreases from the chimney out to the 'fracture' radius, and (2) an increased production height is used near the chimney, match the data better than the simple radial model. Reasonable fits to the data for

  4. Determining the explosion effects on the Gasbuggy reservoir from computer simulation of the postshot gas production history

    International Nuclear Information System (INIS)

    Rogers, Leo A.

    1970-01-01

    Analysis of the gas production data from Gasbuggy to deduce reservoir properties outside the chimney is complicated by the large gas storage volume in the chimney because the gas flow from the surrounding reservoir into the chimney cannot be directly measured. This problem was overcome by developing a chimney volume factor F (M 2 CF/PSI) based upon analysis of rapid drawdowns during the production tests. The chimney volume factor was in turn used to construct the time history of the required influx of gas into the chimney from the surrounding reservoir. The most probable value of F to describe the chimney is found to be 0.150 M 2 CF/PSI. Postulated models of the reservoir properties outside the chimney are examined by calculating the pressure distribution and flow of gas through the reservoir with the experimentally observed chimney pressure history applied to the cavity wall. The calculated influx from the reservoir into the chimney is then compared to the required influx and the calculated pressure at a radius of 300 feet is compared to the observed pressures in a shut-in satellite well (GB-2RS) which intersects the gas-bearing formation 300 feet from the center of the chimney. A description of the mathematics in the computer program used to perform the calculations is given. Gas flow for a radial model wherein permeability and porosity are uniform through the gas producing sand outside the chimney was calculated for several values of permeability. These calculations indicated that for the first drawdown test (July 1968) the permeability-producing height product (kh) was in the region of 15 to 30 millidarcy-feet (md-ft) and that after several months of testing, the effective kh had dropped to less than 8 md-ft. Calculations wherein (1) the permeability decreases from the chimney out to the 'fracture' radius, and (2) an increased production height is used near the chimney, match the data better than the simple radial model. Reasonable fits to the data for the

  5. JENDL Dosimetry File

    International Nuclear Information System (INIS)

    Nakazawa, Masaharu; Iguchi, Tetsuo; Kobayashi, Katsuhei; Iwasaki, Shin; Sakurai, Kiyoshi; Ikeda, Yujiro; Nakagawa, Tsuneo.

    1992-03-01

    The JENDL Dosimetry File based on JENDL-3 was compiled and integral tests of cross section data were performed by the Dosimetry Integral Test Working Group of the Japanese Nuclear Data Committee. Data stored in the JENDL Dosimetry File are the cross sections and their covariance data for 61 reactions. The cross sections were mainly taken from JENDL-3 and the covariances from IRDF-85. For some reactions, data were adopted from other evaluated data files. The data are given in the neutron energy region below 20 MeV in both of point-wise and group-wise files in the ENDF-5 format. In order to confirm reliability of the data, several integral tests were carried out; comparison with the data in IRDF-85 and average cross sections measured in fission neutron fields, fast reactor spectra, DT neutron fields and Li(d, n) neutron fields. As a result, it has been found that the JENDL Dosimetry File gives better results than IRDF-85 but there are some problems to be improved in future. The contents of the JENDL Dosimetry File and the results of the integral tests are described in this report. All of the dosimetry cross sections are shown in a graphical form. (author) 76 refs

  6. JENDL Dosimetry File

    Energy Technology Data Exchange (ETDEWEB)

    Nakazawa, Masaharu; Iguchi, Tetsuo [Tokyo Univ. (Japan). Faculty of Engineering; Kobayashi, Katsuhei [Kyoto Univ., Kumatori, Osaka (Japan). Research Reactor Inst.; Iwasaki, Shin [Tohoku Univ., Sendai (Japan). Faculty of Engineering; Sakurai, Kiyoshi; Ikeda, Yujior; Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1992-03-15

    The JENDL Dosimetry File based on JENDL-3 was compiled and integral tests of cross section data were performed by the Dosimetry Integral Test Working Group of the Japanese Nuclear Data Committee. Data stored in the JENDL Dosimetry File are the cross sections and their covariance data for 61 reactions. The cross sections were mainly taken from JENDL-3 and the covariances from IRDF-85. For some reactions, data were adopted from other evaluated data files. The data are given in the neutron energy region below 20 MeV in both of point-wise and group-wise files in the ENDF-5 format. In order to confirm reliability of the data, several integral tests were carried out; comparison with the data in IRDF-85 and average cross sections measured in fission neutron fields, fast reactor spectra, DT neutron fields and Li(d,n) neutron fields. As a result, it has been found that the JENDL Dosimetry File gives better results than IRDF-85 but there are some problems to be improved in future. The contents of the JENDL Dosimetry File and the results of the integral tests are described in this report. All of the dosimetry cross sections are shown in a graphical form.

  7. The effect of a feedback signal in a computer mouse on hovering behaviour, productivity, comfort and usability in a field study

    NARCIS (Netherlands)

    Kraker, H. de; Korte, E. de; Mil, F. van; Rijs, B.; Bongers, P.

    2008-01-01

    The aim of this study was to determine the effect of a tactile feedback signal on hovering behaviour, productivity, usability and comfort after 1 week of using an experimental mouse. In a randomized controlled trial, a regular computer mouse was compared to a new developed mouse with a tactile,

  8. The combinatorics computation for Casimir operators of the symplectic Lie algebra and the application for determining the center of the enveloping algebra of a semidirect product

    International Nuclear Information System (INIS)

    Le Van Hop.

    1989-12-01

    The combinatorics computation is used to describe the Casimir operators of the symplectic Lie Algebra. This result is applied for determining the Center of the enveloping Algebra of the semidirect Product of the Heisenberg Lie Algebra and the symplectic Lie Algebra. (author). 10 refs

  9. Methods, Devices and Computer Program Products Providing for Establishing a Model for Emulating a Physical Quantity Which Depends on at Least One Input Parameter, and Use Thereof

    DEFF Research Database (Denmark)

    2014-01-01

    The present invention proposes methods, devices and computer program products. To this extent, there is defined a set X including N distinct parameter values x_i for at least one input parameter x, N being an integer greater than or equal to 1, first measured the physical quantity Pm1 for each...

  10. Development of data file system for cardiovascular nuclear medicine

    International Nuclear Information System (INIS)

    Hayashida, Kohei; Nishimura, Tsunehiko; Uehara, Toshiisa; Nisawa, Yoshifumi.

    1985-01-01

    A computer-assisted filing system for storing and processing data from cardiac pool scintigraphy and myocardial scintigraphy has been developed. Individual patient data are stored with his (her) identification number (ID) into floppy discs successively in order of receiving scintigraphy. Data for 900 patients can be stored per floppy disc. Scintigraphic findings can be outputted in a uniform file format, and can be used as a reporting format. Output or retrieval of filed individual patient data is possible according to each examination, disease code or ID. This system seems to be used for prospective study in patients with cardiovascular diseases. (Namekawa, K.)

  11. Dose field simulation for products irradiated by electron beams: formulation of the problem and its step by step solution with EGS4 computer code

    International Nuclear Information System (INIS)

    Rakhno, I.L.; Roginets, L.P.

    1999-01-01

    When performing radiation treatment of products using an electron beam much time and money should be spent for numerous measurements to make optimal choice of treatment mode. Direct radiation treatment simulation by means of the EGS4 computer code fails to describe such measurement results correctly. In the paper a multi-step radiation treatment planning procedure is suggested which consists in fitting the EGS4 simulation results to reference measurement results, and using the fitted electron beam parameters and other ones in subsequent computer simulations. It is shown that the fitting procedure should be performed separately for each material or product type. The procedure suggested allows to replace measurements by computer simulations and therefore reduces significantly time and money required for such measurements. (author)

  12. Tuning HDF5 subfiling performance on parallel file systems

    Energy Technology Data Exchange (ETDEWEB)

    Byna, Suren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Chaarawi, Mohamad [Intel Corp. (United States); Koziol, Quincey [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Mainzer, John [The HDF Group (United States); Willmore, Frank [The HDF Group (United States)

    2017-05-12

    Subfiling is a technique used on parallel file systems to reduce locking and contention issues when multiple compute nodes interact with the same storage target node. Subfiling provides a compromise between the single shared file approach that instigates the lock contention problems on parallel file systems and having one file per process, which results in generating a massive and unmanageable number of files. In this paper, we evaluate and tune the performance of recently implemented subfiling feature in HDF5. In specific, we explain the implementation strategy of subfiling feature in HDF5, provide examples of using the feature, and evaluate and tune parallel I/O performance of this feature with parallel file systems of the Cray XC40 system at NERSC (Cori) that include a burst buffer storage and a Lustre disk-based storage. We also evaluate I/O performance on the Cray XC30 system, Edison, at NERSC. Our results show performance benefits of 1.2X to 6X performance advantage with subfiling compared to writing a single shared HDF5 file. We present our exploration of configurations, such as the number of subfiles and the number of Lustre storage targets to storing files, as optimization parameters to obtain superior I/O performance. Based on this exploration, we discuss recommendations for achieving good I/O performance as well as limitations with using the subfiling feature.

  13. Geospatial Method for Computing Supplemental Multi-Decadal U.S. Coastal Land-Use and Land-Cover Classification Products, Using Landsat Data and C-CAP Products

    Science.gov (United States)

    Spruce, J. P.; Smoot, James; Ellis, Jean; Hilbert, Kent; Swann, Roberta

    2012-01-01

    This paper discusses the development and implementation of a geospatial data processing method and multi-decadal Landsat time series for computing general coastal U.S. land-use and land-cover (LULC) classifications and change products consisting of seven classes (water, barren, upland herbaceous, non-woody wetland, woody upland, woody wetland, and urban). Use of this approach extends the observational period of the NOAA-generated Coastal Change and Analysis Program (C-CAP) products by almost two decades, assuming the availability of one cloud free Landsat scene from any season for each targeted year. The Mobile Bay region in Alabama was used as a study area to develop, demonstrate, and validate the method that was applied to derive LULC products for nine dates at approximate five year intervals across a 34-year time span, using single dates of data for each classification in which forests were either leaf-on, leaf-off, or mixed senescent conditions. Classifications were computed and refined using decision rules in conjunction with unsupervised classification of Landsat data and C-CAP value-added products. Each classification's overall accuracy was assessed by comparing stratified random locations to available reference data, including higher spatial resolution satellite and aerial imagery, field survey data, and raw Landsat RGBs. Overall classification accuracies ranged from 83 to 91% with overall Kappa statistics ranging from 0.78 to 0.89. The accuracies are comparable to those from similar, generalized LULC products derived from C-CAP data. The Landsat MSS-based LULC product accuracies are similar to those from Landsat TM or ETM+ data. Accurate classifications were computed for all nine dates, yielding effective results regardless of season. This classification method yielded products that were used to compute LULC change products via additive GIS overlay techniques.

  14. Renewal-anomalous-heterogeneous files

    International Nuclear Information System (INIS)

    Flomenbom, Ophir

    2010-01-01

    Renewal-anomalous-heterogeneous files are solved. A simple file is made of Brownian hard spheres that diffuse stochastically in an effective 1D channel. Generally, Brownian files are heterogeneous: the spheres' diffusion coefficients are distributed and the initial spheres' density is non-uniform. In renewal-anomalous files, the distribution of waiting times for individual jumps is not exponential as in Brownian files, yet obeys: ψ α (t)∼t -1-α , 0 2 >, obeys, 2 >∼ 2 > nrml α , where 2 > nrml is the MSD in the corresponding Brownian file. This scaling is an outcome of an exact relation (derived here) connecting probability density functions of Brownian files and renewal-anomalous files. It is also shown that non-renewal-anomalous files are slower than the corresponding renewal ones.

  15. Cardiac-Specific Conversion Factors to Estimate Radiation Effective Dose From Dose-Length Product in Computed Tomography.

    Science.gov (United States)

    Trattner, Sigal; Halliburton, Sandra; Thompson, Carla M; Xu, Yanping; Chelliah, Anjali; Jambawalikar, Sachin R; Peng, Boyu; Peters, M Robert; Jacobs, Jill E; Ghesani, Munir; Jang, James J; Al-Khalidi, Hussein; Einstein, Andrew J

    2018-01-01

    This study sought to determine updated conversion factors (k-factors) that would enable accurate estimation of radiation effective dose (ED) for coronary computed tomography angiography (CTA) and calcium scoring performed on 12 contemporary scanner models and current clinical cardiac protocols and to compare these methods to the standard chest k-factor of 0.014 mSv·mGy -1 cm -1 . Accurate estimation of ED from cardiac CT scans is essential to meaningfully compare the benefits and risks of different cardiac imaging strategies and optimize test and protocol selection. Presently, ED from cardiac CT is generally estimated by multiplying a scanner-reported parameter, the dose-length product, by a k-factor which was determined for noncardiac chest CT, using single-slice scanners and a superseded definition of ED. Metal-oxide-semiconductor field-effect transistor radiation detectors were positioned in organs of anthropomorphic phantoms, which were scanned using all cardiac protocols, 120 clinical protocols in total, on 12 CT scanners representing the spectrum of scanners from 5 manufacturers (GE, Hitachi, Philips, Siemens, Toshiba). Organ doses were determined for each protocol, and ED was calculated as defined in International Commission on Radiological Protection Publication 103. Effective doses and scanner-reported dose-length products were used to determine k-factors for each scanner model and protocol. k-Factors averaged 0.026 mSv·mGy -1 cm -1 (95% confidence interval: 0.0258 to 0.0266) and ranged between 0.020 and 0.035 mSv·mGy -1 cm -1 . The standard chest k-factor underestimates ED by an average of 46%, ranging from 30% to 60%, depending on scanner, mode, and tube potential. Factors were higher for prospective axial versus retrospective helical scan modes, calcium scoring versus coronary CTA, and higher (100 to 120 kV) versus lower (80 kV) tube potential and varied among scanner models (range of average k-factors: 0.0229 to 0.0277 mSv·mGy -1 cm -1 ). Cardiac k

  16. Computational analysis of modern HTGR fuel performance and fission product release during the HFR-EU1 irradiation experiment

    Energy Technology Data Exchange (ETDEWEB)

    Verfondern, Karl, E-mail: k.verfondern@fz-juelich.de [Research Center Jülich, Institute of Energy and Climate Research, 52425 Jülich (Germany); Xhonneux, André, E-mail: xhonneux@lrst.rwth-aachen.de [Research Center Jülich, Institute of Energy and Climate Research, 52425 Jülich (Germany); Nabielek, Heinz, E-mail: heinznabielek@me.com [Research Center Jülich, Monschauerstrasse 61, 52355 Düren (Germany); Allelein, Hans-Josef, E-mail: h.j.allelein@fz-juelich.de [Research Center Jülich, Institute of Energy and Climate Research, 52425 Jülich (Germany); RWTH Aachen, Chair for Reactor Safety and Reactor Technology, 52072 Aachen (Germany)

    2014-07-01

    Highlights: • HFR-EU1 irradiation test demonstrates high quality of HTGR spherical fuel elements. • Irradiation performance is in good agreement with German fuel performance modeling. • International benchmark exercise expected first particle to fail at ∼13–17% FIMA. • EOL silver release is predicted to be in the percentage range. • EOL cesium and strontium are expected to remain at a low level. - Abstract: Various countries engaged in the development and fabrication of modern HTGR fuel have initiated activities of modeling the fuel and fission product release behavior with the aim of predicting the fuel performance under HTGR operating and accident conditions. Verification and validation studies are conducted by code-to-code benchmarking and code-to-experiment comparisons as part of international exercises. The methodology developed in Germany since the 1980s represents valuable and efficient tools to describe fission product release from spherical fuel elements and TRISO fuel performance, respectively, under given conditions. Continued application to new results of irradiation and accident simulation testing demonstrates the appropriateness of the models in terms of a conservative estimation of the source term as part of interactions with HTGR licensing authorities. Within the European irradiation testing program for HTGR fuel and as part of the former EU RAPHAEL project, the HFR-EU1 irradiation experiment explores the potential for high performance of the presently existing German and newly produced Chinese fuel spheres under defined conditions up to high burnups. The fuel irradiation was completed in 2010. Test samples are prepared for further postirradiation examinations (PIE) including heatup simulation testing in the KÜFA-II furnace at the JRC-ITU, Karlsruhe, to be conducted within the on-going ARCHER Project of the European Commission. The paper will describe the application of the German computer models to the HFR-EU1 irradiation test and

  17. Challenges and considerations for the design and production of a purpose-optimized body-worn wrist-watch computer

    Science.gov (United States)

    Narayanaswami, Chandra; Raghunath, Mandayam T.

    2004-09-01

    We outline a collection of technological challenges in the design of wearable computers with a focus on one of the most desirable form-factors, the wrist watch. We describe our experience with building three generations of wrist watch computers. We built these research prototypes as platforms to investigate the fundamental limitations of wearable computing. Results of our investigations are presented in the form of challenges that have been overcome and those that still remain.

  18. 75 FR 65533 - New Postal Product

    Science.gov (United States)

    2010-10-25

    ... consideration of the Request pertaining to the proposed Express Mail Contract 9 product and the related contract...-filed Postal Service filing to add Express Mail Contract 9 to the competitive product list. The Postal Service has also filed a related contract. This notice addresses procedural steps associated with the...

  19. INTEGRATION OF COMPUTER TECHNOLOGIES SMK: AUTOMATION OF THE PRODUCTION CERTIFICA-TION PROCEDURE AND FORMING OF SHIPPING DOCUMENTS

    Directory of Open Access Journals (Sweden)

    S. A. Pavlenko

    2009-01-01

    Full Text Available Integration of informational computer technologies allowed to reorganize and optimize some processes due to decrease of circulation of documents, unification of documentation forms and others.

  20. Computer simulation to predict energy use, greenhouse gas emissions and costs for production of fluid milk using alternative processing methods

    Science.gov (United States)

    Computer simulation is a useful tool for benchmarking the electrical and fuel energy consumption and water use in a fluid milk plant. In this study, a computer simulation model of the fluid milk process based on high temperature short time (HTST) pasteurization was extended to include models for pr...

  1. The Borg–eye and the We–I. The production of a collective living body through wearable computers

    NARCIS (Netherlands)

    Liberati, Nicola

    2018-01-01

    The aim of this work is to analyze the constitution of a new collective subject thanks to wearable computers. Wearable computers are emerging technologies which are supposed to become pervasively used in the near future. They are devices designed to be on us every single moment of our life and to

  2. 76 FR 53695 - In the Matter of Certain Computer Forensic Devices and Products Containing the Same; Notice of...

    Science.gov (United States)

    2011-08-29

    ... INTERNATIONAL TRADE COMMISSION [Inv. No. 337-TA-799] In the Matter of Certain Computer Forensic... importation, and the sale within the United States after importation of certain computer forensic devices and.... 7,228,379 (``the `379 patent''). The complaint further alleges that an industry in the United States...

  3. The version control service for the ATLAS data acquisition configuration files

    International Nuclear Information System (INIS)

    Soloviev, Igor

    2012-01-01

    The ATLAS experiment at the LHC in Geneva uses a complex and highly distributed Trigger and Data Acquisition system, involving a very large number of computing nodes and custom modules. The configuration of the system is specified by schema and data in more than 1000 XML files, with various experts responsible for updating the files associated with their components. Maintaining an error free and consistent set of XML files proved a major challenge. Therefore a special service was implemented; to validate any modifications; to check the authorization of anyone trying to modify a file; to record who had made changes, plus when and why; and to provide tools to compare different versions of files and to go back to earlier versions if required. This paper provides details of the implementation and exploitation experience, that may be interesting for other applications using many human-readable files maintained by different people, where consistency of the files and traceability of modifications are key requirements.

  4. AIP1OGREN: Aerosol Observing Station Intensive Properties Value-Added Product

    Energy Technology Data Exchange (ETDEWEB)

    Koontz, Annette [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Flynn, Connor [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-09-15

    The aip1ogren value-added product (VAP) computes several aerosol intensive properties. It requires as input calibrated, corrected, aerosol extensive properties (scattering and absorption coefficients, primarily) from the Aerosol Observing Station (AOS). Aerosol extensive properties depend on both the nature of the aerosol and the amount of the aerosol. We compute several properties as relationships between the various extensive properties. These intensive properties are independent of aerosol amount and instead relate to intrinsic properties of the aerosol itself. Along with the original extensive properties we report aerosol single-scattering albedo, hemispheric backscatter fraction, asymmetry parameter, and Ångström exponent for scattering and absorption with one-minute averaging. An hourly averaged file is produced from the 1-minute files that includes all extensive and intensive properties as well as submicron scattering and submicron absorption fractions. Finally, in both the minutely and hourly files the aerosol radiative forcing efficiency is provided.

  5. Students "Hacking" School Computer Systems

    Science.gov (United States)

    Stover, Del

    2005-01-01

    This article deals with students hacking school computer systems. School districts are getting tough with students "hacking" into school computers to change grades, poke through files, or just pit their high-tech skills against district security. Dozens of students have been prosecuted recently under state laws on identity theft and unauthorized…

  6. The effect of micro and macro stressors in the work environment on computer professionals' subjective health status and productive behavior in Japan.

    Science.gov (United States)

    Tominaga, Maki; Asakura, Takashi; Akiyama, Tsuyoshi

    2007-06-01

    To investigate the effect of micro and macro stressors in the work environment on the subjective health status and productive behavior of computer professionals, we conducted a web-based investigation with Japanese IT-related company employees in 53 company unions. The questionnaire consisted of individual attributes, employment characteristics, working hour characteristics, company size and profitability, personal characteristics (i.e., Growth Need Strength), micro and macro stressors scale, and four outcome scales concerning the subjective health status and productive behavior. We obtained 1,049 Japanese IT-related company employees' data (response rate: 66%), and analyzed the data of computer engineers (80%; n=871). The results of hierarchical multiple regressions showed that each full model explained 23% in psychological distress, 20% in cumulative fatigue, 44% in job dissatisfaction, and 35% in intentions to leave, respectively. In micro stressors, "quantitative and qualitative work overload" had the strongest influence on both the subjective health status and intentions to leave. Furthermore, in macro stressors, "career and future ambiguity" was the most important predictor of the subjective health status, and "insufficient evaluation systems" and "poor supervisor's support" were important predictors of productive behavior as well. These findings suggest that improving not only micro stressors but also macro stressors will enhance the subjective health status and increase the productive behavior of computer professionals in Japan.

  7. NASA ARCH- A FILE ARCHIVAL SYSTEM FOR THE DEC VAX

    Science.gov (United States)

    Scott, P. J.

    1994-01-01

    The function of the NASA ARCH system is to provide a permanent storage area for files that are infrequently accessed. The NASA ARCH routines were designed to provide a simple mechanism by which users can easily store and retrieve files. The user treats NASA ARCH as the interface to a black box where files are stored. There are only five NASA ARCH user commands, even though NASA ARCH employs standard VMS directives and the VAX BACKUP utility. Special care is taken to provide the security needed to insure file integrity over a period of years. The archived files may exist in any of three storage areas: a temporary buffer, the main buffer, and a magnetic tape library. When the main buffer fills up, it is transferred to permanent magnetic tape storage and deleted from disk. Files may be restored from any of the three storage areas. A single file, multiple files, or entire directories can be stored and retrieved. archived entities hold the same name, extension, version number, and VMS file protection scheme as they had in the user's account prior to archival. NASA ARCH is capable of handling up to 7 directory levels. Wildcards are supported. User commands include TEMPCOPY, DISKCOPY, DELETE, RESTORE, and DIRECTORY. The DIRECTORY command searches a directory of savesets covering all three archival areas, listing matches according to area, date, filename, or other criteria supplied by the user. The system manager commands include 1) ARCHIVE- to transfer the main buffer to duplicate magnetic tapes, 2) REPORTto determine when the main buffer is full enough to archive, 3) INCREMENT- to back up the partially filled main buffer, and 4) FULLBACKUP- to back up the entire main buffer. On-line help files are provided for all NASA ARCH commands. NASA ARCH is written in DEC VAX DCL for interactive execution and has been implemented on a DEC VAX computer operating under VMS 4.X. This program was developed in 1985.

  8. Performance of the engineering analysis and data system 2 common file system

    Science.gov (United States)

    Debrunner, Linda S.

    1993-01-01

    The Engineering Analysis and Data System (EADS) was used from April 1986 to July 1993 to support large scale scientific and engineering computation (e.g. computational fluid dynamics) at Marshall Space Flight Center. The need for an updated system resulted in a RFP in June 1991, after which a contract was awarded to Cray Grumman. EADS II was installed in February 1993, and by July 1993 most users were migrated. EADS II is a network of heterogeneous computer systems supporting scientific and engineering applications. The Common File System (CFS) is a key component of this system. The CFS provides a seamless, integrated environment to the users of EADS II including both disk and tape storage. UniTree software is used to implement this hierarchical storage management system. The performance of the CFS suffered during the early months of the production system. Several of the performance problems were traced to software bugs which have been corrected. Other problems were associated with hardware. However, the use of NFS in UniTree UCFM software limits the performance of the system. The performance issues related to the CFS have led to a need to develop a greater understanding of the CFS organization. This paper will first describe the EADS II with emphasis on the CFS. Then, a discussion of mass storage systems will be presented, and methods of measuring the performance of the Common File System will be outlined. Finally, areas for further study will be identified and conclusions will be drawn.

  9. Productivity

    DEFF Research Database (Denmark)

    Spring, Martin; Johnes, Geraint; Hald, Kim Sundtoft

    Productivity is increasingly critical for developed economies. It has always been important: as Paul Krugman puts it, “Productivity isn’t everything, but in the long run it is almost everything. A country’s ability to improve its standard of living over time depends almost entirely on its ability...... to raise its output per worker”(Krugman, 1994). Analyses of productivity have, by and large, been the preserve of economists. Operations Management (OM) is rooted in a similar concern for the efficient use of scarce resources; Management Accounting (MA) is concerned with the institutionalised measurement...... and management of productivity. Yet the three perspectives are rarely connected. This paper is a sketch of a literature review seeking to identify, contrast and reconcile these three perspectives. In so doing, it aims to strengthen the connections between policy and managerial analyses of productivity....

  10. Cloud object store for archive storage of high performance computing data using decoupling middleware

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-06-30

    Cloud object storage is enabled for archived data, such as checkpoints and results, of high performance computing applications using a middleware process. A plurality of archived files, such as checkpoint files and results, generated by a plurality of processes in a parallel computing system are stored by obtaining the plurality of archived files from the parallel computing system; converting the plurality of archived files to objects using a log structured file system middleware process; and providing the objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  11. Cloud object store for checkpoints of high performance computing applications using decoupling middleware

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-04-19

    Cloud object storage is enabled for checkpoints of high performance computing applications using a middleware process. A plurality of files, such as checkpoint files, generated by a plurality of processes in a parallel computing system are stored by obtaining said plurality of files from said parallel computing system; converting said plurality of files to objects using a log structured file system middleware process; and providing said objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  12. Three Aspects of PLATO Use at Chanute AFB: CBE Production Techniques, Computer-Aided Management, Formative Development of CBE Lessons.

    Science.gov (United States)

    Klecka, Joseph A.

    This report describes various aspects of lesson production and use of the PLATO system at Chanute Air Force Base. The first chapter considers four major factors influencing lesson production: (1) implementation of the "lean approach," (2) the Instructional Systems Development (ISD) role in lesson production, (3) the transfer of…

  13. The virtual maintenance system: a computer-based support tool for robust design, product monitoring, fault diagnosis and maintenance planning

    NARCIS (Netherlands)

    van Houten, Frederikus J.A.M.; Kimura, F.

    2000-01-01

    Digital (geometric) product models can be used for maintainability analysis and maintenance planning. It is not feasible to build digital product models for maintenance purposes only, but if a digital product model is available, it may be used to support many maintenance-related engineering tasks.

  14. Total system for manufacture of nuclear vessels by computer: VECTRON

    International Nuclear Information System (INIS)

    Inagawa, Jin; Ueno, Osamu; Hanai, Yoshiharu; Ohkawa, Isao; Washizu, Hideyuki

    1980-01-01

    VECTRON (Vessel Engineering by Computer Tool and Rapid Operating for the N/C System) is a CAM (Computer Aided Manufacturing) system that has been developed to produce high quality and highly accurate vessels for nuclear power plants and other industrial plants. Outputs of this system are design drawings, manufacturing information and magnetic tapes of the N/C marking machine for vessel shell plates including their attachments. And it can also output information at each stage of designing, marking, cutting, forming and assembling by treating the vessels in three dimensions and by using data filing systems and plotting program for general use. The data filing systems consist of functional and manufacturing data of each part of vessels. This system not only realizes a change from manual work to computer work, but also leads us to improve production engineering and production jigs for safety and high quality. At present, VECTRON is being applied to the manufacture of the shell plates of primary containment vessels in the Kashiwazaki-Kariwa Nuclear Power Station Unit 1 (K-1) and the Fukushima Daini Nuclear Power Station Unit 3 (2F-3), to realize increased productivity. (author)

  15. Formalizing a hierarchical file system

    NARCIS (Netherlands)

    Hesselink, Wim H.; Lali, Muhammad Ikram

    An abstract file system is defined here as a partial function from (absolute) paths to data. Such a file system determines the set of valid paths. It allows the file system to be read and written at a valid path, and it allows the system to be modified by the Unix operations for creation, removal,

  16. Formalizing a Hierarchical File System

    NARCIS (Netherlands)

    Hesselink, Wim H.; Lali, M.I.

    2009-01-01

    In this note, we define an abstract file system as a partial function from (absolute) paths to data. Such a file system determines the set of valid paths. It allows the file system to be read and written at a valid path, and it allows the system to be modified by the Unix operations for removal

  17. Extracting the Data From the LCM vk4 Formatted Output File

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, James G. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-29

    These are slides about extracting the data from the LCM vk4 formatted output file. The following is covered: vk4 file produced by Keyence VK Software, custom analysis, no off the shelf way to read the file, reading the binary data in a vk4 file, various offsets in decimal lines, finding the height image data, directly in MATLAB, binary output beginning of height image data, color image information, color image binary data, color image decimal and binary data, MATLAB code to read vk4 file (choose a file, read the file, compute offsets, read optical image, laser optical image, read and compute laser intensity image, read height image, timing, display height image, display laser intensity image, display RGB laser optical images, display RGB optical images, display beginning data and save images to workspace, gamma correction subroutine), reading intensity form the vk4 file, linear in the low range, linear in the high range, gamma correction for vk4 files, computing the gamma intensity correction, observations.

  18. A Centralized Control and Dynamic Dispatch Architecture for File Integrity Analysis

    Directory of Open Access Journals (Sweden)

    Ronald DeMara

    2006-02-01

    Full Text Available The ability to monitor computer file systems for unauthorized changes is a powerful administrative tool. Ideally this task could be performed remotely under the direction of the administrator to allow on-demand checking, and use of tailorable reporting and exception policies targeted to adjustable groups of network elements. This paper introduces M-FICA, a Mobile File Integrity and Consistency Analyzer as a prototype to achieve this capability using mobile agents. The M-FICA file tampering detection approach uses MD5 message digests to identify file changes. Two agent types, Initiator and Examiner, are used to perform file integrity tasks. An Initiator travels to client systems, computes a file digest, then stores those digests in a database file located on write-once media. An Examiner agent computes a new digest to compare with the original digests in the database file. Changes in digest values indicate that the file contents have been modified. The design and evaluation results for a prototype developed in the Concordia agent framework are described.

  19. Reliable file sharing in distributed operating system using web RTC

    Science.gov (United States)

    Dukiya, Rajesh

    2017-12-01

    Since, the evolution of distributed operating system, distributed file system is come out to be important part in operating system. P2P is a reliable way in Distributed Operating System for file sharing. It was introduced in 1999, later it became a high research interest topic. Peer to Peer network is a type of network, where peers share network workload and other load related tasks. A P2P network can be a period of time connection, where a bunch of computers connected by a USB (Universal Serial Bus) port to transfer or enable disk sharing i.e. file sharing. Currently P2P requires special network that should be designed in P2P way. Nowadays, there is a big influence of browsers in our life. In this project we are going to study of file sharing mechanism in distributed operating system in web browsers, where we will try to find performance bottlenecks which our research will going to be an improvement in file sharing by performance and scalability in distributed file systems. Additionally, we will discuss the scope of Web Torrent file sharing and free-riding in peer to peer networks.

  20. PRO/Mapper: a plotting program for the DEC PRO/300 personal computers utilizing the MAPPER graphics language

    International Nuclear Information System (INIS)

    Wachter, J.W.

    1986-05-01

    PRO/Mapper is an application for the Digital Equipment Corporation PRO/300 series of personal computers that facilitates the preparation of visuals such as graphs, charts, and maps in color or black and white. The user prepares an input data file containing English-language commands and writes it into a file using standard editor. PRO/Mapper then reads these files and draws graphs, maps, boxes, and complex line segments onto the computer screen. Axes, curves, and error bars may be plotted in graphical presentations. The commands of PRO/Mapper are a subset of the commands of the more sophisticated MAPPER program written for mainframe computers. The PRO/Mapper commands were chosen primarily for the production of linear graphs. Command files written for the PRO/300 are upward compatible with the Martin Marietta Energy Systems version of MAPPER and can be used to produce publication-quality slides, drawings, and maps on the various output devices of the Oak Ridge National Laboratory mainframe computers