WorldWideScience

Sample records for selection computer file

  1. Algorithms and file structures for computational geometry

    International Nuclear Information System (INIS)

    Hinrichs, K.; Nievergelt, J.

    1983-01-01

    Algorithms for solving geometric problems and file structures for storing large amounts of geometric data are of increasing importance in computer graphics and computer-aided design. As examples of recent progress in computational geometry, we explain plane-sweep algorithms, which solve various topological and geometric problems efficiently; and we present the grid file, an adaptable, symmetric multi-key file structure that provides efficient access to multi-dimensional data along any space dimension. (orig.)

  2. Selecting Personal Computers.

    Science.gov (United States)

    Djang, Philipp A.

    1993-01-01

    Describes a Multiple Criteria Decision Analysis Approach for the selection of personal computers that combines the capabilities of Analytic Hierarchy Process and Integer Goal Programing. An example of how decision makers can use this approach to determine what kind of personal computers and how many of each type to purchase is given. (nine…

  3. Small file aggregation in a parallel computing system

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Zhang, Jingwang

    2014-09-02

    Techniques are provided for small file aggregation in a parallel computing system. An exemplary method for storing a plurality of files generated by a plurality of processes in a parallel computing system comprises aggregating the plurality of files into a single aggregated file; and generating metadata for the single aggregated file. The metadata comprises an offset and a length of each of the plurality of files in the single aggregated file. The metadata can be used to unpack one or more of the files from the single aggregated file.

  4. Storing files in a parallel computing system using list-based index to identify replica files

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Zhang, Zhenhua; Grider, Gary

    2015-07-21

    Improved techniques are provided for storing files in a parallel computing system using a list-based index to identify file replicas. A file and at least one replica of the file are stored in one or more storage nodes of the parallel computing system. An index for the file comprises at least one list comprising a pointer to a storage location of the file and a storage location of the at least one replica of the file. The file comprises one or more of a complete file and one or more sub-files. The index may also comprise a checksum value for one or more of the file and the replica(s) of the file. The checksum value can be evaluated to validate the file and/or the file replica(s). A query can be processed using the list.

  5. Study and development of a document file system with selective access

    International Nuclear Information System (INIS)

    Mathieu, Jean-Claude

    1974-01-01

    The objective of this research thesis was to design and to develop a set of software aimed at an efficient management of a document file system by using methods of selective access to information. Thus, the three main aspects of file processing (creation, modification, reorganisation) have been addressed. The author first presents the main problems related to the development of a comprehensive automatic documentation system, and their conventional solutions. Some future aspects, notably dealing with the development of peripheral computer technology, are also evoked. He presents the characteristics of INIS bibliographic records provided by the IAEA which have been used to create the files. In the second part, he briefly describes the file system general organisation. This system is based on the use of two main files: an inverse file which contains for each descriptor a list of of numbers of files indexed by this descriptor, and a dictionary of descriptor or input file which gives access to the inverse file. The organisation of these both files is then describes in a detailed way. Other related or associated files are created, and the overall architecture and mechanisms integrated into the file data input software are described, as well as various processing applied to these different files. Performance and possible development are finally discussed

  6. RAMA: A file system for massively parallel computers

    Science.gov (United States)

    Miller, Ethan L.; Katz, Randy H.

    1993-01-01

    This paper describes a file system design for massively parallel computers which makes very efficient use of a few disks per processor. This overcomes the traditional I/O bottleneck of massively parallel machines by storing the data on disks within the high-speed interconnection network. In addition, the file system, called RAMA, requires little inter-node synchronization, removing another common bottleneck in parallel processor file systems. Support for a large tertiary storage system can easily be integrated in lo the file system; in fact, RAMA runs most efficiently when tertiary storage is used.

  7. WinSCP for Windows File Transfers | High-Performance Computing | NREL

    Science.gov (United States)

    WinSCP for Windows File Transfers WinSCP for Windows File Transfers WinSCP for can used to securely transfer files between your local computer running Microsoft Windows and a remote computer running Linux

  8. New developments in file-based infrastructure for ATLAS event selection

    Energy Technology Data Exchange (ETDEWEB)

    Gemmeren, P van; Malon, D M [Argonne National Laboratory, Argonne, Illinois 60439 (United States); Nowak, M, E-mail: gemmeren@anl.go [Brookhaven National Laboratory, Upton, NY 11973-5000 (United States)

    2010-04-01

    In ATLAS software, TAGs are event metadata records that can be stored in various technologies, including ROOT files and relational databases. TAGs are used to identify and extract events that satisfy certain selection predicates, which can be coded as SQL-style queries. TAG collection files support in-file metadata to store information describing all events in the collection. Event Selector functionality has been augmented to provide such collection-level metadata to subsequent algorithms. The ATLAS I/O framework has been extended to allow computational processing of TAG attributes to select or reject events without reading the event data. This capability enables physicists to use more detailed selection criteria than are feasible in an SQL query. For example, the TAGs contain enough information not only to check the number of electrons, but also to calculate their distance to the closest jet-a calculation that would be difficult to express in SQL. Another new development allows ATLAS to write TAGs directly into event data files. This feature can improve performance by supporting advanced event selection capabilities, including computational processing of TAG information, without the need for external TAG file or database access.

  9. Documentation of CATHENA input files for the APOLLO computer

    International Nuclear Information System (INIS)

    1988-06-01

    Input files created for the VAX version of the CATHENA two-fluid code have been modified and documented for simulation on the AECB's APOLLO computer system. The input files describe the RD-14 thermalhydraulic loop, the RD-14 steam generator, the RD-12 steam generator blowdown test facility, the Stern Laboratories Cold Water Injection Facility (CWIT), and a CANDU 600 reactor. Sample CATHENA predictions are given and compared with experimental results where applicable. 24 refs

  10. Globus File Transfer Services | High-Performance Computing | NREL

    Science.gov (United States)

    installed on the systems at both ends of the data transfer. The NREL endpoint is nrel#globus. Click Login on the Globus web site. On the login page select "Globus ID" as the login method and click Login to the Globus website. From the Manage Data drop down menu, select Transfer Files. Then click Get

  11. NET: an inter-computer file transfer command

    International Nuclear Information System (INIS)

    Burris, R.D.

    1978-05-01

    The NET command was defined and supported in order to facilitate file transfer between computers. Among the goals of the implementation were greatest possible ease of use, maximum power (i.e., support of a diversity of equipment and operations), and protection of the operating system

  12. Storing files in a parallel computing system based on user-specified parser function

    Science.gov (United States)

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Manzanares, Adam; Torres, Aaron

    2014-10-21

    Techniques are provided for storing files in a parallel computing system based on a user-specified parser function. A plurality of files generated by a distributed application in a parallel computing system are stored by obtaining a parser from the distributed application for processing the plurality of files prior to storage; and storing one or more of the plurality of files in one or more storage nodes of the parallel computing system based on the processing by the parser. The plurality of files comprise one or more of a plurality of complete files and a plurality of sub-files. The parser can optionally store only those files that satisfy one or more semantic requirements of the parser. The parser can also extract metadata from one or more of the files and the extracted metadata can be stored with one or more of the plurality of files and used for searching for files.

  13. A software to report and file by personal computer

    International Nuclear Information System (INIS)

    Di Giandomenico, E.; Filippone, A.; Esposito, A.; Bonomo, L.

    1989-01-01

    During the past four years the authors have been gaining experince in reporting radiological examinations by personal computer. Today they describe the project of a new software which allows the reporting and filing of roentgenograms. This program was realized by a radiologist, using a well known data base management system: dBASE III. The program was shaped to fit the radiologist's needs: it helps to report, and allows to file, radiological data, with the diagnosic codes used by the American College of Radiology. In this paper the authors describe the data base structure and indicate the software functions which make its use possible. Thus, this paper is not aimed at advertising a new reporting program, but at demonstrating how the radiologist can himself manage some aspects of his work with the help of a personal computer

  14. Software For Computing Selected Functions

    Science.gov (United States)

    Grant, David C.

    1992-01-01

    Technical memorandum presents collection of software packages in Ada implementing mathematical functions used in science and engineering. Provides programmer with function support in Pascal and FORTRAN, plus support for extended-precision arithmetic and complex arithmetic. Valuable for testing new computers, writing computer code, or developing new computer integrated circuits.

  15. 22 CFR 1429.21 - Computation of time for filing papers.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Computation of time for filing papers. 1429.21... MISCELLANEOUS AND GENERAL REQUIREMENTS General Requirements § 1429.21 Computation of time for filing papers. In... subchapter requires the filing of any paper, such document must be received by the Board or the officer or...

  16. Study and development of a document file system with selective access; Etude et realisation d'un systeme de fichiers documentaires a acces selectif

    Energy Technology Data Exchange (ETDEWEB)

    Mathieu, Jean-Claude

    1974-06-21

    The objective of this research thesis was to design and to develop a set of software aimed at an efficient management of a document file system by using methods of selective access to information. Thus, the three main aspects of file processing (creation, modification, reorganisation) have been addressed. The author first presents the main problems related to the development of a comprehensive automatic documentation system, and their conventional solutions. Some future aspects, notably dealing with the development of peripheral computer technology, are also evoked. He presents the characteristics of INIS bibliographic records provided by the IAEA which have been used to create the files. In the second part, he briefly describes the file system general organisation. This system is based on the use of two main files: an inverse file which contains for each descriptor a list of of numbers of files indexed by this descriptor, and a dictionary of descriptor or input file which gives access to the inverse file. The organisation of these both files is then describes in a detailed way. Other related or associated files are created, and the overall architecture and mechanisms integrated into the file data input software are described, as well as various processing applied to these different files. Performance and possible development are finally discussed.

  17. A computational model of selection by consequences.

    OpenAIRE

    McDowell, J J

    2004-01-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of computational experiments that arranged reinforcement according to random-interval (RI) schedules. The quantitative features of the model were varied o...

  18. A Computational Model of Selection by Consequences

    Science.gov (United States)

    McDowell, J. J.

    2004-01-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of…

  19. Methods and apparatus for multi-resolution replication of files in a parallel computing system using semantic information

    Science.gov (United States)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-10-20

    Techniques are provided for storing files in a parallel computing system using different resolutions. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a sub-file. The method comprises the steps of obtaining semantic information related to the file; generating a plurality of replicas of the file with different resolutions based on the semantic information; and storing the file and the plurality of replicas of the file in one or more storage nodes of the parallel computing system. The different resolutions comprise, for example, a variable number of bits and/or a different sub-set of data elements from the file. A plurality of the sub-files can be merged to reproduce the file.

  20. File and metadata management for BESIII distributed computing

    International Nuclear Information System (INIS)

    Nicholson, C; Zheng, Y H; Lin, L; Deng, Z Y; Li, W D; Zhang, X M

    2012-01-01

    The BESIII experiment at the Institute of High Energy Physics (IHEP), Beijing, uses the high-luminosity BEPCII e + e − collider to study physics in the π-charm energy region around 3.7 GeV; BEPCII has produced the worlds largest samples of J/φ and φ’ events to date. An order of magnitude increase in the data sample size over the 2011-2012 data-taking period demanded a move from a very centralized to a distributed computing environment, as well as the development of an efficient file and metadata management system. While BESIII is on a smaller scale than some other HEP experiments, this poses particular challenges for its distributed computing and data management system. These constraints include limited resources and manpower, and low quality of network connections to IHEP. Drawing on the rich experience of the HEP community, a system has been developed which meets these constraints. The design and development of the BESIII distributed data management system, including its integration with other BESIII distributed computing components, such as job management, are presented here.

  1. Comparison of canal transportation and centering ability of hand Protaper files and rotary Protaper files by using micro computed tomography

    OpenAIRE

    Amit Gandhi; Taru Gandhi

    2011-01-01

    Introduction and objective: The aim of the present study was to compare root canal preparation with rotary ProTaper files and hand ProTaper files to find a better instrumentation technique for maintaining root canal geometry with the aid of computed tomography. Material and methods: Twenty curved root canals with at least 10 degree of curvature were divided into 2 groups of 10 teeth each. In group I the canals were prepared with hand ProTaper files and in group II the canals were prepared wit...

  2. Arranging and finding folders and files on your Windows 7 computer

    CERN Document Server

    Steps, Studio Visual

    2014-01-01

    If you have lots of documents on your desk, it may prove to be impossible to find the document you are looking for. In order to easily find certain documents, they are often stored in a filing cabinet and arranged in a logical order. The folders on your computer serve the same purpose. They do not just contain files; they can also contain other folders. You can create an unlimited number of folders, and each folder can contain any number of subfolders and files. You can use Windows Explorer, also called the folder window, to work with the files and folders on your computer. You can copy, delete, move, find, and sort files, among other things. Or you can transfer files and folders to a USB stick, an external hard drive, a CD, DVD or Blu-Ray disk. In this practical guide we will show you how to use the folder window, and help you arrange your own files.

  3. Methods and apparatus for capture and storage of semantic information with sub-files in a parallel computing system

    Science.gov (United States)

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-02-03

    Techniques are provided for storing files in a parallel computing system using sub-files with semantically meaningful boundaries. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a plurality of sub-files. The method comprises the steps of obtaining a user specification of semantic information related to the file; providing the semantic information as a data structure description to a data formatting library write function; and storing the semantic information related to the file with one or more of the sub-files in one or more storage nodes of the parallel computing system. The semantic information provides a description of data in the file. The sub-files can be replicated based on semantically meaningful boundaries.

  4. Optical encryption with selective computational ghost imaging

    International Nuclear Information System (INIS)

    Zafari, Mohammad; Kheradmand, Reza; Ahmadi-Kandjani, Sohrab

    2014-01-01

    Selective computational ghost imaging (SCGI) is a technique which enables the reconstruction of an N-pixel image from N measurements or less. In this paper we propose an optical encryption method based on SCGI and experimentally demonstrate that this method has much higher security under eavesdropping and unauthorized accesses compared with previous reported methods. (paper)

  5. Computer Aided Solvent Selection and Design Framework

    DEFF Research Database (Denmark)

    Mitrofanov, Igor; Conte, Elisa; Abildskov, Jens

    and computer-aided tools and methods for property prediction and computer-aided molecular design (CAMD) principles. This framework is applicable for solvent selection and design in product design as well as process design. The first module of the framework is dedicated to the solvent selection and design...... in terms of: physical and chemical properties (solvent-pure properties); Environment, Health and Safety (EHS) characteristic (solvent-EHS properties); operational properties (solvent–solute properties). 3. Performing the search. The search step consists of two stages. The first is a generation and property...... identification of solvent candidates using special software ProCAMD and ProPred, which are the implementations of computer-aided molecular techniques. The second consists of assigning the RS-indices following the reaction–solvent and then consulting the known solvent database and identifying the set of solvents...

  6. Computational intelligence, medicine and biology selected links

    CERN Document Server

    Zaitseva, Elena

    2015-01-01

    This book contains an interesting and state-of the art collection of chapters presenting several examples of attempts to developing modern tools utilizing computational intelligence in different real life problems encountered by humans. Reasoning, prediction, modeling, optimization, decision making, etc. need modern, soft and intelligent algorithms, methods and methodologies to solve, in the efficient ways, problems appearing in human activity. The contents of the book is divided into two parts. Part I, consisting of four chapters, is devoted to selected links of computational intelligence, medicine, health care and biomechanics. Several problems are considered: estimation of healthcare system reliability, classification of ultrasound thyroid images, application of fuzzy logic to measure weight status and central fatness, and deriving kinematics directly from video records. Part II, also consisting of four chapters, is devoted to selected links of computational intelligence and biology. The common denominato...

  7. Computer aided selection of plant layout | Kitaw | Zede Journal

    African Journals Online (AJOL)

    Zede Journal. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 8 (1989) >. Log in or Register to get access to full text downloads. Username, Password, Remember me, or Register · Download this PDF file. The PDF file you selected should load here if your Web ...

  8. RRB / SSI Interface Checkwriting Integrated Computer Operation Extract File (CHICO)

    Data.gov (United States)

    Social Security Administration — This monthly file provides SSA with information about benefit payments made to railroad retirement beneficiaries. SSA uses this data to verify Supplemental Security...

  9. A technique for integrating remote minicomputers into a general computer's file system

    CERN Document Server

    Russell, R D

    1976-01-01

    This paper describes a simple technique for interfacing remote minicomputers used for real-time data acquisition into the file system of a central computer. Developed as part of the ORION system at CERN, this 'File Manager' subsystem enables a program in the minicomputer to access and manipulate files of any type as if they resided on a storage device attached to the minicomputer. Yet, completely transparent to the program, the files are accessed from disks on the central system via high-speed data links, with response times comparable to local storage devices. (6 refs).

  10. Comparison of canal transportation and centering ability of twisted files, Pathfile-ProTaper system, and stainless steel hand K-files by using computed tomography.

    Science.gov (United States)

    Gergi, Richard; Rjeily, Joe Abou; Sader, Joseph; Naaman, Alfred

    2010-05-01

    The purpose of this study was to compare canal transportation and centering ability of 2 rotary nickel-titanium (NiTi) systems (Twisted Files [TF] and Pathfile-ProTaper [PP]) with conventional stainless steel K-files. Ninety root canals with severe curvature and short radius were selected. Canals were divided randomly into 3 groups of 30 each. After preparation with TF, PP, and stainless steel files, the amount of transportation that occurred was assessed by using computed tomography. Three sections from apical, mid-root, and coronal levels of the canal were recorded. Amount of transportation and centering ability were assessed. The 3 groups were statistically compared with analysis of variance and Tukey honestly significant difference test. Less transportation and better centering ability occurred with TF rotary instruments (P < .0001). K-files showed the highest transportation followed by PP system. PP system showed significant transportation when compared with TF (P < .0001). The TF system was found to be the best for all variables measured in this study. Copyright (c) 2010 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  11. Informatics in Radiology (infoRAD): personal computer security: part 2. Software Configuration and file protection.

    Science.gov (United States)

    Caruso, Ronald D

    2004-01-01

    Proper configuration of software security settings and proper file management are necessary and important elements of safe computer use. Unfortunately, the configuration of software security options is often not user friendly. Safe file management requires the use of several utilities, most of which are already installed on the computer or available as freeware. Among these file operations are setting passwords, defragmentation, deletion, wiping, removal of personal information, and encryption. For example, Digital Imaging and Communications in Medicine medical images need to be anonymized, or "scrubbed," to remove patient identifying information in the header section prior to their use in a public educational or research environment. The choices made with respect to computer security may affect the convenience of the computing process. Ultimately, the degree of inconvenience accepted will depend on the sensitivity of the files and communications to be protected and the tolerance of the user. Copyright RSNA, 2004

  12. Computational analysis of sequence selection mechanisms.

    Science.gov (United States)

    Meyerguz, Leonid; Grasso, Catherine; Kleinberg, Jon; Elber, Ron

    2004-04-01

    Mechanisms leading to gene variations are responsible for the diversity of species and are important components of the theory of evolution. One constraint on gene evolution is that of protein foldability; the three-dimensional shapes of proteins must be thermodynamically stable. We explore the impact of this constraint and calculate properties of foldable sequences using 3660 structures from the Protein Data Bank. We seek a selection function that receives sequences as input, and outputs survival probability based on sequence fitness to structure. We compute the number of sequences that match a particular protein structure with energy lower than the native sequence, the density of the number of sequences, the entropy, and the "selection" temperature. The mechanism of structure selection for sequences longer than 200 amino acids is approximately universal. For shorter sequences, it is not. We speculate on concrete evolutionary mechanisms that show this behavior.

  13. Processing of evaluated neutron data files in ENDF format on personal computers

    International Nuclear Information System (INIS)

    Vertes, P.

    1991-11-01

    A computer code package - FDMXPC - has been developed for processing evaluated data files in ENDF format. The earlier version of this package is supplemented with modules performing calculations using Reich-Moore and Adler-Adler resonance parameters. The processing of evaluated neutron data files by personal computers requires special programming considerations outlined in this report. The scope of the FDMXPC program system is demonstrated by means of numerical examples. (author). 5 refs, 4 figs, 4 tabs

  14. A digital imaging teaching file by using the internet, HTML and personal computers

    International Nuclear Information System (INIS)

    Chun, Tong Jin; Jeon, Eun Ju; Baek, Ho Gil; Kang, Eun Joo; Baik, Seung Kug; Choi, Han Yong; Kim, Bong Ki

    1996-01-01

    A film-based teaching file takes up space and the need to search through such a file places limits on the extent to which it is likely to be used. Furthermore it is not easy for doctors in a medium-sized hospital to experience a variety of cases, and so for these reasons we created an easy-to-use digital imaging teaching file with HTML(Hypertext Markup Language) and downloaded images via World Wide Web(WWW) services on the Internet. This was suitable for use by computer novices. We used WWW internet services as a resource for various images and three different IMB-PC compatible computers(386DX, 486DX-II, and Pentium) in downloading the images and in developing a digitalized teaching file. These computers were connected with the Internet through a high speed dial-up modem(28.8Kbps) and to navigate the Internet. Twinsock and Netscape were used. 3.0, Korean word processing software, was used to create HTML(Hypertext Markup Language) files and the downloaded images were linked to the HTML files. In this way, a digital imaging teaching file program was created. Access to a Web service via the Internet required a high speed computer(at least 486DX II with 8MB RAM) for comfortabel use; this also ensured that the quality of downloaded images was not degraded during downloading and that these were good enough to use as a teaching file. The time needed to retrieve the text and related images depends on the size of the file, the speed of the network, and the network traffic at the time of connection. For computer novices, a digital image teaching file using HTML is easy to use. Our method of creating a digital imaging teaching file by using Internet and HTML would be easy to create and radiologists with little computer experience who want to study various digital radiologic imaging cases would find it easy to use

  15. Cooperative storage of shared files in a parallel computing system with dynamic block size

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-11-10

    Improved techniques are provided for parallel writing of data to a shared object in a parallel computing system. A method is provided for storing data generated by a plurality of parallel processes to a shared object in a parallel computing system. The method is performed by at least one of the processes and comprises: dynamically determining a block size for storing the data; exchanging a determined amount of the data with at least one additional process to achieve a block of the data having the dynamically determined block size; and writing the block of the data having the dynamically determined block size to a file system. The determined block size comprises, e.g., a total amount of the data to be stored divided by the number of parallel processes. The file system comprises, for example, a log structured virtual parallel file system, such as a Parallel Log-Structured File System (PLFS).

  16. Computer Forensics Method in Analysis of Files Timestamps in Microsoft Windows Operating System and NTFS File System

    Directory of Open Access Journals (Sweden)

    Vesta Sergeevna Matveeva

    2013-02-01

    Full Text Available All existing file browsers displays 3 timestamps for every file in file system NTFS. Nowadays there are a lot of utilities that can manipulate temporal attributes to conceal the traces of file using. However every file in NTFS has 8 timestamps that are stored in file record and used in detecting the fact of attributes substitution. The authors suggest a method of revealing original timestamps after replacement and automated variant of it in case of a set of files.

  17. Survey on Security Issues in File Management in Cloud Computing Environment

    Science.gov (United States)

    Gupta, Udit

    2015-06-01

    Cloud computing has pervaded through every aspect of Information technology in past decade. It has become easier to process plethora of data, generated by various devices in real time, with the advent of cloud networks. The privacy of users data is maintained by data centers around the world and hence it has become feasible to operate on that data from lightweight portable devices. But with ease of processing comes the security aspect of the data. One such security aspect is secure file transfer either internally within cloud or externally from one cloud network to another. File management is central to cloud computing and it is paramount to address the security concerns which arise out of it. This survey paper aims to elucidate the various protocols which can be used for secure file transfer and analyze the ramifications of using each protocol.

  18. The image database management system of teaching file using personal computer

    International Nuclear Information System (INIS)

    Shin, M. J.; Kim, G. W.; Chun, T. J.; Ahn, W. H.; Baik, S. K.; Choi, H. Y.; Kim, B. G.

    1995-01-01

    For the systemic management and easy using of teaching file in radiology department, the authors tried to do the setup of a database management system of teaching file using personal computer. We used a personal computer (IBM PC compatible, 486DX2) including a image capture card(Window vision, Dooin Elect, Seoul, Korea) and video camera recorder (8mm, CCD-TR105, Sony, Tokyo, Japan) for the acquisition and storage of images. We developed the database program by using Foxpro for Window 2.6(Microsoft, Seattle, USA) executed in the Window 3.1 (Microsoft, Seattle, USA). Each datum consisted of hospital number, name, sex, age, examination date, keyword, radiologic examination modalities, final diagnosis, radiologic findings, references and representative images. The images were acquired and stored as bitmap format (8 bitmap, 540 X 390 ∼ 545 X 414, 256 gray scale) and displayed on the 17 inch-flat monitor(1024 X 768, Samtron, Seoul, Korea). Without special devices, the images acquisition and storage could be done on the reading viewbox, simply. The image quality on the computer's monitor was less than the one of original film on the viewbox, but generally the characteristics of each lesions could be differentiated. Easy retrieval of data was possible for the purpose of teaching file system. Without high cost appliances, we could consummate the image database system of teaching file using personal computer with relatively inexpensive method

  19. Transfer of numeric ASCII data files between Apple and IBM personal computers.

    Science.gov (United States)

    Allan, R W; Bermejo, R; Houben, D

    1986-01-01

    Listings for programs designed to transfer numeric ASCII data files between Apple and IBM personal computers are provided with accompanying descriptions of how the software operates. Details of the hardware used are also given. The programs may be easily adapted for transferring data between other microcomputers.

  20. ERX: a software for editing files containing X-ray spectra to be used in exposure computational models

    International Nuclear Information System (INIS)

    Cabral, Manuela O.M.; Vieira, Jose W.; Silva, Alysson G.; Leal Neto, Viriato; Oliveira, Alex C.H.; Lima, Fernando R.A.

    2011-01-01

    Exposure Computational Models (ECMs) are utilities that simulate situations in which occurs irradiation in a given environment. An ECM is composed primarily by an anthropomorphic model (phantom), and a Monte Carlo code (MC). This paper presents a tutorial of the software Espectro de Raios-X (ERX). This software performs reading and numerical and graphical analysis of text files containing diagnostic X-ray spectra for use in algorithms of radioactive sources in the ECMs of a Grupo de Dosimetria Numerica. The ERX allows the user to select one among several X-ray spectrums in the energy range Diagnostic radiology X-Ray most commonly used in radiology clinics. In the current version of the ERX there are two types of input files: the contained in mspectra.dat file and the resulting of MC simulations in Geant4. The software allows the construction of charts of the Probability Density Function (PDF) and Cumulative Distribution Function (CDF) of a selected spectrum as well as the table with the values of these functions and the spectrum. In addition, the ERX allows the user to make comparative analysis between the PDF graphics of the two catalogs of spectra available, besides being can perform dosimetric evaluations with the selected spectrum. A software of this kind is an important computational tool for researchers in numerical dosimetry because of the diversity of Diagnostic radiology X-Ray machines, which implies in a mass of input data highly diverse. And because of this, the ERX provides independence to the group related to the data origin that is contained in the catalogs created, not being necessary to resort to others. (author)

  1. Dimensional quality control of Ti-Ni dental file by optical coordinate metrology and computed tomography

    DEFF Research Database (Denmark)

    Yagüe-Fabra, J.A.; Tosello, Guido; Ontiveros, S.

    2014-01-01

    Endodontic dental files usually present complex 3D geometries, which make the complete measurement of the component very challenging with conventional micro metrology tools. Computed Tomography (CT) can represent a suitable alternative solution to micro metrology tools based on optical and tactile...... techniques. However, the establishment of CT systems traceability when measuring 3D complex geometries is still an open issue. In this work, to verify the quality of the CT dimensional measurements, the dental file has been measured both with a μCT system and an optical CMM (OCMM). The uncertainty...

  2. Adaptive security protocol selection for mobile computing

    NARCIS (Netherlands)

    Pontes Soares Rocha, B.; Costa, D.N.O.; Moreira, R.A.; Rezende, C.G.; Loureiro, A.A.F.; Boukerche, A.

    2010-01-01

    The mobile computing paradigm has introduced new problems for application developers. Challenges include heterogeneity of hardware, software, and communication protocols, variability of resource limitations and varying wireless channel quality. In this scenario, security becomes a major concern for

  3. COMPUTER AIDED SELECTION OF PLANT LAYOUT

    African Journals Online (AJOL)

    Special focus is directed at improving the preparation of the input data to enhance computer assistance to plant layout. ... INTRODUCTION. Plant layout problems have ... 1960's with the development by industrial engineers and operational ...

  4. Attentional selection of levels within hierarchically organized figures is mediated by object-files

    Directory of Open Access Journals (Sweden)

    Mitchell Joseph Valdes-Sosa

    2014-12-01

    Full Text Available Objects frequently have a hierarchical organization (tree-branch-leaf. How do we select the level to be attended? This has been explored with compound letters: a global letter built from local letters. One explanation, backed by much empirical support, is that attentional competition is biased towards certain spatial frequency (SF bands across all locations and objects (a SF filter. This view assumes that the global and local letters are carried respectively by low and high SF bands, and that the bias can persist over time. Here we advocate a complementary view in which perception of hierarchical level is determined by how we represent each object-file. Although many properties bound to an object-file (i.e. position, color, even shape can mutate without affecting its persistence over time, we posit that same object-file cannot be used to store information from different hierarchical levels. Thus selection of level would be independent from locations but not from the way objects are represented at each moment. These views were contrasted via an attentional blink paradigm that presented letters within compound figures, but only one level at a time. Attending to two letters in rapid succession was easier if they were at the same- compared to different-levels, as predicted by both accounts. However, only the object-file account was able to explain why it was easier to report two targets on the same moving object compared to the same targets on distinct objects. The interference of different masks on target recognition was also easier to predict by the object-file account than by the SF filter. The methods introduced here allowed us to investigate attention to hierarchical levels and to objects within the same empirical framework. The data suggests that SF information is used to structure the internal organization of object representations, a process understood best by integrating object-file theory with previous models of hierarchical perception.

  5. HLYWD: a program for post-processing data files to generate selected plots or time-lapse graphics

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.

    1980-05-01

    The program HLYWD is a post-processor of output files generated by large plasma simulation computations or of data files containing a time sequence of plasma diagnostics. It is intended to be used in a production mode for either type of application; i.e., it allows one to generate along with the graphics sequence, segments containing title, credits to those who performed the work, text to describe the graphics, and acknowledgement of funding agency. The current version is designed to generate 3D plots and allows one to select type of display (linear or semi-log scales), choice of normalization of function values for display purposes, viewing perspective, and an option to allow continuous rotations of surfaces. This program was developed with the intention of being relatively easy to use, reasonably flexible, and requiring a minimum investment of the user's time. It uses the TV80 library of graphics software and ORDERLIB system software on the CDC 7600 at the National Magnetic Fusion Energy Computing Center at Lawrence Livermore Laboratory in California

  6. File management for experiment control parameters within a distributed function computer network

    International Nuclear Information System (INIS)

    Stubblefield, F.W.

    1976-10-01

    An attempt to design and implement a computer system for control of and data collection from a set of laboratory experiments reveals that many of the experiments in the set require an extensive collection of parameters for their control. The operation of the experiments can be greatly simplified if a means can be found for storing these parameters between experiments and automatically accessing them as they are required. A subsystem for managing files of such experiment control parameters is discussed. 3 figures

  7. A Primer on High-Throughput Computing for Genomic Selection

    Directory of Open Access Journals (Sweden)

    Xiao-Lin eWu

    2011-02-01

    Full Text Available High-throughput computing (HTC uses computer clusters to solve advanced computational problems, with the goal of accomplishing high throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general purpose computation on a graphics processing unit (GPU provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin – Madison, which can be leveraged for genomic selection, in terms of central processing unit (CPU capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of

  8. Computed tomography with selectable image resolution

    International Nuclear Information System (INIS)

    Dibianca, F.A.; Dallapiazza, D.G.

    1981-01-01

    A computed tomography system x-ray detector has a central group of half-width detector elements and groups of full-width elements on each side of the central group. To obtain x-ray attenuation data for whole body layers, the half-width elements are switched effectively into paralleled pairs so all elements act like full-width elements and an image of normal resolution is obtained. For narrower head layers, the elements in the central group are used as half-width elements so resolution which is twice as great as normal is obtained. The central group is also used in the half-width mode and the outside groups are used in the full-width mode to obtain a high resolution image of a body zone within a full body layer. In one embodiment data signals from the detector are switched by electronic multiplexing and in another embodiment a processor chooses the signals for the various kinds of images that are to be reconstructed. (author)

  9. Long term file migration. Part I: file reference patterns

    International Nuclear Information System (INIS)

    Smith, A.J.

    1978-08-01

    In most large computer installations, files are moved between on-line disk and mass storage (tape, integrated mass storage device) either automatically by the system or specifically at the direction of the user. This is the first of two papers which study the selection of algorithms for the automatic migration of files between mass storage and disk. The use of the text editor data sets at the Stanford Linear Accelerator Center (SLAC) computer installation is examined through the analysis of thirteen months of file reference data. Most files are used very few times. Of those that are used sufficiently frequently that their reference patterns may be examined, about a third show declining rates of reference during their lifetime; of the remainder, very few (about 5%) show correlated interreference intervals, and interreference intervals (in days) appear to be more skewed than would occur with the Bernoulli process. Thus, about two-thirds of all sufficiently active files appear to be referenced as a renewal process with a skewed interreference distribution. A large number of other file reference statistics (file lifetimes, interference distributions, moments, means, number of uses/file, file sizes, file rates of reference, etc.) are computed and presented. The results are applied in the following paper to the development and comparative evaluation of file migration algorithms. 17 figures, 13 tables

  10. Computer Use of a Medical Dictionary to Select Search Words.

    Science.gov (United States)

    O'Connor, John

    1986-01-01

    Explains an experiment in text-searching retrieval for cancer questions which developed and used computer procedures (via human simulation) to select search words from medical dictionaries. This study is based on an earlier one in which search words were humanly selected, and the recall results of the two studies are compared. (Author/LRW)

  11. Cone-beam Computed Tomographic Assessment of Canal Centering Ability and Transportation after Preparation with Twisted File and Bio RaCe Instrumentation.

    Directory of Open Access Journals (Sweden)

    Kiamars Honardar

    2014-08-01

    Full Text Available Use of rotary Nickel-Titanium (NiTi instruments for endodontic preparation has introduced a new era in endodontic practice, but this issue has undergone dramatic modifications in order to achieve improved shaping abilities. Cone-beam computed tomography (CBCT has made it possible to accurately evaluate geometrical changes following canal preparation. This study was carried out to compare canal centering ability and transportation of Twisted File and BioRaCe rotary systems by means of cone-beam computed tomography.Thirty root canals from freshly extracted mandibular and maxillary teeth were selected. Teeth were mounted and scanned before and after preparation by CBCT at different apical levels. Specimens were divided into 2 groups of 15. In the first group Twisted File and in the second, BioRaCe was used for canal preparation. Canal transportation and centering ability after preparation were assessed by NNT Viewer and Photoshop CS4 software. Statistical analysis was performed using t-test and two-way ANOVA.All samples showed deviations from the original axes of the canals. No significant differences were detected between the two rotary NiTi instruments for canal centering ability in all sections. Regarding canal transportation however, a significant difference was seen in the BioRaCe group at 7.5mm from the apex.Under the conditions of this in vitro study, Twisted File and BioRaCe rotary NiTi files retained original canal geometry.

  12. A primer on high-throughput computing for genomic selection.

    Science.gov (United States)

    Wu, Xiao-Lin; Beissinger, Timothy M; Bauck, Stewart; Woodward, Brent; Rosa, Guilherme J M; Weigel, Kent A; Gatti, Natalia de Leon; Gianola, Daniel

    2011-01-01

    High-throughput computing (HTC) uses computer clusters to solve advanced computational problems, with the goal of accomplishing high-throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long, and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl, and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general-purpose computation on a graphics processing unit provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin-Madison, which can be leveraged for genomic selection, in terms of central processing unit capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general-purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of marker panels to realized

  13. Evaluation of clinical data in childhood asthma. Application of a computer file system

    International Nuclear Information System (INIS)

    Fife, D.; Twarog, F.J.; Geha, R.S.

    1983-01-01

    A computer file system was used in our pediatric allergy clinic to assess the value of chest roentgenograms and hemoglobin determinations used in the examination of patients and to correlate exposure to pets and forced hot air with the severity of asthma. Among 889 children with asthma, 20.7% had abnormal chest roentgenographic findings, excluding hyperinflation and peribronchial thickening, and 0.7% had abnormal hemoglobin values. Environmental exposure to pets or forced hot air was not associated with increased severity of asthma, as assessed by five measures of outcome: number of medications administered, requirement for corticosteroids, frequency of clinic visits, frequency of emergency room visits, and frequency of hospitalizations

  14. Building Parts Inventory Files Using the AppleWorks Data Base Subprogram and Apple IIe or GS Computers.

    Science.gov (United States)

    Schlenker, Richard M.

    This manual is a "how to" training device for building database files using the AppleWorks program with an Apple IIe or Apple IIGS Computer with Duodisk or two disk drives and an 80-column card. The manual provides step-by-step directions, and includes 25 figures depicting the computer screen at the various stages of the database file…

  15. CINDA 83 (1977-1983). The index to literature and computer files on microscopic neutron data

    International Nuclear Information System (INIS)

    1983-01-01

    CINDA, the Computer Index of Neutron Data, contains bibliographical references to measurements, calculations, reviews and evaluations of neutron cross-sections and other microscopic neutron data; it includes also index references to computer libraries of numerical neutron data exchanged between four regional neutron data centres. The present issue, CINDA 83, is an index to the literature on neutron data published after 1976. The basic volume, CINDA-A, together with the present issue, contains the full CINDA file as of 1 April 1983. A supplement to CINDA 83 is foreseen for fall 1983. Next year's issue, which is envisaged to be published in June 1984, will again cover all relevant literature that has appeared after 1976

  16. Trust in social computing. The case of peer-to-peer file sharing networks

    Directory of Open Access Journals (Sweden)

    Heng Xu

    2011-09-01

    Full Text Available Social computing and online communities are changing the fundamental way people share information and communicate with each other. Social computing focuses on how users may have more autonomy to express their ideas and participate in social exchanges in various ways, one of which may be peer-to-peer (P2P file sharing. Given the greater risk of opportunistic behavior by malicious or criminal communities in P2P networks, it is crucial to understand the factors that affect individual’s use of P2P file sharing software. In this paper, we develop and empirically test a research model that includes trust beliefs and perceived risks as two major antecedent beliefs to the usage intention. Six trust antecedents are assessed including knowledge-based trust, cognitive trust, and both organizational and peer-network factors of institutional trust. Our preliminary results show general support for the model and offer some important implications for software vendors in P2P sharing industry and regulatory bodies.

  17. Comparison and selection of client computer in nuclear instrument

    International Nuclear Information System (INIS)

    Ma Guizhen; Xie Yanhui; Peng Jing; Xu Feiyan

    2012-01-01

    The function of modern new nuclear instrument is very much. And the information degree is high requested. Through close matching for host computer and client computer, the data processing function can be carried out. This article puts forward a few of projects for the client computer of general nuclear instrument. The function and features of several common client computers, such as FPGA, ARM and DSP, are analyzed and compared. The applied scope is discussed also. At the same time, using a practical design as an example, the selection ideas of client computer are described. This article can be used for reference for the hardware design of data acquisition processing unit in nuclear instrument. (authors)

  18. Human performance across decision making, selective attention, and working memory tasks: Experimental data and computer simulations

    Directory of Open Access Journals (Sweden)

    Andrea Stocco

    2018-04-01

    Full Text Available This article describes the data analyzed in the paper “Individual differences in the Simon effect are underpinned by differences in the competitive dynamics in the basal ganglia: An experimental verification and a computational model” (Stocco et al., 2017 [1]. The data includes behavioral results from participants performing three cognitive tasks (Probabilistic Stimulus Selection (Frank et al., 2004 [2], Simon task (Craft and Simon, 1970 [3], and Automated Operation Span (Unsworth et al., 2005 [4], as well as simulationed traces generated by a computational neurocognitive model that accounts for individual variations in human performance across the tasks. The experimental data encompasses individual data files (in both preprocessed and native output format as well as group-level summary files. The simulation data includes the entire model code, the results of a full-grid search of the model's parameter space, and the code used to partition the model space and parallelize the simulations. Finally, the repository includes the R scripts used to carry out the statistical analyses reported in the original paper.

  19. Human performance across decision making, selective attention, and working memory tasks: Experimental data and computer simulations.

    Science.gov (United States)

    Stocco, Andrea; Yamasaki, Brianna L; Prat, Chantel S

    2018-04-01

    This article describes the data analyzed in the paper "Individual differences in the Simon effect are underpinned by differences in the competitive dynamics in the basal ganglia: An experimental verification and a computational model" (Stocco et al., 2017) [1]. The data includes behavioral results from participants performing three cognitive tasks (Probabilistic Stimulus Selection (Frank et al., 2004) [2], Simon task (Craft and Simon, 1970) [3], and Automated Operation Span (Unsworth et al., 2005) [4]), as well as simulationed traces generated by a computational neurocognitive model that accounts for individual variations in human performance across the tasks. The experimental data encompasses individual data files (in both preprocessed and native output format) as well as group-level summary files. The simulation data includes the entire model code, the results of a full-grid search of the model's parameter space, and the code used to partition the model space and parallelize the simulations. Finally, the repository includes the R scripts used to carry out the statistical analyses reported in the original paper.

  20. Checklist/Guide to Selecting a Small Computer.

    Science.gov (United States)

    Bennett, Wilma E.

    This 322-point checklist was designed to help executives make an intelligent choice when selecting a small computer for a business. For ease of use the questions have been divided into ten categories: Display Features, Keyboard Features, Printer Features, Controller Features, Software, Word Processing, Service, Training, Miscellaneous, and Costs.…

  1. Quantum Computing: Selected Internet Resources for Librarians, Researchers, and the Casually Curious

    OpenAIRE

    Cirasella, Jill

    2009-01-01

    This article is an annotated selection of the most important and informative Internet resources for learning about quantum computing, finding quantum computing literature, and tracking quantum computing news.

  2. Implementation of Computer Assisted Test Selection System in Local Governments

    Directory of Open Access Journals (Sweden)

    Abdul Azis Basri

    2016-05-01

    Full Text Available As an evaluative way of selection of civil servant system in all government areas, Computer Assisted Test selection system was started to apply in 2013. In phase of implementation for first time in all areas in 2014, this system selection had trouble in several areas, such as registration procedure and passing grade. The main objective of this essay was to describe implementation of new selection system for civil servants in the local governments and to seek level of effectiveness of this selection system. This essay used combination of study literature and field survey which data collection was made by interviews, observations, and documentations from various sources, and to analyze the collected data, this essay used reduction, display data and verification for made the conclusion. The result of this essay showed, despite there a few parts that be problem of this system such as in the registration phase but almost all phases of implementation of CAT selection system in local government areas can be said was working clearly likes in preparation, implementation and result processing phase. And also this system was fulfilled two of three criterias of effectiveness for selection system, they were accuracy and trusty. Therefore, this selection system can be said as an effective way to select new civil servant. As suggestion, local governments have to make prime preparation in all phases of test and make a good feedback as evaluation mechanism and together with central government to seek, fix and improve infrastructures as supporting tool and competency of local residents.

  3. Reducing constraints on quantum computer design by encoded selective recoupling

    International Nuclear Information System (INIS)

    Lidar, D.A.; Wu, L.-A.

    2002-01-01

    The requirement of performing both single-qubit and two-qubit operations in the implementation of universal quantum logic often leads to very demanding constraints on quantum computer design. We show here how to eliminate the need for single-qubit operations in a large subset of quantum computer proposals: those governed by isotropic and XXZ , XY -type anisotropic exchange interactions. Our method employs an encoding of one logical qubit into two physical qubits, while logic operations are performed using an analogue of the NMR selective recoupling method

  4. Selection and implementation of a laboratory computer system.

    Science.gov (United States)

    Moritz, V A; McMaster, R; Dillon, T; Mayall, B

    1995-07-01

    The process of selection of a pathology computer system has become increasingly complex as there are an increasing number of facilities that must be provided and stringent performance requirements under heavy computing loads from both human users and machine inputs. Furthermore, the continuing advances in software and hardware technology provide more options and innovative new ways of tackling problems. These factors taken together pose a difficult and complex set of decisions and choices for the system analyst and designer. The selection process followed by the Microbiology Department at Heidelberg Repatriation Hospital included examination of existing systems, development of a functional specification followed by a formal tender process. The successful tenderer was then selected using predefined evaluation criteria. The successful tenderer was a software development company that developed and supplied a system based on a distributed network using a SUN computer as the main processor. The software was written using Informix running on the UNIX operating system. This represents one of the first microbiology systems developed using a commercial relational database and fourth generation language. The advantages of this approach are discussed.

  5. Quantitative analysis of task selection for brain-computer interfaces

    Science.gov (United States)

    Llera, Alberto; Gómez, Vicenç; Kappen, Hilbert J.

    2014-10-01

    Objective. To assess quantitatively the impact of task selection in the performance of brain-computer interfaces (BCI). Approach. We consider the task-pairs derived from multi-class BCI imagery movement tasks in three different datasets. We analyze for the first time the benefits of task selection on a large-scale basis (109 users) and evaluate the possibility of transferring task-pair information across days for a given subject. Main results. Selecting the subject-dependent optimal task-pair among three different imagery movement tasks results in approximately 20% potential increase in the number of users that can be expected to control a binary BCI. The improvement is observed with respect to the best task-pair fixed across subjects. The best task-pair selected for each subject individually during a first day of recordings is generally a good task-pair in subsequent days. In general, task learning from the user side has a positive influence in the generalization of the optimal task-pair, but special attention should be given to inexperienced subjects. Significance. These results add significant evidence to existing literature that advocates task selection as a necessary step towards usable BCIs. This contribution motivates further research focused on deriving adaptive methods for task selection on larger sets of mental tasks in practical online scenarios.

  6. Modification to the Monte Carlo N-Particle (MCNP) Visual Editor (MCNPVised) to Read in Computer Aided Design (CAD) Files

    International Nuclear Information System (INIS)

    Randolph Schwarz; Leland L. Carter; Alysia Schwarz

    2005-01-01

    Monte Carlo N-Particle Transport Code (MCNP) is the code of choice for doing complex neutron/photon/electron transport calculations for the nuclear industry and research institutions. The Visual Editor for Monte Carlo N-Particle is internationally recognized as the best code for visually creating and graphically displaying input files for MCNP. The work performed in this grant was used to enhance the capabilities of the MCNP Visual Editor to allow it to read in both 2D and 3D Computer Aided Design (CAD) files, allowing the user to electronically generate a valid MCNP input geometry

  7. Conversion of Input Data between KENO and MCNP File Formats for Computer Criticality Assessments

    International Nuclear Information System (INIS)

    Schwarz, Randolph A.; Carter, Leland L.; Schwarz Alysia L.

    2006-01-01

    KENO is a Monte Carlo criticality code that is maintained by Oak Ridge National Laboratory (ORNL). KENO is included in the SCALE (Standardized Computer Analysis for Licensing Evaluation) package. KENO is often used because it was specifically designed for criticality calculations. Because KENO has convenient geometry input, including the treatment of lattice arrays of materials, it is frequently used for production calculations. Monte Carlo N-Particle (MCNP) is a Monte Carlo transport code maintained by Los Alamos National Laboratory (LANL). MCNP has a powerful 3D geometry package and an extensive cross section database. It is a general-purpose code and may be used for calculations involving shielding or medical facilities, for example, but can also be used for criticality calculations. MCNP is becoming increasingly more popular for performing production criticality calculations. Both codes have their own specific advantages. After a criticality calculation has been performed with one of the codes, it is often desirable (or may be a safety requirement) to repeat the calculation with the other code to compare the important parameters using a different geometry treatment and cross section database. This manual conversion of input files between the two codes is labor intensive. The industry needs the capability of converting geometry models between MCNP and KENO without a large investment in manpower. The proposed conversion package will aid the user in converting between the codes. It is not intended to be used as a ''black box''. The resulting input file will need to be carefully inspected by criticality safety personnel to verify the intent of the calculation is preserved in the conversion. The purpose of this package is to help the criticality specialist in the conversion process by converting the geometry, materials, and pertinent data cards

  8. Selectively Fortifying Reconfigurable Computing Device to Achieve Higher Error Resilience

    Directory of Open Access Journals (Sweden)

    Mingjie Lin

    2012-01-01

    Full Text Available With the advent of 10 nm CMOS devices and “exotic” nanodevices, the location and occurrence time of hardware defects and design faults become increasingly unpredictable, therefore posing severe challenges to existing techniques for error-resilient computing because most of them statically assign hardware redundancy and do not account for the error tolerance inherently existing in many mission-critical applications. This work proposes a novel approach to selectively fortifying a target reconfigurable computing device in order to achieve hardware-efficient error resilience for a specific target application. We intend to demonstrate that such error resilience can be significantly improved with effective hardware support. The major contributions of this work include (1 the development of a complete methodology to perform sensitivity and criticality analysis of hardware redundancy, (2 a novel problem formulation and an efficient heuristic methodology to selectively allocate hardware redundancy among a target design’s key components in order to maximize its overall error resilience, and (3 an academic prototype of SFC computing device that illustrates a 4 times improvement of error resilience for a H.264 encoder implemented with an FPGA device.

  9. Modification to the Monte N-Particle (MCNP) Visual Editor (MCNPVised) to read in Computer Aided Design (CAD) files

    International Nuclear Information System (INIS)

    Schwarz, Randy A.; Carter, Leeland L.

    2004-01-01

    Monte Carlo N-Particle Transport Code (MCNP) (Reference 1) is the code of choice for doing complex neutron/photon/electron transport calculations for the nuclear industry and research institutions. The Visual Editor for Monte Carlo N-Particle (References 2 to 11) is recognized internationally as the best code for visually creating and graphically displaying input files for MCNP. The work performed in this grant enhanced the capabilities of the MCNP Visual Editor to allow it to read in a 2D Computer Aided Design (CAD) file, allowing the user to modify and view the 2D CAD file and then electronically generate a valid MCNP input geometry with a user specified axial extent

  10. Energy-efficient computing and networking. Revised selected papers

    Energy Technology Data Exchange (ETDEWEB)

    Hatziargyriou, Nikos; Dimeas, Aris [Ethnikon Metsovion Polytechneion, Athens (Greece); Weidlich, Anke (eds.) [SAP Research Center, Karlsruhe (Germany); Tomtsi, Thomai

    2011-07-01

    This book constitutes the postproceedings of the First International Conference on Energy-Efficient Computing and Networking, E-Energy, held in Passau, Germany in April 2010. The 23 revised papers presented were carefully reviewed and selected for inclusion in the post-proceedings. The papers are organized in topical sections on energy market and algorithms, ICT technology for the energy market, implementation of smart grid and smart home technology, microgrids and energy management, and energy efficiency through distributed energy management and buildings. (orig.)

  11. Schools (Students) Exchanging CAD/CAM Files over the Internet.

    Science.gov (United States)

    Mahoney, Gary S.; Smallwood, James E.

    This document discusses how students and schools can benefit from exchanging computer-aided design/computer-aided manufacturing (CAD/CAM) files over the Internet, explains how files are exchanged, and examines the problem of selected hardware/software incompatibility. Key terms associated with information search services are defined, and several…

  12. Model Selection in Historical Research Using Approximate Bayesian Computation

    Science.gov (United States)

    Rubio-Campillo, Xavier

    2016-01-01

    Formal Models and History Computational models are increasingly being used to study historical dynamics. This new trend, which could be named Model-Based History, makes use of recently published datasets and innovative quantitative methods to improve our understanding of past societies based on their written sources. The extensive use of formal models allows historians to re-evaluate hypotheses formulated decades ago and still subject to debate due to the lack of an adequate quantitative framework. The initiative has the potential to transform the discipline if it solves the challenges posed by the study of historical dynamics. These difficulties are based on the complexities of modelling social interaction, and the methodological issues raised by the evaluation of formal models against data with low sample size, high variance and strong fragmentation. Case Study This work examines an alternate approach to this evaluation based on a Bayesian-inspired model selection method. The validity of the classical Lanchester’s laws of combat is examined against a dataset comprising over a thousand battles spanning 300 years. Four variations of the basic equations are discussed, including the three most common formulations (linear, squared, and logarithmic) and a new variant introducing fatigue. Approximate Bayesian Computation is then used to infer both parameter values and model selection via Bayes Factors. Impact Results indicate decisive evidence favouring the new fatigue model. The interpretation of both parameter estimations and model selection provides new insights into the factors guiding the evolution of warfare. At a methodological level, the case study shows how model selection methods can be used to guide historical research through the comparison between existing hypotheses and empirical evidence. PMID:26730953

  13. A computational neural model of goal-directed utterance selection.

    Science.gov (United States)

    Klein, Michael; Kamp, Hans; Palm, Guenther; Doya, Kenji

    2010-06-01

    It is generally agreed that much of human communication is motivated by extra-linguistic goals: we often make utterances in order to get others to do something, or to make them support our cause, or adopt our point of view, etc. However, thus far a computational foundation for this view on language use has been lacking. In this paper we propose such a foundation using Markov Decision Processes. We borrow computational components from the field of action selection and motor control, where a neurobiological basis of these components has been established. In particular, we make use of internal models (i.e., next-state transition functions defined on current state action pairs). The internal model is coupled with reinforcement learning of a value function that is used to assess the desirability of any state that utterances (as well as certain non-verbal actions) can bring about. This cognitive architecture is tested in a number of multi-agent game simulations. In these computational experiments an agent learns to predict the context-dependent effects of utterances by interacting with other agents that are already competent speakers. We show that the cognitive architecture can account for acquiring the capability of deciding when to speak in order to achieve a certain goal (instead of performing a non-verbal action or simply doing nothing), whom to address and what to say. Copyright 2010 Elsevier Ltd. All rights reserved.

  14. Geothermal-energy files in computer storage: sites, cities, and industries

    Energy Technology Data Exchange (ETDEWEB)

    O' Dea, P.L.

    1981-12-01

    The site, city, and industrial files are described. The data presented are from the hydrothermal site file containing about three thousand records which describe some of the principal physical features of hydrothermal resources in the United States. Data elements include: latitude, longitude, township, range, section, surface temperature, subsurface temperature, the field potential, and well depth for commercialization. (MHR)

  15. Report on the achievements in the Sunshine Project in fiscal 1986. Surveys on coal type selection and surveys on coal types (Data file); 1986 nendo tanshu sentei chosa tanshu chosa seika hokokusho. Data file

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1987-03-01

    This data file is a data file concerning coal types for liquefaction in the report on the achievements in the surveys on coal type selection and on coal types (JN0040843). Such items of information were filed as existence and production of coals, various kinds of analyses, and test values relative to data for liquefaction tests that have been collected and sent to date. The file consists of two files of a test sample information file related to existence and production of coals and coal mines, and an analysis and test file accommodating the results of different analyses and tests. However, the test sample information files (1) through (6) have not been put into order on such items of information as test samples and sample collection, geography, geology, ground beds, coal beds, coal mines, development and transportation. The analysis and test file contains (7) industrial analyses, (8) element analysis, (9) ash composition, (10) solubility of ash, (11) structure analysis, (12) liquefaction characteristics (standard version), (13) analysis of liquefaction produced gas, (14) distillation characteristics of liquefaction produced oil, (15) liquefaction characteristics (simplified version), (16) analysis of liquefaction produced gas (simplified version), and (17) distillation characteristics of liquefaction produced oil (simplified version). However, the information related to liquefaction test using a tubing reactor in (15) through (17) has not been put into order. (NEDO)

  16. Evaluation of Single File Systems Reciproc, Oneshape, and WaveOne using Cone Beam Computed Tomography -An In Vitro Study.

    Science.gov (United States)

    Dhingra, Annil; Ruhal, Nidhi; Miglani, Anjali

    2015-04-01

    Successful endodontic therapy depends on many factor, one of the most important step in any root canal treatment is root canal preparation. In addition, respecting the original shape of the canal is of the same importance; otherwise, canal aberrations such as transportation will be created. The purpose of this study is to compare and evaluate Reciprocating WaveOne ,Reciproc and Rotary Oneshape Single File Instrumentation System On Cervical Dentin Thickness, Cross Sectional Area and Canal Transportation on First Mandibular Molar Using Cone Beam Computed Tomography. Sixty Mandibular First Molars extracted due to periodontal reason was collected from the Department of Oral and Maxillofacial. Teeth were prepared using one rotary and two reciprocating single file system. Teeth were divided into 3 groups 20 teeth in each group. Pre instrumentation and Post instrumentation scans was done and evaluated for three parameters Canal Transportation, Cervical Dentinal Thickness, Cross-sectional Area. Results were analysed statistically using ANOVA, Post-Hoc Tukey analysis. The change in cross-sectional area after filing showed significant difference at 0mm, 1mm, 2mm and 7mm (pfile system over a distance of 7 mm (starting from 0mm and then evaluation at 1mm, 2mm, 3mm, 5mm and 7mm), the results showed a significant difference among the file systems at various lengths (p= 0.014, 0.046, 0.004, 0.028, 0.005 & 0.029 respectively). Mean value of cervical dentinal removal is maximum at all the levels for oneshape and minimum for waveone showing the better quality of waveone and reciproc over oneshape file system. Significant difference was found at 9mm, 11mm and 12mm between all the three file systems (p<0.001,< 0.001, <0.001). It was concluded that reciprocating motion is better than rotary motion in all the three parameters Canal Transportation, Cross-sectional Area, Cervical Dentinal Thickness.

  17. Generation of Gaussian 09 Input Files for the Computation of 1H and 13C NMR Chemical Shifts of Structures from a Spartan’14 Conformational Search

    OpenAIRE

    sprotocols

    2014-01-01

    Authors: Spencer Reisbick & Patrick Willoughby ### Abstract This protocol describes an approach to preparing a series of Gaussian 09 computational input files for an ensemble of conformers generated in Spartan’14. The resulting input files are necessary for computing optimum geometries, relative conformer energies, and NMR shielding tensors using Gaussian. Using the conformational search feature within Spartan’14, an ensemble of conformational isomers was obtained. To convert the str...

  18. Selection of Norway spruce somatic embryos by computer vision

    Science.gov (United States)

    Hamalainen, Jari J.; Jokinen, Kari J.

    1993-05-01

    A computer vision system was developed for the classification of plant somatic embryos. The embryos are in a Petri dish that is transferred with constant speed and they are recognized as they pass a line scan camera. A classification algorithm needs to be installed for every plant species. This paper describes an algorithm for the recognition of Norway spruce (Picea abies) embryos. A short review of conifer micropropagation by somatic embryogenesis is also given. The recognition algorithm is based on features calculated from the boundary of the object. Only part of the boundary corresponding to the developing cotyledons (2 - 15) and the straight sides of the embryo are used for recognition. An index of the length of the cotyledons describes the developmental stage of the embryo. The testing set for classifier performance consisted of 118 embryos and 478 nonembryos. With the classification tolerances chosen 69% of the objects classified as embryos by a human classifier were selected and 31$% rejected. Less than 1% of the nonembryos were classified as embryos. The basic features developed can probably be easily adapted for the recognition of other conifer somatic embryos.

  19. Computational Approach for Epitaxial Polymorph Stabilization through Substrate Selection

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Hong; Dwaraknath, Shyam S.; Garten, Lauren; Ndione, Paul; Ginley, David; Persson, Kristin A.

    2016-05-25

    With the ultimate goal of finding new polymorphs through targeted synthesis conditions and techniques, we outline a computational framework to select optimal substrates for epitaxial growth using first principle calculations of formation energies, elastic strain energy, and topological information. To demonstrate the approach, we study the stabilization of metastable VO2 compounds which provides a rich chemical and structural polymorph space. We find that common polymorph statistics, lattice matching, and energy above hull considerations recommends homostructural growth on TiO2 substrates, where the VO2 brookite phase would be preferentially grown on the a-c TiO2 brookite plane while the columbite and anatase structures favor the a-b plane on the respective TiO2 phases. Overall, we find that a model which incorporates a geometric unit cell area matching between the substrate and the target film as well as the resulting strain energy density of the film provide qualitative agreement with experimental observations for the heterostructural growth of known VO2 polymorphs: rutile, A and B phases. The minimal interfacial geometry matching and estimated strain energy criteria provide several suggestions for substrates and substrate-film orientations for the heterostructural growth of the hitherto hypothetical anatase, brookite, and columbite polymorphs. These criteria serve as a preliminary guidance for the experimental efforts stabilizing new materials and/or polymorphs through epitaxy. The current screening algorithm is being integrated within the Materials Project online framework and data and hence publicly available.

  20. Computational system to create an entry file for replicating I-125 seeds simulating brachytherapy case studies using the MCNPX code

    Directory of Open Access Journals (Sweden)

    Leonardo da Silva Boia

    2014-03-01

    Full Text Available Purpose: A computational system was developed for this paper in the C++ programming language, to create a 125I radioactive seed entry file, based on the positioning of a virtual grid (template in voxel geometries, with the purpose of performing prostate cancer treatment simulations using the MCNPX code.Methods: The system is fed with information from the planning system with regard to each seed’s location and its depth, and an entry file is automatically created with all the cards (instructions for each seed regarding their cell blocks and surfaces spread out spatially in the 3D environment. The system provides with precision a reproduction of the clinical scenario for the MCNPX code’s simulation environment, thereby allowing the technique’s in-depth study.Results and Conclusion: The preliminary results from this study showed that the lateral penumbra of uniform scanning proton beams was less sensitive In order to validate the computational system, an entry file was created with 88 125I seeds that were inserted in the phantom’s MAX06 prostate region with initial activity determined for the seeds at the 0.27 mCi value. Isodose curves were obtained in all the prostate slices in 5 mm steps in the 7 to 10 cm interval, totaling 7 slices. Variance reduction techniques were applied in order to optimize computational time and the reduction of uncertainties such as photon and electron energy interruptions in 4 keV and forced collisions regarding cells of interest. Through the acquisition of isodose curves, the results obtained show that hot spots have values above 300 Gy, as anticipated in literature, stressing the importance of the sources’ correct positioning, in which the computational system developed provides, in order not to release excessive doses in adjacent risk organs. The 144 Gy prescription curve showed in the validation process that it covers perfectly a large percentage of the volume, at the same time that it demonstrates a large

  1. Selections from 2017: Computers Help Us Map Our Home

    Science.gov (United States)

    Kohler, Susanna

    2017-12-01

    Editors note:In these last two weeks of 2017, well be looking at a few selections that we havent yet discussed on AAS Nova from among the most-downloaded paperspublished in AAS journals this year. The usual posting schedule will resume in January.Machine-Learned Identification of RR Lyrae Stars from Sparse, Multi-Band Data: The PS1 SamplePublished April2017Main takeaway:A sample of RR Lyrae variable stars was built from thePan-STARRS1 (PS1) survey by a team led byBranimir Sesar (Max Planck Institute for Astronomy, Germany). The sample of45,000 starsrepresentsthe widest (three-fourthsof the sky) and deepest (reaching 120 kpc) sample of RR Lyrae stars to date.Why its interesting:Its challengingto understand the overall shape and behaviorof our galaxy because were stuck on the inside of it. RR Lyrae stars are a useful tool for this purpose: they can be used as tracers to map out the Milky Ways halo. The authors large sample of RR Lyrae stars from PS1 combined withproper-motion measurements from Gaia and radial-velocity measurements from multi-object spectroscopic surveys could become thepremier source for studying the structure, kinematics, and the gravitational potential of our galaxys outskirts.How they were found:The black dots show the distribution of the 45,000 probable RR Lyrae stars in the authors sample. [Sesar et al. 2017]The 45,000 stars in this sample were selected not by humans, but by computer.The authors used machine-learning algorithms to examine the light curvesin the Pan-STARRS1 sample and identify the characteristic brightness variations of RR Lyrae stars lying in the galactic halo. These techniques resulted in a very pure and complete sample, and the authors suggest that this approachmay translate well to othersparse,multi-band data sets such asthat from the upcomingLarge Synoptic Survey Telescope (LSST) galactic plane sub-survey.CitationBranimir Sesar et al 2017 AJ 153 204. doi:10.3847/1538-3881/aa661b

  2. Quantum Computing: Selected Internet Resources for Librarians, Researchers, and the Casually Curious

    Science.gov (United States)

    Cirasella, Jill

    2009-01-01

    This article presents an annotated selection of the most important and informative Internet resources for learning about quantum computing, finding quantum computing literature, and tracking quantum computing news. All of the quantum computing resources described in this article are freely available, English-language web sites that fall into one…

  3. Computationally efficient thermal-mechanical modelling of selective laser melting

    Science.gov (United States)

    Yang, Yabin; Ayas, Can

    2017-10-01

    The Selective laser melting (SLM) is a powder based additive manufacturing (AM) method to produce high density metal parts with complex topology. However, part distortions and accompanying residual stresses deteriorates the mechanical reliability of SLM products. Modelling of the SLM process is anticipated to be instrumental for understanding and predicting the development of residual stress field during the build process. However, SLM process modelling requires determination of the heat transients within the part being built which is coupled to a mechanical boundary value problem to calculate displacement and residual stress fields. Thermal models associated with SLM are typically complex and computationally demanding. In this paper, we present a simple semi-analytical thermal-mechanical model, developed for SLM that represents the effect of laser scanning vectors with line heat sources. The temperature field within the part being build is attained by superposition of temperature field associated with line heat sources in a semi-infinite medium and a complimentary temperature field which accounts for the actual boundary conditions. An analytical solution of a line heat source in a semi-infinite medium is first described followed by the numerical procedure used for finding the complimentary temperature field. This analytical description of the line heat sources is able to capture the steep temperature gradients in the vicinity of the laser spot which is typically tens of micrometers. In turn, semi-analytical thermal model allows for having a relatively coarse discretisation of the complimentary temperature field. The temperature history determined is used to calculate the thermal strain induced on the SLM part. Finally, a mechanical model governed by elastic-plastic constitutive rule having isotropic hardening is used to predict the residual stresses.

  4. OK, Computer: File Sharing, the Music Industry, and Why We Need the Pirate Party

    Directory of Open Access Journals (Sweden)

    Adrian Cosstick

    2009-03-01

    Full Text Available The Pirate Party believes the state and big business are in the process of protecting stale and inefficient models of business for their own monetary benefit by limiting our right to share information. The Pirate Party suggests that they are achieving this goal through the amendment of intellectual property legislation. In the dawn of the digital era, the Pirate Party advocates that governments and multinational corporations are using intellectual property to: crack down on file sharing which limits the ability to share knowledge and information; increase the terms and length of copyright to raise profits; and build code into music files which limits their ability to be shared (Pirate Party, 2009. There are a number of ‘copyright industries’ that are affected by these issues, none more so than the music industry. Its relationship with file sharing is topical and makes an excellent case study to address the impact big business has had on intellectual property and the need for the Pirate Party’s legislative input. The essay will then examine the central issues raised by illegal file sharing. In particular, the future for record companies in an environment that increasingly demands flexibility, and whether the Pirate Party’s proposal is a viable solution to the music industry’s problems

  5. Development of selective photoionization spectroscopy technology - Development of a computer program to calculate selective ionization of atoms with multistep processes

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Soon; Nam, Baek Il [Myongji University, Seoul (Korea, Republic of)

    1995-08-01

    We have developed computer programs to calculate 2-and 3-step selective resonant multiphoton ionization of atoms. Autoionization resonances in the final continuum can be put into account via B-Spline basis set method. 8 refs., 5 figs. (author)

  6. Comparative evaluation of effect of rotary and reciprocating single-file systems on pericervical dentin: A cone-beam computed tomography study.

    Science.gov (United States)

    Zinge, Priyanka Ramdas; Patil, Jayaprakash

    2017-01-01

    The aim of this study is to evaluate and compare the effect of one shape, Neolix rotary single-file systems and WaveOne, Reciproc reciprocating single-file systems on pericervical dentin (PCD) using cone-beam computed tomography (CBCT). A total of 40 freshly extracted mandibular premolars were collected and divided into two groups, namely, Group A - Rotary: A 1 - Neolix and A 2 - OneShape and Group B - Reciprocating: B 1 - WaveOne and B 2 - Reciproc. Preoperative scans of each were taken followed by conventional access cavity preparation and working length determination with 10-k file. Instrumentation of the canal was done according to the respective file system, and postinstrumentation CBCT scans of teeth were obtained. 90 μm thick slices were obtained 4 mm apical and coronal to the cementoenamel junction. The PCD thickness was calculated as the shortest distance from the canal outline to the closest adjacent root surface, which was measured in four surfaces, i.e., facial, lingual, mesial, and distal for all the groups in the two obtained scans. There was no significant difference found between rotary single-file systems and reciprocating single-file systems in their effect on PCD, but in Group B 2 , there was most significant loss of tooth structure in the mesial, lingual, and distal surface ( P file system removes more PCD as compared to other experimental groups, whereas Neolix single file system had the least effect on PCD.

  7. Selective Bibliography on the History of Computing and Information Processing.

    Science.gov (United States)

    Aspray, William

    1982-01-01

    Lists some of the better-known and more accessible books on the history of computing and information processing, covering: (1) popular general works; (2) more technical general works; (3) microelectronics and computing; (4) artificial intelligence and robotics; (5) works relating to Charles Babbage; (6) other biographical and personal accounts;…

  8. F2AC: A Lightweight, Fine-Grained, and Flexible Access Control Scheme for File Storage in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Wei Ren

    2016-01-01

    Full Text Available Current file storage service models for cloud servers assume that users either belong to single layer with different privileges or cannot authorize privileges iteratively. Thus, the access control is not fine-grained and flexible. Besides, most access control methods at cloud servers mainly rely on computationally intensive cryptographic algorithms and, especially, may not be able to support highly dynamic ad hoc groups with addition and removal of group members. In this paper, we propose a scheme called F2AC, which is a lightweight, fine-grained, and flexible access control scheme for file storage in mobile cloud computing. F2AC can not only achieve iterative authorization, authentication with tailored policies, and access control for dynamically changing accessing groups, but also provide access privilege transition and revocation. A new access control model called directed tree with linked leaf model is proposed for further implementations in data structures and algorithms. The extensive analysis is given for justifying the soundness and completeness of F2AC.

  9. 77 FR 27263 - Computer Matching Between the Selective Service System and the Department of Education

    Science.gov (United States)

    2012-05-09

    ... SELECTIVE SERVICE SYSTEM Computer Matching Between the Selective Service System and the Department of Education AGENCY: Selective Service System. Action: Notice. In accordance with the Privacy Act of... of Participating Agencies The Selective Service System (SSS) and the Department of Education (ED). 2...

  10. Computation of diatomic molecular spectra for selected transitions of aluminum monoxide, cyanide, diatomic carbon, and titanium monoxide

    Energy Technology Data Exchange (ETDEWEB)

    Parigger, Christian G., E-mail: cparigge@tennessee.edu [The University of Tennessee/University of Tennessee Space Institute, Center for Laser Applications, 411 B.H. Goethert Parkway, Tullahoma, TN 37388-9700 (United States); Woods, Alexander C.; Surmick, David M.; Gautam, Ghaneshwar; Witte, Michael J. [The University of Tennessee/University of Tennessee Space Institute, Center for Laser Applications, 411 B.H. Goethert Parkway, Tullahoma, TN 37388-9700 (United States); Hornkohl, James O. [Hornkohl Consulting, Tullahoma, TN 37388 (United States)

    2015-05-01

    Laser ablation studies with laser-induced breakdown spectroscopy (LIBS) typically emphasize atomic species yet fingerprints from molecular species can occur subsequently or concurrently. In this work, selected molecular transitions of aluminum monixide (AlO), diatomic carbon (C{sub 2}), cyanide (CN), and titanium monoxide (TiO) are accurately computed. Line strength tables are used to describe the radiative transitions of diatomic molecules primarily in the visible, optical region. Details are elaborated of the computational procedure that allows one to utilize diatomic spectra as a predictive and as a diagnostic tool. In order to create a computed spectrum, the procedure requires information regarding the temperature of the diatomic transitions along with other input such as the spectral resolution. When combined with a fitting algorithm to optimize such parameters, this procedure is used to infer information from an experimentally obtained spectrum. Furthermore, the programs and data files are provided for LIBS investigations that also reveal AlO, C{sub 2}, CN, and TiO diatomic spectra. - Highlights: • We present a program for fitting of molecular spectra. • This includes data base for AlO, C{sub 2}, CN, and TiO. • We discuss the details of the program including fitting. • We show computed examples and reference current work.

  11. Dynamic angle selection in X-ray computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Dabravolski, Andrei, E-mail: andrei.dabravolski@uantwerpen.be [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, 2610 Wilrijk (Belgium); Batenburg, Kees Joost, E-mail: joost.batenburg@uantwerpen.be [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, 2610 Wilrijk (Belgium); Centrum Wiskunde and Informatica (CWI), Science Park 123, 1098 XG Amsterdam (Netherlands); Sijbers, Jan, E-mail: jan.sijbers@uantwerpen.be [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, 2610 Wilrijk (Belgium)

    2014-04-01

    Highlights: • We propose the dynamic angle selection algorithm for CT scanning. • The approach is based on the concept of information gain over a set of solutions. • Projection angles are selected based on the already available projection data. • The approach can lead to more accurate results from fewer projections. - Abstract: In X-ray tomography, a number of radiographs (projections) are recorded from which a tomogram is then reconstructed. Conventionally, these projections are acquired equiangularly, resulting in an unbiased sampling of the Radon space. However, especially in case when only a limited number of projections can be acquired, the selection of the angles has a large impact on the quality of the reconstructed image. In this paper, a dynamic algorithm is proposed, in which new projection angles are selected by maximizing the information gain about the object, given the set of possible new angles. Experiments show that this approach can select projection angles for which the accuracy of the reconstructed image is significantly higher compared to the standard angle selections schemes.

  12. Dynamic angle selection in X-ray computed tomography

    International Nuclear Information System (INIS)

    Dabravolski, Andrei; Batenburg, Kees Joost; Sijbers, Jan

    2014-01-01

    Highlights: • We propose the dynamic angle selection algorithm for CT scanning. • The approach is based on the concept of information gain over a set of solutions. • Projection angles are selected based on the already available projection data. • The approach can lead to more accurate results from fewer projections. - Abstract: In X-ray tomography, a number of radiographs (projections) are recorded from which a tomogram is then reconstructed. Conventionally, these projections are acquired equiangularly, resulting in an unbiased sampling of the Radon space. However, especially in case when only a limited number of projections can be acquired, the selection of the angles has a large impact on the quality of the reconstructed image. In this paper, a dynamic algorithm is proposed, in which new projection angles are selected by maximizing the information gain about the object, given the set of possible new angles. Experiments show that this approach can select projection angles for which the accuracy of the reconstructed image is significantly higher compared to the standard angle selections schemes

  13. Computer aided selection of plant layout | Kitaw | Zede Journal

    African Journals Online (AJOL)

    This paper deals with the fundamental concepts of plant layout, in which the need for plant layout, the systematic and logical approaches to the problems, layout solutions and the objectives of plant layout are discussed. Further the approaches and the scoring techniques of the two available computer rout ines are ...

  14. Micro computed tomography evaluation of the Self-adjusting file and ProTaper Universal system on curved mandibular molars.

    Science.gov (United States)

    Serefoglu, Burcu; Piskin, Beyser

    2017-09-26

    The aim of this investigation was to compare the cleaning and shaping efficiency of Self-adjusting file and Protaper, and to assess the correlation between root canal curvature and working time in mandibular molars using micro-computed tomography. Twenty extracted mandibular molars instrumented with Protaper and Self-adjusting file and the total working time was measured in mesial canals. The changes in canal volume, surface area and structure model index, transportation, uninstrumented area and the correlation between working-time and the curvature were analyzed. Although no statistically significant difference was observed between two systems in distal canals (p>0.05), a significantly higher amount of removed dentin volume and lower uninstrumented area were provided by Protaper in mesial canals (p<0.0001). A correlation between working-time and the canal-curvature was also observed in mesial canals for both groups (SAFr 2 =0.792, p<0.0004, PTUr 2 =0.9098, p<0.0001).

  15. ADAM: A computer program to simulate selective-breeding schemes for animals

    DEFF Research Database (Denmark)

    Pedersen, L D; Sørensen, A C; Henryon, M

    2009-01-01

    ADAM is a computer program that models selective breeding schemes for animals using stochastic simulation. The program simulates a population of animals and traces the genetic changes in the population under different selective breeding scenarios. It caters to different population structures......, genetic models, selection strategies, and mating designs. ADAM can be used to evaluate breeding schemes and generate genetic data to test statistical tools...

  16. Effect of Model Selection on Computed Water Balance Components

    NARCIS (Netherlands)

    Jhorar, R.K.; Smit, A.A.M.F.R.; Roest, C.W.J.

    2009-01-01

    Soil water flow modelling approaches as used in four selected on-farm water management models, namely CROPWAT. FAIDS, CERES and SWAP, are compared through numerical experiments. The soil water simulation approaches used in the first three models are reformulated to incorporate ail evapotranspiration

  17. Computationally efficient thermal-mechanical modelling of selective laser melting

    NARCIS (Netherlands)

    Yang, Y.; Ayas, C.; Brabazon, Dermot; Naher, Sumsun; Ul Ahad, Inam

    2017-01-01

    The Selective laser melting (SLM) is a powder based additive manufacturing (AM) method to produce high density metal parts with complex topology. However, part distortions and accompanying residual stresses deteriorates the mechanical reliability of SLM products. Modelling of the SLM process is

  18. Request queues for interactive clients in a shared file system of a parallel computing system

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin

    2015-08-18

    Interactive requests are processed from users of log-in nodes. A metadata server node is provided for use in a file system shared by one or more interactive nodes and one or more batch nodes. The interactive nodes comprise interactive clients to execute interactive tasks and the batch nodes execute batch jobs for one or more batch clients. The metadata server node comprises a virtual machine monitor; an interactive client proxy to store metadata requests from the interactive clients in an interactive client queue; a batch client proxy to store metadata requests from the batch clients in a batch client queue; and a metadata server to store the metadata requests from the interactive client queue and the batch client queue in a metadata queue based on an allocation of resources by the virtual machine monitor. The metadata requests can be prioritized, for example, based on one or more of a predefined policy and predefined rules.

  19. A computer program for creating keyword indexes to textual data files

    Science.gov (United States)

    Moody, David W.

    1972-01-01

    A keyword-in-context (KWIC) or out-of-context (KWOC) index is a convenient means of organizing information. This keyword index program can be used to create either KWIC or KWOC indexes of bibliographic references or other types of information punched on. cards, typed on optical scanner sheets, or retrieved from various Department of Interior data bases using the Generalized Information Processing System (GIPSY). The index consists of a 'bibliographic' section and a keyword-section based on the permutation of. document titles, project titles, environmental impact statement titles, maps, etc. or lists of descriptors. The program can also create a back-of-the-book index to documents from a list of descriptors. By providing the user with a wide range of input and output options, the program provides the researcher, manager, or librarian with a means of-maintaining a list and index to documents in. a small library, reprint collection, or office file.

  20. Verification of data files of TREF-computer program; TREF-ohjelmiston ohjaustiedostojen soveltuvuustutkimus

    Energy Technology Data Exchange (ETDEWEB)

    Ruottu, S.; Halme, A.; Ruottu, A. [Einco Oy, Karhula (Finland)

    1996-12-01

    Originally the aim of Y43 project was to verify TREF data files for several different processes. However, it appeared that deficient or missing coordination between experimental and theoretical works made meaningful verifications impossible in some cases. Therefore verification calculations were focused on catalytic cracking reactor which was developed by Neste. The studied reactor consisted of prefluidisation and reaction zones. Verification calculations concentrated mainly on physical phenomena like vaporization near oil injection zone. The main steps of the cracking process can be described as follows oil (liquid) -> oil (gas) -> oil (catal) -> product (catal) + char (catal) -> product (gas). Catalytic nature of cracking reaction was accounted by defining the cracking pseudoreaction into catalyst phase. This simplified reaction model was valid only for vaporization zone. Applied fluid dynamic theory was based on the results of EINCO`s earlier LIEKKI-projects. (author)

  1. Shaping ability of the conventional nickel-titanium and reciprocating nickel-titanium file systems: a comparative study using micro-computed tomography.

    Science.gov (United States)

    Hwang, Young-Hye; Bae, Kwang-Shik; Baek, Seung-Ho; Kum, Kee-Yeon; Lee, WooCheol; Shon, Won-Jun; Chang, Seok Woo

    2014-08-01

    This study used micro-computed tomographic imaging to compare the shaping ability of Mtwo (VDW, Munich, Germany), a conventional nickel-titanium file system, and Reciproc (VDW), a reciprocating file system morphologically similar to Mtwo. Root canal shaping was performed on the mesiobuccal and distobuccal canals of extracted maxillary molars. In the RR group (n = 15), Reciproc was used in a reciprocating motion (150° counterclockwise/30° clockwise, 300 rpm); in the MR group, Mtwo was used in a reciprocating motion (150° clockwise/30° counterclockwise, 300 rpm); and in the MC group, Mtwo was used in a continuous rotating motion (300 rpm). Micro-computed tomographic images taken before and after canal shaping were used to analyze canal volume change and the degree of transportation at the cervical, middle, and apical levels. The time required for canal shaping was recorded. Afterward, each file was analyzed using scanning electron microscopy. No statistically significant differences were found among the 3 groups in the time for canal shaping or canal volume change (P > .05). Transportation values of the RR and MR groups were not significantly different at any level. However, the transportation value of the MC group was significantly higher than both the RR and MR groups at the cervical and apical levels (P file deformation was observed for 1 file in group RR (1/15), 3 files in group MR (3/15), and 5 files in group MC (5/15). In terms of shaping ability, Mtwo used in a reciprocating motion was not significantly different from the Reciproc system. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  2. PCF File Format.

    Energy Technology Data Exchange (ETDEWEB)

    Thoreson, Gregory G [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-08-01

    PCF files are binary files designed to contain gamma spectra and neutron count rates from radiation sensors. It is the native format for the GAmma Detector Response and Analysis Software (GADRAS) package [1]. It can contain multiple spectra and information about each spectrum such as energy calibration. This document outlines the format of the file that would allow one to write a computer program to parse and write such files.

  3. Image selection as a service for cloud computing environments

    KAUST Repository

    Filepp, Robert

    2010-12-01

    Customers of Cloud Services are expected to choose specific machine images to instantiate in order to host their workloads. Unfortunately very little information is provided to the users to enable them to make intelligent choices. We believe that as the number of images proliferates it will become increasingly difficult for users to decide effectively. Cloud service providers often allow their customers to instantiate standard system images, to modify their instances, and to store images of these customized instances for public or private future use. Storing modified instances as images enables customers to avoid re-provisioning and re-configuration of required resources thereby reducing their future costs. However Cloud service providers generally do not expose details regarding the configurations of the images in a rigorous canonical fashion nor offer services that assist clients in the best target image selection to support client transformation objectives. Rather, they allow customers to enter a free-form description of an image based on client\\'s best effort. This means in order to find a "best fit" image to instantiate, a human user must review potentially thousands of image descriptions, reading each description to evaluate its suitability as a platform to host their source application. Furthermore, the actual content of the selected image may differ greatly from its description. Finally, even images that have been customized and retained for future use may need additional provisioning and customization to accommodate specific needs. In this paper we propose a service that accumulates image configuration details in a canonical fashion and a further service that employs an algorithm to order images per best fit /least cost in conformance to user-specified policies. These services collectively facilitate workload transformation into enterprise cloud environments.

  4. Reciprocating vs Rotary Instrumentation in Pediatric Endodontics: Cone Beam Computed Tomographic Analysis of Deciduous Root Canals using Two Single-file Systems.

    Science.gov (United States)

    Prabhakar, Attiguppe R; Yavagal, Chandrashekar; Dixit, Kratika; Naik, Saraswathi V

    2016-01-01

    Primary root canals are considered to be most challenging due to their complex anatomy. "Wave one" and "one shape" are single-file systems with reciprocating and rotary motion respectively. The aim of this study was to evaluate and compare dentin thickness, centering ability, canal transportation, and instrumentation time of wave one and one shape files in primary root canals using a cone beam computed tomographic (CBCT) analysis. This is an experimental, in vitro study comparing the two groups. A total of 24 extracted human primary teeth with minimum 7 mm root length were included in the study. Cone beam computed tomographic images were taken before and after the instrumentation for each group. Dentin thickness, centering ability, canal transportation, and instrumentation times were evaluated for each group. A significant difference was found in instrumentation time and canal transportation measures between the two groups. Wave one showed less canal transportation as compared with one shape, and the mean instrumentation time of wave one was significantly less than one shape. Reciprocating single-file systems was found to be faster with much less procedural errors and can hence be recommended for shaping the root canals of primary teeth. How to cite this article: Prabhakar AR, Yavagal C, Dixit K, Naik SV. Reciprocating vs Rotary Instrumentation in Pediatric Endodontics: Cone Beam Computed Tomographic Analysis of Deciduous Root Canals using Two Single-File Systems. Int J Clin Pediatr Dent 2016;9(1):45-49.

  5. In-vitro Assessing the Shaping Ability of Three Nickel-Titanium Rotary Single File Systems by Cone Beam Computed Tomography

    Directory of Open Access Journals (Sweden)

    Ali Imad Al-Asadi

    2018-02-01

    Full Text Available Aim of the study was to evaluate the canal transportation and centering ability of three nickel-titanium single file rotary systems by cone beam computed tomography (CBCT. Materials and methods: Thirty permanent maxillary first molar with a range of mesiobuccal canals curvature from 20-30 degree were selected and assigned into three groups (n=10, according to the biomechanical preparation system used: Hyflex EDM (HF, Reciproc blue (RB and OneShape (OS. The sampled were scanned by CBCT after being mounted on customized acrylic base and then rescanned after the instrumentation. Slices from the axial section were taken from both exposures at 3 mm, 6 mm and 9 mm from the root apex corresponding to the apical, middle, and coronal third respectively. Data were statistically analyzed using Kurskal-Wallis and Mann-Whitney U tests at the 5% confidence level. Results: The results showed that there were no significant differences at the apical and coronal third and a significant difference at the middle third regarding canal transportation. However, there was a significant difference at the apical third and no significant difference at the middle and coronal third regarding centering ratio. Conclusion: It was concluded that the three single rotary systems reported a degree in canal transportation and centric ratio but the Hyflex EDM reported the least one.

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  7. Assessing Affordances of Selected Cloud Computing Tools for Language Teacher Education in Nigeria

    Science.gov (United States)

    Ofemile, Abdulmalik Yusuf

    2015-01-01

    This paper reports part of a study that hoped to understand Teacher Educators' (TE) assessment of the affordances of selected cloud computing tools ranked among the top 100 for the year 2010. Research has shown that ICT and by extension cloud computing has positive impacts on daily life and this informed the Nigerian government's policy to…

  8. Computational Linguistics in the Netherlands 2004 : Selected papers from the fifteenth CLIN meeting

    NARCIS (Netherlands)

    Wouden, Ton van der; Poß, Michaela; Reckman, Hilke; Cremers, Crit

    2005-01-01

    This volume contains a selection of the papers presented at the fifteenth installment of Computational Linguistics in the Netherlands, held at Leiden University on Friday, December 17th, 2004. Organized by the computational linguists of what was at that time called the Leiden Centre for Linguistics

  9. A User Assessment of Workspaces in Selected Music Education Computer Laboratories.

    Science.gov (United States)

    Badolato, Michael Jeremy

    A study of 120 students selected from the user populations of four music education computer laboratories was conducted to determine the applicability of current ergonomic and environmental design guidelines in satisfying the needs of users of educational computing workspaces. Eleven categories of workspace factors were organized into a…

  10. Thread selection according to predefined power characteristics during context switching on compute nodes

    Science.gov (United States)

    None, None

    2013-06-04

    Methods, apparatus, and products are disclosed for thread selection during context switching on a plurality of compute nodes that includes: executing, by a compute node, an application using a plurality of threads of execution, including executing one or more of the threads of execution; selecting, by the compute node from a plurality of available threads of execution for the application, a next thread of execution in dependence upon power characteristics for each of the available threads; determining, by the compute node, whether criteria for a thread context switch are satisfied; and performing, by the compute node, the thread context switch if the criteria for a thread context switch are satisfied, including executing the next thread of execution.

  11. Selective population rate coding: a possible computational role of gamma oscillations in selective attention.

    Science.gov (United States)

    Masuda, Naoki

    2009-12-01

    Selective attention is often accompanied by gamma oscillations in local field potentials and spike field coherence in brain areas related to visual, motor, and cognitive information processing. Gamma oscillations are implicated to play an important role in, for example, visual tasks including object search, shape perception, and speed detection. However, the mechanism by which gamma oscillations enhance cognitive and behavioral performance of attentive subjects is still elusive. Using feedforward fan-in networks composed of spiking neurons, we examine a possible role for gamma oscillations in selective attention and population rate coding of external stimuli. We implement the concept proposed by Fries ( 2005 ) that under dynamic stimuli, neural populations effectively communicate with each other only when there is a good phase relationship among associated gamma oscillations. We show that the downstream neural population selects a specific dynamic stimulus received by an upstream population and represents it by population rate coding. The encoded stimulus is the one for which gamma rhythm in the corresponding upstream population is resonant with the downstream gamma rhythm. The proposed role for gamma oscillations in stimulus selection is to enable top-down control, a neural version of time division multiple access used in communication engineering.

  12. Evaluation of Single File Systems Reciproc, Oneshape, and WaveOne using Cone Beam Computed Tomography –An In Vitro Study

    Science.gov (United States)

    Dhingra, Annil; Miglani, Anjali

    2015-01-01

    Background Successful endodontic therapy depends on many factor, one of the most important step in any root canal treatment is root canal preparation. In addition, respecting the original shape of the canal is of the same importance; otherwise, canal aberrations such as transportation will be created. Aim The purpose of this study is to compare and evaluate Reciprocating WaveOne ,Reciproc and Rotary Oneshape Single File Instrumentation System On Cervical Dentin Thickness, Cross Sectional Area and Canal Transportation on First Mandibular Molar Using Cone Beam Computed Tomography. Materials and Methods Sixty Mandibular First Molars extracted due to periodontal reason was collected from the Department of Oral and Maxillofacial. Teeth were prepared using one rotary and two reciprocating single file system. Teeth were divided into 3 groups 20 teeth in each group. Pre instrumentation and Post instrumentation scans was done and evaluated for three parameters Canal Transportation, Cervical Dentinal Thickness, Cross-sectional Area. Results were analysed statistically using ANOVA, Post-Hoc Tukey analysis. Results The change in cross-sectional area after filing showed significant difference at 0mm, 1mm, 2mm and 7mm (pfile system over a distance of 7 mm (starting from 0mm and then evaluation at 1mm, 2mm, 3mm, 5mm and 7mm), the results showed a significant difference among the file systems at various lengths (p= 0.014, 0.046, 0.004, 0.028, 0.005 & 0.029 respectively). Mean value of cervical dentinal removal is maximum at all the levels for oneshape and minimum for waveone showing the better quality of waveone and reciproc over oneshape file system. Significant difference was found at 9mm, 11mm and 12mm between all the three file systems (p<0.001,< 0.001, <0.001). Conclusion It was concluded that reciprocating motion is better than rotary motion in all the three parameters Canal Transportation, Cross-sectional Area, Cervical Dentinal Thickness. PMID:26023639

  13. Availability and Overlap of Quality Computer Science Journal Holdings in Selected University Libraries in Malaysia

    OpenAIRE

    Zainab, A.N.; Ng, S.L.

    2003-01-01

    The study reveals the availability status of quality journals in the field of computer science held in the libraries of the University of Malaya, (UM), University of Science Malaysia (USM), University of Technology Malaysia (UTM), National University of Malaysia (UKM) and University Putra Malaysia (UPM). These universities are selected since they offer degree programmes in computer science. The study also investigates the degree of overlaps and unique titles in the five libraries. The Univers...

  14. User's guide for the implementation of level one of the proposed American National Standard Specifications for an information interchange data descriptive file on control data 6000/7000 series computers

    CERN Document Server

    Wiley, R A

    1977-01-01

    User's guide for the implementation of level one of the proposed American National Standard Specifications for an information interchange data descriptive file on control data 6000/7000 series computers

  15. Statistical test data selection for reliability evalution of process computer software

    International Nuclear Information System (INIS)

    Volkmann, K.P.; Hoermann, H.; Ehrenberger, W.

    1976-01-01

    The paper presents a concept for converting knowledge about the characteristics of process states into practicable procedures for the statistical selection of test cases in testing process computer software. Process states are defined as vectors whose components consist of values of input variables lying in discrete positions or within given limits. Two approaches for test data selection, based on knowledge about cases of demand, are outlined referring to a purely probabilistic method and to the mathematics of stratified sampling. (orig.) [de

  16. A Multidisciplinary Research Team Approach to Computer-Aided Drafting (CAD) System Selection. Final Report.

    Science.gov (United States)

    Franken, Ken; And Others

    A multidisciplinary research team was assembled to review existing computer-aided drafting (CAD) systems for the purpose of enabling staff in the Design Drafting Department at Linn Technical College (Missouri) to select the best system out of the many CAD systems in existence. During the initial stage of the evaluation project, researchers…

  17. Human vs. Computer Diagnosis of Students' Natural Selection Knowledge: Testing the Efficacy of Text Analytic Software

    Science.gov (United States)

    Nehm, Ross H.; Haertig, Hendrik

    2012-01-01

    Our study examines the efficacy of Computer Assisted Scoring (CAS) of open-response text relative to expert human scoring within the complex domain of evolutionary biology. Specifically, we explored whether CAS can diagnose the explanatory elements (or Key Concepts) that comprise undergraduate students' explanatory models of natural selection with…

  18. Selecting, Evaluating and Creating Policies for Computer-Based Resources in the Behavioral Sciences and Education.

    Science.gov (United States)

    Richardson, Linda B., Comp.; And Others

    This collection includes four handouts: (1) "Selection Critria Considerations for Computer-Based Resources" (Linda B. Richardson); (2) "Software Collection Policies in Academic Libraries" (a 24-item bibliography, Jane W. Johnson); (3) "Circulation and Security of Software" (a 19-item bibliography, Sara Elizabeth Williams); and (4) "Bibliography of…

  19. Selection Finder (SelFi: A computational metabolic engineering tool to enable directed evolution of enzymes

    Directory of Open Access Journals (Sweden)

    Neda Hassanpour

    2017-06-01

    Full Text Available Directed evolution of enzymes consists of an iterative process of creating mutant libraries and choosing desired phenotypes through screening or selection until the enzymatic activity reaches a desired goal. The biggest challenge in directed enzyme evolution is identifying high-throughput screens or selections to isolate the variant(s with the desired property. We present in this paper a computational metabolic engineering framework, Selection Finder (SelFi, to construct a selection pathway from a desired enzymatic product to a cellular host and to couple the pathway with cell survival. We applied SelFi to construct selection pathways for four enzymes and their desired enzymatic products xylitol, D-ribulose-1,5-bisphosphate, methanol, and aniline. Two of the selection pathways identified by SelFi were previously experimentally validated for engineering Xylose Reductase and RuBisCO. Importantly, SelFi advances directed evolution of enzymes as there is currently no known generalized strategies or computational techniques for identifying high-throughput selections for engineering enzymes.

  20. Evaluating biomechanics of user-selected sitting and standing computer workstation.

    Science.gov (United States)

    Lin, Michael Y; Barbir, Ana; Dennerlein, Jack T

    2017-11-01

    A standing computer workstation has now become a popular modern work place intervention to reduce sedentary behavior at work. However, user's interaction related to a standing computer workstation and its differences with a sitting workstation need to be understood to assist in developing recommendations for use and set up. The study compared the differences in upper extremity posture and muscle activity between user-selected sitting and standing workstation setups. Twenty participants (10 females, 10 males) volunteered for the study. 3-D posture, surface electromyography, and user-reported discomfort were measured while completing simulated tasks with each participant's self-selected workstation setups. Sitting computer workstation associated with more non-neutral shoulder postures and greater shoulder muscle activity, while standing computer workstation induced greater wrist adduction angle and greater extensor carpi radialis muscle activity. Sitting computer workstation also associated with greater shoulder abduction postural variation (90th-10th percentile) while standing computer workstation associated with greater variation for should rotation and wrist extension. Users reported similar overall discomfort levels within the first 10 min of work but had more than twice as much discomfort while standing than sitting after 45 min; with most discomfort reported in the low back for standing and shoulder for sitting. These different measures provide understanding in users' different interactions with sitting and standing and by alternating between the two configurations in short bouts may be a way of changing the loading pattern on the upper extremity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Max-AUC feature selection in computer-aided detection of polyps in CT colonography.

    Science.gov (United States)

    Xu, Jian-Wu; Suzuki, Kenji

    2014-03-01

    We propose a feature selection method based on a sequential forward floating selection (SFFS) procedure to improve the performance of a classifier in computerized detection of polyps in CT colonography (CTC). The feature selection method is coupled with a nonlinear support vector machine (SVM) classifier. Unlike the conventional linear method based on Wilks' lambda, the proposed method selected the most relevant features that would maximize the area under the receiver operating characteristic curve (AUC), which directly maximizes classification performance, evaluated based on AUC value, in the computer-aided detection (CADe) scheme. We presented two variants of the proposed method with different stopping criteria used in the SFFS procedure. The first variant searched all feature combinations allowed in the SFFS procedure and selected the subsets that maximize the AUC values. The second variant performed a statistical test at each step during the SFFS procedure, and it was terminated if the increase in the AUC value was not statistically significant. The advantage of the second variant is its lower computational cost. To test the performance of the proposed method, we compared it against the popular stepwise feature selection method based on Wilks' lambda for a colonic-polyp database (25 polyps and 2624 nonpolyps). We extracted 75 morphologic, gray-level-based, and texture features from the segmented lesion candidate regions. The two variants of the proposed feature selection method chose 29 and 7 features, respectively. Two SVM classifiers trained with these selected features yielded a 96% by-polyp sensitivity at false-positive (FP) rates of 4.1 and 6.5 per patient, respectively. Experiments showed a significant improvement in the performance of the classifier with the proposed feature selection method over that with the popular stepwise feature selection based on Wilks' lambda that yielded 18.0 FPs per patient at the same sensitivity level.

  2. Computational design of selective peptides to discriminate between similar PDZ domains in an oncogenic pathway.

    Science.gov (United States)

    Zheng, Fan; Jewell, Heather; Fitzpatrick, Jeremy; Zhang, Jian; Mierke, Dale F; Grigoryan, Gevorg

    2015-01-30

    Reagents that target protein-protein interactions to rewire signaling are of great relevance in biological research. Computational protein design may offer a means of creating such reagents on demand, but methods for encoding targeting selectivity are sorely needed. This is especially challenging when targeting interactions with ubiquitous recognition modules--for example, PDZ domains, which bind C-terminal sequences of partner proteins. Here we consider the problem of designing selective PDZ inhibitor peptides in the context of an oncogenic signaling pathway, in which two PDZ domains (NHERF-2 PDZ2-N2P2 and MAGI-3 PDZ6-M3P6) compete for a receptor C-terminus to differentially modulate oncogenic activities. Because N2P2 has been shown to increase tumorigenicity and M3P6 to decreases it, we sought to design peptides that inhibit N2P2 without affecting M3P6. We developed a structure-based computational design framework that models peptide flexibility in binding yet is efficient enough to rapidly analyze tradeoffs between affinity and selectivity. Designed peptides showed low-micromolar inhibition constants for N2P2 and no detectable M3P6 binding. Peptides designed for reverse discrimination bound M3P6 tighter than N2P2, further testing our technology. Experimental and computational analysis of selectivity determinants revealed significant indirect energetic coupling in the binding site. Successful discrimination between N2P2 and M3P6, despite their overlapping binding preferences, is highly encouraging for computational approaches to selective PDZ targeting, especially because design relied on a homology model of M3P6. Still, we demonstrate specific deficiencies of structural modeling that must be addressed to enable truly robust design. The presented framework is general and can be applied in many scenarios to engineer selective targeting. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Comparative Analysis of Canal Centering Ability of Different Single File Systems Using Cone Beam Computed Tomography- An In-Vitro Study

    Science.gov (United States)

    Agarwal, Jatin; Jain, Pradeep; Chandra, Anil

    2015-01-01

    Background The ability of an endodontic instrument to remain centered in the root canal system is one of the most important characteristic influencing the clinical performance of a particular file system. Thus, it is important to assess the canal centering ability of newly introduced single file systems before they can be considered a viable replacement of full-sequence rotary file systems. Aim The aim of the study was to compare the canal transportation, centering ability, and time taken for preparation of curved root canals after instrumentation with single file systems One Shape and Wave One, using cone-beam computed tomography (CBCT). Materials and Methods Sixty mesiobuccal canals of mandibular molars with an angle of curvature ranging from 20o to 35o were divided into three groups of 20 samples each: ProTaper PT (group I) – full-sequence rotary control group, OneShape OS (group II)- single file continuous rotation, WaveOne WO – single file reciprocal motion (group III). Pre instrumentation and post instrumentation three-dimensional CBCT images were obtained from root cross-sections at 3mm, 6mm and 9mm from the apex. Scanned images were then accessed to determine canal transportation and centering ability. The data collected were evaluated using one-way analysis of variance (ANOVA) with Tukey’s honestly significant difference test. Results It was observed that there were no differences in the magnitude of transportation between the rotary instruments (p >0.05) at both 3mm as well as 6mm from the apex. At 9 mm from the apex, Group I PT showed significantly higher mean canal transportation and lower centering ability (0.19±0.08 and 0.39±0.16), as compared to Group II OS (0.12±0.07 and 0.54±0.24) and Group III WO (0.13±0.06 and 0.55±0.18) while the differences between OS and WO were not statistically significant Conclusion It was concluded that there was minor difference between the tested groups. Single file systems demonstrated average canal

  4. Comparative Analysis of Canal Centering Ability of Different Single File Systems Using Cone Beam Computed Tomography- An In-Vitro Study.

    Science.gov (United States)

    Agarwal, Rolly S; Agarwal, Jatin; Jain, Pradeep; Chandra, Anil

    2015-05-01

    The ability of an endodontic instrument to remain centered in the root canal system is one of the most important characteristic influencing the clinical performance of a particular file system. Thus, it is important to assess the canal centering ability of newly introduced single file systems before they can be considered a viable replacement of full-sequence rotary file systems. The aim of the study was to compare the canal transportation, centering ability, and time taken for preparation of curved root canals after instrumentation with single file systems One Shape and Wave One, using cone-beam computed tomography (CBCT). Sixty mesiobuccal canals of mandibular molars with an angle of curvature ranging from 20(o) to 35(o) were divided into three groups of 20 samples each: ProTaper PT (group I) - full-sequence rotary control group, OneShape OS (group II)- single file continuous rotation, WaveOne WO - single file reciprocal motion (group III). Pre instrumentation and post instrumentation three-dimensional CBCT images were obtained from root cross-sections at 3mm, 6mm and 9mm from the apex. Scanned images were then accessed to determine canal transportation and centering ability. The data collected were evaluated using one-way analysis of variance (ANOVA) with Tukey's honestly significant difference test. It was observed that there were no differences in the magnitude of transportation between the rotary instruments (p >0.05) at both 3mm as well as 6mm from the apex. At 9 mm from the apex, Group I PT showed significantly higher mean canal transportation and lower centering ability (0.19±0.08 and 0.39±0.16), as compared to Group II OS (0.12±0.07 and 0.54±0.24) and Group III WO (0.13±0.06 and 0.55±0.18) while the differences between OS and WO were not statistically significant. It was concluded that there was minor difference between the tested groups. Single file systems demonstrated average canal transportation and centering ability comparable to full sequence

  5. Goal selection versus process control in a brain-computer interface based on sensorimotor rhythms.

    Science.gov (United States)

    Royer, Audrey S; He, Bin

    2009-02-01

    In a brain-computer interface (BCI) utilizing a process control strategy, the signal from the cortex is used to control the fine motor details normally handled by other parts of the brain. In a BCI utilizing a goal selection strategy, the signal from the cortex is used to determine the overall end goal of the user, and the BCI controls the fine motor details. A BCI based on goal selection may be an easier and more natural system than one based on process control. Although goal selection in theory may surpass process control, the two have never been directly compared, as we are reporting here. Eight young healthy human subjects participated in the present study, three trained and five naïve in BCI usage. Scalp-recorded electroencephalograms (EEG) were used to control a computer cursor during five different paradigms. The paradigms were similar in their underlying signal processing and used the same control signal. However, three were based on goal selection, and two on process control. For both the trained and naïve populations, goal selection had more hits per run, was faster, more accurate (for seven out of eight subjects) and had a higher information transfer rate than process control. Goal selection outperformed process control in every measure studied in the present investigation.

  6. A Wearable Channel Selection-Based Brain-Computer Interface for Motor Imagery Detection.

    Science.gov (United States)

    Lo, Chi-Chun; Chien, Tsung-Yi; Chen, Yu-Chun; Tsai, Shang-Ho; Fang, Wai-Chi; Lin, Bor-Shyh

    2016-02-06

    Motor imagery-based brain-computer interface (BCI) is a communication interface between an external machine and the brain. Many kinds of spatial filters are used in BCIs to enhance the electroencephalography (EEG) features related to motor imagery. The approach of channel selection, developed to reserve meaningful EEG channels, is also an important technique for the development of BCIs. However, current BCI systems require a conventional EEG machine and EEG electrodes with conductive gel to acquire multi-channel EEG signals and then transmit these EEG signals to the back-end computer to perform the approach of channel selection. This reduces the convenience of use in daily life and increases the limitations of BCI applications. In order to improve the above issues, a novel wearable channel selection-based brain-computer interface is proposed. Here, retractable comb-shaped active dry electrodes are designed to measure the EEG signals on a hairy site, without conductive gel. By the design of analog CAR spatial filters and the firmware of EEG acquisition module, the function of spatial filters could be performed without any calculation, and channel selection could be performed in the front-end device to improve the practicability of detecting motor imagery in the wearable EEG device directly or in commercial mobile phones or tablets, which may have relatively low system specifications. Finally, the performance of the proposed BCI is investigated, and the experimental results show that the proposed system is a good wearable BCI system prototype.

  7. Grid collector: An event catalog with automated file management

    International Nuclear Information System (INIS)

    Wu, Kesheng; Zhang, Wei-Ming; Sim, Alexander; Gu, Junmin; Shoshani, Arie

    2003-01-01

    High Energy Nuclear Physics (HENP) experiments such as STAR at BNL and ATLAS at CERN produce large amounts of data that are stored as files on mass storage systems in computer centers. In these files, the basic unit of data is an event. Analysis is typically performed on a selected set of events. The files containing these events have to be located, copied from mass storage systems to disks before analysis, and removed when no longer needed. These file management tasks are tedious and time consuming. Typically, all events contained in the files are read into memory before a selection is made. Since the time to read the events dominate the overall execution time, reading the unwanted event needlessly increases the analysis time. The Grid Collector is a set of software modules that works together to address these two issues. It automates the file management tasks and provides ''direct'' access to the selected events for analyses. It is currently integrated with the STAR analysis framework. The users can select events based on tags, such as, ''production date between March 10 and 20, and the number of charged tracks > 100.'' The Grid Collector locates the files containing relevant events, transfers the files across the Grid if necessary, and delivers the events to the analysis code through the familiar iterators. There has been some research efforts to address the file management issues, the Grid Collector is unique in that it addresses the event access issue together with the file management issues. This makes it more useful to a large variety of users

  8. Grid collector: An event catalog with automated file management

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Kesheng; Zhang, Wei-Ming; Sim, Alexander; Gu, Junmin; Shoshani, Arie

    2003-10-17

    High Energy Nuclear Physics (HENP) experiments such as STAR at BNL and ATLAS at CERN produce large amounts of data that are stored as files on mass storage systems in computer centers. In these files, the basic unit of data is an event. Analysis is typically performed on a selected set of events. The files containing these events have to be located, copied from mass storage systems to disks before analysis, and removed when no longer needed. These file management tasks are tedious and time consuming. Typically, all events contained in the files are read into memory before a selection is made. Since the time to read the events dominate the overall execution time, reading the unwanted event needlessly increases the analysis time. The Grid Collector is a set of software modules that works together to address these two issues. It automates the file management tasks and provides ''direct'' access to the selected events for analyses. It is currently integrated with the STAR analysis framework. The users can select events based on tags, such as, ''production date between March 10 and 20, and the number of charged tracks > 100.'' The Grid Collector locates the files containing relevant events, transfers the files across the Grid if necessary, and delivers the events to the analysis code through the familiar iterators. There has been some research efforts to address the file management issues, the Grid Collector is unique in that it addresses the event access issue together with the file management issues. This makes it more useful to a large variety of users.

  9. The selection of embedded computer using in the nuclear physics instruments

    International Nuclear Information System (INIS)

    Zhang Jianchuan; Nan Gangyang; Wang Yanyu; Su Hong

    2010-01-01

    It introduces the requirement for embedded PC and the benefits of using it in the experimental nuclear physics instrument developing and improving project. A cording to the specific requirements in the project of improving laboratory instruments. several kinds of embedded computer are compared and specifically tested. Thus, a x86 architecture embedded computer, which have ultra-low-power consumption and a small in size, is selected to be the main component of the controller using in the nuclear physics instrument, and this will be used in the high-speed data acquisition and electronic control system. (authors)

  10. Advanced display object selection methods for enhancing user-computer productivity

    Science.gov (United States)

    Osga, Glenn A.

    1993-01-01

    The User-Interface Technology Branch at NCCOSC RDT&E Division has been conducting a series of studies to address the suitability of commercial off-the-shelf (COTS) graphic user-interface (GUI) methods for efficiency and performance in critical naval combat systems. This paper presents an advanced selection algorithm and method developed to increase user performance when making selections on tactical displays. The method has also been applied with considerable success to a variety of cursor and pointing tasks. Typical GUI's allow user selection by: (1) moving a cursor with a pointing device such as a mouse, trackball, joystick, touchscreen; and (2) placing the cursor on the object. Examples of GUI objects are the buttons, icons, folders, scroll bars, etc. used in many personal computer and workstation applications. This paper presents an improved method of selection and the theoretical basis for the significant performance gains achieved with various input devices tested. The method is applicable to all GUI styles and display sizes, and is particularly useful for selections on small screens such as notebook computers. Considering the amount of work-hours spent pointing and clicking across all styles of available graphic user-interfaces, the cost/benefit in applying this method to graphic user-interfaces is substantial, with the potential for increasing productivity across thousands of users and applications.

  11. Computer simulation of the optical properties of high-temperature cermet solar selective coatings

    Energy Technology Data Exchange (ETDEWEB)

    Nejati, M. Reza [K.N. Toosi Univ. of Technology, Dept. of Mechanical Engineering, Tehran (Iran); Fathollahi, V.; Asadi, M. Khalaji [AEOI, Center for Renewable Energy Research and Applications (CRERA), Tehran (Iran)

    2005-02-01

    A computer simulation is developed to calculate the solar absorptance and thermal emittance of various configurations of cermet solar selective coatings. Special attention has been paid to those material combinations, which are commonly used in high-temperature solar thermal applications. Moreover, other material combinations such as two-, three- and four-cermet-layer structures as solar selective coatings have been theoretically analyzed by computer simulation using three distinct physical models of Ping Sheng, Maxwell-Garnett and Bruggeman. The novel case of two-cermet-layer structure with different cermet components has also been investigated. The results were optimized by allowing the program to manipulate the metal volume fraction and thickness of each layer and the results compared to choose the best possible configuration. The calculated results are within the range of 0.91-0.97 for solar absorptance and 0.02-0.07 for thermal emittance at room temperature. (Author)

  12. IV international conference on computational methods in marine engineering : selected papers

    CERN Document Server

    Oñate, Eugenio; García-Espinosa, Julio; Kvamsdal, Trond; Bergan, Pål; MARINE 2011

    2013-01-01

    This book contains selected papers from the Fourth International Conference on Computational Methods in Marine Engineering, held at Instituto Superior Técnico, Technical University of Lisbon, Portugal in September 2011.  Nowadays, computational methods are an essential tool of engineering, which includes a major field of interest in marine applications, such as the maritime and offshore industries and engineering challenges related to the marine environment and renewable energies. The 2011 Conference included 8 invited plenary lectures and 86 presentations distributed through 10 thematic sessions that covered many of the most relevant topics of marine engineering today. This book contains 16 selected papers from the Conference that cover “CFD for Offshore Applications”, “Fluid-Structure Interaction”, “Isogeometric Methods for Marine Engineering”, “Marine/Offshore Renewable Energy”, “Maneuvering and Seakeeping”, “Propulsion and Cavitation” and “Ship Hydrodynamics”.  The papers we...

  13. On the Computation of the Efficient Frontier of the Portfolio Selection Problem

    Directory of Open Access Journals (Sweden)

    Clara Calvo

    2012-01-01

    Full Text Available An easy-to-use procedure is presented for improving the ε-constraint method for computing the efficient frontier of the portfolio selection problem endowed with additional cardinality and semicontinuous variable constraints. The proposed method provides not only a numerical plotting of the frontier but also an analytical description of it, including the explicit equations of the arcs of parabola it comprises and the change points between them. This information is useful for performing a sensitivity analysis as well as for providing additional criteria to the investor in order to select an efficient portfolio. Computational results are provided to test the efficiency of the algorithm and to illustrate its applications. The procedure has been implemented in Mathematica.

  14. Goal selection versus process control while learning to use a brain-computer interface

    Science.gov (United States)

    Royer, Audrey S.; Rose, Minn L.; He, Bin

    2011-06-01

    A brain-computer interface (BCI) can be used to accomplish a task without requiring motor output. Two major control strategies used by BCIs during task completion are process control and goal selection. In process control, the user exerts continuous control and independently executes the given task. In goal selection, the user communicates their goal to the BCI and then receives assistance executing the task. A previous study has shown that goal selection is more accurate and faster in use. An unanswered question is, which control strategy is easier to learn? This study directly compares goal selection and process control while learning to use a sensorimotor rhythm-based BCI. Twenty young healthy human subjects were randomly assigned either to a goal selection or a process control-based paradigm for eight sessions. At the end of the study, the best user from each paradigm completed two additional sessions using all paradigms randomly mixed. The results of this study were that goal selection required a shorter training period for increased speed, accuracy, and information transfer over process control. These results held for the best subjects as well as in the general subject population. The demonstrated characteristics of goal selection make it a promising option to increase the utility of BCIs intended for both disabled and able-bodied users.

  15. Effects of Force Field Selection on the Computational Ranking of MOFs for CO2 Separations.

    Science.gov (United States)

    Dokur, Derya; Keskin, Seda

    2018-02-14

    Metal-organic frameworks (MOFs) have been considered as highly promising materials for adsorption-based CO 2 separations. The number of synthesized MOFs has been increasing very rapidly. High-throughput molecular simulations are very useful to screen large numbers of MOFs in order to identify the most promising adsorbents prior to extensive experimental studies. Results of molecular simulations depend on the force field used to define the interactions between gas molecules and MOFs. Choosing the appropriate force field for MOFs is essential to make reliable predictions about the materials' performance. In this work, we performed two sets of molecular simulations using the two widely used generic force fields, Dreiding and UFF, and obtained adsorption data of CO 2 /H 2 , CO 2 /N 2 , and CO 2 /CH 4 mixtures in 100 different MOF structures. Using this adsorption data, several adsorbent evaluation metrics including selectivity, working capacity, sorbent selection parameter, and percent regenerability were computed for each MOF. MOFs were then ranked based on these evaluation metrics, and top performing materials were identified. We then examined the sensitivity of the MOF rankings to the force field type. Our results showed that although there are significant quantitative differences between some adsorbent evaluation metrics computed using different force fields, rankings of the top MOF adsorbents for CO 2 separations are generally similar: 8, 8, and 9 out of the top 10 most selective MOFs were found to be identical in the ranking for CO 2 /H 2 , CO 2 /N 2 , and CO 2 /CH 4 separations using Dreiding and UFF. We finally suggested a force field factor depending on the energy parameters of atoms present in the MOFs to quantify the robustness of the simulation results to the force field selection. This easily computable factor will be highly useful to determine whether the results are sensitive to the force field type or not prior to performing computationally demanding

  16. Evaluation of the Self-Adjusting File system (SAF) for the instrumentation of primary molar root canals: a micro-computed tomographic study.

    Science.gov (United States)

    Kaya, E; Elbay, M; Yiğit, D

    2017-06-01

    The Self-Adjusting File (SAF) system has been recommended for use in permanent teeth since it offers more conservative and effective root-canal preparation when compared to traditional rotary systems. However, no study had evaluated the usage of SAF in primary teeth. The aim of this study was to evaluate and compare the use of SAF, K file (manual instrumentation) and Profile (traditional rotary instrumentation) systems for primary-tooth root-canal preparation in terms of instrumentation time and amounts of dentin removed using micro-computed tomography (μCT) technology. Study Design: The study was conducted with 60 human primary mandibular second molar teeth divided into 3 groups according to instrumentation technique: Group I: SAF (n=20); Group II: K file (n=20); Group III; Profile (n=20). Teeth were embedded in acrylic blocks and scanned with a μCT scanner prior to instrumentation. All distal root canals were prepared up to size 30 for K file,.04/30 for Profile and 2 mm thickness, size 25 for SAF; instrumentation time was recorded for each tooth, and a second μCT scan was performed after instrumentation was complete. Amounts of dentin removed were measured using the three-dimensional images by calculating the difference in root-canal volume before and after preparation. Data was statistically analysed using the Kolmogorov-Smirnov and Kruskal-Wallis tests. Manual instrumentation (K file) resulted in significantly more dentin removal when compared to rotary instrumentation (Profile and SAF), while the SAF system generated significantly less dentin removal than both manual instrumentation (K file) and traditional rotary instrumentation (Profile) (psystems. Within the experimental conditions of the present study, the SAF seems as a useful system for root-canal instrumentation in primary molars because it removed less dentin than other systems, which is especially important for the relatively thin-walled canals of primary teeth, and because it involves less

  17. Computation-aware algorithm selection approach for interlaced-to-progressive conversion

    Science.gov (United States)

    Park, Sang-Jun; Jeon, Gwanggil; Jeong, Jechang

    2010-05-01

    We discuss deinterlacing results in a computationally constrained and varied environment. The proposed computation-aware algorithm selection approach (CASA) for fast interlaced to progressive conversion algorithm consists of three methods: the line-averaging (LA) method for plain regions, the modified edge-based line-averaging (MELA) method for medium regions, and the proposed covariance-based adaptive deinterlacing (CAD) method for complex regions. The proposed CASA uses two criteria, mean-squared error (MSE) and CPU time, for assigning the method. We proposed a CAD method. The principle idea of CAD is based on the correspondence between the high and low-resolution covariances. We estimated the local covariance coefficients from an interlaced image using Wiener filtering theory and then used these optimal minimum MSE interpolation coefficients to obtain a deinterlaced image. The CAD method, though more robust than most known methods, was not found to be very fast compared to the others. To alleviate this issue, we proposed an adaptive selection approach using a fast deinterlacing algorithm rather than using only one CAD algorithm. The proposed hybrid approach of switching between the conventional schemes (LA and MELA) and our CAD was proposed to reduce the overall computational load. A reliable condition to be used for switching the schemes was presented after a wide set of initial training processes. The results of computer simulations showed that the proposed methods outperformed a number of methods presented in the literature.

  18. Selection of stationary phase particle geometry using X-ray computed tomography and computational fluid dynamics simulations.

    Science.gov (United States)

    Schmidt, Irma; Minceva, Mirjana; Arlt, Wolfgang

    2012-02-17

    The X-ray computed tomography (CT) is used to determine local parameters related to the column packing homogeneity and hydrodynamics in columns packed with spherically and irregularly shaped particles of same size. The results showed that the variation of porosity and axial dispersion coefficient along the column axis is insignificant, compared to their radial distribution. The methodology of using the data attained by CT measurements to perform a CFD simulation of a batch separation of model binary mixtures, with different concentration and separation factors is demonstrated. The results of the CFD simulation study show that columns packed with spherically shaped particles provide higher yield in comparison to columns packed with irregularly shaped particles only below a certain value of the separation factor. The presented methodology can be used for selecting a suited packing material for a particular separation task. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. [Comparison of effectiveness and safety between Twisted File technique and ProTaper Universal rotary full sequence based on micro-computed tomography].

    Science.gov (United States)

    Chen, Xiao-bo; Chen, Chen; Liang, Yu-hong

    2016-02-18

    To evaluate the efficacy and security of two type of rotary nickel titanium system (Twisted File and ProTaper Universal) for root canal preparation based on micro-computed tomography(micro-CT). Twenty extracted molars (including 62 canals) were divided into two experimental groups and were respectively instrumented using Twisted File rotary nickel titanium system (TF) and ProTaper Universal rotary nickel titanium system (PU) to #25/0.08 following recommended protocol. Time for root canal instrumentation (accumulation of time for every single file) was recorded. The 0-3 mm root surface from apex was observed under an optical stereomicroscope at 25 × magnification. The presence of crack line was noted. The root canals were scanned with micro-CT before and after root canal preparation. Three-dimensional shape images of canals were reconstructed, calculated and evaluated. The amount of canal central transportation of the two groups was calculated and compared. The shorter preparation time [(0.53 ± 0.14) min] was observed in TF group, while the preparation time of PU group was (2.06 ± 0.39) min (Pvs. (0.097 ± 0.084) mm, P<0.05]. No instrument separation was observed in both the groups. Cracks were not found in both the groups either based in micro-CT images or observation under an optical stereomicroscope at 25 × magnification. Compared with ProTaper Universal, Twisted File took less time in root canal preparation and exhibited better shaping ability, and less canal transportation.

  20. Selection of Computer Codes for Shallow Land Waste Disposal in PPTA Serpong

    International Nuclear Information System (INIS)

    Syahrir

    1996-01-01

    Selection of Computer Codes for Shallow Land Waste Disposal in PPTA Serpong. Models and computer codes have been selected for safety assessment of near surface waste disposal facility. This paper provides a summary and overview of the methodology and codes selected. The methodology allows analyses of dose to individuals from offsite releases under normal conditions as well as on-site doses to inadvertent intruders. A demonstration in the case of shallow land waste disposal in Nuclear Research Establishment are in Serpong has been given for normal release scenario. The assessment includes infiltration of rainfall, source-term, ground water (well) and surface water transport, food-chain and dosimetry. The results show dose history of maximally exposed individuals. The codes used are VS2DT, PAGAN and GENII. The application of 1 m silt loam as a moisture barrier cover decreases flow in the disposal unit by a factor of 27. The selected radionuclides show variety of dose histories according to their chemical and physical characteristics and behavior in the environment

  1. The Development of Computer-Aided Design for Electrical Equipment Selection and Arrangement of 10 Kv Switchgear

    Directory of Open Access Journals (Sweden)

    Chernaya Anastassiya

    2015-01-01

    Full Text Available The paper intends to give an overview of a computer-aided design program application. The research includes two main parts: the development of a computer-aided design for an appropriate switchgear selection and its arrangement in an indoor switchgear layout. Matlab program was used to develop a computer-aided design system. The use of this program considerably simplifies the selection and arrangement of 10 kV switchgear.

  2. Approaches to Addressing Service Selection Ties in Ad Hoc Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Ayotuyi Tosin Akinola

    2018-01-01

    Full Text Available The ad hoc mobile cloud (AMC allows mobile devices to connect together through a wireless connection or any other means and send a request for web services from one to another within the mobile cloud. However, one of the major challenges in the AMC is the occurrence of dissatisfaction experienced by the users. This is because there are many services with similar functionalities but varying nonfunctional properties. Moreover, another resultant cause of user dissatisfaction being coupled with runtime redundancy is the attainment of similar quality computations during service selection, often referred to as “service selection ties.” In an attempt to address this challenge, service selection mechanisms for the AMC were developed in this work. This includes the use of selected quality of service properties coupled with user feedback data to determine the most suitable service. These mechanisms were evaluated using the experimental method. The evaluation of the mechanisms mainly focused on the metrics that evaluate the satisfaction of users' interest via the quantitative evaluation. The experiments affirmed that the use of the shortest distance can help to break selection ties between potential servicing nodes. Also, a continuous use of updated and unlimited range of users' assessments enhances an optimal service selection.

  3. Thermodynamics of natural selection III: Landauer's principle in computation and chemistry.

    Science.gov (United States)

    Smith, Eric

    2008-05-21

    This is the third in a series of three papers devoted to energy flow and entropy changes in chemical and biological processes, and their relations to the thermodynamics of computation. The previous two papers have developed reversible chemical transformations as idealizations for studying physiology and natural selection, and derived bounds from the second law of thermodynamics, between information gain in an ensemble and the chemical work required to produce it. This paper concerns the explicit mapping of chemistry to computation, and particularly the Landauer decomposition of irreversible computations, in which reversible logical operations generating no heat are separated from heat-generating erasure steps which are logically irreversible but thermodynamically reversible. The Landauer arrangement of computation is shown to produce the same entropy-flow diagram as that of the chemical Carnot cycles used in the second paper of the series to idealize physiological cycles. The specific application of computation to data compression and error-correcting encoding also makes possible a Landauer analysis of the somewhat different problem of optimal molecular recognition, which has been considered as an information theory problem. It is shown here that bounds on maximum sequence discrimination from the enthalpy of complex formation, although derived from the same logical model as the Shannon theorem for channel capacity, arise from exactly the opposite model for erasure.

  4. Experimental design of membrane sensor for selective determination of phenazopyridine hydrochloride based on computational calculations

    International Nuclear Information System (INIS)

    Attia, Khalid A.M.; El-Abasawi, Nasr M.; Abdel-Azim, Ahmed H.

    2016-01-01

    Computational study has been done electronically and geometrically to select the most suitable ionophore to design a novel sensitive and selective electrochemical sensor for phenazopyridine hydrochloride (PAP). This study has revealed that sodium tetraphenylbarate (NaTPB) fits better with PAP than potassium tetrakis (KTClPB). The sensor design is based on the ion pair of PAP with NaTPB using dioctyl phthalate as a plasticizer. Under optimum conditions, the proposed sensor shows the slope of 59.5 mV per concentration decade in the concentration range of 1.0 × 10 −2 –1.0 × 10 −5 M with detection limit 8.5 × 10 −6 M. The sensor exhibits a very good selectivity for PAP with respect to a large number of interfering species as inorganic cations and sugars. The sensor enables track of determining PAP in the presence of its oxidative degradation product 2, 3, 6-Triaminopyridine, which is also its toxic metabolite. The proposed sensor has been successfully applied for the selective determination of PAP in pharmaceutical formulation. Also, the obtained results have been statistically compared to a reported electrochemical method indicating no significant difference between the investigated method and the reported one with respect to accuracy and precision. - Highlights: • Novel use of ISE for selective determination of phenazopyridine hydrochloride. • Investigating the degradation pathway of phenazopyridine with enough confirmation scan. • To avoid time-consuming and experimental trials, computational studies have been applied. • The proposed sensor shows high selectivity, reasonable detection limit and fast response.

  5. Experimental design of membrane sensor for selective determination of phenazopyridine hydrochloride based on computational calculations

    Energy Technology Data Exchange (ETDEWEB)

    Attia, Khalid A.M.; El-Abasawi, Nasr M.; Abdel-Azim, Ahmed H., E-mail: Ahmed.hussienabdelazim@hotmil.com

    2016-04-01

    Computational study has been done electronically and geometrically to select the most suitable ionophore to design a novel sensitive and selective electrochemical sensor for phenazopyridine hydrochloride (PAP). This study has revealed that sodium tetraphenylbarate (NaTPB) fits better with PAP than potassium tetrakis (KTClPB). The sensor design is based on the ion pair of PAP with NaTPB using dioctyl phthalate as a plasticizer. Under optimum conditions, the proposed sensor shows the slope of 59.5 mV per concentration decade in the concentration range of 1.0 × 10{sup −2}–1.0 × 10{sup −5} M with detection limit 8.5 × 10{sup −6} M. The sensor exhibits a very good selectivity for PAP with respect to a large number of interfering species as inorganic cations and sugars. The sensor enables track of determining PAP in the presence of its oxidative degradation product 2, 3, 6-Triaminopyridine, which is also its toxic metabolite. The proposed sensor has been successfully applied for the selective determination of PAP in pharmaceutical formulation. Also, the obtained results have been statistically compared to a reported electrochemical method indicating no significant difference between the investigated method and the reported one with respect to accuracy and precision. - Highlights: • Novel use of ISE for selective determination of phenazopyridine hydrochloride. • Investigating the degradation pathway of phenazopyridine with enough confirmation scan. • To avoid time-consuming and experimental trials, computational studies have been applied. • The proposed sensor shows high selectivity, reasonable detection limit and fast response.

  6. Selective sensation based brain-computer interface via mechanical vibrotactile stimulation.

    Science.gov (United States)

    Yao, Lin; Meng, Jianjun; Zhang, Dingguo; Sheng, Xinjun; Zhu, Xiangyang

    2013-01-01

    In this work, mechanical vibrotactile stimulation was applied to subjects' left and right wrist skins with equal intensity, and a selective sensation perception task was performed to achieve two types of selections similar to motor imagery Brain-Computer Interface. The proposed system was based on event-related desynchronization/synchronization (ERD/ERS), which had a correlation with processing of afferent inflow in human somatosensory system, and attentional effect which modulated the ERD/ERS. The experiments were carried out on nine subjects (without experience in selective sensation), and six of them showed a discrimination accuracy above 80%, three of them above 95%. Comparative experiments with motor imagery (with and without presence of stimulation) were also carried out, which further showed the feasibility of selective sensation as an alternative BCI task complementary to motor imagery. Specifically there was significant improvement ([Formula: see text]) from near 65% in motor imagery (with and without presence of stimulation) to above 80% in selective sensation on some subjects. The proposed BCI modality might well cooperate with existing BCI modalities in the literature in enlarging the widespread usage of BCI system.

  7. Selective Sensation Based Brain-Computer Interface via Mechanical Vibrotactile Stimulation

    Science.gov (United States)

    Yao, Lin; Meng, Jianjun; Zhang, Dingguo; Sheng, Xinjun; Zhu, Xiangyang

    2013-01-01

    In this work, mechanical vibrotactile stimulation was applied to subjects’ left and right wrist skins with equal intensity, and a selective sensation perception task was performed to achieve two types of selections similar to motor imagery Brain-Computer Interface. The proposed system was based on event-related desynchronization/synchronization (ERD/ERS), which had a correlation with processing of afferent inflow in human somatosensory system, and attentional effect which modulated the ERD/ERS. The experiments were carried out on nine subjects (without experience in selective sensation), and six of them showed a discrimination accuracy above 80%, three of them above 95%. Comparative experiments with motor imagery (with and without presence of stimulation) were also carried out, which further showed the feasibility of selective sensation as an alternative BCI task complementary to motor imagery. Specifically there was significant improvement () from near 65% in motor imagery (with and without presence of stimulation) to above 80% in selective sensation on some subjects. The proposed BCI modality might well cooperate with existing BCI modalities in the literature in enlarging the widespread usage of BCI system. PMID:23762253

  8. Selection of suitable hand gestures for reliable myoelectric human computer interface.

    Science.gov (United States)

    Castro, Maria Claudia F; Arjunan, Sridhar P; Kumar, Dinesh K

    2015-04-09

    Myoelectric controlled prosthetic hand requires machine based identification of hand gestures using surface electromyogram (sEMG) recorded from the forearm muscles. This study has observed that a sub-set of the hand gestures have to be selected for an accurate automated hand gesture recognition, and reports a method to select these gestures to maximize the sensitivity and specificity. Experiments were conducted where sEMG was recorded from the muscles of the forearm while subjects performed hand gestures and then was classified off-line. The performances of ten gestures were ranked using the proposed Positive-Negative Performance Measurement Index (PNM), generated by a series of confusion matrices. When using all the ten gestures, the sensitivity and specificity was 80.0% and 97.8%. After ranking the gestures using the PNM, six gestures were selected and these gave sensitivity and specificity greater than 95% (96.5% and 99.3%); Hand open, Hand close, Little finger flexion, Ring finger flexion, Middle finger flexion and Thumb flexion. This work has shown that reliable myoelectric based human computer interface systems require careful selection of the gestures that have to be recognized and without such selection, the reliability is poor.

  9. Grid collector an event catalog with automated file management

    CERN Document Server

    Ke Sheng Wu; Sim, A; Jun Min Gu; Shoshani, A

    2004-01-01

    High Energy Nuclear Physics (HENP) experiments such as STAR at BNL and ATLAS at CERN produce large amounts of data that are stored as files on mass storage systems in computer centers. In these files, the basic unit of data is an event. Analysis is typically performed on a selected set of events. The files containing these events have to be located, copied from mass storage systems to disks before analysis, and removed when no longer needed. These file management tasks are tedious and time consuming. Typically, all events contained in the files are read into memory before a selection is made. Since the time to read the events dominate the overall execution time, reading the unwanted event needlessly increases the analysis time. The Grid Collector is a set of software modules that works together to address these two issues. It automates the file management tasks and provides "direct" access to the selected events for analyses. It is currently integrated with the STAR analysis framework. The users can select ev...

  10. Computational Selection of RNA Aptamer against Angiopoietin-2 and Experimental Evaluation

    Directory of Open Access Journals (Sweden)

    Wen-Pin Hu

    2015-01-01

    Full Text Available Angiogenesis plays a decisive role in the growth and spread of cancer and angiopoietin-2 (Ang2 is in the spotlight of studies for its unique role in modulating angiogenesis. The aim of this study was to introduce a computational simulation approach to screen aptamers with high binding ability for Ang2. We carried out computational simulations of aptamer-protein interactions by using ZDOCK and ZRANK functions in Discovery Studio 3.5 starting from the available information of aptamers generated through the systematic evolution of ligands by exponential enrichment (SELEX in the literature. From the best of three aptamers on the basis of ZRANK scores, 189 sequences with two-point mutations were created and simulated with Ang2. Then, we used a surface plasmon resonance (SPR biosensor to test 3 mutant sequences of high ZRANK scores along with a high and a low affinity binding sequence as reported in the literature. We found a selected RNA aptamer has a higher binding affinity and SPR response than a reported sequence with the highest affinity. This is the first study of in silico selection of aptamers against Ang2 by using the ZRANK scoring function, which should help to increase the efficiency of selecting aptamers with high target-binding ability.

  11. Selection and benchmarking of computer codes for research reactor core conversions

    Energy Technology Data Exchange (ETDEWEB)

    Yilmaz, Emin [School of Aerospace, Mechanical and Nuclear Engineering, University of Oklahoma, Norman, OK (United States); Jones, Barclay G [Nuclear Engineering Program, University of IL at Urbana-Champaign, Urbana, IL (United States)

    1983-09-01

    A group of computer codes have been selected and obtained from the Nuclear Energy Agency (NEA) Data Bank in France for the core conversion study of highly enriched research reactors. ANISN, WIMSD-4, MC{sup 2}, COBRA-3M, FEVER, THERMOS, GAM-2, CINDER and EXTERMINATOR were selected for the study. For the final work THERMOS, GAM-2, CINDER and EXTERMINATOR have been selected and used. A one dimensional thermal hydraulics code also has been used to calculate temperature distributions in the core. THERMOS and CINDER have been modified to serve the purpose. Minor modifications have been made to GAM-2 and EXTERMINATOR to improve their utilization. All of the codes have been debugged on both CDC and IBM computers at the University of IL. IAEA 10 MW Benchmark problem has been solved. Results of this work has been compared with the IAEA contributor's results. Agreement is very good for highly enriched fuel (HEU). Deviations from IAEA contributor's mean value for low enriched fuel (LEU) exist but they are small enough in general. Deviation of k{sub eff} is about 0.5% for both enrichments at the beginning of life (BOL) and at the end of life (EOL). Flux ratios deviate only about 1.5% from IAEA contributor's mean value. (author)

  12. Selection and benchmarking of computer codes for research reactor core conversions

    International Nuclear Information System (INIS)

    Yilmaz, Emin; Jones, Barclay G.

    1983-01-01

    A group of computer codes have been selected and obtained from the Nuclear Energy Agency (NEA) Data Bank in France for the core conversion study of highly enriched research reactors. ANISN, WIMSD-4, MC 2 , COBRA-3M, FEVER, THERMOS, GAM-2, CINDER and EXTERMINATOR were selected for the study. For the final work THERMOS, GAM-2, CINDER and EXTERMINATOR have been selected and used. A one dimensional thermal hydraulics code also has been used to calculate temperature distributions in the core. THERMOS and CINDER have been modified to serve the purpose. Minor modifications have been made to GAM-2 and EXTERMINATOR to improve their utilization. All of the codes have been debugged on both CDC and IBM computers at the University of IL. IAEA 10 MW Benchmark problem has been solved. Results of this work has been compared with the IAEA contributor's results. Agreement is very good for highly enriched fuel (HEU). Deviations from IAEA contributor's mean value for low enriched fuel (LEU) exist but they are small enough in general. Deviation of k eff is about 0.5% for both enrichments at the beginning of life (BOL) and at the end of life (EOL). Flux ratios deviate only about 1.5% from IAEA contributor's mean value. (author)

  13. Selection and benchmarking of computer codes for research reactor core conversions

    International Nuclear Information System (INIS)

    Yilmaz, E.; Jones, B.G.

    1983-01-01

    A group of computer codes have been selected and obtained from the Nuclear Energy Agency (NEA) Data Bank in France for the core conversion study of highly enriched research reactors. ANISN, WIMSD-4, MC 2 , COBRA-3M, FEVER, THERMOS, GAM-2, CINDER and EXTERMINATOR were selected for the study. For the final work THERMOS, GAM-2, CINDER and EXTERMINATOR have been selected and used. A one dimensional thermal hydraulics code also has been used to calculate temperature distributions in the core. THERMOS and CINDER have been modified to serve the purpose. Minor modifications have been made to GAM-2 and EXTERMINATOR to improve their utilization. All of the codes have been debugged on both CDC and IBM computers at the University of Illinois. IAEA 10 MW Benchmark problem has been solved. Results of this work has been compared with the IAEA contributor's results. Agreement is very good for highly enriched fuel (HEU). Deviations from IAEA contributor's mean value for low enriched fuel (LEU) exist but they are small enough in general

  14. Computational Simulation of Thermal and Spattering Phenomena and Microstructure in Selective Laser Melting of Inconel 625

    Science.gov (United States)

    Özel, Tuğrul; Arısoy, Yiğit M.; Criales, Luis E.

    Computational modelling of Laser Powder Bed Fusion (L-PBF) processes such as Selective laser Melting (SLM) can reveal information that is hard to obtain or unobtainable by in-situ experimental measurements. A 3D thermal field that is not visible by the thermal camera can be obtained by solving the 3D heat transfer problem. Furthermore, microstructural modelling can be used to predict the quality and mechanical properties of the product. In this paper, a nonlinear 3D Finite Element Method based computational code is developed to simulate the SLM process with different process parameters such as laser power and scan velocity. The code is further improved by utilizing an in-situ thermal camera recording to predict spattering which is in turn included as a stochastic heat loss. Then, thermal gradients extracted from the simulations applied to predict growth directions in the resulting microstructure.

  15. Collaborative filtering for brain-computer interaction using transfer learning and active class selection.

    Directory of Open Access Journals (Sweden)

    Dongrui Wu

    Full Text Available Brain-computer interaction (BCI and physiological computing are terms that refer to using processed neural or physiological signals to influence human interaction with computers, environment, and each other. A major challenge in developing these systems arises from the large individual differences typically seen in the neural/physiological responses. As a result, many researchers use individually-trained recognition algorithms to process this data. In order to minimize time, cost, and barriers to use, there is a need to minimize the amount of individual training data required, or equivalently, to increase the recognition accuracy without increasing the number of user-specific training samples. One promising method for achieving this is collaborative filtering, which combines training data from the individual subject with additional training data from other, similar subjects. This paper describes a successful application of a collaborative filtering approach intended for a BCI system. This approach is based on transfer learning (TL, active class selection (ACS, and a mean squared difference user-similarity heuristic. The resulting BCI system uses neural and physiological signals for automatic task difficulty recognition. TL improves the learning performance by combining a small number of user-specific training samples with a large number of auxiliary training samples from other similar subjects. ACS optimally selects the classes to generate user-specific training samples. Experimental results on 18 subjects, using both k nearest neighbors and support vector machine classifiers, demonstrate that the proposed approach can significantly reduce the number of user-specific training data samples. This collaborative filtering approach will also be generalizable to handling individual differences in many other applications that involve human neural or physiological data, such as affective computing.

  16. Collaborative filtering for brain-computer interaction using transfer learning and active class selection.

    Science.gov (United States)

    Wu, Dongrui; Lance, Brent J; Parsons, Thomas D

    2013-01-01

    Brain-computer interaction (BCI) and physiological computing are terms that refer to using processed neural or physiological signals to influence human interaction with computers, environment, and each other. A major challenge in developing these systems arises from the large individual differences typically seen in the neural/physiological responses. As a result, many researchers use individually-trained recognition algorithms to process this data. In order to minimize time, cost, and barriers to use, there is a need to minimize the amount of individual training data required, or equivalently, to increase the recognition accuracy without increasing the number of user-specific training samples. One promising method for achieving this is collaborative filtering, which combines training data from the individual subject with additional training data from other, similar subjects. This paper describes a successful application of a collaborative filtering approach intended for a BCI system. This approach is based on transfer learning (TL), active class selection (ACS), and a mean squared difference user-similarity heuristic. The resulting BCI system uses neural and physiological signals for automatic task difficulty recognition. TL improves the learning performance by combining a small number of user-specific training samples with a large number of auxiliary training samples from other similar subjects. ACS optimally selects the classes to generate user-specific training samples. Experimental results on 18 subjects, using both k nearest neighbors and support vector machine classifiers, demonstrate that the proposed approach can significantly reduce the number of user-specific training data samples. This collaborative filtering approach will also be generalizable to handling individual differences in many other applications that involve human neural or physiological data, such as affective computing.

  17. Interactive FORTRAN IV computer programs for the thermodynamic and transport properties of selected cryogens (fluids pack)

    Science.gov (United States)

    Mccarty, R. D.

    1980-01-01

    The thermodynamic and transport properties of selected cryogens had programmed into a series of computer routines. Input variables are any two of P, rho or T in the single phase regions and either P or T for the saturated liquid or vapor state. The output is pressure, density, temperature, entropy, enthalpy for all of the fluids and in most cases specific heat capacity and speed of sound. Viscosity and thermal conductivity are also given for most of the fluids. The programs are designed for access by remote terminal; however, they have been written in a modular form to allow the user to select either specific fluids or specific properties for particular needs. The program includes properties for hydrogen, helium, neon, nitrogen, oxygen, argon, and methane. The programs include properties for gaseous and liquid states usually from the triple point to some upper limit of pressure and temperature which varies from fluid to fluid.

  18. [Use of Cone Beam Computed Tomography in endodontics: rational case selection criteria].

    Science.gov (United States)

    Rosen, E; Tsesis, I

    2016-01-01

    To present rational case selection criteria for the use of CBCT (Cone Beam Computed Tomography) in endodontics. This article reviews the literature concerning the benefits of CBCT in endodontics, alongside its radiation risks, and present case selection criteria for referral of endodontic patients to CBCT. Up to date, the expected ultimate benefit of CBCT to the endodontic patient is yet uncertain, and the current literature is mainly restricted to its technical efficacy. In addition, the potential radiation risks of CBCT scan are stochastic in nature and uncertain, and are worrying especially in pediatric patients. Both the efficacy of CBCT in supporting the endodontic practitioner decision making and in affecting treatment outcomes, and its long term potential radiation risks are yet uncertain. Therefore, a cautious rational decision making is essential when a CBCT scan is considered in endodontics. Risk-benefit considerations are presented.

  19. Some selection criteria for computers in real-time systems for high energy physics

    International Nuclear Information System (INIS)

    Kolpakov, I.F.

    1980-01-01

    The right choice of program source is for the organization of real-time systems of great importance as cost and reliability are decisive factors. Some selection criteria for program sources for high energy physics multiwire chamber spectrometers (MWCS) are considered in this report. MWCS's accept bits of information from event pattens. Large and small computers, microcomputers and intelligent controllers in CAMAC crates are compared with respect to the following characteristics: data exchange speed, number of addresses for peripheral devices, cost of interfacing a peripheral device, sizes of buffer and mass memory, configuration costs, and the mean time between failures (MTBF). The results of comparisons are shown by plots and histograms which allow the selection of program sources according to the above criteria. (Auth.)

  20. Computer-assisted selection of coplanar beam orientations in intensity-modulated radiation therapy

    Energy Technology Data Exchange (ETDEWEB)

    Pugachev, A.; Xing, L. [Department of Radiation Oncology, Stanford University School of Medicine, Stanford, CA (United States)]. E-mail: lei@reyes.stanford.edu

    2001-09-01

    In intensity-modulated radiation therapy (IMRT), the incident beam orientations are often determined by a trial and error search. The conventional beam's-eye view (BEV) tool becomes less helpful in IMRT because it is frequently required that beams go through organs at risk (OARs) in order to achieve a compromise between the dosimetric objectives of the planning target volume (PTV) and the OARs. In this paper, we report a beam's-eye view dosimetrics (BEVD) technique to assist in the selection of beam orientations in IMRT. In our method, each beam portal is divided into a grid of beamlets. A score function is introduced to measure the 'goodness' of each beamlet at a given gantry angle. The score is determined by the maximum PTV dose deliverable by the beamlet without exceeding the tolerance doses of the OARs and normal tissue located in the path of the beamlet. The overall score of the gantry angle is given by a sum of the scores of all beamlets. For a given patient, the score function is evaluated for each possible beam orientation. The directions with the highest scores are then selected as the candidates for beam placement. This procedure is similar to the BEV approach used in conventional radiation therapy, except that the evaluation by a human is replaced by a score function to take into account the intensity modulation. This technique allows one to select beam orientations without the excessive computing overhead of computer optimization of beam orientation. It also provides useful insight into the problem of selection of beam orientation and is especially valuable for complicated cases where the PTV is surrounded by several sensitive structures and where it is difficult to select a set of 'good' beam orientations. Several two-dimensional (2D) model cases were used to test the proposed technique. The plans obtained using the BEVD-selected beam orientations were compared with the plans obtained using equiangular spaced beams. For

  1. Selection of examples in case-based computer-aided decision systems

    International Nuclear Information System (INIS)

    Mazurowski, Maciej A; Zurada, Jacek M; Tourassi, Georgia D

    2008-01-01

    Case-based computer-aided decision (CB-CAD) systems rely on a database of previously stored, known examples when classifying new, incoming queries. Such systems can be particularly useful since they do not need retraining every time a new example is deposited in the case base. The adaptive nature of case-based systems is well suited to the current trend of continuously expanding digital databases in the medical domain. To maintain efficiency, however, such systems need sophisticated strategies to effectively manage the available evidence database. In this paper, we discuss the general problem of building an evidence database by selecting the most useful examples to store while satisfying existing storage requirements. We evaluate three intelligent techniques for this purpose: genetic algorithm-based selection, greedy selection and random mutation hill climbing. These techniques are compared to a random selection strategy used as the baseline. The study is performed with a previously presented CB-CAD system applied for false positive reduction in screening mammograms. The experimental evaluation shows that when the development goal is to maximize the system's diagnostic performance, the intelligent techniques are able to reduce the size of the evidence database to 37% of the original database by eliminating superfluous and/or detrimental examples while at the same time significantly improving the CAD system's performance. Furthermore, if the case-base size is a main concern, the total number of examples stored in the system can be reduced to only 2-4% of the original database without a decrease in the diagnostic performance. Comparison of the techniques shows that random mutation hill climbing provides the best balance between the diagnostic performance and computational efficiency when building the evidence database of the CB-CAD system.

  2. Evaluation of selected mechanical properties of NiTi rotary glide path files manufactured from controlled memory wires.

    Science.gov (United States)

    Nishijo, Miki; Ebihara, Arata; Tokita, Daisuke; Doi, Hisashi; Hanawa, Takao; Okiji, Takashi

    2018-03-28

    This study aimed to investigate mechanical properties related to flexibility and fracture resistance of controlled memory wiremanufactured nickel-titanium rotary glide path files [HyFlex EDM Glide Path File (EDM) and HyFlex GPF (GPF)]. Scout RaCe (RaCe) served as control. Bending loads, torsional/cyclic fatigue resistance, and screw-in forces were measured. EDM showed a significantly larger torque at fracture, a longer time to cyclic fracture in reciprocation and a larger screw-in force compared with GPF and RaCe. GPF showed significantly lower bending loads and higher angular deflection values than EDM and RaCe, and a significantly longer time to cyclic fracture than RaCe. The time to cyclic fracture was significantly longer in reciprocation compared with continuous rotation in EDM and GPF. It can be concluded that EDM and/or GPF showed higher flexibility and cyclic/torsional fatigue resistance compared with RaCe; and that reciprocation conferred better cyclic fatigue resistance to EDM and GPF.

  3. A Soft Computing Based Approach Using Modified Selection Strategy for Feature Reduction of Medical Systems

    Directory of Open Access Journals (Sweden)

    Kursat Zuhtuogullari

    2013-01-01

    Full Text Available The systems consisting high input spaces require high processing times and memory usage. Most of the attribute selection algorithms have the problems of input dimensions limits and information storage problems. These problems are eliminated by means of developed feature reduction software using new modified selection mechanism with middle region solution candidates adding. The hybrid system software is constructed for reducing the input attributes of the systems with large number of input variables. The designed software also supports the roulette wheel selection mechanism. Linear order crossover is used as the recombination operator. In the genetic algorithm based soft computing methods, locking to the local solutions is also a problem which is eliminated by using developed software. Faster and effective results are obtained in the test procedures. Twelve input variables of the urological system have been reduced to the reducts (reduced input attributes with seven, six, and five elements. It can be seen from the obtained results that the developed software with modified selection has the advantages in the fields of memory allocation, execution time, classification accuracy, sensitivity, and specificity values when compared with the other reduction algorithms by using the urological test data.

  4. A soft computing based approach using modified selection strategy for feature reduction of medical systems.

    Science.gov (United States)

    Zuhtuogullari, Kursat; Allahverdi, Novruz; Arikan, Nihat

    2013-01-01

    The systems consisting high input spaces require high processing times and memory usage. Most of the attribute selection algorithms have the problems of input dimensions limits and information storage problems. These problems are eliminated by means of developed feature reduction software using new modified selection mechanism with middle region solution candidates adding. The hybrid system software is constructed for reducing the input attributes of the systems with large number of input variables. The designed software also supports the roulette wheel selection mechanism. Linear order crossover is used as the recombination operator. In the genetic algorithm based soft computing methods, locking to the local solutions is also a problem which is eliminated by using developed software. Faster and effective results are obtained in the test procedures. Twelve input variables of the urological system have been reduced to the reducts (reduced input attributes) with seven, six, and five elements. It can be seen from the obtained results that the developed software with modified selection has the advantages in the fields of memory allocation, execution time, classification accuracy, sensitivity, and specificity values when compared with the other reduction algorithms by using the urological test data.

  5. Evaluation of intradural stimulation efficiency and selectivity in a computational model of spinal cord stimulation.

    Directory of Open Access Journals (Sweden)

    Bryan Howell

    Full Text Available Spinal cord stimulation (SCS is an alternative or adjunct therapy to treat chronic pain, a prevalent and clinically challenging condition. Although SCS has substantial clinical success, the therapy is still prone to failures, including lead breakage, lead migration, and poor pain relief. The goal of this study was to develop a computational model of SCS and use the model to compare activation of neural elements during intradural and extradural electrode placement. We constructed five patient-specific models of SCS. Stimulation thresholds predicted by the model were compared to stimulation thresholds measured intraoperatively, and we used these models to quantify the efficiency and selectivity of intradural and extradural SCS. Intradural placement dramatically increased stimulation efficiency and reduced the power required to stimulate the dorsal columns by more than 90%. Intradural placement also increased selectivity, allowing activation of a greater proportion of dorsal column fibers before spread of activation to dorsal root fibers, as well as more selective activation of individual dermatomes at different lateral deviations from the midline. Further, the results suggest that current electrode designs used for extradural SCS are not optimal for intradural SCS, and a novel azimuthal tripolar design increased stimulation selectivity, even beyond that achieved with an intradural paddle array. Increased stimulation efficiency is expected to increase the battery life of implantable pulse generators, increase the recharge interval of rechargeable implantable pulse generators, and potentially reduce stimulator volume. The greater selectivity of intradural stimulation may improve the success rate of SCS by mitigating the sensitivity of pain relief to malpositioning of the electrode. The outcome of this effort is a better quantitative understanding of how intradural electrode placement can potentially increase the selectivity and efficiency of SCS

  6. High Altitude Balloon Flight Path Prediction and Site Selection Based On Computer Simulations

    Science.gov (United States)

    Linford, Joel

    2010-10-01

    Interested in the upper atmosphere, Weber State University Physics department has developed a High Altitude Reconnaissance Balloon for Outreach and Research team, also known as HARBOR. HARBOR enables Weber State University to take a variety of measurements from ground level to altitudes as high as 100,000 feet. The flight paths of these balloons can extend as long as 100 miles from the launch zone, making the choice of where and when to fly critical. To ensure the ability to recover the packages in a reasonable amount of time, days and times are carefully selected using computer simulations limiting flight tracks to approximately 40 miles from the launch zone. The computer simulations take atmospheric data collected by National Oceanic and Atmospheric Administration (NOAA) to plot what flights might have looked like in the past, and to predict future flights. Using these simulations a launch zone has been selected in Duchesne Utah, which has hosted eight successful flights over the course of the last three years, all of which have been recovered. Several secondary launch zones in western Wyoming, Southern Idaho, and Northern Utah are also being considered.

  7. The computational form of craving is a selective multiplication of economic value.

    Science.gov (United States)

    Konova, Anna B; Louie, Kenway; Glimcher, Paul W

    2018-04-17

    Craving is thought to be a specific desire state that biases choice toward the desired object, be it chocolate or drugs. A vast majority of people report having experienced craving of some kind. In its pathological form craving contributes to health outcomes in addiction and obesity. Yet despite its ubiquity and clinical relevance we still lack a basic neurocomputational understanding of craving. Here, using an instantaneous measure of subjective valuation and selective cue exposure, we identify a behavioral signature of a food craving-like state and advance a computational framework for understanding how this state might transform valuation to bias choice. We find desire induced by exposure to a specific high-calorie, high-fat/sugar snack good is expressed in subjects' momentary willingness to pay for this good. This effect is selective but not exclusive to the exposed good; rather, we find it generalizes to nonexposed goods in proportion to their subjective attribute similarity to the exposed ones. A second manipulation of reward size (number of snack units available for purchase) further suggested that a multiplicative gain mechanism supports the transformation of valuation during laboratory craving. These findings help explain how real-world food craving can result in behaviors inconsistent with preferences expressed in the absence of craving and open a path for the computational modeling of craving-like phenomena using a simple and repeatable experimental tool for assessing subjective states in economic terms. Copyright © 2018 the Author(s). Published by PNAS.

  8. Optimal Selection Method of Process Patents for Technology Transfer Using Fuzzy Linguistic Computing

    Directory of Open Access Journals (Sweden)

    Gangfeng Wang

    2014-01-01

    Full Text Available Under the open innovation paradigm, technology transfer of process patents is one of the most important mechanisms for manufacturing companies to implement process innovation and enhance the competitive edge. To achieve promising technology transfers, we need to evaluate the feasibility of process patents and optimally select the most appropriate patent according to the actual manufacturing situation. Hence, this paper proposes an optimal selection method of process patents using multiple criteria decision-making and 2-tuple fuzzy linguistic computing to avoid information loss during the processes of evaluation integration. An evaluation index system for technology transfer feasibility of process patents is designed initially. Then, fuzzy linguistic computing approach is applied to aggregate the evaluations of criteria weights for each criterion and corresponding subcriteria. Furthermore, performance ratings for subcriteria and fuzzy aggregated ratings of criteria are calculated. Thus, we obtain the overall technology transfer feasibility of patent alternatives. Finally, a case study of aeroengine turbine manufacturing is presented to demonstrate the applicability of the proposed method.

  9. Clustering based gene expression feature selection method: A computational approach to enrich the classifier efficiency of differentially expressed genes

    KAUST Repository

    Abusamra, Heba; Bajic, Vladimir B.

    2016-01-01

    decrease the computational time and cost, but also improve the classification performance. Among different approaches of feature selection methods, however most of them suffer from several problems such as lack of robustness, validation issues etc. Here, we

  10. A micro-computed tomographic evaluation of dentinal microcrack alterations during root canal preparation using single-file Ni-Ti systems.

    Science.gov (United States)

    Li, Mei-Lin; Liao, Wei-Li; Cai, Hua-Xiong

    2018-01-01

    The aim of the present study was to evaluate the length of dentinal microcracks observed prior to and following root canal preparation with different single-file nickel-titanium (Ni-Ti) systems using micro-computed tomography (micro-CT) analysis. A total of 80 mesial roots of mandibular first molars presenting with type II Vertucci canal configurations were scanned at an isotropic resolution of 7.4 µm. The samples were randomly assigned into four groups (n=20 per group) according to the system used for root canal preparation, including the WaveOne (WO), OneShape (OS), Reciproc (RE) and control groups. A second micro-CT scan was conducted after the root canals were prepared with size 25 instruments. Pre- and postoperative cross-section images of the roots (n=237,760) were then screened to identify the lengths of the microcracks. The results indicated that the microcrack lengths were notably increased following root canal preparation (Pfiles. Among the single-file Ni-Ti systems, WO and RE were not observed to cause notable microcracks, while the OS system resulted in evident microcracks.

  11. A reliable computational workflow for the selection of optimal screening libraries.

    Science.gov (United States)

    Gilad, Yocheved; Nadassy, Katalin; Senderowitz, Hanoch

    2015-01-01

    components, it can be easily adapted and reproduced by computational groups interested in rational selection of screening libraries. Furthermore, the workflow could be readily modified to include additional components. This workflow has been routinely used in our laboratory for the selection of libraries in multiple projects and consistently selects libraries which are well balanced across multiple parameters.Graphical abstract.

  12. A hybrid agent-based computational economics and optimization approach for supplier selection problem

    Directory of Open Access Journals (Sweden)

    Zahra Pourabdollahi

    2017-12-01

    Full Text Available Supplier evaluation and selection problem is among the most important of logistics decisions that have been addressed extensively in supply chain management. The same logistics decision is also important in freight transportation since it identifies trade relationships between business establishments and determines commodity flows between production and consumption points. The commodity flows are then used as input to freight transportation models to determine cargo movements and their characteristics including mode choice and shipment size. Various approaches have been proposed to explore this latter problem in previous studies. Traditionally, potential suppliers are evaluated and selected using only price/cost as the influential criteria and the state-of-practice methods. This paper introduces a hybrid agent-based computational economics and optimization approach for supplier selection. The proposed model combines an agent-based multi-criteria supplier evaluation approach with a multi-objective optimization model to capture both behavioral and economical aspects of the supplier selection process. The model uses a system of ordered response models to determine importance weights of the different criteria in supplier evaluation from a buyers’ point of view. The estimated weights are then used to calculate a utility for each potential supplier in the market and rank them. The calculated utilities are then entered into a mathematical programming model in which best suppliers are selected by maximizing the total accrued utility for all buyers and minimizing total shipping costs while balancing the capacity of potential suppliers to ensure market clearing mechanisms. The proposed model, herein, was implemented under an operational agent-based supply chain and freight transportation framework for the Chicago Metropolitan Area.

  13. Causal Inference for Cross-Modal Action Selection: A Computational Study in a Decision Making Framework.

    Science.gov (United States)

    Daemi, Mehdi; Harris, Laurence R; Crawford, J Douglas

    2016-01-01

    Animals try to make sense of sensory information from multiple modalities by categorizing them into perceptions of individual or multiple external objects or internal concepts. For example, the brain constructs sensory, spatial representations of the locations of visual and auditory stimuli in the visual and auditory cortices based on retinal and cochlear stimulations. Currently, it is not known how the brain compares the temporal and spatial features of these sensory representations to decide whether they originate from the same or separate sources in space. Here, we propose a computational model of how the brain might solve such a task. We reduce the visual and auditory information to time-varying, finite-dimensional signals. We introduce controlled, leaky integrators as working memory that retains the sensory information for the limited time-course of task implementation. We propose our model within an evidence-based, decision-making framework, where the alternative plan units are saliency maps of space. A spatiotemporal similarity measure, computed directly from the unimodal signals, is suggested as the criterion to infer common or separate causes. We provide simulations that (1) validate our model against behavioral, experimental results in tasks where the participants were asked to report common or separate causes for cross-modal stimuli presented with arbitrary spatial and temporal disparities. (2) Predict the behavior in novel experiments where stimuli have different combinations of spatial, temporal, and reliability features. (3) Illustrate the dynamics of the proposed internal system. These results confirm our spatiotemporal similarity measure as a viable criterion for causal inference, and our decision-making framework as a viable mechanism for target selection, which may be used by the brain in cross-modal situations. Further, we suggest that a similar approach can be extended to other cognitive problems where working memory is a limiting factor, such

  14. A ROC-based feature selection method for computer-aided detection and diagnosis

    Science.gov (United States)

    Wang, Songyuan; Zhang, Guopeng; Liao, Qimei; Zhang, Junying; Jiao, Chun; Lu, Hongbing

    2014-03-01

    Image-based computer-aided detection and diagnosis (CAD) has been a very active research topic aiming to assist physicians to detect lesions and distinguish them from benign to malignant. However, the datasets fed into a classifier usually suffer from small number of samples, as well as significantly less samples available in one class (have a disease) than the other, resulting in the classifier's suboptimal performance. How to identifying the most characterizing features of the observed data for lesion detection is critical to improve the sensitivity and minimize false positives of a CAD system. In this study, we propose a novel feature selection method mR-FAST that combines the minimal-redundancymaximal relevance (mRMR) framework with a selection metric FAST (feature assessment by sliding thresholds) based on the area under a ROC curve (AUC) generated on optimal simple linear discriminants. With three feature datasets extracted from CAD systems for colon polyps and bladder cancer, we show that the space of candidate features selected by mR-FAST is more characterizing for lesion detection with higher AUC, enabling to find a compact subset of superior features at low cost.

  15. Summary of computational support and general documentation for computer code (GENTREE) used in Office of Nuclear Waste Isolation Pilot Salt Site Selection Project

    International Nuclear Information System (INIS)

    Beatty, J.A.; Younker, J.L.; Rousseau, W.F.; Elayat, H.A.

    1983-01-01

    A Decision Tree Computer Model was adapted for the purposes of a Pilot Salt Site Selection Project conducted by the Office of Nuclear Waste Isolation (ONWI). A deterministic computer model was developed to structure the site selection problem with submodels reflecting the five major outcome categories (Cost, Safety, Delay, Environment, Community Impact) to be evaluated in the decision process. Time-saving modifications were made in the tree code as part of the effort. In addition, format changes allowed retention of information items which are valuable in directing future research and in isolation of key variabilities in the Site Selection Decision Model. The deterministic code was linked to the modified tree code and the entire program was transferred to the ONWI-VAX computer for future use by the ONWI project

  16. Quantum computer based on activated dielectric nanoparticles selectively interacting with short optical pulses

    International Nuclear Information System (INIS)

    Gadomskii, Oleg N; Kharitonov, Yu Ya

    2004-01-01

    The operation principle of a quantum computer is proposed based on a system of dielectric nanoparticles activated with two-level atoms - cubits, in which electric dipole transitions are excited by short intense optical pulses. It is proved that the logical operation (logical operator) CNOT (controlled NOT) is performed by means of time-dependent transfer of quantum information over 'long' (of the order of 10 4 nm) distances between spherical nanoparticles owing to the delayed interaction between them in the optical radiation field. It is shown that one-cubit and two-cubit logical operators required for quantum calculations can be realised by selectively exciting dielectric particles with short optical pulses. (quantum calculations)

  17. Application of tripolar concentric electrodes and prefeature selection algorithm for brain-computer interface.

    Science.gov (United States)

    Besio, Walter G; Cao, Hongbao; Zhou, Peng

    2008-04-01

    For persons with severe disabilities, a brain-computer interface (BCI) may be a viable means of communication. Lapalacian electroencephalogram (EEG) has been shown to improve classification in EEG recognition. In this work, the effectiveness of signals from tripolar concentric electrodes and disc electrodes were compared for use as a BCI. Two sets of left/right hand motor imagery EEG signals were acquired. An autoregressive (AR) model was developed for feature extraction with a Mahalanobis distance based linear classifier for classification. An exhaust selection algorithm was employed to analyze three factors before feature extraction. The factors analyzed were 1) length of data in each trial to be used, 2) start position of data, and 3) the order of the AR model. The results showed that tripolar concentric electrodes generated significantly higher classification accuracy than disc electrodes.

  18. The effective use of virtualization for selection of data centers in a cloud computing environment

    Science.gov (United States)

    Kumar, B. Santhosh; Parthiban, Latha

    2018-04-01

    Data centers are the places which consist of network of remote servers to store, access and process the data. Cloud computing is a technology where users worldwide will submit the tasks and the service providers will direct the requests to the data centers which are responsible for execution of tasks. The servers in the data centers need to employ the virtualization concept so that multiple tasks can be executed simultaneously. In this paper we proposed an algorithm for data center selection based on energy of virtual machines created in server. The virtualization energy in each of the server is calculated and total energy of the data center is obtained by the summation of individual server energy. The tasks submitted are routed to the data center with least energy consumption which will result in minimizing the operational expenses of a service provider.

  19. Numerical validation of selected computer programs in nonlinear analysis of steel frame exposed to fire

    Science.gov (United States)

    Maślak, Mariusz; Pazdanowski, Michał; Woźniczka, Piotr

    2018-01-01

    Validation of fire resistance for the same steel frame bearing structure is performed here using three different numerical models, i.e. a bar one prepared in the SAFIR environment, and two 3D models developed within the framework of Autodesk Simulation Mechanical (ASM) and an alternative one developed in the environment of the Abaqus code. The results of the computer simulations performed are compared with the experimental results obtained previously, in a laboratory fire test, on a structure having the same characteristics and subjected to the same heating regimen. Comparison of the experimental and numerically determined displacement evolution paths for selected nodes of the considered frame during the simulated fire exposure constitutes the basic criterion applied to evaluate the validity of the numerical results obtained. The experimental and numerically determined estimates of critical temperature specific to the considered frame and related to the limit state of bearing capacity in fire have been verified as well.

  20. Selectivity in ligand binding to uranyl compounds: A synthetic, structural, thermodynamic and computational study

    Energy Technology Data Exchange (ETDEWEB)

    Arnold, John [Univ. of California, Berkeley, CA (United States)

    2015-01-21

    The uranyl cation (UO₂²⁺) is the most abundant form of uranium on the planet. It is estimated that 4.5 billion tons of uranium in this form exist in sea water. The ability to bind and extract the uranyl cation from aqueous solution while separating it from other elements would provide a limitless source of nuclear fuel. A large body of research concerns the selective recognition and extraction of uranyl. A stable molecule, the cation has a linear O=U=O geometry. The short U-O bonds (1.78 Å) arise from the combination of uranium 5f/6d and oxygen 2p orbitals. Due to the oxygen moieties being multiply bonded, these sites were not thought to be basic enough for Lewis acidic coordination to be a viable approach to sequestration. The goal of this research is thus to broaden the coordination chemistry of the uranyl ion by studying new ligand systems via synthetic, structural, thermodynamic and computational methods. It is anticipated that this fundamental science will find use beyond actinide separation technologies in areas such as nuclear waste remediation and nuclear materials. The focus of this study is to synthesize uranyl complexes incorporating amidinate and guanidinate ligands. Both synthetic and computational methods are used to investigate novel equatorial ligand coordination and how this affects the basicity of the oxo ligands. Such an understanding will later apply to designing ligands incorporating functionalities that can bind uranyl both equatorially and axially for highly selective sequestration. Efficient and durable chromatography supports for lanthanide separation will be generated by (1) identifying robust peptoid-based ligands capable of binding different lanthanides with variable affinities, and (2) developing practical synthetic methods for the attachment of these ligands to Dowex ion exchange resins.

  1. Selectivity in ligand binding to uranyl compounds: A synthetic, structural, thermodynamic and computational study

    International Nuclear Information System (INIS)

    Arnold, John

    2015-01-01

    The uranyl cation (UO 2 2+ ) is the most abundant form of uranium on the planet. It is estimated that 4.5 billion tons of uranium in this form exist in sea water. The ability to bind and extract the uranyl cation from aqueous solution while separating it from other elements would provide a limitless source of nuclear fuel. A large body of research concerns the selective recognition and extraction of uranyl. A stable molecule, the cation has a linear O=U=O geometry. The short U-O bonds (1.78 Å) arise from the combination of uranium 5f/6d and oxygen 2p orbitals. Due to the oxygen moieties being multiply bonded, these sites were not thought to be basic enough for Lewis acidic coordination to be a viable approach to sequestration. The goal of this research is thus to broaden the coordination chemistry of the uranyl ion by studying new ligand systems via synthetic, structural, thermodynamic and computational methods. It is anticipated that this fundamental science will find use beyond actinide separation technologies in areas such as nuclear waste remediation and nuclear materials. The focus of this study is to synthesize uranyl complexes incorporating amidinate and guanidinate ligands. Both synthetic and computational methods are used to investigate novel equatorial ligand coordination and how this affects the basicity of the oxo ligands. Such an understanding will later apply to designing ligands incorporating functionalities that can bind uranyl both equatorially and axially for highly selective sequestration. Efficient and durable chromatography supports for lanthanide separation will be generated by (1) identifying robust peptoid-based ligands capable of binding different lanthanides with variable affinities, and (2) developing practical synthetic methods for the attachment of these ligands to Dowex ion exchange resins.

  2. Automated selection of brain regions for real-time fMRI brain-computer interfaces

    Science.gov (United States)

    Lührs, Michael; Sorger, Bettina; Goebel, Rainer; Esposito, Fabrizio

    2017-02-01

    Objective. Brain-computer interfaces (BCIs) implemented with real-time functional magnetic resonance imaging (rt-fMRI) use fMRI time-courses from predefined regions of interest (ROIs). To reach best performances, localizer experiments and on-site expert supervision are required for ROI definition. To automate this step, we developed two unsupervised computational techniques based on the general linear model (GLM) and independent component analysis (ICA) of rt-fMRI data, and compared their performances on a communication BCI. Approach. 3 T fMRI data of six volunteers were re-analyzed in simulated real-time. During a localizer run, participants performed three mental tasks following visual cues. During two communication runs, a letter-spelling display guided the subjects to freely encode letters by performing one of the mental tasks with a specific timing. GLM- and ICA-based procedures were used to decode each letter, respectively using compact ROIs and whole-brain distributed spatio-temporal patterns of fMRI activity, automatically defined from subject-specific or group-level maps. Main results. Letter-decoding performances were comparable to supervised methods. In combination with a similarity-based criterion, GLM- and ICA-based approaches successfully decoded more than 80% (average) of the letters. Subject-specific maps yielded optimal performances. Significance. Automated solutions for ROI selection may help accelerating the translation of rt-fMRI BCIs from research to clinical applications.

  3. Student Perceived Importance and Correlations of Selected Computer Literacy Course Topics

    Science.gov (United States)

    Ciampa, Mark

    2013-01-01

    Traditional college-level courses designed to teach computer literacy are in a state of flux. Today's students have high rates of access to computing technology and computer ownership, leading many policy decision makers to conclude that students already are computer literate and thus computer literacy courses are dinosaurs in a modern digital…

  4. Selective laser melting of Al-12Si

    OpenAIRE

    Prashanth, Konda Gokuldoss

    2014-01-01

    Selective laser melting (SLM) is a powder-based additive manufacturing technique consisting of the exact reproduction of a three dimensional computer model (generally a computer-aided design CAD file or a computer tomography CT scan) through an additive layer-by-layer strategy. Because of the high degree of freedom offered by the additive manufacturing, parts having almost any possible geometry can be produced by SLM. More specifically, with this process it is possible to build parts with ext...

  5. The photon identification loophole in EPRB experiments: computer models with single-wing selection

    Science.gov (United States)

    De Raedt, Hans; Michielsen, Kristel; Hess, Karl

    2017-11-01

    Recent Einstein-Podolsky-Rosen-Bohm experiments [M. Giustina et al. Phys. Rev. Lett. 115, 250401 (2015); L. K. Shalm et al. Phys. Rev. Lett. 115, 250402 (2015)] that claim to be loophole free are scrutinized. The combination of a digital computer and discrete-event simulation is used to construct a minimal but faithful model of the most perfected realization of these laboratory experiments. In contrast to prior simulations, all photon selections are strictly made, as they are in the actual experiments, at the local station and no other "post-selection" is involved. The simulation results demonstrate that a manifestly non-quantum model that identifies photons in the same local manner as in these experiments can produce correlations that are in excellent agreement with those of the quantum theoretical description of the corresponding thought experiment, in conflict with Bell's theorem which states that this is impossible. The failure of Bell's theorem is possible because of our recognition of the photon identification loophole. Such identification measurement-procedures are necessarily included in all actual experiments but are not included in the theory of Bell and his followers.

  6. Computational Selection of Inhibitors of A-beta Aggregation and Neuronal Toxicity

    Science.gov (United States)

    Chen, Deliang; Martin, Zane S.; Soto, Claudio; Schein, Catherine H.

    2009-01-01

    Alzheimer’s Disease (AD) is characterized by the cerebral accumulation of misfolded and aggregated amyloid-β protein (Aβ). Disease symptoms can be alleviated, in vitro and in vivo, by “β-sheet breaker” pentapeptides that reduce plaque volume. However the peptide nature of these compounds, made them biologically unstable and unable to penetrate membranes with high efficiency. The main goal of this study was to use computational methods to identify small molecule mimetics with better drug-like properties. For this purpose, the docked conformations of the active peptides were used to identify compounds with similar activities. A series of related β-sheet breaker peptides were docked to solid state NMR structures of a fibrillar form of Aβ. The lowest energy conformations of the active peptides were used to design three dimensional (3D)-pharmacophores, suitable for screening the NCI database with Unity. Small molecular weight compounds with physicochemical features in a conformation similar to the active peptides were selected, ranked by docking solubility parameters. Of 16 diverse compounds selected for experimental screening, 2 prevented and reversed Aβ aggregation at 2–3 μM concentration, as measured by Thioflavin T (ThT) fluorescence and ELISA assays. They also prevented the toxic effects of aggregated Aβ on neuroblastoma cells. Their low molecular weight and aqueous solubility makes them promising lead compounds for treating AD. PMID:19540126

  7. An independent brain-computer interface using covert non-spatial visual selective attention

    Science.gov (United States)

    Zhang, Dan; Maye, Alexander; Gao, Xiaorong; Hong, Bo; Engel, Andreas K.; Gao, Shangkai

    2010-02-01

    In this paper, a novel independent brain-computer interface (BCI) system based on covert non-spatial visual selective attention of two superimposed illusory surfaces is described. Perception of two superimposed surfaces was induced by two sets of dots with different colors rotating in opposite directions. The surfaces flickered at different frequencies and elicited distinguishable steady-state visual evoked potentials (SSVEPs) over parietal and occipital areas of the brain. By selectively attending to one of the two surfaces, the SSVEP amplitude at the corresponding frequency was enhanced. An online BCI system utilizing the attentional modulation of SSVEP was implemented and a 3-day online training program with healthy subjects was carried out. The study was conducted with Chinese subjects at Tsinghua University, and German subjects at University Medical Center Hamburg-Eppendorf (UKE) using identical stimulation software and equivalent technical setup. A general improvement of control accuracy with training was observed in 8 out of 18 subjects. An averaged online classification accuracy of 72.6 ± 16.1% was achieved on the last training day. The system renders SSVEP-based BCI paradigms possible for paralyzed patients with substantial head or ocular motor impairments by employing covert attention shifts instead of changing gaze direction.

  8. New evaluation methods for conceptual design selection using computational intelligence techniques

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Hong Zhong; Liu, Yu; Li, Yanfeng; Wang, Zhonglai [University of Electronic Science and Technology of China, Chengdu (China); Xue, Lihua [Higher Education Press, Beijing (China)

    2013-03-15

    The conceptual design selection, which aims at choosing the best or most desirable design scheme among several candidates for the subsequent detailed design stage, oftentimes requires a set of tools to conduct design evaluation. Using computational intelligence techniques, such as fuzzy logic, neural network, genetic algorithm, and physical programming, several design evaluation methods are put forth in this paper to realize the conceptual design selection under different scenarios. Depending on whether an evaluation criterion can be quantified or not, the linear physical programming (LPP) model and the RAOGA-based fuzzy neural network (FNN) model can be utilized to evaluate design alternatives in conceptual design stage. Furthermore, on the basis of Vanegas and Labib's work, a multi-level conceptual design evaluation model based on the new fuzzy weighted average (NFWA) and the fuzzy compromise decision-making method is developed to solve the design evaluation problem consisting of many hierarchical criteria. The effectiveness of the proposed methods is demonstrated via several illustrative examples.

  9. New evaluation methods for conceptual design selection using computational intelligence techniques

    International Nuclear Information System (INIS)

    Huang, Hong Zhong; Liu, Yu; Li, Yanfeng; Wang, Zhonglai; Xue, Lihua

    2013-01-01

    The conceptual design selection, which aims at choosing the best or most desirable design scheme among several candidates for the subsequent detailed design stage, oftentimes requires a set of tools to conduct design evaluation. Using computational intelligence techniques, such as fuzzy logic, neural network, genetic algorithm, and physical programming, several design evaluation methods are put forth in this paper to realize the conceptual design selection under different scenarios. Depending on whether an evaluation criterion can be quantified or not, the linear physical programming (LPP) model and the RAOGA-based fuzzy neural network (FNN) model can be utilized to evaluate design alternatives in conceptual design stage. Furthermore, on the basis of Vanegas and Labib's work, a multi-level conceptual design evaluation model based on the new fuzzy weighted average (NFWA) and the fuzzy compromise decision-making method is developed to solve the design evaluation problem consisting of many hierarchical criteria. The effectiveness of the proposed methods is demonstrated via several illustrative examples.

  10. The Application of Computer-Aided Discovery to Spacecraft Site Selection

    Science.gov (United States)

    Pankratius, V.; Blair, D. M.; Gowanlock, M.; Herring, T.

    2015-12-01

    The selection of landing and exploration sites for interplanetary robotic or human missions is a complex task. Historically it has been labor-intensive, with large groups of scientists manually interpreting a planetary surface across a variety of datasets to identify potential sites based on science and engineering constraints. This search process can be lengthy, and excellent sites may get overlooked when the aggregate value of site selection criteria is non-obvious or non-intuitive. As planetary data collection leads to Big Data repositories and a growing set of selection criteria, scientists will face a combinatorial search space explosion that requires scalable, automated assistance. We are currently exploring more general computer-aided discovery techniques in the context of planetary surface deformation phenomena that can lend themselves to application in the landing site search problem. In particular, we are developing a general software framework that addresses key difficulties: characterizing a given phenomenon or site based on data gathered from multiple instruments (e.g. radar interferometry, gravity, thermal maps, or GPS time series), and examining a variety of possible workflows whose individual configurations are optimized to isolate different features. The framework allows algorithmic pipelines and hypothesized models to be perturbed or permuted automatically within well-defined bounds established by the scientist. For example, even simple choices for outlier and noise handling or data interpolation can drastically affect the detectability of certain features. These techniques aim to automate repetitive tasks that scientists routinely perform in exploratory analysis, and make them more efficient and scalable by executing them in parallel in the cloud. We also explore ways in which machine learning can be combined with human feedback to prune the search space and converge to desirable results. Acknowledgements: We acknowledge support from NASA AIST

  11. File sharing

    NARCIS (Netherlands)

    van Eijk, N.

    2011-01-01

    File sharing’ has become generally accepted on the Internet. Users share files for downloading music, films, games, software etc. In this note, we have a closer look at the definition of file sharing, the legal and policy-based context as well as enforcement issues. The economic and cultural

  12. Computed micro-tomographic evaluation of glide path with nickel-titanium rotary PathFile in maxillary first molars curved canals.

    Science.gov (United States)

    Pasqualini, Damiano; Bianchi, Caterina Chiara; Paolino, Davide Salvatore; Mancini, Lucia; Cemenasco, Andrea; Cantatore, Giuseppe; Castellucci, Arnaldo; Berutti, Elio

    2012-03-01

    X-ray computed micro-tomography scanning allows high-resolution 3-dimensional imaging of small objects. In this study, micro-CT scanning was used to compare the ability of manual and mechanical glide path to maintain the original root canal anatomy. Eight extracted upper first permanent molars were scanned at the TOMOLAB station at ELETTRA Synchrotron Light Laboratory in Trieste, Italy, with a microfocus cone-beam geometry system. A total of 2,400 projections on 360° have been acquired at 100 kV and 80 μA, with a focal spot size of 8 μm. Buccal root canals of each specimen (n = 16) were randomly assigned to PathFile (P) or stainless-steel K-file (K) to perform glide path at the full working length. Specimens were then microscanned at the apical level (A) and at the point of the maximum curvature level (C) for post-treatment analyses. Curvatures of root canals were classified as moderate (≤35°) or severe (≥40°). The ratio of diameter ratios (RDRs) and the ratio of cross-sectional areas (RAs) were assessed. For each level of analysis (A and C), 2 balanced 2-way factorial analyses of variance (P < .05) were performed to evaluate the significance of the instrument factor and of canal curvature factor as well as the interactions of the factors both with RDRs and RAs. Specimens in the K group had a mean curvature of 35.4° ± 11.5°; those in the P group had a curvature of 38° ± 9.9°. The instrument factor (P and K) was extremely significant (P < .001) for both the RDR and RA parameters, regardless of the point of analysis. Micro-CT scanning confirmed that NiTi rotary PathFile instruments preserve the original canal anatomy and cause less canal aberrations. Copyright © 2012 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  13. Portable File Format (PFF) specifications

    Energy Technology Data Exchange (ETDEWEB)

    Dolan, Daniel H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Created at Sandia National Laboratories, the Portable File Format (PFF) allows binary data transfer across computer platforms. Although this capability is supported by many other formats, PFF files are still in use at Sandia, particularly in pulsed power research. This report provides detailed PFF specifications for accessing data without relying on legacy code.

  14. Selection of meteorological parameters affecting rainfall estimation using neuro-fuzzy computing methodology

    Science.gov (United States)

    Hashim, Roslan; Roy, Chandrabhushan; Motamedi, Shervin; Shamshirband, Shahaboddin; Petković, Dalibor; Gocic, Milan; Lee, Siew Cheng

    2016-05-01

    Rainfall is a complex atmospheric process that varies over time and space. Researchers have used various empirical and numerical methods to enhance estimation of rainfall intensity. We developed a novel prediction model in this study, with the emphasis on accuracy to identify the most significant meteorological parameters having effect on rainfall. For this, we used five input parameters: wet day frequency (dwet), vapor pressure (e̅a), and maximum and minimum air temperatures (Tmax and Tmin) as well as cloud cover (cc). The data were obtained from the Indian Meteorological Department for the Patna city, Bihar, India. Further, a type of soft-computing method, known as the adaptive-neuro-fuzzy inference system (ANFIS), was applied to the available data. In this respect, the observation data from 1901 to 2000 were employed for testing, validating, and estimating monthly rainfall via the simulated model. In addition, the ANFIS process for variable selection was implemented to detect the predominant variables affecting the rainfall prediction. Finally, the performance of the model was compared to other soft-computing approaches, including the artificial neural network (ANN), support vector machine (SVM), extreme learning machine (ELM), and genetic programming (GP). The results revealed that ANN, ELM, ANFIS, SVM, and GP had R2 of 0.9531, 0.9572, 0.9764, 0.9525, and 0.9526, respectively. Therefore, we conclude that the ANFIS is the best method among all to predict monthly rainfall. Moreover, dwet was found to be the most influential parameter for rainfall prediction, and the best predictor of accuracy. This study also identified sets of two and three meteorological parameters that show the best predictions.

  15. Efficient File Sharing by Multicast - P2P Protocol Using Network Coding and Rank Based Peer Selection

    Science.gov (United States)

    Stoenescu, Tudor M.; Woo, Simon S.

    2009-01-01

    In this work, we consider information dissemination and sharing in a distributed peer-to-peer (P2P highly dynamic communication network. In particular, we explore a network coding technique for transmission and a rank based peer selection method for network formation. The combined approach has been shown to improve information sharing and delivery to all users when considering the challenges imposed by the space network environments.

  16. CINDA 99, supplement 2 to CINDA 97 (1988-1999). The index to literature and computer files on microscopic neutron data

    International Nuclear Information System (INIS)

    1999-01-01

    CINDA, the Computer Index of Neutron Data, contains bibliographical references to measurements, calculations, reviews and evaluations of neutron cross-sections and other microscopic neutron data; it includes also index references to computer libraries of numerical neutron data available from four regional neutron data centres. The present issue, CINDA 99, is the second supplement to CINDA 97, the index to the literature on neutron data published after 1987. It supersedes the first supplement, CINDA 98. The complete CINDA file as of 1 June 1999 is contained in: the archival issue CINDA-A (5 volumes, 1990), CINDA 97 and the current issue CINDA 99. The compilation and publication of CINDA are the result of worldwide co-operation involving the following four data centres. Each centre is responsible for compiling the CINDA entries from the literature published in a defined geographical area given in brackets below: the USA National Nuclear Data Center at the Brookhaven National Laboratory, USA (United States of America and Canada); the Russian Nuclear Data Centre at the Fiziko-Energeticheskij Institut, Obninsk, Russian Federation (former USSR countries); the NEA Data Bank in Paris, France (European OECD member countries in Western Europe and Japan); and the IAEA Nuclear Data Section in Vienna, Austria (all other countries in Eastern Europe, Asia, Australia, Africa, Central and South America; also IAEA publications and translation journals). Besides the published CINDA books, up-to-date computer retrievals for specified CINDA information are currently available on request from the responsible CINDA centres, or via direct access to the on-line services as described in this publication

  17. Temperature increases on the external root surface during endodontic treatment using single file systems.

    Science.gov (United States)

    Özkocak, I; Taşkan, M M; Gökt Rk, H; Aytac, F; Karaarslan, E Şirin

    2015-01-01

    The aim of this study is to evaluate increases in temperature on the external root surface during endodontic treatment with different rotary systems. Fifty human mandibular incisors with a single root canal were selected. All root canals were instrumented using a size 20 Hedstrom file, and the canals were irrigated with 5% sodium hypochlorite solution. The samples were randomly divided into the following three groups of 15 teeth: Group 1: The OneShape Endodontic File no.: 25; Group 2: The Reciproc Endodontic File no.: 25; Group 3: The WaveOne Endodontic File no.: 25. During the preparation, the temperature changes were measured in the middle third of the roots using a noncontact infrared thermometer. The temperature data were transferred from the thermometer to the computer and were observed graphically. Statistical analysis was performed using the Kruskal-Wallis analysis of variance at a significance level of 0.05. The increases in temperature caused by the OneShape file system were lower than those of the other files (P file showed the highest temperature increases. However, there were no significant differences between the Reciproc and WaveOne files. The single file rotary systems used in this study may be recommended for clinical use.

  18. The photon identification loophole in EPRB experiments: computer models with single-wing selection

    Directory of Open Access Journals (Sweden)

    De Raedt Hans

    2017-11-01

    Full Text Available Recent Einstein-Podolsky-Rosen-Bohm experiments [M. Giustina et al. Phys. Rev. Lett. 115, 250401 (2015; L. K. Shalm et al. Phys. Rev. Lett. 115, 250402 (2015] that claim to be loophole free are scrutinized. The combination of a digital computer and discrete-event simulation is used to construct a minimal but faithful model of the most perfected realization of these laboratory experiments. In contrast to prior simulations, all photon selections are strictly made, as they are in the actual experiments, at the local station and no other “post-selection” is involved. The simulation results demonstrate that a manifestly non-quantum model that identifies photons in the same local manner as in these experiments can produce correlations that are in excellent agreement with those of the quantum theoretical description of the corresponding thought experiment, in conflict with Bell’s theorem which states that this is impossible. The failure of Bell’s theorem is possible because of our recognition of the photon identification loophole. Such identification measurement-procedures are necessarily included in all actual experiments but are not included in the theory of Bell and his followers.

  19. Achieving a hybrid brain-computer interface with tactile selective attention and motor imagery

    Science.gov (United States)

    Ahn, Sangtae; Ahn, Minkyu; Cho, Hohyun; Jun, Sung Chan

    2014-12-01

    Objective. We propose a new hybrid brain-computer interface (BCI) system that integrates two different EEG tasks: tactile selective attention (TSA) using a vibro-tactile stimulator on the left/right finger and motor imagery (MI) of left/right hand movement. Event-related desynchronization (ERD) from the MI task and steady-state somatosensory evoked potential (SSSEP) from the TSA task are retrieved and combined into two hybrid senses. Approach. One hybrid approach is to measure two tasks simultaneously; the features of each task are combined for testing. Another hybrid approach is to measure two tasks consecutively (TSA first and MI next) using only MI features. For comparison with the hybrid approaches, the TSA and MI tasks are measured independently. Main results. Using a total of 16 subject datasets, we analyzed the BCI classification performance for MI, TSA and two hybrid approaches in a comparative manner; we found that the consecutive hybrid approach outperformed the others, yielding about a 10% improvement in classification accuracy relative to MI alone. It is understood that TSA may play a crucial role as a prestimulus in that it helps to generate earlier ERD prior to MI and thus sustains ERD longer and to a stronger degree; this ERD may give more discriminative information than ERD in MI alone. Significance. Overall, our proposed consecutive hybrid approach is very promising for the development of advanced BCI systems.

  20. Introduction of helical computed tomography affects patient selection for V/Q lung scan

    International Nuclear Information System (INIS)

    Zettinig, G.; Baudrexel, S.; Leitha, Th.

    2002-01-01

    Aim: Retrospective analysis for determination of the effect of helical computed tomography (HCT) on utilization of V/Q lung scanning to diagnose pulmonary embolism (PE) in a large general hospital. Methods: A total number of 2676 V/Q scans of in- and out-patients referred to our department between March 1992 and December 1998 and between April 1997 and December 1998 were analyzed by an identical group of nuclear physicians. Results: Neither the total number of annually performed V/Q scans (446 ± 135) nor the mean age of patients (56 years ± 17) changed significantly since the introduction of HCT. However, the referral pattern was different. The percentage of patients with high and intermediate probability for PE decreased significantly from 15.2% to 9.4% (p <0.01) and from 10.2% to 7.3% (p <0.05), respectively. Low probability scans significantly increased from 37.8% to 42.7% (p <0.05). The percentage of normal scans did not change significantly, however, there was a highly significant increase summarizing patients with normal and low probability scans (74.6% to 83.3%; p <0.01). Conclusion: The introduction of HCT affected the selection of patients referred for V/Q lung scanning since V/Q scanning was primarily used to exclude rather to confirm PE. (orig.)

  1. Variable Selection for Nonparametric Gaussian Process Priors: Models and Computational Strategies.

    Science.gov (United States)

    Savitsky, Terrance; Vannucci, Marina; Sha, Naijun

    2011-02-01

    This paper presents a unified treatment of Gaussian process models that extends to data from the exponential dispersion family and to survival data. Our specific interest is in the analysis of data sets with predictors that have an a priori unknown form of possibly nonlinear associations to the response. The modeling approach we describe incorporates Gaussian processes in a generalized linear model framework to obtain a class of nonparametric regression models where the covariance matrix depends on the predictors. We consider, in particular, continuous, categorical and count responses. We also look into models that account for survival outcomes. We explore alternative covariance formulations for the Gaussian process prior and demonstrate the flexibility of the construction. Next, we focus on the important problem of selecting variables from the set of possible predictors and describe a general framework that employs mixture priors. We compare alternative MCMC strategies for posterior inference and achieve a computationally efficient and practical approach. We demonstrate performances on simulated and benchmark data sets.

  2. Computer simulation of the relationship between selected properties of laser remelted tool steel surface layer

    Energy Technology Data Exchange (ETDEWEB)

    Bonek, Mirosław, E-mail: miroslaw.bonek@polsl.pl; Śliwa, Agata; Mikuła, Jarosław

    2016-12-01

    Highlights: • Prediction of the properties of laser remelted surface layer with the use of FEM analysis. • The simulation was applied to determine the shape of molten pool of remelted surface. • Applying of numerical model MES for simulation of surface laser treatment to meaningfully shorten time of selection of optimum parameters. • An FEM model was established for the purpose of building a computer simulation. - Abstract: Investigations >The language in this paper has been slightly changed. Please check for clarity of thought, and that the meaning is still correct, and amend if necessary.include Finite Element Method simulation model of remelting of PMHSS6-5-3 high-speed steel surface layer using the high power diode laser (HPDL). The Finite Element Method computations were performed using ANSYS software. The scope of FEM simulation was determination of temperature distribution during laser alloying process at various process configurations regarding the laser beam power and method of powder deposition, as pre-coated past or surface with machined grooves. The Finite Element Method simulation was performed on five different 3-dimensional models. The model assumed nonlinear change of thermal conductivity, specific heat and density that were depended on temperature. The heating process was realized as heat flux corresponding to laser beam power of 1.4, 1.7 and 2.1 kW. Latent heat effects are considered during solidification. The molten pool is composed of the same material as the substrate and there is no chemical reaction. The absorptivity of laser energy was dependent on the simulated materials properties and their surface condition. The Finite Element Method simulation allows specifying the heat affected zone and the temperature distribution in the sample as a function of time and thus allows the estimation of the structural changes taking place during laser remelting process. The simulation was applied to determine the shape of molten pool and the

  3. Canal transportation and centering ability of protaper and self-adjusting file system in long oval canals: An ex-vivo cone-beam computed tomography analysis.

    Science.gov (United States)

    Shah, Dipali Yogesh; Wadekar, Swati Ishwara; Dadpe, Ashwini Manish; Jadhav, Ganesh Ranganath; Choudhary, Lalit Jayant; Kalra, Dheeraj Deepak

    2017-01-01

    The purpose of this study was to compare and evaluate the shaping ability of ProTaper (PT) and Self-Adjusting File (SAF) system using cone-beam computed tomography (CBCT) to assess their performance in oval-shaped root canals. Sixty-two mandibular premolars with single oval canals were divided into two experimental groups ( n = 31) according to the systems used: Group I - PT and Group II - SAF. Canals were evaluated before and after instrumentation using CBCT to assess centering ratio and canal transportation at three levels. Data were statistically analyzed using one-way analysis of variance, post hoc Tukey's test, and t -test. The SAF showed better centering ability and lesser canal transportation than the PT only in the buccolingual plane at 6 and 9 mm levels. The shaping ability of the PT was best in the apical third in both the planes. The SAF had statistically significant better centering and lesser canal transportation in the buccolingual as compared to the mesiodistal plane at the middle and coronal levels. The SAF produced significantly less transportation and remained centered than the PT at the middle and coronal levels in the buccolingual plane of oval canals. In the mesiodistal plane, the performance of both the systems was parallel.

  4. File-System Workload on a Scientific Multiprocessor

    Science.gov (United States)

    Kotz, David; Nieuwejaar, Nils

    1995-01-01

    Many scientific applications have intense computational and I/O requirements. Although multiprocessors have permitted astounding increases in computational performance, the formidable I/O needs of these applications cannot be met by current multiprocessors a their I/O subsystems. To prevent I/O subsystems from forever bottlenecking multiprocessors and limiting the range of feasible applications, new I/O subsystems must be designed. The successful design of computer systems (both hardware and software) depends on a thorough understanding of their intended use. A system designer optimizes the policies and mechanisms for the cases expected to most common in the user's workload. In the case of multiprocessor file systems, however, designers have been forced to build file systems based only on speculation about how they would be used, extrapolating from file-system characterizations of general-purpose workloads on uniprocessor and distributed systems or scientific workloads on vector supercomputers (see sidebar on related work). To help these system designers, in June 1993 we began the Charisma Project, so named because the project sought to characterize 1/0 in scientific multiprocessor applications from a variety of production parallel computing platforms and sites. The Charisma project is unique in recording individual read and write requests-in live, multiprogramming, parallel workloads (rather than from selected or nonparallel applications). In this article, we present the first results from the project: a characterization of the file-system workload an iPSC/860 multiprocessor running production, parallel scientific applications at NASA's Ames Research Center.

  5. Applied Computational Intelligence in Engineering and Information Technology Revised and Selected Papers from the 6th IEEE International Symposium on Applied Computational Intelligence and Informatics SACI 2011

    CERN Document Server

    Precup, Radu-Emil; Preitl, Stefan

    2012-01-01

    This book highlights the potential of getting benefits from various applications of computational intelligence techniques. The present book is structured such that to include a set of selected and extended papers from the 6th IEEE International Symposium on Applied Computational Intelligence and Informatics SACI 2011, held in Timisoara, Romania, from 19 to 21 May 2011. After a serious paper review performed by the Technical Program Committee only 116 submissions were accepted, leading to a paper acceptance ratio of 65 %. A further refinement was made after the symposium, based also on the assessment of the presentation quality. Concluding, this book includes the extended and revised versions of the very best papers of SACI 2011 and few invited papers authored by prominent specialists. The readers will benefit from gaining knowledge of the computational intelligence and on what problems can be solved in several areas; they will learn what kind of approaches is advised to use in order to solve these problems. A...

  6. ChemEngine: harvesting 3D chemical structures of supplementary data from PDF files.

    Science.gov (United States)

    Karthikeyan, Muthukumarasamy; Vyas, Renu

    2016-01-01

    Digital access to chemical journals resulted in a vast array of molecular information that is now available in the supplementary material files in PDF format. However, extracting this molecular information, generally from a PDF document format is a daunting task. Here we present an approach to harvest 3D molecular data from the supporting information of scientific research articles that are normally available from publisher's resources. In order to demonstrate the feasibility of extracting truly computable molecules from PDF file formats in a fast and efficient manner, we have developed a Java based application, namely ChemEngine. This program recognizes textual patterns from the supplementary data and generates standard molecular structure data (bond matrix, atomic coordinates) that can be subjected to a multitude of computational processes automatically. The methodology has been demonstrated via several case studies on different formats of coordinates data stored in supplementary information files, wherein ChemEngine selectively harvested the atomic coordinates and interpreted them as molecules with high accuracy. The reusability of extracted molecular coordinate data was demonstrated by computing Single Point Energies that were in close agreement with the original computed data provided with the articles. It is envisaged that the methodology will enable large scale conversion of molecular information from supplementary files available in the PDF format into a collection of ready- to- compute molecular data to create an automated workflow for advanced computational processes. Software along with source codes and instructions available at https://sourceforge.net/projects/chemengine/files/?source=navbar.Graphical abstract.

  7. Selectivity in Ligand Binding to Uranyl Compounds: A Synthetic, Structural, Thermodynamic and Computational Study

    Energy Technology Data Exchange (ETDEWEB)

    Arnold, John [Univ. of California, Berkeley, CA (United States)

    2017-12-06

    The uranyl cation (UO22+) is the most abundant form of uranium on the planet. It is estimated that 4.5 billion tons of uranium in this form exist in sea water. The ability to bind and extract the uranyl cation from aqueous solution while separating it from other elements would provide a limitless source of nuclear fuel. A large body of research concerns the selective recognition and extraction of uranyl. A stable molecule, the cation has a linear O=U=O geometry. The short U-O bonds (1.78 Å) arise from the combination of uranium 5f/6d and oxygen 2p orbitals. Due to the oxygen moieties being multiply bonded, these sites were not thought to be basic enough for Lewis acidic coordination to be a viable approach to sequestration. We believe that the goal of developing a practical system for uranium separation from seawater will not be attained without new insights into our existing fundamental knowledge of actinide chemistry. We posit that detailed studies of the kinetic and thermodynamic factors that influence interactions between f-elements and ligands with a range of donor atoms is essential to any major advance in this important area. The goal of this research is thus to broaden the coordination chemistry of the uranyl ion by studying new ligand systems via synthetic, structural, thermodynamic and computational methods. We anticipate that this fundamental science will find use beyond actinide separation technologies in areas such as nuclear waste remediation and nuclear materials.

  8. An observer study comparing spot imaging regions selected by radiologists and a computer for an automated stereo spot mammography technique

    International Nuclear Information System (INIS)

    Goodsitt, Mitchell M.; Chan, Heang-Ping; Lydick, Justin T.; Gandra, Chaitanya R.; Chen, Nelson G.; Helvie, Mark A.; Bailey, Janet E.; Roubidoux, Marilyn A.; Paramagul, Chintana; Blane, Caroline E.; Sahiner, Berkman; Petrick, Nicholas A.

    2004-01-01

    We are developing an automated stereo spot mammography technique for improved imaging of suspicious dense regions within digital mammograms. The technique entails the acquisition of a full-field digital mammogram, automated detection of a suspicious dense region within that mammogram by a computer aided detection (CAD) program, and acquisition of a stereo pair of images with automated collimation to the suspicious region. The latter stereo spot image is obtained within seconds of the original full-field mammogram, without releasing the compression paddle. The spot image is viewed on a stereo video display. A critical element of this technique is the automated detection of suspicious regions for spot imaging. We performed an observer study to compare the suspicious regions selected by radiologists with those selected by a CAD program developed at the University of Michigan. True regions of interest (TROIs) were separately determined by one of the radiologists who reviewed the original mammograms, biopsy images, and histology results. We compared the radiologist and computer-selected regions of interest (ROIs) to the TROIs. Both the radiologists and the computer were allowed to select up to 3 regions in each of 200 images (mixture of 100 CC and 100 MLO views). We computed overlap indices (the overlap index is defined as the ratio of the area of intersection to the area of interest) to quantify the agreement between the selected regions in each image. The averages of the largest overlap indices per image for the 5 radiologist-to-computer comparisons were directly related to the average number of regions per image traced by the radiologists (about 50% for 1 region/image, 84% for 2 regions/image and 96% for 3 regions/image). The average of the overlap indices with all of the TROIs was 73% for CAD and 76.8%+/-10.0% for the radiologists. This study indicates that the CAD determined ROIs could potentially be useful for a screening technique that includes stereo spot

  9. Computer programs to make a Chart of the nuclides for WWW

    International Nuclear Information System (INIS)

    Nakagawa, Tsuneo; Katakura, Jun-ichi; Horiguchi, Takayoshi

    1999-06-01

    Computer programs to make a chart of the nuclides for World Wide Web (WWW) have been developed. The programs make a data file for WWW chart of the nuclides from a data file containing nuclide information in the format similar to ENSDF, by filling unknown half-lives with calculated ones. Then, the WWW chart of the nuclides in the gif format is created from the data file. The programs to make html files and image map files, to select a chart of selected nuclides, and to show various information of nuclides are included in the system. All the programs are written in C language. This report describes the formats of files, the programs and 1998 issue of Chart of the Nuclides made by means of the present programs. (author)

  10. Building a parallel file system simulator

    International Nuclear Information System (INIS)

    Molina-Estolano, E; Maltzahn, C; Brandt, S A; Bent, J

    2009-01-01

    Parallel file systems are gaining in popularity in high-end computing centers as well as commercial data centers. High-end computing systems are expected to scale exponentially and to pose new challenges to their storage scalability in terms of cost and power. To address these challenges scientists and file system designers will need a thorough understanding of the design space of parallel file systems. Yet there exist few systematic studies of parallel file system behavior at petabyte- and exabyte scale. An important reason is the significant cost of getting access to large-scale hardware to test parallel file systems. To contribute to this understanding we are building a parallel file system simulator that can simulate parallel file systems at very large scale. Our goal is to simulate petabyte-scale parallel file systems on a small cluster or even a single machine in reasonable time and fidelity. With this simulator, file system experts will be able to tune existing file systems for specific workloads, scientists and file system deployment engineers will be able to better communicate workload requirements, file system designers and researchers will be able to try out design alternatives and innovations at scale, and instructors will be able to study very large-scale parallel file system behavior in the class room. In this paper we describe our approach and provide preliminary results that are encouraging both in terms of fidelity and simulation scalability.

  11. Design and application of remote file management system

    International Nuclear Information System (INIS)

    Zhu Haijun; Liu Dekang; Shen liren

    2006-01-01

    File transfer protocol can help users transfer files between computers on internet. FTP can not fulfill the needs of users in special occasions, so it needs programmer define file transfer protocol himself based on users. The method or realization and application for user-defined file transfer protocol is introduced. (authors)

  12. Application of Selective Algorithm for Effective Resource Provisioning in Cloud Computing Environment

    OpenAIRE

    Katyal, Mayanka; Mishra, Atul

    2014-01-01

    Modern day continued demand for resource hungry services and applications in IT sector has led to development of Cloud computing. Cloud computing environment involves high cost infrastructure on one hand and need high scale computational resources on the other hand. These resources need to be provisioned (allocation and scheduling) to the end users in most efficient manner so that the tremendous capabilities of cloud are utilized effectively and efficiently. In this paper we discuss a selecti...

  13. Zebra: A striped network file system

    Science.gov (United States)

    Hartman, John H.; Ousterhout, John K.

    1992-01-01

    The design of Zebra, a striped network file system, is presented. Zebra applies ideas from log-structured file system (LFS) and RAID research to network file systems, resulting in a network file system that has scalable performance, uses its servers efficiently even when its applications are using small files, and provides high availability. Zebra stripes file data across multiple servers, so that the file transfer rate is not limited by the performance of a single server. High availability is achieved by maintaining parity information for the file system. If a server fails its contents can be reconstructed using the contents of the remaining servers and the parity information. Zebra differs from existing striped file systems in the way it stripes file data: Zebra does not stripe on a per-file basis; instead it stripes the stream of bytes written by each client. Clients write to the servers in units called stripe fragments, which are analogous to segments in an LFS. Stripe fragments contain file blocks that were written recently, without regard to which file they belong. This method of striping has numerous advantages over per-file striping, including increased server efficiency, efficient parity computation, and elimination of parity update.

  14. Learning Natural Selection in 4th Grade with Multi-Agent-Based Computational Models

    Science.gov (United States)

    Dickes, Amanda Catherine; Sengupta, Pratim

    2013-01-01

    In this paper, we investigate how elementary school students develop multi-level explanations of population dynamics in a simple predator-prey ecosystem, through scaffolded interactions with a multi-agent-based computational model (MABM). The term "agent" in an MABM indicates individual computational objects or actors (e.g., cars), and these…

  15. Task Selection, Task Switching and Multitasking during Computer-Based Independent Study

    Science.gov (United States)

    Judd, Terry

    2015-01-01

    Detailed logs of students' computer use, during independent study sessions, were captured in an open-access computer laboratory. Each log consisted of a chronological sequence of tasks representing either the application or the Internet domain displayed in the workstation's active window. Each task was classified using a three-tier schema…

  16. Rules for selection of computer system to support customer relationships management

    Directory of Open Access Journals (Sweden)

    Dorota Buchnowska

    2012-12-01

    Full Text Available The number of support systems for business management on the Polish market is increasing. Because of that, enterprises are facing a more and more difficult dilemma: which solution to choose? This paper will present stages of the selection process of applications for customer relationships management support, discuss selection criteria and present a decision making tool for the selection of management support system, allowing for multi-faceted and impartial comparison of business applications.

  17. Computer-controlled environmental test systems - Criteria for selection, installation, and maintenance.

    Science.gov (United States)

    Chapman, C. P.

    1972-01-01

    Applications for presently marketed, new computer-controlled environmental test systems are suggested. It is shown that capital costs of these systems follow an exponential cost function curve that levels out as additional applications are implemented. Some test laboratory organization changes are recommended in terms of new personnel requirements, and facility modification are considered in support of a computer-controlled test system. Software for computer-controlled test systems are discussed, and control loop speed constraints are defined for real-time control functions. Suitable input and output devices and memory storage device tradeoffs are also considered.

  18. Computational Modelling of Materials for Wind Turbine Blades: Selected DTU Wind Energy Activities.

    Science.gov (United States)

    Mikkelsen, Lars Pilgaard; Mishnaevsky, Leon

    2017-11-08

    Computational and analytical studies of degradation of wind turbine blade materials at the macro-, micro-, and nanoscale carried out by the modelling team of the Section Composites and Materials Mechanics, Department of Wind Energy, DTU, are reviewed. Examples of the analysis of the microstructural effects on the strength and fatigue life of composites are shown. Computational studies of degradation mechanisms of wind blade composites under tensile and compressive loading are presented. The effect of hybrid and nanoengineered structures on the performance of the composite was studied in computational experiments as well.

  19. Selecting an Architecture for a Safety-Critical Distributed Computer System with Power, Weight and Cost Considerations

    Science.gov (United States)

    Torres-Pomales, Wilfredo

    2014-01-01

    This report presents an example of the application of multi-criteria decision analysis to the selection of an architecture for a safety-critical distributed computer system. The design problem includes constraints on minimum system availability and integrity, and the decision is based on the optimal balance of power, weight and cost. The analysis process includes the generation of alternative architectures, evaluation of individual decision criteria, and the selection of an alternative based on overall value. In this example presented here, iterative application of the quantitative evaluation process made it possible to deliberately generate an alternative architecture that is superior to all others regardless of the relative importance of cost.

  20. Happenstance and compromise: a gendered analysis of students' computing degree course selection

    Science.gov (United States)

    Lang, Catherine

    2010-12-01

    The number of students choosing to study computing at university continues to decline this century, with an even sharper decline in female students. This article presents the results of a series of interviews with university students studying computing courses in Australia that uncovered the influence of happenstance and compromise on course choice. This investigation provides an insight into the contributing factors into the continued downturn of student diversity in computing bachelor degree courses. Many females interviewed made decisions based on happenstance, many males interviewed had chosen computing as a compromise course, and family helped in the decision-making to a large degree in both genders. The major implication from this investigation is the finding that students of both genders appear to be socialised away from this discipline, which is perceived as a support or insurance skill, not a career in itself, in all but the most technical-oriented (usually male) student.

  1. Selected Publications in Image Understanding and Computer Vision from 1974 to 1983

    Science.gov (United States)

    1985-04-18

    Germany, September 26-28, 1978), Plenum, New York, 1979. 9. Reconnaissance des Formes et Intelligence Artificielle (2’me Congres AFCET-IRIA, Toulouse...the last decade. .To L..... ABBREVIATIONS - AI Artificial Intelligence BC Biological Cybernetics CACM Communications of the ACM CG Computer Graphics... Intelligence PACM Proceedings of the ACM "P-IEEE Proceedings of the IEEE P-NCC Proceedings of the National Computer Conference PR Pattern Recognition PRL

  2. Comparing ProFile Vortex to ProTaper Next for the efficacy of removal of root filling material: An ex vivo micro-computed tomography study

    Directory of Open Access Journals (Sweden)

    Emad AlShwaimi

    2018-01-01

    Conclusion: Our findings suggest that PV is as effective as PTN for removal of root canal filling material. Therefore, PV can be considered for use in endodontic retreatment, although more effective files or techniques are still required.

  3. Selection of symptomatic patients with Crohn's disease for abdominopelvic computed tomography: role of serum C-reactive protein.

    LENUS (Irish Health Repository)

    Desmond, Alan N

    2012-11-01

    Results of previous studies have shown that repeated abdominopelvic computed tomography (CT) examinations can lead to substantial cumulative diagnostic radiation exposure in patients with Crohn\\'s disease (CD). Improved selection of patients referred for CT will reduce unnecessary radiation exposure. This study examines if serum C-reactive protein (CRP) concentration predicts which symptomatic patients with CD are likely to have significant disease activity or disease complications (such as abscess) detected on abdominopelvic CT.

  4. Human versus Computer Controlled Selection of Ventilator Settings: An Evaluation of Adaptive Support Ventilation and Mid-Frequency Ventilation

    Directory of Open Access Journals (Sweden)

    Eduardo Mireles-Cabodevila

    2012-01-01

    Full Text Available Background. There are modes of mechanical ventilation that can select ventilator settings with computer controlled algorithms (targeting schemes. Two examples are adaptive support ventilation (ASV and mid-frequency ventilation (MFV. We studied how different clinician-chosen ventilator settings are from these computer algorithms under different scenarios. Methods. A survey of critical care clinicians provided reference ventilator settings for a 70 kg paralyzed patient in five clinical/physiological scenarios. The survey-derived values for minute ventilation and minute alveolar ventilation were used as goals for ASV and MFV, respectively. A lung simulator programmed with each scenario’s respiratory system characteristics was ventilated using the clinician, ASV, and MFV settings. Results. Tidal volumes ranged from 6.1 to 8.3 mL/kg for the clinician, 6.7 to 11.9 mL/kg for ASV, and 3.5 to 9.9 mL/kg for MFV. Inspiratory pressures were lower for ASV and MFV. Clinician-selected tidal volumes were similar to the ASV settings for all scenarios except for asthma, in which the tidal volumes were larger for ASV and MFV. MFV delivered the same alveolar minute ventilation with higher end expiratory and lower end inspiratory volumes. Conclusions. There are differences and similarities among initial ventilator settings selected by humans and computers for various clinical scenarios. The ventilation outcomes are the result of the lung physiological characteristics and their interaction with the targeting scheme.

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  6. Preface to special issue of selected papers from Theoretical, Experimental, and Computational Mechanics (TECM)

    DEFF Research Database (Denmark)

    Jabbari, Masoud; Sarlak Chivaee, Hamid; Hattel, Jesper Henri

    2017-01-01

    We are pleased to introduce this special issue of the Applied Mathematical Modelling journal with highlights from theTheoretical, Experimental, and Computational Mechanics Symposium (TECM-2015). This special issue consists of four rigorouslyselected papers originally presented at TECM-2015...... as a part of the 13th International Conference of Numerical Analysisand Applied Mathematics 2015 (ICNAAM 2015), which was held on 23-29 September 2015 in Rhodes, Greece.The symposium attracted a broad range of international and local leaders in theoretical, experimental, and computational mechanics across...... various fields and application. The symposium did an excellent job of outlining the current landscape of computational mechanics and its capabilities in solving complex industrial problems in the process industries, and we agree with the editor-in-chief of the journal that it is certainly worthwhile...

  7. Selection of a computer code for Hanford low-level waste engineered-system performance assessment

    International Nuclear Information System (INIS)

    McGrail, B.P.; Mahoney, L.A.

    1995-10-01

    Planned performance assessments for the proposed disposal of low-level waste (LLW) glass produced from remediation of wastes stored in underground tanks at Hanford, Washington will require calculations of radionuclide release rates from the subsurface disposal facility. These calculations will be done with the aid of computer codes. Currently available computer codes were ranked in terms of the feature sets implemented in the code that match a set of physical, chemical, numerical, and functional capabilities needed to assess release rates from the engineered system. The needed capabilities were identified from an analysis of the important physical and chemical process expected to affect LLW glass corrosion and the mobility of radionuclides. The highest ranked computer code was found to be the ARES-CT code developed at PNL for the US Department of Energy for evaluation of and land disposal sites

  8. Experimental and computational evaluation of area selectively immobilized horseradish peroxidase in a microfluidic device

    DEFF Research Database (Denmark)

    Hoffmann, Christian; Pereira Rosinha Grundtvig, Ines; Thrane, Joachim

    2017-01-01

    experimentally and by computational fluid dynamics (CFD) simulations. Ultimately, such a correlation would lead to faster development through computational pre-screening and optimized experimental design.In this proof-of-concept study, microreactors were prepared in a 2-step curing process of an off......-stoichiometric thiol-ene-epoxy (OSTE+) mixture employing both a thiol-ene (TEC) and a thiol-epoxy curing reaction. Subsequent surface functionalization of the remaining thiol groups on the reactor surface through stenciled photoinitiated TEC enabled the preparation of specific surface patterns in the reactor. Patterns...... as obtained from experimental determination. This good agreement between the obtained experimental and computational results confirmed the high potential of CFD models for predicting and optimizing the biocatalytic performance of such a reactor....

  9. ACONC Files

    Data.gov (United States)

    U.S. Environmental Protection Agency — ACONC files containing simulated ozone and PM2.5 fields that were used to create the model difference plots shown in the journal article. This dataset is associated...

  10. XML Files

    Science.gov (United States)

    ... this page: https://medlineplus.gov/xml.html MedlinePlus XML Files To use the sharing features on this page, please enable JavaScript. MedlinePlus produces XML data sets that you are welcome to download ...

  11. 831 Files

    Data.gov (United States)

    Social Security Administration — SSA-831 file is a collection of initial and reconsideration adjudicative level DDS disability determinations. (A few hearing level cases are also present, but the...

  12. Computational design for a wide-angle cermet-based solar selective absorber for high temperature applications

    International Nuclear Information System (INIS)

    Sakurai, Atsushi; Tanikawa, Hiroya; Yamada, Makoto

    2014-01-01

    The purpose of this study is to computationally design a wide-angle cermet-based solar selective absorber for high temperature applications by using a characteristic matrix method and a genetic algorithm. The present study investigates a solar selective absorber with tungsten–silica (W–SiO 2 ) cermet. Multilayer structures of 1, 2, 3, and 4 layers and a wide range of metal volume fractions are optimized. The predicted radiative properties show good solar performance, i.e., thermal emittances, especially beyond 2 μm, are quite low, in contrast, solar absorptance levels are successfully high with wide angular range, so that solar photons are effectively absorbed and infrared radiative heat loss can be decreased. -- Highlights: • Electromagnetic simulation of radiative properties by characteristic matrix method. • Optimization for multilayered W–SiO 2 cermet-based absorber by a Genetic Algorithm. • We propose a successfully high solar performance of solar selective absorber

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  14. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  16. Computational Modelling of Materials for Wind Turbine Blades: Selected DTUWind Energy Activities

    DEFF Research Database (Denmark)

    Mikkelsen, Lars Pilgaard; Mishnaevsky, Leon

    2017-01-01

    Computational and analytical studies of degradation of wind turbine blade materials at the macro-, micro-, and nanoscale carried out by the modelling team of the Section Composites and Materials Mechanics, Department of Wind Energy, DTU, are reviewed. Examples of the analysis of the microstructural...... effects on the strength and fatigue life of composites are shown. Computational studies of degradation mechanisms of wind blade composites under tensile and compressive loading are presented. The effect of hybrid and nanoengineered structures on the performance of the composite was studied...

  17. The impact of computer-based versus "traditional" textbook science instruction on selected student learning outcomes

    Science.gov (United States)

    Rothman, Alan H.

    This study reports the results of research designed to examine the impact of computer-based science instruction on elementary school level students' science content achievement, their attitude about science learning, their level of critical thinking-inquiry skills, and their level of cognitive and English language development. The study compared these learning outcomes resulting from a computer-based approach compared to the learning outcomes from a traditional, textbook-based approach to science instruction. The computer-based approach was inherent in a curriculum titled The Voyage of the Mimi , published by The Bank Street College Project in Science and Mathematics (1984). The study sample included 209 fifth-grade students enrolled in three schools in a suburban school district. This sample was divided into three groups, each receiving one of the following instructional treatments: (a) Mixed-instruction primarily based on the use of a hardcopy textbook in conjunction with computer-based instructional materials as one component of the science course; (b) Non-Traditional, Technology-Based -instruction fully utilizing computer-based material; and (c) Traditional, Textbook-Based-instruction utilizing only the textbook as the basis for instruction. Pre-test, or pre-treatment, data related to each of the student learning outcomes was collected at the beginning of the school year and post-test data was collected at the end of the school year. Statistical analyses of pre-test data were used as a covariate to account for possible pre-existing differences with regard to the variables examined among the three student groups. This study concluded that non-traditional, computer-based instruction in science significantly improved students' attitudes toward science learning and their level of English language development. Non-significant, positive trends were found for the following student learning outcomes: overall science achievement and development of critical thinking

  18. A high speed, selective multi-ADC to computer data transfer interface, for nuclear physics experiments

    International Nuclear Information System (INIS)

    Arctaedius, T.; Ekstroem, R.E.

    1986-08-01

    A link connecting up to fifteen Analog to Digital Converters with a computer, through a Direct Memory Access interface, is described. The interface decides which of the connected ADC:s that participate in an event, and transfers the output-data from these to the computer, accompanied with a 2-byte word identifying the participating ADC:s. This data format can be recorded on tape without further transformations, and is easy to unfold at the off-line analysis. Data transfer is accomplished in less than a few microseconds, which is made possible by the use of high speed TTL circuits. (authors)

  19. Selecting a Suitable Cloud Computing Technology Deployment Model for an Academic Institute : A Case Study

    Science.gov (United States)

    Ramachandran, N.; Sivaprakasam, P.; Thangamani, G.; Anand, G.

    2014-01-01

    Purpose: Cloud Computing (CC) technology is getting implemented rapidly in the educational sector to improve learning, research and other administrative process. As evident from the literature review, most of these implementations are happening in the western countries such as USA, UK, while the level of implementation of CC in developing…

  20. Performance of Brain-computer Interfacing based on tactile selective sensation and motor imagery

    DEFF Research Database (Denmark)

    Yao, Lin; Sheng, Xinjun; Mrachacz-Kersting, Natalie

    2018-01-01

    We proposed a multi-class tactile brain-computer interface that utilizes stimulus-induced oscillatory dynamics. It was hypothesized that somatosensory attention can modulate tactile induced oscillation changes, which can decode different sensation attention tasks. Subjects performed four tactile...

  1. Computer Simulation of the Relationship between Selected Properties of PVD Coatings

    Directory of Open Access Journals (Sweden)

    Śliwa A.

    2016-06-01

    Full Text Available The possibility to apply the Finite Element Method to calculate internal stresses which occur in Ti+TiN, Ti+Ti(CxN1-x and Ti+TiC coatings obtained in the magnetron PVD process on the sintered high-speed steel of the PM HS6-5-3-8 type. For the purpose of computer simulation of internal stresses in coatings with the use of MES, the correct model of analyzed specimens was worked out and then it was experimentally verified by comparison of calculation results with the results of computer simulation. Accurate analysis of correlations indicated especially strong dependence between internal stresses and microhardness and between microhardness and erosion resistance what created conditions for establishing the dependence between internal stresses obtained in the result of computer simulation and erosion resistance as basic functional quality of coating. It has essential practical meaning because it allows to estimate predictable erosion resistance of coating exclusively on the base of the results of computer simulation for used parameters in the process of coating manufacturing.

  2. Criteria for the selective use of chest computed tomography in blunt trauma patients.

    NARCIS (Netherlands)

    Brink, M.; Deunk, J.; Dekker, H.M.; Edwards, M.J.R.; Kool, D.R.; Vugt, A.B. van; Kuijk, C. van; Blickman, J.G.

    2010-01-01

    PURPOSE: The purpose of this study was to derive parameters that predict which high-energy blunt trauma patients should undergo computed tomography (CT) for detection of chest injury. METHODS: This observational study prospectively included consecutive patients (>or=16 years old) who underwent

  3. Student Computer Use in Selected Undergraduate Agriculture Courses: An Examination of Required Tasks.

    Science.gov (United States)

    Johnson, Donald M.; Ferguson, James A.; Vokins, Nancy W.; Lester, Melissa L.

    2000-01-01

    Over 50% of faculty teaching undergraduate agriculture courses (n=58) required use of word processing, Internet, and electronic mail; less than 50% required spreadsheets, databases, graphics, or specialized software. They planned to maintain or increase required computer tasks in their courses. (SK)

  4. JNDC FP decay data file

    International Nuclear Information System (INIS)

    Yamamoto, Tohru; Akiyama, Masatsugu

    1981-02-01

    The decay data file for fission product nuclides (FP DECAY DATA FILE) has been prepared for summation calculation of the decay heat of fission products. The average energies released in β- and γ-transitions have been calculated with computer code PROFP. The calculated results and necessary information have been arranged in tabular form together with the estimated results for 470 nuclides of which decay data are not available experimentally. (author)

  5. Efficacy of Twisted File Adaptive, Reciproc and ProTaper Universal Retreatment instruments for root-canal-filling removal: A cone-beam computed tomography study.

    Science.gov (United States)

    Akbulut, Makbule Bilge; Akman, Melek; Terlemez, Arslan; Magat, Guldane; Sener, Sevgi; Shetty, Heeresh

    2016-01-01

    The aim of this study was to evaluate the efficacy of Twisted File (TF) Adaptive, Reciproc, and ProTaper Universal Retreatment (UR) System instruments for removing root-canal-filling. Sixty single rooted teeth were decoronated, instrumented and obturated. Preoperative CBCT scans were taken and the teeth were retreated with TF Adaptive, Reciproc, ProTaper UR, or hand files (n=15). Then, the teeth were rescanned, and the percentage volume of the residual root-canal-filling material was established. The total time for retreatment was recorded, and the data was statistically analyzed. The statistical ranking of the residual filling material volume was as follows: hand file=TF Adaptive>ProTaper UR=Reciproc. The ProTaper UR and Reciproc systems required shorter periods of time for retreatment. Root canal filling was more efficiently removed by using Reciproc and ProTaper UR instruments than TF Adaptive instruments and hand files. The TF Adaptive system was advantageous over hand files with regard to operating time.

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  7. Computationally Efficient Chaotic Spreading Sequence Selection for Asynchronous DS-CDMA

    Directory of Open Access Journals (Sweden)

    Litviņenko Anna

    2017-12-01

    Full Text Available The choice of the spreading sequence for asynchronous direct-sequence code-division multiple-access (DS-CDMA systems plays a crucial role for the mitigation of multiple-access interference. Considering the rich dynamics of chaotic sequences, their use for spreading allows overcoming the limitations of the classical spreading sequences. However, to ensure low cross-correlation between the sequences, careful selection must be performed. This paper presents a novel exhaustive search algorithm, which allows finding sets of chaotic spreading sequences of required length with a particularly low mutual cross-correlation. The efficiency of the search is verified by simulations, which show a significant advantage compared to non-selected chaotic sequences. Moreover, the impact of sequence length on the efficiency of the selection is studied.

  8. Computational Experiment Study on Selection Mechanism of Project Delivery Method Based on Complex Factors

    Directory of Open Access Journals (Sweden)

    Xiang Ding

    2014-01-01

    Full Text Available Project delivery planning is a key stage used by the project owner (or project investor for organizing design, construction, and other operations in a construction project. The main task in this stage is to select an appropriate project delivery method. In order to analyze different factors affecting the PDM selection, this paper establishes a multiagent model mainly to show how project complexity, governance strength, and market environment affect the project owner’s decision on PDM. Experiment results show that project owner usually choose Design-Build method when the project is very complex within a certain range. Besides, this paper points out that Design-Build method will be the prior choice when the potential contractors develop quickly. This paper provides the owners with methods and suggestions in terms of showing how the factors affect PDM selection, and it may improve the project performance.

  9. A trial of selective imaging for the breast mass shadow by computed radiography

    International Nuclear Information System (INIS)

    Muramatsu, Yukio; Anan, Mitsuhiro; Tanaka, Takashi; Matsue, Hiroto; Yamada, Tatsuya

    1990-01-01

    CR has ability to make many kinds of images by several imaging processings. Especially, gradation processing is more important than frequency processing to make images in CR mammography. We developed new method to image breast masses selectively with new gradation processing and tried it for 18 patients over sixty years old with breast cancer. All of breast mass shadows were separated selectively from other parencymal shadow. So, we conclude that the auto-recognition of breast mass shadow can be possible in near future in CR system. (author)

  10. Quantum perceptron over a field and neural network architecture selection in a quantum computer.

    Science.gov (United States)

    da Silva, Adenilton José; Ludermir, Teresa Bernarda; de Oliveira, Wilson Rosa

    2016-04-01

    In this work, we propose a quantum neural network named quantum perceptron over a field (QPF). Quantum computers are not yet a reality and the models and algorithms proposed in this work cannot be simulated in actual (or classical) computers. QPF is a direct generalization of a classical perceptron and solves some drawbacks found in previous models of quantum perceptrons. We also present a learning algorithm named Superposition based Architecture Learning algorithm (SAL) that optimizes the neural network weights and architectures. SAL searches for the best architecture in a finite set of neural network architectures with linear time over the number of patterns in the training set. SAL is the first learning algorithm to determine neural network architectures in polynomial time. This speedup is obtained by the use of quantum parallelism and a non-linear quantum operator. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Computational intelligence-based polymerase chain reaction primer selection based on a novel teaching-learning-based optimisation.

    Science.gov (United States)

    Cheng, Yu-Huei

    2014-12-01

    Specific primers play an important role in polymerase chain reaction (PCR) experiments, and therefore it is essential to find specific primers of outstanding quality. Unfortunately, many PCR constraints must be simultaneously inspected which makes specific primer selection difficult and time-consuming. This paper introduces a novel computational intelligence-based method, Teaching-Learning-Based Optimisation, to select the specific and feasible primers. The specified PCR product lengths of 150-300 bp and 500-800 bp with three melting temperature formulae of Wallace's formula, Bolton and McCarthy's formula and SantaLucia's formula were performed. The authors calculate optimal frequency to estimate the quality of primer selection based on a total of 500 runs for 50 random nucleotide sequences of 'Homo species' retrieved from the National Center for Biotechnology Information. The method was then fairly compared with the genetic algorithm (GA) and memetic algorithm (MA) for primer selection in the literature. The results show that the method easily found suitable primers corresponding with the setting primer constraints and had preferable performance than the GA and the MA. Furthermore, the method was also compared with the common method Primer3 according to their method type, primers presentation, parameters setting, speed and memory usage. In conclusion, it is an interesting primer selection method and a valuable tool for automatic high-throughput analysis. In the future, the usage of the primers in the wet lab needs to be validated carefully to increase the reliability of the method.

  12. A New Decision-Making Method for Stock Portfolio Selection Based on Computing with Linguistic Assessment

    Directory of Open Access Journals (Sweden)

    Chen-Tung Chen

    2009-01-01

    Full Text Available The purpose of stock portfolio selection is how to allocate the capital to a large number of stocks in order to bring a most profitable return for investors. In most of past literatures, experts considered the portfolio of selection problem only based on past crisp or quantitative data. However, many qualitative and quantitative factors will influence the stock portfolio selection in real investment situation. It is very important for experts or decision-makers to use their experience or knowledge to predict the performance of each stock and make a stock portfolio. Because of the knowledge, experience, and background of each expert are different and vague, different types of 2-tuple linguistic variable are suitable used to express experts' opinions for the performance evaluation of each stock with respect to criteria. According to the linguistic evaluations of experts, the linguistic TOPSIS and linguistic ELECTRE methods are combined to present a new decision-making method for dealing with stock selection problems in this paper. Once the investment set has been determined, the risk preferences of investor are considered to calculate the investment ratio of each stock in the investment set. Finally, an example is implemented to demonstrate the practicability of the proposed method.

  13. Computed ABC Analysis for Rational Selection of Most Informative Variables in Multivariate Data.

    Science.gov (United States)

    Ultsch, Alfred; Lötsch, Jörn

    2015-01-01

    Multivariate data sets often differ in several factors or derived statistical parameters, which have to be selected for a valid interpretation. Basing this selection on traditional statistical limits leads occasionally to the perception of losing information from a data set. This paper proposes a novel method for calculating precise limits for the selection of parameter sets. The algorithm is based on an ABC analysis and calculates these limits on the basis of the mathematical properties of the distribution of the analyzed items. The limits implement the aim of any ABC analysis, i.e., comparing the increase in yield to the required additional effort. In particular, the limit for set A, the "important few", is optimized in a way that both, the effort and the yield for the other sets (B and C), are minimized and the additional gain is optimized. As a typical example from biomedical research, the feasibility of the ABC analysis as an objective replacement for classical subjective limits to select highly relevant variance components of pain thresholds is presented. The proposed method improved the biological interpretation of the results and increased the fraction of valid information that was obtained from the experimental data. The method is applicable to many further biomedical problems including the creation of diagnostic complex biomarkers or short screening tests from comprehensive test batteries. Thus, the ABC analysis can be proposed as a mathematically valid replacement for traditional limits to maximize the information obtained from multivariate research data.

  14. PSE For Solvent Applications: A Generic Computer-aided Solvent Selection and Design Framework

    DEFF Research Database (Denmark)

    Mitrofanov, Igor; Sin, Gürkan; Gani, Rafiqul

    system engineering view that emphasizes a systematic and generic solution framework to solvent selection problems is presented. The framework integrates different methods and tools to manage the complexity and solve a wide range of problems in efficient and flexible manner. Its software implementation...

  15. Endogenous Sensory Discrimination and Selection by a Fast Brain Switch for a High Transfer Rate Brain-Computer Interface.

    Science.gov (United States)

    Xu, Ren; Jiang, Ning; Dosen, Strahinja; Lin, Chuang; Mrachacz-Kersting, Natalie; Dremstrup, Kim; Farina, Dario

    2016-08-01

    In this study, we present a novel multi-class brain-computer interface (BCI) for communication and control. In this system, the information processing is shared by the algorithm (computer) and the user (human). Specifically, an electro-tactile cycle was presented to the user, providing the choice (class) by delivering timely sensory input. The user discriminated these choices by his/her endogenous sensory ability and selected the desired choice with an intuitive motor task. This selection was detected by a fast brain switch based on real-time detection of movement-related cortical potentials from scalp EEG. We demonstrated the feasibility of such a system with a four-class BCI, yielding a true positive rate of  ∼ 80% and  ∼ 70%, and an information transfer rate of  ∼ 7 bits/min and  ∼ 5 bits/min, for the movement and imagination selection command, respectively. Furthermore, when the system was extended to eight classes, the throughput of the system was improved, demonstrating the capability of accommodating a large number of classes. Combining the endogenous sensory discrimination with the fast brain switch, the proposed system could be an effective, multi-class, gaze-independent BCI system for communication and control applications.

  16. Influence of core design, production technique, and material selection on fracture behavior of yttria-stabilized tetragonal zirconia polycrystal fixed dental prostheses produced using different multilayer techniques: split-file, over-pressing, and manually built-up veneers.

    Science.gov (United States)

    Mahmood, Deyar Jallal Hadi; Linderoth, Ewa H; Wennerberg, Ann; Vult Von Steyern, Per

    2016-01-01

    To investigate and compare the fracture strength and fracture mode in eleven groups of currently, the most commonly used multilayer three-unit all-ceramic yttria-stabilized tetragonal zirconia polycrystal (Y-TZP) fixed dental prostheses (FDPs) with respect to the choice of core material, veneering material area, manufacturing technique, design of connectors, and radii of curvature of FDP cores. A total of 110 three-unit Y-TZP FDP cores with one intermediate pontic were made. The FDP cores in groups 1-7 were made with a split-file design, veneered with manually built-up porcelain, computer-aided design-on veneers, and over-pressed veneers. Groups 8-11 consisted of FDPs with a state-of-the-art design, veneered with manually built-up porcelain. All the FDP cores were subjected to simulated aging and finally loaded to fracture. There was a significant difference (Pdesigns, but not between the different types of Y-TZP materials. The split-file designs with VITABLOCS(®) (1,806±165 N) and e.max(®) ZirPress (1,854±115 N) and the state-of-the-art design with VITA VM(®) 9 (1,849±150 N) demonstrated the highest mean fracture values. The shape of a split-file designed all-ceramic reconstruction calls for a different dimension protocol, compared to traditionally shaped ones, as the split-file design leads to sharp approximal indentations acting as fractural impressions, thus decreasing the overall strength. The design of a framework is a crucial factor for the load bearing capacity of an all-ceramic FDP. The state-of-the-art design is preferable since the split-file designed cores call for a cross-sectional connector area at least 42% larger, to have the same load bearing capacity as the state-of-the-art designed cores. All veneering materials and techniques tested in the study, split-file, over-press, built-up porcelains, and glass-ceramics are, with a great safety margin, sufficient for clinical use both anteriorly and posteriorly. Analysis of the fracture pattern shows

  17. Clustering based gene expression feature selection method: A computational approach to enrich the classifier efficiency of differentially expressed genes

    KAUST Repository

    Abusamra, Heba

    2016-07-20

    The native nature of high dimension low sample size of gene expression data make the classification task more challenging. Therefore, feature (gene) selection become an apparent need. Selecting a meaningful and relevant genes for classifier not only decrease the computational time and cost, but also improve the classification performance. Among different approaches of feature selection methods, however most of them suffer from several problems such as lack of robustness, validation issues etc. Here, we present a new feature selection technique that takes advantage of clustering both samples and genes. Materials and methods We used leukemia gene expression dataset [1]. The effectiveness of the selected features were evaluated by four different classification methods; support vector machines, k-nearest neighbor, random forest, and linear discriminate analysis. The method evaluate the importance and relevance of each gene cluster by summing the expression level for each gene belongs to this cluster. The gene cluster consider important, if it satisfies conditions depend on thresholds and percentage otherwise eliminated. Results Initial analysis identified 7120 differentially expressed genes of leukemia (Fig. 15a), after applying our feature selection methodology we end up with specific 1117 genes discriminating two classes of leukemia (Fig. 15b). Further applying the same method with more stringent higher positive and lower negative threshold condition, number reduced to 58 genes have be tested to evaluate the effectiveness of the method (Fig. 15c). The results of the four classification methods are summarized in Table 11. Conclusions The feature selection method gave good results with minimum classification error. Our heat-map result shows distinct pattern of refines genes discriminating between two classes of leukemia.

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  20. File Type Identification of File Fragments using Longest Common Subsequence (LCS)

    Science.gov (United States)

    Rahmat, R. F.; Nicholas, F.; Purnamawati, S.; Sitompul, O. S.

    2017-01-01

    Computer forensic analyst is a person in charge of investigation and evidence tracking. In certain cases, the file needed to be presented as digital evidence was deleted. It is difficult to reconstruct the file, because it often lost its header and cannot be identified while being restored. Therefore, a method is required for identifying the file type of file fragments. In this research, we propose Longest Common Subsequences that consists of three steps, namely training, testing and validation, to identify the file type from file fragments. From all testing results we can conlude that our proposed method works well and achieves 92.91% of accuracy to identify the file type of file fragment for three data types.

  1. Computer-aided tool for solvent selection in pharmaceutical processes: Solvent swap

    DEFF Research Database (Denmark)

    Papadakis, Emmanouil; K. Tula, Anjan; Gernaey, Krist V.

    -liquid equilibria). The application of the developed model-based framework is highlighted through several cases studies published in the literature. In the current state, the framework is suitable for problems where the original solvent is exchanged by distillation. A solvent selection guide for fast of suitable......-aided framework with the objective to assist the pharmaceutical industry in gaining better process understanding. A software interface to improve the usability of the tool has been created also....

  2. An integrated computational and experimental approach to gaining selectivity for MMP-2 within the gelatinase subfamily.

    Science.gov (United States)

    Fabre, Benjamin; Filipiak, Kamila; Díaz, Natalia; Zapico, José María; Suárez, Dimas; Ramos, Ana; de Pascual-Teresa, Beatriz

    2014-02-10

    Looking for water-soluble inhibitors of matrix metalloproteinase-2 (MMP-2 or gelatinase A), we have previously reported compound 1, a potent MMP-2 inhibitor with a promising selectivity over the structurally homologous MMP-9 (gelatinase B). Here we report the results of Molecular Dynamics (MD) simulations for both gelatinases (MMP-2 and MMP-9), and for the corresponding MMP/1 complexes, in an attempt to shed light on the observed selectivity between the two enzymes. These studies indicated a higher plasticity of MMP-2 at the S1' pocket and suggested an induced-fit effect at the "back door" of this pocket. On the basis of these observations, we designed 11 a-d to aid further discrimination between MMP-2 and MMP-9. Those compounds displayed notably lower inhibitory activities against MMP-9; in particular, 11 b proved to be over 100 times more active against MMP-2 than against MMP-9. MD simulations of the MMP/11 b complexes and thermodynamic integration calculations provided structural insight and relative binding energies consistent with the experimentally observed activity data. These findings demonstrate that structural differences in the S1' pocket bottom permit an improvement in selectivity in the inhibition of MMP-2 over that of MMP-9; this is of great relevance for future structure-based drug design because MMP-2 is a validated target for cancer therapy, whereas MMP-9 plays both detrimental and protective roles in cancer. This study also supports the need to consider the dynamics of the S1' pocket in order to achieve selectivity in the inhibition of MMPs. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. The selective value of computed tomography of the brain in Cerebritis due to systemic lupus erythematosus

    International Nuclear Information System (INIS)

    Gaylis, N.B.; Altman, R.D.; Ostrov, S.; Quencer, R.

    1982-01-01

    Systemic lupus erythematosus (SLE) and steroid effects on the brain were measured by computed tomography (CT). Of 14 patients with SLE cerebritis, 10 (71%) had marked cortical atrophy and 4 (29%) minimal atrophy. None were normal by CT. Controls included 22 patients with SLE without cerebritis receiving cortiocosteroids; this group had normal CT scans in 16 (73%) and minimal cortical atrophy in the remaining 6 (27%). Follow-up CT on 5 patients with cerebritis was unchanged. CT of the brain is a minimally invasive technique for documenting SLE cerebritis. CT may also help differentiate cerebritis from the neuropsychiatric side effects of corticosteroids

  4. A Biologically Inspired Computational Model of Basal Ganglia in Action Selection.

    Science.gov (United States)

    Baston, Chiara; Ursino, Mauro

    2015-01-01

    The basal ganglia (BG) are a subcortical structure implicated in action selection. The aim of this work is to present a new cognitive neuroscience model of the BG, which aspires to represent a parsimonious balance between simplicity and completeness. The model includes the 3 main pathways operating in the BG circuitry, that is, the direct (Go), indirect (NoGo), and hyperdirect pathways. The main original aspects, compared with previous models, are the use of a two-term Hebb rule to train synapses in the striatum, based exclusively on neuronal activity changes caused by dopamine peaks or dips, and the role of the cholinergic interneurons (affected by dopamine themselves) during learning. Some examples are displayed, concerning a few paradigmatic cases: action selection in basal conditions, action selection in the presence of a strong conflict (where the role of the hyperdirect pathway emerges), synapse changes induced by phasic dopamine, and learning new actions based on a previous history of rewards and punishments. Finally, some simulations show model working in conditions of altered dopamine levels, to illustrate pathological cases (dopamine depletion in parkinsonian subjects or dopamine hypermedication). Due to its parsimonious approach, the model may represent a straightforward tool to analyze BG functionality in behavioral experiments.

  5. A Biologically Inspired Computational Model of Basal Ganglia in Action Selection

    Directory of Open Access Journals (Sweden)

    Chiara Baston

    2015-01-01

    Full Text Available The basal ganglia (BG are a subcortical structure implicated in action selection. The aim of this work is to present a new cognitive neuroscience model of the BG, which aspires to represent a parsimonious balance between simplicity and completeness. The model includes the 3 main pathways operating in the BG circuitry, that is, the direct (Go, indirect (NoGo, and hyperdirect pathways. The main original aspects, compared with previous models, are the use of a two-term Hebb rule to train synapses in the striatum, based exclusively on neuronal activity changes caused by dopamine peaks or dips, and the role of the cholinergic interneurons (affected by dopamine themselves during learning. Some examples are displayed, concerning a few paradigmatic cases: action selection in basal conditions, action selection in the presence of a strong conflict (where the role of the hyperdirect pathway emerges, synapse changes induced by phasic dopamine, and learning new actions based on a previous history of rewards and punishments. Finally, some simulations show model working in conditions of altered dopamine levels, to illustrate pathological cases (dopamine depletion in parkinsonian subjects or dopamine hypermedication. Due to its parsimonious approach, the model may represent a straightforward tool to analyze BG functionality in behavioral experiments.

  6. A guide for the selection of computer assisted mapping (CAM) and facilities informations systems

    Energy Technology Data Exchange (ETDEWEB)

    Haslin, S.; Baxter, P.; Jarvis, L.

    1980-12-01

    Many distribution engineers are now aware that computer assisted mapping (CAM) and facilities informations systems are probably the most significant breakthrough to date in computer applications for distribution engineering. The Canadian Electrical Asociation (CEA) recognized this and requested engineers of B.C. Hydro make a study of the state of the art in Canadian utilities and the progress of CAM systems on an international basis. The purpose was to provide a guide to assist Canadian utility distribution engineers faced with the problem of studying the application of CAM systems as an alternative to present methods, consideration being given to the long-term and other benefits that were perhaps not apparent for those approaching this field for the first time. It soon became apparent that technology was developing at a high rate and competition in the market was very strong. Also a number of publications were produced by other sources which adequately covered the scope of this study. This report is thus a collection of references to reports, manuals, and other documents with a few considerations provided for those companies interested in exploring further the use of interactive graphics. 24 refs.

  7. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  13. Biomimicry in Product Design through Materials Selection and Computer Aided Engineering

    Science.gov (United States)

    Alexandridis, G.; Tzetzis, D.; Kyratsis, P.

    2016-11-01

    The aim of this study is to demonstrate a 7-step methodology that describes the way nature can act as a source of inspiration for the design and the development of a product. Furthermore, it suggests special computerized tools and methods for the product optimization regarding its environmental impact i.e. material selection, production methods. For validation purposes, a garden chaise lounge that imitates the form of a scorpion was developed as a result for the case study and the presentation of the current methodology.

  14. Computed tomography of the breast: a valuable adjunct to mammography in selected cases

    International Nuclear Information System (INIS)

    Van Gelderen, W.F.C.; Tygerberg Hospital, Cape Town

    1995-01-01

    Computed tomography (CT) is not often used for the further assessment of a mass in the breast. In the case presented it proved to be invaluable in demonstrating a very posterior breast mass previously sub optimally demonstrated a mammography, not palpable clinically, and not visualized on ultrasound examination. The predicament of a very posterior breast mass is highlighted and it is suggested that present-day routine cranio-caudal and oblique views may not be adequate to show such a mass even if meticulous attention to radiograph technique is given. If the supine-oblique view with balloon compression cannot be obtained, CT in the prone position with the breasts dependent may be the best alternative. Fine needle aspiration biopsy can be performed under CT guidance in such cases. 4 refs., 3 figs

  15. Dopamine selectively remediates 'model-based' reward learning: a computational approach.

    Science.gov (United States)

    Sharp, Madeleine E; Foerde, Karin; Daw, Nathaniel D; Shohamy, Daphna

    2016-02-01

    Patients with loss of dopamine due to Parkinson's disease are impaired at learning from reward. However, it remains unknown precisely which aspect of learning is impaired. In particular, learning from reward, or reinforcement learning, can be driven by two distinct computational processes. One involves habitual stamping-in of stimulus-response associations, hypothesized to arise computationally from 'model-free' learning. The other, 'model-based' learning, involves learning a model of the world that is believed to support goal-directed behaviour. Much work has pointed to a role for dopamine in model-free learning. But recent work suggests model-based learning may also involve dopamine modulation, raising the possibility that model-based learning may contribute to the learning impairment in Parkinson's disease. To directly test this, we used a two-step reward-learning task which dissociates model-free versus model-based learning. We evaluated learning in patients with Parkinson's disease tested ON versus OFF their dopamine replacement medication and in healthy controls. Surprisingly, we found no effect of disease or medication on model-free learning. Instead, we found that patients tested OFF medication showed a marked impairment in model-based learning, and that this impairment was remediated by dopaminergic medication. Moreover, model-based learning was positively correlated with a separate measure of working memory performance, raising the possibility of common neural substrates. Our results suggest that some learning deficits in Parkinson's disease may be related to an inability to pursue reward based on complete representations of the environment. © The Author (2015). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. Computing a ground appropriateness index for route selection in permafrost regions

    Directory of Open Access Journals (Sweden)

    Chi Zhang

    2017-10-01

    Full Text Available The reasonable calculation of ground appropriateness index in permafrost region is the precondition of highway route design in permafrost region. The theory of knowledge base and fuzzy mathematics are applied, and the damage effect of permafrost is considered in the paper. Based on the idea of protecting permafrost the calculation method of ground appropriateness index is put forward. Firstly, based on the actual environment conditions, the paper determines the factors affecting the road layout in permafrost areas by qualitative and quantitative analysis, including the annual slope, the average annual ground temperature of permafrost, the amount of ice in frozen soil, and the interference engineering. Secondly, based on the knowledge base theory and the use of Delphi method, the paper establishes the knowledge base, the rule base of the permafrost region and inference mechanism. The method of selecting the road in permafrost region is completed and realized by using the software platform. Thirdly, taking the Tuotuo River to Kaixin Mountain section of permafrost region as an example, the application of the method is studied by using an ArcGIS platform. Results show that the route plan determined by the method of selecting the road in permafrost region can avoid the high temperature and high ice content area, conform the terrain changes and evade the heat disturbance among the existing projects. A reasonable route plan can be achieved, and it can provide the basis for the next engineering construction.

  17. A Very Compact AES-SPIHT Selective Encryption Computer Architecture Design with Improved S-Box

    Directory of Open Access Journals (Sweden)

    Jia Hao Kong

    2013-01-01

    Full Text Available The “S-box” algorithm is a key component in the Advanced Encryption Standard (AES due to its nonlinear property. Various implementation approaches have been researched and discussed meeting stringent application goals (such as low power, high throughput, low area, but the ultimate goal for many researchers is to find a compact and small hardware footprint for the S-box circuit. In this paper, we present our version of minimized S-box with two separate proposals and improvements in the overall gate count. The compact S-box is adopted with a compact and optimum processor architecture specifically tailored for the AES, namely, the compact instruction set architecture (CISA. To further justify and strengthen the purpose of the compact crypto-processor’s application, we have also presented a selective encryption architecture (SEA which incorporates the CISA as a part of the encryption core, accompanied by the set partitioning in hierarchical trees (SPIHT algorithm as a complete selective encryption system.

  18. Computer aided diagnosis system for Alzheimer disease using brain diffusion tensor imaging features selected by Pearson's correlation.

    Science.gov (United States)

    Graña, M; Termenon, M; Savio, A; Gonzalez-Pinto, A; Echeveste, J; Pérez, J M; Besga, A

    2011-09-20

    The aim of this paper is to obtain discriminant features from two scalar measures of Diffusion Tensor Imaging (DTI) data, Fractional Anisotropy (FA) and Mean Diffusivity (MD), and to train and test classifiers able to discriminate Alzheimer's Disease (AD) patients from controls on the basis of features extracted from the FA or MD volumes. In this study, support vector machine (SVM) classifier was trained and tested on FA and MD data. Feature selection is done computing the Pearson's correlation between FA or MD values at voxel site across subjects and the indicative variable specifying the subject class. Voxel sites with high absolute correlation are selected for feature extraction. Results are obtained over an on-going study in Hospital de Santiago Apostol collecting anatomical T1-weighted MRI volumes and DTI data from healthy control subjects and AD patients. FA features and a linear SVM classifier achieve perfect accuracy, sensitivity and specificity in several cross-validation studies, supporting the usefulness of DTI-derived features as an image-marker for AD and to the feasibility of building Computer Aided Diagnosis systems for AD based on them. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  19. COPD phenotypes on computed tomography and its correlation with selected lung function variables in severe patients

    Directory of Open Access Journals (Sweden)

    da Silva SMD

    2016-03-01

    Full Text Available Silvia Maria Doria da Silva, Ilma Aparecida Paschoal, Eduardo Mello De Capitani, Marcos Mello Moreira, Luciana Campanatti Palhares, Mônica Corso PereiraPneumology Service, Department of Internal Medicine, School of Medical Sciences, State University of Campinas (UNICAMP, Campinas, São Paulo, BrazilBackground: Computed tomography (CT phenotypic characterization helps in understanding the clinical diversity of chronic obstructive pulmonary disease (COPD patients, but its clinical relevance and its relationship with functional features are not clarified. Volumetric capnography (VC uses the principle of gas washout and analyzes the pattern of CO2 elimination as a function of expired volume. The main variables analyzed were end-tidal concentration of carbon dioxide (ETCO2, Slope of phase 2 (Slp2, and Slope of phase 3 (Slp3 of capnogram, the curve which represents the total amount of CO2 eliminated by the lungs during each breath.Objective: To investigate, in a group of patients with severe COPD, if the phenotypic analysis by CT could identify different subsets of patients, and if there was an association of CT findings and functional variables.Subjects and methods: Sixty-five patients with COPD Gold III–IV were admitted for clinical evaluation, high-resolution CT, and functional evaluation (spirometry, 6-minute walk test [6MWT], and VC. The presence and profusion of tomography findings were evaluated, and later, the patients were identified as having emphysema (EMP or airway disease (AWD phenotype. EMP and AWD groups were compared; tomography findings scores were evaluated versus spirometric, 6MWT, and VC variables.Results: Bronchiectasis was found in 33.8% and peribronchial thickening in 69.2% of the 65 patients. Structural findings of airways had no significant correlation with spirometric variables. Air trapping and EMP were strongly correlated with VC variables, but in opposite directions. There was some overlap between the EMP and AWD

  20. An analysis of file system and installation of the file management system for NOS operating system

    International Nuclear Information System (INIS)

    Lee, Young Jai; Park, Sun Hee; Hwang, In Ah; Kim, Hee Kyung

    1992-06-01

    In this technical report, we analyze NOS file structure for Cyber 170-875 and Cyber 960-31 computer system. We also describe functions, procedure and how-to-operate and how-to-use of VDS. VDS is used to manage large files effectively for Cyber computer system. The purpose of the VDS installation is to increase the virtual disk storage by utilizing magnetic tape, to assist the users of the computer system in managing their files, and to enhance the performance of KAERI Cyber computer system. (Author)

  1. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  2. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  4. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  5. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  8. Optogenetic stimulation in a computational model of the basal ganglia biases action selection and reward prediction error.

    Science.gov (United States)

    Berthet, Pierre; Lansner, Anders

    2014-01-01

    Optogenetic stimulation of specific types of medium spiny neurons (MSNs) in the striatum has been shown to bias the selection of mice in a two choices task. This shift is dependent on the localisation and on the intensity of the stimulation but also on the recent reward history. We have implemented a way to simulate this increased activity produced by the optical flash in our computational model of the basal ganglia (BG). This abstract model features the direct and indirect pathways commonly described in biology, and a reward prediction pathway (RP). The framework is similar to Actor-Critic methods and to the ventral/dorsal distinction in the striatum. We thus investigated the impact on the selection caused by an added stimulation in each of the three pathways. We were able to reproduce in our model the bias in action selection observed in mice. Our results also showed that biasing the reward prediction is sufficient to create a modification in the action selection. However, we had to increase the percentage of trials with stimulation relative to that in experiments in order to impact the selection. We found that increasing only the reward prediction had a different effect if the stimulation in RP was action dependent (only for a specific action) or not. We further looked at the evolution of the change in the weights depending on the stage of learning within a block. A bias in RP impacts the plasticity differently depending on that stage but also on the outcome. It remains to experimentally test how the dopaminergic neurons are affected by specific stimulations of neurons in the striatum and to relate data to predictions of our model.

  9. Selective tuberculosis incidence estimation by digital computer information technologies in the MS Excel system

    Directory of Open Access Journals (Sweden)

    G. I. Ilnitsky

    2014-01-01

    Full Text Available The incidence of tuberculosis was estimated in different age groups of people, applying the digital computer information technologies of tracking. For this, the author used the annual forms of the reporting materials stipulated by the Ministry of Health of Ukraine, the results of his observations, and the data of bank information accumulation in the MS Excel system. The initial positions were formed in terms of the epidemiological indicators of Ukraine and the Lvov Region during a 10-year period (2000-2009 that was, in relation with different initial characteristics, divided into Step 1 (2000-2004 in which the tuberculosis epidemic situation progressively deteriorated and Step 2 (2005-2009 in which relative morbidity was relatively stabilized. The results were processed using the MS Excel statistical and mathematical functions that were parametric and nonparametric in establishing a correlation when estimating the changes in epidemic parameters. The findings of studies among the general population could lead to the conclusion that the mean tuberculosis morbidity in Ukraine was much greater than that in the Lvov Region irrespective of the age of a population. At the same time, the morbidity rate in the foci of tuberculosis infection suggested that it rose among both the children, adolescents, and adults, which provided a rationale for that therapeutic and preventive measures should be better implemented.

  10. Criteria for the selective use of chest computed tomography in blunt trauma patients

    Energy Technology Data Exchange (ETDEWEB)

    Brink, Monique; Dekker, Helena M.; Kool, Digna R.; Blickman, Johan G. [Radboud University Nijmegen, Medical Centre, Department of Radiology, Nijmegen (Netherlands); Deunk, Jaap; Edwards, Michael J.R. [Radboud University Nijmegen, Medical Centre, Department of Surgery, Nijmegen (Netherlands); Vugt, Arie B. van [Radboud University Nijmegen, Medical Centre Nijmegen, Department of Emergency Medicine, Nijmegen (Netherlands); Kuijk, Cornelis van [VU (Vrije Universiteit) University, Medical Center Amsterdam, Department of Radiology, Amsterdam (Netherlands)

    2010-04-15

    The purpose of this study was to derive parameters that predict which high-energy blunt trauma patients should undergo computed tomography (CT) for detection of chest injury. This observational study prospectively included consecutive patients ({>=}16 years old) who underwent multidetector CT of the chest after a high-energy mechanism of blunt trauma in one trauma centre. We included 1,047 patients (median age, 37; 70% male), of whom 508 had chest injuries identified by CT. Using logistic regression, we identified nine predictors of chest injury presence on CT (age {>=}55 years, abnormal chest physical examination, altered sensorium, abnormal thoracic spine physical examination, abnormal chest conventional radiography (CR), abnormal thoracic spine CR, abnormal pelvic CR or abdominal ultrasound, base excess <-3 mmol/l and haemoglobin <6 mmol/l). Of 855 patients with {>=}1 positive predictors, 484 had injury on CT (95% of all 508 patients with injury). Of all 192 patients with no positive predictor, 24 (13%) had chest injury, of whom 4 (2%) had injuries that were considered clinically relevant. Omission of CT in patients without any positive predictor could reduce imaging frequency by 18%, while most clinically relevant chest injuries remain adequately detected. (orig.)

  11. Criteria for the selective use of chest computed tomography in blunt trauma patients

    International Nuclear Information System (INIS)

    Brink, Monique; Dekker, Helena M.; Kool, Digna R.; Blickman, Johan G.; Deunk, Jaap; Edwards, Michael J.R.; Vugt, Arie B. van; Kuijk, Cornelis van

    2010-01-01

    The purpose of this study was to derive parameters that predict which high-energy blunt trauma patients should undergo computed tomography (CT) for detection of chest injury. This observational study prospectively included consecutive patients (≥16 years old) who underwent multidetector CT of the chest after a high-energy mechanism of blunt trauma in one trauma centre. We included 1,047 patients (median age, 37; 70% male), of whom 508 had chest injuries identified by CT. Using logistic regression, we identified nine predictors of chest injury presence on CT (age ≥55 years, abnormal chest physical examination, altered sensorium, abnormal thoracic spine physical examination, abnormal chest conventional radiography (CR), abnormal thoracic spine CR, abnormal pelvic CR or abdominal ultrasound, base excess <-3 mmol/l and haemoglobin <6 mmol/l). Of 855 patients with ≥1 positive predictors, 484 had injury on CT (95% of all 508 patients with injury). Of all 192 patients with no positive predictor, 24 (13%) had chest injury, of whom 4 (2%) had injuries that were considered clinically relevant. Omission of CT in patients without any positive predictor could reduce imaging frequency by 18%, while most clinically relevant chest injuries remain adequately detected. (orig.)

  12. Multivariate Feature Selection of Image Descriptors Data for Breast Cancer with Computer-Assisted Diagnosis

    Directory of Open Access Journals (Sweden)

    Carlos E. Galván-Tejada

    2017-02-01

    Full Text Available Breast cancer is an important global health problem, and the most common type of cancer among women. Late diagnosis significantly decreases the survival rate of the patient; however, using mammography for early detection has been demonstrated to be a very important tool increasing the survival rate. The purpose of this paper is to obtain a multivariate model to classify benign and malignant tumor lesions using a computer-assisted diagnosis with a genetic algorithm in training and test datasets from mammography image features. A multivariate search was conducted to obtain predictive models with different approaches, in order to compare and validate results. The multivariate models were constructed using: Random Forest, Nearest centroid, and K-Nearest Neighbor (K-NN strategies as cost function in a genetic algorithm applied to the features in the BCDR public databases. Results suggest that the two texture descriptor features obtained in the multivariate model have a similar or better prediction capability to classify the data outcome compared with the multivariate model composed of all the features, according to their fitness value. This model can help to reduce the workload of radiologists and present a second opinion in the classification of tumor lesions.

  13. Multivariate Feature Selection of Image Descriptors Data for Breast Cancer with Computer-Assisted Diagnosis.

    Science.gov (United States)

    Galván-Tejada, Carlos E; Zanella-Calzada, Laura A; Galván-Tejada, Jorge I; Celaya-Padilla, José M; Gamboa-Rosales, Hamurabi; Garza-Veloz, Idalia; Martinez-Fierro, Margarita L

    2017-02-14

    Breast cancer is an important global health problem, and the most common type of cancer among women. Late diagnosis significantly decreases the survival rate of the patient; however, using mammography for early detection has been demonstrated to be a very important tool increasing the survival rate. The purpose of this paper is to obtain a multivariate model to classify benign and malignant tumor lesions using a computer-assisted diagnosis with a genetic algorithm in training and test datasets from mammography image features. A multivariate search was conducted to obtain predictive models with different approaches, in order to compare and validate results. The multivariate models were constructed using: Random Forest, Nearest centroid, and K-Nearest Neighbor (K-NN) strategies as cost function in a genetic algorithm applied to the features in the BCDR public databases. Results suggest that the two texture descriptor features obtained in the multivariate model have a similar or better prediction capability to classify the data outcome compared with the multivariate model composed of all the features, according to their fitness value. This model can help to reduce the workload of radiologists and present a second opinion in the classification of tumor lesions.

  14. Binary particle swarm optimization for frequency band selection in motor imagery based brain-computer interfaces.

    Science.gov (United States)

    Wei, Qingguo; Wei, Zhonghai

    2015-01-01

    A brain-computer interface (BCI) enables people suffering from affective neurological diseases to communicate with the external world. Common spatial pattern (CSP) is an effective algorithm for feature extraction in motor imagery based BCI systems. However, many studies have proved that the performance of CSP depends heavily on the frequency band of EEG signals used for the construction of covariance matrices. The use of different frequency bands to extract signal features may lead to different classification performances, which are determined by the discriminative and complementary information they contain. In this study, the broad frequency band (8-30 Hz) is divided into 10 sub-bands of band width 4 Hz and overlapping 2 Hz. Binary particle swarm optimization (BPSO) is used to find the best sub-band set to improve the performance of CSP and subsequent classification. Experimental results demonstrate that the proposed method achieved an average improvement of 6.91% in cross-validation accuracy when compared to broad band CSP.

  15. Recommendations for computer code selection of a flow and transport code to be used in undisturbed vadose zone calculations for TWRS immobilized wastes environmental analyses

    International Nuclear Information System (INIS)

    VOOGD, J.A.

    1999-01-01

    An analysis of three software proposals is performed to recommend a computer code for immobilized low activity waste flow and transport modeling. The document uses criteria restablished in HNF-1839, ''Computer Code Selection Criteria for Flow and Transport Codes to be Used in Undisturbed Vadose Zone Calculation for TWRS Environmental Analyses'' as the basis for this analysis

  16. Accuracy of computer-calculated and manual QRS duration assessments: Clinical implications to select candidates for cardiac resynchronization therapy.

    Science.gov (United States)

    De Pooter, Jan; El Haddad, Milad; Stroobandt, Roland; De Buyzere, Marc; Timmermans, Frank

    2017-06-01

    QRS duration (QRSD) plays a key role in the field of cardiac resynchronization therapy (CRT). Computer-calculated QRSD assessments are widely used, however inter-manufacturer differences have not been investigated in CRT candidates. QRSD was assessed in 377 digitally stored ECGs: 139 narrow QRS, 140 LBBB and 98 ventricular paced ECGs. Manual QRSD was measured as global QRSD, using digital calipers, by two independent observers. Computer-calculated QRSD was assessed by Marquette 12SL (GE Healthcare, Waukesha, WI, USA) and SEMA3 (Schiller, Baar, Switzerland). Inter-manufacturer differences of computer-calculated QRSD assessments vary among different QRS morphologies: narrow QRSD: 4 [2-9] ms (median [IQR]), p=0.010; LBBB QRSD: 7 [2-10] ms, p=0.003 and paced QRSD: 13 [6-18] ms, p=0.007. Interobserver differences of manual QRSD assessments measured: narrow QRSD: 4 [2-6] ms, p=non-significant; LBBB QRSD: 6 [3-12] ms, p=0.006; paced QRSD: 8 [4-18] ms, p=0.001. In LBBB ECGs, intraclass correlation coefficients (ICCs) were comparable for inter-manufacturer and interobserver agreement (ICC 0.830 versus 0.837). When assessing paced QRSD, manual measurements showed higher ICC compared to inter-manufacturer agreement (ICC 0.902 versus 0.776). Using guideline cutoffs of 130ms, up to 15% of the LBBB ECGs would be misclassified as <130ms or ≥130ms by at least one method. Using a cutoff of 150ms, this number increases to 33% of ECGs being misclassified. However, by combining LBBB-morphology and QRSD, the number of misclassified ECGs can be decreased by half. Inter-manufacturer differences in computer-calculated QRSD assessments are significant and may compromise adequate selection of individual CRT candidates when using QRSD as sole parameter. Paced QRSD should preferentially be assessed by manual QRSD measurements. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Computational and experimental investigation of molecular imprinted polymers for selective extraction of dimethoate and its metabolite omethoate from olive oil.

    Science.gov (United States)

    Bakas, Idriss; Oujji, Najwa Ben; Moczko, Ewa; Istamboulie, Georges; Piletsky, Sergey; Piletska, Elena; Ait-Addi, Elhabib; Ait-Ichou, Ihya; Noguer, Thierry; Rouillon, Régis

    2013-01-25

    This work presents the development of molecularly imprinted polymers (MIPs) for the selective extraction of dimethoate from olive oil. Computational simulations allowed selecting itaconic acid as the monomer showing the highest affinity towards dimethoate. Experimental validation confirmed modelling predictions and showed that the polymer based on IA as functional monomer and omethoate as template molecule displays the highest selectivity for the structurally similar pesticides dimethoate, omethoate and monocrotophos. Molecularly imprinted solid phase extraction (MISPE) method was developed and applied to the clean-up of olive oil extracts. It was found that the most suitable solvents for loading, washing and elution step were respectively hexane, hexane-dichloromethane (85:15%) and methanol. The developed MIPSE was successfully applied to extraction of dimethoate from olive oil, with recovery rates up to 94%. The limits of detection and quantification of the described method were respectively 0.012 and 0.05 μg g(-1). Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Selection of nonessential intravenous contrast enhanced-computed tomography for diagnosing acute appendicitis

    International Nuclear Information System (INIS)

    Kondo, Naoko; Kitagawa, Yoshimi; Satake, Tatsunari; Mayumi, Toshihiko; Kohno, Hiroshi

    2007-01-01

    Since computed tomography (CT) has made acute appendicitis increasingly easy to diagnose correctly, intravenous contrast-enhanced CT (IV-CT) is increasingly used for this diagnosis. The purpose of this study is to clarify an indication of IV-CT and to eliminate unnecessary IV-CT. We studied whether IV-CT is necessary in all patients suspected of acute appendicitis, given the interval between onset and clinical diagnosis. IV-CT was performed in patient who had right lower quadrant abdominal pain or who had no pain but physical findings at right lower quadrant abdomen. We reviewed detailed medical records of 171 consecutive patients who underwent IV-CT followed by appendectomy within 24 hr. We compared Blumberg's sign, muscle guarding, body temperature, white blood cell count, and C reaction protein, dividing patients into 3 groups-half a day, in which the interval between onset and initial diagnosis was shorter than half a day; 1-day, in which the interval was longer than half a day but shorter than 1 day; and multiple-day, in which the interval exceeded 1 day. We also analyzed IV-CT findings for the abnormal appendix and the number of positive individual CT findings including abnormal appendix, calcified appendicolith, ascites, cecal wall thickening, and dilated intestines. Muscle guarding was significantly common in the patients who had appendicitis among 1-day and multiple-day patients. In IV-CT, enlarged appendix was observed more frequently in those with appendicitis in all 3 groups. Positive individual CT findings were detected more often in multiple-day patients who had appendicitis. We found no significant difference among the other items. A patient diagnosed clinically later than half a day after onset and having muscle guarding should be strongly suspected having acute appendicitis, indicating that IV-CT is not needed in such patients. (author)

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  20. Feature Selection and Blind Source Separation in an EEG-Based Brain-Computer Interface

    Directory of Open Access Journals (Sweden)

    Michael H. Thaut

    2005-11-01

    Full Text Available Most EEG-based BCI systems make use of well-studied patterns of brain activity. However, those systems involve tasks that indirectly map to simple binary commands such as “yes” or “no” or require many weeks of biofeedback training. We hypothesized that signal processing and machine learning methods can be used to discriminate EEG in a direct “yes”/“no” BCI from a single session. Blind source separation (BSS and spectral transformations of the EEG produced a 180-dimensional feature space. We used a modified genetic algorithm (GA wrapped around a support vector machine (SVM classifier to search the space of feature subsets. The GA-based search found feature subsets that outperform full feature sets and random feature subsets. Also, BSS transformations of the EEG outperformed the original time series, particularly in conjunction with a subset search of both spaces. The results suggest that BSS and feature selection can be used to improve the performance of even a “direct,” single-session BCI.

  1. Palladium-catalyzed meta-selective C-H bond activation with a nitrile-containing template: computational study on mechanism and origins of selectivity.

    Science.gov (United States)

    Yang, Yun-Fang; Cheng, Gui-Juan; Liu, Peng; Leow, Dasheng; Sun, Tian-Yu; Chen, Ping; Zhang, Xinhao; Yu, Jin-Quan; Wu, Yun-Dong; Houk, K N

    2014-01-08

    Density functional theory investigations have elucidated the mechanism and origins of meta-regioselectivity of Pd(II)-catalyzed C-H olefinations of toluene derivatives that employ a nitrile-containing template. The reaction proceeds through four major steps: C-H activation, alkene insertion, β-hydride elimination, and reductive elimination. The C-H activation step, which proceeds via a concerted metalation-deprotonation (CMD) pathway, is found to be the rate- and regioselectivity-determining step. For the crucial C-H activation, four possible active catalytic species-monomeric Pd(OAc)2, dimeric Pd2(OAc)4, heterodimeric PdAg(OAc)3, and trimeric Pd3(OAc)6-have been investigated. The computations indicated that the C-H activation with the nitrile-containing template occurs via a Pd-Ag heterodimeric transition state. The nitrile directing group coordinates with Ag while the Pd is placed adjacent to the meta-C-H bond in the transition state, leading to the observed high meta-selectivity. The Pd2(OAc)4 dimeric mechanism also leads to the meta-C-H activation product but with higher activation energies than the Pd-Ag heterodimeric mechanism. The Pd monomeric and trimeric mechanisms require much higher activation free energies and are predicted to give ortho products. Structural and distortion energy analysis of the transition states revealed significant effects of distortions of the template on mechanism and regioselectivity, which provided hints for further developments of new templates.

  2. JENDL special purpose file

    International Nuclear Information System (INIS)

    Nakagawa, Tsuneo

    1995-01-01

    In JENDL-3,2, the data on all the reactions having significant cross section over the neutron energy from 0.01 meV to 20 MeV are given for 340 nuclides. The object range of application extends widely, such as the neutron engineering, shield and others of fast reactors, thermal neutron reactors and nuclear fusion reactors. This is a general purpose data file. On the contrary to this, the file in which only the data required for a specific application field are collected is called special purpose file. The file for dosimetry is a typical special purpose file. The Nuclear Data Center, Japan Atomic Energy Research Institute, is making ten kinds of JENDL special purpose files. The files, of which the working groups of Sigma Committee are in charge, are listed. As to the format of the files, ENDF format is used similarly to JENDL-3,2. Dosimetry file, activation cross section file, (α, n) reaction data file, fusion file, actinoid file, high energy data file, photonuclear data file, PKA/KERMA file, gas production cross section file and decay data file are described on their contents, the course of development and their verification. Dosimetry file and gas production cross section file have been completed already. As for the others, the expected time of completion is shown. When these files are completed, they are opened to the public. (K.I.)

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  4. Correlated responses in tissue weights measured in vivo by computer tomography in Dorset Down sheep selected for lean tissue growth

    International Nuclear Information System (INIS)

    Nsoso, S.J.; Young, M.J.; Beatson, P.R.

    2003-01-01

    The aim of this study was to estimate correlated responses in lean, fat and bone weights in vivo in Dorset Down sheep selected for lean tissue growth. Over the period 1986-1992 inclusive, the lean tissue growth line had been selected using two economic indices for an increased aggregate breeding value incorporating predicted lean and fat weights with positive and negative economic weightings, respectively. The control line was selected for no change in lean tissue growth each year. Animals were born and run on pasture all year round. X-ray computer tomography was used to estimate the weights of lean, fat and bone in vivo in the 1994-born sheep, aged 265-274 days and selected randomly into 12 rams and 12 ewes from the selected line and 10 rams and 9 ewes from the control line. The lean tissue growth line had significantly greater responses in lean weight (+0.65 + 0.10 kg) and lean percentage (+1.19 + 0.17%) and significantly lesser fat weight (-0.36 + 0.08 kg) and fat percentage (-1.88 + 0.20%) compared to the control line. There was a significant increase in bone weight (+0.27 + 0.03 kg) and bone percentage (+0.69 + 0.09%) in the lean tissue growth line compared to the control line. Responses differed significantly between sexes of the lean tissue growth line, rams having a greater response in weight of lean (+1.22 + 0.20 vs. +0.08 + 0.22 kg) and bone (+0.45 + 0.06 vs. +0.09 + 0.07 kg), and a lesser response in weight of fat (-0.03 + 0.15 vs. -0.70 + 0.16 kg) than the ewes. Selection led to significant changes in lean (increase) and fat weights (decrease), and bone weight increased. Although responses in the lean tissue growth line differed significantly between sexes, there were confounding factors due to differences in management and lack of comparison at equal stage of development. Therefore, to assess real genetic differences further studies should be conducted taking these factors into consideration

  5. Detection Of Alterations In Audio Files Using Spectrograph Analysis

    Directory of Open Access Journals (Sweden)

    Anandha Krishnan G

    2015-08-01

    Full Text Available The corresponding study was carried out to detect changes in audio file using spectrograph. An audio file format is a file format for storing digital audio data on a computer system. A sound spectrograph is a laboratory instrument that displays a graphical representation of the strengths of the various component frequencies of a sound as time passes. The objectives of the study were to find the changes in spectrograph of audio after altering them to compare altering changes with spectrograph of original files and to check for similarity and difference in mp3 and wav. Five different alterations were carried out on each audio file to analyze the differences between the original and the altered file. For altering the audio file MP3 or WAV by cutcopy the file was opened in Audacity. A different audio was then pasted to the audio file. This new file was analyzed to view the differences. By adjusting the necessary parameters the noise was reduced. The differences between the new file and the original file were analyzed. By adjusting the parameters from the dialog box the necessary changes were made. The edited audio file was opened in the software named spek where after analyzing a graph is obtained of that particular file which is saved for further analysis. The original audio graph received was combined with the edited audio file graph to see the alterations.

  6. Coronary artery analysis: Computer-assisted selection of best-quality segments in multiple-phase coronary CT angiography

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Chuan, E-mail: chuan@umich.edu; Chan, Heang-Ping; Hadjiyski, Lubomir M.; Chughtai, Aamer; Wei, Jun; Kazerooni, Ella A. [Department of Radiology, The University of Michigan, Ann Arbor, Michigan 48109-0904 (United States)

    2016-10-15

    Purpose: The authors are developing an automated method to identify the best-quality coronary arterial segment from multiple-phase coronary CT angiography (cCTA) acquisitions, which may be used by either interpreting physicians or computer-aided detection systems to optimally and efficiently utilize the diagnostic information available in multiple-phase cCTA for the detection of coronary artery disease. Methods: After initialization with a manually identified seed point, each coronary artery tree is automatically extracted from multiple cCTA phases using our multiscale coronary artery response enhancement and 3D rolling balloon region growing vessel segmentation and tracking method. The coronary artery trees from multiple phases are then aligned by a global registration using an affine transformation with quadratic terms and nonlinear simplex optimization, followed by a local registration using a cubic B-spline method with fast localized optimization. The corresponding coronary arteries among the available phases are identified using a recursive coronary segment matching method. Each of the identified vessel segments is transformed by the curved planar reformation (CPR) method. Four features are extracted from each corresponding segment as quality indicators in the original computed tomography volume and the straightened CPR volume, and each quality indicator is used as a voting classifier for the arterial segment. A weighted voting ensemble (WVE) classifier is designed to combine the votes of the four voting classifiers for each corresponding segment. The segment with the highest WVE vote is then selected as the best-quality segment. In this study, the training and test sets consisted of 6 and 20 cCTA cases, respectively, each with 6 phases, containing a total of 156 cCTA volumes and 312 coronary artery trees. An observer preference study was also conducted with one expert cardiothoracic radiologist and four nonradiologist readers to visually rank vessel segment

  7. Coronary artery analysis: Computer-assisted selection of best-quality segments in multiple-phase coronary CT angiography

    International Nuclear Information System (INIS)

    Zhou, Chuan; Chan, Heang-Ping; Hadjiyski, Lubomir M.; Chughtai, Aamer; Wei, Jun; Kazerooni, Ella A.

    2016-01-01

    Purpose: The authors are developing an automated method to identify the best-quality coronary arterial segment from multiple-phase coronary CT angiography (cCTA) acquisitions, which may be used by either interpreting physicians or computer-aided detection systems to optimally and efficiently utilize the diagnostic information available in multiple-phase cCTA for the detection of coronary artery disease. Methods: After initialization with a manually identified seed point, each coronary artery tree is automatically extracted from multiple cCTA phases using our multiscale coronary artery response enhancement and 3D rolling balloon region growing vessel segmentation and tracking method. The coronary artery trees from multiple phases are then aligned by a global registration using an affine transformation with quadratic terms and nonlinear simplex optimization, followed by a local registration using a cubic B-spline method with fast localized optimization. The corresponding coronary arteries among the available phases are identified using a recursive coronary segment matching method. Each of the identified vessel segments is transformed by the curved planar reformation (CPR) method. Four features are extracted from each corresponding segment as quality indicators in the original computed tomography volume and the straightened CPR volume, and each quality indicator is used as a voting classifier for the arterial segment. A weighted voting ensemble (WVE) classifier is designed to combine the votes of the four voting classifiers for each corresponding segment. The segment with the highest WVE vote is then selected as the best-quality segment. In this study, the training and test sets consisted of 6 and 20 cCTA cases, respectively, each with 6 phases, containing a total of 156 cCTA volumes and 312 coronary artery trees. An observer preference study was also conducted with one expert cardiothoracic radiologist and four nonradiologist readers to visually rank vessel segment

  8. Remote file inquiry (RFI) system

    Science.gov (United States)

    1975-01-01

    System interrogates and maintains user-definable data files from remote terminals, using English-like, free-form query language easily learned by persons not proficient in computer programming. System operates in asynchronous mode, allowing any number of inquiries within limitation of available core to be active concurrently.

  9. A Heuristic Placement Selection of Live Virtual Machine Migration for Energy-Saving in Cloud Computing Environment

    Science.gov (United States)

    Zhao, Jia; Hu, Liang; Ding, Yan; Xu, Gaochao; Hu, Ming

    2014-01-01

    The field of live VM (virtual machine) migration has been a hotspot problem in green cloud computing. Live VM migration problem is divided into two research aspects: live VM migration mechanism and live VM migration policy. In the meanwhile, with the development of energy-aware computing, we have focused on the VM placement selection of live migration, namely live VM migration policy for energy saving. In this paper, a novel heuristic approach PS-ES is presented. Its main idea includes two parts. One is that it combines the PSO (particle swarm optimization) idea with the SA (simulated annealing) idea to achieve an improved PSO-based approach with the better global search's ability. The other one is that it uses the Probability Theory and Mathematical Statistics and once again utilizes the SA idea to deal with the data obtained from the improved PSO-based process to get the final solution. And thus the whole approach achieves a long-term optimization for energy saving as it has considered not only the optimization of the current problem scenario but also that of the future problem. The experimental results demonstrate that PS-ES evidently reduces the total incremental energy consumption and better protects the performance of VM running and migrating compared with randomly migrating and optimally migrating. As a result, the proposed PS-ES approach has capabilities to make the result of live VM migration events more high-effective and valuable. PMID:25251339

  10. Memory and selective attention in multiple sclerosis: cross-sectional computer-based assessment in a large outpatient sample.

    Science.gov (United States)

    Adler, Georg; Lembach, Yvonne

    2015-08-01

    Cognitive impairments may have a severe impact on everyday functioning and quality of life of patients with multiple sclerosis (MS). However, there are some methodological problems in the assessment and only a few studies allow a representative estimate of the prevalence and severity of cognitive impairments in MS patients. We applied a computer-based method, the memory and attention test (MAT), in 531 outpatients with MS, who were assessed at nine neurological practices or specialized outpatient clinics. The findings were compared with those obtained in an age-, sex- and education-matched control group of 84 healthy subjects. Episodic short-term memory was substantially decreased in the MS patients. About 20% of them reached a score of only less than two standard deviations below the mean of the control group. The episodic short-term memory score was negatively correlated with the EDSS score. Minor but also significant impairments in the MS patients were found for verbal short-term memory, episodic working memory and selective attention. The computer-based MAT was found to be useful for a routine assessment of cognition in MS outpatients.

  11. A heuristic placement selection of live virtual machine migration for energy-saving in cloud computing environment.

    Science.gov (United States)

    Zhao, Jia; Hu, Liang; Ding, Yan; Xu, Gaochao; Hu, Ming

    2014-01-01

    The field of live VM (virtual machine) migration has been a hotspot problem in green cloud computing. Live VM migration problem is divided into two research aspects: live VM migration mechanism and live VM migration policy. In the meanwhile, with the development of energy-aware computing, we have focused on the VM placement selection of live migration, namely live VM migration policy for energy saving. In this paper, a novel heuristic approach PS-ES is presented. Its main idea includes two parts. One is that it combines the PSO (particle swarm optimization) idea with the SA (simulated annealing) idea to achieve an improved PSO-based approach with the better global search's ability. The other one is that it uses the Probability Theory and Mathematical Statistics and once again utilizes the SA idea to deal with the data obtained from the improved PSO-based process to get the final solution. And thus the whole approach achieves a long-term optimization for energy saving as it has considered not only the optimization of the current problem scenario but also that of the future problem. The experimental results demonstrate that PS-ES evidently reduces the total incremental energy consumption and better protects the performance of VM running and migrating compared with randomly migrating and optimally migrating. As a result, the proposed PS-ES approach has capabilities to make the result of live VM migration events more high-effective and valuable.

  12. A heuristic placement selection of live virtual machine migration for energy-saving in cloud computing environment.

    Directory of Open Access Journals (Sweden)

    Jia Zhao

    Full Text Available The field of live VM (virtual machine migration has been a hotspot problem in green cloud computing. Live VM migration problem is divided into two research aspects: live VM migration mechanism and live VM migration policy. In the meanwhile, with the development of energy-aware computing, we have focused on the VM placement selection of live migration, namely live VM migration policy for energy saving. In this paper, a novel heuristic approach PS-ES is presented. Its main idea includes two parts. One is that it combines the PSO (particle swarm optimization idea with the SA (simulated annealing idea to achieve an improved PSO-based approach with the better global search's ability. The other one is that it uses the Probability Theory and Mathematical Statistics and once again utilizes the SA idea to deal with the data obtained from the improved PSO-based process to get the final solution. And thus the whole approach achieves a long-term optimization for energy saving as it has considered not only the optimization of the current problem scenario but also that of the future problem. The experimental results demonstrate that PS-ES evidently reduces the total incremental energy consumption and better protects the performance of VM running and migrating compared with randomly migrating and optimally migrating. As a result, the proposed PS-ES approach has capabilities to make the result of live VM migration events more high-effective and valuable.

  13. Fast selection of miRNA candidates based on large-scale pre-computed MFE sets of randomized sequences.

    Science.gov (United States)

    Warris, Sven; Boymans, Sander; Muiser, Iwe; Noback, Michiel; Krijnen, Wim; Nap, Jan-Peter

    2014-01-13

    Small RNAs are important regulators of genome function, yet their prediction in genomes is still a major computational challenge. Statistical analyses of pre-miRNA sequences indicated that their 2D structure tends to have a minimal free energy (MFE) significantly lower than MFE values of equivalently randomized sequences with the same nucleotide composition, in contrast to other classes of non-coding RNA. The computation of many MFEs is, however, too intensive to allow for genome-wide screenings. Using a local grid infrastructure, MFE distributions of random sequences were pre-calculated on a large scale. These distributions follow a normal distribution and can be used to determine the MFE distribution for any given sequence composition by interpolation. It allows on-the-fly calculation of the normal distribution for any candidate sequence composition. The speedup achieved makes genome-wide screening with this characteristic of a pre-miRNA sequence practical. Although this particular property alone will not be able to distinguish miRNAs from other sequences sufficiently discriminative, the MFE-based P-value should be added to the parameters of choice to be included in the selection of potential miRNA candidates for experimental verification.

  14. Students Enrolled in Selected Upper-Division Agriculture Courses: An Examination of Computer Experiences, Self-Efficacy and Knowledge.

    Science.gov (United States)

    Johnson, Donald M.; Ferguson, James A.; Lester, Melissa L.

    2000-01-01

    Of 169 agriculture students surveyed, 79% had computer training, 66% owned computers; they had slightly above average computer self-efficacy, especially in word processing, electronic mail, and Internet use. However, 72.7% scored 60% or less on a test of computer knowledge. There was little correlation between self-efficacy and computer knowledge.…

  15. Classification effects of real and imaginary movement selective attention tasks on a P300-based brain-computer interface

    Science.gov (United States)

    Salvaris, Mathew; Sepulveda, Francisco

    2010-10-01

    Brain-computer interfaces (BCIs) rely on various electroencephalography methodologies that allow the user to convey their desired control to the machine. Common approaches include the use of event-related potentials (ERPs) such as the P300 and modulation of the beta and mu rhythms. All of these methods have their benefits and drawbacks. In this paper, three different selective attention tasks were tested in conjunction with a P300-based protocol (i.e. the standard counting of target stimuli as well as the conduction of real and imaginary movements in sync with the target stimuli). The three tasks were performed by a total of 10 participants, with the majority (7 out of 10) of the participants having never before participated in imaginary movement BCI experiments. Channels and methods used were optimized for the P300 ERP and no sensory-motor rhythms were explicitly used. The classifier used was a simple Fisher's linear discriminant. Results were encouraging, showing that on average the imaginary movement achieved a P300 versus No-P300 classification accuracy of 84.53%. In comparison, mental counting, the standard selective attention task used in previous studies, achieved 78.9% and real movement 90.3%. Furthermore, multiple trial classification results were recorded and compared, with real movement reaching 99.5% accuracy after four trials (12.8 s), imaginary movement reaching 99.5% accuracy after five trials (16 s) and counting reaching 98.2% accuracy after ten trials (32 s).

  16. Single photon emission computed tomography before and after treatment of anxiety using a selective serotonin reuptake inhibitor

    International Nuclear Information System (INIS)

    Warwick, J.M.; Heerden, B.B. van; Stein, D.J.; Niehaus, D.J.H.; Seedat, S.; Linden, G. van der; Harvey, B.A.

    2002-01-01

    Background: The selective serotonin reuptake inhibitors (SSRIs) are currently recommended as first line medications for a number of different anxiety disorders, including obsessive-compulsive disorder (OCD), posttraumatic stress disorder (PTSD), and social anxiety disorder (social phobia) (SAD). This raises the question of what effects these agents have on the functional neuroanatomy of anxiety disorders. Methods: Single photon emission computed tomography (SPECT) brain scanning was undertaken in patients with OCD, PTSD, and SAD before and after treatment with citalopram, the most selective of the SSRIs. Statistical parametric mapping (SPM) was used to compare scans (pre- vs post-medication, and responders vs nonresponders) in the combined group of subjects. Results: Citalopram pharmacotherapy resulted in significant deactivation within anterior and superior cingulate and left hippocampus. Deactivation within the anterior cingulate, left paracingular cortex, and right inferior frontal cortex was more marked in treatment responders. Baseline activation did not, however, predict response to pharmacotherapy. Conclusion: Although each of the anxiety disorders may be mediated by different neurocircuits, there are some overlaps in the functional neuroanatomy of their response to SSRI treatment. The current data is consistent with previous work demonstrating the importance of limbic circuits in this spectrum of disorders. These play a crucial role in cognitive-affective processing, and are innervated by serotonergic neurons

  17. Computational Analysis of Molecular Interaction Networks Underlying Change of HIV-1 Resistance to Selected Reverse Transcriptase Inhibitors.

    Science.gov (United States)

    Kierczak, Marcin; Dramiński, Michał; Koronacki, Jacek; Komorowski, Jan

    2010-12-12

    Despite more than two decades of research, HIV resistance to drugs remains a serious obstacle in developing efficient AIDS treatments. Several computational methods have been developed to predict resistance level from the sequence of viral proteins such as reverse transcriptase (RT) or protease. These methods, while powerful and accurate, give very little insight into the molecular interactions that underly acquisition of drug resistance/hypersusceptibility. Here, we attempt at filling this gap by using our Monte Carlo feature selection and interdependency discovery method (MCFS-ID) to elucidate molecular interaction networks that characterize viral strains with altered drug resistance levels. We analyzed a number of HIV-1 RT sequences annotated with drug resistance level using the MCFS-ID method. This let us expound interdependency networks that characterize change of drug resistance to six selected RT inhibitors: Abacavir, Lamivudine, Stavudine, Zidovudine, Tenofovir and Nevirapine. The networks consider interdependencies at the level of physicochemical properties of mutating amino acids, eg,: polarity. We mapped each network on the 3D structure of RT in attempt to understand the molecular meaning of interacting pairs. The discovered interactions describe several known drug resistance mechanisms and, importantly, some previously unidentified ones. Our approach can be easily applied to a whole range of problems from the domain of protein engineering. A portable Java implementation of our MCFS-ID method is freely available for academic users and can be obtained at: http://www.ipipan.eu/staff/m.draminski/software.htm.

  18. Research of Performance Linux Kernel File Systems

    Directory of Open Access Journals (Sweden)

    Andrey Vladimirovich Ostroukh

    2015-10-01

    Full Text Available The article describes the most common Linux Kernel File Systems. The research was carried out on a personal computer, the characteristics of which are written in the article. The study was performed on a typical workstation running GNU/Linux with below characteristics. On a personal computer for measuring the file performance, has been installed the necessary software. Based on the results, conclusions and proposed recommendations for use of file systems. Identified and recommended by the best ways to store data.

  19. Fast probabilistic file fingerprinting for big data.

    Science.gov (United States)

    Tretyakov, Konstantin; Laur, Sven; Smant, Geert; Vilo, Jaak; Prins, Pjotr

    2013-01-01

    Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily concerns logistics within and between data centers, but is also important for workstation users in the analysis phase. Common usage patterns, such as comparing and transferring files, are proving computationally expensive and are tying down shared resources. We present an efficient method for calculating file uniqueness for large scientific data files, that takes less computational effort than existing techniques. This method, called Probabilistic Fast File Fingerprinting (PFFF), exploits the variation present in biological data and computes file fingerprints by sampling randomly from the file instead of reading it in full. Consequently, it has a flat performance characteristic, correlated with data variation rather than file size. We demonstrate that probabilistic fingerprinting can be as reliable as existing hashing techniques, with provably negligible risk of collisions. We measure the performance of the algorithm on a number of data storage and access technologies, identifying its strengths as well as limitations. Probabilistic fingerprinting may significantly reduce the use of computational resources when comparing very large files. Utilisation of probabilistic fingerprinting techniques can increase the speed of common file-related workflows, both in the data center and for workbench analysis. The implementation of the algorithm is available as an open-source tool named pfff, as a command-line tool as well as a C library. The tool can be downloaded from http://biit.cs.ut.ee/pfff.

  20. Influence of core design, production technique, and material selection on fracture behavior of yttria-stabilized tetragonal zirconia polycrystal fixed dental prostheses produced using different multilayer techniques: split-file, over-pressing, and manually built-up veneers

    Directory of Open Access Journals (Sweden)

    Mahmood DJH

    2016-02-01

    Full Text Available Deyar Jallal Hadi Mahmood, Ewa H Linderoth, Ann Wennerberg, Per Vult Von Steyern Department of Prosthetic Dentistry, Faculty of Odontology, Malmö University, Malmö, Sweden Aim: To investigate and compare the fracture strength and fracture mode in eleven groups of currently, the most commonly used multilayer three-unit all-ceramic yttria-stabilized tetragonal zirconia polycrystal (Y-TZP fixed dental prostheses (FDPs with respect to the choice of core material, veneering material area, manufacturing technique, design of connectors, and radii of curvature of FDP cores. Materials and methods: A total of 110 three-unit Y-TZP FDP cores with one intermediate pontic were made. The FDP cores in groups 1–7 were made with a split-file design, veneered with manually built-up porcelain, computer-aided design-on veneers, and over-pressed veneers. Groups 8–11 consisted of FDPs with a state-of-the-art design, veneered with manually built-up porcelain. All the FDP cores were subjected to simulated aging and finally loaded to fracture. Results: There was a significant difference (P<0.05 between the core designs, but not between the different types of Y-TZP materials. The split-file designs with VITABLOCS® (1,806±165 N and e.max® ZirPress (1,854±115 N and the state-of-the-art design with VITA VM® 9 (1,849±150 N demonstrated the highest mean fracture values. Conclusion: The shape of a split-file designed all-ceramic reconstruction calls for a different dimension protocol, compared to traditionally shaped ones, as the split-file design leads to sharp approximal indentations acting as fractural impressions, thus decreasing the overall strength. The design of a framework is a crucial factor for the load bearing capacity of an all-ceramic FDP. The state-of-the-art design is preferable since the split-file designed cores call for a cross-sectional connector area at least 42% larger, to have the same load bearing capacity as the state-of-the-art designed

  1. HUD GIS Boundary Files

    Data.gov (United States)

    Department of Housing and Urban Development — The HUD GIS Boundary Files are intended to supplement boundary files available from the U.S. Census Bureau. The files are for community planners interested in...

  2. Computer-aided engineering system for design of sequence arrays and lithographic masks

    Science.gov (United States)

    Hubbell, Earl A.; Morris, MacDonald S.; Winkler, James L.

    1996-01-01

    An improved set of computer tools for forming arrays. According to one aspect of the invention, a computer system (100) is used to select probes and design the layout of an array of DNA or other polymers with certain beneficial characteristics. According to another aspect of the invention, a computer system uses chip design files (104) to design and/or generate lithographic masks (110).

  3. Selection of an optimal neural network architecture for computer-aided detection of microcalcifications - Comparison of automated optimization techniques

    International Nuclear Information System (INIS)

    Gurcan, Metin N.; Sahiner, Berkman; Chan Heangping; Hadjiiski, Lubomir; Petrick, Nicholas

    2001-01-01

    Many computer-aided diagnosis (CAD) systems use neural networks (NNs) for either detection or classification of abnormalities. Currently, most NNs are 'optimized' by manual search in a very limited parameter space. In this work, we evaluated the use of automated optimization methods for selecting an optimal convolution neural network (CNN) architecture. Three automated methods, the steepest descent (SD), the simulated annealing (SA), and the genetic algorithm (GA), were compared. We used as an example the CNN that classifies true and false microcalcifications detected on digitized mammograms by a prescreening algorithm. Four parameters of the CNN architecture were considered for optimization, the numbers of node groups and the filter kernel sizes in the first and second hidden layers, resulting in a search space of 432 possible architectures. The area A z under the receiver operating characteristic (ROC) curve was used to design a cost function. The SA experiments were conducted with four different annealing schedules. Three different parent selection methods were compared for the GA experiments. An available data set was split into two groups with approximately equal number of samples. By using the two groups alternately for training and testing, two different cost surfaces were evaluated. For the first cost surface, the SD method was trapped in a local minimum 91% (392/432) of the time. The SA using the Boltzman schedule selected the best architecture after evaluating, on average, 167 architectures. The GA achieved its best performance with linearly scaled roulette-wheel parent selection; however, it evaluated 391 different architectures, on average, to find the best one. The second cost surface contained no local minimum. For this surface, a simple SD algorithm could quickly find the global minimum, but the SA with the very fast reannealing schedule was still the most efficient. The same SA scheme, however, was trapped in a local minimum on the first cost

  4. Transmission of the environmental radiation data files on the internet

    International Nuclear Information System (INIS)

    Yamaguchi, Yoshiaki; Saito, Tadashi; Yamamoto, Takayoshi; Matsumoto, Atsushi; Kyoh, Bunkei

    1999-01-01

    Recently, any text or data file has come to be transportable through the Internet with a personal computer. It is, however, restricted by selection of monitoring point to lay the cable because the personal circuit is generally used in case of continuous type environmental monitors. This is the reason why we have developed an environmental monitoring system that can transmit radiation data files on the Internet. Both 3''φ x 3'' NaI(Tl) detector and Thermo-Hydrometer are installed in the monitoring post of this system, and the data files of those detectors are transmitted from a personal computer at the monitoring point to Radioisotope Research Center of Osaka University. Environmental monitoring data from remote places have easily been obtained due to the data transmission through the Internet. Moreover, the system brings a higher precision of the environmental monitoring data because it includes the energy information of γ-rays. If it is possible to maintain the monitors at remote places, this system could execute the continuous environmental monitoring over the wide area. (author)

  5. Transmission of the environmental radiation data files on the internet

    Energy Technology Data Exchange (ETDEWEB)

    Yamaguchi, Yoshiaki; Saito, Tadashi; Yamamoto, Takayoshi [Osaka Univ., Suita (Japan). Radioisotope Research Center; Matsumoto, Atsushi; Kyoh, Bunkei

    1999-01-01

    Recently, any text or data file has come to be transportable through the Internet with a personal computer. It is, however, restricted by selection of monitoring point to lay the cable because the personal circuit is generally used in case of continuous type environmental monitors. This is the reason why we have developed an environmental monitoring system that can transmit radiation data files on the Internet. Both 3``{phi} x 3`` NaI(Tl) detector and Thermo-Hydrometer are installed in the monitoring post of this system, and the data files of those detectors are transmitted from a personal computer at the monitoring point to Radioisotope Research Center of Osaka University. Environmental monitoring data from remote places have easily been obtained due to the data transmission through the Internet. Moreover, the system brings a higher precision of the environmental monitoring data because it includes the energy information of {gamma}-rays. If it is possible to maintain the monitors at remote places, this system could execute the continuous environmental monitoring over the wide area. (author)

  6. Log files for testing usability

    NARCIS (Netherlands)

    Klein Teeselink, G.; Siepe, A.H.M.; Pijper, de J.R.

    1999-01-01

    The aim of this study is to gain insight in the usefulness of log file analysis as a method to evaluate the usability of individual interface components and their influence on the usability of the overall user interface. We selected a music player as application, with four different interfaces and

  7. A secure file manager for UNIX

    Energy Technology Data Exchange (ETDEWEB)

    DeVries, R.G.

    1990-12-31

    The development of a secure file management system for a UNIX-based computer facility with supercomputers and workstations is described. Specifically, UNIX in its usual form does not address: (1) Operation which would satisfy rigorous security requirements. (2) Online space management in an environment where total data demands would be many times the actual online capacity. (3) Making the file management system part of a computer network in which users of any computer in the local network could retrieve data generated on any other computer in the network. The characteristics of UNIX can be exploited to develop a portable, secure file manager which would operate on computer systems ranging from workstations to supercomputers. Implementation considerations making unusual use of UNIX features, rather than requiring extensive internal system changes, are described, and implementation using the Cray Research Inc. UNICOS operating system is outlined.

  8. Influence of gating phase selection on the image quality of coronary arteries in multidetector row computed tomography

    International Nuclear Information System (INIS)

    Laskowska, K.; Marzec, M.; Serafin, Z.; Nawrocka, E.; Lasek, W.; WWisniewska-Szmyt, J.; Kubica, J.

    2005-01-01

    Motion artifacts caused by cardiac movement disturb the imaging of coronary arteries with multidetector-row spiral computed tomography. The aim of this study was to determine the phase of the heart rate which provides the best quality of coronary artery imaging in retrospective ECG-gated CT. Although 75% is usually the best reconstruction phase, the optimal phase should be established individually for the patient, artery, segment, and type of tomograph for the best imaging quality. Forty-five cardiac CT angiograms of 26 patients were retrospectively evaluated. The examinations were performed with a 4-detector-row tomograph. ECG-gated retrospective reconstructions were relatively delayed at 0%, 12.5%, 25%, 37.5%, 50%, 62.5%, 75%, and 87.5% of the cardiac cycle. Selected coronary arteries of the highest diagnostic quality were estimated in the eight phases of the cardiac cycle. Only arteries of very high image quality were selected for analysis: left coronary artery trunks (44 cases, incl. 37 stented), anterior interventricular branches (36, incl. 3 stented), circumflex branches (16), right coronary rtery branches (23), and posterior interventricular branches (4). The reconstruction phase had a statistically significant impact on the quality of imaging (p < 0.0003). Depending on the case, optimal imaging was noted in various phases, except in the 12.5 % phase. The 75% phase appeared to be the best of all those examined (p < 0.05), both in the group of arteries without stents (p < 0.0006) and in those stented (p < 0.05). In some cases of repeated examinations the best phases differed within the same patient. (author)

  9. Parallel file system with metadata distributed across partitioned key-value store c

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron

    2017-09-19

    Improved techniques are provided for storing metadata associated with a plurality of sub-files associated with a single shared file in a parallel file system. The shared file is generated by a plurality of applications executing on a plurality of compute nodes. A compute node implements a Parallel Log Structured File System (PLFS) library to store at least one portion of the shared file generated by an application executing on the compute node and metadata for the at least one portion of the shared file on one or more object storage servers. The compute node is also configured to implement a partitioned data store for storing a partition of the metadata for the shared file, wherein the partitioned data store communicates with partitioned data stores on other compute nodes using a message passing interface. The partitioned data store can be implemented, for example, using Multidimensional Data Hashing Indexing Middleware (MDHIM).

  10. A computer program to calculate nuclide yields in complex decay chain for selection of optimum irradiation and cooling condition

    International Nuclear Information System (INIS)

    Takeda, Tsuneo

    1977-11-01

    This report is prepared as a user's input manual for a computer code CODAC-No.5 and provides a general description of the code and instructions for its use. The code represents a modified version of the CODAC-No.4 code. The code developed is capable of calculating radioactive nuclide yields in an any given complex decay and activation chain independent of irradiation history. In this code, eighteen kinds of valuable tables and graphs can be prepared for output. They are available for selection of optimum irradiation and cooling conditions and for other intentions in accordance with irradiation and cooling. For a example, the ratio of a nuclide yield to total nuclide yield depending on irradiation and cooling times is obtained. In these outputs, several kinds of complex and intricate equations and others are included. This code has almost the same input forms as that of CODAC-No.4 code excepting input of irradiation history data. Input method and formats used for this code are very simple for any kinds of nuclear data. List of FORTRAN statements, examples of input data and output results and list of input parameters and its definitions are given in this report. (auth.)

  11. A novel computer-aided diagnosis system for breast MRI based on feature selection and ensemble learning.

    Science.gov (United States)

    Lu, Wei; Li, Zhe; Chu, Jinghui

    2017-04-01

    Breast cancer is a common cancer among women. With the development of modern medical science and information technology, medical imaging techniques have an increasingly important role in the early detection and diagnosis of breast cancer. In this paper, we propose an automated computer-aided diagnosis (CADx) framework for magnetic resonance imaging (MRI). The scheme consists of an ensemble of several machine learning-based techniques, including ensemble under-sampling (EUS) for imbalanced data processing, the Relief algorithm for feature selection, the subspace method for providing data diversity, and Adaboost for improving the performance of base classifiers. We extracted morphological, various texture, and Gabor features. To clarify the feature subsets' physical meaning, subspaces are built by combining morphological features with each kind of texture or Gabor feature. We tested our proposal using a manually segmented Region of Interest (ROI) data set, which contains 438 images of malignant tumors and 1898 images of normal tissues or benign tumors. Our proposal achieves an area under the ROC curve (AUC) value of 0.9617, which outperforms most other state-of-the-art breast MRI CADx systems. Compared with other methods, our proposal significantly reduces the false-positive classification rate. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Mathematical Model and Computational Analysis of Selected Transient States of Cylindrical Linear Induction Motor Fed via Frequency Converter

    Directory of Open Access Journals (Sweden)

    Andrzej Rusek

    2008-01-01

    Full Text Available The mathematical model of cylindrical linear induction motor (C-LIM fed via frequency converter is presented in the paper. The model was developed in order to analyze numerically the transient states. Problems concerning dynamics of ac-machines especially linear induction motor are presented in [1 – 7]. Development of C-LIM mathematical model is based on circuit method and analogy to rotary induction motor. The analogy between both: (a stator and rotor windings of rotary induction motor and (b winding of primary part of C-LIM (inductor and closed current circuits in external secondary part of C-LIM (race is taken into consideration. The equations of C-LIM mathematical model are presented as matrix together with equations expressing each vector separately. A computational analysis of selected transient states of C-LIM fed via frequency converter is presented in the paper. Two typical examples of C-LIM operation are considered for the analysis: (a starting the motor at various static loads and various synchronous velocities and (b reverse of the motor at the same operation conditions. Results of simulation are presented as transient responses including transient electromagnetic force, transient linear velocity and transient phase current.

  13. An exploratory discussion on business files compilation

    International Nuclear Information System (INIS)

    Gao Chunying

    2014-01-01

    Business files compilation for an enterprise is a distillation and recreation of its spiritual wealth, from which the applicable information can be available to those who want to use it in a fast, extensive and precise way. Proceeding from the effects of business files compilation on scientific researches, productive constructions and developments, this paper in five points discusses the way how to define topics, analyze historical materials, search or select data and process it to an enterprise archives collection. Firstly, to expound the importance and necessity of business files compilation in production, operation and development of an company; secondly, to present processing methods from topic definition, material searching and data selection to final examination and correction; thirdly, to define principle and classification in order to make different categories and levels of processing methods available to business files compilation; fourthly, to discuss the specific method how to implement a file compilation through a documentation collection upon principle of topic definition gearing with demand; fifthly, to address application of information technology to business files compilation in view point of widely needs for business files so as to level up enterprise archives management. The present discussion focuses on the examination and correction principle of enterprise historical material compilation and the basic classifications as well as the major forms of business files compilation achievements. (author)

  14. Aurally Handicapped -- Research; A Selective Bibliography. Exceptional Child Bibliography Series No. 625.

    Science.gov (United States)

    Council for Exceptional Children, Reston, VA. Information Center on Exceptional Children.

    The selected bibliography of research on aurally handicapped children contains approximately 95 abstracts with indexing information explained to be drawn from the computer file of abstracts representing the Council for Exceptional Children Information Center's complete holdings as of August, 1972. Abstracts are said to be chosen using the criteria…

  15. Peregrine System | High-Performance Computing | NREL

    Science.gov (United States)

    classes of nodes that users access: Login Nodes Peregrine has four login nodes, each of which has Intel E5 /scratch file systems, the /mss file system is mounted on all login nodes. Compute Nodes Peregrine has 2592

  16. Provider of Services File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The POS file consists of two data files, one for CLIA labs and one for 18 other provider types. The file names are CLIA and OTHER. If downloading the file, note it...

  17. PC Graphic file programing

    International Nuclear Information System (INIS)

    Yang, Jin Seok

    1993-04-01

    This book gives description of basic of graphic knowledge and understanding and realization of graphic file form. The first part deals with graphic with graphic data, store of graphic data and compress of data, programing language such as assembling, stack, compile and link of program and practice and debugging. The next part mentions graphic file form such as Mac paint file, GEM/IMG file, PCX file, GIF file, and TIFF file, consideration of hardware like mono screen driver and color screen driver in high speed, basic conception of dithering and conversion of formality.

  18. Women in computer science: An interpretative phenomenological analysis exploring common factors contributing to women's selection and persistence in computer science as an academic major

    Science.gov (United States)

    Thackeray, Lynn Roy

    The purpose of this study is to understand the meaning that women make of the social and cultural factors that influence their reasons for entering and remaining in study of computer science. The twenty-first century presents many new challenges in career development and workforce choices for both men and women. Information technology has become the driving force behind many areas of the economy. As this trend continues, it has become essential that U.S. citizens need to pursue a career in technologies, including the computing sciences. Although computer science is a very lucrative profession, many Americans, especially women, are not choosing it as a profession. Recent studies have shown no significant differences in math, technical and science competency between men and women. Therefore, other factors, such as social, cultural, and environmental influences seem to affect women's decisions in choosing an area of study and career choices. A phenomenological method of qualitative research was used in this study, based on interviews of seven female students who are currently enrolled in a post-secondary computer science program. Their narratives provided meaning into the social and cultural environments that contribute to their persistence in their technical studies, as well as identifying barriers and challenges that are faced by female students who choose to study computer science. It is hoped that the data collected from this study may provide recommendations for the recruiting, retention and support for women in computer science departments of U.S. colleges and universities, and thereby increase the numbers of women computer scientists in industry. Keywords: gender access, self-efficacy, culture, stereotypes, computer education, diversity.

  19. Methods and Algorithms for Detecting Objects in Video Files

    Directory of Open Access Journals (Sweden)

    Nguyen The Cuong

    2018-01-01

    Full Text Available Video files are files that store motion pictures and sounds like in real life. In today's world, the need for automated processing of information in video files is increasing. Automated processing of information has a wide range of application including office/home surveillance cameras, traffic control, sports applications, remote object detection, and others. In particular, detection and tracking of object movement in video file plays an important role. This article describes the methods of detecting objects in video files. Today, this problem in the field of computer vision is being studied worldwide.

  20. The ECE Pre-Service Teachers' Perception on Factors Affecting the Integration of Educational Computer Games in Two Conditions: Selecting versus Redesigning

    Science.gov (United States)

    Sancar Tokmak, Hatice; Ozgelen, Sinan

    2013-01-01

    This case study aimed to examine early childhood education (ECE) pre-service teachers' perception on the factors affecting integration of educational computer games to their instruction in two areas: selecting and redesigning. Twenty-six ECE pre-service teachers participated in the study. The data was collected through open-ended questionnaires,…

  1. Infinity in Logic and Computation: International Conference, ILC 2007, Cape Town, South Africa, November 3-5, 2007: Revised selected papers

    NARCIS (Netherlands)

    Archibald, M.; Brattka, V.; Goranko, V.; Löwe, B.

    2009-01-01

    Edited in collaboration with FoLLI, the Association of Logic, Language and Information, this volume constitutes a selection of papers presented at the Internatonal Conference on Infinity in Logic and Computation, ILC 2007, held in Cape Town, South Africa, in November 2007. The 7 revised papers

  2. Metal artifact correction for x-ray computed tomography using kV and selective MV imaging

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Meng, E-mail: mengwu@stanford.edu [Department of Electrical Engineering, Stanford University, Stanford, California 94305 (United States); Keil, Andreas [microDimensions GmbH, Munich 81379 (Germany); Constantin, Dragos; Star-Lack, Josh [Varian Medical Systems, Inc., Palo Alto, California 94304 (United States); Zhu, Lei [Nuclear and Radiological Engineering and Medical Physics Programs, The George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, Atlanta, Georgia 30332 (United States); Fahrig, Rebecca [Department of Radiology, Stanford University, Stanford, California 94305 (United States)

    2014-12-15

    Purpose: The overall goal of this work is to improve the computed tomography (CT) image quality for patients with metal implants or fillings by completing the missing kilovoltage (kV) projection data with selectively acquired megavoltage (MV) data that do not suffer from photon starvation. When both of these imaging systems, which are available on current radiotherapy devices, are used, metal streak artifacts are avoided, and the soft-tissue contrast is restored, even for regions in which the kV data cannot contribute any information. Methods: Three image-reconstruction methods, including two filtered back-projection (FBP)-based analytic methods and one iterative method, for combining kV and MV projection data from the two on-board imaging systems of a radiotherapy device are presented in this work. The analytic reconstruction methods modify the MV data based on the information in the projection or image domains and then patch the data onto the kV projections for a FBP reconstruction. In the iterative reconstruction, the authors used dual-energy (DE) penalized weighted least-squares (PWLS) methods to simultaneously combine the kV/MV data and perform the reconstruction. Results: The authors compared kV/MV reconstructions to kV-only reconstructions using a dental phantom with fillings and a hip-implant numerical phantom. Simulation results indicated that dual-energy sinogram patch FBP and the modified dual-energy PWLS method can successfully suppress metal streak artifacts and restore information lost due to photon starvation in the kV projections. The root-mean-square errors of soft-tissue patterns obtained using combined kV/MV data are 10–15 Hounsfield units smaller than those of the kV-only images, and the structural similarity index measure also indicates a 5%–10% improvement in the image quality. The added dose from the MV scan is much less than the dose from the kV scan if a high efficiency MV detector is assumed. Conclusions: The authors have shown that it

  3. Single reading with computer-aided detection performed by selected radiologists in a breast cancer screening program

    Energy Technology Data Exchange (ETDEWEB)

    Bargalló, Xavier, E-mail: xbarga@clinic.cat [Department of Radiology (CDIC), Hospital Clínic de Barcelona, C/ Villarroel, 170, 08036 Barcelona (Spain); Santamaría, Gorane; Amo, Montse del; Arguis, Pedro [Department of Radiology (CDIC), Hospital Clínic de Barcelona, C/ Villarroel, 170, 08036 Barcelona (Spain); Ríos, José [Biostatistics and Data Management Core Facility, IDIBAPS, (Hospital Clinic) C/ Mallorca, 183. Floor -1. Office #60. 08036 Barcelona (Spain); Grau, Jaume [Preventive Medicine and Epidemiology Unit, Hospital Clínic de Barcelona, C/ Villarroel, 170, 08036 Barcelona (Spain); Burrel, Marta; Cores, Enrique; Velasco, Martín [Department of Radiology (CDIC), Hospital Clínic de Barcelona, C/ Villarroel, 170, 08036 Barcelona (Spain)

    2014-11-15

    Highlights: • 1-The cancer detection rate of the screening program improved using a single reading protocol by experienced radiologists assisted by CAD. • 2-The cancer detection rate improved at the cost of increasing recall rate. • 3-CAD, used by breast radiologists, did not help to detect more cancers. - Abstract: Objectives: To assess the impact of shifting from a standard double reading plus arbitration protocol to a single reading by experienced radiologists assisted by computer-aided detection (CAD) in a breast cancer screening program. Methods: This was a prospective study approved by the ethics committee. Data from 21,321 consecutive screening mammograms in incident rounds (2010–2012) were read following a single reading plus CAD protocol and compared with data from 47,462 consecutive screening mammograms in incident rounds (2004–2010) that were interpreted following a double reading plus arbitration protocol. For the single reading, radiologists were selected on the basis of the appraisement of their previous performance. Results: Period 2010–2012 vs. period 2004–2010: Cancer detection rate (CDR): 6.1‰ (95% confidence interval: 5.1–7.2) vs. 5.25‰; Recall rate (RR): 7.02% (95% confidence interval: 6.7–7.4) vs. 7.24% (selected readers before arbitration) and vs. 3.94 (all readers after arbitration); Predictive positive value of recall: 8.69% vs. 13.32%. Average size of invasive cancers: 14.6 ± 9.5 mm vs. 14.3 ± 9.5 mm. Stage: 0 (22.3/26.1%); I (59.2/50.8%); II (19.2/17.1%); III (3.1/3.3%); IV (0/1.9%). Specialized breast radiologists performed better than general radiologists. Conclusions: The cancer detection rate of the screening program improved using a single reading protocol by experienced radiologists assisted by CAD, at the cost of a moderate increase of the recall rate mainly related to the lack of arbitration.

  4. Metal artifact correction for x-ray computed tomography using kV and selective MV imaging

    International Nuclear Information System (INIS)

    Wu, Meng; Keil, Andreas; Constantin, Dragos; Star-Lack, Josh; Zhu, Lei; Fahrig, Rebecca

    2014-01-01

    Purpose: The overall goal of this work is to improve the computed tomography (CT) image quality for patients with metal implants or fillings by completing the missing kilovoltage (kV) projection data with selectively acquired megavoltage (MV) data that do not suffer from photon starvation. When both of these imaging systems, which are available on current radiotherapy devices, are used, metal streak artifacts are avoided, and the soft-tissue contrast is restored, even for regions in which the kV data cannot contribute any information. Methods: Three image-reconstruction methods, including two filtered back-projection (FBP)-based analytic methods and one iterative method, for combining kV and MV projection data from the two on-board imaging systems of a radiotherapy device are presented in this work. The analytic reconstruction methods modify the MV data based on the information in the projection or image domains and then patch the data onto the kV projections for a FBP reconstruction. In the iterative reconstruction, the authors used dual-energy (DE) penalized weighted least-squares (PWLS) methods to simultaneously combine the kV/MV data and perform the reconstruction. Results: The authors compared kV/MV reconstructions to kV-only reconstructions using a dental phantom with fillings and a hip-implant numerical phantom. Simulation results indicated that dual-energy sinogram patch FBP and the modified dual-energy PWLS method can successfully suppress metal streak artifacts and restore information lost due to photon starvation in the kV projections. The root-mean-square errors of soft-tissue patterns obtained using combined kV/MV data are 10–15 Hounsfield units smaller than those of the kV-only images, and the structural similarity index measure also indicates a 5%–10% improvement in the image quality. The added dose from the MV scan is much less than the dose from the kV scan if a high efficiency MV detector is assumed. Conclusions: The authors have shown that it

  5. Metal artifact correction for x-ray computed tomography using kV and selective MV imaging.

    Science.gov (United States)

    Wu, Meng; Keil, Andreas; Constantin, Dragos; Star-Lack, Josh; Zhu, Lei; Fahrig, Rebecca

    2014-12-01

    The overall goal of this work is to improve the computed tomography (CT) image quality for patients with metal implants or fillings by completing the missing kilovoltage (kV) projection data with selectively acquired megavoltage (MV) data that do not suffer from photon starvation. When both of these imaging systems, which are available on current radiotherapy devices, are used, metal streak artifacts are avoided, and the soft-tissue contrast is restored, even for regions in which the kV data cannot contribute any information. Three image-reconstruction methods, including two filtered back-projection (FBP)-based analytic methods and one iterative method, for combining kV and MV projection data from the two on-board imaging systems of a radiotherapy device are presented in this work. The analytic reconstruction methods modify the MV data based on the information in the projection or image domains and then patch the data onto the kV projections for a FBP reconstruction. In the iterative reconstruction, the authors used dual-energy (DE) penalized weighted least-squares (PWLS) methods to simultaneously combine the kV/MV data and perform the reconstruction. The authors compared kV/MV reconstructions to kV-only reconstructions using a dental phantom with fillings and a hip-implant numerical phantom. Simulation results indicated that dual-energy sinogram patch FBP and the modified dual-energy PWLS method can successfully suppress metal streak artifacts and restore information lost due to photon starvation in the kV projections. The root-mean-square errors of soft-tissue patterns obtained using combined kV/MV data are 10-15 Hounsfield units smaller than those of the kV-only images, and the structural similarity index measure also indicates a 5%-10% improvement in the image quality. The added dose from the MV scan is much less than the dose from the kV scan if a high efficiency MV detector is assumed. The authors have shown that it is possible to improve the image quality of

  6. ATAQS: A computational software tool for high throughput transition optimization and validation for selected reaction monitoring mass spectrometry

    Directory of Open Access Journals (Sweden)

    Ramos Hector

    2011-03-01

    Full Text Available Abstract Background Since its inception, proteomics has essentially operated in a discovery mode with the goal of identifying and quantifying the maximal number of proteins in a sample. Increasingly, proteomic measurements are also supporting hypothesis-driven studies, in which a predetermined set of proteins is consistently detected and quantified in multiple samples. Selected reaction monitoring (SRM is a targeted mass spectrometric technique that supports the detection and quantification of specific proteins in complex samples at high sensitivity and reproducibility. Here, we describe ATAQS, an integrated software platform that supports all stages of targeted, SRM-based proteomics experiments including target selection, transition optimization and post acquisition data analysis. This software will significantly facilitate the use of targeted proteomic techniques and contribute to the generation of highly sensitive, reproducible and complete datasets that are particularly critical for the discovery and validation of targets in hypothesis-driven studies in systems biology. Result We introduce a new open source software pipeline, ATAQS (Automated and Targeted Analysis with Quantitative SRM, which consists of a number of modules that collectively support the SRM assay development workflow for targeted proteomic experiments (project management and generation of protein, peptide and transitions and the validation of peptide detection by SRM. ATAQS provides a flexible pipeline for end-users by allowing the workflow to start or end at any point of the pipeline, and for computational biologists, by enabling the easy extension of java algorithm classes for their own algorithm plug-in or connection via an external web site. This integrated system supports all steps in a SRM-based experiment and provides a user-friendly GUI that can be run by any operating system that allows the installation of the Mozilla Firefox web browser. Conclusions Targeted

  7. NASA work unit system file maintenance manual

    Science.gov (United States)

    1972-01-01

    The NASA Work Unit System is a management information system for research tasks (i.e., work units) performed under NASA grants and contracts. It supplies profiles on research efforts and statistics on fund distribution. The file maintenance operator can add, delete and change records at a remote terminal or can submit punched cards to the computer room for batch update. The system is designed for file maintenance by a person with little or no knowledge of data processing techniques.

  8. Use of DBMS-10 for storage and retrieval of evaluated nuclear data files

    International Nuclear Information System (INIS)

    Dunford, C.L.

    1977-01-01

    The use of a data base management system (DBMS) for storage of, and retrieval from, the many scientific data bases maintained by the National Nuclear Data Center is currently being investigated. It would appear that a commercially available DBMS package would save the Center considerable money and manpower when adding new data files to the library and in the long-term maintenance of current data files. Current DBMS technology and experience with an internal DBMS system suggests an inherent inefficiency in processing large data networks where significant portions are accessed in a sequential manner. Such a file is the Evaluated Nuclear Data File (ENDF/B), which contains many large data tables, each one normally accessed in a sequential manner. After gaining some experience and success in small applications of the commercially available DBMS package, DBMS-10, on the Center's DECsystem-10 computer, it was decided to select a large data base as a test case before making a final decision on the implementation of DBMS-10 for all data bases. The obvious approach is to utilize the DBMS to index a random-access file. In this way one is able to increase the storage and retrieval efficiency at the one-time cost of additional programing effort. 2 figures

  9. Use of DBMS-10 for storage and retrieval of evaluated nuclear data files

    International Nuclear Information System (INIS)

    Dunford, C.L.

    1978-01-01

    The use of a data base management system (DBMS) for storage of, and retrieval from, the many scientific data bases maintained by the National Nuclear Data Center is currently being investigated. It would appear that a commercially available DBMS package would save the Center considerable money and manpower when adding new data files to our library and in the long-term maintenance of our current data files. Current DBMS technology and experience with our internal DBMS system suggests an inherent inefficiency in processing large data networks where significant portions are accessed in a sequential manner. Such a file is the Evaluated Nuclear Data File (ENDF/B) which contains many large data tables, each one normally accessed in a sequential manner. After gaining some experience and success in small applications of the commercially available DBMS package, DBMS-10, on the Center's DECsystem-10 computer, it was decided to select one of our large data bases as a test case before making a final decision on the implementation of DBMS-10 for all our data bases. The obvious approach is to utilize the DBMS to index a random access file. In this way one is able to increase the storage and retrieval efficiency at the one-time cost of additional programming effort

  10. ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations

    Science.gov (United States)

    Laloo, Jalal Z. A.; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai

    2017-07-01

    The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.

  11. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  12. Computerized index for teaching files

    International Nuclear Information System (INIS)

    Bramble, J.M.

    1989-01-01

    A computerized index can be used to retrieve cases from a teaching file that have radiographic findings similar to an unknown case. The probability that a user will review cases with a correct diagnosis was estimated with use of radiographic findings of arthritis in hand radiographs of 110 cases from a teaching file. The nearest-neighbor classification algorithm was used as a computer index to 110 cases of arthritis. Each case was treated as an unknown and inputted to the computer index. The accuracy of the computer index in retrieving cases with the same diagnosis (including rheumatoid arthritis, gout, psoriatic arthritis, inflammatory osteoarthritis, and pyrophosphate arthropathy) was measured. A Bayes classifier algorithm was also tested on the same database. Results are presented. The nearest-neighbor algorithm was 83%. By comparison, the estimated accuracy of the Bayes classifier algorithm was 78%. Conclusions: A computerized index to a teaching file based on the nearest-neighbor algorithm should allow the user to review cases with the correct diagnosis of an unknown case, by entering the findings of the unknown case

  13. Integrating medicinal chemistry, organic/combinatorial chemistry, and computational chemistry for the discovery of selective estrogen receptor modulators with Forecaster, a novel platform for drug discovery.

    Science.gov (United States)

    Therrien, Eric; Englebienne, Pablo; Arrowsmith, Andrew G; Mendoza-Sanchez, Rodrigo; Corbeil, Christopher R; Weill, Nathanael; Campagna-Slater, Valérie; Moitessier, Nicolas

    2012-01-23

    As part of a large medicinal chemistry program, we wish to develop novel selective estrogen receptor modulators (SERMs) as potential breast cancer treatments using a combination of experimental and computational approaches. However, one of the remaining difficulties nowadays is to fully integrate computational (i.e., virtual, theoretical) and medicinal (i.e., experimental, intuitive) chemistry to take advantage of the full potential of both. For this purpose, we have developed a Web-based platform, Forecaster, and a number of programs (e.g., Prepare, React, Select) with the aim of combining computational chemistry and medicinal chemistry expertise to facilitate drug discovery and development and more specifically to integrate synthesis into computer-aided drug design. In our quest for potent SERMs, this platform was used to build virtual combinatorial libraries, filter and extract a highly diverse library from the NCI database, and dock them to the estrogen receptor (ER), with all of these steps being fully automated by computational chemists for use by medicinal chemists. As a result, virtual screening of a diverse library seeded with active compounds followed by a search for analogs yielded an enrichment factor of 129, with 98% of the seeded active compounds recovered, while the screening of a designed virtual combinatorial library including known actives yielded an area under the receiver operating characteristic (AU-ROC) of 0.78. The lead optimization proved less successful, further demonstrating the challenge to simulate structure activity relationship studies.

  14. The design and development of GRASS file reservation system

    International Nuclear Information System (INIS)

    Huang Qiulan; Zhu Suijiang; Cheng Yaodong; Chen Gang

    2010-01-01

    GFRS (GRASS File Reservation System) is designed to improve the file access performance of GRASS (Grid-enabled Advanced Storage System) which is a Hierarchical Storage Management (HSM) system developed at Computing Center, Institute of High Energy Physics. GRASS can provide massive storage management and data migration, but the data migration policy is simply based factors such as pool water level, the intervals for migration and so on, so it is short of precise control over files. As for that, we design GFRS to implement user-based file reservation which is to reserve and keep the required files on disks for High Energy physicists. CFRS can improve file access speed for users by avoiding migrating frequently accessed files to tapes. In this paper we first give a brief introduction of GRASS system and then detailed architecture and implementation of GFRS. Experiments results from GFRS have shown good performance and a simple analysis is made based on it. (authors)

  15. Decay data file based on the ENSDF file

    Energy Technology Data Exchange (ETDEWEB)

    Katakura, J. [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-03-01

    A decay data file with the JENDL (Japanese Evaluated Nuclear Data Library) format based on the ENSDF (Evaluated Nuclear Structure Data File) file was produced as a tentative one of special purpose files of JENDL. The problem using the ENSDF file as primary source data of the JENDL decay data file is presented. (author)

  16. file 4

    African Journals Online (AJOL)

    students who were selected by a systematic sampling method. A semi-structured .... response rate of 96.9%. The mean age of 69.2% of respondents, while reduction in food .... natural disasters, and travel health world summit on global ...

  17. file 7

    African Journals Online (AJOL)

    Methodology. This study was carried out in antenatal clinics of Obafemi Awolowo University Teaching Hospital Complex. It ... adverse effects of malaria in pregnant women clinics of the selected hospitals within the. 7 in endemic areas of the .... like heat and difficulty in installation on NDHS 2008 reports . modern day beds ...

  18. Optimizing Input/Output Using Adaptive File System Policies

    Science.gov (United States)

    Madhyastha, Tara M.; Elford, Christopher L.; Reed, Daniel A.

    1996-01-01

    Parallel input/output characterization studies and experiments with flexible resource management algorithms indicate that adaptivity is crucial to file system performance. In this paper we propose an automatic technique for selecting and refining file system policies based on application access patterns and execution environment. An automatic classification framework allows the file system to select appropriate caching and pre-fetching policies, while performance sensors provide feedback used to tune policy parameters for specific system environments. To illustrate the potential performance improvements possible using adaptive file system policies, we present results from experiments involving classification-based and performance-based steering.

  19. UPIN Group File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Group Unique Physician Identifier Number (UPIN) File is the business entity file that contains the group practice UPIN and descriptive information. It does NOT...

  20. Hyperactivity--Drug Therapy/Food Additives/Allergies. A Selective Bibliography. Exceptional Child Bibliography Series No. 602.

    Science.gov (United States)

    ERIC Clearinghouse on Handicapped and Gifted Children, Reston, VA.

    The annotated bibliography on Hyperactivity--Drug Therapy/Food Additives/Allergies contains approximately 65 abstracts and associated indexing information for documents or journal articles published from 1968 to 1975 and selected from the computer files of the Council for Exceptional Children's Information Services and the Education Resources…

  1. Multi-level, automatic file management system using magnetic disk, mass storage system and magnetic tape

    International Nuclear Information System (INIS)

    Fujii, Minoru; Asai, Kiyoshi

    1979-12-01

    A simple, effective file management system using magnetic disk, mass storage system (MSS) and magnetic tape is described. Following are the concepts and techniques introduced in this file management system. (1) File distribution and continuity character of file references are closely approximated by memory retention function. A density function using the memory retention function is thus defined. (2) A method of computing the cost/benefit lines for magnetic disk, MSS and magnetic tape is presented. (3) A decision process of an optimal organization of file facilities incorporating file demands distribution to respective file devices, is presented. (4) A method of simple, practical, effective, automatic file management, incorporating multi-level file management, space management and file migration control, is proposed. (author)

  2. A File Archival System

    Science.gov (United States)

    Fanselow, J. L.; Vavrus, J. L.

    1984-01-01

    ARCH, file archival system for DEC VAX, provides for easy offline storage and retrieval of arbitrary files on DEC VAX system. System designed to eliminate situations that tie up disk space and lead to confusion when different programers develop different versions of same programs and associated files.

  3. Text File Comparator

    Science.gov (United States)

    Kotler, R. S.

    1983-01-01

    File Comparator program IFCOMP, is text file comparator for IBM OS/VScompatable systems. IFCOMP accepts as input two text files and produces listing of differences in pseudo-update form. IFCOMP is very useful in monitoring changes made to software at the source code level.

  4. Longitudinal effects of college type and selectivity on degrees conferred upon undergraduate females in physical science, life science, math and computer science, and social science

    Science.gov (United States)

    Stevens, Stacy Mckimm

    There has been much research to suggest that a single-sex college experience for female undergraduate students can increase self-confidence and leadership ability during the college years and beyond. The results of previous studies also suggest that these students achieve in the workforce and enter graduate school at higher rates than their female peers graduating from coeducational institutions. However, some researchers have questioned these findings, suggesting that it is the selectivity level of the colleges rather than the comprised gender of the students that causes these differences. The purpose of this study was to justify the continuation of single-sex educational opportunities for females at the post-secondary level by examining the effects that college selectivity, college type, and time have on the rate of undergraduate females pursuing majors in non-traditional fields. The study examined the percentage of physical science, life science, math and computer science, and social science degrees conferred upon females graduating from women's colleges from 1985-2001, as compared to those at comparable coeducational colleges. Sampling for this study consisted of 42 liberal arts women's (n = 21) and coeducational (n = 21) colleges. Variables included the type of college, the selectivity level of the college, and the effect of time on the percentage of female graduates. Doubly multivariate repeated measures analysis of variance testing revealed significant main effects for college selectivity on social science graduates, and time on both life science and math and computer science graduates. Significant interaction was also found between the college type and time on social science graduates, as well as the college type, selectivity level, and time on math and computer science graduates. Implications of the results and suggestions for further research are discussed.

  5. Grammar-Based Specification and Parsing of Binary File Formats

    Directory of Open Access Journals (Sweden)

    William Underwood

    2012-03-01

    Full Text Available The capability to validate and view or play binary file formats, as well as to convert binary file formats to standard or current file formats, is critically important to the preservation of digital data and records. This paper describes the extension of context-free grammars from strings to binary files. Binary files are arrays of data types, such as long and short integers, floating-point numbers and pointers, as well as characters. The concept of an attribute grammar is extended to these context-free array grammars. This attribute grammar has been used to define a number of chunk-based and directory-based binary file formats. A parser generator has been used with some of these grammars to generate syntax checkers (recognizers for validating binary file formats. Among the potential benefits of an attribute grammar-based approach to specification and parsing of binary file formats is that attribute grammars not only support format validation, but support generation of error messages during validation of format, validation of semantic constraints, attribute value extraction (characterization, generation of viewers or players for file formats, and conversion to current or standard file formats. The significance of these results is that with these extensions to core computer science concepts, traditional parser/compiler technologies can potentially be used as a part of a general, cost effective curation strategy for binary file formats.

  6. Computer-assisted design and synthesis of a highly selective smart adsorbent for extraction of clonazepam from human serum.

    Science.gov (United States)

    Aqababa, Heydar; Tabandeh, Mehrdad; Tabatabaei, Meisam; Hasheminejad, Meisam; Emadi, Masoomeh

    2013-01-01

    A computational approach was applied to screen functional monomers and polymerization solvents for rational design of molecular imprinted polymers (MIPs) as smart adsorbents for solid-phase extraction of clonazepam (CLO) form human serum. The comparison of the computed binding energies of the complexes formed between the template and functional monomers was conducted. The primary computational results were corrected by taking into calculation both the basis set superposition error (BSSE) and the effect of the polymerization solvent using the counterpoise (CP) correction and the polarizable continuum model, respectively. Based on the theoretical calculations, trifluoromethyl acrylic acid (TFMAA) and acrylonitrile (ACN) were found as the best and the worst functional monomers, correspondingly. To test the accuracy of the computational results, three MIPs were synthesized by different functional monomers and their Langmuir-Freundlich (LF) isotherms were studied. The experimental results obtained confirmed the computational results and indicated that the MIP synthesized using TFMAA had the highest affinity for CLO in human serum despite the presence of a vast spectrum of ions. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. 29 CFR 4000.28 - What if I send a computer disk?

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false What if I send a computer disk? 4000.28 Section 4000.28... I send a computer disk? (a) In general. We determine your filing or issuance date for a computer... paragraph (b) of this section. (1) Filings. For computer-disk filings, we may treat your submission as...

  8. A data compression algorithm for nuclear spectrum files

    International Nuclear Information System (INIS)

    Mika, J.F.; Martin, L.J.; Johnston, P.N.

    1990-01-01

    The total space occupied by computer files of spectra generated in nuclear spectroscopy systems can lead to problems of storage, and transmission time. An algorithm is presented which significantly reduces the space required to store nuclear spectra, without loss of any information content. Testing indicates that spectrum files can be routinely compressed by a factor of 5. (orig.)

  9. Effect of reciprocating file motion on microcrack formation in root canals: an SEM study.

    Science.gov (United States)

    Ashwinkumar, V; Krithikadatta, J; Surendran, S; Velmurugan, N

    2014-07-01

    To compare dentinal microcrack formation whilst using Ni-Ti hand K-files, ProTaper hand and rotary files and the WaveOne reciprocating file. One hundred and fifty mandibular first molars were selected. Thirty teeth were left unprepared and served as controls, and the remaining 120 teeth were divided into four groups. Ni-Ti hand K-files, ProTaper hand files, ProTaper rotary files and WaveOne Primary reciprocating files were used to prepare the mesial canals. Roots were then sectioned 3, 6 and 9 mm from the apex, and the cut surface was observed under scanning electron microscope (SEM) and checked for the presence of dentinal microcracks. The control and Ni-Ti hand K-files groups were not associated with microcracks. In roots prepared with ProTaper hand files, ProTaper rotary files and WaveOne Primary reciprocating files, dentinal microcracks were present. There was a significant difference between control/Ni-Ti hand K-files group and ProTaper hand files/ProTaper rotary files/WaveOne Primary reciprocating file group (P ProTaper rotary files producing the most microcracks. No significant difference was observed between teeth prepared with ProTaper hand files and WaveOne Primary reciprocating files. ProTaper rotary files were associated with significantly more microcracks than ProTaper hand files and WaveOne Primary reciprocating files. Ni-Ti hand K-files did not produce microcracks at any levels inside the root canals. © 2013 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  10. Frequency-selective near-field radiative heat transfer between photonic crystal slabs: a computational approach for arbitrary geometries and materials.

    Science.gov (United States)

    Rodriguez, Alejandro W; Ilic, Ognjen; Bermel, Peter; Celanovic, Ivan; Joannopoulos, John D; Soljačić, Marin; Johnson, Steven G

    2011-09-09

    We demonstrate the possibility of achieving enhanced frequency-selective near-field radiative heat transfer between patterned (photonic-crystal) slabs at designable frequencies and separations, exploiting a general numerical approach for computing heat transfer in arbitrary geometries and materials based on the finite-difference time-domain method. Our simulations reveal a tradeoff between selectivity and near-field enhancement as the slab-slab separation decreases, with the patterned heat transfer eventually reducing to the unpatterned result multiplied by a fill factor (described by a standard proximity approximation). We also find that heat transfer can be further enhanced at selective frequencies when the slabs are brought into a glide-symmetric configuration, a consequence of the degeneracies associated with the nonsymmorphic symmetry group.

  11. New orthopaedic implant management tool for computer-assisted planning, navigation, and simulation: from implant CAD files to a standardized XML-based implant database.

    Science.gov (United States)

    Sagbo, S; Blochaou, F; Langlotz, F; Vangenot, C; Nolte, L-P; Zheng, G

    2005-01-01

    Computer-Assisted Orthopaedic Surgery (CAOS) has made much progress over the last 10 years. Navigation systems have been recognized as important tools that help surgeons, and various such systems have been developed. A disadvantage of these systems is that they use non-standard formalisms and techniques. As a result, there are no standard concepts for implant and tool management or data formats to store information for use in 3D planning and navigation. We addressed these limitations and developed a practical and generic solution that offers benefits for surgeons, implant manufacturers, and CAS application developers. We developed a virtual implant database containing geometrical as well as calibration information for orthopedic implants and instruments, with a focus on trauma. This database has been successfully tested for various applications in the client/server mode. The implant information is not static, however, because manufacturers periodically revise their implants, resulting in the deletion of some implants and the introduction of new ones. Tracking these continuous changes and keeping CAS systems up to date is a tedious task if done manually. This leads to additional costs for system development, and some errors are inevitably generated due to the huge amount of information that has to be processed. To ease management with respect to implant life cycle, we developed a tool to assist end-users (surgeons, hospitals, CAS system providers, and implant manufacturers) in managing their implants. Our system can be used for pre-operative planning and intra-operative navigation, and also for any surgical simulation involving orthopedic implants. Currently, this tool allows addition of new implants, modification of existing ones, deletion of obsolete implants, export of a given implant, and also creation of backups. Our implant management system has been successfully tested in the laboratory with very promising results. It makes it possible to fill the current gap

  12. INTRA- AND INTER-OBSERVER RELIABILITY IN SELECTION OF THE HEART RATE DEFLECTION POINT DURING INCREMENTAL EXERCISE: COMPARISON TO A COMPUTER-GENERATED DEFLECTION POINT

    Directory of Open Access Journals (Sweden)

    Bridget A. Duoos

    2002-12-01

    Full Text Available This study was designed to 1 determine the relative frequency of occurrence of a heart rate deflection point (HRDP, when compared to a linear relationship, during progressive exercise, 2 measure the reproducibility of a visual assessment of a heart rate deflection point (HRDP, both within and between observers 3 compare visual and computer-assessed deflection points. Subjects consisted of 73 competitive male cyclists with mean age of 31.4 ± 6.3 years, mean height 178.3 ± 4.8 cm. and weight 74.0 ± 4.4 kg. Tests were conducted on an electrically-braked cycle ergometer beginning at 25 watts and progressing 25 watts per minute to fatigue. Heart Rates were recorded the last 10 seconds of each stage and at fatigue. Scatter plots of heart rate versus watts were computer-generated and given to 3 observers on two different occasions. A computer program was developed to assess if data points were best represented by a single line or two lines. The HRDP represented the intersection of the two lines. Results of this study showed that 1 computer-assessed HRDP showed that 44 of 73 subjects (60.3% had scatter plots best represented by a straight line with no HRDP 2in those subjects having HRDP, all 3 observers showed significant differences(p = 0.048, p = 0.007, p = 0.001 in reproducibility of their HRDP selection. Differences in HRDP selection were significant for two of the three comparisons between observers (p = 0.002, p = 0.305, p = 0.0003 Computer-generated HRDP was significantly different than visual HRDP for 2 of 3 observers (p = 0.0016, p = 0.513, p = 0.0001. It is concluded that 1 HRDP occurs in a minority of subjects 2 significant differences exist, both within and between observers, in selection of HRDP and 3 differences in agreement between visual and computer-generated HRDP would indicate that, when HRDP exists, it should be computer-assessed

  13. Securing the AliEn File Catalogue - Enforcing authorization with accountable file operations

    International Nuclear Information System (INIS)

    Schreiner, Steffen; Banerjee, Subho Sankar; Betev, Latchezar; Carminati, Federico; Vladimirovna Datskova, Olga; Furano, Fabrizio; Grigoras, Alina; Grigoras, Costin; Mendez Lorenzo, Patricia; Peters, Andreas Joachim; Saiz, Pablo; Bagnasco, Stefano; Zhu Jianlin

    2011-01-01

    The AliEn Grid Services, as operated by the ALICE Collaboration in its global physics analysis grid framework, is based on a central File Catalogue together with a distributed set of storage systems and the possibility to register links to external data resources. This paper describes several identified vulnerabilities in the AliEn File Catalogue access protocol regarding fraud and unauthorized file alteration and presents a more secure and revised design: a new mechanism, called LFN Booking Table, is introduced in order to keep track of access authorization in the transient state of files entering or leaving the File Catalogue. Due to a simplification of the original Access Envelope mechanism for xrootd-protocol-based storage systems, fundamental computational improvements of the mechanism were achieved as well as an up to 50% reduction of the credential's size. By extending the access protocol with signed status messages from the underlying storage system, the File Catalogue receives trusted information about a file's size and checksum and the protocol is no longer dependent on client trust. Altogether, the revised design complies with atomic and consistent transactions and allows for accountable, authentic, and traceable file operations. This paper describes these changes as part and beyond the development of AliEn version 2.19.

  14. Winning the Popularity Contest: Researcher Preference When Selecting Resources for Civil Engineering, Computer Science, Mathematics and Physics Dissertations

    Science.gov (United States)

    Dotson, Daniel S.; Franks, Tina P.

    2015-01-01

    More than 53,000 citations from 609 dissertations published at The Ohio State University between 1998-2012 representing four science disciplines--civil engineering, computer science, mathematics and physics--were examined to determine what, if any, preferences or trends exist. This case study seeks to identify whether or not researcher preferences…

  15. Frequency-selectivity of a thalamocortical relay neuron during Parkinson's disease and deep brain stimulation: a computational study

    NARCIS (Netherlands)

    Cagnan, Hayriye; Cagnan, H.; Meijer, Hil Gaétan Ellart; van Gils, Stephanus A.; Krupa, M.; Heida, Tjitske; Rudolph, Michelle; Wadman, Wyse J.; Martens, Hubert C.F.

    2009-01-01

    In this computational study, we investigated (i) the functional importance of correlated basal ganglia (BG) activity associated with Parkinson's disease (PD) motor symptoms by analysing the effects of globus pallidus internum (GPi) bursting frequency and synchrony on a thalamocortical (TC) relay

  16. ATLAS, an integrated structural analysis and design system. Volume 4: Random access file catalog

    Science.gov (United States)

    Gray, F. P., Jr. (Editor)

    1979-01-01

    A complete catalog is presented for the random access files used by the ATLAS integrated structural analysis and design system. ATLAS consists of several technical computation modules which output data matrices to corresponding random access file. A description of the matrices written on these files is contained herein.

  17. pcircle - A Suite of Scalable Parallel File System Tools

    Energy Technology Data Exchange (ETDEWEB)

    2015-10-01

    Most of the software related to file system are written for conventional local file system, they are serialized and can't take advantage of the benefit of a large scale parallel file system. "pcircle" software builds on top of ubiquitous MPI in cluster computing environment and "work-stealing" pattern to provide a scalable, high-performance suite of file system tools. In particular - it implemented parallel data copy and parallel data checksumming, with advanced features such as async progress report, checkpoint and restart, as well as integrity checking.

  18. MR-AFS: a global hierarchical file-system

    International Nuclear Information System (INIS)

    Reuter, H.

    2000-01-01

    The next generation of fusion experiments will use object-oriented technology creating the need for world wide sharing of an underlying hierarchical file-system. The Andrew file system (AFS) is a well known and widely spread global distributed file-system. Multiple-resident-AFS (MR-AFS) combines the features of AFS with hierarchical storage management systems. Files in MR-AFS therefore may be migrated on secondary storage, such as roboted tape libraries. MR-AFS is in use at IPP for the current experiments and data originating from super-computer applications. Experiences and scalability issues are discussed

  19. Network survivability performance (computer diskette)

    Science.gov (United States)

    1993-11-01

    File characteristics: Data file; 1 file. Physical description: 1 computer diskette; 3 1/2 in.; high density; 2.0MB. System requirements: Mac; Word. This technical report has been developed to address the survivability of telecommunications networks including services. It responds to the need for a common understanding of, and assessment techniques for network survivability, availability, integrity, and reliability. It provides a basis for designing and operating telecommunication networks to user expectations for network survivability.

  20. A structural, functional, and computational analysis suggests pore flexibility as the base for the poor selectivity of CNG channels.

    Science.gov (United States)

    Napolitano, Luisa Maria Rosaria; Bisha, Ina; De March, Matteo; Marchesi, Arin; Arcangeletti, Manuel; Demitri, Nicola; Mazzolini, Monica; Rodriguez, Alex; Magistrato, Alessandra; Onesti, Silvia; Laio, Alessandro; Torre, Vincent

    2015-07-07

    Cyclic nucleotide-gated (CNG) ion channels, despite a significant homology with the highly selective K(+) channels, do not discriminate among monovalent alkali cations and are permeable also to several organic cations. We combined electrophysiology, molecular dynamics (MD) simulations, and X-ray crystallography to demonstrate that the pore of CNG channels is highly flexible. When a CNG mimic is crystallized in the presence of a variety of monovalent cations, including Na(+), Cs(+), and dimethylammonium (DMA(+)), the side chain of Glu66 in the selectivity filter shows multiple conformations and the diameter of the pore changes significantly. MD simulations indicate that Glu66 and the prolines in the outer vestibule undergo large fluctuations, which are modulated by the ionic species and the voltage. This flexibility underlies the coupling between gating and permeation and the poor ionic selectivity of CNG channels.

  1. Apically extruded dentin debris by reciprocating single-file and multi-file rotary system.

    Science.gov (United States)

    De-Deus, Gustavo; Neves, Aline; Silva, Emmanuel João; Mendonça, Thais Accorsi; Lourenço, Caroline; Calixto, Camila; Lima, Edson Jorge Moreira

    2015-03-01

    This study aims to evaluate the apical extrusion of debris by the two reciprocating single-file systems: WaveOne and Reciproc. Conventional multi-file rotary system was used as a reference for comparison. The hypotheses tested were (i) the reciprocating single-file systems extrude more than conventional multi-file rotary system and (ii) the reciprocating single-file systems extrude similar amounts of dentin debris. After solid selection criteria, 80 mesial roots of lower molars were included in the present study. The use of four different instrumentation techniques resulted in four groups (n = 20): G1 (hand-file technique), G2 (ProTaper), G3 (WaveOne), and G4 (Reciproc). The apparatus used to evaluate the collection of apically extruded debris was typical double-chamber collector. Statistical analysis was performed for multiple comparisons. No significant difference was found in the amount of the debris extruded between the two reciprocating systems. In contrast, conventional multi-file rotary system group extruded significantly more debris than both reciprocating groups. Hand instrumentation group extruded significantly more debris than all other groups. The present results yielded favorable input for both reciprocation single-file systems, inasmuch as they showed an improved control of apically extruded debris. Apical extrusion of debris has been studied extensively because of its clinical relevance, particularly since it may cause flare-ups, originated by the introduction of bacteria, pulpal tissue, and irrigating solutions into the periapical tissues.

  2. Towards the Selection of an Optimal Global Geopotential Model for the Computation of the Long-Wavelength Contribution: A Case Study of Ghana

    Directory of Open Access Journals (Sweden)

    Caleb Iddissah Yakubu

    2017-11-01

    Full Text Available The selection of a global geopotential model (GGM for modeling the long-wavelength for geoid computation is imperative not only because of the plethora of GGMs available but more importantly because it influences the accuracy of a geoid model. In this study, we propose using the Gaussian averaging function for selecting an optimal GGM and degree and order (d/o for the remove-compute-restore technique as a replacement for the direct comparison of terrestrial gravity anomalies and GGM anomalies, because ground data and GGM have different frequencies. Overall, EGM2008 performed better than all the tested GGMs and at an optimal d/o of 222. We verified the results by computing geoid models using Heck and Grüninger’s modification and validated them against GPS/trigonometric data. The results of the validation were consistent with those of the averaging process with EGM2008 giving the smallest standard deviation of 0.457 m at d/o 222, resulting in an 8% improvement over the previous geoid model. In addition, this geoid model, the Ghanaian Gravimetric Geoid 2017 (GGG 2017 may be used to replace second-order class II leveling, with an expected error of 6.8 mm/km for baselines ranging from 20 to 225 km.

  3. Characteristics of file sharing and peer to peer networking | Opara ...

    African Journals Online (AJOL)

    Characteristics of file sharing and peer to peer networking. ... distributing or providing access to digitally stored information, such as computer programs, ... including in multicast systems, anonymous communications systems, and web caches.

  4. Selection of a computer code for Hanford low-level waste engineered-system performance assessment. Revision 1

    International Nuclear Information System (INIS)

    McGrail, B.P.; Bacon, D.H.

    1998-02-01

    Planned performance assessments for the proposed disposal of low-activity waste (LAW) glass produced from remediation of wastes stored in underground tanks at Hanford, Washington will require calculations of radionuclide release rates from the subsurface disposal facility. These calculations will be done with the aid of computer codes. The available computer codes with suitable capabilities at the time Revision 0 of this document was prepared were ranked in terms of the feature sets implemented in the code that match a set of physical, chemical, numerical, and functional capabilities needed to assess release rates from the engineered system. The needed capabilities were identified from an analysis of the important physical and chemical processes expected to affect LAW glass corrosion and the mobility of radionuclides. This analysis was repeated in this report but updated to include additional processes that have been found to be important since Revision 0 was issued and to include additional codes that have been released. The highest ranked computer code was found to be the STORM code developed at PNNL for the US Department of Energy for evaluation of arid land disposal sites

  5. Analogous Mechanisms of Selection and Updating in Declarative and Procedural Working Memory: Experiments and a Computational Model

    Science.gov (United States)

    Oberauer, Klaus; Souza, Alessandra S.; Druey, Michel D.; Gade, Miriam

    2013-01-01

    The article investigates the mechanisms of selecting and updating representations in declarative and procedural working memory (WM). Declarative WM holds the objects of thought available, whereas procedural WM holds representations of what to do with these objects. Both systems consist of three embedded components: activated long-term memory, a…

  6. Sulfide perovskites for solar energy conversion applications: computational screening and synthesis of the selected compound LaYS3

    DEFF Research Database (Denmark)

    Kuhar, Korina; Crovetto, Andrea; Pandey, Mohnish

    2017-01-01

    of ternary sulfides followed by synthesis and confirmation of the properties of one of the most promising materials. The screening focusses on materials with ABS3 composition taking both perovskite and non-perovskite structures into consideration, and the material selection is based on descriptors...

  7. User Adapted Motor-Imaginary Brain-Computer Interface by means of EEG Channel Selection Based on Estimation of Distributed Algorithms

    Directory of Open Access Journals (Sweden)

    Aitzol Astigarraga

    2016-01-01

    Full Text Available Brain-Computer Interfaces (BCIs have become a research field with interesting applications, and it can be inferred from published papers that different persons activate different parts of the brain to perform the same action. This paper presents a personalized interface design method, for electroencephalogram- (EEG- based BCIs, based on channel selection. We describe a novel two-step method in which firstly a computationally inexpensive greedy algorithm finds an adequate search range; and, then, an Estimation of Distribution Algorithm (EDA is applied in the reduced range to obtain the optimal channel subset. The use of the EDA allows us to select the most interacting channels subset, removing the irrelevant and noisy ones, thus selecting the most discriminative subset of channels for each user improving accuracy. The method is tested on the IIIa dataset from the BCI competition III. Experimental results show that the resulting channel subset is consistent with motor-imaginary-related neurophysiological principles and, on the other hand, optimizes performance reducing the number of channels.

  8. Sundials in the shade: A study of women's persistence in the first year of a computer science program in a selective university

    Science.gov (United States)

    Powell, Rita Manco

    Currently women are underrepresented in departments of computer science, making up approximately 18% of the undergraduate enrollment in selective universities. Most attrition in computer science occurs early in this major, in the freshman and sophomore years, and women drop out in disproportionately greater numbers than their male counterparts. Taking an ethnographic approach to investigating women's experiences and progress in the first year courses in the computer science major at the University of Pennsylvania, this study examined the pre-college influences that led these women to the major and the nature of their experiences in and outside of class with faculty, peers, and academic support services. This study sought an understanding of the challenges these women faced in the first year of the major with the goal of informing institutional practice about how to best support their persistence. The research reviewed for this study included patterns of leaving majors in science, math and engineering (Seymour & Hewitt 1997), the high school preparation needed to pursue math and engineering majors in college (Strenta, Elliott, Adair, Matier, & Scott, 1994), and intervention programs that have positively impacted persistence of women in computer science (Margolis & Fisher, 2002). The research method of this study employed a series of personal interviews over the course of one calendar year with fourteen first year women who had either declared on intended to declare the computer science major in the School of Engineering and Applied Science at the University of Pennsylvania. Other data sources were focus groups and personal interviews with faculty, administrators, admissions and student life professionals, teaching assistants, female graduate students, and male first year students at the University of Pennsylvania. This study found that the women in this study group came to the University of Pennsylvania with a thorough grounding in mathematics, but many either had

  9. Online selection of short-lived particles on many-core computer architectures in the CBM experiment at FAIR

    Energy Technology Data Exchange (ETDEWEB)

    Zyzak, Maksym

    2016-07-07

    Modern experiments in heavy ion collisions operate with huge data rates that can not be fully stored on the currently available storage devices. Therefore the data flow should be reduced by selecting those collisions that potentially carry the information of the physics interest. The future CBM experiment will have no simple criteria for selecting such collisions and requires the full online reconstruction of the collision topology including reconstruction of short-lived particles. In this work the KF Particle Finder package for online reconstruction and selection of short-lived particles is proposed and developed. It reconstructs more than 70 decays, covering signals from all the physics cases of the CBM experiment: strange particles, strange resonances, hypernuclei, low mass vector mesons, charmonium, and open-charm particles. The package is based on the Kalman filter method providing a full set of the particle parameters together with their errors including position, momentum, mass, energy, lifetime, etc. It shows a high quality of the reconstructed particles, high efficiencies, and high signal to background ratios. The KF Particle Finder is extremely fast for achieving the reconstruction speed of 1.5 ms per minimum-bias AuAu collision at 25 AGeV beam energy on single CPU core. It is fully vectorized and parallelized and shows a strong linear scalability on the many-core architectures of up to 80 cores. It also scales within the First Level Event Selection package on the many-core clusters up to 3200 cores. The developed KF Particle Finder package is a universal platform for short- lived particle reconstruction, physics analysis and online selection.

  10. Online selection of short-lived particles on many-core computer architectures in the CBM experiment at FAIR

    International Nuclear Information System (INIS)

    Zyzak, Maksym

    2016-01-01

    Modern experiments in heavy ion collisions operate with huge data rates that can not be fully stored on the currently available storage devices. Therefore the data flow should be reduced by selecting those collisions that potentially carry the information of the physics interest. The future CBM experiment will have no simple criteria for selecting such collisions and requires the full online reconstruction of the collision topology including reconstruction of short-lived particles. In this work the KF Particle Finder package for online reconstruction and selection of short-lived particles is proposed and developed. It reconstructs more than 70 decays, covering signals from all the physics cases of the CBM experiment: strange particles, strange resonances, hypernuclei, low mass vector mesons, charmonium, and open-charm particles. The package is based on the Kalman filter method providing a full set of the particle parameters together with their errors including position, momentum, mass, energy, lifetime, etc. It shows a high quality of the reconstructed particles, high efficiencies, and high signal to background ratios. The KF Particle Finder is extremely fast for achieving the reconstruction speed of 1.5 ms per minimum-bias AuAu collision at 25 AGeV beam energy on single CPU core. It is fully vectorized and parallelized and shows a strong linear scalability on the many-core architectures of up to 80 cores. It also scales within the First Level Event Selection package on the many-core clusters up to 3200 cores. The developed KF Particle Finder package is a universal platform for short- lived particle reconstruction, physics analysis and online selection.

  11. LASIP-III, a generalized processor for standard interface files

    International Nuclear Information System (INIS)

    Bosler, G.E.; O'Dell, R.D.; Resnik, W.M.

    1976-03-01

    The LASIP-III code was developed for processing Version III standard interface data files which have been specified by the Committee on Computer Code Coordination. This processor performs two distinct tasks, namely, transforming free-field format, BCD data into well-defined binary files and providing for printing and punching data in the binary files. While LASIP-III is exported as a complete free-standing code package, techniques are described for easily separating the processor into two modules, viz., one for creating the binary files and one for printing the files. The two modules can be separated into free-standing codes or they can be incorporated into other codes. Also, the LASIP-III code can be easily expanded for processing additional files, and procedures are described for such an expansion. 2 figures, 8 tables

  12. Efficient analysis and extraction of MS/MS result data from Mascot™ result files

    Directory of Open Access Journals (Sweden)

    Sickmann Albert

    2005-12-01

    Full Text Available Abstract Background Mascot™ is a commonly used protein identification program for MS as well as for tandem MS data. When analyzing huge shotgun proteomics datasets with Mascot™'s native tools, limits of computing resources are easily reached. Up to now no application has been available as open source that is capable of converting the full content of Mascot™ result files from the original MIME format into a database-compatible tabular format, allowing direct import into database management systems and efficient handling of huge datasets analyzed by Mascot™. Results A program called mres2x is presented, which reads Mascot™ result files, analyzes them and extracts either selected or all information in order to store it in a single file or multiple files in formats which are easier to handle downstream of Mascot™. It generates different output formats. The output of mres2x in tab format is especially designed for direct high-performance import into relational database management systems using native tools of these systems. Having the data available in database management systems allows complex queries and extensive analysis. In addition, the original peak lists can be extracted in DTA format suitable for protein identification using the Sequest™ program, and the Mascot™ files can be split, preserving the original data format. During conversion, several consistency checks are performed. mres2x is designed to provide high throughput processing combined with the possibility to be driven by other computer programs. The source code including supplement material and precompiled binaries is available via http://www.protein-ms.de and http://sourceforge.net/projects/protms/. Conclusion The database upload allows regrouping of the MS/MS results using a database management system and complex analyzing queries using SQL without the need to run new Mascot™ searches when changing grouping parameters.

  13. CryptoCache: A Secure Sharable File Cache for Roaming Users

    DEFF Research Database (Denmark)

    Jensen, Christian D.

    2000-01-01

    . Conventional distributed file systems cache everything locally or not at all; there is no possibility to cache files on nearby nodes.In this paper we present the design of a secure cache system called CryptoCache that allows roaming users to cache files on untrusted file hosting servers. The system allows...... flexible sharing of cached files among unauthenticated users, i.e. unlike most distributed file systems CryptoCache does not require a global authentication framework.Files are encrypted when they are transferred over the network and while stored on untrusted servers. The system uses public key......Small mobile computers are now sufficiently powerful to run many applications, but storage capacity remains limited so working files cannot be cached or stored locally. Even if files can be stored locally, the mobile device is not powerful enough to act as server in collaborations with other users...

  14. Distributed Data Management and Distributed File Systems

    CERN Document Server

    Girone, Maria

    2015-01-01

    The LHC program has been successful in part due to the globally distributed computing resources used for collecting, serving, processing, and analyzing the large LHC datasets. The introduction of distributed computing early in the LHC program spawned the development of new technologies and techniques to synchronize information and data between physically separated computing centers. Two of the most challenges services are the distributed file systems and the distributed data management systems. In this paper I will discuss how we have evolved from local site services to more globally independent services in the areas of distributed file systems and data management and how these capabilities may continue to evolve into the future. I will address the design choices, the motivations, and the future evolution of the computing systems used for High Energy Physics.

  15. Utilizing HDF4 File Content Maps for the Cloud

    Science.gov (United States)

    Lee, Hyokyung Joe

    2016-01-01

    We demonstrate a prototype study that HDF4 file content map can be used for efficiently organizing data in cloud object storage system to facilitate cloud computing. This approach can be extended to any binary data formats and to any existing big data analytics solution powered by cloud computing because HDF4 file content map project started as long term preservation of NASA data that doesn't require HDF4 APIs to access data.

  16. BIBLIO: A Reprint File Management Algorithm

    Science.gov (United States)

    Zelnio, Robert N.; And Others

    1977-01-01

    The development of a simple computer algorithm designed for use by the individual educator or researcher in maintaining and searching reprint files is reported. Called BIBLIO, the system is inexpensive and easy to operate and maintain without sacrificing flexibility and utility. (LBH)

  17. ACTIV87 Fast neutron activation cross section file 1987

    International Nuclear Information System (INIS)

    Manokhin, V.N.; Pashchenko, A.B.; Plyaskin, V.I.; Bychkov, V.M.; Pronyaev, V.G.; Schwerer, O.

    1989-10-01

    This document summarizes the content of the Fast Neutron Activation Cross Section File based on data from different evaluated data libraries and individual evaluations in ENDF/B-5 format. The entire file or selective retrievals from it are available on magnetic tape, free of charge, from the IAEA Nuclear Data Section. (author)

  18. An information retrieval system for research file data

    Science.gov (United States)

    Joan E. Lengel; John W. Koning

    1978-01-01

    Research file data have been successfully retrieved at the Forest Products Laboratory through a high-speed cross-referencing system involving the computer program FAMULUS as modified by the Madison Academic Computing Center at the University of Wisconsin. The method of data input, transfer to computer storage, system utilization, and effectiveness are discussed....

  19. GEODOC: the GRID document file, record structure and data element description

    Energy Technology Data Exchange (ETDEWEB)

    Trippe, T.; White, V.; Henderson, F.; Phillips, S.

    1975-11-06

    The purpose of this report is to describe the information structure of the GEODOC file. GEODOC is a computer based file which contains the descriptive cataloging and indexing information for all documents processed by the National Geothermal Information Resource Group. This file (along with other GRID files) is managed by DBMS, the Berkeley Data Base Management System. Input for the system is prepared using the IRATE Text Editing System with its extended (12 bit) character set, or punched cards.

  20. Structural, Biochemical, and Computational Studies Reveal the Mechanism of Selective Aldehyde Dehydrogenase 1A1 Inhibition by Cytotoxic Duocarmycin Analogues.

    Science.gov (United States)

    Koch, Maximilian F; Harteis, Sabrina; Blank, Iris D; Pestel, Galina; Tietze, Lutz F; Ochsenfeld, Christian; Schneider, Sabine; Sieber, Stephan A

    2015-11-09

    Analogues of the natural product duocarmycin bearing an indole moiety were shown to bind aldehyde dehydrogenase 1A1 (ALDH1A1) in addition to DNA, while derivatives without the indole solely addressed the ALDH1A1 protein. The molecular mechanism of selective ALDH1A1 inhibition by duocarmycin analogues was unraveled through cocrystallization, mutational studies, and molecular dynamics simulations. The structure of the complex shows the compound embedded in a hydrophobic pocket, where it is stabilized by several crucial π-stacking and van der Waals interactions. This binding mode positions the cyclopropyl electrophile for nucleophilic attack by the noncatalytic residue Cys302, thereby resulting in covalent attachment, steric occlusion of the active site, and inhibition of catalysis. The selectivity of duocarmycin analogues for ALDH1A1 is unique, since only minor alterations in the sequence of closely related protein isoforms restrict compound accessibility. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. How simple is too simple? Computational perspective on importance of second-shell environment for metal-ion selectivity

    Czech Academy of Sciences Publication Activity Database

    Gutten, Ondrej; Rulíšek, Lubomír

    2015-01-01

    Roč. 17, č. 22 (2015), s. 14393-14404 ISSN 1463-9076 R&D Projects: GA ČR(CZ) GA14-31419S Institutional support: RVO:61388963 Keywords : metal-ion selectivity * metallopeptide * stability constants * theoretical calculations Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 4.449, year: 2015 http://pubs.rsc.org/en/content/articlepdf/2015/cp/c4cp04876h

  2. Adding Data Management Services to Parallel File Systems

    Energy Technology Data Exchange (ETDEWEB)

    Brandt, Scott [Univ. of California, Santa Cruz, CA (United States)

    2015-03-04

    The objective of this project, called DAMASC for “Data Management in Scientific Computing”, is to coalesce data management with parallel file system management to present a declarative interface to scientists for managing, querying, and analyzing extremely large data sets efficiently and predictably. Managing extremely large data sets is a key challenge of exascale computing. The overhead, energy, and cost of moving massive volumes of data demand designs where computation is close to storage. In current architectures, compute/analysis clusters access data in a physically separate parallel file system and largely leave it scientist to reduce data movement. Over the past decades the high-end computing community has adopted middleware with multiple layers of abstractions and specialized file formats such as NetCDF-4 and HDF5. These abstractions provide a limited set of high-level data processing functions, but have inherent functionality and performance limitations: middleware that provides access to the highly structured contents of scientific data files stored in the (unstructured) file systems can only optimize to the extent that file system interfaces permit; the highly structured formats of these files often impedes native file system performance optimizations. We are developing Damasc, an enhanced high-performance file system with native rich data management services. Damasc will enable efficient queries and updates over files stored in their native byte-stream format while retaining the inherent performance of file system data storage via declarative queries and updates over views of underlying files. Damasc has four key benefits for the development of data-intensive scientific code: (1) applications can use important data-management services, such as declarative queries, views, and provenance tracking, that are currently available only within database systems; (2) the use of these services becomes easier, as they are provided within a familiar file

  3. Reduced sintering of mass-selected Au clusters on SiO2 by alloying with Ti: an aberration-corrected STEM and computational study

    DEFF Research Database (Denmark)

    Niu, Yubiao; Schlexer, Philomena; Sebök, Béla

    2018-01-01

    Au nanoparticles represent the most remarkable example of a size effect in heterogeneous catalysis. However, a major issue hindering the use of Au nanoparticles in technological applications is their rapid sintering. We explore the potential of stabilizing Au nanoclusters on SiO2 by alloying them...... in the Au/Ti clusters, but in line with the model computational investigation, Au atoms were still present on the surface. Thus size-selected, deposited nanoalloy Au/Ti clusters appear to be promising candidates for sustainable gold-based nanocatalysis....

  4. Development of a computer code system for selecting off-site protective action in radiological accidents based on the multiobjective optimization method

    International Nuclear Information System (INIS)

    Ishigami, Tsutomu; Oyama, Kazuo

    1989-09-01

    This report presents a new method to support selection of off-site protective action in nuclear reactor accidents, and provides a user's manual of a computer code system, PRASMA, developed using the method. The PRASMA code system gives several candidates of protective action zones of evacuation, sheltering and no action based on the multiobjective optimization method, which requires objective functions and decision variables. We have assigned population risks of fatality, injury and cost as the objective functions, and distance from a nuclear power plant characterizing the above three protective action zones as the decision variables. (author)

  5. Implications of Nine Risk Prediction Models for Selecting Ever-Smokers for Computed Tomography Lung Cancer Screening.

    Science.gov (United States)

    Katki, Hormuzd A; Kovalchik, Stephanie A; Petito, Lucia C; Cheung, Li C; Jacobs, Eric; Jemal, Ahmedin; Berg, Christine D; Chaturvedi, Anil K

    2018-05-15

    Lung cancer screening guidelines recommend using individualized risk models to refer ever-smokers for screening. However, different models select different screening populations. The performance of each model in selecting ever-smokers for screening is unknown. To compare the U.S. screening populations selected by 9 lung cancer risk models (the Bach model; the Spitz model; the Liverpool Lung Project [LLP] model; the LLP Incidence Risk Model [LLPi]; the Hoggart model; the Prostate, Lung, Colorectal, and Ovarian Cancer Screening Trial Model 2012 [PLCOM2012]; the Pittsburgh Predictor; the Lung Cancer Risk Assessment Tool [LCRAT]; and the Lung Cancer Death Risk Assessment Tool [LCDRAT]) and to examine their predictive performance in 2 cohorts. Population-based prospective studies. United States. Models selected U.S. screening populations by using data from the National Health Interview Survey from 2010 to 2012. Model performance was evaluated using data from 337 388 ever-smokers in the National Institutes of Health-AARP Diet and Health Study and 72 338 ever-smokers in the CPS-II (Cancer Prevention Study II) Nutrition Survey cohort. Model calibration (ratio of model-predicted to observed cases [expected-observed ratio]) and discrimination (area under the curve [AUC]). At a 5-year risk threshold of 2.0%, the models chose U.S. screening populations ranging from 7.6 million to 26 million ever-smokers. These disagreements occurred because, in both validation cohorts, 4 models (the Bach model, PLCOM2012, LCRAT, and LCDRAT) were well-calibrated (expected-observed ratio range, 0.92 to 1.12) and had higher AUCs (range, 0.75 to 0.79) than 5 models that generally overestimated risk (expected-observed ratio range, 0.83 to 3.69) and had lower AUCs (range, 0.62 to 0.75). The 4 best-performing models also had the highest sensitivity at a fixed specificity (and vice versa) and similar discrimination at a fixed risk threshold. These models showed better agreement on size of the

  6. File access prediction using neural networks.

    Science.gov (United States)

    Patra, Prashanta Kumar; Sahu, Muktikanta; Mohapatra, Subasish; Samantray, Ronak Kumar

    2010-06-01

    One of the most vexing issues in design of a high-speed computer is the wide gap of access times between the memory and the disk. To solve this problem, static file access predictors have been used. In this paper, we propose dynamic file access predictors using neural networks to significantly improve upon the accuracy, success-per-reference, and effective-success-rate-per-reference by using neural-network-based file access predictor with proper tuning. In particular, we verified that the incorrect prediction has been reduced from 53.11% to 43.63% for the proposed neural network prediction method with a standard configuration than the recent popularity (RP) method. With manual tuning for each trace, we are able to improve upon the misprediction rate and effective-success-rate-per-reference using a standard configuration. Simulations on distributed file system (DFS) traces reveal that exact fit radial basis function (RBF) gives better prediction in high end system whereas multilayer perceptron (MLP) trained with Levenberg-Marquardt (LM) backpropagation outperforms in system having good computational capability. Probabilistic and competitive predictors are the most suitable for work stations having limited resources to deal with and the former predictor is more efficient than the latter for servers having maximum system calls. Finally, we conclude that MLP with LM backpropagation algorithm has better success rate of file prediction than those of simple perceptron, last successor, stable successor, and best k out of m predictors.

  7. Computational and empirical simulations of selective memory impairments: Converging evidence for a single-system account of memory dissociations.

    Science.gov (United States)

    Curtis, Evan T; Jamieson, Randall K

    2018-04-01

    Current theory has divided memory into multiple systems, resulting in a fractionated account of human behaviour. By an alternative perspective, memory is a single system. However, debate over the details of different single-system theories has overshadowed the converging agreement among them, slowing the reunification of memory. Evidence in favour of dividing memory often takes the form of dissociations observed in amnesia, where amnesic patients are impaired on some memory tasks but not others. The dissociations are taken as evidence for separate explicit and implicit memory systems. We argue against this perspective. We simulate two key dissociations between classification and recognition in a computational model of memory, A Theory of Nonanalytic Association. We assume that amnesia reflects a quantitative difference in the quality of encoding. We also present empirical evidence that replicates the dissociations in healthy participants, simulating amnesic behaviour by reducing study time. In both analyses, we successfully reproduce the dissociations. We integrate our computational and empirical successes with the success of alternative models and manipulations and argue that our demonstrations, taken in concert with similar demonstrations with similar models, provide converging evidence for a more general set of single-system analyses that support the conclusion that a wide variety of memory phenomena can be explained by a unified and coherent set of principles.

  8. Selection of window sizes for optimizing occupational comfort and hygiene based on computational fluid dynamics and neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Stavrakakis, G.M.; Karadimou, D.P.; Zervas, P.L.; Markatos, N.C. [Computational Fluid Dynamics Unit, School of Chemical Engineering, National Technical University of Athens, Heroon Polytechniou 9, GR-15780 Athens (Greece); Sarimveis, H. [Unit of Process Control and Informatics, School of Chemical Engineering, National Technical University of Athens, Heroon Polytechniou 9, GR-15780 Athens (Greece)

    2011-02-15

    The present paper presents a novel computational method to optimize window sizes for thermal comfort and indoor air quality in naturally ventilated buildings. The methodology is demonstrated by means of a prototype case, which corresponds to a single-sided naturally ventilated apartment. Initially, the airflow in and around the building is simulated using a Computational Fluid Dynamics model. Local prevailing weather conditions are imposed in the CFD model as inlet boundary conditions. The produced airflow patterns are utilized to predict thermal comfort indices, i.e. the PMV and its modifications for non-air-conditioned buildings, as well as indoor air quality indices, such as ventilation effectiveness based on carbon dioxide and volatile organic compounds removal. Mean values of these indices (output/objective variables) within the occupied zone are calculated for different window sizes (input/design variables), to generate a database of input-output data pairs. The database is then used to train and validate Radial Basis Function Artificial Neural Network input-output ''meta-models''. The produced meta-models are used to formulate an optimization problem, which takes into account special constraints recommended by design guidelines. It is concluded that the proposed methodology determines appropriate windows architectural designs for pleasant and healthy indoor environments. (author)

  9. Source Reference File

    Data.gov (United States)

    Social Security Administration — This file contains a national set of names and contact information for doctors, hospitals, clinics, and other facilities (known collectively as sources) from which...

  10. Patient Assessment File (PAF)

    Data.gov (United States)

    Department of Veterans Affairs — The Patient Assessment File (PAF) database compiles the results of the Patient Assessment Instrument (PAI) questionnaire filled out for intermediate care Veterans...

  11. RRB Earnings File (RRBERN)

    Data.gov (United States)

    Social Security Administration — RRBERN contains records for all beneficiaries on the RRB's PSSVES file who's SSNs are validated through the SVES processing. Validated output is processed through...

  12. Java facilities in processing XML files - JAXB and generating PDF reports

    Directory of Open Access Journals (Sweden)

    Danut-Octavian SIMION

    2008-01-01

    Full Text Available The paper presents the Java programming language facilities in working with XML files using JAXB (The Java Architecture for XML Binding technology and generating PDF reports from XML files using Java objects. The XML file can be an existing one and could contain the data about an entity (Clients for example or it might be the result of a SELECT-SQL statement. JAXB generates JAVA classes through xs rules and a Marshalling, Unmarshalling compiler. The PDF file is build from a XML file and uses XSL-FO formatting file and a Java ResultSet object.

  13. Radiology Teaching Files on the Internet

    International Nuclear Information System (INIS)

    Lim, Eun Chung; Kim, Eun Kyung

    1996-01-01

    There is increasing attention about radiology teaching files on the Internet in the field of diagnostic radiology. The purpose of this study was to aid in the creation of new radiology teaching file by analysing the present radiology teaching file sites on the Internet with many aspects and evaluating images on those sites, using Macintosh II ci compute r, 28.8kbps TelePort Fax/Modem, Netscape Navigator 2.0 software. The results were as follow : 1. Analysis of radiology teaching file sites (1) Country distribution was the highest in USA (57.5%). (2) Average number of cases was 186 cases and radiology teaching file sites with search engine were 9 sites (22.5%). (3) At the method of case arrangement, anatomic area type and diagnosis type were found at the 10 sites (25%) each, question and answer type was found at the 9 sites (22.5%). (4) Radiology teaching file sites with oro-maxillofacial disorder were 9 sites (22.5%). (5) At the image format, GIF format was found at the 14 sites (35%), and JPEG format found at the 14 sites (35%). (6) Created year was the highest in 1995 (43.7%). (7) Continuing case upload was found at the 35 sites (87.5%). 2. Evaluation of images on the radiology teaching files (1) Average file size of GIF format (71 Kbyte) was greater than that of JPEG format (24 Kbyte). (P<0.001) (2) Image quality of GIF format was better than that of JPEG format. (P<0.001)

  14. 78 FR 64294 - Loan Guaranty: Mandatory Electronic Delivery of Loan Files for Review

    Science.gov (United States)

    2013-10-28

    ... DEPARTMENT OF VETERANS AFFAIRS Loan Guaranty: Mandatory Electronic Delivery of Loan Files for... Affairs (VA) Loan Guaranty Service (LGY) announces a new policy with regard to lender submission of VA- guaranteed closed loan files for review. Currently, lenders can submit loan files selected for review by LGY...

  15. 76 FR 62092 - Filing Procedures

    Science.gov (United States)

    2011-10-06

    ... INTERNATIONAL TRADE COMMISSION Filing Procedures AGENCY: International Trade Commission. ACTION: Notice of issuance of Handbook on Filing Procedures. SUMMARY: The United States International Trade Commission (``Commission'') is issuing a Handbook on Filing Procedures to replace its Handbook on Electronic...

  16. Virus Alert: Ten Steps to Safe Computing.

    Science.gov (United States)

    Gunter, Glenda A.

    1997-01-01

    Discusses computer viruses and explains how to detect them; discusses virus protection and the need to update antivirus software; and offers 10 safe computing tips, including scanning floppy disks and commercial software, how to safely download files from the Internet, avoiding pirated software copies, and backing up files. (LRW)

  17. Nuclear plant fire incident data file

    International Nuclear Information System (INIS)

    Sideris, A.G.; Hockenbury, R.W.; Yeater, M.L.; Vesely, W.E.

    1979-01-01

    A computerized nuclear plant fire incident data file was developed by American Nuclear Insurers and was further analyzed by Rensselaer Polytechnic Institute with technical and monetary support provided by the Nuclear Regulatory Commission. Data on 214 fires that occurred at nuclear facilities have been entered in the file. A computer program has been developed to sort the fire incidents according to various parameters. The parametric sorts that are presented in this article are significant since they are the most comprehensive statistics presently available on fires that have occurred at nuclear facilities

  18. FHEO Filed Cases

    Data.gov (United States)

    Department of Housing and Urban Development — The dataset is a list of all the Title VIII fair housing cases filed by FHEO from 1/1/2007 - 12/31/2012 including the case number, case name, filing date, state and...

  19. TIGER/Line Shapefile, 2011, Series Information File for the 2010 Census Traffic Analysis Zone (TAZ) State-based Shapefile

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The TIGER/Line Files are shapefiles and related database files (.dbf) that are an extract of selected geographic and cartographic information from the U.S. Census...

  20. Attenuation-based kV pair selection in dual source dual energy computed tomography angiography of the chest: impact on radiation dose and image quality

    Energy Technology Data Exchange (ETDEWEB)

    Renapurkar, Rahul D.; Azok, Joseph; Lempel, Jason; Karim, Wadih; Graham, Ruffin [Thoracic Imaging, L10, Imaging Institute, Cleveland Clinic, Cleveland, OH (United States); Primak, Andrew [Siemens Medical Solutions, Malvern, PA (United States); Tandon, Yasmeen [Case Western Reserve University-Metro Health Medical Center, Department of Radiology, Cleveland, OH (United States); Bullen, Jennifer [Quantitative Health Sciences, Cleveland Clinic, Cleveland, OH (United States); Dong, Frank [Section of Medical Physics, Cleveland Clinic, Cleveland, OH (United States)

    2017-08-15

    The purpose of this study was to evaluate the impact of attenuation-based kilovoltage (kV) pair selection in dual source dual energy (DSDE)-pulmonary embolism (PE) protocol examinations on radiation dose savings and image quality. A prospective study was carried out on 118 patients with suspected PE. In patients in whom attenuation-based kV pair selection selected the 80/140Sn kV pair, the pre-scan 100/140Sn CTDIvol (computed tomography dose index volume) values were compared with the pre-scan 80/140Sn CTDIvol values. Subjective and objective image quality parameters were assessed. Attenuation-based kV pair selection switched to the 80/140Sn kV pair (''switched'' cohort) in 63 out of 118 patients (53%). The mean 100/140Sn pre-scan CTDIvol was 8.8 mGy, while the mean 80/140Sn pre-scan CTDIvol was 7.5 mGy. The average estimated dose reduction for the ''switched'' cohort was 1.3 mGy (95% CI 1.2, 1.4; p < 0.001), representing a 15% reduction in dose. After adjusting for patient weight, mean attenuation was significantly higher in the ''switched'' vs. ''non-switched'' cohorts in all five pulmonary arteries and in all lobes on iodine maps. This study demonstrates that attenuation-based kV pair selection in DSDE examination is feasible and can offer radiation dose reduction without compromising image quality. (orig.)

  1. Efficacy of D-RaCe and ProTaper Universal Retreatment NiTi instruments and hand files in removing gutta-percha from curved root canals - a micro-computed tomography study.

    Science.gov (United States)

    Rödig, T; Hausdörfer, T; Konietschke, F; Dullin, C; Hahn, W; Hülsmann, M

    2012-06-01

    To compare the efficacy of two rotary NiTi retreatment systems and Hedström files in removing filling material from curved root canals. Curved root canals of 57 extracted teeth were prepared using FlexMaster instruments and filled with gutta-percha and AH Plus. After determination of root canal curvatures and radii in two directions, the teeth were assigned to three identical groups (n = 19). The root fillings were removed with D-RaCe instruments, ProTaper Universal Retreatment instruments or Hedström files. Pre- and postoperative micro-CT imaging was used to assess the percentage of residual filling material as well as the amount of dentine removal. Working time and procedural errors were recorded. Data were analysed using analysis of covariance and analysis of variance procedures. D-RaCe instruments were significantly more effective than ProTaper Universal Retreatment instruments and Hedström files (P ProTaper group, four instrument fractures and one lateral perforation were observed. Five instrument fractures were recorded for D-RaCe. D-RaCe instruments were associated with significantly less residual filling material than ProTaper Universal Retreatment instruments and hand files. Hedström files removed significantly less dentine than both rotary NiTi systems. Retreatment with rotary NiTi systems resulted in a high incidence of procedural errors. © 2012 International Endodontic Journal.

  2. Is triple contrast computed tomographic scanning useful in the selective management of stab wounds to the back?

    Science.gov (United States)

    McAllister, E; Perez, M; Albrink, M H; Olsen, S M; Rosemurgy, A S

    1994-09-01

    We devised a protocol to prospectively manage stab wounds to the back with the hypothesis that the triple contrast computed tomographic (CT) scan is an effective means of detecting occult injury in these patients. All wounds to the back in hemodynamically stable adults were locally explored. All patients with muscular fascial penetration underwent triple contrast CT scanning utilizing oral, rectal, and IV contrast. Patients did not undergo surgical exploration if their CT scan was interpreted as negative or if the CT scan demonstrated injuries not requiring surgical intervention. Fifty-three patients were entered into the protocol. The time to complete the triple contrast CT scan ranged from 3 to 6 hours at a cost of $1050 for each scan. In 51 patients (96%), the CT scan either had negative findings (n = 31) or showed injuries not requiring exploration (n = 20). These patients did well with nonsurgical management. Two CT scans documented significant injury and led to surgical exploration and therapeutic celiotomies. Although triple contrast CT scanning was able to detect occult injury in patients with stab wounds to the back it did so at considerable cost and the results rarely altered clinical care. Therefore, its routine use in these patients is not recommended.

  3. Assessment of the structural shielding integrity of some selected computed tomography facilities in the Greater Accra Region of Ghana

    International Nuclear Information System (INIS)

    Nkansah, A.

    2010-01-01

    The structural shielding integrity was assessed for four of the CT facilities at Trust Hospital, Korle-Bu Teaching Hospital, the 37 Military Hospital and Medical Imaging Ghana Ltd. in the Greater Accra Region of Ghana. From the shielding calculations, the concrete wall thickness computed are 120, 145, 140 and 155mm, for Medical Imaging Ghana Ltd. 37 Military, Trust Hospital and Korle-Bu Teaching Hospital respectively using Default DLP values. The wall thickness using Derived DLP values are 110, 110, 120 and 168mm for Medical Imaging Ghana Ltd, 37 Military Hospital, Trust Hospital and Korle-Bu Teaching Hospital respectively. These values are within the accepted standard concrete thickness of 102- 152mm prescribed by the National Council of Radiological Protection and measurement. The ultrasonic pulse testing indicated that all the sandcrete walls are of good quality and free of voids since pulse velocities estimated were approximately equal to 3.45km/s. an average dose rate measurement for supervised areas is 3.4 μSv/wk and controlled areas is 18.0 μSv/wk. These dose rates were below the acceptable levels of 100 μSv per week for the occupationally exposed and 20 μSv per week for members of the public provided by the ICRU. The results mean that the structural shielding thickness are adequate to protect members of the public and occupationally exposed workers (au).

  4. A Well-Mixed Computational Model for Estimating Room Air Levels of Selected Constituents from E-Vapor Product Use

    Directory of Open Access Journals (Sweden)

    Ali A. Rostami

    2016-08-01

    Full Text Available Concerns have been raised in the literature for the potential of secondhand exposure from e-vapor product (EVP use. It would be difficult to experimentally determine the impact of various factors on secondhand exposure including, but not limited to, room characteristics (indoor space size, ventilation rate, device specifications (aerosol mass delivery, e-liquid composition, and use behavior (number of users and usage frequency. Therefore, a well-mixed computational model was developed to estimate the indoor levels of constituents from EVPs under a variety of conditions. The model is based on physical and thermodynamic interactions between aerosol, vapor, and air, similar to indoor air models referred to by the Environmental Protection Agency. The model results agree well with measured indoor air levels of nicotine from two sources: smoking machine-generated aerosol and aerosol exhaled from EVP use. Sensitivity analysis indicated that increasing air exchange rate reduces room air level of constituents, as more material is carried away. The effect of the amount of aerosol released into the space due to variability in exhalation was also evaluated. The model can estimate the room air level of constituents as a function of time, which may be used to assess the level of non-user exposure over time.

  5. Series Information File for the 2016 Cartographic Boundary File, Current Secondary School District, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary KMLs are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File / Topologically...

  6. Series Information File for the 2016 Cartographic Boundary File, State-Consolidated City, 1:500,000

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The 2016 cartographic boundary shapefiles are simplified representations of selected geographic areas from the U.S. Census Bureau's Master Address File /...

  7. Computer-aided design and synthesis of magnetic molecularly imprinted polymers with high selectivity for the removal of phenol from water.

    Science.gov (United States)

    Yang, Wenming; Liu, Lukuan; Ni, Xiaoni; Zhou, Wei; Huang, Weihong; Liu, Hong; Xu, Wanzhen

    2016-02-01

    A molecular simulation method was introduced to compute the phenol-monomer pre-assembled system of a molecularly imprinted polymer. The interaction type and intensity between phenol and monomer were evaluated by combining binding energy and charge transfer with complex conformation. The simulation results indicate that interaction energies are simultaneously affected by the type of monomer and the ratio between phenol and monomers. At the same time, we considered that by increasing the amount of functional monomer is not always better for preparing molecularly imprinter polymers. In this study, three kinds of novel magnetic phenol-imprinted polymers with favorable specific adsorption effects were prepared by the surface imprinting technique combined with atom transfer radical polymerization. Various measures were selected to characterize the structure and morphology to obtain the optimal polymer. The characterization results show that the optimal polymer has suitable features for further adsorption process. A series of static adsorption experiments were conducted to analyze its adsorption performance, which follows the Elovich model from the kinetic analysis and the Sips equation from the isothermal analysis. To further verify the reliability and accuracy of the simulation results, the effects of different monomers on the adsorption selectivity were also determined. They display higher selectivity towards phenol than 4-nitrophenol.The results from the simulation of the pre-assembled complexes are in reasonable agreement with those from the experiment. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Attenuation-based automatic kilovolt (kV)-selection in computed tomography of the chest: effects on radiation exposure and image quality.

    Science.gov (United States)

    Eller, Achim; Wuest, Wolfgang; Scharf, Michael; Brand, Michael; Achenbach, Stephan; Uder, Michael; Lell, Michael M

    2013-12-01

    To evaluate an automated attenuation-based kV-selection in computed tomography of the chest in respect to radiation dose and image quality, compared to a standard 120 kV protocol. 104 patients were examined using a 128-slice scanner. Fifty examinations (58 ± 15 years, study group) were performed using the automated adaption of tube potential (100-140 kV), based on the attenuation profile of the scout scan, 54 examinations (62 ± 14 years, control group) with fixed 120 kV. Estimated CT dose index (CTDI) of the software-proposed setting was compared with a 120 kV protocol. After the scan CTDI volume (CTDIvol) and dose length product (DLP) were recorded. Image quality was assessed by region of interest (ROI) measurements, subjective image quality by two observers with a 4-point scale (3--excellent, 0--not diagnostic). The algorithm selected 100 kV in 78% and 120 kV in 22%. Overall CTDIvol reduction was 26.6% (34% in 100 kV) overall DLP reduction was 22.8% (32.1% in 100 kV) (all pimage quality was excellent in both groups. The attenuation based kV-selection algorithm enables relevant dose reduction (~27%) in chest-CT while keeping image quality parameters at high levels. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  9. Attenuation-based automatic kilovolt (kV)-selection in computed tomography of the chest: Effects on radiation exposure and image quality

    Energy Technology Data Exchange (ETDEWEB)

    Eller, Achim; Wuest, Wolfgang; Scharf, Michael; Brand, Michael [Department of Radiology, University Erlangen (Germany); Achenbach, Stephan [Department of Cardiology, University Erlangen (Germany); Uder, Michael [Department of Radiology, University Erlangen (Germany); Imaging Science Institute, Erlangen (Germany); Lell, Michael M., E-mail: Michael.lell@uk-erlangen.de [Department of Radiology, University Erlangen (Germany); Imaging Science Institute, Erlangen (Germany)

    2013-12-01

    Objectives: To evaluate an automated attenuation-based kV-selection in computed tomography of the chest in respect to radiation dose and image quality, compared to a standard 120 kV protocol. Materials and methods: 104 patients were examined using a 128-slice scanner. Fifty examinations (58 ± 15 years, study group) were performed using the automated adaption of tube potential (100–140 kV), based on the attenuation profile of the scout scan, 54 examinations (62 ± 14 years, control group) with fixed 120 kV. Estimated CT dose index (CTDI) of the software-proposed setting was compared with a 120 kV protocol. After the scan CTDI volume (CTDIvol) and dose length product (DLP) were recorded. Image quality was assessed by region of interest (ROI) measurements, subjective image quality by two observers with a 4-point scale (3 – excellent, 0 – not diagnostic). Results: The algorithm selected 100 kV in 78% and 120 kV in 22%. Overall CTDIvol reduction was 26.6% (34% in 100 kV) overall DLP reduction was 22.8% (32.1% in 100 kV) (all p < 0.001). Subjective image quality was excellent in both groups. Conclusion: The attenuation based kV-selection algorithm enables relevant dose reduction (∼27%) in chest-CT while keeping image quality parameters at high levels.

  10. The Emotional Gatekeeper: A Computational Model of Attentional Selection and Suppression through the Pathway from the Amygdala to the Inhibitory Thalamic Reticular Nucleus

    Science.gov (United States)

    Bullock, Daniel; Barbas, Helen

    2016-01-01

    In a complex environment that contains both opportunities and threats, it is important for an organism to flexibly direct attention based on current events and prior plans. The amygdala, the hub of the brain's emotional system, is involved in forming and signaling affective associations between stimuli and their consequences. The inhibitory thalamic reticular nucleus (TRN) is a hub of the attentional system that gates thalamo-cortical signaling. In the primate brain, a recently discovered pathway from the amygdala sends robust projections to TRN. Here we used computational modeling to demonstrate how the amygdala-TRN pathway, embedded in a wider neural circuit, can mediate selective attention guided by emotions. Our Emotional Gatekeeper model demonstrates how this circuit enables focused top-down, and flexible bottom-up, allocation of attention. The model suggests that the amygdala-TRN projection can serve as a unique mechanism for emotion-guided selection of signals sent to cortex for further processing. This inhibitory selection mechanism can mediate a powerful affective ‘framing’ effect that may lead to biased decision-making in highly charged emotional situations. The model also supports the idea that the amygdala can serve as a relevance detection system. Further, the model demonstrates how abnormal top-down drive and dysregulated local inhibition in the amygdala and in the cortex can contribute to the attentional symptoms that accompany several neuropsychiatric disorders. PMID:26828203

  11. The Emotional Gatekeeper: A Computational Model of Attentional Selection and Suppression through the Pathway from the Amygdala to the Inhibitory Thalamic Reticular Nucleus.

    Directory of Open Access Journals (Sweden)

    Yohan J John

    2016-02-01

    Full Text Available In a complex environment that contains both opportunities and threats, it is important for an organism to flexibly direct attention based on current events and prior plans. The amygdala, the hub of the brain's emotional system, is involved in forming and signaling affective associations between stimuli and their consequences. The inhibitory thalamic reticular nucleus (TRN is a hub of the attentional system that gates thalamo-cortical signaling. In the primate brain, a recently discovered pathway from the amygdala sends robust projections to TRN. Here we used computational modeling to demonstrate how the amygdala-TRN pathway, embedded in a wider neural circuit, can mediate selective attention guided by emotions. Our Emotional Gatekeeper model demonstrates how this circuit enables focused top-down, and flexible bottom-up, allocation of attention. The model suggests that the amygdala-TRN projection can serve as a unique mechanism for emotion-guided selection of signals sent to cortex for further processing. This inhibitory selection mechanism can mediate a powerful affective 'framing' effect that may lead to biased decision-making in highly charged emotional situations. The model also supports the idea that the amygdala can serve as a relevance detection system. Further, the model demonstrates how abnormal top-down drive and dysregulated local inhibition in the amygdala and in the cortex can contribute to the attentional symptoms that accompany several neuropsychiatric disorders.

  12. Attenuation-based automatic kilovolt (kV)-selection in computed tomography of the chest: Effects on radiation exposure and image quality

    International Nuclear Information System (INIS)

    Eller, Achim; Wuest, Wolfgang; Scharf, Michael; Brand, Michael; Achenbach, Stephan; Uder, Michael; Lell, Michael M.

    2013-01-01

    Objectives: To evaluate an automated attenuation-based kV-selection in computed tomography of the chest in respect to radiation dose and image quality, compared to a standard 120 kV protocol. Materials and methods: 104 patients were examined using a 128-slice scanner. Fifty examinations (58 ± 15 years, study group) were performed using the automated adaption of tube potential (100–140 kV), based on the attenuation profile of the scout scan, 54 examinations (62 ± 14 years, control group) with fixed 120 kV. Estimated CT dose index (CTDI) of the software-proposed setting was compared with a 120 kV protocol. After the scan CTDI volume (CTDIvol) and dose length product (DLP) were recorded. Image quality was assessed by region of interest (ROI) measurements, subjective image quality by two observers with a 4-point scale (3 – excellent, 0 – not diagnostic). Results: The algorithm selected 100 kV in 78% and 120 kV in 22%. Overall CTDIvol reduction was 26.6% (34% in 100 kV) overall DLP reduction was 22.8% (32.1% in 100 kV) (all p < 0.001). Subjective image quality was excellent in both groups. Conclusion: The attenuation based kV-selection algorithm enables relevant dose reduction (∼27%) in chest-CT while keeping image quality parameters at high levels

  13. Accuracy of multidetector row computed tomography for the diagnosis of acute bowel ischemia in a non-selected study population

    International Nuclear Information System (INIS)

    Wiesner, Walter; Hauser, Andreas; Steinbrich, Wolfgang

    2004-01-01

    The diagnostic accuracy of multidetector row computed tomography for the prospective diagnosis of acute bowel ischemia in the daily clinical routine was analyzed. Two hundred ninety-one consecutive patients with an acute or subacute abdomen, examined by MDCT over a time period of 5 months, were included in the study. All original CT diagnoses made during the daily routine by radiological generalists were compared to the final diagnoses made by using all available medical information from endoscopies, surgical interventions, autopsies and follow-up. Finally, all CT examinations of patients with an initial CT diagnosis or a final diagnosis of bowel ischemia were reread by a radiologist specialized in abdominal imaging in order to analyze the CT findings and the reasons for initially false negative or false positive CT readings. Twenty-four patients out of 291 (8.2%) had acute bowel ischemia. The age of affected patients ranged from 50 to 94 years (mean age: 75.7 years). Eleven patients were male, and 13 female. Reasons for acute bowel ischemia were: arterio-occlusive (n=11), non-occlusive (n=5), strangulation (n=2), over-distension (n=3) and radiation (n=3). The prospective sensitivity, specificity, PPV and NPV of MDCT for the diagnosis of acute bowel ischemia in the daily routine were 79.17, 98.51, 90.48 and 98.15%. MDCT reaches a similarly high sensitivity in diagnosing acute bowel as angiography. Furthermore, it has the advantage of being helpful in most of its clinical differential diagnoses and of being less invasive with the consecutive possibility of being used earlier in the diagnostic process with all the resulting positive effects on the patients prognosis. Therefore, nowadays MDCT should probably be used as the first step imaging modality of choice in patients with suspected acute bowel ischemia. (orig.)

  14. A novel computational approach for development of highly selective fenitrothion imprinted polymer: theoretical predictions and experimental validations

    Energy Technology Data Exchange (ETDEWEB)

    Barros, Leonardo Augusto de; Pereira, Leandro Alves; Rath, Susanne [Universidade de Campinas (UNICAMP), SP (Brazil). Inst. de Quimica. Dept. de Quimica Analitica; Custodio, Rogerio [Universidade de Campinas (UNICAMP), SP (Brazil). Inst. de Quimica. Dept. de Fisico-Quimica

    2014-04-15

    The quality of molecularly imprinted recognition sites depend on the mechanisms and the extent of the functional monomer-template interactions present in the prepolymerization mixture. Thus, an understanding of the physical parameters governing these interactions is key for producing a highly selective molecularly imprinted polymer (MIP). In this paper, novel molecular modeling studies were performed to optimize the conditions for the molecular imprinting of fenitrothion. Four possible functional monomers were evaluated. Five porogenic solvents were investigated employing the polarizable continuum method. The MIP based in methacrylic acid (MAA-MIP) synthesized in the presence of toluene shown to be the most thermodynamically stable complex. Contrarily, MIP based in p-vinylbenzoic acid (PVB-MIP) had the lowest binding energy. According to the adsorption parameters fitted by the Langmuir-Freundlich isotherm, MAA-MIP presented twice the number of binding sites compared to PVB-MIP (103.35 and 53.77 μmol g{sup -1}, respectively) (author)

  15. Excited States and Photodebromination of Selected Polybrominated Diphenyl Ethers: Computational and Quantitative Structure—Property Relationship Studies

    Directory of Open Access Journals (Sweden)

    Jin Luo

    2015-01-01

    Full Text Available This paper presents a density functional theory (DFT/time-dependent DFT (TD-DFT study on the lowest lying singlet and triplet excited states of 20 selected polybrominateddiphenyl ether (PBDE congeners, with the solvation effect included in the calculations using the polarized continuum model (PCM. The results obtained showed that for most of the brominated diphenyl ether (BDE congeners, the lowest singlet excited state was initiated by the electron transfer from HOMO to LUMO, involving a π–σ* excitation. In triplet excited states, structure of the BDE congeners differed notably from that of the BDE ground states with one of the specific C–Br bonds bending off the aromatic plane. In addition, the partial least squares regression (PLSR, principal component analysis-multiple linear regression analysis (PCA-MLR, and back propagation artificial neural network (BP-ANN approaches were employed for a quantitative structure-property relationship (QSPR study. Based on the previously reported kinetic data for the debromination by ultraviolet (UV and sunlight, obtained QSPR models exhibited a reasonable evaluation of the photodebromination reactivity even when the BDE congeners had same degree of bromination, albeit different patterns of bromination.

  16. Systems-level computational modeling demonstrates fuel selection switching in high capacity running and low capacity running rats

    Science.gov (United States)

    Qi, Nathan R.

    2018-01-01

    High capacity and low capacity running rats, HCR and LCR respectively, have been bred to represent two extremes of running endurance and have recently demonstrated disparities in fuel usage during transient aerobic exercise. HCR rats can maintain fatty acid (FA) utilization throughout the course of transient aerobic exercise whereas LCR rats rely predominantly on glucose utilization. We hypothesized that the difference between HCR and LCR fuel utilization could be explained by a difference in mitochondrial density. To test this hypothesis and to investigate mechanisms of fuel selection, we used a constraint-based kinetic analysis of whole-body metabolism to analyze transient exercise data from these rats. Our model analysis used a thermodynamically constrained kinetic framework that accounts for glycolysis, the TCA cycle, and mitochondrial FA transport and oxidation. The model can effectively match the observed relative rates of oxidation of glucose versus FA, as a function of ATP demand. In searching for the minimal differences required to explain metabolic function in HCR versus LCR rats, it was determined that the whole-body metabolic phenotype of LCR, compared to the HCR, could be explained by a ~50% reduction in total mitochondrial activity with an additional 5-fold reduction in mitochondrial FA transport activity. Finally, we postulate that over sustained periods of exercise that LCR can partly overcome the initial deficit in FA catabolic activity by upregulating FA transport and/or oxidation processes. PMID:29474500

  17. Next generation WLCG File Transfer Service (FTS)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    LHC experiments at CERN and worldwide utilize WLCG resources and middleware components to perform distributed computing tasks. One of the most important tasks is reliable file replication. It is a complex problem, suffering from transfer failures, disconnections, transfer duplication, server and network overload, differences in storage systems, etc. To address these problems, EMI and gLite have provided the independent File Transfer Service (FTS) and Grid File Access Library (GFAL) tools. Their development started almost a decade ago, in the meantime, requirements in data management have changed - the old architecture of FTS and GFAL cannot keep support easily these changes. Technology has also been progressing: FTS and GFAL do not fit into the new paradigms (cloud, messaging, for example). To be able to serve the next stage of LHC data collecting (from 2013), we need a new generation of  these tools: FTS 3 and GFAL 2. We envision a service requiring minimal configuration, which can dynamically adapt to the...

  18. GIFT: an HEP project for file transfer

    International Nuclear Information System (INIS)

    Ferrer, M.L.; Mirabelli, G.; Valente, E.

    1986-01-01

    Started in autumn 1983, GIFT (General Internetwork File Transfer) is a collaboration among several HEP centers, including CERN, Frascati, Oslo, Oxford, RAL and Rome. The collaboration was initially set up with the aim of studying the feasibility of a software system to allow direct file exchange between computers which do not share a common Virtual File Protocol. After the completion of this first phase, an implementation phase started and, since March 1985, an experimental service based on this system has been running at CERN between DECnet, CERNET and the UK Coloured Book protocols. The authors present the motivations that, together with previous gateway experiences, led to the definition of GIFT specifications and to the implementation of the GIFT Kernel system. The position of GIFT in the overall development framework of the networking facilities needed by large international collaborations within the HEP community is explained. (Auth.)

  19. Mode selectivity in the intramolecular cyclization of ketenimines bearing N-acylimino units: a computational and experimental study.

    Science.gov (United States)

    Alajarín, Mateo; Sánchez-Andrada, Pilar; Vidal, Angel; Tovar, Fulgencio

    2005-02-18

    [reaction: see text] The mode selectivity in the intramolecular cyclization of a particular class of ketenimines bearing N-acylimino units has been studied by ab initio and DFT calculations. In the model compounds the carbonyl carbon atom and the keteniminic nitrogen atom are linked either by a vinylic or an o-phenylene tether. Two cyclization modes have been analyzed: the [2+2] cycloaddition furnishing compounds with an azeto[2,1-b]pyrimidinone moiety and a 6pi-electrocyclic ring closure leading to compounds enclosing a 1,3-oxazine ring. The [2+2] cycloaddition reaction takes place via a two-step process with formation of a zwitterionic intermediate, which has been characterized as a cross-conjugated mesomeric betaine. The 6pi-electrocyclic ring closure occurs via a transition state whose pseudopericyclic character has been established on the basis of its magnetic properties, geometry, and NBO analysis. The 6pi-electrocyclic ring closure is energetically favored over the [2+2] cycloaddition, although the [2+2] cycloadducts are the thermodynamically controlled products. A quantitative kinetic analysis predicts that 1,3-oxazines would be the kinetically controlled products, but they should transform rapidly and totally into the [2+2] cycloadducts at room temperature. In the experimental study, a number of N-acylimino-ketenimines, in which both reactive functions are supported on an o-phenylene scaffold, have been successfully synthesized in three steps starting from 2-azidobenzoyl chloride. These compounds rapidly convert into azeto[2,1-b]quinazolin-8-ones in moderate to good yields as a result of a formal [2+2] cycloaddition.

  20. Rejection Positivity Predicts Trial-to-Trial Reaction Times in an Auditory Selective Attention Task: A Computational Analysis of Inhibitory Control

    Directory of Open Access Journals (Sweden)

    Sufen eChen

    2014-08-01

    Full Text Available A series of computer simulations using variants of a formal model of attention (Melara & Algom, 2003 probed the role of rejection positivity (RP, a slow-wave electroencephalographic (EEG component, in the inhibitory control of distraction. Behavioral and EEG data were recorded as participants performed auditory selective attention tasks. Simulations that modulated processes of distractor inhibition accounted well for reaction-time (RT performance, whereas those that modulated target excitation did not. A model that incorporated RP from actual EEG recordings in estimating distractor inhibition was superior in predicting changes in RT as a function of distractor salience across conditions. A model that additionally incorporated momentary fluctuations in EEG as the source of trial-to-trial variation in performance precisely predicted individual RTs within each condition. The results lend support to the linking proposition that RP controls the speed of responding to targets through the inhibitory control of distractors.

  1. A naphthalene exciplex based Al3+ selective on-type fluorescent probe for living cells at the physiological pH range: experimental and computational studies.

    Science.gov (United States)

    Banerjee, Arnab; Sahana, Animesh; Das, Sudipta; Lohar, Sisir; Guha, Subarna; Sarkar, Bidisha; Mukhopadhyay, Subhra Kanti; Mukherjee, Asok K; Das, Debasis

    2012-05-07

    2-((Naphthalen-6-yl)methylthio)ethanol (HL) was prepared by one pot synthesis using 2-mercaptoethanol and 2-bromomethylnaphthalene. It was found to be a highly selective fluorescent sensor for Al(3+) in the physiological pH (pH 7.0-8.0). It could sense Al(3+) bound to cells through fluorescence microscopy. Metal ions like Mn(2+), Fe(3+), Co(2+), Ni(2+), Cu(2+), Zn(2+), Ag(+), Cd(2+), Hg(2+), Cr(3+) and Pb(2+) did not interfere. No interference was also observed with anions like Cl(-), Br(-), F(-), SO(4)(2-), NO(3)(-), CO(3)(2-), HPO(4)(2-) and SCN(-). Experimentally observed structural and spectroscopic features of HL and its Al(3+) complex have been substantiated by computational calculations using density functional theory (DFT) and time dependent density functional theory (TDDFT).

  2. Selection of low activation materials for fusion power plants using ACAB system: the effect of computational methods and cross section uncertainties on waste management assessment

    Energy Technology Data Exchange (ETDEWEB)

    Alonso, M.; Sanz, J.; Rodriguez, A.; Falquina, R. [Universidad Nacional de Educacion a Distancia (UNED), Dept. of Power Engineering, Madrid (Spain); Cabellos, O.; Sanz, J. [Universidad Politecnica de Madrid, Instituto de Fusion Nuclear (UPM) (Spain)

    2003-07-01

    The feasibility of nuclear fusion as a realistic option for energy generation depends on its radioactive waste management assessment. In this respect, the production of high level waste is to be avoided and the reduction of low level waste volumes is to be enhanced. Three different waste management options are commonly regarded in fusion plants: Hands-on Recycling, Remote Recycling and Shallow Land Burial (SLB). Therefore, important research work has been undertaken to find low activation structural materials. In performing this task, a major issue is to compute the concentration limits (CLs) for all natural elements, which will be used to select the intended constituent elements of a particular Low Activation Material (LAM) and assess how much the impurities can deteriorate the waste management properties. Nevertheless, the reliable computation of CLs depends on the accuracy of nuclear data (mainly activation cross-sections) and the suitability of the computational method both for inertial and magnetic fusion environments. In this paper the importance of nuclear data uncertainties and mathematical algorithms used in different activation calculations for waste management purposes will be studied. Our work is centred on the study of {sup 186}W activation under first structural wall conditions of Hylife-II inertial fusion reactor design. The importance of the dominant transmutation/decay sequence has been documented in several publications. From a practical point of view, W is used in low activation materials for fusion applications: Cr-W ferritic/martensitic steels, and the need to better compute its activation has been assessed, in particular in relation to the cross-section uncertainties for reactions leading to Ir isotopes. {sup 192n}Ir and {sup 192}Ir reach a secular equilibrium, and {sup 192n}Ir is the critical one for waste management, with a half life of 241 years. From a theoretical point of view, this is one of the most complex chains appearing in

  3. Configuration Management File Manager Developed for Numerical Propulsion System Simulation

    Science.gov (United States)

    Follen, Gregory J.

    1997-01-01

    One of the objectives of the High Performance Computing and Communication Project's (HPCCP) Numerical Propulsion System Simulation (NPSS) is to provide a common and consistent way to manage applications, data, and engine simulations. The NPSS Configuration Management (CM) File Manager integrated with the Common Desktop Environment (CDE) window management system provides a common look and feel for the configuration management of data, applications, and engine simulations for U.S. engine companies. In addition, CM File Manager provides tools to manage a simulation. Features include managing input files, output files, textual notes, and any other material normally associated with simulation. The CM File Manager includes a generic configuration management Application Program Interface (API) that can be adapted for the configuration management repositories of any U.S. engine company.

  4. Tabulation of Fundamental Assembly Heat and Radiation Source Files

    International Nuclear Information System (INIS)

    T. deBues; J.C. Ryman

    2006-01-01

    The purpose of this calculation is to tabulate a set of computer files for use as input to the WPLOAD thermal loading software. These files contain details regarding heat and radiation from pressurized water reactor (PWR) assemblies and boiling water reactor (BWR) assemblies. The scope of this calculation is limited to rearranging and reducing the existing file information into a more streamlined set of tables for use as input to WPLOAD. The electronic source term files used as input to this calculation were generated from the output files of the SAS2H/ORIGIN-S sequence of the SCALE Version 4.3 modular code system, as documented in References 2.1.1 and 2.1.2, and are included in Attachment II

  5. The Improvement and Performance of Mobile Environment Using Both Cloud and Text Computing

    OpenAIRE

    S.Saravana Kumar; J.Lakshmi Priya; P.Hannah Jennifer; N.Jeff Monica; Fathima

    2013-01-01

    In this research paper presents an design model for file sharing system for ubiquitos mobile devices using both cloud and text computing. File s haring is one of the rationales for computer networks with increasing demand for file sharing ap plications and technologies in small and large enterprise networks and on the Internet. File transfer is an important process in any form of computing as we need to really share the data ac ross. ...

  6. Medical and administrative management by computing devices in the service of nuclear medicine of Nancy

    International Nuclear Information System (INIS)

    Legras, B.; Chau, N.; Lambert, J.-P.; Martin, J.; Bertrand, A.

    1977-01-01

    The results of the processing of the administrative and medical data collected in a Department of Nuclear medicine are presented. For a moderate increase in the secretaries' work (limited by the use of carbon copies) and for minor efforts of the doctors, the resulting dvantages are tremendous. A detailed balance of the Department activity can be obtained monthly. From the medical files, the computer provides statistical data and listings in clear form (with or without sort) of the selected records [fr

  7. Quantitative assessment of selective in-plane shielding of tissues in computed tomography through evaluation of absorbed dose and image quality

    International Nuclear Information System (INIS)

    Geleijns, J.; Veldkamp, W.J.H.; Salvado Artells, M.; Lopez Tortosa, M.; Calzado Cantera, A.

    2006-01-01

    This study aimed at assessment of efficacy of selective in-plane shielding in adults by quantitative evaluation of the achieved dose reduction and image quality. Commercially available accessories for in-plane shielding of the eye lens, thyroid and breast, and an anthropomorphic phantom were used for the evaluation of absorbed dose and image quality. Organ dose and total energy imparted were assessed by means of a Monte Carlo technique taking into account tube voltage, tube current, and scanner type. Image quality was quantified as noise in soft tissue. Application of the lens shield reduced dose to the lens by 27% and to the brain by 1%. The thyroid shield reduced thyroid dose by 26%; the breast shield reduced dose to the breasts by 30% and to the lungs by 15%. Total energy imparted (unshielded/shielded) was 88/86 mJ for computed tomography (CT) brain, 64/60 mJ for CT cervical spine, and 289/260 mJ for CT chest scanning. An increase in image noise could be observed in the ranges were bismuth shielding was applied. The observed reduction of organ dose and total energy imparted could be achieved more efficiently by a reduction of tube current. The application of in-plane selective shielding is therefore discouraged. (orig.)

  8. Concept of a selective tumour therapy and its evaluation by near-infrared fluorescence imaging and flat-panel volume computed tomography in mice.

    Science.gov (United States)

    Alves, Frauke; Dullin, Christian; Napp, Joanna; Missbach-Guentner, Jeannine; Jannasch, Katharina; Mathejczyk, Julia; Pardo, Luis A; Stühmer, Walter; Tietze, Lutz-F

    2009-05-01

    Conventional chemotherapy of cancer has its limitations, especially in advanced and disseminated disease and suffers from lack of specificity. This results in a poor therapeutic index and considerable toxicity to normal organs. Therefore, many efforts are made to develop novel therapeutic tools against cancer with the aim of selectively targeting the drug to the tumour site. Drug delivery strategies fundamentally rely on the identification of good-quality biomarkers, allowing unequivocal discrimination between cancer and healthy tissue. At present, antibodies or antibody fragments have clearly proven their value as carrier molecules specific for a tumour-associated molecular marker. This present review draws attention to the use of near-infrared fluorescence (NIRF) imaging to investigate binding specificity and kinetics of carrier molecules such as monoclonal antibodies. In addition, flat-panel volume computed tomography (fpVCT) will be presented to monitor anatomical structures in tumour mouse models over time in a non-invasive manner. Each imaging device sheds light on a different aspect; functional imaging is applied to optimise the dose schedule and the concept of selective tumour therapies, whereas anatomical imaging assesses preclinically the efficacy of novel tumour therapies. Both imaging techniques in combination allow the visualisation of functional information obtained by NIRF imaging within an adequate anatomic framework.

  9. Download this PDF file

    African Journals Online (AJOL)

    5,. May. 1923, p. 287. ISouth African Military Schools) p 287. CGS Box 231, File 31/0/2. .... One gains the impression that the sphere .... tions, Anthropology, Sociology and Man Manage- ment. ... of the word, possesses personality and initiative,.

  10. MMLEADS Public Use File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicare-Medicaid Linked Enrollee Analytic Data Source (MMLEADS) Public Use File (PUF) contains demographic, enrollment, condition prevalence, utilization, and...

  11. Hospital Service Area File

    Data.gov (United States)

    U.S. Department of Health & Human Services — This file is derived from the calendar year inpatient claims data. The records contain number of discharges, length of stay, and total charges summarized by provider...

  12. Patient Treatment File (PTF)

    Data.gov (United States)

    Department of Veterans Affairs — This database is part of the National Medical Information System (NMIS). The Patient Treatment File (PTF) contains a record for each inpatient care episode provided...

  13. USEEIO Satellite Files

    Data.gov (United States)

    U.S. Environmental Protection Agency — These files contain the environmental data as particular emissions or resources associated with a BEA sectors that are used in the USEEIO model. They are organized...

  14. Provider of Services File

    Data.gov (United States)

    U.S. Department of Health & Human Services — The POS file contains data on characteristics of hospitals and other types of healthcare facilities, including the name and address of the facility and the type of...

  15. Download this PDF file

    African Journals Online (AJOL)

    countries quite a number of distance education institutions and programmes are more likely to be ... The Open University of Tanzania (OUT), (Ministry of Higher Education, Science and ..... (1991) Comic Relief Funding file. BAI, London, 1st ...

  16. Cyclic fatigue resistance tests of Nickel-Titanium rotary files using simulated canal and weight loading conditions

    Directory of Open Access Journals (Sweden)

    Ok-In Cho

    2013-02-01

    Full Text Available Objectives This study compared the cyclic fatigue resistance of nickel-titanium (NiTi files obtained in a conventional test using a simulated canal with a newly developed method that allows the application of constant fatigue load conditions. Materials and Methods ProFile and K3 files of #25/.06, #30/.06, and #40/.04 were selected. Two types of testing devices were built to test their fatigue performance. The first (conventional device prescribed curvature inside a simulated canal (C-test, the second new device exerted a constant load (L-test whilst allowing any resulting curvature. Ten new instruments of each size and brand were tested with each device. The files were rotated until fracture and the number of cycles to failure (NCF was determined. The NCF were subjected to one-way ANOVA and Duncan's post-hoc test for each method. Spearman's rank correlation coefficient was computed to examine any association between methods. Results Spearman's rank correlation coefficient (ρ = -0.905 showed a significant negative correlation between methods. Groups with significant difference after the L-test divided into 4 clusters, whilst the C-test gave just 2 clusters. From the L-test, considering the negative correlation of NCF, K3 gave a significantly lower fatigue resistance than ProFile as in the C-test. K3 #30/.06 showed a lower fatigue resistance than K3 #25/.06, which was not found by the C-test. Variation in fatigue test methodology resulted in different cyclic fatigue resistance rankings for various NiTi files. Conclusions The new methodology standardized the load during fatigue testing, allowing determination fatigue behavior under constant load conditions.

  17. JENDL Dosimetry File

    International Nuclear Information System (INIS)

    Nakazawa, Masaharu; Iguchi, Tetsuo; Kobayashi, Katsuhei; Iwasaki, Shin; Sakurai, Kiyoshi; Ikeda, Yujiro; Nakagawa, Tsuneo.

    1992-03-01

    The JENDL Dosimetry File based on JENDL-3 was compiled and integral tests of cross section data were performed by the Dosimetry Integral Test Working Group of the Japanese Nuclear Data Committee. Data stored in the JENDL Dosimetry File are the cross sections and their covariance data for 61 reactions. The cross sections were mainly taken from JENDL-3 and the covariances from IRDF-85. For some reactions, data were adopted from other evaluated data files. The data are given in the neutron energy region below 20 MeV in both of point-wise and group-wise files in the ENDF-5 format. In order to confirm reliability of the data, several integral tests were carried out; comparison with the data in IRDF-85 and average cross sections measured in fission neutron fields, fast reactor spectra, DT neutron fields and Li(d, n) neutron fields. As a result, it has been found that the JENDL Dosimetry File gives better results than IRDF-85 but there are some problems to be improved in future. The contents of the JENDL Dosimetry File and the results of the integral tests are described in this report. All of the dosimetry cross sections are shown in a graphical form. (author) 76 refs

  18. JENDL Dosimetry File

    Energy Technology Data Exchange (ETDEWEB)

    Nakazawa, Masaharu; Iguchi, Tetsuo [Tokyo Univ. (Japan). Faculty of Engineering; Kobayashi, Katsuhei [Kyoto Univ., Kumatori, Osaka (Japan). Research Reactor Inst.; Iwasaki, Shin [Tohoku Univ., Sendai (Japan). Faculty of Engineering; Sakurai, Kiyoshi; Ikeda, Yujior; Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1992-03-15

    The JENDL Dosimetry File based on JENDL-3 was compiled and integral tests of cross section data were performed by the Dosimetry Integral Test Working Group of the Japanese Nuclear Data Committee. Data stored in the JENDL Dosimetry File are the cross sections and their covariance data for 61 reactions. The cross sections were mainly taken from JENDL-3 and the covariances from IRDF-85. For some reactions, data were adopted from other evaluated data files. The data are given in the neutron energy region below 20 MeV in both of point-wise and group-wise files in the ENDF-5 format. In order to confirm reliability of the data, several integral tests were carried out; comparison with the data in IRDF-85 and average cross sections measured in fission neutron fields, fast reactor spectra, DT neutron fields and Li(d,n) neutron fields. As a result, it has been found that the JENDL Dosimetry File gives better results than IRDF-85 but there are some problems to be improved in future. The contents of the JENDL Dosimetry File and the results of the integral tests are described in this report. All of the dosimetry cross sections are shown in a graphical form.

  19. Dynamic Non-Hierarchical File Systems for Exascale Storage

    Energy Technology Data Exchange (ETDEWEB)

    Long, Darrell E. [Univ. of California, Santa Cruz, CA (United States); Miller, Ethan L [Univ. of California, Santa Cruz, CA (United States)

    2015-02-24

    This constitutes the final report for “Dynamic Non-Hierarchical File Systems for Exascale Storage”. The ultimate goal of this project was to improve data management in scientific computing and high-end computing (HEC) applications, and to achieve this goal we proposed: to develop the first, HEC-targeted, file system featuring rich metadata and provenance collection, extreme scalability, and future storage hardware integration as core design goals, and to evaluate and develop a flexible non-hierarchical file system interface suitable for providing more powerful and intuitive data management interfaces to HEC and scientific computing users. Data management is swiftly becoming a serious problem in the scientific community – while copious amounts of data are good for obtaining results, finding the right data is often daunting and sometimes impossible. Scientists participating in a Department of Energy workshop noted that most of their time was spent “...finding, processing, organizing, and moving data and it’s going to get much worse”. Scientists should not be forced to become data mining experts in order to retrieve the data they want, nor should they be expected to remember the naming convention they used several years ago for a set of experiments they now wish to revisit. Ideally, locating the data you need would be as easy as browsing the web. Unfortunately, existing data management approaches are usually based on hierarchical naming, a 40 year-old technology designed to manage thousands of files, not exabytes of data. Today’s systems do not take advantage of the rich array of metadata that current high-end computing (HEC) file systems can gather, including content-based metadata and provenance1 information. As a result, current metadata search approaches are typically ad hoc and often work by providing a parallel management system to the “main” file system, as is done in Linux (the locate utility), personal computers, and enterprise search

  20. Development of data file system for cardiovascular nuclear medicine

    International Nuclear Information System (INIS)

    Hayashida, Kohei; Nishimura, Tsunehiko; Uehara, Toshiisa; Nisawa, Yoshifumi.

    1985-01-01

    A computer-assisted filing system for storing and processing data from cardiac pool scintigraphy and myocardial scintigraphy has been developed. Individual patient data are stored with his (her) identification number (ID) into floppy discs successively in order of receiving scintigraphy. Data for 900 patients can be stored per floppy disc. Scintigraphic findings can be outputted in a uniform file format, and can be used as a reporting format. Output or retrieval of filed individual patient data is possible according to each examination, disease code or ID. This system seems to be used for prospective study in patients with cardiovascular diseases. (Namekawa, K.)

  1. Tuning HDF5 subfiling performance on parallel file systems

    Energy Technology Data Exchange (ETDEWEB)

    Byna, Suren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Chaarawi, Mohamad [Intel Corp. (United States); Koziol, Quincey [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Mainzer, John [The HDF Group (United States); Willmore, Frank [The HDF Group (United States)

    2017-05-12

    Subfiling is a technique used on parallel file systems to reduce locking and contention issues when multiple compute nodes interact with the same storage target node. Subfiling provides a compromise between the single shared file approach that instigates the lock contention problems on parallel file systems and having one file per process, which results in generating a massive and unmanageable number of files. In this paper, we evaluate and tune the performance of recently implemented subfiling feature in HDF5. In specific, we explain the implementation strategy of subfiling feature in HDF5, provide examples of using the feature, and evaluate and tune parallel I/O performance of this feature with parallel file systems of the Cray XC40 system at NERSC (Cori) that include a burst buffer storage and a Lustre disk-based storage. We also evaluate I/O performance on the Cray XC30 system, Edison, at NERSC. Our results show performance benefits of 1.2X to 6X performance advantage with subfiling compared to writing a single shared HDF5 file. We present our exploration of configurations, such as the number of subfiles and the number of Lustre storage targets to storing files, as optimization parameters to obtain superior I/O performance. Based on this exploration, we discuss recommendations for achieving good I/O performance as well as limitations with using the subfiling feature.

  2. Renewal-anomalous-heterogeneous files

    International Nuclear Information System (INIS)

    Flomenbom, Ophir

    2010-01-01

    Renewal-anomalous-heterogeneous files are solved. A simple file is made of Brownian hard spheres that diffuse stochastically in an effective 1D channel. Generally, Brownian files are heterogeneous: the spheres' diffusion coefficients are distributed and the initial spheres' density is non-uniform. In renewal-anomalous files, the distribution of waiting times for individual jumps is not exponential as in Brownian files, yet obeys: ψ α (t)∼t -1-α , 0 2 >, obeys, 2 >∼ 2 > nrml α , where 2 > nrml is the MSD in the corresponding Brownian file. This scaling is an outcome of an exact relation (derived here) connecting probability density functions of Brownian files and renewal-anomalous files. It is also shown that non-renewal-anomalous files are slower than the corresponding renewal ones.

  3. Computer code selection criteria for flow and transport code(s) to be used in undisturbed vadose zone calculations for TWRS environmental analyses

    International Nuclear Information System (INIS)

    Mann, F.M.

    1998-01-01

    The Tank Waste Remediation System (TWRS) is responsible for the safe storage, retrieval, and disposal of waste currently being held in 177 underground tanks at the Hanford Site. In order to successfully carry out its mission, TWRS must perform environmental analyses describing the consequences of tank contents leaking from tanks and associated facilities during the storage, retrieval, or closure periods and immobilized low-activity tank waste contaminants leaving disposal facilities. Because of the large size of the facilities and the great depth of the dry zone (known as the vadose zone) underneath the facilities, sophisticated computer codes are needed to model the transport of the tank contents or contaminants. This document presents the code selection criteria for those vadose zone analyses (a subset of the above analyses) where the hydraulic properties of the vadose zone are constant in time the geochemical behavior of the contaminant-soil interaction can be described by simple models, and the geologic or engineered structures are complicated enough to require a two-or three dimensional model. Thus, simple analyses would not need to use the fairly sophisticated codes which would meet the selection criteria in this document. Similarly, those analyses which involve complex chemical modeling (such as those analyses involving large tank leaks or those analyses involving the modeling of contaminant release from glass waste forms) are excluded. The analyses covered here are those where the movement of contaminants can be relatively simply calculated from the moisture flow. These code selection criteria are based on the information from the low-level waste programs of the US Department of Energy (DOE) and of the US Nuclear Regulatory Commission as well as experience gained in the DOE Complex in applying these criteria. Appendix table A-1 provides a comparison between the criteria in these documents and those used here. This document does not define the models (that

  4. Five-Year-Olds’ Systematic Errors in Second-Order False Belief Tasks Are Due to First-Order Theory of Mind Strategy Selection: A Computational Modeling Study

    Science.gov (United States)

    Arslan, Burcu; Taatgen, Niels A.; Verbrugge, Rineke

    2017-01-01

    The focus of studies on second-order false belief reasoning generally was on investigating the roles of executive functions and language with correlational studies. Different from those studies, we focus on the question how 5-year-olds select and revise reasoning strategies in second-order false belief tasks by constructing two computational cognitive models of this process: an instance-based learning model and a reinforcement learning model. Unlike the reinforcement learning model, the instance-based learning model predicted that children who fail second-order false belief tasks would give answers based on first-order theory of mind (ToM) reasoning as opposed to zero-order reasoning. This prediction was confirmed with an empirical study that we conducted with 72 5- to 6-year-old children. The results showed that 17% of the answers were correct and 83% of the answers were wrong. In line with our prediction, 65% of the wrong answers were based on a first-order ToM strategy, while only 29% of them were based on a zero-order strategy (the remaining 6% of subjects did not provide any answer). Based on our instance-based learning model, we propose that when children get feedback “Wrong,” they explicitly revise their strategy to a higher level instead of implicitly selecting one of the available ToM strategies. Moreover, we predict that children’s failures are due to lack of experience and that with exposure to second-order false belief reasoning, children can revise their wrong first-order reasoning strategy to a correct second-order reasoning strategy. PMID:28293206

  5. Five-Year-Olds' Systematic Errors in Second-Order False Belief Tasks Are Due to First-Order Theory of Mind Strategy Selection: A Computational Modeling Study.

    Science.gov (United States)

    Arslan, Burcu; Taatgen, Niels A; Verbrugge, Rineke

    2017-01-01

    The focus of studies on second-order false belief reasoning generally was on investigating the roles of executive functions and language with correlational studies. Different from those studies, we focus on the question how 5-year-olds select and revise reasoning strategies in second-order false belief tasks by constructing two computational cognitive models of this process: an instance-based learning model and a reinforcement learning model. Unlike the reinforcement learning model, the instance-based learning model predicted that children who fail second-order false belief tasks would give answers based on first-order theory of mind (ToM) reasoning as opposed to zero-order reasoning. This prediction was confirmed with an empirical study that we conducted with 72 5- to 6-year-old children. The results showed that 17% of the answers were correct and 83% of the answers were wrong. In line with our prediction, 65% of the wrong answers were based on a first-order ToM strategy, while only 29% of them were based on a zero-order strategy (the remaining 6% of subjects did not provide any answer). Based on our instance-based learning model, we propose that when children get feedback "Wrong," they explicitly revise their strategy to a higher level instead of implicitly selecting one of the available ToM strategies. Moreover, we predict that children's failures are due to lack of experience and that with exposure to second-order false belief reasoning, children can revise their wrong first-order reasoning strategy to a correct second-order reasoning strategy.

  6. The version control service for the ATLAS data acquisition configuration files

    International Nuclear Information System (INIS)

    Soloviev, Igor

    2012-01-01

    The ATLAS experiment at the LHC in Geneva uses a complex and highly distributed Trigger and Data Acquisition system, involving a very large number of computing nodes and custom modules. The configuration of the system is specified by schema and data in more than 1000 XML files, with various experts responsible for updating the files associated with their components. Maintaining an error free and consistent set of XML files proved a major challenge. Therefore a special service was implemented; to validate any modifications; to check the authorization of anyone trying to modify a file; to record who had made changes, plus when and why; and to provide tools to compare different versions of files and to go back to earlier versions if required. This paper provides details of the implementation and exploitation experience, that may be interesting for other applications using many human-readable files maintained by different people, where consistency of the files and traceability of modifications are key requirements.

  7. Students "Hacking" School Computer Systems

    Science.gov (United States)

    Stover, Del

    2005-01-01

    This article deals with students hacking school computer systems. School districts are getting tough with students "hacking" into school computers to change grades, poke through files, or just pit their high-tech skills against district security. Dozens of students have been prosecuted recently under state laws on identity theft and unauthorized…

  8. NASA ARCH- A FILE ARCHIVAL SYSTEM FOR THE DEC VAX

    Science.gov (United States)

    Scott, P. J.

    1994-01-01

    The function of the NASA ARCH system is to provide a permanent storage area for files that are infrequently accessed. The NASA ARCH routines were designed to provide a simple mechanism by which users can easily store and retrieve files. The user treats NASA ARCH as the interface to a black box where files are stored. There are only five NASA ARCH user commands, even though NASA ARCH employs standard VMS directives and the VAX BACKUP utility. Special care is taken to provide the security needed to insure file integrity over a period of years. The archived files may exist in any of three storage areas: a temporary buffer, the main buffer, and a magnetic tape library. When the main buffer fills up, it is transferred to permanent magnetic tape storage and deleted from disk. Files may be restored from any of the three storage areas. A single file, multiple files, or entire directories can be stored and retrieved. archived entities hold the same name, extension, version number, and VMS file protection scheme as they had in the user's account prior to archival. NASA ARCH is capable of handling up to 7 directory levels. Wildcards are supported. User commands include TEMPCOPY, DISKCOPY, DELETE, RESTORE, and DIRECTORY. The DIRECTORY command searches a directory of savesets covering all three archival areas, listing matches according to area, date, filename, or other criteria supplied by the user. The system manager commands include 1) ARCHIVE- to transfer the main buffer to duplicate magnetic tapes, 2) REPORTto determine when the main buffer is full enough to archive, 3) INCREMENT- to back up the partially filled main buffer, and 4) FULLBACKUP- to back up the entire main buffer. On-line help files are provided for all NASA ARCH commands. NASA ARCH is written in DEC VAX DCL for interactive execution and has been implemented on a DEC VAX computer operating under VMS 4.X. This program was developed in 1985.

  9. Individual selection of X-ray tube settings in computed tomography coronary angiography: Reliability of an automated software algorithm to maintain constant image quality.

    Science.gov (United States)

    Durmus, Tahir; Luhur, Reny; Daqqaq, Tareef; Schwenke, Carsten; Knobloch, Gesine; Huppertz, Alexander; Hamm, Bernd; Lembcke, Alexander

    2016-05-01

    To evaluate a software tool that claims to maintain a constant contrast-to-noise ratio (CNR) in high-pitch dual-source computed tomography coronary angiography (CTCA) by automatically selecting both X-ray tube voltage and current. A total of 302 patients (171 males; age 61±12years; body weight 82±17kg, body mass index 27.3±4.6kg/cm(2)) underwent CTCA with a topogram-based, automatic selection of both tube voltage and current using dedicated software with quality reference values of 100kV and 250mAs/rotation (i.e., standard values for an average adult weighing 75kg) and an injected iodine load of 222mg/kg. The average radiation dose was estimated to be 1.02±0.64mSv. All data sets had adequate contrast enhancement. Average CNR in the aortic root, left ventricle, and left and right coronary artery was 15.7±4.5, 8.3±2.9, 16.1±4.3 and 15.3±3.9 respectively. Individual CNR values were independent of patients' body size and radiation dose. However, individual CNR values may vary considerably between subjects as reflected by interquartile ranges of 12.6-18.6, 6.2-9.9, 12.8-18.9 and 12.5-17.9 respectively. Moreover, average CNR values were significantly lower in males than females (15.1±4.1 vs. 16.6±11.7 and 7.9±2.7 vs. 8.9±3.0, 15.5±3.9 vs. 16.9±4.6 and 14.7±3.6 vs. 16.0±4.1 respectively). A topogram-based automatic selection of X-ray tube settings in CTCA provides diagnostic image quality independent of patients' body size. Nevertheless, considerable variation of individual CNR values between patients and significant differences of CNR values between males and females occur which questions the reliability of this approach. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Escherichia coli promoter sequences predict in vitro RNA polymerase selectivity.

    Science.gov (United States)

    Mulligan, M E; Hawley, D K; Entriken, R; McClure, W R

    1984-01-11

    We describe a simple algorithm for computing a homology score for Escherichia coli promoters based on DNA sequence alone. The homology score was related to 31 values, measured in vitro, of RNA polymerase selectivity, which we define as the product KBk2, the apparent second order rate constant for open complex formation. We found that promoter strength could be predicted to within a factor of +/-4.1 in KBk2 over a range of 10(4) in the same parameter. The quantitative evaluation was linked to an automated (Apple II) procedure for searching and evaluating possible promoters in DNA sequence files.

  11. An integrated methodological approach to the computer-assisted gas chromatographic screening of basic drugs in biological fluids using nitrogen selective detection.

    Science.gov (United States)

    Dugal, R; Massé, R; Sanchez, G; Bertrand, M J

    1980-01-01

    This paper presents the methodological aspects of a computerized system for the gas-chromatographic screening and primary identification of central nervous system stimulants and narcotic analgesics (including some of their respective metabolites) extracted from urine. The operating conditions of a selective nitrogen detector for optimized analytical functions are discussed, particularly the effect of carrier and fuel gas on the detector's sensitivity to nitrogen-containing molecules and discriminating performance toward biological matrix interferences. Application of simple extraction techniques, combined with rapid derivatization procedures, computer data acquisition, and reduction of chromatographic data are presented. Results show that this system approach allows for the screening of several drugs and their metabolites in a short amount of time. The reliability and stability of the system have been tested by analyzing several thousand samples for doping control at major international sporting events and for monitoring drug intake in addicts participating in a rehabilitation program. Results indicate that these techniques can be used and adapted to many different analytical toxicology situations.

  12. A Genetic-Based Feature Selection Approach in the Identification of Left/Right Hand Motor Imagery for a Brain-Computer Interface.

    Science.gov (United States)

    Yaacoub, Charles; Mhanna, Georges; Rihana, Sandy

    2017-01-23

    Electroencephalography is a non-invasive measure of the brain electrical activity generated by millions of neurons. Feature extraction in electroencephalography analysis is a core issue that may lead to accurate brain mental state classification. This paper presents a new feature selection method that improves left/right hand movement identification of a motor imagery brain-computer interface, based on genetic algorithms and artificial neural networks used as classifiers. Raw electroencephalography signals are first preprocessed using appropriate filtering. Feature extraction is carried out afterwards, based on spectral and temporal signal components, and thus a feature vector is constructed. As various features might be inaccurate and mislead the classifier, thus degrading the overall system performance, the proposed approach identifies a subset of features from a large feature space, such that the classifier error rate is reduced. Experimental results show that the proposed method is able to reduce the number of features to as low as 0.5% (i.e., the number of ignored features can reach 99.5%) while improving the accuracy, sensitivity, specificity, and precision of the classifier.

  13. A Genetic-Based Feature Selection Approach in the Identification of Left/Right Hand Motor Imagery for a Brain-Computer Interface

    Directory of Open Access Journals (Sweden)

    Charles Yaacoub

    2017-01-01

    Full Text Available Electroencephalography is a non-invasive measure of the brain electrical activity generated by millions of neurons. Feature extraction in electroencephalography analysis is a core issue that may lead to accurate brain mental state classification. This paper presents a new feature selection method that improves left/right hand movement identification of a motor imagery brain-computer interface, based on genetic algorithms and artificial neural networks used as classifiers. Raw electroencephalography signals are first preprocessed using appropriate filtering. Feature extraction is carried out afterwards, based on spectral and temporal signal components, and thus a feature vector is constructed. As various features might be inaccurate and mislead the classifier, thus degrading the overall system performance, the proposed approach identifies a subset of features from a large feature space, such that the classifier error rate is reduced. Experimental results show that the proposed method is able to reduce the number of features to as low as 0.5% (i.e., the number of ignored features can reach 99.5% while improving the accuracy, sensitivity, specificity, and precision of the classifier.

  14. Computational modeling of distinct neocortical oscillations driven by cell-type selective optogenetic drive: Separable resonant circuits controlled by low-threshold spiking and fast-spiking interneurons

    Directory of Open Access Journals (Sweden)

    Dorea Vierling-Claassen

    2010-11-01

    Full Text Available Selective optogenetic drive of fast spiking interneurons (FS leads to enhanced local field potential (LFP power across the traditional gamma frequency band (20-80Hz; Cardin et al., 2009. In contrast, drive to regular-spiking pyramidal cells (RS enhances power at lower frequencies, with a peak at 8 Hz. The first result is consistent with previous computational studies emphasizing the role of FS and the time constant of GABAA synaptic inhibition in gamma rhythmicity. However, the same theoretical models do not typically predict low-frequency LFP enhancement with RS drive. To develop hypotheses as to how the same network can support these contrasting behaviors, we constructed a biophysically principled network model of primary somatosensory neocortex containing FS, RS and low-threshold-spiking (LTS interneurons. Cells were modeled with detailed cell anatomy and physiology, multiple dendritic compartments, and included active somatic and dendritic ionic currents. Consistent with prior studies, the model demonstrated gamma resonance during FS drive, dependent on the time-constant of GABAA inhibition induced by synchronous FS activity. Lower frequency enhancement during RS drive was replicated only on inclusion of an inhibitory LTS population, whose activation was critically dependent on RS synchrony and evoked longer-lasting inhibition. Our results predict that differential recruitment of FS and LTS inhibitory populations is essential to the observed cortical dynamics and may provide a means for amplifying the natural expression of distinct oscillations in normal cortical processing.

  15. Verification of SIGACE code for generating ACE format cross-section files with continuous energy at high temperature

    International Nuclear Information System (INIS)

    Li Zhifeng; Yu Tao; Xie Jinsen; Qin Mian

    2012-01-01

    Based on the recently released ENDF/B-VII. 1 library, high temperature neutron cross-section files are generated through SIGACE code using low temperature ACE format files. To verify the processed ACE file of SIGACE, benchmark calculations are performed in this paper. The calculated results of selected ICT, standard CANDU assembly, LWR Doppler coefficient and SEFOR benchmarks are well conformed with reference value, which indicates that high temperature ACE files processed by SIGACE can be used in related neutronics calculations. (authors)

  16. Strategies for Sharing Seismic Data Among Multiple Computer Platforms

    Science.gov (United States)

    Baker, L. M.; Fletcher, J. B.

    2001-12-01

    Seismic waveform data is readily available from a variety of sources, but it often comes in a distinct, instrument-specific data format. For example, data may be from portable seismographs, such as those made by Refraction Technology or Kinemetrics, from permanent seismograph arrays, such as the USGS Parkfield Dense Array, from public data centers, such as the IRIS Data Center, or from personal communication with other researchers through e-mail or ftp. A computer must be selected to import the data - usually whichever is the most suitable for reading the originating format. However, the computer best suited for a specific analysis may not be the same. When copies of the data are then made for analysis, a proliferation of copies of the same data results, in possibly incompatible, computer-specific formats. In addition, if an error is detected and corrected in one copy, or some other change is made, all the other copies must be updated to preserve their validity. Keeping track of what data is available, where it is located, and which copy is authoritative requires an effort that is easy to neglect. We solve this problem by importing waveform data to a shared network file server that is accessible to all our computers on our campus LAN. We use a Network Appliance file server running Sun's Network File System (NFS) software. Using an NFS client software package on each analysis computer, waveform data can then be read by our MatLab or Fortran applications without first copying the data. Since there is a single copy of the waveform data in a single location, the NFS file system hierarchy provides an implicit complete waveform data catalog and the single copy is inherently authoritative. Another part of our solution is to convert the original data into a blocked-binary format (known historically as USGS DR100 or VFBB format) that is interpreted by MatLab or Fortran library routines available on each computer so that the idiosyncrasies of each machine are not visible to

  17. TIGER/Line Shapefile, 2010, Series Information File for the 2010 Census Block State-based Shapefile with Housing and Population Data

    Data.gov (United States)

    US Census Bureau, Department of Commerce — The TIGER/Line Files are shapefiles and related database files (.dbf) that are an extract of selected geographic and cartographic information from the U.S. Census...

  18. Selective gossip

    NARCIS (Netherlands)

    Üstebay, D.; Castro, R.M.; Rabbat, M.

    2009-01-01

    Motivated by applications in compression and distributed transform coding, we propose a new gossip algorithm called Selective Gossip to efficiently compute sparse approximations of network data. We consider running parallel gossip algorithms on the elements of a vector of transform coefficients.

  19. Cloud object store for archive storage of high performance computing data using decoupling middleware

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-06-30

    Cloud object storage is enabled for archived data, such as checkpoints and results, of high performance computing applications using a middleware process. A plurality of archived files, such as checkpoint files and results, generated by a plurality of processes in a parallel computing system are stored by obtaining the plurality of archived files from the parallel computing system; converting the plurality of archived files to objects using a log structured file system middleware process; and providing the objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  20. Cloud object store for checkpoints of high performance computing applications using decoupling middleware

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-04-19

    Cloud object storage is enabled for checkpoints of high performance computing applications using a middleware process. A plurality of files, such as checkpoint files, generated by a plurality of processes in a parallel computing system are stored by obtaining said plurality of files from said parallel computing system; converting said plurality of files to objects using a log structured file system middleware process; and providing said objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  1. Computational Logistics

    DEFF Research Database (Denmark)

    Pacino, Dario; Voss, Stefan; Jensen, Rune Møller

    2013-01-01

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  2. Computational Logistics

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  3. Formalizing a hierarchical file system

    NARCIS (Netherlands)

    Hesselink, Wim H.; Lali, Muhammad Ikram

    An abstract file system is defined here as a partial function from (absolute) paths to data. Such a file system determines the set of valid paths. It allows the file system to be read and written at a valid path, and it allows the system to be modified by the Unix operations for creation, removal,

  4. Formalizing a Hierarchical File System

    NARCIS (Netherlands)

    Hesselink, Wim H.; Lali, M.I.

    2009-01-01

    In this note, we define an abstract file system as a partial function from (absolute) paths to data. Such a file system determines the set of valid paths. It allows the file system to be read and written at a valid path, and it allows the system to be modified by the Unix operations for removal

  5. Extracting the Data From the LCM vk4 Formatted Output File

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, James G. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-29

    These are slides about extracting the data from the LCM vk4 formatted output file. The following is covered: vk4 file produced by Keyence VK Software, custom analysis, no off the shelf way to read the file, reading the binary data in a vk4 file, various offsets in decimal lines, finding the height image data, directly in MATLAB, binary output beginning of height image data, color image information, color image binary data, color image decimal and binary data, MATLAB code to read vk4 file (choose a file, read the file, compute offsets, read optical image, laser optical image, read and compute laser intensity image, read height image, timing, display height image, display laser intensity image, display RGB laser optical images, display RGB optical images, display beginning data and save images to workspace, gamma correction subroutine), reading intensity form the vk4 file, linear in the low range, linear in the high range, gamma correction for vk4 files, computing the gamma intensity correction, observations.

  6. A Centralized Control and Dynamic Dispatch Architecture for File Integrity Analysis

    Directory of Open Access Journals (Sweden)

    Ronald DeMara

    2006-02-01

    Full Text Available The ability to monitor computer file systems for unauthorized changes is a powerful administrative tool. Ideally this task could be performed remotely under the direction of the administrator to allow on-demand checking, and use of tailorable reporting and exception policies targeted to adjustable groups of network elements. This paper introduces M-FICA, a Mobile File Integrity and Consistency Analyzer as a prototype to achieve this capability using mobile agents. The M-FICA file tampering detection approach uses MD5 message digests to identify file changes. Two agent types, Initiator and Examiner, are used to perform file integrity tasks. An Initiator travels to client systems, computes a file digest, then stores those digests in a database file located on write-once media. An Examiner agent computes a new digest to compare with the original digests in the database file. Changes in digest values indicate that the file contents have been modified. The design and evaluation results for a prototype developed in the Concordia agent framework are described.

  7. Reliable file sharing in distributed operating system using web RTC

    Science.gov (United States)

    Dukiya, Rajesh

    2017-12-01

    Since, the evolution of distributed operating system, distributed file system is come out to be important part in operating system. P2P is a reliable way in Distributed Operating System for file sharing. It was introduced in 1999, later it became a high research interest topic. Peer to Peer network is a type of network, where peers share network workload and other load related tasks. A P2P network can be a period of time connection, where a bunch of computers connected by a USB (Universal Serial Bus) port to transfer or enable disk sharing i.e. file sharing. Currently P2P requires special network that should be designed in P2P way. Nowadays, there is a big influence of browsers in our life. In this project we are going to study of file sharing mechanism in distributed operating system in web browsers, where we will try to find performance bottlenecks which our research will going to be an improvement in file sharing by performance and scalability in distributed file systems. Additionally, we will discuss the scope of Web Torrent file sharing and free-riding in peer to peer networks.

  8. Computing environment logbook

    Science.gov (United States)

    Osbourn, Gordon C; Bouchard, Ann M

    2012-09-18

    A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

  9. 75 FR 30839 - Privacy Act of 1974; CMS Computer Match No. 2010-03, HHS Computer Match No. 1003, SSA Computer...

    Science.gov (United States)

    2010-06-02

    ... 1974; CMS Computer Match No. 2010-03, HHS Computer Match No. 1003, SSA Computer Match No. 1048, IRS... Services (CMS). ACTION: Notice of renewal of an existing computer matching program (CMP) that has an...'' section below for comment period. DATES: Effective Dates: CMS filed a report of the Computer Matching...

  10. Accessing files in an Internet: The Jade file system

    Science.gov (United States)

    Peterson, Larry L.; Rao, Herman C.

    1991-01-01

    Jade is a new distribution file system that provides a uniform way to name and access files in an internet environment. It makes two important contributions. First, Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Jade is designed under the restriction that the underlying file system may not be modified. Second, rather than providing a global name space, Jade permits each user to define a private name space. These private name spaces support two novel features: they allow multiple file systems to be mounted under one directory, and they allow one logical name space to mount other logical name spaces. A prototype of the Jade File System was implemented on Sun Workstations running Unix. It consists of interfaces to the Unix file system, the Sun Network File System, the Andrew File System, and FTP. This paper motivates Jade's design, highlights several aspects of its implementation, and illustrates applications that can take advantage of its features.

  11. Accessing files in an internet - The Jade file system

    Science.gov (United States)

    Rao, Herman C.; Peterson, Larry L.

    1993-01-01

    Jade is a new distribution file system that provides a uniform way to name and access files in an internet environment. It makes two important contributions. First, Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Jade is designed under the restriction that the underlying file system may not be modified. Second, rather than providing a global name space, Jade permits each user to define a private name space. These private name spaces support two novel features: they allow multiple file systems to be mounted under one directory, and they allow one logical name space to mount other logical name spaces. A prototype of the Jade File System was implemented on Sun Workstations running Unix. It consists of interfaces to the Unix file system, the Sun Network File System, the Andrew File System, and FTP. This paper motivates Jade's design, highlights several aspects of its implementation, and illustrates applications that can take advantage of its features.

  12. Prefetching in file systems for MIMD multiprocessors

    Science.gov (United States)

    Kotz, David F.; Ellis, Carla Schlatter

    1990-01-01

    The question of whether prefetching blocks on the file into the block cache can effectively reduce overall execution time of a parallel computation, even under favorable assumptions, is considered. Experiments have been conducted with an interleaved file system testbed on the Butterfly Plus multiprocessor. Results of these experiments suggest that (1) the hit ratio, the accepted measure in traditional caching studies, may not be an adequate measure of performance when the workload consists of parallel computations and parallel file access patterns, (2) caching with prefetching can significantly improve the hit ratio and the average time to perform an I/O (input/output) operation, and (3) an improvement in overall execution time has been observed in most cases. In spite of these gains, prefetching sometimes results in increased execution times (a negative result, given the optimistic nature of the study). The authors explore why it is not trivial to translate savings on individual I/O requests into consistently better overall performance and identify the key problems that need to be addressed in order to improve the potential of prefetching techniques in the environment.

  13. Download this PDF file

    African Journals Online (AJOL)

    1- is gifts' ta5ie" in elist fig'equitable' fees distilition s ... O'." & 1 25; 33i) re...) C SS Sati ri. Southerri'Stillah diffigFiles'f actities s % -- - , a v. & ' " St - a s fit . . . fiji ſti i ...

  14. Challenging Ubiquitous Inverted Files

    NARCIS (Netherlands)

    de Vries, A.P.

    2000-01-01

    Stand-alone ranking systems based on highly optimized inverted file structures are generally considered ‘the’ solution for building search engines. Observing various developments in software and hardware, we argue however that IR research faces a complex engineering problem in the quest for more

  15. The Global File System

    Science.gov (United States)

    Soltis, Steven R.; Ruwart, Thomas M.; OKeefe, Matthew T.

    1996-01-01

    The global file system (GFS) is a prototype design for a distributed file system in which cluster nodes physically share storage devices connected via a network-like fiber channel. Networks and network-attached storage devices have advanced to a level of performance and extensibility so that the previous disadvantages of shared disk architectures are no longer valid. This shared storage architecture attempts to exploit the sophistication of storage device technologies whereas a server architecture diminishes a device's role to that of a simple component. GFS distributes the file system responsibilities across processing nodes, storage across the devices, and file system resources across the entire storage pool. GFS caches data on the storage devices instead of the main memories of the machines. Consistency is established by using a locking mechanism maintained by the storage devices to facilitate atomic read-modify-write operations. The locking mechanism is being prototyped in the Silicon Graphics IRIX operating system and is accessed using standard Unix commands and modules.

  16. Download this PDF file

    African Journals Online (AJOL)

    AJNS WEBMASTERS

    Incidence is higher in the elderly, about 58 per 100,000 per year. Diagnosis of CSDH is still .... in the other two patients was not stated in the case file. Evacuation of the Subdural .... Personal experience in 39 patients. Br J of Neurosurg. 2003 ...

  17. File System Virtual Appliances

    Science.gov (United States)

    2010-05-01

    4 KB of data is read or written, data is copied back and forth using trampoline buffers — pages that are shared during proxy initialization — because...in 2008. CIO Magazine. 104 · File system virtual appliances [64] Megiddo, N. and Modha, D. S. 2003. ARC: A Self-Tuning, Low Over- head Replacement

  18. DB90: A Fortran Callable Relational Database Routine for Scientific and Engineering Computer Programs

    Science.gov (United States)

    Wrenn, Gregory A.

    2005-01-01

    This report describes a database routine called DB90 which is intended for use with scientific and engineering computer programs. The software is written in the Fortran 90/95 programming language standard with file input and output routines written in the C programming language. These routines should be completely portable to any computing platform and operating system that has Fortran 90/95 and C compilers. DB90 allows a program to supply relation names and up to 5 integer key values to uniquely identify each record of each relation. This permits the user to select records or retrieve data in any desired order.

  19. Standard interface files and procedures for reactor physics codes. Version IV

    International Nuclear Information System (INIS)

    O'Dell, R.D.

    1977-09-01

    Standards, procedures, and recommendations of the Committee on Computer Code Coordination for promoting the exchange of reactor physics codes are updated to Version IV status. Standards and procedures covering general programming, program structure, standard interface files, and file management and handling subroutines are included

  20. Testing the Forensic Interestingness of Image Files Based on Size and Type

    Science.gov (United States)

    2017-09-01

    down to 0.18% (Rowe, 2015). 7 III. IMAGE FILE FORMATS When scanning a computer hard drive, many kinds of pictures are found. Digital images are not...3  III.  IMAGE FILE FORMATS ...Interchange Format JPEG Joint Photographic Experts Group LSH Locality Sensitive Hashing NSRL National Software Reference Library PDF Portable Document