WorldWideScience

Sample records for site computing standards

  1. Savannah River Site computing architecture

    Energy Technology Data Exchange (ETDEWEB)

    1991-03-29

    A computing architecture is a framework for making decisions about the implementation of computer technology and the supporting infrastructure. Because of the size, diversity, and amount of resources dedicated to computing at the Savannah River Site (SRS), there must be an overall strategic plan that can be followed by the thousands of site personnel who make decisions daily that directly affect the SRS computing environment and impact the site`s production and business systems. This plan must address the following requirements: There must be SRS-wide standards for procurement or development of computing systems (hardware and software). The site computing organizations must develop systems that end users find easy to use. Systems must be put in place to support the primary function of site information workers. The developers of computer systems must be given tools that automate and speed up the development of information systems and applications based on computer technology. This document describes a proposal for a site-wide computing architecture that addresses the above requirements. In summary, this architecture is standards-based data-driven, and workstation-oriented with larger systems being utilized for the delivery of needed information to users in a client-server relationship.

  2. Savannah River Site computing architecture

    Energy Technology Data Exchange (ETDEWEB)

    1991-03-29

    A computing architecture is a framework for making decisions about the implementation of computer technology and the supporting infrastructure. Because of the size, diversity, and amount of resources dedicated to computing at the Savannah River Site (SRS), there must be an overall strategic plan that can be followed by the thousands of site personnel who make decisions daily that directly affect the SRS computing environment and impact the site's production and business systems. This plan must address the following requirements: There must be SRS-wide standards for procurement or development of computing systems (hardware and software). The site computing organizations must develop systems that end users find easy to use. Systems must be put in place to support the primary function of site information workers. The developers of computer systems must be given tools that automate and speed up the development of information systems and applications based on computer technology. This document describes a proposal for a site-wide computing architecture that addresses the above requirements. In summary, this architecture is standards-based data-driven, and workstation-oriented with larger systems being utilized for the delivery of needed information to users in a client-server relationship.

  3. Developing computer systems to support emergency operations: Standardization efforts by the Department of Energy and implementation at the DOE Savannah River Site

    International Nuclear Information System (INIS)

    DeBusk, R.E.; Fulton, G.J.; O'Dell, J.J.

    1990-01-01

    This paper describes the development of standards for emergency operations computer systems for the US Department of Energy (DOE). The proposed DOE computer standards prescribe the necessary power and simplicity to meet the expanding needs of emergency managers. Standards include networked UNIX workstations based on the client server model and software that presents information graphically using icons and windowing technology. DOE standards are based on those of the computer industry although Proposed DOE is implementing the latest technology to ensure a solid base for future growth. A case of how these proposed standards are being implemented is also presented. The Savannah River Site (SRS), a DOE facility near Aiken, South Carolina is automating a manual information system, proven over years of development. This system is generalized as a model that can apply to most, if not all, Emergency Operations Centers. This model can provide timely and validated information to emergency managers. By automating this proven system, the system is made easier to use. As experience in the case study demonstrates, computers are only an effective information tool when used as part of a proven process

  4. Standard guide for computed radiography

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This guide provides general tutorial information regarding the fundamental and physical principles of computed radiography (CR), definitions and terminology required to understand the basic CR process. An introduction to some of the limitations that are typically encountered during the establishment of techniques and basic image processing methods are also provided. This guide does not provide specific techniques or acceptance criteria for specific end-user inspection applications. Information presented within this guide may be useful in conjunction with those standards of 1.2. 1.2 CR techniques for general inspection applications may be found in Practice E2033. Technical qualification attributes for CR systems may be found in Practice E2445. Criteria for classification of CR system technical performance levels may be found in Practice E2446. Reference Images Standards E2422, E2660, and E2669 contain digital reference acceptance illustrations. 1.3 The values stated in SI units are to be regarded as the st...

  5. The ANS mathematics and computation software standards

    Energy Technology Data Exchange (ETDEWEB)

    Smetana, A. O. [Savannah River National Laboratory, Washington Savannah River Company, Aiken, SC 29808 (United States)

    2006-07-01

    The Mathematics and Computations Div. of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains three ANSI/ANS software standards. These standards are: Portability of Scientific and Engineering Software, ANS-10.2; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Accommodating User Needs in Scientific and Engineering Computer Software Development, ANS-10.5. A fourth Standard, Documentation of Computer Software, ANS-10.3, is available as a historical Standard. (authors)

  6. The ANS mathematics and computation software standards

    International Nuclear Information System (INIS)

    Smetana, A. O.

    2006-01-01

    The Mathematics and Computations Div. of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains three ANSI/ANS software standards. These standards are: Portability of Scientific and Engineering Software, ANS-10.2; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Accommodating User Needs in Scientific and Engineering Computer Software Development, ANS-10.5. A fourth Standard, Documentation of Computer Software, ANS-10.3, is available as a historical Standard. (authors)

  7. Contracting for Computer Software in Standardized Computer Languages

    OpenAIRE

    Brannigan, Vincent M.; Dayhoff, Ruth E.

    1982-01-01

    The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the co...

  8. Contracting for Computer Software in Standardized Computer Languages

    Science.gov (United States)

    Brannigan, Vincent M.; Dayhoff, Ruth E.

    1982-01-01

    The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the contracting process for standard language software. Appropriate contract language is suggested for requiring strict compliance with a standard, and an overview of remedies is given for failure to comply.

  9. Standardized UXO Technology Demonstration Site Blind Grid Scoring Record #833

    National Research Council Canada - National Science Library

    Fling, Rick; McClung, Christina; Burch, William; McDonnell, Patrick

    2007-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Blind Grid. This Scoring Record was coordinated by Dennis Teefy and the Standardized UXO Technology Demonstration Site Scoring Committee...

  10. Standardized UXO Technology Demonstration Site, Woods Scoring Record Number 486

    National Research Council Canada - National Science Library

    Overbay, Larry; Robitaille, George

    2005-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Open Field. The scoring record was coordinated by Larry Overbay and the Standardized UXO Technology Demonstration Site Scoring Committee...

  11. Procedures for Computing Site Seismicity

    Science.gov (United States)

    1994-02-01

    Fourth World Conference on Earthquake Engineering, Santiago, Chile , 1969. Schnabel, P.B., J. Lysmer, and H.B. Seed (1972). SHAKE, a computer program for...This fault system is composed of the Elsinore and Whittier fault zones, Agua Caliente fault, and Earthquake Valley fault. Five recent earthquakes of

  12. Standardized UXO Demonstration Site Blind Grid Scoring Record No. 690

    National Research Council Canada - National Science Library

    Overbay, Larry, Jr; Archiable, Robert; McClung, Christina; Robitaille, George

    2005-01-01

    ...) utilizing the YPG Standardized UXO Technology Demonstration Site Blind Grid. The scoring record was coordinated by Larry Overbay and by the Standardized UXO Technology Demonstration Scoring Committee...

  13. COMPUTER INTEGRATED MANUFACTURING: OVERVIEW OF MODERN STANDARDS

    Directory of Open Access Journals (Sweden)

    A. Рupena

    2016-09-01

    Full Text Available The article deals with modern international standards ISA-95 and ISA-88 on the development of computer inegreted manufacturing. It is shown scope of standards in the context of a hierarchical model of the enterprise. Article is built in such a way to describe the essence of the standards in the light of the basic descriptive models: product definition, resources, schedules and actual performance of industrial activity. Description of the product definition is given by hierarchical presentation of products at various levels of management. Much attention is given to describe this type of resources like equipment, which is logical chain to all these standards. For example, the standard batch process control shows the relationship between the definition of product and equipment on which it is made. The article shows the hierarchy of planning ERP-MES / MOM-SCADA (in terms of standard ISA-95, which traces the decomposition of common production plans of enterprises for specific works at APCS. We consider the appointment of the actual performance of production at MES / MOM considering KPI. Generalized picture of operational activity on a level MES / MOM is shown via general circuit diagrams of the relationship of activities and information flows between the functions. The article is finished by a substantiation of necessity of distribution, approval and development of standards ISA-88 and ISA-95 in Ukraine. The article is an overview and can be useful to specialists in computer-integrated systems control and management of industrial enterprises, system integrators and suppliers.

  14. Computer-Based Testing: Test Site Security.

    Science.gov (United States)

    Rosen, Gerald A.

    Computer-based testing places great burdens on all involved parties to ensure test security. A task analysis of test site security might identify the areas of protecting the test, protecting the data, and protecting the environment as essential issues in test security. Protecting the test involves transmission of the examinations, identifying the…

  15. Mini-computer in standard CAMAC

    International Nuclear Information System (INIS)

    Meyer, J.M.; Perrin, J.; Lecoq, J.; Tedjini, H.; Metzger, G.

    1975-01-01

    CAMAC is the designation of rules for the design and use of modular electronic data-handling equipment. The rules offer a standard scheme for interfacing computers to transducers and actuators in on-line systems. Where systems do not need a large memory capacity or where computing power is provided by an associated computer, a processor implemented in a CAMAC structure will be of a great interest for such a standard. In such a way built such a processor with an INTEL 8008 CPU chip with use of a CAMAC crate, a memory bus, an 1/0 bus or CAMAC horizontal Dataway and a bus connecting the CPU to the operator's panel. The interrupt system has six levels. To allow multi-programmation, the 8008's instruction set was extended with the creating of an Jump and mark instruction. A multi-task operating system was implemented allowing the execution of real time tasks, process control and program debugging. Three units have been built nowadays for: process control, education, test of CAMAC modules, image processing [fr

  16. Site preparation savings through better utilization standards

    Science.gov (United States)

    W.F. Watson; B.J. Stokes; I.W. Savelle

    1984-01-01

    This reports preliminary paper results of a study to determine the savings in the cost of site preparation that can be accomplished by the intensive utiliiation of understory biomass. mechanized sys terns can potentially be used for recovering this material.

  17. Interpreting Results from the Standardized UXO Test Sites

    National Research Council Canada - National Science Library

    May, Michael; Tuley, Michael

    2007-01-01

    ...) and the Environmental Security Technology Certification Program (ESCTP) to complete a detailed analysis of the results of testing carried out at the Standardized Unexploded Ordnance (UXO) Test Sites...

  18. Journal of Computer Science and Its Application: Site Map

    African Journals Online (AJOL)

    Journal of Computer Science and Its Application: Site Map. Journal Home > About the Journal > Journal of Computer Science and Its Application: Site Map. Log in or Register to get access to full text downloads.

  19. Computer aided site management. Site use management by digital mapping

    International Nuclear Information System (INIS)

    Chupin, J.C.

    1990-01-01

    The logistics program developed for assisting the Hague site management is presented. A digital site mapping representation and geographical data bases are used. The digital site map and its integration into a data base are described. The program can be applied to urban and rural land management aid. Technical administrative and economic evaluations of the program are summarized [fr

  20. Adapting standards to the site. Example of Seismic Base Isolation

    International Nuclear Information System (INIS)

    Viallet, Emmanuel

    2014-01-01

    Emmanuel Viallet, Civil Design Manager at EDF engineering center SEPTEN, concluded the morning's lectures with a presentation on how to adapt a standard design to site characteristics. He presented the example of the seismic isolation of the Cruas NPP for which the standard 900 MW design was indeed built on 'anti-seismic pads' to withstand local seismic load

  1. American National Standard: guidelines for evaluating site-related geotechnical parameters at nuclear power sites

    International Nuclear Information System (INIS)

    Anon.

    1978-01-01

    This standard presents guidelines for evaluating site-related geotechnical parameters for nuclear power sites. Aspects considered include geology, ground water, foundation engineering, and earthwork engineering. These guidelines identify the basic geotechnical parameters to be considered in site evaluation, and in the design, construction, and performance of foundations and earthwork aspects for nuclear power plants. Also included are tabulations of typical field and laboratory investigative methods useful in identifying geotechnical parameters. Those areas where interrelationships with other standards may exist are indicated

  2. DICOM standard in computer-aided medical technologies

    International Nuclear Information System (INIS)

    Plotnikov, A.V.; Prilutskij, D.A.; Selishchev, S.V.

    1997-01-01

    The paper outlines one of the promising standards to transmit images in medicine, in radiology in particular. the essence of the standard DICOM is disclosed and promises of its introduction into computer-aided medical technologies

  3. Cloud Computing for Standard ERP Systems

    DEFF Research Database (Denmark)

    Schubert, Petra; Adisa, Femi

    for the operation of ERP systems. We argue that the phenomenon of cloud computing could lead to a decisive change in the way business software is deployed in companies. Our reference framework contains three levels (IaaS, PaaS, SaaS) and clarifies the meaning of public, private and hybrid clouds. The three levels......Cloud Computing is a topic that has gained momentum in the last years. Current studies show that an increasing number of companies is evaluating the promised advantages and considering making use of cloud services. In this paper we investigate the phenomenon of cloud computing and its importance...... of cloud computing and their impact on ERP systems operation are discussed. From the literature we identify areas for future research and propose a research agenda....

  4. Plan for implementing EPA standards for UMTRA sites

    International Nuclear Information System (INIS)

    1984-01-01

    The Uranium Mill Tailings Radiation Control Act of 1978 (UMTRCA), authorizes the Department of Energy (DOE) to undertake remedial actions at 24 DOE designated processing sites. The term ''processing site,'' by statutory definition, means the inactive uranium mill or processing site and any other real property or improvement which is in the vicinity of the mill or processing site and is determined to be contaminated with residual radioactive materials derived from the mill or processing site. For purposes of this document, the inactive mill or processing site is referred to as the ''processing site'' and the other real property or improvement in the vicinity of such site is referred to as a 'vicinity property.'' The purpose of the remedial actions is to stabilize and control the uranium mill tailings and other residual radioactive materials in a safe and environmentally sound manner. Remedial actions undertaken by DOE are to be accomplished: With the full participation of the affected states and Indian tribes, in accordance with standards issued by the Environmental Protection Agency (EPA), and with the concurrence of the Nuclear Regulatory Commission (NRC). This plan is designed to be a generic presentation on methodology that will be followed in implementing the EPA standards. 5 refs., 1 tab

  5. Automating ATLAS Computing Operations using the Site Status Board

    CERN Document Server

    Andreeva, J; The ATLAS collaboration; Campana, S; Di Girolamo, A; Espinal Curull, X; Gayazov, S; Magradze, E; Nowotka, MM; Rinaldi, L; Saiz, P; Schovancova, J; Stewart, GA; Wright, M

    2012-01-01

    The automation of operations is essential to reduce manpower costs and improve the reliability of the system. The Site Status Board (SSB) is a framework which allows Virtual Organizations to monitor their computing activities at distributed sites and to evaluate site performance. The ATLAS experiment intensively uses SSB for the distributed computing shifts, for estimating data processing and data transfer efficiencies at a particular site, and for implementing automatic exclusion of sites from computing activities, in case of potential problems. ATLAS SSB provides a real-time aggregated monitoring view and keeps the history of the monitoring metrics. Based on this history, usability of a site from the perspective of ATLAS is calculated. The presentation will describe how SSB is integrated in the ATLAS operations and computing infrastructure and will cover implementation details of the ATLAS SSB sensors and alarm system, based on the information in SSB. It will demonstrate the positive impact of the use of SS...

  6. Standard operating procedure for computing pangenome trees

    DEFF Research Database (Denmark)

    Snipen, L.; Ussery, David

    2010-01-01

    We present the pan-genome tree as a tool for visualizing similarities and differences between closely related microbial genomes within a species or genus. Distance between genomes is computed as a weighted relative Manhattan distance based on gene family presence/absence. The weights can be chose...

  7. 24 CFR 983.57 - Site selection standards.

    Science.gov (United States)

    2010-04-01

    ...; (vi) If the poverty rate in the area where the proposed PBV development will be located is greater... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Site selection standards. 983.57 Section 983.57 Housing and Urban Development Regulations Relating to Housing and Urban Development...

  8. Plan for implementing EPA standards for UMTRA sites

    International Nuclear Information System (INIS)

    1984-01-01

    This plan is designed to be a generic presentation on methodology that will be followed in implementing the EPA standards. Its applicability covers 24 inactive uranium mill tailings sites and approximately 8000 vicinity properties, no two of which are alike. This diversity dictates the more general approach of this plan. In practice, however, site-specific application will be implemented and will require extensive consultation with the appropriate state or tribe and the NRC. In addition, information concerning relevant Federal, state, or tribal standards and regulations will be considered along with any data that may assist in the evaluations. Throughout this process, DOE will encourage state and tribal participation to ensure that compliance with the EPA standards will be achieved

  9. Computer data exchanges spur need for worldwide well numbering standard

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    This paper reports that the American Association of Petroleum Geologists database standards subcommittee has voted to pursue development of a worldwide well numbering standard. Aim of such a standard would be to facilitate the exchange of well data between operators, service companies, and governments. The need for such a standard is heightened by the explosive growth of electronic data interchange (EDI), which uses industry standards to exchange data computer to computer. The subcommittee has reviewed various well numbering methods, identified advantages and disadvantages of each approach for publication to obtain industrywide comments

  10. Standard problems for structural computer codes

    International Nuclear Information System (INIS)

    Philippacopoulos, A.J.; Miller, C.A.; Costantino, C.J.

    1985-01-01

    BNL is investigating the ranges of validity of the analytical methods used to predict the behavior of nuclear safety related structures under accidental and extreme environmental loadings. During FY 85, the investigations were concentrated on special problems that can significantly influence the outcome of the soil structure interaction evaluation process. Specially, limitations and applicability of the standard interaction methods when dealing with lift-off, layering and water table effects, were investigated. This paper describes the work and the results obtained during FY 85 from the studies on lift-off, layering and water-table effects in soil-structure interaction

  11. Computation of standard deviations in eigenvalue calculations

    International Nuclear Information System (INIS)

    Gelbard, E.M.; Prael, R.

    1990-01-01

    In Brissenden and Garlick (1985), the authors propose a modified Monte Carlo method for eigenvalue calculations, designed to decrease particle transport biases in the flux and eigenvalue estimates, and in corresponding estimates of standard deviations. Apparently a very similar method has been used by Soviet Monte Carlo specialists. The proposed method is based on the generation of ''superhistories'', chains of histories run in sequence without intervening renormalization of the fission source. This method appears to have some disadvantages, discussed elsewhere. Earlier numerical experiments suggest that biases in fluxes and eigenvalues are negligibly small, even for very small numbers of histories per generation. Now more recent experiments, run on the CRAY-XMP, tend to confirm these earlier conclusions. The new experiments, discussed in this paper, involve the solution of one-group 1D diffusion theory eigenvalue problems, in difference form, via Monte Carlo. Experiments covered a range of dominance ratios from ∼0.75 to ∼0.985. In all cases flux and eigenvalue biases were substantially smaller than one standard deviation. The conclusion that, in practice, the eigenvalue bias is negligible has strong theoretical support. (author)

  12. Standardized UXO Technology Demonstration Site, Blind Grid Scoring Record Number 842

    National Research Council Canada - National Science Library

    Karwatka, Michael; Fling, Rick; McClung, Christina; Banta, Matthew; Burch, William; McDonnell, Patrick

    2007-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Blind Grid. This Scoring Record was coordinated by Michael Karwatka and the Standardized UXO Technology Demonstration Site Scoring Committee...

  13. Standardized UXO Technology Demonstration Site Blind Grid Scoring Record No. 396

    National Research Council Canada - National Science Library

    Overbay, Larry, Jr; Boutin, Matthew; Fling, Rick; McClung, Christina; Robitaille, George

    2005-01-01

    ...) utilizing the APG Standardized UXO technology Demonstration Site Blind Grid. The scoring record was coordinated by Larry Overbay and by the Standardized UXO Technology Demonstration Site Scoring Committee...

  14. Standardized UXO Technology Demonstration Site Blind Grid Scoring Record No. 805

    National Research Council Canada - National Science Library

    Karwatka, Michael; Fling, Rick; McClung, Christina

    2007-01-01

    ...) utilizing the YPG Standardized UXO Technology Demonstration Site Blind Grid. This Scoring Record was coordinated by Michael Karwatka and the Standardized UXO Technology Demonstration Site Scoring Committee...

  15. Standardized UXO Technology Demonstration Site Blind Grid Scoring Record No. 268

    National Research Council Canada - National Science Library

    Overbay, Larry, Jr; Fling, Rick; McClung, Christina; Robitaille, George

    2005-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Blind Grid. The scoring record was coordinated by Larry Overbay and by the Standardized UXO Technology Demonstration Site Scoring Committee...

  16. Standardized UXO Technology Demonstration Site, Blind Grid Scoring Record No. 898

    National Research Council Canada - National Science Library

    Burch, William; Fling, Rick; McClung, Christina; Lombardo, Leonardo; McDonnell, Patrick

    2008-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Blind Grid Field. This Scoring Record was coordinated by William Burch and the Standardized UXO Technology Demonstration Site Scoring Committee...

  17. Standardized UXO Technology Demonstration Site Blind Grid Scoring Record No. 769

    National Research Council Canada - National Science Library

    Archiable, Robert; Fling, Rick; McClung, Christina; Teefy, Dennis; Burch, William; Packer, Bonnie; Banta, Matthew

    2006-01-01

    ...) utilizing the YPG Standardized UXO Technology Demonstration Site Blind Grid. Scoring Records have been coordinated by Dennis Teefy and the Standardized UXO Technology Demonstration Site Scoring Committee...

  18. Standardized UXO Technology Demonstration Site Blind Grid Scoring Record No. 792

    National Research Council Canada - National Science Library

    Karwatka, Mike; Packer, Bonnie

    2006-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Blind Grid. Scoring Records have been coordinated by Mike Karwatka and the Standardized UXO Technology Demonstration Site Scoring Committee...

  19. Standardized UXO Technology Demonstration Site, Blind Grid Scoring Record No. 896

    National Research Council Canada - National Science Library

    Burch, William; Fling, Rick; McClung, Christina

    2008-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Blind Grid Field. This Scoring Record was coordinated by William Burch and the Standardized UXO Technology Demonstration Site Scoring Committee...

  20. Standardized UXO Technology Demonstration Site, Blind Grid Scoring Record No. 257

    National Research Council Canada - National Science Library

    Overbay, Larry, Jr; Robitaille, George; Boutin, Matthew; Fling, Rick; McClung, Christina

    2005-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Blind Grid. The scoring record was coordinated by Larry Overbay and the Standardized UXO Technology Demonstration Site Scoring Committee...

  1. Standardized UXO Technology Demonstration Site Blind Grid Scoring Record No. 830

    National Research Council Canada - National Science Library

    Teefy, Dennis

    2007-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Blind Grid. This Scoring Record was coordinated by Dennis Teefy and the Standardized UXO Technology Demonstration Site Scoring Committee...

  2. Standardized UXO Technology Demonstration Site, Blind Grid Scoring Record Number 431

    National Research Council Canada - National Science Library

    Overbay, Larry; Robitaille, George

    2005-01-01

    ...) utilizing the YPG Standardized UXO Technology Demonstration Site Blind Grid. The scoring record was coordinated by Larry Overbay and by the Standardized UXO Technology Demonstration Site Scoring Committee...

  3. Standardized UXO Technology Demonstration Site Blind Grid Scoring Record No. 834

    National Research Council Canada - National Science Library

    Teefy, Dennis; Fling, Rick; McClung, Christina

    2007-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Blind Grid. This Scoring Record was coordinated by Dennis Teefy and the Standardized UXO Technology Demonstration Site Scoring Committee...

  4. Standardized UXO Technology Demonstration Site, Blind Grid Scoring Record No. 397

    National Research Council Canada - National Science Library

    Overbay, Larry, Jr; Robitaille, George; Boutin, Matthew; Fling, Rick; McClung, Christina

    2005-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Blind Grid. The scoring record was coordinated by Larry Overbay and by the Standardized UXO Technology Demonstration Site Scoring Committee...

  5. Standardized UXO Technology Demonstration Site Blind Grid Scoring Record No. 252

    National Research Council Canada - National Science Library

    Overbay, Larry, Jr; Boutin, Matthew; Fling, Rick; McClung, Christina

    2005-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Blind Grid. The scoring record was coordinated by Larry Overbay and the Standardized UXO Technology Demonstration Site Scoring Committee...

  6. Standardized UXO Technology Demonstration Site Blind Grid Scoring Record Number 691

    National Research Council Canada - National Science Library

    Overbay, Jr., Larry; Watts, Kimberly; Fling, Rick; McClung, Christina; Banta, Matthew

    2006-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site blind grid. Scoring Records have been coordinated by Larry Overbay and the Standardized UXO Technology Demonstration Site Scoring Committee...

  7. Standardized UXO Technology Demonstration Site Blind Grid Scoring Record No. 832

    National Research Council Canada - National Science Library

    Teefy, Dennis

    2007-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Blind Grid. This Scoring Record was coordinated by Dennis Teefy and the Standardized UXO Technology Demonstration Site Scoring Committee...

  8. Standardized UXO Technology Demonstration Site, Blind Grid Scoring Record No. 237

    National Research Council Canada - National Science Library

    Overbay, Larry, Jr; Robitaille, George; Boutin, Matthew; Fling, Rick; McClung, Christina

    2005-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Blind Grid. The scoring record was coordinated by Larry Overbay and the Standardized UXO Technology Demonstration Site Scoring Committee...

  9. Standardized UXO Technology Demonstration Site Blind Grid Scoring Record No. 906 (Sky Research, Inc.)

    National Research Council Canada - National Science Library

    McClung, J. S; Burch, William; Fling, Rick; McClung, Christina; Lombardo, Leonardo; McDonnell, Patrick

    2008-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Blind Grid. This Scoring Record was coordinated by William Burch and the Standardized UXO Technology Demonstration Site Scoring Committee...

  10. Standardized UXO Technology Demonstration Site Blind Grid Scoring Record No. 764

    National Research Council Canada - National Science Library

    Overbay, Larry; Watts, Kimberly

    2006-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Blind Grid. Scoring Records have been coordinated by Larry Overbay and the Standardized UXO Technology Demonstration Site Scoring Committee...

  11. Standardized UXO Technology Demonstration Site Blind Grid Scoring Record No. 831

    National Research Council Canada - National Science Library

    Teefy, Dennis

    2007-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Blind Grid. This Scoring Record was coordinated by Dennis Teefy and the Standardized UXO Technology Demonstration Site Scoring Committee...

  12. Standardized UXO Technology Demonstration Site. Open Field Scoring Record Number 154

    National Research Council Canada - National Science Library

    Overbay, Larry

    2004-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Open Field. The scoring record was coordinated by Larry Overbay and by the Standardized UXO Technology Demonstration Site Scoring Committee...

  13. Standardized UXO Technology Demonstration Site, Open Field Scoring Record Number 379

    National Research Council Canada - National Science Library

    Overbay, Larry; Robitaille, George

    2005-01-01

    ... (UXO) utilizing the APG Standardized UXO Technology Demonstration Site Open Field. Scoring Records have been coordinated by Larry Overbay and the Standardized UXO Technology Demonstration Site Scoring Committee...

  14. Standardized UXO Technology Demonstration Site Open Field Scoring Record Number 354

    National Research Council Canada - National Science Library

    Overbay, Larry, Jr; Archiable, Robert; McClung, Christina

    2005-01-01

    ...) utilizing the YPG Standardized UXO Technology Demonstration Site Open Field. The scoring record was coordinated by Larry Overbay and by the Standardized UXO Technology Demonstration Site Scoring Committee...

  15. Standardized UXO Technology Demonstration Site Open Field Scoring Record No. 311

    National Research Council Canada - National Science Library

    Overbay, Larry, Jr; Boutin, Matthew; Fling, Rick; McClung, Christina; Robitaille, George

    2005-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Open Field. The scoring record was coordinated by Larry Overbay and by the Standardized UXO Technology Demonstration Site Scoring Committee...

  16. Standardized UXO Technology Demonstration Site Open Field Scoring Recording Number 231 (Human Factors Applications, Inc.)

    National Research Council Canada - National Science Library

    Overbay, Larry; Robitaille, George

    2005-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Open Field. The scoring record was coordinated by Larry Overbuy and by the Standardized UXO Technology Demonstration Site Scoring Committee...

  17. Standardized UXO Technology Demonstration Site, Open Field Scoring Record Number 426

    National Research Council Canada - National Science Library

    Overbay, Larry, Jr; Boutin, Matthew; Archiable, Robert; Fling, Rick; McClung, Christina; Robitaille, George

    2005-01-01

    ...) utilizing the YPG Standardized UXO Technology Demonstration Site Open Field. Scoring Records have been coordinated by Larry Overbay and the Standardized UXO Technology Demonstration Site Scoring Committee...

  18. Standardized UXO Technology Demonstration Site, Open Field Scoring Record Number 657

    National Research Council Canada - National Science Library

    Overbay, Larry; Robitaille, George

    2005-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Open Field. Scoring Records have been coordinated by Larry Overbay and the Standardized UXO Technology Demonstration Site Scoring Committee...

  19. Standardized UXO Technology Demonstration Site Open Field Scoring Record Number 129

    National Research Council Canada - National Science Library

    Overbay, Larry

    2004-01-01

    ...) utilizing the APO Standardized UXO Technology Demonstration Site Open Field. The scoring record was coordinated by Larry Overbay and by the Standardized UXO Technology Demonstration Site Scoring Committee...

  20. Standardized UXO Technology Demonstration Site, Open Field Scoring Record Number 229

    National Research Council Canada - National Science Library

    Overbay, Larry, Jr; Boutin, Matthew; Fling, Rick; McClung, Christina; Robitaille, George

    2005-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Open Field. The scoring record was coordinated by Larry Overbay and by the Standardized UXO Technology Demonstration Site Scoring Committee...

  1. Standardized UXO Technology Demonstration Site, Open Field Scoring Record Number 411

    National Research Council Canada - National Science Library

    Overbay, Larry; Robitaille, George

    2005-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Open Field. The scoring record was coordinated by Larry Overbay and by the Standardized UXO Technology Demonstration Site Scoring Committee...

  2. Standardized UXO Technology Demonstration Site, Open Field Scoring Record No. 897

    National Research Council Canada - National Science Library

    Burch, William; Fling, Rick; McClung, Christina

    2008-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Open Field. This Scoring Record was coordinated by William Burch and the Standardized UXO Technology Demonstration Site Scoring Committee...

  3. Standardized UXO Technology Demonstration Site Open Field Scoring Record Number 673 (Naval Research Laboratories)

    National Research Council Canada - National Science Library

    Overbay, Larry; Robitaille, George

    2005-01-01

    ...) utilizing the APG standardized UXO Technology Demonstration Site Open Field. Scoring Records have been coordinate by Larry Overbay and the Standardized UXO Technology Demonstration Site Scoring Committee...

  4. Standardized UXO Technology Demonstration Site Open Field Scoring Record Number 169

    National Research Council Canada - National Science Library

    Overbay, Larry; Archiable, Robert; McClung, Christina; Robitaille, George

    2005-01-01

    ...) utilizing the YPG Standardized UXO Technology Demonstration Site Open Field. The scoring record was coordinated by Larry Overbay and by the Standardized UXO Technology Demonstration Site Scoring Committee...

  5. Standardized UXO Technology Demonstration Site Open Field Scoring Record Number 492 (Shaw Environmental, Inc.)

    National Research Council Canada - National Science Library

    Overbay, Larry; Robitaille, George

    2005-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Open Field. Scoring Records have been coordinated by Larry Overbay and the Standardized UXO Technology Demonstration Site Scoring Committee...

  6. Standardized UXO Technology Demonstration Site Open Field Scoring Record No. 442

    National Research Council Canada - National Science Library

    Overbay, Larry, Jr; Boutin, Matthew; Archiable, Robert; Fling, Rick; McClung, Christina; Robitaille, George

    2005-01-01

    ...) unitizing the YPG Standardized UXO Technology Demonstration Site Open Field. The scoring record was coordinated by Larry Overbay and the Standardized UXO Technology Demonstration Site Scoring Committee...

  7. Standardized UXO Technology Demonstration Site Open Field Scoring Record Number 201

    National Research Council Canada - National Science Library

    Overbay, Larry, Jr; Fling, Rick; Robitaille, George

    2004-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Open Field. The scoring record was coordinated by Larry Overbay and by the Standardized UXO Technology Demonstration Site Scoring Committee...

  8. Standardized UXO Technology Demonstration Site Open Field Scoring Record Number 668 (NAEVA Geophysics, Inc.)

    National Research Council Canada - National Science Library

    Overbay, Larry; Robitaille, George

    2005-01-01

    ...) utilizing they PG Standardized UXO Technology Demonstration Site Open Field. Scoring Records have been coordinate by Larry Overbay and the Standardized UXO Technology Demonstration Site Scoring Committee...

  9. Standardized UXO Technology Demonstration Site Open Field Scoring Record Number 165

    National Research Council Canada - National Science Library

    Overbay, Larry

    2004-01-01

    ...) utilizing the APO Standardized UXO Technology Demonstration Site Open Field. The scoring record was coordinated by Larry Overbay and by the Standardized UXO Technology Demonstration Site Scoring Committee...

  10. Standardized UXO Technology Demonstration Site, Open Field Scoring Record Number 638

    National Research Council Canada - National Science Library

    Overbay, Larry, Jr; Robitaille, George; Boutin, Matthew; Archiable, Robert; McClung, Christina

    2005-01-01

    ...) utilizing the YPG Standardized UXO Technology Demonstration Site Open Field. The scoring record was coordinated by Larry Overbay and the Standardized UXO Technology Demonstration Site Scoring Committee...

  11. Standardized UXO Technology Demonstration Site Open Field Scoring Record No. 857

    National Research Council Canada - National Science Library

    Fling, Rick; McClung, Christina; Banta, Matthew; Burch, William; McDonnell, Patrick

    2007-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Open Field. This Scoring Record was coordinated by Dennis Teefy and the Standardized UXO Technology Demonstration Site Scoring Committee...

  12. Methodology to evaluate the site standard seismic motion to a nuclear facility

    International Nuclear Information System (INIS)

    Soares, W.A.

    1983-01-01

    For the seismic design of nuclear facilities, the input motion is normally defined by the predicted maximum ground horizontal acceleration and the free field ground response spectrum. This spectrum is computed on the basis of records of strong motion earthquakes. The pair maximum acceleration-response spectrum is called the site standard seismic motion. An overall view of the subjects involved in the determination of the site standard seismic motion to a nuclear facility is presented. The main topics discussed are: basic principles of seismic instrumentation; dynamic and spectral concepts; design earthquakes definitions; fundamentals of seismology; empirical curves developed from prior seismic data; available methodologies and recommended procedures to evaluate the site standard seismic motion. (Author) [pt

  13. Savannah River Site peer evaluator standards: Operator assessment for restart

    International Nuclear Information System (INIS)

    1990-01-01

    Savannah River Site has implemented a Peer Evaluator program for the assessment of certified Central Control Room Operators, Central Control Room Supervisors and Shift Technical Engineers prior to restart. This program is modeled after the nuclear Regulatory Commission's (NRC's) Examiner Standard, ES-601, for the requalification of licensed operators in the commercial utility industry. It has been tailored to reflect the unique differences between Savannah River production reactors and commercial power reactors

  14. The comparison of high and standard definition computed ...

    African Journals Online (AJOL)

    The comparison of high and standard definition computed tomography techniques regarding coronary artery imaging. A Aykut, D Bumin, Y Omer, K Mustafa, C Meltem, C Orhan, U Nisa, O Hikmet, D Hakan, K Mert ...

  15. ANL statement of site strategy for computing workstations

    Energy Technology Data Exchange (ETDEWEB)

    Fenske, K.R. (ed.); Boxberger, L.M.; Amiot, L.W.; Bretscher, M.E.; Engert, D.E.; Moszur, F.M.; Mueller, C.J.; O' Brien, D.E.; Schlesselman, C.G.; Troyer, L.J.

    1991-11-01

    This Statement of Site Strategy describes the procedure at Argonne National Laboratory for defining, acquiring, using, and evaluating scientific and office workstations and related equipment and software in accord with DOE Order 1360.1A (5-30-85), and Laboratory policy. It is Laboratory policy to promote the installation and use of computing workstations to improve productivity and communications for both programmatic and support personnel, to ensure that computing workstations acquisitions meet the expressed need in a cost-effective manner, and to ensure that acquisitions of computing workstations are in accord with Laboratory and DOE policies. The overall computing site strategy at ANL is to develop a hierarchy of integrated computing system resources to address the current and future computing needs of the laboratory. The major system components of this hierarchical strategy are: Supercomputers, Parallel computers, Centralized general purpose computers, Distributed multipurpose minicomputers, and Computing workstations and office automation support systems. Computing workstations include personal computers, scientific and engineering workstations, computer terminals, microcomputers, word processing and office automation electronic workstations, and associated software and peripheral devices costing less than $25,000 per item.

  16. QUEST Hanford Site Computer Users - What do they do?

    Energy Technology Data Exchange (ETDEWEB)

    WITHERSPOON, T.T.

    2000-03-02

    The Fluor Hanford Chief Information Office requested that a computer-user survey be conducted to determine the user's dependence on the computer and its importance to their ability to accomplish their work. Daily use trends and future needs of Hanford Site personal computer (PC) users was also to be defined. A primary objective was to use the data to determine how budgets should be focused toward providing those services that are truly needed by the users.

  17. Computer-related standards for the petroleum industry

    International Nuclear Information System (INIS)

    Winczewski, L.M.

    1992-01-01

    Rapid application of the computer to all areas of the petroleum industry is straining the capabilities of corporations and vendors to efficiently integrate computer tools into the work environment. Barriers to this integration arose form decades of competitive development of proprietary applications formats, along with compilation of data bases in isolation. Rapidly emerging industry-wide standards relating to computer applications and data management are poised to topple these barriers. This paper identifies the most active players within a rapidly evolving group of cooperative standardization activities sponsored by the petroleum industry. Summarized are their objectives, achievements, current activities and relationships to each other. The trends of these activities are assessed and projected

  18. Computing tools for implementing standards for single-case designs.

    Science.gov (United States)

    Chen, Li-Ting; Peng, Chao-Ying Joanne; Chen, Ming-E

    2015-11-01

    In the single-case design (SCD) literature, five sets of standards have been formulated and distinguished: design standards, assessment standards, analysis standards, reporting standards, and research synthesis standards. This article reviews computing tools that can assist researchers and practitioners in meeting the analysis standards recommended by the What Works Clearinghouse: Procedures and Standards Handbook-the WWC standards. These tools consist of specialized web-based calculators or downloadable software for SCD data, and algorithms or programs written in Excel, SAS procedures, SPSS commands/Macros, or the R programming language. We aligned these tools with the WWC standards and evaluated them for accuracy and treatment of missing data, using two published data sets. All tools were tested to be accurate. When missing data were present, most tools either gave an error message or conducted analysis based on the available data. Only one program used a single imputation method. This article concludes with suggestions for an inclusive computing tool or environment, additional research on the treatment of missing data, and reasonable and flexible interpretations of the WWC standards. © The Author(s) 2015.

  19. Standardized UXO Technology Demonstration Site Blind Grid Scoring Record No. 213

    National Research Council Canada - National Science Library

    Overbay, Larry; Archiable, Robert; McClung, Christina; Robitaille, George

    2005-01-01

    ... (UXO) utilizing the YPG Standardized UXO Technology Demonstration Site Blind Grid. The scoring record was coordinated by Larry Overbay and by the Standardized UXO Technology Demonstration Site Site Scoring Committee...

  20. Performance of an extrapolation chamber in computed tomography standard beams

    International Nuclear Information System (INIS)

    Castro, Maysa C.; Silva, Natália F.; Caldas, Linda V.E.

    2017-01-01

    Among the medical uses of ionizing radiations, the computed tomography (CT) diagnostic exams are responsible for the highest dose values to the patients. The dosimetry procedure in CT scanner beams makes use of pencil ionization chambers with sensitive volume lengths of 10 cm. The aim of its calibration is to compare the values that are obtained with the instrument to be calibrated and a standard reference system. However, there is no primary standard system for this kind of radiation beam. Therefore, an extrapolation ionization chamber built at the Calibration Laboratory (LCI), was used to establish a CT primary standard. The objective of this work was to perform some characterization tests (short- and medium-term stabilities, saturation curve, polarity effect and ion collection efficiency) in the standard X-rays beams established for computed tomography at the LCI. (author)

  1. Performance of an extrapolation chamber in computed tomography standard beams

    Energy Technology Data Exchange (ETDEWEB)

    Castro, Maysa C.; Silva, Natália F.; Caldas, Linda V.E., E-mail: mcastro@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2017-07-01

    Among the medical uses of ionizing radiations, the computed tomography (CT) diagnostic exams are responsible for the highest dose values to the patients. The dosimetry procedure in CT scanner beams makes use of pencil ionization chambers with sensitive volume lengths of 10 cm. The aim of its calibration is to compare the values that are obtained with the instrument to be calibrated and a standard reference system. However, there is no primary standard system for this kind of radiation beam. Therefore, an extrapolation ionization chamber built at the Calibration Laboratory (LCI), was used to establish a CT primary standard. The objective of this work was to perform some characterization tests (short- and medium-term stabilities, saturation curve, polarity effect and ion collection efficiency) in the standard X-rays beams established for computed tomography at the LCI. (author)

  2. Standardized computer-based organized reporting of EEG

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C.

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE....... In the end, the diagnostic significance is scored, using a standardized list of terms. SCORE has specific modules for scoring seizures (including seizure semiology and ictal EEG patterns), neonatal recordings (including features specific for this age group), and for Critical Care EEG Terminology. SCORE...

  3. Standardized UXO Technology Demonstration Site Blind Grid Scoring Record Number 312

    National Research Council Canada - National Science Library

    Overbay, Larry, Jr; Archiable, Robert; McClung, Christina; Robitaille, George

    2005-01-01

    ...) utilizing the YPG Standardized UXO Technology Demonstration Site Blind Grid. The scoring record was coordinated by Larry Overbay and by the Standardized UXO Technology Demonstration Scoring Committee...

  4. Standardized UXO Technology Demonstration Site Blind Grid Scoring Record No. 690

    National Research Council Canada - National Science Library

    Overbay, Larry

    2005-01-01

    ...) utilizing the YPC Standardized UXO Technology Demonstration Site Blind Grid. The scoring record was coordinated by Larry Overbay and by the Standardized UXO Technology Demonstration Scoring Committee...

  5. Standardized UXO Technology Demonstration Site, Blind Grid Scoring Record No. 671

    National Research Council Canada - National Science Library

    Overbay, Larry; Robitaille, George

    2005-01-01

    ... (UXO) utilizing the APG Standardized UXO Technology Demonstration Site Blind Grid. The scoring record was coordinated by Larry Overbay and by the Standardized UXO Technology Deomostration Site Scoring Committee...

  6. Standardized UXO Technology Demonstration Site Open Field Scoring Record No. 901 (Sky Research, Inc.)

    National Research Council Canada - National Science Library

    McClung, J. S; Fling, Rick; McClung, Christina; Burch, William; Lombardo, Leonardo; McDonnell, Patrick

    2008-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Open Field. This Scoring Record was coordinated by Stephen McClung and the Standardized UXO Technology Demonstration Site Scoring Committee...

  7. Standardized UXO Technology Demonstration Site Open Field Scoring Record Number 245

    National Research Council Canada - National Science Library

    Overbay, Larry

    2005-01-01

    ... (UXO) utilizing the YPG Standardized UXO Technology Demonstration Site Open Field. The scoring record was coordinated by Larry Overbay and by the Standardized UXO Technology Demonstration Site Scoring Committee...

  8. Standardized UXO Technology Demonstration Site Open Field Scoring Record Number 675

    National Research Council Canada - National Science Library

    Overbay, Larry; Robitaille, George

    2005-01-01

    ... (UXO) utilizing the APG Standardized UXO Technology Demonstration Site Open Field. The scoring record was coordinated by Larry Overbay and by the Standardized UXO Technology Demonstration Site Scoring Committee...

  9. Standardized computer-based organized reporting of EEG

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C.

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE...... are used to report the features of clinical relevance, extracted while assessing the EEGs. Selection of the terms is context sensitive: initial choices determine the subsequently presented sets of additional choices. This process automatically generates a report and feeds these features into a database...

  10. Cloud computing for protein-ligand binding site comparison.

    Science.gov (United States)

    Hung, Che-Lun; Hua, Guan-Jie

    2013-01-01

    The proteome-wide analysis of protein-ligand binding sites and their interactions with ligands is important in structure-based drug design and in understanding ligand cross reactivity and toxicity. The well-known and commonly used software, SMAP, has been designed for 3D ligand binding site comparison and similarity searching of a structural proteome. SMAP can also predict drug side effects and reassign existing drugs to new indications. However, the computing scale of SMAP is limited. We have developed a high availability, high performance system that expands the comparison scale of SMAP. This cloud computing service, called Cloud-PLBS, combines the SMAP and Hadoop frameworks and is deployed on a virtual cloud computing platform. To handle the vast amount of experimental data on protein-ligand binding site pairs, Cloud-PLBS exploits the MapReduce paradigm as a management and parallelizing tool. Cloud-PLBS provides a web portal and scalability through which biologists can address a wide range of computer-intensive questions in biology and drug discovery.

  11. Computer system validation: an overview of official requirements and standards.

    Science.gov (United States)

    Hoffmann, A; Kähny-Simonius, J; Plattner, M; Schmidli-Vckovski, V; Kronseder, C

    1998-02-01

    A brief overview of the relevant documents for companies in the pharmaceutical industry, which are to be taken into consideration to fulfil computer system validation requirements, is presented. We concentrate on official requirements and valid standards in the USA, European Community and Switzerland. There are basically three GMP-guidelines. their interpretations by the associations of interests like APV and PDA as well as the GAMP Suppliers Guide. However, the three GMP-guidelines imply the same philosophy about computer system validation. They describe more a what-to-do approach for validation, whereas the GAMP Suppliers Guide describes a how-to-do validation. Nevertheless, they do not contain major discrepancies.

  12. Testing an extrapolation chamber in computed tomography standard beams

    Science.gov (United States)

    Castro, M. C.; Silva, N. F.; Caldas, L. V. E.

    2018-03-01

    The computed tomography (CT) is responsible for the highest dose values to the patients. Therefore, the radiation doses in this procedure must be accurate. However, there is no primary standard system for this kind of radiation beam yet. In order to search for a CT primary standard, an extrapolation ionization chamber built at the Calibration Laboratory (LCI) of the Instituto de Pesquisas Energéticas e Nucleares (IPEN), was tested in this work. The results showed to be within the international recommended limits.

  13. The feasibility of mobile computing for on-site inspection.

    Energy Technology Data Exchange (ETDEWEB)

    Horak, Karl Emanuel; DeLand, Sharon Marie; Blair, Dianna Sue

    2014-09-01

    With over 5 billion cellphones in a world of 7 billion inhabitants, mobile phones are the most quickly adopted consumer technology in the history of the world. Miniaturized, power-efficient sensors, especially video-capable cameras, are becoming extremely widespread, especially when one factors in wearable technology like Apples Pebble, GoPro video systems, Google Glass, and lifeloggers. Tablet computers are becoming more common, lighter weight, and power-efficient. In this report the authors explore recent developments in mobile computing and their potential application to on-site inspection for arms control verification and treaty compliance determination. We examine how such technology can effectively be applied to current and potential future inspection regimes. Use cases are given for both host-escort and inspection teams. The results of field trials and their implications for on-site inspections are discussed.

  14. SITE SPECIFIC REFERENCE PERSON PARAMETERS AND DERIVED CONCENTRATION STANDARDS FOR THE SAVANNAH RIVER SITE

    Energy Technology Data Exchange (ETDEWEB)

    Jannik, T.

    2013-03-14

    The purpose of this report is twofold. The first is to develop a set of behavioral parameters for a reference person specific for the Savannah River Site (SRS) such that the parameters can be used to determine dose to members of the public in compliance with Department of Energy (DOE) Order 458.1 “Radiation Protection of the Public and the Environment.” A reference person is a hypothetical, gender and age aggregation of human physical and physiological characteristics arrived at by international consensus for the purpose of standardizing radiation dose calculations. DOE O 458.1 states that compliance with the annual dose limit of 100 mrem (1 mSv) to a member of the public may be demonstrated by calculating the dose to the maximally exposed individual (MEI) or to a representative person. Historically, for dose compliance, SRS has used the MEI concept, which uses adult dose coefficients and adult male usage parameters. Beginning with the 2012 annual site environmental report, SRS will be using the representative person concept for dose compliance. The dose to a representative person will be based on 1) the SRS-specific reference person usage parameters at the 95th percentile of appropriate national or regional data, which are documented in this report, 2) the reference person (gender and age averaged) ingestion and inhalation dose coefficients provided in DOE Derived Concentration Technical Standard (DOE-STD-1196-2011), and 3) the external dose coefficients provided in the DC_PAK3 toolbox. The second purpose of this report is to develop SRS-specific derived concentration standards (DCSs) for all applicable food ingestion pathways, ground shine, and water submersion. The DCS is the concentration of a particular radionuclide in water, in air, or on the ground that results in a member of the public receiving 100 mrem (1 mSv) effective dose following continuous exposure for one year. In DOE-STD-1196-2011, DCSs were developed for the ingestion of water, inhalation of

  15. 24 CFR 941.202 - Site and neighborhood standards.

    Science.gov (United States)

    2010-04-01

    ... utilities (e.g., water, sewer, gas and electricity) and streets must be available to service the site. (b...) The site must be free from adverse environmental conditions, natural or manmade, such as instability...; excessive noise vibration, vehicular traffic, rodent or vermin infestation; or fire hazards. The...

  16. 38 CFR 39.20 - Site planning standards.

    Science.gov (United States)

    2010-07-01

    ... depending on the State veteran population and national cemetery availability. (3) Accessibility. The site.... The curbs shall not be less than 4 inches high and 4 inches wide. A level platform in a ramp shall not.... Site furnishings include signage, trash receptacles, benches, and flower containers. These items should...

  17. Defense strategies for cloud computing multi-site server infrastructures

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Nageswara S. [ORNL; Ma, Chris Y. T. [Hang Seng Management College, Hon Kong; He, Fei [Texas A& M University, Kingsville, TX, USA

    2018-01-01

    We consider cloud computing server infrastructures for big data applications, which consist of multiple server sites connected over a wide-area network. The sites house a number of servers, network elements and local-area connections, and the wide-area network plays a critical, asymmetric role of providing vital connectivity between them. We model this infrastructure as a system of systems, wherein the sites and wide-area network are represented by their cyber and physical components. These components can be disabled by cyber and physical attacks, and also can be protected against them using component reinforcements. The effects of attacks propagate within the systems, and also beyond them via the wide-area network.We characterize these effects using correlations at two levels using: (a) aggregate failure correlation function that specifies the infrastructure failure probability given the failure of an individual site or network, and (b) first-order differential conditions on system survival probabilities that characterize the component-level correlations within individual systems. We formulate a game between an attacker and a provider using utility functions composed of survival probability and cost terms. At Nash Equilibrium, we derive expressions for the expected capacity of the infrastructure given by the number of operational servers connected to the network for sum-form, product-form and composite utility functions.

  18. Work related perceived stress and muscle activity during standardized computer work among female computer users

    DEFF Research Database (Denmark)

    Larsman, P; Thorn, S; Søgaard, K

    2009-01-01

    The current study investigated the associations between work-related perceived stress and surface electromyographic (sEMG) parameters (muscle activity and muscle rest) during standardized simulated computer work (typing, editing, precision, and Stroop tasks). It was part of the European case......-control study, NEW (Neuromuscular assessment in the Elderly Worker). The present cross-sectional study was based on a questionnaire survey and sEMG measurements among Danish and Swedish female computer users aged 45 or older (n=49). The results show associations between work-related perceived stress...... and trapezius muscle activity and rest during standardized simulated computer work, and provide partial empirical support for the hypothesized pathway of stress induced muscle activity in the association between an adverse psychosocial work environment and musculoskeletal symptoms in the neck and shoulder....

  19. Standardized UXO Technology Demonstration Site Blind Grid Scoring Record No. 810 (FEREX Fluxgate Gradient Magnetometer/Sling)

    National Research Council Canada - National Science Library

    Fling, Rick; McClung, Christina; Banta, Matthew; Burch, William; Karwatka, Michael; McDonnell, Patrick

    2007-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Blind Grid. This Scoring Record was coordinated by Michael Karwatka and the Standardized UXO Technology Demonstration Site Scoring Committee...

  20. Standardized UXO Technology Demonstration Site, Open Field Scoring Record No. 770. Magnetometer FEREX DLG GPS/Sling

    National Research Council Canada - National Science Library

    Karwatka, Mike; Packer, Bonnie

    2006-01-01

    ...) utilizing the YPG Standardized UXO Technology Demonstration Site open field. Scoring Records have been coordinated by Mike Karwatka and the Standardized UXO Technology Demonstration Site Scoring Committee...

  1. Remedial Action Plan and site design for stabilization of the inactive uranium mill tailings site at Durango, Colorado: Attachment 6, Supplemental standard for Durango processing site

    International Nuclear Information System (INIS)

    1991-12-01

    Excavation control to the 15 pCi/g radium-226 (Ra-226) standard at certain areas along the Animas River on the Durango Site would require extensive engineering and construction support. Elevated Ra-226 concentrations have been encountered immediately adjacent to the river at depths in excess of 7 feet below the present river stage. Decontamination to such depths to ensure compliance with the EPA standards will, in our opinion, become unreasonable. This work does not appear to be in keeping with the intent of the standards. Because the principal reason for radium removal is reduction of radon daughter concentrations (RDC) in homes to be built onsite, and because radon produced at depth will be attenuated in clean fill cover before entering such homes, it is appropriate to calculate the depth of excavation needed under a home to reduce RDC to acceptable levels. Potential impact was assessed through radon emanation estimation, using the RAECOM computer model. Elevated Ra-226 concentrations were encountered during final radium excavation of the flood plain below the large tailings pile, adjacent to the slag area. Data from 7 test pits excavated across the area were analyzed to provide an estimate of the Ra-226 concentration profile. Results are given in this report

  2. Standardized UXO Technology Demonstration Site Open Field Scoring Record Number 740

    National Research Council Canada - National Science Library

    Overbay, Jr., Larry; Fling, Rick; McClug, Christina; Watts, Kimberly; Banta, Matthew

    2006-01-01

    The objective in the Standardized UXO Technology Demonstration Site Program is to evaluate the detection and discrimination capabilities of a given technology under various field and soil conditions...

  3. Standard protocol for evaluation of environmental transfer factors around NPP sites

    International Nuclear Information System (INIS)

    Hegde, A.G.; Verma, P.C.; Rao, D.D.

    2009-01-01

    This document presents the standard procedures for evaluation of site specific environmental transfer factors around NPP sites. The scope of this document is to provide standard protocol to be followed for evaluation of environmental transfer factors around NPP sites. The studies on transfer factors are being carried out at various NPP sites under DAE-BRNS projects for evaluation of site specific transfer factors for radionuclides released from power plants. This document contains a common methodology in terms of sampling, processing, measurements and analysis of elemental/radionuclides, while keeping the site specific requirements also in place. (author)

  4. Beam standardization and dosimetric methodology in computed tomography

    International Nuclear Information System (INIS)

    Maia, Ana Figueiredo

    2005-01-01

    Special ionization chambers, named pencil ionization chambers, are used in dosimetric procedures in computed tomography beams (CT). In this work, an extensive study about pencil ionization chambers was performed, as a contribution to the accuracy of the dosimetric procedures in CT beams. The international scientific community has recently been discussing the need of the establishment of a specific calibration procedure for CT ionization chambers, once these chambers present special characteristics that differentiate them from other ionization chambers used in diagnostic radiology beams. In this work, an adequate calibration procedure for pencil ionization chambers was established at the Calibration Laboratory, of the Institute de Pesquisas Energeticas e Nucleares, in accordance with the most recent international recommendations. Two calibration methodologies were tested and analyzed by comparative studies. Moreover, a new extended length parallel plate ionization chamber, with a transversal section very similar to pencil ionization chambers, was developed. The operational characteristics of this chamber were determined and the results obtained showed that its behaviour is adequate as a reference system in CT standard beams. Two other studies were performed during this work, both using CT ionization chambers. The first study was about the performance of a pencil ionization chamber in standard radiation beams of several types and energies, and the results showed that this chamber presents satisfactory behaviour in other radiation qualities as of diagnostic radiology, mammography and radiotherapy. In the second study, a tandem system for verification of hal'-value layer variations in CT equipment, using a pencil ionization chamber, was developed. Because of the X rays tube rotation, the determination of half-value layers in computed tomography equipment is not an easy task, and it is usually not performed within quality control programs. (author)

  5. Computer systems and software description for Standard-E+ Hydrogen Monitoring System (SHMS-E+)

    International Nuclear Information System (INIS)

    Tate, D.D.

    1997-01-01

    The primary function of the Standard-E+ Hydrogen Monitoring System (SHMS-E+) is to determine tank vapor space gas composition and gas release rate, and to detect gas release events. Characterization of the gas composition is needed for safety analyses. The lower flammability limit, as well as the peak burn temperature and pressure, are dependent upon the gas composition. If there is little or no knowledge about the gas composition, safety analyses utilize compositions that yield the worst case in a deflagration or detonation. Knowledge of the true composition could lead to reductions in the assumptions and therefore there may be a potential for a reduction in controls and work restrictions. Also, knowledge of the actual composition will be required information for the analysis that is needed to remove tanks from the Watch List. Similarly, the rate of generation and release of gases is required information for performing safety analyses, developing controls, designing equipment, and closing safety issues. This report outlines the computer system design layout description for the Standard-E+ Hydrogen Monitoring System

  6. Standardized UXO Technology Demonstration Site Scoring Record No. 946

    Science.gov (United States)

    2017-07-01

    PUSH CART AREAS COVERED: SMALL MUNITIONS TEST SITE AD No. ATEC Project No. 2011-DT-ATC-DODSP-FO292 Report No. ATC-12166 Leonard...Munitions Management, ATTN: Mr. Herb Nelson. G-1 APPENDIX G. DISTRIBUTION LIST ATEC Project No. 2011-DT-ATC-DODSP-F0292 Note: A copy of this test ...Lethality Directorate July 2017 Report Produced by: U.S. Army Aberdeen Test Center Aberdeen Proving Ground, MD 21005-5059 Report Produced for

  7. Beam standardization of X radiation in computed tomography

    International Nuclear Information System (INIS)

    Maia, Ana F.; Caldas, Linda V.E.

    2005-01-01

    The ionization chamber used in dosimetric procedures in computed tomography beams (CT), is a cylindrical chamber, unsealed, with the sensitive length between 10 and 15 cm, named pencil ionization chamber. Because the doses involved in CT procedures are higher s than those in the procedures in radiology, it is very important to ensure the appropriate calibration of pencil ionization chambers and thus the accuracy of Dosimetric procedures. Recently, only the Calibration Laboratory, from Institute de Pesquisas Energeticas e Nucleares, had standards fields of conventional radiodiagnostic, but not arrived to include the energy range used in CT. In this work, will be shown the results obtained in standard field of radiodiagnostic - all qualities of radiodiagnostic of series RQR (direct beam) and RQA (attenuated beam) described in IEC 61267 norm - in an industrial X-ray equipment of the Calibration Laboratory. The recommended qualities for the calibration of TC chambers are the qualities RQA9 and RQR9. The other qualities will be used for calibration of other radiodiagnostic dosimeters and also for a larger study of the energy dependence of the pencil ionization chambers

  8. Gold-standard for computer-assisted morphological sperm analysis.

    Science.gov (United States)

    Chang, Violeta; Garcia, Alejandra; Hitschfeld, Nancy; Härtel, Steffen

    2017-04-01

    Published algorithms for classification of human sperm heads are based on relatively small image databases that are not open to the public, and thus no direct comparison is available for competing methods. We describe a gold-standard for morphological sperm analysis (SCIAN-MorphoSpermGS), a dataset of sperm head images with expert-classification labels in one of the following classes: normal, tapered, pyriform, small or amorphous. This gold-standard is for evaluating and comparing known techniques and future improvements to present approaches for classification of human sperm heads for semen analysis. Although this paper does not provide a computational tool for morphological sperm analysis, we present a set of experiments for comparing sperm head description and classification common techniques. This classification base-line is aimed to be used as a reference for future improvements to present approaches for human sperm head classification. The gold-standard provides a label for each sperm head, which is achieved by majority voting among experts. The classification base-line compares four supervised learning methods (1- Nearest Neighbor, naive Bayes, decision trees and Support Vector Machine (SVM)) and three shape-based descriptors (Hu moments, Zernike moments and Fourier descriptors), reporting the accuracy and the true positive rate for each experiment. We used Fleiss' Kappa Coefficient to evaluate the inter-expert agreement and Fisher's exact test for inter-expert variability and statistical significant differences between descriptors and learning techniques. Our results confirm the high degree of inter-expert variability in the morphological sperm analysis. Regarding the classification base line, we show that none of the standard descriptors or classification approaches is best suitable for tackling the problem of sperm head classification. We discovered that the correct classification rate was highly variable when trying to discriminate among non-normal sperm

  9. On-demand provisioning of HEP compute resources on cloud sites and shared HPC centers

    Science.gov (United States)

    Erli, G.; Fischer, F.; Fleig, G.; Giffels, M.; Hauth, T.; Quast, G.; Schnepf, M.; Heese, J.; Leppert, K.; Arnaez de Pedro, J.; Sträter, R.

    2017-10-01

    This contribution reports on solutions, experiences and recent developments with the dynamic, on-demand provisioning of remote computing resources for analysis and simulation workflows. Local resources of a physics institute are extended by private and commercial cloud sites, ranging from the inclusion of desktop clusters over institute clusters to HPC centers. Rather than relying on dedicated HEP computing centers, it is nowadays more reasonable and flexible to utilize remote computing capacity via virtualization techniques or container concepts. We report on recent experience from incorporating a remote HPC center (NEMO Cluster, Freiburg University) and resources dynamically requested from the commercial provider 1&1 Internet SE into our intitute’s computing infrastructure. The Freiburg HPC resources are requested via the standard batch system, allowing HPC and HEP applications to be executed simultaneously, such that regular batch jobs run side by side to virtual machines managed via OpenStack [1]. For the inclusion of the 1&1 commercial resources, a Python API and SDK as well as the possibility to upload images were available. Large scale tests prove the capability to serve the scientific use case in the European 1&1 datacenters. The described environment at the Institute of Experimental Nuclear Physics (IEKP) at KIT serves the needs of researchers participating in the CMS and Belle II experiments. In total, resources exceeding half a million CPU hours have been provided by remote sites.

  10. Overview of the ANS [American Nuclear Society] mathematics and computation software standards

    International Nuclear Information System (INIS)

    Smetana, A.O.

    1991-01-01

    The Mathematics and Computations Division of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains four ANSI/ANS software standards. These standards are: Recommended Programming Practices to Facilitate the Portability of Scientific Computer Programs, ANS-10.2; Guidelines for the Documentation of Computer Software, ANS-10.3; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Guidelines for Accommodating User Needs in Computer Program Development, ANS-10.5. 5 refs

  11. Remedial action standards for inactive uranium processing sites (40 cfr 192). Draft environmental impact statement

    International Nuclear Information System (INIS)

    1980-12-01

    The Environmental Protection Agency is proposing standards for disposing of uranium mill tailings from inactive processing sites and for cleaning up contaminated open land and buildings. These standards were developed pursuant to the Uranium Mill Tailings Radiation Control Act of 1978 (Public Law 95-604). This Act requires EPA to promulgate standards to protect the environment and public health and safety from radioactive and nonradioactive hazards posed by uranium mill tailings at designated inactive processing sites. The Draft Environmental Impact Statement examines health, technical, cost, and other factors relevant to determining standards. The proposed standards for disposal of the tailings piles cover radon emissions from the tailings to the air, protection of surface and ground water from radioactive and nonradioactive contaminants, and the length of time the disposal system should provide a reasonable expectation of meeting these standards. The proposed cleanup standards limit indoor radon decay product concentrations and gamma radiation levels and the residual radium concentration of contaminated land after cleanup

  12. Methodology to evaluate the site standard seismic motion for a nuclear facility

    International Nuclear Information System (INIS)

    Soares, W.A.

    1983-03-01

    An overall view of the subjects involved in the determination of the site standard seismic motion to a nuclear facility is presented. The main topics discussed are: basic priciples of seismic instrumentation; dynamic and spectral concepts; design earthquakes definitions; fundamentals of seismology; empirical curves developed from prior seismic data; avalable methodologies and recommended procedures to evaluate the site standard seismic motion. (E.G.) [pt

  13. Administrative goals and safety standards for hazard control on forested recreation sites

    Science.gov (United States)

    Lee A. Paine

    1973-01-01

    For efficient control of tree hazard on recreation sites, a specific administrative goal must be selected. A safety standard designed to achieve the selected goal and a uniform hazard-rating procedure will then promote a consistent level of safety at an acceptable cost. Safety standards can be established with the aid of data for past years, and dollar evaluations are...

  14. The practical use of computer graphics techniques for site characterization

    International Nuclear Information System (INIS)

    Tencer, B.; Newell, J.C.

    1982-01-01

    In this paper the authors describe the approach utilized by Roy F. Weston, Inc. (WESTON) to analyze and characterize data relative to a specific site and the computerized graphical techniques developed to display site characterization data. These techniques reduce massive amounts of tabular data to a limited number of graphics easily understood by both the public and policy level decision makers. First, they describe the general design of the system; then the application of this system to a low level rad site followed by a description of an application to an uncontrolled hazardous waste site

  15. Controlling engineering project changes for multi-unit, multi-site standardized nuclear power plants

    International Nuclear Information System (INIS)

    Randall, E.; Boddeker, G.; McGugin, H.; Strother, E.; Waggoner, G.

    1978-01-01

    Multibillioin dollar multiple nuclear power plant projects have numerous potential sources of engineering changes. The majority of these are internally generated changes, client generated changes, and changes from construction, procurement, other engineering organizations, and regulatory organizations. For multiunit, multisite projects, the use of a standardized design is cost effective. Engineering changes can then be controlled for a single standardized design, and the unit or site unique changes can be treated as deviations. Once an effective change procedure is established for change control of the standardized design, the same procedures can be used for control of unit or site unique changes

  16. The feasibility of using computer graphics in environmental evaluations : interim report, documenting historic site locations using computer graphics.

    Science.gov (United States)

    1981-01-01

    This report describes a method for locating historic site information using a computer graphics program. If adopted for use by the Virginia Department of Highways and Transportation, this method should significantly reduce the time now required to de...

  17. Fundamental challenging problems for developing new nuclear safety standard computer codes

    International Nuclear Information System (INIS)

    Wong, P.K.; Wong, A.E.; Wong, A.

    2005-01-01

    Based on the claims of the US Basic patents number 5,084,232; 5,848,377 and 6,430,516 that can be obtained from typing the Patent Numbers into the Box of the Web site http://164.195.100.11/netahtml/srchnum.htm and their associated published technical papers having been presented and published at International Conferences in the last three years and that all these had been sent into US-NRC by E-mail on March 26, 2003 at 2:46 PM., three fundamental challenging problems for developing new nuclear safety standard computer codes had been presented at the US-NRC RIC2003 Session W4. 2:15-3:15 PM. at the Washington D.C. Capital Hilton Hotel, Presidential Ballroom on April 16, 2003 in front of more than 800 nuclear professionals from many countries worldwide. The objective and scope of this paper is to invite all nuclear professionals to examine and evaluate all the current computer codes being used in their own countries by means of comparison of numerical data from these three specific openly challenging fundamental problems in order to set up a global safety standard for all nuclear power plants in the world. (authors)

  18. Health risk assessment standards of cyanobacteria bloom occurrence in bathing sites

    Directory of Open Access Journals (Sweden)

    Agnieszka Stankiewicz

    2011-03-01

    Full Text Available Threat for human health appears during a massive cyanobacteria bloom in potable water used for human consumption or in basins used for recreational purposes. General health risk assessment standards and preventive measures to be taken by sanitation service were presented in scope of: – evaluation of cyanobacteria bloom occurrence in bathing sites / water bodies, – procedures in case of cyanobacteria bloom, including health risk assessment and decision making process to protect users’ health at bathing sites, – preventive measures, to be taken in case of cyanobacteria bloom occurrence in bathing sites and basins, where bathing sites are located.

  19. Digitized molecular diagnostics: reading disk-based bioassays with standard computer drives.

    Science.gov (United States)

    Li, Yunchao; Ou, Lily M L; Yu, Hua-Zhong

    2008-11-01

    We report herein a digital signal readout protocol for screening disk-based bioassays with standard optical drives of ordinary desktop/notebook computers. Three different types of biochemical recognition reactions (biotin-streptavidin binding, DNA hybridization, and protein-protein interaction) were performed directly on a compact disk in a line array format with the help of microfluidic channel plates. Being well-correlated with the optical darkness of the binding sites (after signal enhancement by gold nanoparticle-promoted autometallography), the reading error levels of prerecorded audio files can serve as a quantitative measure of biochemical interaction. This novel readout protocol is about 1 order of magnitude more sensitive than fluorescence labeling/scanning and has the capability of examining multiplex microassays on the same disk. Because no modification to either hardware or software is needed, it promises a platform technology for rapid, low-cost, and high-throughput point-of-care biomedical diagnostics.

  20. Standardized computer-based organized reporting of EEG:SCORE

    DEFF Research Database (Denmark)

    Beniczky, Sandor; H, Aurlien,; JC, Brøgger,

    2013-01-01

    process, organized by the European Chapter of the International Federation of Clinical Neurophysiology. The Standardised Computer-based Organised Reporting of EEG (SCORE) software was constructed based on the terms and features of the consensus statement and it was tested in the clinical practice...... in free-text format. The purpose of our endeavor was to create a computer-based system for EEG assessment and reporting, where the physicians would construct the reports by choosing from predefined elements for each relevant EEG feature, as well as the clinical phenomena (for video-EEG recordings....... SCORE can potentially improve the quality of EEG assessment and reporting; it will help incorporate the results of computer-assisted analysis into the report, it will make possible the build-up of a multinational database, and it will help in training young neurophysiologists....

  1. Evaluation of four building energy analysis computer programs against ASHRAE standard 140-2007

    CSIR Research Space (South Africa)

    Szewczuk, S

    2014-08-01

    Full Text Available ) standard or code of practice. Agrément requested the CSIR to evaluate a range of building energy simulation computer programs. The standard against which these computer programs were to be evaluated was developed by the American Society of Heating...

  2. Mental Computation or Standard Algorithm? Children's Strategy Choices on Multi-Digit Subtractions

    Science.gov (United States)

    Torbeyns, Joke; Verschaffel, Lieven

    2016-01-01

    This study analyzed children's use of mental computation strategies and the standard algorithm on multi-digit subtractions. Fifty-eight Flemish 4th graders of varying mathematical achievement level were individually offered subtractions that either stimulated the use of mental computation strategies or the standard algorithm in one choice and two…

  3. Launch Site Computer Simulation and its Application to Processes

    Science.gov (United States)

    Sham, Michael D.

    1995-01-01

    This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.

  4. Ergonomics standards and guidelines for computer workstation design and the impact on users' health - a review.

    Science.gov (United States)

    Woo, E H C; White, P; Lai, C W K

    2016-03-01

    This paper presents an overview of global ergonomics standards and guidelines for design of computer workstations, with particular focus on their inconsistency and associated health risk impact. Overall, considerable disagreements were found in the design specifications of computer workstations globally, particularly in relation to the results from previous ergonomics research and the outcomes from current ergonomics standards and guidelines. To cope with the rapid advancement in computer technology, this article provides justifications and suggestions for modifications in the current ergonomics standards and guidelines for the design of computer workstations. Practitioner Summary: A research gap exists in ergonomics standards and guidelines for computer workstations. We explore the validity and generalisability of ergonomics recommendations by comparing previous ergonomics research through to recommendations and outcomes from current ergonomics standards and guidelines.

  5. Standards guide for space and earth sciences computer software

    Science.gov (United States)

    Mason, G.; Chapman, R.; Klinglesmith, D.; Linnekin, J.; Putney, W.; Shaffer, F.; Dapice, R.

    1972-01-01

    Guidelines for the preparation of systems analysis and programming work statements are presented. The data is geared toward the efficient administration of available monetary and equipment resources. Language standards and the application of good management techniques to software development are emphasized.

  6. Evaluation of regulatory processes affecting nuclear power plant early site approval and standardization

    International Nuclear Information System (INIS)

    1983-12-01

    This report presents the results of a survey and evaluation of existing federal, state and local regulatory considerations affecting siting approval of power plants in the United States. Those factors that may impede early site approval of nuclear power plants are identified, and findings related to the removal of these impediments and the general improvement of the approval process are presented. A brief evaluation of standardization of nuclear plant design is also presented

  7. The Role of Standards in Cloud-Computing Interoperability

    Science.gov (United States)

    2012-10-01

    services are not shared outside the organization. CloudStack, Eucalyptus, HP, Microsoft, OpenStack , Ubuntu, and VMWare provide tools for building...center requirements • Developing usage models for cloud ven- dors • Independent IT consortium OpenStack http://www.openstack.org • Open-source...software for running private clouds • Currently consists of three core software projects: OpenStack Compute (Nova), OpenStack Object Storage (Swift

  8. Development of a standard for computer program verification and control

    International Nuclear Information System (INIS)

    Dunn, T.E.; Ozer, O.

    1980-01-01

    It is expected that adherence to the guidelines of the ANS 10.4 will: 1. Provide confidence that the program conforms to its requirements specification; 2. Provide confidence that the computer program has been adequately evaluated and tested; 3. Provide confidence that program changes are adequately evaluated, tested, and controlled; and 4. Enhance assurance that reliable data will be produced for engineering, scientific, and safety analysis purposes

  9. Standardized Computer-based Organized Reporting of EEG: SCORE

    Science.gov (United States)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C; Fuglsang-Frederiksen, Anders; Martins-da-Silva, António; Trinka, Eugen; Visser, Gerhard; Rubboli, Guido; Hjalgrim, Helle; Stefan, Hermann; Rosén, Ingmar; Zarubova, Jana; Dobesberger, Judith; Alving, Jørgen; Andersen, Kjeld V; Fabricius, Martin; Atkins, Mary D; Neufeld, Miri; Plouin, Perrine; Marusic, Petr; Pressler, Ronit; Mameniskiene, Ruta; Hopfengärtner, Rüdiger; Emde Boas, Walter; Wolf, Peter

    2013-01-01

    The electroencephalography (EEG) signal has a high complexity, and the process of extracting clinically relevant features is achieved by visual analysis of the recordings. The interobserver agreement in EEG interpretation is only moderate. This is partly due to the method of reporting the findings in free-text format. The purpose of our endeavor was to create a computer-based system for EEG assessment and reporting, where the physicians would construct the reports by choosing from predefined elements for each relevant EEG feature, as well as the clinical phenomena (for video-EEG recordings). A working group of EEG experts took part in consensus workshops in Dianalund, Denmark, in 2010 and 2011. The faculty was approved by the Commission on European Affairs of the International League Against Epilepsy (ILAE). The working group produced a consensus proposal that went through a pan-European review process, organized by the European Chapter of the International Federation of Clinical Neurophysiology. The Standardised Computer-based Organised Reporting of EEG (SCORE) software was constructed based on the terms and features of the consensus statement and it was tested in the clinical practice. The main elements of SCORE are the following: personal data of the patient, referral data, recording conditions, modulators, background activity, drowsiness and sleep, interictal findings, “episodes” (clinical or subclinical events), physiologic patterns, patterns of uncertain significance, artifacts, polygraphic channels, and diagnostic significance. The following specific aspects of the neonatal EEGs are scored: alertness, temporal organization, and spatial organization. For each EEG finding, relevant features are scored using predefined terms. Definitions are provided for all EEG terms and features. SCORE can potentially improve the quality of EEG assessment and reporting; it will help incorporate the results of computer-assisted analysis into the report, it will make

  10. To the problem of reliability standardization in computer-aided manufacturing at NPP units

    International Nuclear Information System (INIS)

    Yastrebenetskij, M.A.; Shvyryaev, Yu.V.; Spektor, L.I.; Nikonenko, I.V.

    1989-01-01

    The problems of reliability standardization in computer-aided manufacturing of NPP units considering the following approaches: computer-aided manufacturing of NPP units as a part of automated technological complex; computer-aided manufacturing of NPP units as multi-functional system, are analyzed. Selection of the composition of reliability indeces for computer-aided manufacturing of NPP units for each of the approaches considered is substantiated

  11. Work plan for defining a standard inventory estimate for wastes stored in Hanford Site underground tanks

    International Nuclear Information System (INIS)

    Hodgson, K.M.

    1996-01-01

    This work plan addresses the Standard Inventory task scope, deliverables, budget, and schedule for fiscal year 1997. The goal of the Standard Inventory task is to resolve differences among the many reported Hanford Site tank waste inventory values and to provide inventory estimates that will serve as Standard Inventory values for all waste management and disposal activities. These best-basis estimates of chemicals and radionuclides will be reported on both a global and tank-specific basis and will be published in the Tank Characterization Database

  12. New Standards for the Validation of EMC Test Sites particularly above 1 GHz

    Directory of Open Access Journals (Sweden)

    S. Battermann

    2005-01-01

    Full Text Available Standards for the validation of alternative test sites with conducting groundplane exist for the frequency range 30-1000 MHz since the end of the eighties. Recently the procedure for fully anechoic rooms (FAR has been included in CISPR 16 after more than 10 years intensive discussion in standards committees (CENELEC, 2002; CISPR, 2004. But there are no standards available for the validation of alternative test sites above 1 GHz. The responsible working group (WG1 in CISPR/A has drawn up the 7th common draft (CD. A CDV will be published in spring 2005. The German standards committee VDE AK 767.4.1 participates in the drafting of the standard. All suggested measurement procedures proposed in the last CDs have been investigated by measurements and theoretical analysis. This contribution describes the basic ideas and problems of the validation procedure of the test site. Furthermore measurement results and numerical calculations will be presented especially for the use of omni-directional antennas.

  13. Standard problems to evaluate soil structure interaction computer codes

    International Nuclear Information System (INIS)

    Miller, C.A.; Costantino, C.J.; Philippacopoulos, A.J.

    1979-01-01

    The seismic response of nuclear power plant structures is often calculated using lumped parameter methods. A finite element model of the structure is coupled to the soil with a spring-dashpot system used to represent the interaction process. The parameters of the interaction model are based on analytic solutions to simple problems which are idealizations of the actual problems of interest. The objective of the work reported in this paper is to compare predicted responses using the standard lumped parameter models with experimental data. These comparisons are shown to be good for a fairly uniform soil system and for loadings which do not result in nonlinear interaction effects such as liftoff. 7 references, 7 figures

  14. Standard practice for classification of computed radiology systems

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2005-01-01

    1.1 This practice describes the evaluation and classification of a computed radiography (CR) system, a particular phosphor imaging plate (IP), system scanner and software, in combination with specified metal screens for industrial radiography. It is intended to ensure that the evaluation of image quality, as far as this is influenced by the scanner/IP system, meets the needs of users. 1.2 The practice defines system tests to be used to classify the systems of different suppliers and make them comparable for users. 1.3 The CR system performance is described by signal and noise parameters. For film systems, the signal is represented by gradient and the noise by granularity. The signal-to-noise ratio is normalized by the basic spatial resolution of the system and is part of classification. The normalization is given by the scanning aperture of 100 µm diameter for the micro-photometer, which is defined in Test Method E1815 for film system classification. This practice describes how the parameters shall be meas...

  15. US Department of Energy DOE Nevada Operations Office, Nevada Test Site: Underground safety and health standards

    International Nuclear Information System (INIS)

    1993-05-01

    The Nevada Test Site Underground Safety and Health Standards Working Group was formed at the direction of John D. Stewart, Director, Nevada Test Site Office in April, 1990. The objective of the Working Group was to compile a safety and health standard from the California Tunnel Safety Orders and OSHA for the underground operations at the NTS, (excluding Yucca Mountain). These standards are called the NTS U/G Safety and Health Standards. The Working Group submits these standards as a RECOMMENDATION to the Director, NTSO. Although the Working Group considers these standards to be the most integrated and comprehensive standards that could be developed for NTS Underground Operations, the intent is not to supersede or replace any relevant DOE orders. Rather the intent is to collate the multiple safety and health references contained in DOE Order 5480.4 that have applicability to NTS Underground Operations into a single safety and heath standard to be used in the underground operations at the NTS. Each portion of the standard was included only after careful consideration by the Working Group and is judged to be both effective and appropriate. The specific methods and rationale used by the Working Group are outlined as follows: The letter from DOE/HQ, dated September 28, 1990 cited OSHA and the CTSO as the safety and health codes applicable to underground operations at the NTS. These mandated codes were each originally developed to be comprehensive, i.e., all underground operations of a particular type (e.g., tunnels in the case of the CTSO) were intended to be adequately regulated by the appropriate code. However, this is not true; the Working Group found extensive and confusing overlap in the codes in numerous areas. Other subjects and activities were addressed by the various codes in cursory fashion or not at all

  16. US Department of Energy DOE Nevada Operations Office, Nevada Test Site: Underground safety and health standards

    Energy Technology Data Exchange (ETDEWEB)

    1993-05-01

    The Nevada Test Site Underground Safety and Health Standards Working Group was formed at the direction of John D. Stewart, Director, Nevada Test Site Office in April, 1990. The objective of the Working Group was to compile a safety and health standard from the California Tunnel Safety Orders and OSHA for the underground operations at the NTS, (excluding Yucca Mountain). These standards are called the NTS U/G Safety and Health Standards. The Working Group submits these standards as a RECOMMENDATION to the Director, NTSO. Although the Working Group considers these standards to be the most integrated and comprehensive standards that could be developed for NTS Underground Operations, the intent is not to supersede or replace any relevant DOE orders. Rather the intent is to collate the multiple safety and health references contained in DOE Order 5480.4 that have applicability to NTS Underground Operations into a single safety and heath standard to be used in the underground operations at the NTS. Each portion of the standard was included only after careful consideration by the Working Group and is judged to be both effective and appropriate. The specific methods and rationale used by the Working Group are outlined as follows: The letter from DOE/HQ, dated September 28, 1990 cited OSHA and the CTSO as the safety and health codes applicable to underground operations at the NTS. These mandated codes were each originally developed to be comprehensive, i.e., all underground operations of a particular type (e.g., tunnels in the case of the CTSO) were intended to be adequately regulated by the appropriate code. However, this is not true; the Working Group found extensive and confusing overlap in the codes in numerous areas. Other subjects and activities were addressed by the various codes in cursory fashion or not at all.

  17. States Move toward Computer Science Standards. Policy Update. Vol. 23, No. 17

    Science.gov (United States)

    Tilley-Coulson, Eve

    2016-01-01

    While educators and parents recognize computer science as a key skill for career readiness, only five states have adopted learning standards in this area. Tides are changing, however, as the Every Student Succeeds Act (ESSA) recognizes with its call on states to provide a "well-rounded education" for students, to include computer science…

  18. Value of computed tomography for evaluating the injection site in endosonography-guided celiac plexus neurolysis

    International Nuclear Information System (INIS)

    Sakamoto, Hiroki; Kitano, Masayuki; Nishio, Takeshi; Takeyama, Yoshifumi; Yasuda, Chikao; Kudo, Masatoshi

    2006-01-01

    Endosonography-guided celiac plexus neurolysis (EUS-CPN) safely and effectively relieves pain associated with intra-abdominal malignancies when the neurolytic is accurately injected. We applied contrast medium to evaluate the ethanol injection sites in patients who received EUS-CPN due to abdominal pain caused by malignancies. We injected, under the guidance of endoscopic ultrasonography (EUS), ethanol containing 10% contrast medium into the celiac plexus of patients with intra-abdominal pain due to malignancies. Immediately after the endoscopic therapy, patients underwent computed tomography (CT) to confirm the injection site. Images of distribution of injected solutions were classified into three groups. Injected solution dispersed in unilateral and bilateral anterocrural space was defined as ''unilateral injection'' or bilateral injection'', respectively. Injected solution located out of the anterocrural space was defined as ''inappropriate injection''. Pre- and postprocedure pain was assessed using a standard analog scale. Before and 2, 4, 8, 12, and 16 weeks after the procedure, pain scores were evaluated. From April 2003 to May 2005, 13 patients were enrolled in this study. Improvement of pain score in the ''bilateral injection'' and ''unilateral injection'' groups was significantly superior to the change in the ''inappropriate injection'' group. Although EUS-CPN was effective in eight of 13 patients (61.5%), additional EUS-CPN to the ''inappropriate injection group'' increased the response rate to 84.6%. Injection of ethanol to the anterocrural space by EUS-CPN produced adequate pain relief. Immediate examination by CT for confirmation of injection sites after EUS-CPN would increase the likelihood of induction of pain relief. (author)

  19. Internet resources for dentistry: computer, Internet, reference, and sites for enhancing personal productivity of the dental professional.

    Science.gov (United States)

    Guest, G F

    2000-08-15

    At the onset of the new millennium the Internet has become the new standard means of distributing information. In the last two to three years there has been an explosion of e-commerce with hundreds of new web sites being created every minute. For most corporate entities, a web site is as essential as the phone book listing used to be. Twenty years ago technologist directed how computer-based systems were utilized. Now it is the end users of personal computers that have gained expertise and drive the functionality of software applications. The computer, initially invented for mathematical functions, has transitioned from this role to an integrated communications device that provides the portal to the digital world. The Web needs to be used by healthcare professionals, not only for professional activities, but also for instant access to information and services "just when they need it." This will facilitate the longitudinal use of information as society continues to gain better information access skills. With the demand for current "just in time" information and the standards established by Internet protocols, reference sources of information may be maintained in dynamic fashion. News services have been available through the Internet for several years, but now reference materials such as online journals and digital textbooks have become available and have the potential to change the traditional publishing industry. The pace of change should make us consider Will Rogers' advice, "It isn't good enough to be moving in the right direction. If you are not moving fast enough, you can still get run over!" The intent of this article is to complement previous articles on Internet Resources published in this journal, by presenting information about web sites that present information on computer and Internet technologies, reference materials, news information, and information that lets us improve personal productivity. Neither the author, nor the Journal endorses any of the

  20. Effects Of Social Networking Sites (SNSs) On Hyper Media Computer Mediated Environments (HCMEs)

    OpenAIRE

    Yoon C. Cho

    2011-01-01

    Social Networking Sites (SNSs) are known as tools to interact and build relationships between users/customers in Hyper Media Computer Mediated Environments (HCMEs). This study explored how social networking sites play a significant role in communication between users. While numerous researchers examined the effectiveness of social networking websites, few studies investigated which factors affected customers attitudes and behavior toward social networking sites. In this paper, the authors inv...

  1. Standardization of computer programs - basis of the Czechoslovak library of nuclear codes

    International Nuclear Information System (INIS)

    Gregor, M.

    1987-01-01

    A standardized form of computer code documentation has been established in the CSSR in the field of reactor safety. Structure and content of the documentation are described and codes already subject to this process are mentioned. The formation of a Czechoslovak nuclear code library and facilitated discussion of safety reports containing results of standardized codes are aimed at

  2. Data Management Standards in Computer-aided Acquisition and Logistic Support (CALS)

    Science.gov (United States)

    Jefferson, David K.

    1990-01-01

    Viewgraphs and discussion on data management standards in computer-aided acquisition and logistic support (CALS) are presented. CALS is intended to reduce cost, increase quality, and improve timeliness of weapon system acquisition and support by greatly improving the flow of technical information. The phase 2 standards, industrial environment, are discussed. The information resource dictionary system (IRDS) is described.

  3. Standard molar enthalpy of formation of 1-benzosuberone: An experimental and computational study

    International Nuclear Information System (INIS)

    Miranda, Margarida S.; Morais, Victor M.F.; Matos, M. Agostinha R.; Liebman, Joel F.

    2010-01-01

    The energetics of 1-benzosuberone was studied by a combination of calorimetric techniques and computational calculations. The standard (p o = 0.1 MPa) molar enthalpy of formation of 1-benzosuberone, in the liquid phase, was derived from the massic energy of combustion, in oxygen, at T = 298.15 K, measured by static bomb combustion calorimetry. The standard molar enthalpy of vaporization, at T = 298.15 K, was measured by Calvet microcalorimetry. From these two parameters the standard (p o = 0.1 MPa) molar enthalpy of formation, in the gaseous phase, at T = 298.15 K, was derived: -(96.1 ± 3.4) kJ . mol -1 . The G3(MP2)//B3LYP composite method and appropriate reactions were used to computationally calculate the standard molar enthalpy of formation of 1-benzosuberone, in the gaseous phase, at T = 298.15 K. The computational results are in very good agreement with the experimental value.

  4. Evaluation of measurement reproducibility using the standard-sites data, 1994 Fernald field characterization demonstration project

    International Nuclear Information System (INIS)

    Rautman, C.A.

    1996-02-01

    The US Department of Energy conducted the 1994 Fernald (Ohio) field characterization demonstration project to evaluate the performance of a group of both industry-standard and proposed alternative technologies in describing the nature and extent of uranium contamination in surficial soils. Detector stability and measurement reproducibility under actual operating conditions encountered in the field is critical to establishing the credibility of the proposed alternative characterization methods. Comparability of measured uranium activities to those reported by conventional, US Environmental Protection Agency (EPA)-certified laboratory methods is also required. The eleven (11) technologies demonstrated included (1) EPA-standard soil sampling and laboratory mass-spectroscopy analyses, and currently-accepted field-screening techniques using (2) sodium-iodide scintillometers, (3) FIDLER low-energy scintillometers, and (4) a field-portable x-ray fluorescence spectrometer. Proposed advanced characterization techniques included (5) alpha-track detectors, (6) a high-energy beta scintillometer, (7) electret ionization chambers, (8) and (9) a high-resolution gamma-ray spectrometer in two different configurations, (10) a field-adapted laser ablation-inductively coupled plasma-atomic emission spectroscopy (ICP-AES) technique, and (11) a long-range alpha detector. Measurement reproducibility and the accuracy of each method were tested by acquiring numerous replicate measurements of total uranium activity at each of two ''standard sites'' located within the main field demonstration area. Meteorological variables including temperature, relative humidity. and 24-hour rainfall quantities were also recorded in conjunction with the standard-sites measurements

  5. POLYAR, a new computer program for prediction of poly(A sites in human sequences

    Directory of Open Access Journals (Sweden)

    Qamar Raheel

    2010-11-01

    Full Text Available Abstract Background mRNA polyadenylation is an essential step of pre-mRNA processing in eukaryotes. Accurate prediction of the pre-mRNA 3'-end cleavage/polyadenylation sites is important for defining the gene boundaries and understanding gene expression mechanisms. Results 28761 human mapped poly(A sites have been classified into three classes containing different known forms of polyadenylation signal (PAS or none of them (PAS-strong, PAS-weak and PAS-less, respectively and a new computer program POLYAR for the prediction of poly(A sites of each class was developed. In comparison with polya_svm (till date the most accurate computer program for prediction of poly(A sites while searching for PAS-strong poly(A sites in human sequences, POLYAR had a significantly higher prediction sensitivity (80.8% versus 65.7% and specificity (66.4% versus 51.7% However, when a similar sort of search was conducted for PAS-weak and PAS-less poly(A sites, both programs had a very low prediction accuracy, which indicates that our knowledge about factors involved in the determination of the poly(A sites is not sufficient to identify such polyadenylation regions. Conclusions We present a new classification of polyadenylation sites into three classes and a novel computer program POLYAR for prediction of poly(A sites/regions of each of the class. In tests, POLYAR shows high accuracy of prediction of the PAS-strong poly(A sites, though this program's efficiency in searching for PAS-weak and PAS-less poly(A sites is not very high but is comparable to other available programs. These findings suggest that additional characteristics of such poly(A sites remain to be elucidated. POLYAR program with a stand-alone version for downloading is available at http://cub.comsats.edu.pk/polyapredict.htm.

  6. Subtraction radiography and computer assisted densitometric analyses of standardized radiographs. A comparison study with /sup 125/I absorptiometry

    Energy Technology Data Exchange (ETDEWEB)

    Ortmann, L.F.; Dunford, R.; McHenry, K.; Hausmann, E.

    1985-01-01

    A standardized radiographic series of incrementally increasing alveolar crestal defects in skulls were subjected to analyses by subtraction radiography and computer assisted quantitative densitometric analysis. Subjects were able to detect change using subtraction radiography in alveolar bone defects with bone loss in the range of 1-5 percent as measured by /sup 125/I absorptiometry. Quantitative densitometric analyses utilizing radiographic pairs adjusted for differences in contrast (gamma corrected) can be used to follow longitudinal changes at a particular alveolar bone site. Such measurements correlate with change observed by /sup 125/I absorptiometry (r=0.82-0.94). (author).

  7. Can standard sequential extraction determinations effectively define heavy metal species in superfund site soils?

    Energy Technology Data Exchange (ETDEWEB)

    Dahlin, Cheryl L.; Williamson, Connie A.; Collins, Wesley K.; Dahlin, David C.

    2001-01-01

    Speciation and distribution of heavy metals in soils controls the degree to which metals and their compounds are mobile, extractable, and plant-available. Consequently, speciation impacts the success of remediation efforts both by defining the relationship of the contaminants with their environment and by guiding development and evaluation of workable remediation strategies. The U.S. Department of Energy, Albany Research Center (Albany, OR), under a two-year interagency project with the U.S. Environmental Protection Agency (EPA), examined the suitability of sequential extraction as a definitive means to determine species of heavy metals in soil samples. Representative soil samples, contaminated with lead, arsenic, and/or chromium, were collected by EPA personnel from two Superfund sites, the National Lead Company site in Pedricktown, NJ, and the Roebling Steel, Inc., site in Florence, NJ. Data derived from Tessier=s standard three-stage sequential-extraction procedure were compared to data from a comprehensive characterization study that combined optical- and scanning-electron microscopy (with energy-dispersive x-ray and wavelength-dispersive x-ray analyses), x-ray diffraction, and chemical analyses. The results show that standard sequential-extraction procedures that were developed for characterizing species of contaminants in river sediments may be unsuitable for sole evaluation of contaminant species in industrial-site materials (particularly those that contain larger particles of the contaminants, encapsulated contaminants, and/or man-made materials such as slags, metals, and plastics). However, each sequential extraction or comprehensive characterization procedure has it=s own strengths and weaknesses. Findings of this study indicate that the use of both approaches, during the early stages of site studies, would be a best practice. The investigation also highlights the fact that an effective speciation study does not simply identify metal contaminants as

  8. An accurate on-site calibration system for electronic voltage transformers using a standard capacitor

    Science.gov (United States)

    Hu, Chen; Chen, Mian-zhou; Li, Hong-bin; Zhang, Zhu; Jiao, Yang; Shao, Haiming

    2018-05-01

    Ordinarily electronic voltage transformers (EVTs) are calibrated off-line and the calibration procedure requires complex switching operations, which will influence the reliability of the power grid and induce large economic losses. To overcome this problem, this paper investigates a 110 kV on-site calibration system for EVTs, including a standard channel, a calibrated channel and a PC equipped with the LabView environment. The standard channel employs a standard capacitor and an analogue integrating circuit to reconstruct the primary voltage signal. Moreover, an adaptive full-phase discrete Fourier transform (DFT) algorithm is proposed to extract electrical parameters. The algorithm involves the process of extracting the frequency of the grid, adjusting the operation points, and calculating the results using DFT. In addition, an insulated automatic lifting device is designed to realize the live connection of the standard capacitor, which is driven by a wireless remote controller. A performance test of the capacitor verifies the accurateness of the standard capacitor. A system calibration test shows that the system ratio error is less than 0.04% and the phase error is below 2‧, which meets the requirement of the 0.2 accuracy class. Finally, the developed calibration system was used in a substation, and the field test data validates the availability of the system.

  9. Standards and guidelines pertinent to the development of decommissioning criteria for sites contaminated with radioactive material

    International Nuclear Information System (INIS)

    Dickson, H.W.

    1978-08-01

    A review of existing health and safety standards and guidelines has been undertaken to assist in the development of criteria for the decontamination and decommissioning of property contaminated with radioactive material. During the early years of development of the nuclear program in the United States, a number of sites were used which became contaminated with radioactive material. Many of these sites are no longer useful for nuclear activities, and the U.S. DOE desires to develop criteria for the management of these sites for future uses. Radiation protection standards promulgated by ICRP, NCRP, and ANSI have been considered. Government regulations, from the Code of Federal Regulations and the legal codes of various states, as well as regulatory guidelines with specific application to decommissioning of nuclear facilities also have been reviewed. In addition, recommendations of other scientific organizations such as the National Academy of Sciences/National Research Council Advisory Committee on the Biological Effects of Ionizing Radiations and the United Nations Scientific Committee on the Effects of Atomic Radiation were considered. Finally, a few specific recommendations and discussions from current literature were included. 28 references

  10. Computer models used to support cleanup decision-making at hazardous and radioactive waste sites

    Energy Technology Data Exchange (ETDEWEB)

    Moskowitz, P.D.; Pardi, R.; DePhillips, M.P.; Meinhold, A.F.

    1992-07-01

    Massive efforts are underway to cleanup hazardous and radioactive waste sites located throughout the US To help determine cleanup priorities, computer models are being used to characterize the source, transport, fate and effects of hazardous chemicals and radioactive materials found at these sites. Although, the US Environmental Protection Agency (EPA), the US Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) have provided preliminary guidance to promote the use of computer models for remediation purposes, no Agency has produced directed guidance on models that must be used in these efforts. To identify what models are actually being used to support decision-making at hazardous and radioactive waste sites, a project jointly funded by EPA, DOE and NRC was initiated. The purpose of this project was to: (1) Identify models being used for hazardous and radioactive waste site assessment purposes; and (2) describe and classify these models. This report presents the results of this study.

  11. Computer models used to support cleanup decision-making at hazardous and radioactive waste sites

    International Nuclear Information System (INIS)

    Moskowitz, P.D.; Pardi, R.; DePhillips, M.P.; Meinhold, A.F.

    1992-07-01

    Massive efforts are underway to cleanup hazardous and radioactive waste sites located throughout the US To help determine cleanup priorities, computer models are being used to characterize the source, transport, fate and effects of hazardous chemicals and radioactive materials found at these sites. Although, the US Environmental Protection Agency (EPA), the US Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) have provided preliminary guidance to promote the use of computer models for remediation purposes, no Agency has produced directed guidance on models that must be used in these efforts. To identify what models are actually being used to support decision-making at hazardous and radioactive waste sites, a project jointly funded by EPA, DOE and NRC was initiated. The purpose of this project was to: (1) Identify models being used for hazardous and radioactive waste site assessment purposes; and (2) describe and classify these models. This report presents the results of this study

  12. A climatological model for risk computations incorporating site- specific dry deposition influences

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.

    1991-07-01

    A gradient-flux dry deposition module was developed for use in a climatological atmospheric transport model, the Multimedia Environmental Pollutant Assessment System (MEPAS). The atmospheric pathway model computes long-term average contaminant air concentration and surface deposition patterns surrounding a potential release site incorporating location-specific dry deposition influences. Gradient-flux formulations are used to incorporate site and regional data in the dry deposition module for this atmospheric sector-average climatological model. Application of these formulations provide an effective means of accounting for local surface roughness in deposition computations. Linkage to a risk computation module resulted in a need for separate regional and specific surface deposition computations. 13 refs., 4 figs., 2 tabs

  13. USAGE OF STANDARD PERSONAL COMPUTER PORTS FOR DESIGNING OF THE DOUBLE REDUNDANT FAULT-TOLERANT COMPUTER CONTROL SYSTEMS

    Directory of Open Access Journals (Sweden)

    Rafig SAMEDOV

    2005-01-01

    Full Text Available In this study, for designing of the fault-tolerant control systems by using standard personal computers, the ports have been investigated, different structure versions have been designed and the method for choosing of an optimal structure has been suggested. In this scope, first of all, the ÇİFTYAK system has been defined and its work principle has been determined. Then, data transmission ports of the standard personal computers have been classified and analyzed. After that, the structure versions have been designed and evaluated according to the used data transmission methods, the numbers of ports and the criterions of reliability, performance, truth, control and cost. Finally, the method for choosing of the most optimal structure version has been suggested.

  14. Computer simulations of rare earth sites in glass: experimental tests and applications to laser materials

    International Nuclear Information System (INIS)

    Weber, M.J.

    1984-11-01

    Computer simulations of the microscopic structure of BeF 2 glasses using molecular dynamics are reviewed and compared with x-ray and neutron diffraction, EXAFS, NMR, and optical measurements. Unique information about the site-to-site variations in the local environments of rare earth ions is obtained using optical selective excitation and laser-induced fluorescence line-narrowing techniques. Applications and limitations of computer simulations to the development of laser glasses and to predictions of other static and dynamic properties of glasses are discussed. 35 references, 2 figures, 2 tables

  15. Standard Error Computations for Uncertainty Quantification in Inverse Problems: Asymptotic Theory vs. Bootstrapping.

    Science.gov (United States)

    Banks, H T; Holm, Kathleen; Robbins, Danielle

    2010-11-01

    We computationally investigate two approaches for uncertainty quantification in inverse problems for nonlinear parameter dependent dynamical systems. We compare the bootstrapping and asymptotic theory approaches for problems involving data with several noise forms and levels. We consider both constant variance absolute error data and relative error which produces non-constant variance data in our parameter estimation formulations. We compare and contrast parameter estimates, standard errors, confidence intervals, and computational times for both bootstrapping and asymptotic theory methods.

  16. Internationally Standardized Reporting (Checklist) on the Sustainable Development Performance of Uranium Mining and Processing Sites

    International Nuclear Information System (INIS)

    Harris, Frank

    2014-01-01

    The Internationally Standardized Reporting Checklist on the Sustainable Development Performance of Uranium Mining and Processing Sites: • A mutual and beneficial work between a core group of uranium miners and nuclear utilities; • An approach based on an long term experience, international policies and sustainable development principles; • A process to optimize the reporting mechanism, tools and efforts; • 11 sections focused on the main sustainable development subject matters known at an operational and headquarter level. The WNA will make available the sustainable development checklist for member utilities and uranium suppliers. Utilities and suppliers are encouraged to use the checklist for sustainable development verification.

  17. Computational prediction of muon stopping sites using ab initio random structure searching (AIRSS)

    Science.gov (United States)

    Liborio, Leandro; Sturniolo, Simone; Jochym, Dominik

    2018-04-01

    The stopping site of the muon in a muon-spin relaxation experiment is in general unknown. There are some techniques that can be used to guess the muon stopping site, but they often rely on approximations and are not generally applicable to all cases. In this work, we propose a purely theoretical method to predict muon stopping sites in crystalline materials from first principles. The method is based on a combination of ab initio calculations, random structure searching, and machine learning, and it has successfully predicted the MuT and MuBC stopping sites of muonium in Si, diamond, and Ge, as well as the muonium stopping site in LiF, without any recourse to experimental results. The method makes use of Soprano, a Python library developed to aid ab initio computational crystallography, that was publicly released and contains all the software tools necessary to reproduce our analysis.

  18. Standardized computer-based organized reporting of EEG SCORE - Second version

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se...

  19. Citation analysis of Computer Standards & Interfaces: Technical or also non-technical focus?

    NARCIS (Netherlands)

    G. van de Kaa (Geerten); H.J. de Vries (Henk); B. Baskaran (Balakumaran)

    2015-01-01

    textabstractThis paper analyzes to which extent research published in Computer Standards & Interfaces (CSI) has a technical focus. We find that CSI has been following its scope very closely in the last three years and that the majority of its publications have a technical focus. Articles published

  20. Archaeology Through Computational Linguistics: Inscription Statistics Predict Excavation Sites of Indus Valley Artifacts.

    Science.gov (United States)

    Recchia, Gabriel L; Louwerse, Max M

    2016-11-01

    Computational techniques comparing co-occurrences of city names in texts allow the relative longitudes and latitudes of cities to be estimated algorithmically. However, these techniques have not been applied to estimate the provenance of artifacts with unknown origins. Here, we estimate the geographic origin of artifacts from the Indus Valley Civilization, applying methods commonly used in cognitive science to the Indus script. We show that these methods can accurately predict the relative locations of archeological sites on the basis of artifacts of known provenance, and we further apply these techniques to determine the most probable excavation sites of four sealings of unknown provenance. These findings suggest that inscription statistics reflect historical interactions among locations in the Indus Valley region, and they illustrate how computational methods can help localize inscribed archeological artifacts of unknown origin. The success of this method offers opportunities for the cognitive sciences in general and for computational anthropology specifically. Copyright © 2015 Cognitive Science Society, Inc.

  1. Relative resilience to noise of standard and sequential approaches to measurement-based quantum computation

    Science.gov (United States)

    Gallagher, C. B.; Ferraro, A.

    2018-05-01

    A possible alternative to the standard model of measurement-based quantum computation (MBQC) is offered by the sequential model of MBQC—a particular class of quantum computation via ancillae. Although these two models are equivalent under ideal conditions, their relative resilience to noise in practical conditions is not yet known. We analyze this relationship for various noise models in the ancilla preparation and in the entangling-gate implementation. The comparison of the two models is performed utilizing both the gate infidelity and the diamond distance as figures of merit. Our results show that in the majority of instances the sequential model outperforms the standard one in regard to a universal set of operations for quantum computation. Further investigation is made into the performance of sequential MBQC in experimental scenarios, thus setting benchmarks for possible cavity-QED implementations.

  2. Standardized computer-based organized reporting of EEG SCORE - Second version

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE....... In the end, the diagnostic significance is scored, using a standardized list of terms. SCORE has specific modules for scoring seizures (including seizure semiology and ictal EEG patterns), neonatal recordings (including features specific for this age group), and for Critical Care EEG Terminology. SCORE...

  3. New reclamation standards for oil and gas well sites and pipelines in the agricultural land reserve

    International Nuclear Information System (INIS)

    Jones, P.

    1995-01-01

    Reclamation standards are a necessity because of increasing density of oil and gas developments, and the number of wells which may be abandoned over the next few years. All petroleum industry users of land are subject to the Agricultural Land Commission Act and require the approval of the Commission. The new General Order 293/95 was discussed, the purpose of which is to streamline existing regulations and to clarify reclamation standards. The new standards are similar to requirements currently in place in northwestern Alberta because landforms, soils, and land there are similar to those that exist in the Peace River region of B.C. Adopting similar requirements also has the added benefit of providing consistency for the industry between adjacent jurisdictions. In essence, the official view is that petroleum developments are temporary activities as long as the land is restored to its original or better condition, and the disruption to farm operations is minimal. Major provisions of General Order 293/95 were reviewed. It was noted that site contamination and the disposal of wastes were not addressed in the General Order. The reason for this is that these matters fall under the jurisdiction of other government agencies. 7 refs

  4. US Department of Energy response to standards for remedial actions at inactive uranium processing sites: Proposed rule

    International Nuclear Information System (INIS)

    1988-01-01

    The Title I groundwater standards for inactive uranium mill tailings sites, which were promulgated on January 5, 1983, by the US Environmental Protection Agency (EPA) for the Uranium Mill Tailings Remedial Action (UMTRA) Project, were remanded to the EPA on September 3, 1985, by the US Tenth Circuit Court of Appeals. The Court instructed the EPA to compile general groundwater standards for all Title I sites. On September 24, 1987, the EPA published proposed standards (52FR36000-36008) in response to the remand. This report includes an evaluation of the potential effects of the proposed EPA groundwater standards on the UMTRA Project, as well as a discussion of the DOE's position on the proposed standards. The report also contains and appendix which provides supporting information and cost analyses. In order to assess the impacts of the proposed EPA standards, this report summarizes the proposed EPA standards in Section 2.0. The next three sections assess the impacts of the three parts of the EPA standards: Subpart A considers disposal sites; Subpart B is concerned with restoration at processing sites; and Subpart C addresses supplemental standards. Section 6.0 integrates previous sections into a recommendations section. Section 7.0 contains the DOE response to questions posed by the EPA in the preamble to the proposed standards. 6 refs., 5 figs., 3 tabs

  5. Target localization on standard axial images in computed tomography (CT) stereotaxis for functional neurosurgery - a technical note

    International Nuclear Information System (INIS)

    Patil, A.-A.

    1986-01-01

    A simple technique for marking functional neurosurgery target on computed tomography (CT) axial image is described. This permits the use of standard axial image for computed tomography (CT) stereotaxis in functional neurosurgery. (Author)

  6. Computer Vision Photogrammetry for Underwater Archaeological Site Recording in a Low-Visibility Environment

    Science.gov (United States)

    Van Damme, T.

    2015-04-01

    Computer Vision Photogrammetry allows archaeologists to accurately record underwater sites in three dimensions using simple twodimensional picture or video sequences, automatically processed in dedicated software. In this article, I share my experience in working with one such software package, namely PhotoScan, to record a Dutch shipwreck site. In order to demonstrate the method's reliability and flexibility, the site in question is reconstructed from simple GoPro footage, captured in low-visibility conditions. Based on the results of this case study, Computer Vision Photogrammetry compares very favourably to manual recording methods both in recording efficiency, and in the quality of the final results. In a final section, the significance of Computer Vision Photogrammetry is then assessed from a historical perspective, by placing the current research in the wider context of about half a century of successful use of Analytical and later Digital photogrammetry in the field of underwater archaeology. I conclude that while photogrammetry has been used in our discipline for several decades now, for various reasons the method was only ever used by a relatively small percentage of projects. This is likely to change in the near future since, compared to the `traditional' photogrammetry approaches employed in the past, today Computer Vision Photogrammetry is easier to use, more reliable and more affordable than ever before, while at the same time producing more accurate and more detailed three-dimensional results.

  7. A proposal for standardizing computed tomography reports on abdominal aortic aneurysms

    International Nuclear Information System (INIS)

    Torlai, Fabiola Goda; Meirelles, Gustavo S. Portes; Miranda Junior, Fausto; Fonseca, Jose Honorio A.P. da; Ajzen, Sergio; D'Ippolito, Giuseppe

    2006-01-01

    Objective: to propose a model to standardize computed tomography reports on abdominal aortic aneurysms. Materials and methods: interviews were carried out with members of the Vascular Surgery Division of our institution, in the period between April and October 2004, aiming at developing a standardized model of computed tomography reports on abdominal aortic aneurysms. Based on this model, a questionnaire was elaborated and sent to other nine surgeons, all of them experienced in the field of abdominal aortic surgery. The questionnaires response rate was 55.5% (5/9). Results: the most frequently mentioned parameters of interest for evaluation of abdominal aortic aneurysms were: maximum diameter of proximal aortic neck, proximal aortic neck length to lower renal arteries, shape of proximal aortic neck, maximum diameter of the aneurysm and diameter of the common iliac arteries. These data allowed the development of a proposal for a model to standardize computed tomography reports. Conclusion: a model for standardized tomographic analysis of abdominal aortic aneurysms has met vascular surgeons' needs for following-up patients and planning their treatment. (author)

  8. The family of standard hydrogen monitoring system computer software design description: Revision 2

    International Nuclear Information System (INIS)

    Bender, R.M.

    1994-01-01

    In March 1990, 23 waste tanks at the Hanford Nuclear Reservation were identified as having the potential for the buildup of gas to a flammable or explosive level. As a result of the potential for hydrogen gas buildup, a project was initiated to design a standard hydrogen monitoring system (SHMS) for use at any waste tank to analyze gas samples for hydrogen content. Since it was originally deployed three years ago, two variations of the original system have been developed: the SHMS-B and SHMS-C. All three are currently in operation at the tank farms and will be discussed in this document. To avoid confusion in this document, when a feature is common to all three of the SHMS variants, it will be referred to as ''The family of SHMS.'' When it is specific to only one or two, they will be identified. The purpose of this computer software design document is to provide the following: the computer software requirements specification that documents the essential requirements of the computer software and its external interfaces; the computer software design description; the computer software user documentation for using and maintaining the computer software and any dedicated hardware; and the requirements for computer software design verification and validation

  9. Standard practice for digital imaging and communication nondestructive evaluation (DICONDE) for computed radiography (CR) test methods

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This practice facilitates the interoperability of computed radiography (CR) imaging and data acquisition equipment by specifying image data transfer and archival storage methods in commonly accepted terms. This practice is intended to be used in conjunction with Practice E2339 on Digital Imaging and Communication in Nondestructive Evaluation (DICONDE). Practice E2339 defines an industrial adaptation of the NEMA Standards Publication titled Digital Imaging and Communications in Medicine (DICOM, see http://medical.nema.org), an international standard for image data acquisition, review, storage and archival storage. The goal of Practice E2339, commonly referred to as DICONDE, is to provide a standard that facilitates the display and analysis of NDE results on any system conforming to the DICONDE standard. Toward that end, Practice E2339 provides a data dictionary and a set of information modules that are applicable to all NDE modalities. This practice supplements Practice E2339 by providing information objec...

  10. Standards for digital computers used in non-safety nuclear power plant applications: objectives and limitations

    International Nuclear Information System (INIS)

    Rorer, D.C.; Long, A.B.

    1977-01-01

    There are currently a number of efforts to develop standards which would apply to digital computers used in nuclear power plants for functions other than those directly involving plant protection (for example, ANS: 4.3.3 Subworking Group in the U.S., IEC 45A/WGA1 Subcommittee in Europe). The impetus for this activity is discussed and generally attributed to the realization that nonsafety systems computers may affect the assumptions used as the design bases for safety systems, the sizable economic loss which can result from the failure of a critical computer application, and the lingering concern about the misapplication of a still-new technology. At the same time, it is pointed out that these standards may create additional obstacles for the use of this new technology which are not present in the application of more conventional instrumentation and control equipment. Much of the U.S. effort has been directed toward the problem of validation of computer systems in which the potential exists for unplanned interactions between various functions in a multiprogram environment, using common hardware in a time-sharing mode. The goal is to develop procedures for the specification, development implementation, and documentation of testable, modular systems which, in the absence of proven quantitative techniques for assessing software reliability, are felt to provide reasonable assurance that the computer system will function as planned

  11. 40 CFR 61.151 - Standard for inactive waste disposal sites for asbestos mills and manufacturing and fabricating...

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 8 2010-07-01 2010-07-01 false Standard for inactive waste disposal sites for asbestos mills and manufacturing and fabricating operations. 61.151 Section 61.151 Protection... inactive waste disposal sites for asbestos mills and manufacturing and fabricating operations. Each owner...

  12. Computational design of trimeric influenza-neutralizing proteins targeting the hemagglutinin receptor binding site

    Energy Technology Data Exchange (ETDEWEB)

    Strauch, Eva-Maria; Bernard, Steffen M.; La, David; Bohn, Alan J.; Lee, Peter S.; Anderson, Caitlin E.; Nieusma, Travis; Holstein, Carly A.; Garcia, Natalie K.; Hooper, Kathryn A.; Ravichandran, Rashmi; Nelson, Jorgen W.; Sheffler, William; Bloom, Jesse D.; Lee, Kelly K.; Ward, Andrew B.; Yager, Paul; Fuller, Deborah H.; Wilson, Ian A.; Baker , David (UWASH); (Scripps); (FHCRC)

    2017-06-12

    Many viral surface glycoproteins and cell surface receptors are homo-oligomers1, 2, 3, 4, and thus can potentially be targeted by geometrically matched homo-oligomers that engage all subunits simultaneously to attain high avidity and/or lock subunits together. The adaptive immune system cannot generally employ this strategy since the individual antibody binding sites are not arranged with appropriate geometry to simultaneously engage multiple sites in a single target homo-oligomer. We describe a general strategy for the computational design of homo-oligomeric protein assemblies with binding functionality precisely matched to homo-oligomeric target sites5, 6, 7, 8. In the first step, a small protein is designed that binds a single site on the target. In the second step, the designed protein is assembled into a homo-oligomer such that the designed binding sites are aligned with the target sites. We use this approach to design high-avidity trimeric proteins that bind influenza A hemagglutinin (HA) at its conserved receptor binding site. The designed trimers can both capture and detect HA in a paper-based diagnostic format, neutralizes influenza in cell culture, and completely protects mice when given as a single dose 24 h before or after challenge with influenza.

  13. Computational Recognition of RNA Splice Sites by Exact Algorithms for the Quadratic Traveling Salesman Problem

    Directory of Open Access Journals (Sweden)

    Anja Fischer

    2015-06-01

    Full Text Available One fundamental problem of bioinformatics is the computational recognition of DNA and RNA binding sites. Given a set of short DNA or RNA sequences of equal length such as transcription factor binding sites or RNA splice sites, the task is to learn a pattern from this set that allows the recognition of similar sites in another set of DNA or RNA sequences. Permuted Markov (PM models and permuted variable length Markov (PVLM models are two powerful models for this task, but the problem of finding an optimal PM model or PVLM model is NP-hard. While the problem of finding an optimal PM model or PVLM model of order one is equivalent to the traveling salesman problem (TSP, the problem of finding an optimal PM model or PVLM model of order two is equivalent to the quadratic TSP (QTSP. Several exact algorithms exist for solving the QTSP, but it is unclear if these algorithms are capable of solving QTSP instances resulting from RNA splice sites of at least 150 base pairs in a reasonable time frame. Here, we investigate the performance of three exact algorithms for solving the QTSP for ten datasets of splice acceptor sites and splice donor sites of five different species and find that one of these algorithms is capable of solving QTSP instances of up to 200 base pairs with a running time of less than two days.

  14. The identification of sites of biodiversity conservation significance: progress with the application of a global standard

    Directory of Open Access Journals (Sweden)

    M.N. Foster

    2012-08-01

    Full Text Available As a global community, we have a responsibility to ensure the long-term future of our natural heritage. As part of this, it is incumbent upon us to do all that we can to reverse the current trend of biodiversity loss, using all available tools at our disposal. One effective mean is safeguarding of those sites that are highest global priority for the conservation of biodiversity, whether through formal protected areas, community managed reserves, multiple-use areas, or other means. This special issue of the Journal of Threatened Taxa examines the application of the Key Biodiversity Area (KBA approach to identifying such sites. Given the global mandate expressed through policy instruments such as the Convention on Biological Diversity (CBD, the KBA approach can help countries meet obligations in an efficient and transparent manner. KBA methodology follows the well-established general principles of vulnerability and irreplaceability, and while it aims to be a globally standardized approach, it recognizes the fundamental need for the process to be led at local and national levels. In this series of papers the application of the KBA approach is explored in seven countries or regions: the Caribbean, Indo-Burma, Japan, Macedonia, Mediterranean Algeria, the Philippines and the Upper Guinea region of West Africa. This introductory article synthesizes some of the common main findings and provides a comparison of key summary statistics.

  15. IC3 Internet and Computing Core Certification Global Standard 4 study guide

    CERN Document Server

    Rusen, Ciprian Adrian

    2015-01-01

    Hands-on IC3 prep, with expert instruction and loads of tools IC3: Internet and Computing Core Certification Global Standard 4 Study Guide is the ideal all-in-one resource for those preparing to take the exam for the internationally-recognized IT computing fundamentals credential. Designed to help candidates pinpoint weak areas while there's still time to brush up, this book provides one hundred percent coverage of the exam objectives for all three modules of the IC3-GS4 exam. Readers will find clear, concise information, hands-on examples, and self-paced exercises that demonstrate how to per

  16. BIPM direct on-site Josephson voltage standard comparisons: 20 years of results

    International Nuclear Information System (INIS)

    Solve, Stephane; Stock, Michael

    2012-01-01

    The discovery of the Josephson effect has for the first time given national metrology institutes (NMIs) the possibility of maintaining voltage references which are stable in time. In addition, the introduction in 1990 of a conventional value for the Josephson constant, K J-90 , has greatly improved world-wide consistency among representations of the volt. For 20 years, the Bureau International des Poids et Mesures (BIPM) has conducted an ongoing, direct, on-site key comparison of Josephson voltage standards among NMIs under the denominations BIPM.EM-K10.a (1 V) and BIPM.EM-K10.b (10 V) in the framework of the mutual recognition arrangement (CIPM MRA). The results of 41 comparisons illustrate the consistency among primary voltage standards and have demonstrated that a relative total uncertainty of a few parts in 10 10 is achievable if a few precautions are taken with regard to the measurement set-up. Of particular importance are the grounding, efficient filters and high insulation resistance of the measurement leads, and clean microwave distribution along the propagation line to the Josephson array. This paper reviews the comparison scheme and technical issues that need to be taken into account to achieve a relative uncertainty at the level of a few parts in 10 10 or even a few parts in 10 11 in the best cases. (paper)

  17. Development of methods and criteria for a standardized evaluation of contaminated sites and abandoned waste disposal sites particularly concerning their ground water contamination potential. Pt. 1. Final Report

    International Nuclear Information System (INIS)

    Kerndorff, H.; Schleyer, R.; Arneth, J.D.; Struppe, T.; Milde, G.

    1994-01-01

    Contaminated sites should be evaluated to such an extend, that nearly all risks for man and environment can be safely estimated. An assessment for such sites is presented which combines a substance-specific and a site-specific evaluation. It is a standardized path-specific concept in which - as an example - the contamination path ''waste - groundwater - drinking-water'' is investigated and evaluated in detail. Path-specific main contaminants are established on a statistic basis and ranked according to normalized evaluation numbers of 1-100. Their toxicity potential is calculated for which a particular and standardized method was developed. Main contaminants having a high toxicity potential are called priority contaminants. For the most important exposure/usage on this contamination path, the drinking-water catchment, hygienic and toxicologic based standards are presented. Together with site-specific conditions and the also path-specific and normalized transfer/persistency potential of the priority contaminants it is possible to come to a site- and usage/exposure-specific evaluation of individual sites. (orig.) [de

  18. Evaluation of health risks associated with proposed ground water standards at selected inactive uranium mill-tailings sites

    International Nuclear Information System (INIS)

    Hamilton, L.D.; Medeiros, W.H.; Meinhold, A.; Morris, S.C.; Moskowitz, P.D.; Nagy, J.; Lackey, K.

    1989-04-01

    The US Environmental Protection Agency (EPA) has proposed ground water standards applicable to all inactive uranium mill-tailings sites. The proposed standards include maximum concentration limits (MCL) for currently regulated drinking water contaminants, as well as the addition of standards for molybdenum, uranium, nitrate, and radium-226 plus radium-228. The proposed standards define the point of compliance to be everywhere downgradient of the tailings pile, and require ground water remediation to drinking water standards if MCLs are exceeded. This document presents a preliminary description of the Phase 2 efforts. The potential risks and hazards at Gunnison, Colorado and Lakeview, Oregon were estimated to demonstrate the need for a risk assessment and the usefulness of a cost-benefit approach in setting supplemental standards and determining the need for and level of restoration at UMTRA sites. 8 refs., 12 tabs

  19. Evaluation of health risks associated with proposed ground water standards at selected inactive uranium mill-tailings sites

    Energy Technology Data Exchange (ETDEWEB)

    Hamilton, L.D.; Medeiros, W.H.; Meinhold, A.; Morris, S.C.; Moskowitz, P.D.; Nagy, J.; Lackey, K.

    1989-04-01

    The US Environmental Protection Agency (EPA) has proposed ground water standards applicable to all inactive uranium mill-tailings sites. The proposed standards include maximum concentration limits (MCL) for currently regulated drinking water contaminants, as well as the addition of standards for molybdenum, uranium, nitrate, and radium-226 plus radium-228. The proposed standards define the point of compliance to be everywhere downgradient of the tailings pile, and require ground water remediation to drinking water standards if MCLs are exceeded. This document presents a preliminary description of the Phase 2 efforts. The potential risks and hazards at Gunnison, Colorado and Lakeview, Oregon were estimated to demonstrate the need for a risk assessment and the usefulness of a cost-benefit approach in setting supplemental standards and determining the need for and level of restoration at UMTRA sites. 8 refs., 12 tabs.

  20. Verification of thermal-hydraulic computer codes against standard problems for WWER reflooding

    International Nuclear Information System (INIS)

    Alexander D Efanov; Vladimir N Vinogradov; Victor V Sergeev; Oleg A Sudnitsyn

    2005-01-01

    Full text of publication follows: The computational assessment of reactor core components behavior under accident conditions is impossible without knowledge of the thermal-hydraulic processes occurring in this case. The adequacy of the results obtained using the computer codes to the real processes is verified by carrying out a number of standard problems. In 2000-2003, the fulfillment of three Russian standard problems on WWER core reflooding was arranged using the experiments on full-height electrically heated WWER 37-rod bundle model cooldown in regimes of bottom (SP-1), top (SP-2) and combined (SP-3) reflooding. The representatives from the eight MINATOM's organizations took part in this work, in the course of which the 'blind' and posttest calculations were performed using various versions of the RELAP5, ATHLET, CATHARE, COBRA-TF, TRAP, KORSAR computer codes. The paper presents a brief description of the test facility, test section, test scenarios and conditions as well as the basic results of computational analysis of the experiments. The analysis of the test data revealed a significantly non-one-dimensional nature of cooldown and rewetting of heater rods heated up to a high temperature in a model bundle. This was most pronounced at top and combined reflooding. The verification of the model reflooding computer codes showed that most of computer codes fairly predict the peak rod temperature and the time of bundle cooldown. The exception is provided by the results of calculations with the ATHLET and CATHARE codes. The nature and rate of rewetting front advance in the lower half of the bundle are fairly predicted practically by all computer codes. The disagreement between the calculations and experimental results for the upper half of the bundle is caused by the difficulties of computational simulation of multidimensional effects by 1-D computer codes. In this regard, a quasi-two-dimensional computer code COBRA-TF offers certain advantages. Overall, the closest

  1. 2 December 2003: Registration of Computers Mandatory for the entire CERN Site

    CERN Multimedia

    2003-01-01

    Following the decision by the CERN Management Board (see Weekly Bulletin 38/2003), registration of all computers connected to CERN's network will be enforced and only registered computers will be allowed network access. The implementation has been put into place in the IT buildings, building 40 and the Prévessin site, and will cover the whole of CERN by 2 December 2003. We therefore recommend strongly that you register all your computers in CERN's network database including all network access cards (Ethernet AND wire-less) as soon as possible without waiting for the access restriction to take force. This will allow you accessing the network without interruption and help IT service providers to contact you in case of problems (security problems, viruses, etc.). - If you have a CERN NICE/mail computing account register at: http://cern.ch/register/ (CERN Intranet page) - If you don't have CERN NICE/mail computing account (e.g. short term visitors) register at: http://cern.ch/registerVisitorComputer/...

  2. The effect of switch control site on computer skills of infants and toddlers.

    Science.gov (United States)

    Glickman, L; Deitz, J; Anson, D; Stewart, K

    1996-01-01

    The purpose of this study was to determine whether switch control site (hand vs. head) affects the age at which children can successfully activate a computer to play a cause-and-effect game. The sample consisted of 72 participants randomly divided into two groups (head switch and hand switch), with stratification for gender and age (9-11 months, 12-14 months, 15-17 months). All participants were typically developing. After a maximum of 5 min of training, each participant was given five opportunities to activate a Jelly Bean switch to play a computer game. Competency was defined as four to five successful switch activations. Most participants in the 9-month to 11-month age group could successfully use a hand switch to activate a computer, and for the 15-month to 17-month age group, 100% of the participants met with success. By contrast, in the head switch condition, approximately one third of the participants in each of the three age ranges were successful in activating the computer to play a cause-and-effect game. The findings from this study provide developmental guidelines for using switches (head vs. hand) to activate computers to play cause-and-effect games and suggest that the clinician may consider introducing basic computer and switch skills to children as young as 9 months of age. However, the clinician is cautioned that the head switch may be more difficult to master than the hand switch and that additional research involving children with motor impairments is needed.

  3. Effects of electromyographic and mechanomyographic biofeedback on upper trapezius muscle activity during standardized computer work

    DEFF Research Database (Denmark)

    Madeleine, Pascal; Vedsted, Pernille; Blangsted, Anne Katrine

    2006-01-01

    The purpose of this laboratory study was to investigate the effects of surface electromyography (EMG)- and mechanomyography (MMG)-based audio and visual biofeedback during computer work. Standardized computer work was performed for 3 min with/without time constraint and biofeedback in a randomize...... alternative to EMG in ergonomics. A lowering of the trapezius muscle activity may contribute to diminish the risk of work related musculoskeletal disorders development.......The purpose of this laboratory study was to investigate the effects of surface electromyography (EMG)- and mechanomyography (MMG)-based audio and visual biofeedback during computer work. Standardized computer work was performed for 3 min with/without time constraint and biofeedback in a randomized......) values as well as the work performance in terms of number of completed graph/mouse clicks/errors, the rating of perceived exertion (RPE) and the usefulness of the biofeedback were assessed. The duration of muscle activity above the threshold was significantly lower with MMG compared with EMG as source...

  4. Managing Data, Provenance and Chaos through Standardization and Automation at the Georgia Coastal Ecosystems LTER Site

    Science.gov (United States)

    Sheldon, W.

    2013-12-01

    Managing data for a large, multidisciplinary research program such as a Long Term Ecological Research (LTER) site is a significant challenge, but also presents unique opportunities for data stewardship. LTER research is conducted within multiple organizational frameworks (i.e. a specific LTER site as well as the broader LTER network), and addresses both specific goals defined in an NSF proposal as well as broader goals of the network; therefore, every LTER data can be linked to rich contextual information to guide interpretation and comparison. The challenge is how to link the data to this wealth of contextual metadata. At the Georgia Coastal Ecosystems LTER we developed an integrated information management system (GCE-IMS) to manage, archive and distribute data, metadata and other research products as well as manage project logistics, administration and governance (figure 1). This system allows us to store all project information in one place, and provide dynamic links through web applications and services to ensure content is always up to date on the web as well as in data set metadata. The database model supports tracking changes over time in personnel roles, projects and governance decisions, allowing these databases to serve as canonical sources of project history. Storing project information in a central database has also allowed us to standardize both the formatting and content of critical project information, including personnel names, roles, keywords, place names, attribute names, units, and instrumentation, providing consistency and improving data and metadata comparability. Lookup services for these standard terms also simplify data entry in web and database interfaces. We have also coupled the GCE-IMS to our MATLAB- and Python-based data processing tools (i.e. through database connections) to automate metadata generation and packaging of tabular and GIS data products for distribution. Data processing history is automatically tracked throughout the data

  5. Toward the rational use of standardized infection ratios to benchmark surgical site infections.

    Science.gov (United States)

    Fukuda, Haruhisa; Morikane, Keita; Kuroki, Manabu; Taniguchi, Shinichiro; Shinzato, Takashi; Sakamoto, Fumie; Okada, Kunihiko; Matsukawa, Hiroshi; Ieiri, Yuko; Hayashi, Kouji; Kawai, Shin

    2013-09-01

    The National Healthcare Safety Network transitioned from surgical site infection (SSI) rates to the standardized infection ratio (SIR) calculated by statistical models that included perioperative factors (surgical approach and surgery duration). Rationally, however, only patient-related variables should be included in the SIR model. Logistic regression was performed to predict expected SSI rate in 2 models that included or excluded perioperative factors. Observed and expected SSI rates were used to calculate the SIR for each participating hospital. The difference of SIR in each model was then evaluated. Surveillance data were collected from a total of 1,530 colon surgery patients and 185 SSIs. C-index in the model with perioperative factors was statistically greater than that in the model including patient-related factors only (0.701 vs 0.621, respectively, P operative process or the competence of surgical teams, these factors should not be considered predictive variables. Copyright © 2013 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights reserved.

  6. Multislice computed tomography: angiographic emulation versus standard assessment for detection of coronary stenoses

    Energy Technology Data Exchange (ETDEWEB)

    Schnapauff, Dirk; Hamm, Bernd; Dewey, Marc [Humboldt-Universitaet zu Berlin, Department of Radiology, Charite - Universitaetsmedizin Berlin, Chariteplatz 1, P.O. Box 10098, Berlin (Germany); Duebel, Hans-Peter; Baumann, Gert [Charite - Universitaetsmedizin Berlin, Department of Cardiology, Berlin (Germany); Scholze, Juergen [Charite - Universitaetsmedizin Berlin, Charite Outpatient Centre, Berlin (Germany)

    2007-07-15

    The present study investigated angiographic emulation of multislice computed tomography (MSCT) (catheter-like visualization) as an alternative approach of analyzing and visualizing findings in comparison with standard assessment. Thirty patients (120 coronary arteries) were randomly selected from 90 prospectively investigated patients with suspected coronary artery disease who underwent MSCT (16-slice scanner, 0.5 mm collimation, 400 ms rotation time) prior to conventional coronary angiography for comparison of both approaches. Sensitivity and specificity of angiographic emulation [81% (26/32) and 93% (82/88)] were not significantly different from those of standard assessment [88% (28/32) and 99% (87/88)], while the per-case analysis time was significantly shorter for angiographic emulation than for standard assessment (3.4 {+-} 1.5 vs 7.0 {+-} 2.5 min, P < 0.001). Both interventional and referring cardiologists preferred angiographic emulation over standard curved multiplanar reformations of MSCT coronary angiography for illustration, mainly because of improved overall lucidity and depiction of sidebranches (P < 0.001). In conclusion, angiographic emulation of MSCT reduces analysis time, yields a diagnostic accuracy comparable to that of standard assessment, and is preferred by cardiologists for visualization of results. (orig.)

  7. Current state of standardization in the field of dimensional computed tomography

    International Nuclear Information System (INIS)

    Bartscher, Markus; Härtig, Frank; Neuschaefer-Rube, Ulrich; Sato, Osamu

    2014-01-01

    Industrial x-ray computed tomography (CT) is a well-established non-destructive testing (NDT) technology and has been in use for decades. Moreover, CT has also started to become an important technology for dimensional metrology. But the requirements on dimensional CTs, i.e., on performing coordinate measurements with CT, are different from NDT. For dimensional measurements, the position of interfaces or surfaces is of importance, while this is often less critical in NDT. Standardization plays an important role here as it can create trust in new measurement technologies as is the case for dimensional CT. At the international standardization level, the ISO TC 213 WG 10 is working on specifications for dimensional CT. This paper highlights the demands on international standards in the field of dimensional CT and describes the current developments from the viewpoint of representatives of national and international standardization committees. Key aspects of the discussion are the material influence on the length measurement error E and how E can best be measured. A respective study was performed on hole plates as new reference standards for error testing of length measurements incorporating the material influence. We performed corresponding measurement data analysis and present a further elaborated hole plate design. The authors also comment on different approaches currently pursued and give an outlook on upcoming developments as far as they can be foreseen. (paper)

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  9. Computer-based tools for decision support at the Hanford Site

    International Nuclear Information System (INIS)

    Doctor, P.G.; Mahaffey, J.A.; Cowley, P.J.; Freshley, M.D.; Hassig, N.L.; Brothers, J.W.; Glantz, C.S.; Strachan, D.M.

    1992-11-01

    To help integrate activities in the environmental restoration and waste management mission of the Hanford Site, the Hanford Integrated Planning Project (HIPP) was established and funded by the US Department of Energy. The project is divided into three key program elements, the first focusing on an explicit, defensible and comprehensive method for evaluating technical options. Based on the premise that computer technology can be used to support the decision-making process and facilitate integration among programs and activities, the Decision Support Tools Task was charged with assessing the status of computer technology for those purposes at the Site. The task addressed two types of tools: tools need to provide technical information and management support tools. Technical tools include performance and risk assessment models, information management systems, data and the computer infrastructure to supports models, data, and information management systems. Management decision support tools are used to synthesize information at a high' level to assist with making decisions. The major conclusions resulting from the assessment are that there is much technical information available, but it is not reaching the decision-makers in a form to be used. Many existing tools provide components that are needed to integrate site activities; however, some components are missing and, more importantly, the ''glue'' or connections to tie the components together to answer decision-makers questions is largely absent. Top priority should be given to decision support tools that support activities given in the TPA. Other decision tools are needed to facilitate and support the environmental restoration and waste management mission

  10. Computer-based tools for decision support at the Hanford Site

    Energy Technology Data Exchange (ETDEWEB)

    Doctor, P.G.; Mahaffey, J.A.; Cowley, P.J.; Freshley, M.D.; Hassig, N.L.; Brothers, J.W.; Glantz, C.S.; Strachan, D.M.

    1992-11-01

    To help integrate activities in the environmental restoration and waste management mission of the Hanford Site, the Hanford Integrated Planning Project (HIPP) was established and funded by the US Department of Energy. The project is divided into three key program elements, the first focusing on an explicit, defensible and comprehensive method for evaluating technical options. Based on the premise that computer technology can be used to support the decision-making process and facilitate integration among programs and activities, the Decision Support Tools Task was charged with assessing the status of computer technology for those purposes at the Site. The task addressed two types of tools: tools need to provide technical information and management support tools. Technical tools include performance and risk assessment models, information management systems, data and the computer infrastructure to supports models, data, and information management systems. Management decision support tools are used to synthesize information at a high' level to assist with making decisions. The major conclusions resulting from the assessment are that there is much technical information available, but it is not reaching the decision-makers in a form to be used. Many existing tools provide components that are needed to integrate site activities; however, some components are missing and, more importantly, the glue'' or connections to tie the components together to answer decision-makers questions is largely absent. Top priority should be given to decision support tools that support activities given in the TPA. Other decision tools are needed to facilitate and support the environmental restoration and waste management mission.

  11. Computer-based tools for decision support at the Hanford Site

    Energy Technology Data Exchange (ETDEWEB)

    Doctor, P.G.; Mahaffey, J.A.; Cowley, P.J.; Freshley, M.D.; Hassig, N.L.; Brothers, J.W.; Glantz, C.S.; Strachan, D.M.

    1992-11-01

    To help integrate activities in the environmental restoration and waste management mission of the Hanford Site, the Hanford Integrated Planning Project (HIPP) was established and funded by the US Department of Energy. The project is divided into three key program elements, the first focusing on an explicit, defensible and comprehensive method for evaluating technical options. Based on the premise that computer technology can be used to support the decision-making process and facilitate integration among programs and activities, the Decision Support Tools Task was charged with assessing the status of computer technology for those purposes at the Site. The task addressed two types of tools: tools need to provide technical information and management support tools. Technical tools include performance and risk assessment models, information management systems, data and the computer infrastructure to supports models, data, and information management systems. Management decision support tools are used to synthesize information at a high` level to assist with making decisions. The major conclusions resulting from the assessment are that there is much technical information available, but it is not reaching the decision-makers in a form to be used. Many existing tools provide components that are needed to integrate site activities; however, some components are missing and, more importantly, the ``glue`` or connections to tie the components together to answer decision-makers questions is largely absent. Top priority should be given to decision support tools that support activities given in the TPA. Other decision tools are needed to facilitate and support the environmental restoration and waste management mission.

  12. The Application of Computer-Aided Discovery to Spacecraft Site Selection

    Science.gov (United States)

    Pankratius, V.; Blair, D. M.; Gowanlock, M.; Herring, T.

    2015-12-01

    The selection of landing and exploration sites for interplanetary robotic or human missions is a complex task. Historically it has been labor-intensive, with large groups of scientists manually interpreting a planetary surface across a variety of datasets to identify potential sites based on science and engineering constraints. This search process can be lengthy, and excellent sites may get overlooked when the aggregate value of site selection criteria is non-obvious or non-intuitive. As planetary data collection leads to Big Data repositories and a growing set of selection criteria, scientists will face a combinatorial search space explosion that requires scalable, automated assistance. We are currently exploring more general computer-aided discovery techniques in the context of planetary surface deformation phenomena that can lend themselves to application in the landing site search problem. In particular, we are developing a general software framework that addresses key difficulties: characterizing a given phenomenon or site based on data gathered from multiple instruments (e.g. radar interferometry, gravity, thermal maps, or GPS time series), and examining a variety of possible workflows whose individual configurations are optimized to isolate different features. The framework allows algorithmic pipelines and hypothesized models to be perturbed or permuted automatically within well-defined bounds established by the scientist. For example, even simple choices for outlier and noise handling or data interpolation can drastically affect the detectability of certain features. These techniques aim to automate repetitive tasks that scientists routinely perform in exploratory analysis, and make them more efficient and scalable by executing them in parallel in the cloud. We also explore ways in which machine learning can be combined with human feedback to prune the search space and converge to desirable results. Acknowledgements: We acknowledge support from NASA AIST

  13. Performance of three pencil-type ionization chambers (10 cm) in computed tomography standard beams

    International Nuclear Information System (INIS)

    Castro, Maysa C. de; Xavier, Marcos; Caldas, Linda V.E.

    2015-01-01

    The use of computed tomography (CT) has increased over the years, thus generating a concern about the doses received by patients undergoing this procedure. Therefore, it is necessary to perform routinely beam dosimetry with the use of a pencil-type ionization chamber. This detector is the most utilized in the procedures of quality control tests on this kind of equipment. The objective of this work was to perform some characterization tests in standard CT beams, as the saturation curve, polarity effect, ion collection efficiency and linearity of response, using three ionization chambers, one commercial and two developed at the IPEN. (author)

  14. Building a computer-aided design capability using a standard time share operating system

    Science.gov (United States)

    Sobieszczanski, J.

    1975-01-01

    The paper describes how an integrated system of engineering computer programs can be built using a standard commercially available operating system. The discussion opens with an outline of the auxiliary functions that an operating system can perform for a team of engineers involved in a large and complex task. An example of a specific integrated system is provided to explain how the standard operating system features can be used to organize the programs into a simple and inexpensive but effective system. Applications to an aircraft structural design study are discussed to illustrate the use of an integrated system as a flexible and efficient engineering tool. The discussion concludes with an engineer's assessment of an operating system's capabilities and desirable improvements.

  15. Multiparametric multidetector computed tomography scanning on suspicion of hyperacute ischemic stroke: validating a standardized protocol

    Directory of Open Access Journals (Sweden)

    Felipe Torres Pacheco

    2013-06-01

    Full Text Available Multidetector computed tomography (MDCT scanning has enabled the early diagnosis of hyperacute brain ischemia. We aimed at validating a standardized protocol to read and report MDCT techniques in a series of adult patients. The inter-observer agreement among the trained examiners was tested, and their results were compared with a standard reading. No false positives were observed, and an almost perfect agreement (Kappa>0.81 was documented when the CT angiography (CTA and cerebral perfusion CT (CPCT map data were added to the noncontrast CT (NCCT analysis. The inter-observer agreement was higher for highly trained readers, corroborating the need for specific training to interpret these modern techniques. The authors recommend adding CTA and CPCT to the NCCT analysis in order to clarify the global analysis of structural and hemodynamic brain abnormalities. Our structured report is suitable as a script for the reproducible analysis of the MDCT of patients on suspicion of ischemic stroke.

  16. Computer aid in rescue organisation on site in case of catastrophic situation on nuclear plant

    International Nuclear Information System (INIS)

    Teissier, M.

    1992-01-01

    The rescue organisation in case of catastrophic situation is based on known principles: creation of medical buffer structures between hazard spot where injured people are being collected and rear hospitals, triage of victims as urgent casualties. We will propose computer aid in order to value the time used to prepare and evacuate all the victims from the site, knowing inventory of available means, waiting periods and lengths of intervention, types and number of victims. Thus, it is possible to optimize the former organisation, qualitatively and quantitatively to improve efficiency in rescuing operations. (author)

  17. Computation of the Likelihood of Joint Site Frequency Spectra Using Orthogonal Polynomials

    Directory of Open Access Journals (Sweden)

    Claus Vogl

    2016-02-01

    Full Text Available In population genetics, information about evolutionary forces, e.g., mutation, selection and genetic drift, is often inferred from DNA sequence information. Generally, DNA consists of two long strands of nucleotides or sites that pair via the complementary bases cytosine and guanine (C and G, on the one hand, and adenine and thymine (A and T, on the other. With whole genome sequencing, most genomic information stored in the DNA has become available for multiple individuals of one or more populations, at least in humans and model species, such as fruit flies of the genus Drosophila. In a genome-wide sample of L sites for M (haploid individuals, the state of each site may be made binary, by binning the complementary bases, e.g., C with G to C/G, and contrasting C/G to A/T, to obtain a “site frequency spectrum” (SFS. Two such samples of either a single population from different time-points or two related populations from a single time-point are called joint site frequency spectra (joint SFS. While mathematical models describing the interplay of mutation, drift and selection have been available for more than 80 years, calculation of exact likelihoods from joint SFS is difficult. Sufficient statistics for inference of, e.g., mutation or selection parameters that would make use of all the information in the genomic data are rarely available. Hence, often suites of crude summary statistics are combined in simulation-based computational approaches. In this article, we use a bi-allelic boundary-mutation and drift population genetic model to compute the transition probabilities of joint SFS using orthogonal polynomials. This allows inference of population genetic parameters, such as the mutation rate (scaled by the population size and the time separating the two samples. We apply this inference method to a population dataset of neutrally-evolving short intronic sites from six DNA sequences of the fruit fly Drosophila melanogaster and the reference

  18. Discovering Hominins - Application of Medical Computed Tomography (CT) to Fossil-Bearing Rocks from the Site of Malapa, South Africa.

    Science.gov (United States)

    Smilg, Jacqueline S; Berger, Lee R

    2015-01-01

    In the South African context, computed tomography (CT) has been used applied to individually prepared fossils and small rocks containing fossils, but has not been utilized on large breccia blocks as a means of discovering fossils, and particularly fossil hominins. Previous attempts at CT imaging of rocks from other South African sites for this purpose yielded disappointing results. For this study, 109 fossil- bearing rocks from the site of Malapa, South Africa were scanned with medical CT prior to manual preparation. The resultant images were assessed for accuracy of fossil identification and characterization against the standard of manual preparation. The accurate identification of fossils, including those of early hominins, that were not visible on the surface of individual blocks, is shown to be possible. The discovery of unexpected fossils is reduced, thus lowering the potential that fossils could be damaged through accidental encounter during routine preparation, or even entirely missed. This study should significantly change the way fossil discovery, recovery and preparation is done in the South African context and has potential for application in other palaeontological situations. Medical CT imaging is shown to be reliable, readily available, cost effective and accurate in finding fossils within matrix conglomerates. Improvements in CT equipment and in CT image quality are such that medical CT is now a viable imaging modality for this palaeontological application.

  19. 48 CFR 311.7001 - Section 508 accessibility standards for HHS Web site content and communications materials.

    Science.gov (United States)

    2010-10-01

    ..., documents, charts, posters, presentations (such as Microsoft PowerPoint), or video material that is specifically intended for publication on, or delivery via, an HHS-owned or -funded Web site, the Project... standards, and resolve any related issues. (c) Based on those discussions, the Project Officer shall provide...

  20. One-Tube-Only Standardized Site-Directed Mutagenesis: An Alternative Approach to Generate Amino Acid Substitution Collections

    NARCIS (Netherlands)

    Mingo, J.; Erramuzpe, A.; Luna, S.; Aurtenetxe, O.; Amo, L.; Diez, I.; Schepens, J.T.G.; Hendriks, W.J.A.J.; Cortes, J.M.; Pulido, R.

    2016-01-01

    Site-directed mutagenesis (SDM) is a powerful tool to create defined collections of protein variants for experimental and clinical purposes, but effectiveness is compromised when a large number of mutations is required. We present here a one-tube-only standardized SDM approach that generates

  1. American National Standard: criteria and guidelines for assessing capability for surface faulting at nuclear power plant sites

    International Nuclear Information System (INIS)

    Anon.

    1982-01-01

    This standard provides applicants and consultants with criteria and guidelines for investigations directed toward the assessment of the capability for surface faulting at nuclear power plant sites. Assessment of vibratory ground motion resulting from faulting is not treated in these guidelines

  2. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  3. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation

    International Nuclear Information System (INIS)

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This manual covers an array of modules written for the SCALE package, consisting of drivers, system libraries, cross section and materials properties libraries, input/output routines, storage modules, and help files

  4. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This manual covers an array of modules written for the SCALE package, consisting of drivers, system libraries, cross section and materials properties libraries, input/output routines, storage modules, and help files.

  5. Nevada Test Site National Emission Standards for Hazardous Air Pollutants - Radionuclide Emissions Calendar Year 2008

    International Nuclear Information System (INIS)

    Warren, Ronald; Grossman, Robert F.

    2009-01-01

    The Nevada Test Site (NTS) is operated by the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office. From 1951 through 1992, the NTS was the continental testing location for U.S. nuclear weapons. The release of radionuclides from NTS activities has been monitored since the initiation of atmospheric testing. Limitation to under-ground detonations after 1962 greatly reduced radiation exposure to the public surrounding the NTS. After nuclear testing ended in 1992, NTS radiation monitoring focused on detecting airborne radionuclides from historically contaminated soils. These radionuclides are derived from re-suspension of soil (primarily by winds) and emission of tritium-contaminated soil moisture through evapotranspiration. Low amounts of tritium were also emitted to air at the North Las Vegas Facility (NLVF), an NTS support complex in the city of North Las Vegas. To protect the public from harmful levels of man-made radiation, the Clean Air Act, National Emission Standards for Hazardous Air Pollutants (NESHAP) (Title 40 Code of Federal Regulations (CFR) Part 61 Subpart H) (CFR, 2008a) limits the release of radioactivity from a U.S. Department of Energy facility (e.g., the NTS) to 10 millirem per year (mrem/yr) effective dose equivalent to any member of the public. This limit does not include radiation not related to NTS activities. Unrelated doses could come from naturally occurring radioactive elements or from other man-made sources such as medical treatments. The NTS demonstrates compliance with the NESHAP limit by using environmental measurements of radionuclide air concentrations at critical receptor locations. This method was approved by the U.S. Environmental Protection Agency for use on the NTS in 2001 and has been the sole method used since 2005. Six locations on the NTS have been established to act as critical receptor locations to demonstrate compliance with the NESHAP limit. These locations are actually pseudo

  6. Computer-aided detection of breast carcinoma in standard mammographic projections with digital mammography

    International Nuclear Information System (INIS)

    Destounis, S.; Hanson, S.

    2007-01-01

    This study was conducted to retrospectively evaluate a computer-aided detection system's ability to detect breast carcinoma in multiple standard mammographic projections. Forty-five lesions in 44 patients imaged with digital mammography (Selenia registered , Hologic, Bedford, MA; Senographe registered , GE, Milwaukee, WI) and had computer-aided detection (CAD, Image-checker registered V 8.3.15, Hologic/R2, Santa Clara, CA) applied at the time of examination were identified for review; all were subsequently recommended to biopsy where cancer was revealed. These lesions were determined by the study Radiologist to be visible in both standard mammographic images (mediolateral oblique, MLO; craniocaudal, CC). For each patient, case data included patient age, tissue density, lesion type, BIRADS registered assessment, lesion size, lesion visibility-visible on MLO and/or CC view, ability of CAD to correctly mark the cancerous lesion, number of CAD marks per image, needle core biopsy results and surgical pathologic correlation. For this study cohort. CAD lesion/case sensitivity of 87% (n = 39) was found and image sensitivity was found to be 69% (n = 31) for MLO view and 78% (n = 35) for the CC view. For the study cohort, cases presented with a median of four marks per cases (range 0-13). Eighty-four percent (n = 38) of lesions proceeded to excision; initial needle biopsy pathology was upgraded at surgical excision from in situ disease to invasive for 24% (n = 9) lesions. CAD has demonstrated the potential to detect mammographically visible cancers in multiple standard mammographic projections in all categories of lesions in this study cohort. (orig.)

  7. Computer-based planning of optimal donor sites for autologous osseous grafts

    Science.gov (United States)

    Krol, Zdzislaw; Chlebiej, Michal; Zerfass, Peter; Zeilhofer, Hans-Florian U.; Sader, Robert; Mikolajczak, Pawel; Keeve, Erwin

    2002-05-01

    Bone graft surgery is often necessary for reconstruction of craniofacial defects after trauma, tumor, infection or congenital malformation. In this operative technique the removed or missing bone segment is filled with a bone graft. The mainstay of the craniofacial reconstruction rests with the replacement of the defected bone by autogeneous bone grafts. To achieve sufficient incorporation of the autograft into the host bone, precise planning and simulation of the surgical intervention is required. The major problem is to determine as accurately as possible the donor site where the graft should be dissected from and to define the shape of the desired transplant. A computer-aided method for semi-automatic selection of optimal donor sites for autografts in craniofacial reconstructive surgery has been developed. The non-automatic step of graft design and constraint setting is followed by a fully automatic procedure to find the best fitting position. In extension to preceding work, a new optimization approach based on the Levenberg-Marquardt method has been implemented and embedded into our computer-based surgical planning system. This new technique enables, once the pre-processing step has been performed, selection of the optimal donor site in time less than one minute. The method has been applied during surgery planning step in more than 20 cases. The postoperative observations have shown that functional results, such as speech and chewing ability as well as restoration of bony continuity were clearly better compared to conventionally planned operations. Moreover, in most cases the duration of the surgical interventions has been distinctly reduced.

  8. Computed-tomography-guided anatomic standardization for quantitative assessment of dopamine transporter SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Yokoyama, Kota [National Center of Neurology and Psychiatry, Department of Radiology, Tokyo (Japan); National Center of Neurology and Psychiatry, Integrative Brain Imaging Center, Tokyo (Japan); Imabayashi, Etsuko; Matsuda, Hiroshi [National Center of Neurology and Psychiatry, Integrative Brain Imaging Center, Tokyo (Japan); Sumida, Kaoru; Sone, Daichi; Kimura, Yukio; Sato, Noriko [National Center of Neurology and Psychiatry, Department of Radiology, Tokyo (Japan); Mukai, Youhei; Murata, Miho [National Center of Neurology and Psychiatry, Department of Neurology, Tokyo (Japan)

    2017-03-15

    For the quantitative assessment of dopamine transporter (DAT) using [{sup 123}I]FP-CIT single-photon emission computed tomography (SPECT) (DaTscan), anatomic standardization is preferable for achieving objective and user-independent quantification of striatal binding using a volume-of-interest (VOI) template. However, low accumulation of DAT in Parkinson's disease (PD) would lead to a deformation error when using a DaTscan-specific template without any structural information. To avoid this deformation error, we applied computed tomography (CT) data obtained using SPECT/CT equipment to anatomic standardization. We retrospectively analyzed DaTscan images of 130 patients with parkinsonian syndromes (PS), including 80 PD and 50 non-PD patients. First we segmented gray matter from CT images using statistical parametric mapping 12 (SPM12). These gray-matter images were then anatomically standardized using the diffeomorphic anatomical registration using exponentiated Lie algebra (DARTEL) algorithm. Next, DaTscan images were warped with the same parameters used in the CT anatomic standardization. The target striatal VOIs for decreased DAT in PD were generated from the SPM12 group comparison of 20 DaTscan images from each group. We applied these VOIs to DaTscan images of the remaining patients in both groups and calculated the specific binding ratios (SBRs) using nonspecific counts in a reference area. In terms of the differential diagnosis of PD and non-PD groups using SBR, we compared the present method with two other methods, DaTQUANT and DaTView, which have already been released as software programs for the quantitative assessment of DaTscan images. The SPM12 group comparison showed a significant DAT decrease in PD patients in the bilateral whole striatum. Of the three methods assessed, the present CT-guided method showed the greatest power for discriminating PD and non-PD groups, as it completely separated the two groups. CT-guided anatomic standardization using

  9. NACP Site: Terrestrial Biosphere Model and Aggregated Flux Data in Standard Format

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set provides standardized output variables for gross primary productivity (GPP), net ecosystem exchange (NEE), leaf area index (LAI), ecosystem respiration...

  10. NACP Site: Terrestrial Biosphere Model and Aggregated Flux Data in Standard Format

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: This data set provides standardized output variables for gross primary productivity (GPP), net ecosystem exchange (NEE), leaf area index (LAI), ecosystem...

  11. Methods for computing water-quality loads at sites in the U.S. Geological Survey National Water Quality Network

    Science.gov (United States)

    Lee, Casey J.; Murphy, Jennifer C.; Crawford, Charles G.; Deacon, Jeffrey R.

    2017-10-24

    The U.S. Geological Survey publishes information on concentrations and loads of water-quality constituents at 111 sites across the United States as part of the U.S. Geological Survey National Water Quality Network (NWQN). This report details historical and updated methods for computing water-quality loads at NWQN sites. The primary updates to historical load estimation methods include (1) an adaptation to methods for computing loads to the Gulf of Mexico; (2) the inclusion of loads computed using the Weighted Regressions on Time, Discharge, and Season (WRTDS) method; and (3) the inclusion of loads computed using continuous water-quality data. Loads computed using WRTDS and continuous water-quality data are provided along with those computed using historical methods. Various aspects of method updates are evaluated in this report to help users of water-quality loading data determine which estimation methods best suit their particular application.

  12. Unusual sites of metastatic recurrence of osteosarcoma detected on fluorine-18 fluorodeoxyglucose positron emission tomography/computed tomography

    International Nuclear Information System (INIS)

    Kabnurkar, Rasika; Agrawal, Archi; Rekhi, Bharat; Purandare, Nilendu; Shah, Sneha; Rangarajan, Venkatesh

    2015-01-01

    Osteosarcoma (OS) is the most common nonhematolymphoid primary bone malignancy characterized by osteoid or new bone formation. Lungs and bones are the most common sites of metastases. We report a case where unusual sites of the soft tissue recurrence from OS were detected on restaging fluorine-18 fluorodeoxyglucose positron emission tomography/computed tomography scan done post 6 years of disease free interval

  13. Effect of variable scanning protocolson the pre-implant site evaluation of the mandible in reformatted computed tomography

    International Nuclear Information System (INIS)

    Kim, Kee Deog; Park, Chang Seo

    1999-01-01

    To evaluate the effect of variable scanning protocols of computed tomography for evaluation of pre-implant site of the mandible through the comparison of the reformatted cross-sectional images of helical CT scans obtained with various imaging parameters versus those of conventional CT scans. A dry mandible was imaged using conventional nonoverlapped CT scans with 1 mm slice thickness and helical CT scans with 1 mm slice thickness and pitches of 1.0, 1.5, 2.0, 2.5 and 3.0. All helical images were reconstructed at reconstruction interval of 1 mm. DentaScan reformatted images were obtained to allow standardized visualization of cross-sectional images of the mandible. The reformatted images were reviewed and measured separately by 4 dental radiologists. The image qualities of continuity of cortical outline, trabecular bone structure and visibility of the mandibular canal were evaluated and the distance between anatomic structures were measured by 4 dental radiologists. On image qualities of continuity of cortical outline, trabecular bone structure and visibility of the mandibular canal and in horizontal measurement, there was no statistically significant difference among conventional and helical scans with pitches of 1.0, 1.5 and 2.0. In vertical measurement, there was no statistically significant difference among the conventional and all imaging parameters of helical CT scans with pitches of 1.0, 1.5, 2.0, 2.5 and 3.0. The images of helical CT scans with 1 mm slice thickness and pitches of 1.0, 1.5 and 2.0 are as good as those of conventional CT scans with 1 mm slice thickness for evaluation of pre-dental implant site of the mandible. Considering the radiation dose and patient comfort, helical CT scans with 1 mm slice thickness and pitch of 2.0 is recommended for evaluation of pre-implant site of the mandible.

  14. Standardization of computer-assisted semen analysis using an e-learning application.

    Science.gov (United States)

    Ehlers, J; Behr, M; Bollwein, H; Beyerbach, M; Waberski, D

    2011-08-01

    Computer-assisted semen analysis (CASA) is primarily used to obtain accurate and objective kinetic sperm measurements. Additionally, AI centers use computer-assessed sperm concentration in the sample as a basis for calculating the number of insemination doses available from a given ejaculate. The reliability of data is often limited and results can vary even when the same CASA systems with identical settings are used. The objective of the present study was to develop a computer-based training module for standardized measurements with a CASA system and to evaluate its training effect on the quality of the assessment of sperm motility and concentration. A digital versatile disc (DVD) has been produced showing the standardization of sample preparation and analysis with the CASA system SpermVision™ version 3.0 (Minitube, Verona, WI, USA) in words, pictures, and videos, as well as the most probable sources of error. Eight test persons educated in spermatology, but with different levels of experience with the CASA system, prepared and assessed 10 aliquots from one prediluted bull ejaculate using the same CASA system and laboratory equipment before and after electronic learning (e-learning). After using the e-learning application, the coefficient of variation was reduced on average for the sperm concentration from 26.1% to 11.3% (P ≤ 0.01), and for motility from 5.8% to 3.1% (P ≤ 0.05). For five test persons, the difference in the coefficient of variation before and after use of the e-learning application was significant (P ≤ 0.05). Individual deviations of means from the group mean before e-learning were reduced compared with individual deviations from the group mean after e-learning. According to a survey, the e-learning application was highly accepted by users. In conclusion, e-learning presents an effective, efficient, and accepted tool for improvement of the precision of CASA measurements. This study provides a model for the standardization of other

  15. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  16. A simplified 4-site economical intradermal post-exposure rabies vaccine regimen: a randomised controlled comparison with standard methods.

    Directory of Open Access Journals (Sweden)

    Mary J Warrell

    2008-04-01

    Full Text Available The need for economical rabies post-exposure prophylaxis (PEP is increasing in developing countries. Implementation of the two currently approved economical intradermal (ID vaccine regimens is restricted due to confusion over different vaccines, regimens and dosages, lack of confidence in intradermal technique, and pharmaceutical regulations. We therefore compared a simplified 4-site economical PEP regimen with standard methods.Two hundred and fifty-four volunteers were randomly allocated to a single blind controlled trial. Each received purified vero cell rabies vaccine by one of four PEP regimens: the currently accepted 2-site ID; the 8-site regimen using 0.05 ml per ID site; a new 4-site ID regimen (on day 0, approximately 0.1 ml at 4 ID sites, using the whole 0.5 ml ampoule of vaccine; on day 7, 0.1 ml ID at 2 sites and at one site on days 28 and 90; or the standard 5-dose intramuscular regimen. All ID regimens required the same total amount of vaccine, 60% less than the intramuscular method. Neutralising antibody responses were measured five times over a year in 229 people, for whom complete data were available.All ID regimens showed similar immunogenicity. The intramuscular regimen gave the lowest geometric mean antibody titres. Using the rapid fluorescent focus inhibition test, some sera had unexpectedly high antibody levels that were not attributable to previous vaccination. The results were confirmed using the fluorescent antibody virus neutralisation method.This 4-site PEP regimen proved as immunogenic as current regimens, and has the advantages of requiring fewer clinic visits, being more practicable, and having a wider margin of safety, especially in inexperienced hands, than the 2-site regimen. It is more convenient than the 8-site method, and can be used economically with vaccines formulated in 1.0 or 0.5 ml ampoules. The 4-site regimen now meets all requirements of immunogenicity for PEP and can be introduced without further

  17. A simplified 4-site economical intradermal post-exposure rabies vaccine regimen: a randomised controlled comparison with standard methods.

    Science.gov (United States)

    Warrell, Mary J; Riddell, Anna; Yu, Ly-Mee; Phipps, Judith; Diggle, Linda; Bourhy, Hervé; Deeks, Jonathan J; Fooks, Anthony R; Audry, Laurent; Brookes, Sharon M; Meslin, François-Xavier; Moxon, Richard; Pollard, Andrew J; Warrell, David A

    2008-04-23

    The need for economical rabies post-exposure prophylaxis (PEP) is increasing in developing countries. Implementation of the two currently approved economical intradermal (ID) vaccine regimens is restricted due to confusion over different vaccines, regimens and dosages, lack of confidence in intradermal technique, and pharmaceutical regulations. We therefore compared a simplified 4-site economical PEP regimen with standard methods. Two hundred and fifty-four volunteers were randomly allocated to a single blind controlled trial. Each received purified vero cell rabies vaccine by one of four PEP regimens: the currently accepted 2-site ID; the 8-site regimen using 0.05 ml per ID site; a new 4-site ID regimen (on day 0, approximately 0.1 ml at 4 ID sites, using the whole 0.5 ml ampoule of vaccine; on day 7, 0.1 ml ID at 2 sites and at one site on days 28 and 90); or the standard 5-dose intramuscular regimen. All ID regimens required the same total amount of vaccine, 60% less than the intramuscular method. Neutralising antibody responses were measured five times over a year in 229 people, for whom complete data were available. All ID regimens showed similar immunogenicity. The intramuscular regimen gave the lowest geometric mean antibody titres. Using the rapid fluorescent focus inhibition test, some sera had unexpectedly high antibody levels that were not attributable to previous vaccination. The results were confirmed using the fluorescent antibody virus neutralisation method. This 4-site PEP regimen proved as immunogenic as current regimens, and has the advantages of requiring fewer clinic visits, being more practicable, and having a wider margin of safety, especially in inexperienced hands, than the 2-site regimen. It is more convenient than the 8-site method, and can be used economically with vaccines formulated in 1.0 or 0.5 ml ampoules. The 4-site regimen now meets all requirements of immunogenicity for PEP and can be introduced without further studies. Controlled

  18. A descriptive study of pressure pain threshold at 2 standardized sites in people with acute or subacute neck pain.

    Science.gov (United States)

    Walton, David M; Macdermid, Joy C; Nielson, Warren; Teasell, Robert W; Nailer, Tamara; Maheu, Phillippe

    2011-09-01

    Cross-sectional convenience sample. To describe the distribution of scores for pressure pain threshold (PPT) at 2 standardized testing sites in people with neck pain of less than 90 days' duration: the angle of the upper trapezius and the belly of the tibialis anterior. A secondary objective was to identify important influences on PPT. PPT may be a valuable assessment and prognostic indicator for people with neck pain. However, to facilitate interpretation of scores, knowledge of means and variance for the target population, as well as factors that might influence scores, is needed. Participants were recruited from community-based physiotherapy clinics and underwent PPT testing using a digital algometer and standardized protocol. Descriptive statistics (mean, standard deviations, quartiles, skewness, and kurtosis) were calculated for the 2 sites. Simple bivariate tests of association were conducted to explore potential moderators. A positively skewed distribution was described for the 2 standardized sites. Significant moderators were sex (male higher than female), age (r = 0.22), and self-reported pain intensity (r = -0.24). Neither litigation status nor most symptomatic/least symptomatic side influenced PPT. This manuscript presents information regarding the expected scores for PPT testing in people with acute or subacute neck pain. Clinicians can compare the results of individual patients against these population values, and researchers can incorporate the significant confounders of age, sex, and self-reported pain intensity into future research designs.

  19. Standard guide for radioactive pathway methodology for release of sites following decommissioning

    International Nuclear Information System (INIS)

    Anon.

    1989-01-01

    The purpose of this guide is to provide guidance in determining site-specific conversion factors for translating between dose limits and residual radioactive contamination levels on equipment structures, and land areas. This guide is to serve as a guide to acceptable methodology for translating the yet to be determined dose limits into allowable levels of residual radioactive materials that can be left at a site following decommissioning

  20. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false IEEE 1680 Standard for the... CONTRACT CLAUSES Text of Provisions and Clauses 52.223-16 IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products. As prescribed in 23.706(b)(1), insert the following clause: IEEE...

  1. Summary of computational support and general documentation for computer code (GENTREE) used in Office of Nuclear Waste Isolation Pilot Salt Site Selection Project

    International Nuclear Information System (INIS)

    Beatty, J.A.; Younker, J.L.; Rousseau, W.F.; Elayat, H.A.

    1983-01-01

    A Decision Tree Computer Model was adapted for the purposes of a Pilot Salt Site Selection Project conducted by the Office of Nuclear Waste Isolation (ONWI). A deterministic computer model was developed to structure the site selection problem with submodels reflecting the five major outcome categories (Cost, Safety, Delay, Environment, Community Impact) to be evaluated in the decision process. Time-saving modifications were made in the tree code as part of the effort. In addition, format changes allowed retention of information items which are valuable in directing future research and in isolation of key variabilities in the Site Selection Decision Model. The deterministic code was linked to the modified tree code and the entire program was transferred to the ONWI-VAX computer for future use by the ONWI project

  2. Development and Use of Engineering Standards for Computational Fluid Dynamics for Complex Aerospace Systems

    Science.gov (United States)

    Lee, Hyung B.; Ghia, Urmila; Bayyuk, Sami; Oberkampf, William L.; Roy, Christopher J.; Benek, John A.; Rumsey, Christopher L.; Powers, Joseph M.; Bush, Robert H.; Mani, Mortaza

    2016-01-01

    Computational fluid dynamics (CFD) and other advanced modeling and simulation (M&S) methods are increasingly relied on for predictive performance, reliability and safety of engineering systems. Analysts, designers, decision makers, and project managers, who must depend on simulation, need practical techniques and methods for assessing simulation credibility. The AIAA Guide for Verification and Validation of Computational Fluid Dynamics Simulations (AIAA G-077-1998 (2002)), originally published in 1998, was the first engineering standards document available to the engineering community for verification and validation (V&V) of simulations. Much progress has been made in these areas since 1998. The AIAA Committee on Standards for CFD is currently updating this Guide to incorporate in it the important developments that have taken place in V&V concepts, methods, and practices, particularly with regard to the broader context of predictive capability and uncertainty quantification (UQ) methods and approaches. This paper will provide an overview of the changes and extensions currently underway to update the AIAA Guide. Specifically, a framework for predictive capability will be described for incorporating a wide range of error and uncertainty sources identified during the modeling, verification, and validation processes, with the goal of estimating the total prediction uncertainty of the simulation. The Guide's goal is to provide a foundation for understanding and addressing major issues and concepts in predictive CFD. However, this Guide will not recommend specific approaches in these areas as the field is rapidly evolving. It is hoped that the guidelines provided in this paper, and explained in more detail in the Guide, will aid in the research, development, and use of CFD in engineering decision-making.

  3. [Web-ring of sites for pathologists in the internet: a computer-mediated communication environment].

    Science.gov (United States)

    Khramtsov, A I; Isianov, N N; Khorzhevskiĭ, V A

    2009-01-01

    The recently developed Web-ring of pathology-related Web-sites has transformed computer-mediated communications for Russian-speaking pathologists. Though the pathologists may be geographically dispersed, the network provides a complex of asynchronous and synchronous conferences for the purposes of diagnosis, consultations, education, communication, and collaboration in the field of pathology. This paper describes approaches to be used by participants of the pathology-related Web-ring. The approaches are analogous to the tools employed in telepathology and digital microscopy. One of the novel methodologies is the use of Web-based conferencing systems, in which the whole slide digital images of tissue microarrays were jointly reviewed online by pathologists at distant locations. By using ImageScope (Aperio Technologies) and WebEx connect desktop management technology, they shared presentations and images and communicated in realtime. In this manner, the Web-based forums and conferences will be a powerful addition to a telepathology.

  4. Establishing the Antarctic Dome C community reference standard site towards consistent measurements from Earth observation satellites

    Science.gov (United States)

    Cao, C.; Uprety, S.; Xiong, J.; Wu, A.; Jing, P.; Smith, D.; Chander, G.; Fox, N.; Ungar, S.

    2010-01-01

    Establishing satellite measurement consistency by using common desert sites has become increasingly more important not only for climate change detection but also for quantitative retrievals of geophysical variables in satellite applications. Using the Antarctic Dome C site (75°06′S, 123°21′E, elevation 3.2 km) for satellite radiometric calibration and validation (Cal/Val) is of great interest owing to its unique location and characteristics. The site surface is covered with uniformly distributed permanent snow, and the atmospheric effect is small and relatively constant. In this study, the long-term stability and spectral characteristics of this site are evaluated using well-calibrated satellite instruments such as the Moderate Resolution Imaging Spectroradiometer (MODIS) and Sea-viewing Wide Field-of-view Sensor (SeaWiFS). Preliminary results show that despite a few limitations, the site in general is stable in the long term, the bidirectional reflectance distribution function (BRDF) model works well, and the site is most suitable for the Cal/Val of reflective solar bands in the 0.4–1.0 µm range. It was found that for the past decade, the reflectivity change of the site is within 1.35% at 0.64 µm, and interannual variability is within 2%. The site is able to resolve calibration biases between instruments at a level of ~1%. The usefulness of the site is demonstrated by comparing observations from seven satellite instruments involving four space agencies, including OrbView-2–SeaWiFS, Terra–Aqua MODIS, Earth Observing 1 (EO-1) – Hyperion, Meteorological Operational satellite programme (MetOp) – Advanced Very High Resolution Radiometer (AVHRR), Envisat Medium Resolution Imaging Spectrometer (MERIS) – dvanced Along-Track Scanning Radiometer (AATSR), and Landsat 7 Enhanced Thematic Mapper Plus (ETM+). Dome C is a promising candidate site for climate quality calibration of satellite radiometers towards more consistent satellite measurements, as part

  5. Cloud Computing for Geosciences--GeoCloud for standardized geospatial service platforms (Invited)

    Science.gov (United States)

    Nebert, D. D.; Huang, Q.; Yang, C.

    2013-12-01

    The 21st century geoscience faces challenges of Big Data, spike computing requirements (e.g., when natural disaster happens), and sharing resources through cyberinfrastructure across different organizations (Yang et al., 2011). With flexibility and cost-efficiency of computing resources a primary concern, cloud computing emerges as a promising solution to provide core capabilities to address these challenges. Many governmental and federal agencies are adopting cloud technologies to cut costs and to make federal IT operations more efficient (Huang et al., 2010). However, it is still difficult for geoscientists to take advantage of the benefits of cloud computing to facilitate the scientific research and discoveries. This presentation reports using GeoCloud to illustrate the process and strategies used in building a common platform for geoscience communities to enable the sharing, integration of geospatial data, information and knowledge across different domains. GeoCloud is an annual incubator project coordinated by the Federal Geographic Data Committee (FGDC) in collaboration with the U.S. General Services Administration (GSA) and the Department of Health and Human Services. It is designed as a staging environment to test and document the deployment of a common GeoCloud community platform that can be implemented by multiple agencies. With these standardized virtual geospatial servers, a variety of government geospatial applications can be quickly migrated to the cloud. In order to achieve this objective, multiple projects are nominated each year by federal agencies as existing public-facing geospatial data services. From the initial candidate projects, a set of common operating system and software requirements was identified as the baseline for platform as a service (PaaS) packages. Based on these developed common platform packages, each project deploys and monitors its web application, develops best practices, and documents cost and performance information. This

  6. Building highly available control system applications with Advanced Telecom Computing Architecture and open standards

    International Nuclear Information System (INIS)

    Kazakov, Artem; Furukawa, Kazuro

    2010-01-01

    Requirements for modern and future control systems for large projects like International Linear Collider demand high availability for control system components. Recently telecom industry came up with a great open hardware specification - Advanced Telecom Computing Architecture (ATCA). This specification is aimed for better reliability, availability and serviceability. Since its first market appearance in 2004, ATCA platform has shown tremendous growth and proved to be stable and well represented by a number of vendors. ATCA is an industry standard for highly available systems. On the other hand Service Availability Forum, a consortium of leading communications and computing companies, describes interaction between hardware and software. SAF defines a set of specifications such as Hardware Platform Interface, Application Interface Specification. SAF specifications provide extensive description of highly available systems, services and their interfaces. Originally aimed for telecom applications, these specifications can be used for accelerator controls software as well. This study describes benefits of using these specifications and their possible adoption to accelerator control systems. It is demonstrated how EPICS Redundant IOC was extended using Hardware Platform Interface specification, which made it possible to utilize benefits of the ATCA platform.

  7. Integration of lyoplate based flow cytometry and computational analysis for standardized immunological biomarker discovery.

    Directory of Open Access Journals (Sweden)

    Federica Villanova

    Full Text Available Discovery of novel immune biomarkers for monitoring of disease prognosis and response to therapy in immune-mediated inflammatory diseases is an important unmet clinical need. Here, we establish a novel framework for immunological biomarker discovery, comparing a conventional (liquid flow cytometry platform (CFP and a unique lyoplate-based flow cytometry platform (LFP in combination with advanced computational data analysis. We demonstrate that LFP had higher sensitivity compared to CFP, with increased detection of cytokines (IFN-γ and IL-10 and activation markers (Foxp3 and CD25. Fluorescent intensity of cells stained with lyophilized antibodies was increased compared to cells stained with liquid antibodies. LFP, using a plate loader, allowed medium-throughput processing of samples with comparable intra- and inter-assay variability between platforms. Automated computational analysis identified novel immunophenotypes that were not detected with manual analysis. Our results establish a new flow cytometry platform for standardized and rapid immunological biomarker discovery with wide application to immune-mediated diseases.

  8. Integration of lyoplate based flow cytometry and computational analysis for standardized immunological biomarker discovery.

    Science.gov (United States)

    Villanova, Federica; Di Meglio, Paola; Inokuma, Margaret; Aghaeepour, Nima; Perucha, Esperanza; Mollon, Jennifer; Nomura, Laurel; Hernandez-Fuentes, Maria; Cope, Andrew; Prevost, A Toby; Heck, Susanne; Maino, Vernon; Lord, Graham; Brinkman, Ryan R; Nestle, Frank O

    2013-01-01

    Discovery of novel immune biomarkers for monitoring of disease prognosis and response to therapy in immune-mediated inflammatory diseases is an important unmet clinical need. Here, we establish a novel framework for immunological biomarker discovery, comparing a conventional (liquid) flow cytometry platform (CFP) and a unique lyoplate-based flow cytometry platform (LFP) in combination with advanced computational data analysis. We demonstrate that LFP had higher sensitivity compared to CFP, with increased detection of cytokines (IFN-γ and IL-10) and activation markers (Foxp3 and CD25). Fluorescent intensity of cells stained with lyophilized antibodies was increased compared to cells stained with liquid antibodies. LFP, using a plate loader, allowed medium-throughput processing of samples with comparable intra- and inter-assay variability between platforms. Automated computational analysis identified novel immunophenotypes that were not detected with manual analysis. Our results establish a new flow cytometry platform for standardized and rapid immunological biomarker discovery with wide application to immune-mediated diseases.

  9. A computer program for accident calculations of a standard pressurized water reactor

    International Nuclear Information System (INIS)

    Keutner, H.

    1979-01-01

    In this computer program the dynamic of a standard pressurized water reactor should be realized by both circulation loops with all important components. All important phenomena are taken into consideration, which appear for calculation of disturbances in order to state a realistic process for some minutes after a disturbance or a desired change of condition. In order to optimize the computer time simplifications are introduced in the statement of a differential-algebraic equalization system such that all important effects are taken into consideration. The model analysis starts from the heat production of the fuel rod via cladding material to the cooling medium water and considers the delay time from the core to the steam generator. Alternations of the cooling medium pressure as well as the different temperatures in the primary loop influence the pressuring system - the pressurizer - which is realized by a water and a steam zone with saturated and superheated steam respectively saturated and undercooled water with injection, heating and blow-down devices. The bilance of the steam generator to the secondary loop realizes the process engineering devices. Thereby the control regulation of the steam pressure and the reactor performance is realized. (orig.) [de

  10. Federal environmental standards of potential importance to operations and activities at US Department of Energy sites. Draft

    Energy Technology Data Exchange (ETDEWEB)

    Fowler, K.M.; Bilyard, G.R.; Davidson, S.A.; Jonas, R.J.; Joseph, J.

    1993-06-01

    The US Department of Energy (DOE) is now engaged in a program of environmental restoration nationwide across its 45 sites. It is also bringing its facilities into compliance with environmental regulations, decontaminating and decommissioning unwanted facilities, and constructing new waste management facilities. One of the most difficult questions that DOE must face in successfully remediating its inactive waste sites, decontaminating and decommissioning its inactive facilities, and operating its waste management facilities is: ``What criteria and standards should be met?`` Acceptable standards or procedures for determining standards will assist DOE in its conduct of ongoing waste management and pending cleanup activities by helping to ensure that those activities are conducted in compliance with applicable laws and regulations and are accepted by the regulatory community and the public. This document reports on the second of three baseline activities that are being conducted as prerequisites to either the development of quantitative standards that could be used by DOE, or consistent procedures for developing such standards. The first and third baseline activities are also briefly discussed in conjunction with the second of the three activities.

  11. New safety standards of nuclear power station with no requirements of site evaluation. No public dose limit published with possible inappropriateness of reactor site

    International Nuclear Information System (INIS)

    Takitani, Koichi

    2013-01-01

    Nuclear Regulation Authority was preparing new safety standards in order to aim at starting safety reviews of existing nuclear power station in July 2013. This article commented on issues of major accident, which was defined as severely damaged core event. Accumulated dose at the site boundary of the Fukushima Daiichi Nuclear Power Station totaled to about 234 mSv on March just after the accident with rare gas of 500 PBq, iodine 131 of 500 PBq, cesium 134 of 10 PBq and cesium 137 of 10 PBq released to the atmosphere, which was beyond 100 mSv. As measures for preventing containment vessel failure after core severely damaged, filtered venting system was required to be installed for low radiological risk to the public. However filter was not effective to rare gas. Accumulated doses at the site boundary of several nuclear power stations after filtered venting with 100% release of rare gas could be estimated to be 2-37 Sv mostly depending on the site condition, which might be surely greater than 100 mSv. Omitting site evaluation for major accident, which was beyond design basis accident, was great concern. (T. Tanaka)

  12. Standardization and Optimization of Computed Tomography Protocols to Achieve Low-Dose

    Science.gov (United States)

    Chin, Cynthia; Cody, Dianna D.; Gupta, Rajiv; Hess, Christopher P.; Kalra, Mannudeep K.; Kofler, James M.; Krishnam, Mayil S.; Einstein, Andrew J.

    2014-01-01

    The increase in radiation exposure due to CT scans has been of growing concern in recent years. CT scanners differ in their capabilities and various indications require unique protocols, but there remains room for standardization and optimization. In this paper we summarize approaches to reduce dose, as discussed in lectures comprising the first session of the 2013 UCSF Virtual Symposium on Radiation Safety in Computed Tomography. The experience of scanning at low dose in different body regions, for both diagnostic and interventional CT procedures, is addressed. An essential primary step is justifying the medical need for each scan. General guiding principles for reducing dose include tailoring a scan to a patient, minimizing scan length, use of tube current modulation and minimizing tube current, minimizing-tube potential, iterative reconstruction, and periodic review of CT studies. Organized efforts for standardization have been spearheaded by professional societies such as the American Association of Physicists in Medicine. Finally, all team members should demonstrate an awareness of the importance of minimizing dose. PMID:24589403

  13. Computer-aided detection of breast carcinoma in standard mammographic projections with digital mammography

    Energy Technology Data Exchange (ETDEWEB)

    Destounis, Stamatia [Elizabeth Wende Breast Care, LLC, Rochester, NY (United States); University of Rochester, School of Medicine and Dentistry, Rochester, NY (United States); Hanson, Sarah; Morgan, Renee; Murphy, Philip; Somerville, Patricia; Seifert, Posy; Andolina, Valerie; Arieno, Andrea; Skolny, Melissa; Logan-Young, Wende [Elizabeth Wende Breast Care, LLC, Rochester, NY (United States)

    2009-06-15

    A retrospective evaluation of the ability of computer-aided detection (CAD) ability to identify breast carcinoma in standard mammographic projections. Forty-five biopsy proven lesions in 44 patients imaged digitally with CAD applied at examination were reviewed. Forty-four screening BIRADS {sup registered} category 1 digital mammography examinations were randomly identified to serve as a comparative normal/control population. Data included patient age; BIRADS {sup registered} breast density; lesion type, size, and visibility; number, type, and location of CAD marks per image; CAD ability to mark lesions; needle core and surgical pathologic correlation. The CAD lesion/case sensitivity of 87% (n=39), image sensitivity of 69% (n=31) for mediolateral oblique view and 78% (n=35) for the craniocaudal view was found. The average false positive rate in 44 normal screening cases was 2.0 (range 1-8). The 2.0 figure is based on 88 reported false positive CAD marks in 44 normal screening exams: 98% (n=44) lesions proceeded to excision; initial pathology upgraded at surgical excision from in situ to invasive disease in 24% (n=9) lesions. CAD demonstrated potential to detect mammographically visible cancers in standard projections for all lesion types. (orig.)

  14. Transportable GPU (General Processor Units) chip set technology for standard computer architectures

    Science.gov (United States)

    Fosdick, R. E.; Denison, H. C.

    1982-11-01

    The USAFR-developed GPU Chip Set has been utilized by Tracor to implement both USAF and Navy Standard 16-Bit Airborne Computer Architectures. Both configurations are currently being delivered into DOD full-scale development programs. Leadless Hermetic Chip Carrier packaging has facilitated implementation of both architectures on single 41/2 x 5 substrates. The CMOS and CMOS/SOS implementations of the GPU Chip Set have allowed both CPU implementations to use less than 3 watts of power each. Recent efforts by Tracor for USAF have included the definition of a next-generation GPU Chip Set that will retain the application-proven architecture of the current chip set while offering the added cost advantages of transportability across ISO-CMOS and CMOS/SOS processes and across numerous semiconductor manufacturers using a newly-defined set of common design rules. The Enhanced GPU Chip Set will increase speed by an approximate factor of 3 while significantly reducing chip counts and costs of standard CPU implementations.

  15. Why standard brain-computer interface (BCI) training protocols should be changed: an experimental study

    Science.gov (United States)

    Jeunet, Camille; Jahanpour, Emilie; Lotte, Fabien

    2016-06-01

    Objective. While promising, electroencephaloraphy based brain-computer interfaces (BCIs) are barely used due to their lack of reliability: 15% to 30% of users are unable to control a BCI. Standard training protocols may be partly responsible as they do not satisfy recommendations from psychology. Our main objective was to determine in practice to what extent standard training protocols impact users’ motor imagery based BCI (MI-BCI) control performance. Approach. We performed two experiments. The first consisted in evaluating the efficiency of a standard BCI training protocol for the acquisition of non-BCI related skills in a BCI-free context, which enabled us to rule out the possible impact of BCIs on the training outcome. Thus, participants (N = 54) were asked to perform simple motor tasks. The second experiment was aimed at measuring the correlations between motor tasks and MI-BCI performance. The ten best and ten worst performers of the first study were recruited for an MI-BCI experiment during which they had to learn to perform two MI tasks. We also assessed users’ spatial ability and pre-training μ rhythm amplitude, as both have been related to MI-BCI performance in the literature. Main results. Around 17% of the participants were unable to learn to perform the motor tasks, which is close to the BCI illiteracy rate. This suggests that standard training protocols are suboptimal for skill teaching. No correlation was found between motor tasks and MI-BCI performance. However, spatial ability played an important role in MI-BCI performance. In addition, once the spatial ability covariable had been controlled for, using an ANCOVA, it appeared that participants who faced difficulty during the first experiment improved during the second while the others did not. Significance. These studies suggest that (1) standard MI-BCI training protocols are suboptimal for skill teaching, (2) spatial ability is confirmed as impacting on MI-BCI performance, and (3) when faced

  16. Computer analysis of sound recordings from two Anasazi sites in northwestern New Mexico

    Science.gov (United States)

    Loose, Richard

    2002-11-01

    Sound recordings were made at a natural outdoor amphitheater in Chaco Canyon and in a reconstructed great kiva at Aztec Ruins. Recordings included computer-generated tones and swept sine waves, classical concert flute, Native American flute, conch shell trumpet, and prerecorded music. Recording equipment included analog tape deck, digital minidisk recorder, and direct digital recording to a laptop computer disk. Microphones and geophones were used as transducers. The natural amphitheater lies between the ruins of Pueblo Bonito and Chetro Ketl. It is a semicircular arc in a sandstone cliff measuring 500 ft. wide and 75 ft. high. The radius of the arc was verified with aerial photography, and an acoustic ray trace was generated using cad software. The arc is in an overhanging cliff face and brings distant sounds to a line focus. Along this line, there are unusual acoustic effects at conjugate foci. Time history analysis of recordings from both sites showed that a 60-dB reverb decay lasted from 1.8 to 2.0 s, nearly ideal for public performances of music. Echoes from the amphitheater were perceived to be upshifted in pitch, but this was not seen in FFT analysis. Geophones placed on the floor of the great kiva showed a resonance at 95 Hz.

  17. Accounting for both local aquatic community composition and bioavailability in setting site-specific quality standards for zinc.

    Science.gov (United States)

    Peters, Adam; Simpson, Peter; Moccia, Alessandra

    2014-01-01

    Recent years have seen considerable improvement in water quality standards (QS) for metals by taking account of the effect of local water chemistry conditions on their bioavailability. We describe preliminary efforts to further refine water quality standards, by taking account of the composition of the local ecological community (the ultimate protection objective) in addition to bioavailability. Relevance of QS to the local ecological community is critical as it is important to minimise instances where quality classification using QS does not reconcile with a quality classification based on an assessment of the composition of the local ecology (e.g. using benthic macroinvertebrate quality assessment metrics such as River InVertebrate Prediction and Classification System (RIVPACS)), particularly where ecology is assessed to be at good or better status, whilst chemical quality is determined to be failing relevant standards. The alternative approach outlined here describes a method to derive a site-specific species sensitivity distribution (SSD) based on the ecological community which is expected to be present at the site in the absence of anthropogenic pressures (reference conditions). The method combines a conventional laboratory ecotoxicity dataset normalised for bioavailability with field measurements of the response of benthic macroinvertebrate abundance to chemical exposure. Site-specific QSref are then derived from the 5%ile of this SSD. Using this method, site QSref have been derived for zinc in an area impacted by historic mining activities. Application of QSref can result in greater agreement between chemical and ecological metrics of environmental quality compared with the use of either conventional (QScon) or bioavailability-based QS (QSbio). In addition to zinc, the approach is likely to be applicable to other metals and possibly other types of chemical stressors (e.g. pesticides). However, the methodology for deriving site-specific targets requires

  18. Passive Treatment And Monitoring At The Standard Mine Superfund Site, Crested Butte, CO

    Science.gov (United States)

    At the 2008 ASMR conference, data from the initial two months of operation of a U.S. EPA pilot biochemical reactor (BCR) was reported. The BCR was designed and constructed in August, 2007 to treat mining influenced water (MIW) emanating from an adit at a remote site in southern ...

  19. Passive Treatment And Monitoring At The Standard Mine Superfund Site, Crested Butte, CO (Presentation)

    Science.gov (United States)

    At the 2008 ASMR conference, data from the initial two months of operation of a U.S. EPA pilot biochemical reactor (BCR) was reported. The BCR was designed and constructed in August, 2007 to treat mining influenced water (MIW) emanating from an adit at a remote site in southern ...

  20. OFF-SITE SMARTPHONE VS. STANDARD WORKSTATION IN THE RADIOGRAPHIC DIAGNOSIS OF SMALL INTESTINAL MECHANICAL OBSTRUCTION IN DOGS AND CATS.

    Science.gov (United States)

    Noel, Peter G; Fischetti, Anthony J; Moore, George E; Le Roux, Alexandre B

    2016-09-01

    Off-site consultations by board-certified veterinary radiologists benefit residents and emergency clinicians by providing immediate feedback and potentially improving patient outcome. Smartphone devices and compressed images transmitted by email or text greatly facilitate availability of these off-site consultations. Criticism of a smartphone interface for off-site consultation is mostly directed at image degradation relative to the standard radiographic viewing room and monitors. The purpose of this retrospective, cross-sectional, methods comparison study was to compare the accuracy of abdominal radiographs in two imaging interfaces (Joint Photographic Experts Group, off-site, smartphone vs. Digital Imaging and Communications in Medicine, on-site, standard workstation) for the diagnosis of small intestinal mechanical obstruction in vomiting dogs and cats. Two board-certified radiologists graded randomized abdominal radiographs using a five-point Likert scale for the presence of mechanical obstruction in 100 dogs or cats presenting for vomiting. The area under the receiver operator characteristic curves for both imaging interfaces was high. The accuracy of the smartphone and traditional workstation was not statistically significantly different for either reviewer (P = 0.384 and P = 0.536). Correlation coefficients were 0.821 and 0.705 for each reviewer when the same radiographic study was viewed in different formats. Accuracy differences between radiologists were potentially related to years of experience. We conclude that off-site expert consultation with a smartphone provides an acceptable interface for accurate diagnosis of small intestinal mechanical obstruction in dogs and cat. © 2016 American College of Veterinary Radiology.

  1. Building Accessible Educational Web Sites: The Law, Standards, Guidelines, Tools, and Lessons Learned

    Science.gov (United States)

    Liu, Ye; Palmer, Bart; Recker, Mimi

    2004-01-01

    Professional education is increasingly facing accessibility challenges with the emergence of webbased learning. This paper summarizes related U.S. legislation, standards, guidelines, and validation tools to make web-based learning accessible for all potential learners. We also present lessons learned during the implementation of web accessibility…

  2. Problems and solutions in application of IEEE standards at Savannah River Site, Department of Energy (DOE) nuclear facilities

    International Nuclear Information System (INIS)

    Lee, Y.S.; Bowers, T.L.; Chopra, B.J.; Thompson, T.T.; Zimmerman, E.W.

    1993-01-01

    The Department of Energy (DOE) Nuclear Material Production Facilities at the Savannah River Site (SRS) were designed, constructed, and placed into operation in the early 1950's, based on existing industry codes/standards, design criteria, analytical procedures. Since that time, DOE has developed Orders and Polices for the planning, design and construction of DOE Nuclear Reactor Facilities which invoke or reference commercial nuclear reactor codes and standards. The application of IEEE reactor design requirements such as Equipment Qualification, Seismic Qualification, Single Failure Criteria, and Separation Requirement, to non-reactor facilities has been a problem since the IEEE reactor criteria do not directly confirm to the needs of non-reactor facilities. SRS Systems Engineering is developing a methodology for the application of IEEE Standards to non-reactor facilities at SRS

  3. Accuracy of biopsy needle navigation using the Medarpa system - computed tomography reality superimposed on the site of intervention

    International Nuclear Information System (INIS)

    Khan, M. Fawad; Maataoui, Adel; Gurung, Jessen; Schiemann, Mirko; Vogl, Thomas J.; Dogan, Selami; Ackermann, Hanns; Wesarg, Stefan; Sakas, Georgios

    2005-01-01

    The aim of this work was to determine the accuracy of a new navigational system, Medarpa, with a transparent display superimposing computed tomography (CT) reality on the site of intervention. Medarpa uses an optical and an electromagnetic tracking system which allows tracking of instruments, the radiologist and the transparent display. The display superimposes a CT view of a phantom chest on a phantom chest model, in real time. In group A, needle positioning was performed using the Medarpa system. Three targets (diameter 1.5 mm) located inside the phantom were punctured. In group B, the same targets were used to perform standard CT-guided puncturing using the single-slice technique. The same needles were used in both groups (15 G, 15 cm). A total of 42 punctures were performed in each group. Post puncture, CT scans were made to verify needle tip positions. The mean deviation from the needle tip to the targets was 6.65±1.61 mm for group A (range 3.54-9.51 mm) and 7.05±1.33 mm for group B (range 4.10-9.45 mm). No significant difference was found between group A and group B for any target (p>0.05). No significant difference was found between the targets of the same group (p>0.05). The accuracy in needle puncturing using the augmented reality system, Medarpa, matches the accuracy achieved by CT-guided puncturing technique. (orig.)

  4. Improving the Efficiency of Psychotherapy for Depression: Computer-Assisted Versus Standard CBT.

    Science.gov (United States)

    Thase, Michael E; Wright, Jesse H; Eells, Tracy D; Barrett, Marna S; Wisniewski, Stephen R; Balasubramani, G K; McCrone, Paul; Brown, Gregory K

    2018-03-01

    The authors evaluated the efficacy and durability of a therapist-supported method for computer-assisted cognitive-behavioral therapy (CCBT) in comparison to standard cognitive-behavioral therapy (CBT). A total of 154 medication-free patients with major depressive disorder seeking treatment at two university clinics were randomly assigned to either 16 weeks of standard CBT (up to 20 sessions of 50 minutes each) or CCBT using the "Good Days Ahead" program. The amount of therapist time in CCBT was planned to be about one-third that in CBT. Outcomes were assessed by independent raters and self-report at baseline, at weeks 8 and 16, and at posttreatment months 3 and 6. The primary test of efficacy was noninferiority on the Hamilton Depression Rating Scale at week 16. Approximately 80% of the participants completed the 16-week protocol (79% in the CBT group and 82% in the CCBT group). CCBT met a priori criteria for noninferiority to conventional CBT at week 16. The groups did not differ significantly on any measure of psychopathology. Remission rates were similar for the two groups (intent-to-treat rates, 41.6% for the CBT group and 42.9% for the CCBT group). Both groups maintained improvements throughout the follow-up. The study findings indicate that a method of CCBT that blends Internet-delivered skill-building modules with about 5 hours of therapeutic contact was noninferior to a conventional course of CBT that provided over 8 additional hours of therapist contact. Future studies should focus on dissemination and optimizing therapist support methods to maximize the public health significance of CCBT.

  5. Effect of a Standardized Protocol of Antibiotic Therapy on Surgical Site Infection after Laparoscopic Surgery for Complicated Appendicitis.

    Science.gov (United States)

    Park, Hyoung-Chul; Kim, Min Jeong; Lee, Bong Hwa

    Although it is accepted that complicated appendicitis requires antibiotic therapy to prevent post-operative surgical infections, consensus protocols on the duration and regimens of treatment are not well established. This study aimed to compare the outcome of post-operative infectious complications in patients receiving old non-standardized and new standard antibiotic protocols, involving either 5 or 10 days of treatment, respectively. We enrolled 1,343 patients who underwent laparoscopic surgery for complicated appendicitis between January 2009 and December 2014. At the beginning of the new protocol, the patients were divided into two groups; 10 days of various antibiotic regimens (between January 2009 and June 2012, called the non-standardized protocol; n = 730) and five days of cefuroxime and metronidazole regimen (between July 2012 and December 2014; standardized protocol; n = 613). We compared the clinical outcomes, including surgical site infection (SSI) (superficial and deep organ/space infections) in the two groups. The standardized protocol group had a slightly shorter operative time (67 vs. 69 min), a shorter hospital stay (5 vs. 5.4 d), and lower medical cost (US$1,564 vs. US$1,654). Otherwise, there was no difference between the groups. No differences were found in the non-standardized and standard protocol groups with regard to the rate of superficial infection (10.3% vs. 12.7%; p = 0.488) or deep organ/space infection (2.3% vs. 2.1%; p = 0.797). In patients undergoing laparoscopic surgery for complicated appendicitis, five days of cefuroxime and metronidazole did not lead to more SSIs, and it decreased the medical costs compared with non-standardized antibiotic regimens.

  6. Evaluation of a novel trocar-site closure and comparison with a standard Carter-Thomason closure device.

    Science.gov (United States)

    del Junco, Michael; Okhunov, Zhamshid; Juncal, Samuel; Yoon, Renai; Landman, Jaime

    2014-07-01

    The aim of this study was to evaluate and compare a novel trocars-site closure device, the WECK EFx™ Endo Fascial Closure System (EFx) with the Carter-Thomason CloseSure System® (CT) for the closure of laparoscopic trocar site defects created by a 12-mm dilating trocar. We created standardized laparoscopic trocars-site abdominal wall defects in cadaver models using a standard 12-mm laparoscopic dilating trocar. Trocar defects were closed in a randomized fashion using one of the two closure systems. We recorded time and number of attempts needed for complete defect closure. In addition, we recorded the ability to maintain pneumoperitoneum, endoscopic visualization, safety, security, and facility based on the surgeon's subjective evaluations. We compared outcomes for the EFx and CT closure systems. We created 72 standardized laparoscopic trocars-site abdominal wall defects. The mean time needed for complete defect closure was 98.53 seconds (±28.9) for the EFx compared with 133.61 seconds (±54.61) for the CT (Psafety were 2.92 for EFx vs 2.19 for CT (Pvs 1.83 for EFx and CT, respectively (Pvs 2.33 for CT (P=0.022). No significant difference was observed between the EFx and the CT systems for endoscopic visualization (2.28 vs 2.50, P=0.080). In this in vitro cadaver trial, the EFx was superior in terms of time needed to complete defect closure, safety, and facility. CT was superior in terms of maintenance of pneumoperitoneum. Both systems were equal in the number of attempts needed to complete the defect closure and endoscopic visualization.

  7. Computation Of The Residual Radionuclide Activity Within Three Natural Waterways At The Savannah River Site

    Energy Technology Data Exchange (ETDEWEB)

    Hiergesell, R. A.; Phifer, M. A.

    2014-01-07

    In 2010 a Composite Analysis (CA) of the U.S. Department of Energy’s (DOE’s) Savannah River Site (SRS) was completed. This investigation evaluated the dose impact of the anticipated SRS End State residual sources of radionuclides to offsite members of the public. Doses were assessed at the locations where SRS site streams discharge into the Savannah River at the perimeter of the SRS. Although the model developed to perform this computation indicated that the dose constraint of 0.3 mSv/yr (30 mrem/yr), associated with CA, was not approached at the Points of Assessment (POAs), a significant contribution to the total computed dose was derived from the radionuclides (primarily Cs-137) bound-up in the soil and sediment of the drainage corridors of several SRS streams. DOE’s Low Level Waste Federal Review Group (LFRG) reviewed the 2010 CA and identified several items to be addressed in the SRS Maintenance Program. One of the items recognized Cs-137 in the Lower Three Runs (LTR) Integrator Operable Unit (IOU), as a significant CA dose driver. The item made the recommendation that SRS update the estimated radionuclide inventory, including Cs-137, in the LTR IOU. That initial work has been completed and its radionuclide inventory refined. There are five additional streams at SRS and the next phase of the response to the LFRG concern was to obtain a more accurate inventory and distribution of radionuclides in three of those streams, Fourmile Branch (FMB), Pen Branch (PB) and Steel Creek (SC). Each of these streams is designated as an IOU, which are defined for the purpose of this investigation as the surface water bodies and associated wetlands, including the channel sediment, floodplain sed/soil, and related biota. If present, radionuclides associated with IOUs are adsorbed to the streambed sediment and soils of the shallow floodplains that lie immediately adjacent to stream channels. The scope of this effort included the evaluation of any previous sampling and

  8. A Novel Computational Method for Detecting DNA Methylation Sites with DNA Sequence Information and Physicochemical Properties.

    Science.gov (United States)

    Pan, Gaofeng; Jiang, Limin; Tang, Jijun; Guo, Fei

    2018-02-08

    DNA methylation is an important biochemical process, and it has a close connection with many types of cancer. Research about DNA methylation can help us to understand the regulation mechanism and epigenetic reprogramming. Therefore, it becomes very important to recognize the methylation sites in the DNA sequence. In the past several decades, many computational methods-especially machine learning methods-have been developed since the high-throughout sequencing technology became widely used in research and industry. In order to accurately identify whether or not a nucleotide residue is methylated under the specific DNA sequence context, we propose a novel method that overcomes the shortcomings of previous methods for predicting methylation sites. We use k -gram, multivariate mutual information, discrete wavelet transform, and pseudo amino acid composition to extract features, and train a sparse Bayesian learning model to do DNA methylation prediction. Five criteria-area under the receiver operating characteristic curve (AUC), Matthew's correlation coefficient (MCC), accuracy (ACC), sensitivity (SN), and specificity-are used to evaluate the prediction results of our method. On the benchmark dataset, we could reach 0.8632 on AUC, 0.8017 on ACC, 0.5558 on MCC, and 0.7268 on SN. Additionally, the best results on two scBS-seq profiled mouse embryonic stem cells datasets were 0.8896 and 0.9511 by AUC, respectively. When compared with other outstanding methods, our method surpassed them on the accuracy of prediction. The improvement of AUC by our method compared to other methods was at least 0.0399 . For the convenience of other researchers, our code has been uploaded to a file hosting service, and can be downloaded from: https://figshare.com/s/0697b692d802861282d3.

  9. A Novel Computational Method for Detecting DNA Methylation Sites with DNA Sequence Information and Physicochemical Properties

    Directory of Open Access Journals (Sweden)

    Gaofeng Pan

    2018-02-01

    Full Text Available DNA methylation is an important biochemical process, and it has a close connection with many types of cancer. Research about DNA methylation can help us to understand the regulation mechanism and epigenetic reprogramming. Therefore, it becomes very important to recognize the methylation sites in the DNA sequence. In the past several decades, many computational methods—especially machine learning methods—have been developed since the high-throughout sequencing technology became widely used in research and industry. In order to accurately identify whether or not a nucleotide residue is methylated under the specific DNA sequence context, we propose a novel method that overcomes the shortcomings of previous methods for predicting methylation sites. We use k-gram, multivariate mutual information, discrete wavelet transform, and pseudo amino acid composition to extract features, and train a sparse Bayesian learning model to do DNA methylation prediction. Five criteria—area under the receiver operating characteristic curve (AUC, Matthew’s correlation coefficient (MCC, accuracy (ACC, sensitivity (SN, and specificity—are used to evaluate the prediction results of our method. On the benchmark dataset, we could reach 0.8632 on AUC, 0.8017 on ACC, 0.5558 on MCC, and 0.7268 on SN. Additionally, the best results on two scBS-seq profiled mouse embryonic stem cells datasets were 0.8896 and 0.9511 by AUC, respectively. When compared with other outstanding methods, our method surpassed them on the accuracy of prediction. The improvement of AUC by our method compared to other methods was at least 0.0399 . For the convenience of other researchers, our code has been uploaded to a file hosting service, and can be downloaded from: https://figshare.com/s/0697b692d802861282d3.

  10. Experiment Dashboard - a generic, scalable solution for monitoring of the LHC computing activities, distributed sites and services

    International Nuclear Information System (INIS)

    Andreeva, J; Cinquilli, M; Dieguez, D; Dzhunov, I; Karavakis, E; Karhula, P; Kenyon, M; Kokoszkiewicz, L; Nowotka, M; Ro, G; Saiz, P; Tuckett, D; Sargsyan, L; Schovancova, J

    2012-01-01

    The Experiment Dashboard system provides common solutions for monitoring job processing, data transfers and site/service usability. Over the last seven years, it proved to play a crucial role in the monitoring of the LHC computing activities, distributed sites and services. It has been one of the key elements during the commissioning of the distributed computing systems of the LHC experiments. The first years of data taking represented a serious test for Experiment Dashboard in terms of functionality, scalability and performance. And given that the usage of the Experiment Dashboard applications has been steadily increasing over time, it can be asserted that all the objectives were fully accomplished.

  11. Multi-binding site model-based curve-fitting program for the computation of RIA data

    International Nuclear Information System (INIS)

    Malan, P.G.; Ekins, R.P.; Cox, M.G.; Long, E.M.R.

    1977-01-01

    In this paper, a comparison will be made of model-based and empirical curve-fitting procedures. The implementation of a multiple binding-site curve-fitting model which will successfully fit a wide range of assay data, and which can be run on a mini-computer is described. The latter sophisticated model also provides estimates of binding site concentrations and the values of the respective equilibrium constants present: the latter have been used for refining assay conditions using computer optimisation techniques. (orig./AJ) [de

  12. [Evaluation of a registration card for logging electrocardiographic records into standard personal computers].

    Science.gov (United States)

    Pizzuti, A; Baralis, G; Bassignana, A; Antonielli, E; Di Leo, M

    1997-01-01

    The MS200 Cardioscope, from MRT Micro as., Norway, is a 12 channel ECG card to be directly inserted into a standard personal computer (PC). The standard ISA Bus compatible half length card comes with a set of 10 cables with electrodes and the software for recording, displaying and saving ECG signals. The system is supplied with DOS or Windows software. The goal of the present work was to evaluate the affordability and usability of the MS200 in a clinical setting. We tested the 1.5 DOS version of the software. In 30 patients with various cardiac diseases the ECG signal has been recorded with MS200 and with standard Hellige CardioSmart equipment. The saved ECGs were recalled and printed using an Epson Stylus 800 ink-jet printer. Two cardiologists reviewed the recordings for a looking at output quality, amplitude and speed precision, artifacts, etc. 1) Installation: the card has proven to be totally compatible with the hardware; no changes in default settings had to be made. 2) Usage: the screens are clear; the commands and menus are intuitive and easy to use. Due to the boot-strap and software loading procedures and, most important, off-line printing, the time needed to obtain a complete ECG printout has been longer than that of the reference machine. 3) Archiving and retrieval of ECG: the ECG curves can be saved in original or compressed form: selecting the latter, the noise and non-ECG information is filtered away and the space consumption on disk is reduced: on average, 20 Kb are needed for 10 seconds of signal. The MS200 can be run on a Local Area Network and is prepared for integrating with an existing informative system: we are currently testing the system in this scenery. 4) MS200 includes options for on-line diagnosis, a technology we have not tested in the present work. 5) The only setting allowed for printing full pages is letter size (A4): the quality of printouts is good, with a resolution of 180 DPI. In conclusion, the MS200 system seems reliable and

  13. [Standard of care of carcinomas on cancer of unknown primary site in 2016].

    Science.gov (United States)

    Benderra, Marc-Antoine; Ilié, Marius; Hofman, Paul; Massard, Christophe

    2016-01-01

    Patients with Cancer of unknown primary (cup) represent 2-10%, and have disseminated cancers for which we cannot find the primary site despite the clinical, pathological and radiological exams at our disposal. Diagnosis is based on a thorough clinical and histopathologic examination as well as new imaging techniques. Several clinicopathologic entities requiring specific treatment can be identified. Genome sequencing and liquid biopsy (circulating tumor cells and tumor free DNA) could allow further advances in the diagnosis. Therapeutically, in addition to surgery, radiotherapy and chemotherapy, precision medicine provides new therapeutic approaches. Copyright © 2016 Société Française du Cancer. Published by Elsevier Masson SAS. All rights reserved.

  14. Some statistical aspects of background based groundwater standards at an arid hazardous waste site

    International Nuclear Information System (INIS)

    Chou, C.J.; Hodges, F.N.; Johnson, V.G.

    1994-07-01

    Statistical goodness-of-fit tests and open-quotes Box and Whiskerclose quotes plots of hydrochemical data from selected contaminant-free downgradient wells, and wells located upgradient in a non-contaminated or background area show that spatially distinct sample populations do not exhibit significant differences in groundwater chemical composition within the upper unconfined aquifer. Well location dominates natural constituent variability at this arid site. Spatial coverage should be emphasized in such cases rather than sampling frequency. 5 refs., 3 figs., 1 tab

  15. Predictability of bone density at posterior mandibular implant sites using cone-beam computed tomography intensity values

    OpenAIRE

    Alkhader, Mustafa; Hudieb, Malik; Khader, Yousef

    2017-01-01

    Objective: The aim of this study was to investigate the predictability of bone density at posterior mandibular implant sites using cone-beam computed tomography (CBCT) intensity values. Materials and Methods: CBCT cross-sectional images for 436 posterior mandibular implant sites were selected for the study. Using Invivo software (Anatomage, San Jose, California, USA), two observers classified the bone density into three categories: low, intermediate, and high, and CBCT intensity values were g...

  16. The use of computer-assisted interactive videodisc training in reactor operations at the Savannah River site

    International Nuclear Information System (INIS)

    Shiplett, D.W.

    1990-01-01

    This presentation discussed the use of computer aided training at Savannah River Site using a computer-assisted interactive videodisc system. This system was used in situations where there was a high frequency of training required, where there were a large number of people to be trained and where there was a rigid work schedule. The system was used to support classroom training to emphasize major points, display graphics of flowpaths, for simulations, and video of actual equipment

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  18. A new formula for estimation of standard liver volume using computed tomography-measured body thickness.

    Science.gov (United States)

    Ma, Ka Wing; Chok, Kenneth S H; Chan, Albert C Y; Tam, Henry S C; Dai, Wing Chiu; Cheung, Tan To; Fung, James Y Y; Lo, Chung Mau

    2017-09-01

    The objective of this article is to derive a more accurate and easy-to-use formula for finding estimated standard liver volume (ESLV) using novel computed tomography (CT) measurement parameters. New formulas for ESLV have been emerging that aim to improve the accuracy of estimation. However, many of these formulas contain body surface area measurements and logarithms in the equations that lead to a more complicated calculation. In addition, substantial errors in ESLV using these old formulas have been shown. An improved version of the formula for ESLV is needed. This is a retrospective cohort of consecutive living donor liver transplantations from 2005 to 2016. Donors were randomly assigned to either the formula derivation or validation groups. Total liver volume (TLV) measured by CT was used as the reference for a linear regression analysis against various patient factors. The derived formula was compared with the existing formulas. There were 722 patients (197 from the derivation group, 164 from the validation group, and 361 from the recipient group) involved in the study. The donor's body weight (odds ratio [OR], 10.42; 95% confidence interval [CI], 7.25-13.60; P Liver Transplantation 23 1113-1122 2017 AASLD. © 2017 by the American Association for the Study of Liver Diseases.

  19. A real-time computer simulation of nuclear simulator software using standard PC hardware and linux environments

    International Nuclear Information System (INIS)

    Cha, K. H.; Kweon, K. C.

    2001-01-01

    A feasibility study, which standard PC hardware and Real-Time Linux are applied to real-time computer simulation of software for a nuclear simulator, is presented in this paper. The feasibility prototype was established with the existing software in the Compact Nuclear Simulator (CNS). Throughout the real-time implementation in the feasibility prototype, we has identified that the approach can enable the computer-based predictive simulation to be approached, due to both the remarkable improvement in real-time performance and the less efforts for real-time implementation under standard PC hardware and Real-Time Linux envrionments

  20. American National Standard: for facilities and medical care for on-site nuclear-power-plant radiological emergencies

    International Nuclear Information System (INIS)

    Anon.

    1979-01-01

    This standard provides guidance for first aid during an emergency and for initial medical care of those persons on-site who are overexposed to penetrating radiation (irradiated). It also provides guidance for medical care of persons contaminated with radioactive material or radionuclides who may also be irradiated or injured as a result of an accident at a nuclear power plant. It provides recommendations for facilities, supplies, equipment, and the extent of care both on-site where first aid and initial care may be provided and off-site at a local hospital where further medical and surgical care may be provided. This initial care continues until either the patient is released or admitted, or referred to another, possibly distant, medical center for definitive care. Recommendations are also provided for the transportation of patients and the training of personnel. Recommendations for specialized care are considered to be beyond the scope of this standard on emergency medical care; however, since emergency and specialized care are related, a brief discussion of specialized care is provided in the Appendix

  1. One-Tube-Only Standardized Site-Directed Mutagenesis: An Alternative Approach to Generate Amino Acid Substitution Collections.

    Directory of Open Access Journals (Sweden)

    Janire Mingo

    Full Text Available Site-directed mutagenesis (SDM is a powerful tool to create defined collections of protein variants for experimental and clinical purposes, but effectiveness is compromised when a large number of mutations is required. We present here a one-tube-only standardized SDM approach that generates comprehensive collections of amino acid substitution variants, including scanning- and single site-multiple mutations. The approach combines unified mutagenic primer design with the mixing of multiple distinct primer pairs and/or plasmid templates to increase the yield of a single inverse-PCR mutagenesis reaction. Also, a user-friendly program for automatic design of standardized primers for Ala-scanning mutagenesis is made available. Experimental results were compared with a modeling approach together with stochastic simulation data. For single site-multiple mutagenesis purposes and for simultaneous mutagenesis in different plasmid backgrounds, combination of primer sets and/or plasmid templates in a single reaction tube yielded the distinct mutations in a stochastic fashion. For scanning mutagenesis, we found that a combination of overlapping primer sets in a single PCR reaction allowed the yield of different individual mutations, although this yield did not necessarily follow a stochastic trend. Double mutants were generated when the overlap of primer pairs was below 60%. Our results illustrate that one-tube-only SDM effectively reduces the number of reactions required in large-scale mutagenesis strategies, facilitating the generation of comprehensive collections of protein variants suitable for functional analysis.

  2. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  4. Hanford Site radionuclide national emission standards for hazardous air pollutants registered stack source assessment

    Energy Technology Data Exchange (ETDEWEB)

    Davis, W.E.; Barnett, J.M.

    1994-07-01

    On February 3, 1993, the US Department of Energy, Richland Operations Office received a Compliance Order and Information Request from the Director of the Air and Toxics Division of the US Environmental Protection Agency,, Region 10. The Compliance Order requires the Richland Operations Office to evaluate all radionuclide emission points at the Hanford Site . The evaluation also determined if the effective dose equivalent from any of these stack emissions exceeded 0.1 mrem/yr, which will require the stack to have continuous monitoring. The result of this assessment identified a total of 16 stacks as having potential emissions that,would cause an effective dose equivalent greater than 0.1 mrem/yr.

  5. Hanford Site radionuclide national emission standards for hazardous air pollutants registered stack source assessment

    International Nuclear Information System (INIS)

    Davis, W.E.; Barnett, J.M.

    1994-01-01

    On February 3, 1993, the US Department of Energy, Richland Operations Office received a Compliance Order and Information Request from the Director of the Air and Toxics Division of the US Environmental Protection Agency,, Region 10. The Compliance Order requires the Richland Operations Office to evaluate all radionuclide emission points at the Hanford Site . The evaluation also determined if the effective dose equivalent from any of these stack emissions exceeded 0.1 mrem/yr, which will require the stack to have continuous monitoring. The result of this assessment identified a total of 16 stacks as having potential emissions that,would cause an effective dose equivalent greater than 0.1 mrem/yr

  6. UK safety and standards for radioactive waste management and decommissioning on nuclear licensed sites

    International Nuclear Information System (INIS)

    Mason, D.J.

    2001-01-01

    This paper discusses the regulation of radioactive waste and decommissioning in the United Kingdom and identifies the factors considered by HM Nuclear Installations Inspectorate in examining the adequacy arrangements for their management on nuclear licensed sites. The principal requirements are for decommissioning to be undertaken as soon as reasonably practicable and that radioactive wastes should be minimised, disposed of or contained and controlled by storage in a passively safe form. However, these requirements have to be considered in the context of major organisational changes in the UK nuclear industry and the non-availability of disposal routes for some decommissioning wastes. The legislative framework used to regulate decommissioning of nuclear facilities in the UK is described. Reference is made to radioactive waste and decommissioning strategies, quinquennial reviews criteria for delicensing and the forthcoming Environmental Impact Assessment Regulations. (author)

  7. Standardized Procedure Content And Data Structure Based On Human Factors Requirements For Computer-Based Procedures

    International Nuclear Information System (INIS)

    Bly, Aaron; Oxstrand, Johanna; Le Blanc, Katya L

    2015-01-01

    Most activities that involve human interaction with systems in a nuclear power plant are guided by procedures. Traditionally, the use of procedures has been a paper-based process that supports safe operation of the nuclear power industry. However, the nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. Advances in digital technology make computer-based procedures (CBPs) a valid option that provides further enhancement of safety by improving human performance related to procedure use. The transition from paper-based procedures (PBPs) to CBPs creates a need for a computer-based procedure system (CBPS). A CBPS needs to have the ability to perform logical operations in order to adjust to the inputs received from either users or real time data from plant status databases. Without the ability for logical operations the procedure is just an electronic copy of the paper-based procedure. In order to provide the CBPS with the information it needs to display the procedure steps to the user, special care is needed in the format used to deliver all data and instructions to create the steps. The procedure should be broken down into basic elements and formatted in a standard method for the CBPS. One way to build the underlying data architecture is to use an Extensible Markup Language (XML) schema, which utilizes basic elements to build each step in the smart procedure. The attributes of each step will determine the type of functionality that the system will generate for that step. The CBPS will provide the context for the step to deliver referential information, request a decision, or accept input from the user. The XML schema needs to provide all data necessary for the system to accurately perform each step without the need for the procedure writer to reprogram the CBPS. The research team at the Idaho National Laboratory has developed a prototype CBPS for field workers as well as the

  8. Standardized Procedure Content And Data Structure Based On Human Factors Requirements For Computer-Based Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Bly, Aaron; Oxstrand, Johanna; Le Blanc, Katya L

    2015-02-01

    Most activities that involve human interaction with systems in a nuclear power plant are guided by procedures. Traditionally, the use of procedures has been a paper-based process that supports safe operation of the nuclear power industry. However, the nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. Advances in digital technology make computer-based procedures (CBPs) a valid option that provides further enhancement of safety by improving human performance related to procedure use. The transition from paper-based procedures (PBPs) to CBPs creates a need for a computer-based procedure system (CBPS). A CBPS needs to have the ability to perform logical operations in order to adjust to the inputs received from either users or real time data from plant status databases. Without the ability for logical operations the procedure is just an electronic copy of the paper-based procedure. In order to provide the CBPS with the information it needs to display the procedure steps to the user, special care is needed in the format used to deliver all data and instructions to create the steps. The procedure should be broken down into basic elements and formatted in a standard method for the CBPS. One way to build the underlying data architecture is to use an Extensible Markup Language (XML) schema, which utilizes basic elements to build each step in the smart procedure. The attributes of each step will determine the type of functionality that the system will generate for that step. The CBPS will provide the context for the step to deliver referential information, request a decision, or accept input from the user. The XML schema needs to provide all data necessary for the system to accurately perform each step without the need for the procedure writer to reprogram the CBPS. The research team at the Idaho National Laboratory has developed a prototype CBPS for field workers as well as the

  9. SCALE-4 [Standardized Computer Analyses for Licensing Evaluation]: An improved computational system for spent-fuel cask analysis

    International Nuclear Information System (INIS)

    Parks, C.V.

    1989-01-01

    The purpose of this paper is to provide specific information regarding improvements available with Version 4.0 of the SCALE system and discuss the future of SCALE within the current computing and regulatory environment. The emphasis focuses on the improvements in SCALE-4 over that available in SCALE-3. 10 refs., 1 fig., 1 tab

  10. World Nuclear Association (WNA) internationally standardized reporting (checklist) on the sustainable development performance of uranium mining and processing sites

    International Nuclear Information System (INIS)

    Harris, F.

    2014-01-01

    The World Nuclear Association (WNA) has developed internationally standardized reporting (‘Checklist’) for uranium mining and processing sites. This reporting is to achieve widespread utilities/miners agreement on a list of topics/indicators for common use in demonstrating miners’ adherence to strong sustainable development performance. Nuclear utilities are often required to evaluate the sustainable development performance of their suppliers as part of a utility operational management system. In the present case, nuclear utilities are buyers of uranium supplies from uranium miners and such purchases are often achieved through the utility uranium or fuel supply management function. This Checklist is an evaluation tool which has been created to collect information from uranium miners’ available annual reports, data series, and measurable indicators on a wide range of sustainable development topics to verify that best practices in this field are implemented throughout uranium mining and processing sites. The Checklist has been developed to align with the WNA’s policy document Sustaining Global Best Practices in Uranium Mining and Processing: Principles for Managing Radiation, Health and Safety, and Waste and the Environment which encompasses all applicable aspects of sustainable development to uranium mining and processing. The eleven sections of the Checklist are: 1. Adherence to Sustainable Development; 2. Health, Safety and Environmental Protection; 3. Compliance; 4. Social Responsibility and Stakeholder Engagement; 5. Management of Hazardous Materials; 6. Quality Management Systems; 7. Accidents and Emergencies; 8. Transport of Hazardous Materials; 9. Systematic Approach to Training; 10. Security of Sealed Radioactive Sources and Nuclear Substances; 11. Decommissioning and Site Closure. The Checklist benefits from many years of nuclear utility experience in verifying the sustainable development performance of uranium mining and processing sites. This

  11. X-ray computed tomography reconstruction on non-standard trajectories for robotized inspection

    International Nuclear Information System (INIS)

    Banjak, Hussein

    2016-01-01

    The number of industrial applications of computed tomography (CT) is large and rapidly increasing with typical areas of use in the aerospace, automotive and transport industry. To support this growth of CT in the industrial field, the identified requirements concern firstly software development to improve the reconstruction algorithms and secondly the automation of the inspection process. Indeed, the use of robots gives more flexibility in the acquisition trajectory and allows the control of large and complex objects, which cannot be inspected using classical CT systems. In this context of new CT trend, a robotic platform has been installed at CEA LIST to better understand and solve specific challenges linked to the robotization of the CT process. The considered system integrates two robots that move the X-ray generator and detector. This thesis aims at achieving this new development. In particular, the objective is to develop and implement analytical and iterative reconstruction algorithms adapted to such robotized trajectories. The main focus of this thesis is concerned with helical-like scanning trajectories. We consider two main problems that could occur during acquisition process: truncated and limited-angle data. We present in this work experimental results for reconstruction on such non-standard trajectories. CIVA software is used to simulate these complex inspections and our developed algorithms are integrated as reconstruction tools. This thesis contains three parts. In the first part, we introduce the basic principles of CT and we present an overview of existing analytical and iterative algorithms for non-standard trajectories. In the second part, we modify the approximate helical FDK algorithm to deal with transversely truncated data and we propose a modified FDK algorithm adapted to reverse helical trajectory with the scan range less than 360 degrees. For iterative reconstruction, we propose two algebraic methods named SART-FISTA-TV and DART

  12. The use of computer decision-making support systems to justify address rehabilitation of the Semipalatinsk test site area

    OpenAIRE

    Viktoria V. Zaets; Alexey V. Panov

    2011-01-01

    The paper describes the development of a range of optimal protective measures for remediation of the territory of the Semipalatinsk Test Site. The computer system for decision-making support, ReSCA, was employed for the estimations. Costs and radiological effectiveness of countermeasures were evaluated.

  13. The use of computer decision-making support systems to justify address rehabilitation of the Semipalatinsk test site area

    Directory of Open Access Journals (Sweden)

    Viktoria V. Zaets

    2011-05-01

    Full Text Available The paper describes the development of a range of optimal protective measures for remediation of the territory of the Semipalatinsk Test Site. The computer system for decision-making support, ReSCA, was employed for the estimations. Costs and radiological effectiveness of countermeasures were evaluated.

  14. Improving biofeedback for the treatment of fecal incontinence in women: implementation of a standardized multi-site manometric biofeedback protocol.

    Science.gov (United States)

    Markland, A D; Jelovsek, J E; Whitehead, W E; Newman, D K; Andy, U U; Dyer, K; Harm-Ernandes, I; Cichowski, S; McCormick, J; Rardin, C; Sutkin, G; Shaffer, A; Meikle, S

    2017-01-01

    Standardized training and clinical protocols using biofeedback for the treatment of fecal incontinence (FI) are important for clinical care. Our primary aims were to develop, implement, and evaluate adherence to a standardized protocol for manometric biofeedback to treat FI. In a Pelvic Floor Disorders Network (PFDN) trial, participants were enrolled from eight PFDN clinical centers across the United States. A team of clinical and equipment experts developed biofeedback software on a novel tablet computer platform for conducting standardized anorectal manometry with separate manometric biofeedback protocols for improving anorectal muscle strength, sensation, and urge resistance. The training protocol also included education on bowel function, anal sphincter exercises, and bowel diary monitoring. Study interventionists completed online training prior to attending a centralized, standardized certification course. For the certification, expert trainers assessed the ability of the interventionists to perform the protocol components for a paid volunteer who acted as a standardized patient. Postcertification, the trainers audited interventionists during trial implementation to improve protocol adherence. Twenty-four interventionists attended the in-person training and certification, including 46% advanced practice registered nurses (11/24), 50% (12/24) physical therapists, and 4% physician assistants (1/24). Trainers performed audio audits for 88% (21/24), representing 84 audited visits. All certified interventionists met or exceeded the prespecified 80% pass rate for the audit process, with an average passing rate of 93%. A biofeedback protocol can be successfully imparted to experienced pelvic floor health care providers from various disciplines. Our process promoted high adherence to a standard protocol and is applicable to many clinical settings. © 2016 John Wiley & Sons Ltd.

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  16. A standardized perioperative surgical site infection care process among children with stoma closure: a before-after study.

    Science.gov (United States)

    Porras-Hernandez, Juan; Bracho-Blanchet, Eduardo; Tovilla-Mercado, Jose; Vilar-Compte, Diana; Nieto-Zermeño, Jaime; Davila-Perez, Roberto; Teyssier-Morales, Gustavo; Lule-Dominguez, Martha

    2008-10-01

    We report on the effectiveness of a standardized perioperative care process for lowering surgical site infection (SSI) rates among children with stoma closure at a tertiary-care public pediatric teaching hospital in Mexico City. All consecutive children with stoma closure operated on between November 2003 and October 2005 were prospectively followed for 30 days postoperatively. We conducted a before-after study to evaluate standardized perioperative bowel- and abdominal-wall care process results on SSI rates. Seventy-one patients were operated on, and all completed follow-up. SSI rates declined from 42.8% (12/28) before to 13.9% (6/43) after the standardization procedure (relative risk (RR) = 3.1; 95% confidence interval (CI) = 1.3-7.2; p = 0.006). SSI independently associated risk factors comprised peristomal skin inflammation >3 mm (odds ratio (OR) = 9.6; 95% CI = 1.8-49.6; p = 0.007) and intraoperative complications (OR = 13.3; 95% CI = 1.4-127.2; p = 0.02). Being operated on during the after-study period was shown to be a protective factor against SSI (OR = 0.2; 95% CI = 0.4-0.97; p = 0.04). Standardization was able to reduce SSI rates threefold in children with stoma closure in a short period of time.

  17. Standard Guide for On-Site Inspection and Verification of Operation of Solar Domestic Hot Water Systems

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1987-01-01

    1.1 This guide covers procedures and test methods for conducting an on-site inspection and acceptance test of an installed domestic hot water system (DHW) using flat plate, concentrating-type collectors or tank absorber systems. 1.2 It is intended as a simple and economical acceptance test to be performed by the system installer or an independent tester to verify that critical components of the system are functioning and to acquire baseline data reflecting overall short term system heat output. 1.3 This guide is not intended to generate accurate measurements of system performance (see ASHRAE standard 95-1981 for a laboratory test) or thermal efficiency. 1.4 The values stated in SI units are to be regarded as the standard. The values given in parentheses are for information only. 1.5 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine th...

  18. Hanford Site radionuclide national emission standards for hazardous air pollutants unregistered stack (power exhaust) source assessment

    International Nuclear Information System (INIS)

    Davis, W.E.

    1994-01-01

    On February 3, 1993, the US Department of Energy, Richland Operations Office received a Compliance Order and Information Request from the Director of the Air and Toxics Division of the US Environmental Protection Agency, Region 10. The Compliance Order requires the Richland Operations Office to evaluate all radionuclide emission points at the Hanford Site to determine which are subject to continuous emission measurement requirements in 40 Code of Federal Regulations (CFR) 61, Subpart H, and to continuously measure radionuclide emissions in accordance with 40 CFR 61.93. This evaluation provides an assessment of the 39 unregistered stacks, under Westinghouse Hanford Company's management, and their potential radionuclide emissions, i.e., emissions with no control devices in place. The evaluation also determined if the effective dose equivalent from any of these stack emissions exceeded 0.1 mrem/yr, which will require the stack to have continuous monitoring. The result of this assessment identified three stacks, 107-N, 296-P-26 and 296-P-28, as having potential emissions that would cause an effective dose equivalent greater than 0.1 mrem/yr. These stacks, as noted by 40 CFR 61.93, would require continuous monitoring

  19. Hanford Site radionuclide national emission standards for hazardous air pollutants unregistered stack (power exhaust) source assessment

    Energy Technology Data Exchange (ETDEWEB)

    Davis, W.E.

    1994-08-04

    On February 3, 1993, the US Department of Energy, Richland Operations Office received a Compliance Order and Information Request from the Director of the Air and Toxics Division of the US Environmental Protection Agency, Region 10. The Compliance Order requires the Richland Operations Office to evaluate all radionuclide emission points at the Hanford Site to determine which are subject to continuous emission measurement requirements in 40 Code of Federal Regulations (CFR) 61, Subpart H, and to continuously measure radionuclide emissions in accordance with 40 CFR 61.93. This evaluation provides an assessment of the 39 unregistered stacks, under Westinghouse Hanford Company`s management, and their potential radionuclide emissions, i.e., emissions with no control devices in place. The evaluation also determined if the effective dose equivalent from any of these stack emissions exceeded 0.1 mrem/yr, which will require the stack to have continuous monitoring. The result of this assessment identified three stacks, 107-N, 296-P-26 and 296-P-28, as having potential emissions that would cause an effective dose equivalent greater than 0.1 mrem/yr. These stacks, as noted by 40 CFR 61.93, would require continuous monitoring.

  20. ATLAS off-Grid sites (Tier 3) monitoring. From local fabric monitoring to global overview of the VO computing activities

    CERN Document Server

    PETROSYAN, A; The ATLAS collaboration; BELOV, S; ANDREEVA, J; KADOCHNIKOV, I

    2012-01-01

    The ATLAS Distributed Computing activities have so far concentrated in the "central" part of the experiment computing system, namely the first 3 tiers (the CERN Tier0, 10 Tier1 centers and over 60 Tier2 sites). Many ATLAS Institutes and National Communities have deployed (or intend to) deploy Tier-3 facilities. Tier-3 centers consist of non-pledged resources, which are usually dedicated to data analysis tasks by the geographically close or local scientific groups, and which usually comprise a range of architectures without Grid middleware. Therefore a substantial part of the ATLAS monitoring tools which make use of Grid middleware, cannot be used for a large fraction of Tier3 sites. The presentation will describe the T3mon project, which aims to develop a software suite for monitoring the Tier3 sites, both from the perspective of the local site administrator and that of the ATLAS VO, thereby enabling the global view of the contribution from Tier3 sites to the ATLAS computing activities. Special attention in p...

  1. The Relationship between Computer and Internet Use and Performance on Standardized Tests by Secondary School Students with Visual Impairments

    Science.gov (United States)

    Zhou, Li; Griffin-Shirley, Nora; Kelley, Pat; Banda, Devender R.; Lan, William Y.; Parker, Amy T.; Smith, Derrick W.

    2012-01-01

    Introduction: The study presented here explored the relationship between computer and Internet use and the performance on standardized tests by secondary school students with visual impairments. Methods: With data retrieved from the first three waves (2001-05) of the National Longitudinal Transition Study-2, the correlational study focused on…

  2. Predictability of bone density at posterior mandibular implant sites using cone-beam computed tomography intensity values.

    Science.gov (United States)

    Alkhader, Mustafa; Hudieb, Malik; Khader, Yousef

    2017-01-01

    The aim of this study was to investigate the predictability of bone density at posterior mandibular implant sites using cone-beam computed tomography (CBCT) intensity values. CBCT cross-sectional images for 436 posterior mandibular implant sites were selected for the study. Using Invivo software (Anatomage, San Jose, California, USA), two observers classified the bone density into three categories: low, intermediate, and high, and CBCT intensity values were generated. Based on the consensus of the two observers, 15.6% of sites were of low bone density, 47.9% were of intermediate density, and 36.5% were of high density. Receiver-operating characteristic analysis showed that CBCT intensity values had a high predictive power for predicting high density sites (area under the curve [AUC] =0.94, P < 0.005) and intermediate density sites (AUC = 0.81, P < 0.005). The best cut-off value for intensity to predict intermediate density sites was 218 (sensitivity = 0.77 and specificity = 0.76) and the best cut-off value for intensity to predict high density sites was 403 (sensitivity = 0.93 and specificity = 0.77). CBCT intensity values are considered useful for predicting bone density at posterior mandibular implant sites.

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  4. The position of a standard optical computer mouse affects cardiorespiratory responses during the operation of a computer under time constraints.

    Science.gov (United States)

    Sako, Shunji; Sugiura, Hiromichi; Tanoue, Hironori; Kojima, Makoto; Kono, Mitsunobu; Inaba, Ryoichi

    2014-08-01

    This study investigated the association between task-induced stress and fatigue by examining the cardiovascular responses of subjects using different mouse positions while operating a computer under time constraints. The study was participated by 16 young, healthy men and examined the use of optical mouse devices affixed to laptop computers. Two mouse positions were investigated: (1) the distal position (DP), in which the subjects place their forearms on the desk accompanied by the abduction and flexion of their shoulder joints, and (2) the proximal position (PP), in which the subjects place only their wrists on the desk without using an armrest. The subjects continued each task for 16 min. We assessed differences in several characteristics according to mouse position, including expired gas values, autonomic nerve activities (based on cardiorespiratory responses), operating efficiencies (based on word counts), and fatigue levels (based on the visual analog scale - VAS). Oxygen consumption (VO(2)), the ratio of inspiration time to respiration time (T(i)/T(total)), respiratory rate (RR), minute ventilation (VE), and the ratio of expiration to inspiration (Te/T(i)) were significantly lower when the participants were performing the task in the DP than those obtained in the PP. Tidal volume (VT), carbon dioxide output rates (VCO(2)/VE), and oxygen extraction fractions (VO(2)/VE) were significantly higher for the DP than they were for the PP. No significant difference in VAS was observed between the positions; however, as the task progressed, autonomic nerve activities were lower and operating efficiencies were significantly higher for the DP than they were for the PP. Our results suggest that the DP has fewer effects on cardiorespiratory functions, causes lower levels of sympathetic nerve activity and mental stress, and produces a higher total workload than the PP. This suggests that the DP is preferable to the PP when operating a computer.

  5. The position of a standard optical computer mouse affects cardiorespiratory responses during the operation of a computer under time constraints

    Directory of Open Access Journals (Sweden)

    Shunji Sako

    2014-08-01

    Full Text Available Objectives: This study investigated the association between task-induced stress and fatigue by examining the cardiovascular responses of subjects using different mouse positions while operating a computer under time constraints. Material and Methods: The study was participated by 16 young, healthy men and examined the use of optical mouse devices affixed to laptop computers. Two mouse positions were investigated: (1 the distal position (DP, in which the subjects place their forearms on the desk accompanied by the abduction and flexion of their shoulder joints, and (2 the proximal position (PP, in which the subjects place only their wrists on the desk without using an armrest. The subjects continued each task for 16 min. We assessed differences in several characteristics according to mouse position, including expired gas values, autonomic nerve activities (based on cardiorespiratory responses, operating efficiencies (based on word counts, and fatigue levels (based on the visual analog scale – VAS. Results: Oxygen consumption (VO2, the ratio of inspiration time to respiration time (Ti/Ttotal, respiratory rate (RR, minute ventilation (VE, and the ratio of expiration to inspiration (Te/Ti were significantly lower when the participants were performing the task in the DP than those obtained in the PP. Tidal volume (VT, carbon dioxide output rates (VCO2/VE, and oxygen extraction fractions (VO2/VE were significantly higher for the DP than they were for the PP. No significant difference in VAS was observed between the positions; however, as the task progressed, autonomic nerve activities were lower and operating efficiencies were significantly higher for the DP than they were for the PP. Conclusions: Our results suggest that the DP has fewer effects on cardiorespiratory functions, causes lower levels of sympathetic nerve activity and mental stress, and produces a higher total workload than the PP. This suggests that the DP is preferable to the PP when

  6. Impact of the site specialty of a continuity practice on students' clinical skills: performance with standardized patients.

    Science.gov (United States)

    Pfeiffer, Carol A; Palley, Jane E; Harrington, Karen L

    2010-07-01

    The assessment of clinical competence and the impact of training in ambulatory settings are two issues of importance in the evaluation of medical student performance. This study compares the clinical skills performance of students placed in three types of community preceptors' offices (pediatrics, medicine, family medicine) on yearly clinical skills assessments with standardized patients. Our goal was to see if the site specialty impacted on clinical performance. The students in the study were completing a 3-year continuity preceptorship at a site representing one of the disciplines. Their performance on the four clinical skills assessments was compared. There was no significant difference in history taking, physical exam, communication, or clinical reasoning in any year (ANOVA p< or = .05) There was a small but significant difference in performance on a measure of interpersonal and interviewing skills during Years 1 and 2. The site specialty of an early clinical experience does not have a significant impact on performance of most of the skills measured by the assessments.

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  9. An Information Technology Framework for the Development of an Embedded Computer System for the Remote and Non-Destructive Study of Sensitive Archaeology Sites

    Directory of Open Access Journals (Sweden)

    Iliya Georgiev

    2017-04-01

    Full Text Available The paper proposes an information technology framework for the development of an embedded remote system for non-destructive observation and study of sensitive archaeological sites. The overall concept and motivation are described. The general hardware layout and software configuration are presented. The paper concentrates on the implementation of the following informational technology components: (a a geographically unique identification scheme supporting a global key space for a key-value store; (b a common method for octree modeling for spatial geometrical models of the archaeological artifacts, and abstract object representation in the global key space; (c a broadcast of the archaeological information as an Extensible Markup Language (XML stream over the Web for worldwide availability; and (d a set of testing methods increasing the fault tolerance of the system. This framework can serve as a foundation for the development of a complete system for remote archaeological exploration of enclosed archaeological sites like buried churches, tombs, and caves. An archaeological site is opened once upon discovery, the embedded computer system is installed inside upon a robotic platform, equipped with sensors, cameras, and actuators, and the intact site is sealed again. Archaeological research is conducted on a multimedia data stream which is sent remotely from the system and conforms to necessary standards for digital archaeology.

  10. An eLearning Standard Approach for Supporting PBL in Computer Engineering

    Science.gov (United States)

    Garcia-Robles, R.; Diaz-del-Rio, F.; Vicente-Diaz, S.; Linares-Barranco, A.

    2009-01-01

    Problem-based learning (PBL) has proved to be a highly successful pedagogical model in many fields, although it is not that common in computer engineering. PBL goes beyond the typical teaching methodology by promoting student interaction. This paper presents a PBL trial applied to a course in a computer engineering degree at the University of…

  11. Virtual photons in imaginary time: Computing exact Casimir forces via standard numerical electromagnetism techniques

    NARCIS (Netherlands)

    Rodriguez, A.; Ibanescu, M.; Iannuzzi, D.; Joannopoulos, J. D.; Johnson, S.T.

    2007-01-01

    We describe a numerical method to compute Casimir forces in arbitrary geometries, for arbitrary dielectric and metallic materials, with arbitrary accuracy (given sufficient computational resources). Our approach, based on well-established integration of the mean stress tensor evaluated via the

  12. Development testing of the chemical analysis automation polychlorinated biphenyl standard analysis method during surface soils sampling at the David Witherspoon 1630 site

    International Nuclear Information System (INIS)

    Hunt, M.A.; Klatt, L.N.; Thompson, D.H.

    1998-02-01

    The Chemical Analysis Automation (CAA) project is developing standardized, software-driven, site-deployable robotic laboratory systems with the objective of lowering the per-sample analysis cost, decreasing sample turnaround time, and minimizing human exposure to hazardous and radioactive materials associated with DOE remediation projects. The first integrated system developed by the CAA project is designed to determine polychlorinated biphenyls (PCB) content in soil matrices. A demonstration and development testing of this system was conducted in conjuction with surface soil characterization activities at the David Witherspoon 1630 Site in Knoxville, Tennessee. The PCB system consists of five hardware standard laboratory modules (SLMs), one software SLM, the task sequence controller (TSC), and the human-computer interface (HCI). Four of the hardware SLMs included a four-channel Soxhlet extractor, a high-volume concentrator, a column cleanup, and a gas chromatograph. These SLMs performed the sample preparation and measurement steps within the total analysis protocol. The fifth hardware module was a robot that transports samples between the SLMs and the required consumable supplies to the SLMs. The software SLM is an automated data interpretation module that receives raw data from the gas chromatograph SLM and analyzes the data to yield the analyte information. The TSC is a software system that provides the scheduling, management of system resources, and the coordination of all SLM activities. The HCI is a graphical user interface that presents the automated laboratory to the analyst in terms of the analytical procedures and methods. Human control of the automated laboratory is accomplished via the HCI. Sample information required for processing by the automated laboratory is entered through the HCI. Information related to the sample and the system status is presented to the analyst via graphical icons

  13. Standard practice for digital imaging and communication in nondestructive evaluation (DICONDE) for X-ray computed tomography (CT) test methods

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This practice facilitates the interoperability of X-ray computed tomography (CT) imaging equipment by specifying image data transfer and archival storage methods in commonly accepted terms. This document is intended to be used in conjunction with Practice E2339 on Digital Imaging and Communication in Nondestructive Evaluation (DICONDE). Practice E2339 defines an industrial adaptation of the NEMA Standards Publication titled Digital Imaging and Communications in Medicine (DICOM, see http://medical.nema.org), an international standard for image data acquisition, review, storage and archival storage. The goal of Practice E2339, commonly referred to as DICONDE, is to provide a standard that facilitates the display and analysis of NDE test results on any system conforming to the DICONDE standard. Toward that end, Practice E2339 provides a data dictionary and a set of information modules that are applicable to all NDE modalities. This practice supplements Practice E2339 by providing information object definitio...

  14. Computational Identification of Protein Pupylation Sites by Using Profile-Based Composition of k-Spaced Amino Acid Pairs.

    Directory of Open Access Journals (Sweden)

    Md Mehedi Hasan

    Full Text Available Prokaryotic proteins are regulated by pupylation, a type of post-translational modification that contributes to cellular function in bacterial organisms. In pupylation process, the prokaryotic ubiquitin-like protein (Pup tagging is functionally analogous to ubiquitination in order to tag target proteins for proteasomal degradation. To date, several experimental methods have been developed to identify pupylated proteins and their pupylation sites, but these experimental methods are generally laborious and costly. Therefore, computational methods that can accurately predict potential pupylation sites based on protein sequence information are highly desirable. In this paper, a novel predictor termed as pbPUP has been developed for accurate prediction of pupylation sites. In particular, a sophisticated sequence encoding scheme [i.e. the profile-based composition of k-spaced amino acid pairs (pbCKSAAP] is used to represent the sequence patterns and evolutionary information of the sequence fragments surrounding pupylation sites. Then, a Support Vector Machine (SVM classifier is trained using the pbCKSAAP encoding scheme. The final pbPUP predictor achieves an AUC value of 0.849 in 10-fold cross-validation tests and outperforms other existing predictors on a comprehensive independent test dataset. The proposed method is anticipated to be a helpful computational resource for the prediction of pupylation sites. The web server and curated datasets in this study are freely available at http://protein.cau.edu.cn/pbPUP/.

  15. Quantifying relative importance: Computing standardized effects in models with binary outcomes

    Science.gov (United States)

    Grace, James B.; Johnson, Darren; Lefcheck, Jonathan S.; Byrnes, Jarrett E.K.

    2018-01-01

    Scientists commonly ask questions about the relative importances of processes, and then turn to statistical models for answers. Standardized coefficients are typically used in such situations, with the goal being to compare effects on a common scale. Traditional approaches to obtaining standardized coefficients were developed with idealized Gaussian variables in mind. When responses are binary, complications arise that impact standardization methods. In this paper, we review, evaluate, and propose new methods for standardizing coefficients from models that contain binary outcomes. We first consider the interpretability of unstandardized coefficients and then examine two main approaches to standardization. One approach, which we refer to as the Latent-Theoretical or LT method, assumes that underlying binary observations there exists a latent, continuous propensity linearly related to the coefficients. A second approach, which we refer to as the Observed-Empirical or OE method, assumes responses are purely discrete and estimates error variance empirically via reference to a classical R2 estimator. We also evaluate the standard formula for calculating standardized coefficients based on standard deviations. Criticisms of this practice have been persistent, leading us to propose an alternative formula that is based on user-defined “relevant ranges”. Finally, we implement all of the above in an open-source package for the statistical software R.

  16. Geothermal-energy files in computer storage: sites, cities, and industries

    Energy Technology Data Exchange (ETDEWEB)

    O' Dea, P.L.

    1981-12-01

    The site, city, and industrial files are described. The data presented are from the hydrothermal site file containing about three thousand records which describe some of the principal physical features of hydrothermal resources in the United States. Data elements include: latitude, longitude, township, range, section, surface temperature, subsurface temperature, the field potential, and well depth for commercialization. (MHR)

  17. Features generated for computational splice-site prediction correspond to functional elements

    Directory of Open Access Journals (Sweden)

    Wilbur W John

    2007-10-01

    Full Text Available Abstract Background Accurate selection of splice sites during the splicing of precursors to messenger RNA requires both relatively well-characterized signals at the splice sites and auxiliary signals in the adjacent exons and introns. We previously described a feature generation algorithm (FGA that is capable of achieving high classification accuracy on human 3' splice sites. In this paper, we extend the splice-site prediction to 5' splice sites and explore the generated features for biologically meaningful splicing signals. Results We present examples from the observed features that correspond to known signals, both core signals (including the branch site and pyrimidine tract and auxiliary signals (including GGG triplets and exon splicing enhancers. We present evidence that features identified by FGA include splicing signals not found by other methods. Conclusion Our generated features capture known biological signals in the expected sequence interval flanking splice sites. The method can be easily applied to other species and to similar classification problems, such as tissue-specific regulatory elements, polyadenylation sites, promoters, etc.

  18. Nuclear event time histories and computed site transfer functions for locations in the Los Angeles region

    Science.gov (United States)

    Rogers, A.M.; Covington, P.A.; Park, R.B.; Borcherdt, R.D.; Perkins, D.M.

    1980-01-01

    This report presents a collection of Nevada Test Site (NTS) nuclear explosion recordings obtained at sites in the greater Los Angeles, Calif., region. The report includes ground velocity time histories, as well as, derived site transfer functions. These data have been collected as part of a study to evaluate the validity of using low-level ground motions to predict the frequency-dependent response of a site during an earthquake. For this study 19 nuclear events were recorded at 98 separate locations. Some of these sites have recorded more than one of the nuclear explosions, and, consequently, there are a total of 159, three-component station records. The location of all the recording sites are shown in figures 1–5, the station coordinates and abbreviations are given in table 1. The station addresses are listed in table 2, and the nuclear explosions that were recorded are listed in table 3. The recording sites were chosen on the basis of three criteria: (1) that the underlying geological conditions were representative of conditions over significant areas of the region, (2) that the site was the location of a strong-motion recording of the 1971 San Fernando earthquake, or (3) that more complete geographical coverage was required in that location.

  19. Computer analysis of protein functional sites projection on exon structure of genes in Metazoa.

    Science.gov (United States)

    Medvedeva, Irina V; Demenkov, Pavel S; Ivanisenko, Vladimir A

    2015-01-01

    Study of the relationship between the structural and functional organization of proteins and their coding genes is necessary for an understanding of the evolution of molecular systems and can provide new knowledge for many applications for designing proteins with improved medical and biological properties. It is well known that the functional properties of proteins are determined by their functional sites. Functional sites are usually represented by a small number of amino acid residues that are distantly located from each other in the amino acid sequence. They are highly conserved within their functional group and vary significantly in structure between such groups. According to this facts analysis of the general properties of the structural organization of the functional sites at the protein level and, at the level of exon-intron structure of the coding gene is still an actual problem. One approach to this analysis is the projection of amino acid residue positions of the functional sites along with the exon boundaries to the gene structure. In this paper, we examined the discontinuity of the functional sites in the exon-intron structure of genes and the distribution of lengths and phases of the functional site encoding exons in vertebrate genes. We have shown that the DNA fragments coding the functional sites were in the same exons, or in close exons. The observed tendency to cluster the exons that code functional sites which could be considered as the unit of protein evolution. We studied the characteristics of the structure of the exon boundaries that code, and do not code, functional sites in 11 Metazoa species. This is accompanied by a reduced frequency of intercodon gaps (phase 0) in exons encoding the amino acid residue functional site, which may be evidence of the existence of evolutionary limitations to the exon shuffling. These results characterize the features of the coding exon-intron structure that affect the functionality of the encoded protein and

  20. CAMAC - an introduction into a system of standardized highways between computers and their peripherals

    International Nuclear Information System (INIS)

    Stuckenberg, H.J.

    1975-10-01

    CAMAC, which is a synonym for 'Computer Automated Measurement and Control', is a set of rules widely used in many countries for connecting processors and computers to the on-line peripherals. There are rules for an interface transferring the information via a common highway as well as for modular mechanical units in which the peripheral devices are housed together with the multipole connectors combining the computer with the controlled process. All peripherals in a system are sending the data and control information to the computer through parallel or serial highways which are defined also by the CAMAC rules. The use of CAMAC assures the possibility to combine compatible hardware of various suppliers in any system without mechanical or electrical difficulties making the hardware and software implementation much more easier. Also the reconfiguration of a system needed for new and other activities is done relatively fast and simple. Compatible devices are offered by about 60 suppliers in all five continents. (orig.) [de

  1. Computing the temperature dependence of effective CP violation in the standard model

    Czech Academy of Sciences Publication Activity Database

    Brauner, Tomáš; Taanila, O.; Tranberg, A.; Vuorinen, A.

    2012-01-01

    Roč. 2012, č. 11 (2012), 076 ISSN 1126-6708 Institutional support: RVO:61389005 Keywords : CP violation * Thermal Field Theory * Standard Model Subject RIV: BE - Theoretical Physics Impact factor: 5.618, year: 2012

  2. A novel representation of inter-site tumour heterogeneity from pre-treatment computed tomography textures classifies ovarian cancers by clinical outcome

    Energy Technology Data Exchange (ETDEWEB)

    Vargas, Hebert Alberto; Micco, Maura; Lakhman, Yulia; Meier, Andreas A.; Sosa, Ramon; Hricak, Hedvig; Sala, Evis [Memorial Sloan Kettering Cancer Center, Department of Radiology, New York, NY (United States); Veeraraghavan, Harini; Deasy, Joseph [Memorial Sloan Kettering Cancer Center, Department of Medical Physics, New York, NY (United States); Nougaret, Stephanie [Memorial Sloan Kettering Cancer Center, Department of Radiology, New York, NY (United States); Service de Radiologie, Institut Regional du Cancer de Montpellier, Montpellier (France); INSERM, U1194, Institut de Recherche en Cancerologie de Montpellier (IRCM), Montpellier (France); Soslow, Robert A.; Weigelt, Britta [Memorial Sloan Kettering Cancer Center, Department of Pathology, New York, NY (United States); Levine, Douglas A. [Memorial Sloan Kettering Cancer Center, Department of Surgery, New York, NY (United States); Aghajanian, Carol; Snyder, Alexandra [Memorial Sloan Kettering Cancer Center, Department of Medicine, New York, NY (United States)

    2017-09-15

    To evaluate the associations between clinical outcomes and radiomics-derived inter-site spatial heterogeneity metrics across multiple metastatic lesions on CT in patients with high-grade serous ovarian cancer (HGSOC). IRB-approved retrospective study of 38 HGSOC patients. All sites of suspected HGSOC involvement on preoperative CT were manually segmented. Gray-level correlation matrix-based textures were computed from each tumour site, and grouped into five clusters using a Gaussian Mixture Model. Pairwise inter-site similarities were computed, generating an inter-site similarity matrix (ISM). Inter-site texture heterogeneity metrics were computed from the ISM and compared to clinical outcomes. Of the 12 inter-site texture heterogeneity metrics evaluated, those capturing the differences in texture similarities across sites were associated with shorter overall survival (inter-site similarity entropy, similarity level cluster shade, and inter-site similarity level cluster prominence; p ≤ 0.05) and incomplete surgical resection (similarity level cluster shade, inter-site similarity level cluster prominence and inter-site cluster variance; p ≤ 0.05). Neither the total number of disease sites per patient nor the overall tumour volume per patient was associated with overall survival. Amplification of 19q12 involving cyclin E1 gene (CCNE1) predominantly occurred in patients with more heterogeneous inter-site textures. Quantitative metrics non-invasively capturing spatial inter-site heterogeneity may predict outcomes in patients with HGSOC. (orig.)

  3. Formulation and practice of standards for radiation protection of γ-ray industrial computed tomography

    International Nuclear Information System (INIS)

    Zhou Rifeng; Wang Jue; Chen Weimin; Li Ping

    2009-01-01

    There are many differences between industrial CT and industrial radiography, such as imaging principle, inspection time, radiation dose and the requirements for operators etc. The national standards for radiation protection of industrial detection are not applicable to the requirements of protection and safety for γ-ray industrial CT to some extent now. In order to standardize the production and use for γ-ray industrial CT, protect the safety of operators and the public, and to promote the popularization and application of γ-ray industrial CT, it is significant to establish the national standards for radiation protection of γ-ray industrial CT as soon as possible. The purpose of this paper is to introduce the contents of this standard, and specify some important terms. Then there is a brief discussion on the existing problems during establishing such standards. At last, the paper summarize the practice of the standards passed over the past one year, which provides practicable experience for the further implementation. (authors)

  4. Computer Models Used to Support Cleanup Decision Making at Hazardous and Radioactive Waste Sites

    Science.gov (United States)

    This report is a product of the Interagency Environmental Pathway Modeling Workgroup. This report will help bring a uniform approach to solving environmental modeling problems common to site remediation and restoration efforts.

  5. Support for Maui Space Surveillance Site and Maui High Performance Computing Center

    National Research Council Canada - National Science Library

    1999-01-01

    ...) for the Maui Space Surveillance Site. GEMINI, not to be confused with the National Science Foundation's Gemini Telescopes Project, is a one-of-a-kind sensor package built for USAF Space Command operational use in conjunction...

  6. The ATLAS Computing Agora: a resource web site for citizen science projects

    CERN Document Server

    Bourdarios, Claire; The ATLAS collaboration

    2016-01-01

    The ATLAS collaboration has recently setup a number of citizen science projects which have a strong IT component and could not have been envisaged without the growth of general public computing resources and network connectivity: event simulation through volunteer computing, algorithms improvement via Machine Learning challenges, event display analysis on citizen science platforms, use of open data, etc. Most of the interactions with volunteers are handled through message boards, but specific outreach material was also developed, giving an enhanced visibility to the ATLAS software and computing techniques, challenges and community. In this talk the Atlas Computing Agora (ACA) web platform will be presented as well as some of the specific material developed for some of the projects.

  7. Computational approaches to standard-compliant biofilm data for reliable analysis and integration

    Directory of Open Access Journals (Sweden)

    Sousa Ana Margarida

    2012-12-01

    Full Text Available The study of microorganism consortia, also known as biofilms, is associated to a number of applications in biotechnology, ecotechnology and clinical domains. Nowadays, biofilm studies are heterogeneous and data-intensive, encompassing different levels of analysis. Computational modelling of biofilm studies has become thus a requirement to make sense of these vast and ever-expanding biofilm data volumes.

  8. Contributing to global computing platform: gliding, tunneling standard services and high energy physics application

    International Nuclear Information System (INIS)

    Lodygensky, O.

    2006-09-01

    Centralized computers have been replaced by 'client/server' distributed architectures which are in turn in competition with new distributed systems known as 'peer to peer'. These new technologies are widely spread, and trading, industry and the research world have understood the new goals involved and massively invest around these new technologies, named 'grid'. One of the fields is about calculating. This is the subject of the works presented here. At the Paris Orsay University, a synergy emerged between the Computing Science Laboratory (LRI) and the Linear Accelerator Laboratory (LAL) on grid infrastructure, opening new investigations fields for the first and new high computing perspective for the other. Works presented here are the results of this multi-discipline collaboration. They are based on XtremWeb, the LRI global computing platform. We first introduce a state of the art of the large scale distributed systems, its principles, its architecture based on services. We then introduce XtremWeb and detail modifications and improvements we had to specify and implement to achieve our goals. We present two different studies, first interconnecting grids in order to generalize resource sharing and secondly, be able to use legacy services on such platforms. We finally explain how a research community like the community of high energy cosmic radiation detection can gain access to these services and detail Monte Carlos and data analysis processes over the grids. (author)

  9. Virtual photons in imaginary time: Computing exact Casimir forces via standard numerical electromagnetism techniques

    International Nuclear Information System (INIS)

    Rodriguez, Alejandro; Ibanescu, Mihai; Joannopoulos, J. D.; Johnson, Steven G.; Iannuzzi, Davide

    2007-01-01

    We describe a numerical method to compute Casimir forces in arbitrary geometries, for arbitrary dielectric and metallic materials, with arbitrary accuracy (given sufficient computational resources). Our approach, based on well-established integration of the mean stress tensor evaluated via the fluctuation-dissipation theorem, is designed to directly exploit fast methods developed for classical computational electromagnetism, since it only involves repeated evaluation of the Green's function for imaginary frequencies (equivalently, real frequencies in imaginary time). We develop the approach by systematically examining various formulations of Casimir forces from the previous decades and evaluating them according to their suitability for numerical computation. We illustrate our approach with a simple finite-difference frequency-domain implementation, test it for known geometries such as a cylinder and a plate, and apply it to new geometries. In particular, we show that a pistonlike geometry of two squares sliding between metal walls, in both two and three dimensions with both perfect and realistic metallic materials, exhibits a surprising nonmonotonic ''lateral'' force from the walls

  10. Towards a "Golden Standard" for computing globin stability: Stability and structure sensitivity of myoglobin mutants

    DEFF Research Database (Denmark)

    Kepp, Kasper Planeta

    2015-01-01

    Fast and accurate computation of protein stability is increasingly important for e.g. protein engineering and protein misfolding diseases, but no consensus methods exist for important proteins such as globins, and performance may depend on the type of structural input given. This paper reports be...

  11. Effects of Computer-Based Training on Procedural Modifications to Standard Functional Analyses

    Science.gov (United States)

    Schnell, Lauren K.; Sidener, Tina M.; DeBar, Ruth M.; Vladescu, Jason C.; Kahng, SungWoo

    2018-01-01

    Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to…

  12. Use of Standardized Test Scores to Predict Success in a Computer Applications Course

    Science.gov (United States)

    Harris, Robert V.; King, Stephanie B.

    2016-01-01

    The purpose of this study was to see if a relationship existed between American College Testing (ACT) scores (i.e., English, reading, mathematics, science reasoning, and composite) and student success in a computer applications course at a Mississippi community college. The study showed that while the ACT scores were excellent predictors of…

  13. A concept to standardize raw biosignal transmission for brain-computer interfaces.

    Science.gov (United States)

    Breitwieser, Christian; Neuper, Christa; Müller-Putz, Gernot R

    2011-01-01

    With this concept we introduced the attempt of a standardized interface called TiA to transmit raw biosignals. TiA is able to deal with multirate and block-oriented data transmission. Data is distinguished by different signal types (e.g., EEG, EOG, NIRS, …), whereby those signals can be acquired at the same time from different acquisition devices. TiA is built as a client-server model. Multiple clients can connect to one server. Information is exchanged via a control- and a separated data connection. Control commands and meta information are transmitted over the control connection. Raw biosignal data is delivered using the data connection in a unidirectional way. For this purpose a standardized handshaking protocol and raw data packet have been developed. Thus, an abstraction layer between hardware devices and data processing was evolved facilitating standardization.

  14. Computational prediction of cAMP receptor protein (CRP binding sites in cyanobacterial genomes

    Directory of Open Access Journals (Sweden)

    Su Zhengchang

    2009-01-01

    Full Text Available Abstract Background Cyclic AMP receptor protein (CRP, also known as catabolite gene activator protein (CAP, is an important transcriptional regulator widely distributed in many bacteria. The biological processes under the regulation of CRP are highly diverse among different groups of bacterial species. Elucidation of CRP regulons in cyanobacteria will further our understanding of the physiology and ecology of this important group of microorganisms. Previously, CRP has been experimentally studied in only two cyanobacterial strains: Synechocystis sp. PCC 6803 and Anabaena sp. PCC 7120; therefore, a systematic genome-scale study of the potential CRP target genes and binding sites in cyanobacterial genomes is urgently needed. Results We have predicted and analyzed the CRP binding sites and regulons in 12 sequenced cyanobacterial genomes using a highly effective cis-regulatory binding site scanning algorithm. Our results show that cyanobacterial CRP binding sites are very similar to those in E. coli; however, the regulons are very different from that of E. coli. Furthermore, CRP regulons in different cyanobacterial species/ecotypes are also highly diversified, ranging from photosynthesis, carbon fixation and nitrogen assimilation, to chemotaxis and signal transduction. In addition, our prediction indicates that crp genes in modern cyanobacteria are likely inherited from a common ancestral gene in their last common ancestor, and have adapted various cellular functions in different environments, while some cyanobacteria lost their crp genes as well as CRP binding sites during the course of evolution. Conclusion The CRP regulons in cyanobacteria are highly diversified, probably as a result of divergent evolution to adapt to various ecological niches. Cyanobacterial CRPs may function as lineage-specific regulators participating in various cellular processes, and are important in some lineages. However, they are dispensable in some other lineages. The

  15. SABER: a computational method for identifying active sites for new reactions.

    Science.gov (United States)

    Nosrati, Geoffrey R; Houk, K N

    2012-05-01

    A software suite, SABER (Selection of Active/Binding sites for Enzyme Redesign), has been developed for the analysis of atomic geometries in protein structures, using a geometric hashing algorithm (Barker and Thornton, Bioinformatics 2003;19:1644-1649). SABER is used to explore the Protein Data Bank (PDB) to locate proteins with a specific 3D arrangement of catalytic groups to identify active sites that might be redesigned to catalyze new reactions. As a proof-of-principle test, SABER was used to identify enzymes that have the same catalytic group arrangement present in o-succinyl benzoate synthase (OSBS). Among the highest-scoring scaffolds identified by the SABER search for enzymes with the same catalytic group arrangement as OSBS were L-Ala D/L-Glu epimerase (AEE) and muconate lactonizing enzyme II (MLE), both of which have been redesigned to become effective OSBS catalysts, demonstrated by experiments. Next, we used SABER to search for naturally existing active sites in the PDB with catalytic groups similar to those present in the designed Kemp elimination enzyme KE07. From over 2000 geometric matches to the KE07 active site, SABER identified 23 matches that corresponded to residues from known active sites. The best of these matches, with a 0.28 Å catalytic atom RMSD to KE07, was then redesigned to be compatible with the Kemp elimination using RosettaDesign. We also used SABER to search for potential Kemp eliminases using a theozyme predicted to provide a greater rate acceleration than the active site of KE07, and used Rosetta to create a design based on the proteins identified. Copyright © 2012 The Protein Society.

  16. Position paper on the applicability of supplemental standards to the uppermost aquifer at the Uranium Mill Tailings Vitro Processing Site, Salt Lake City, Utah

    International Nuclear Information System (INIS)

    1996-03-01

    This report documents the results of the evaluation of the potential applicability of supplemental standards to the uppermost aquifer underlying the Uranium Mill Tailings Remedial Action (UMTRA) Project, Vitro Processing Site, Salt Lake City, Utah. There are two goals for this evaluation: provide the landowner with information to make an early qualitative decision on the possible use of the Vitro property, and evaluate the proposed application of supplemental standards as the ground water compliance strategy at the site. Justification of supplemental standards is based on the contention that the uppermost aquifer is of limited use due to wide-spread ambient contamination not related to the previous site processing activities. In support of the above, this report discusses the site conceptual model for the uppermost aquifer and related hydrogeological systems and establishes regional and local background water quality. This information is used to determine the extent of site-related and ambient contamination. A risk-based evaluation of the contaminants' effects on current and projected land uses is also provided. Reports of regional and local studies and U.S. Department of Energy (DOE) site investigations provided the basis for the conceptual model and established background ground water quality. In addition, a limited field effort (4 through 28 March 1996) was conducted to supplement existing data, particularly addressing the extent of contamination in the northwestern portion of the Vitro site and site background ground water quality. Results of the field investigation were particularly useful in refining the conceptual site model. This was important in light of the varied ground water quality within the uppermost aquifer. Finally, this report provides a critical evaluation, along with the related uncertainties, of the applicability of supplemental standards to the uppermost aquifer at the Salt Lake City Vitro processing site

  17. Effects of standard and explicit cognitive bias modification and computer-administered cognitive-behaviour therapy on cognitive biases and social anxiety.

    Science.gov (United States)

    Mobini, Sirous; Mackintosh, Bundy; Illingworth, Jo; Gega, Lina; Langdon, Peter; Hoppitt, Laura

    2014-06-01

    This study examines the effects of a single session of Cognitive Bias Modification to induce positive Interpretative bias (CBM-I) using standard or explicit instructions and an analogue of computer-administered CBT (c-CBT) program on modifying cognitive biases and social anxiety. A sample of 76 volunteers with social anxiety attended a research site. At both pre- and post-test, participants completed two computer-administered tests of interpretative and attentional biases and a self-report measure of social anxiety. Participants in the training conditions completed a single session of either standard or explicit CBM-I positive training and a c-CBT program. Participants in the Control (no training) condition completed a CBM-I neutral task matched the active CBM-I intervention in format and duration but did not encourage positive disambiguation of socially ambiguous or threatening scenarios. Participants in both CBM-I programs (either standard or explicit instructions) and the c-CBT condition exhibited more positive interpretations of ambiguous social scenarios at post-test and one-week follow-up as compared to the Control condition. Moreover, the results showed that CBM-I and c-CBT, to some extent, changed negative attention biases in a positive direction. Furthermore, the results showed that both CBM-I training conditions and c-CBT reduced social anxiety symptoms at one-week follow-up. This study used a single session of CBM-I training, however multi-sessions intervention might result in more endurable positive CBM-I changes. A computerised single session of CBM-I and an analogue of c-CBT program reduced negative interpretative biases and social anxiety. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. Cone beam computed tomography in veterinary dentistry: description and standardization of the technique

    International Nuclear Information System (INIS)

    Roza, Marcello R.; Silva, Luiz A.F.; Fioravanti, Maria C. S.; Barriviera, Mauricio

    2009-01-01

    Eleven dogs and four cats with buccodental alterations, treated in the Centro Veterinario do Gama, in Brasilia, DF, Brazil, were submitted to cone beam computed tomography. The exams were carried out in a i-CAT tomograph, using for image acquisition six centimeters height, 40 seconds time, 0.2 voxel, 120 kilovolts and 46.72 milli amperes per second. The ideal positioning of the animal for the exam was also determined in this study and it proved to be fundamental for successful examination, which required a simple and safe anesthetic protocol due to the relatively short period of time necessary to obtain the images. Several alterations and diseases were identified with accurate imaging, demonstrating that cone beam computed tomography is a safe, accessible and feasible imaging method which could be included in the small animal dentistry routine diagnosis. (author)

  19. Using Web Services and XML Harvesting to Achieve a Dynamic Web Site. Computers in Small Libraries

    Science.gov (United States)

    Roberts, Gary

    2005-01-01

    Exploiting and contextualizing free information is a natural part of library culture. In this column, Gary Roberts, the information systems and reference librarian at Herrick Library, Alfred University in Alfred, NY, describes how to use XML content on a Web site to link to hundreds of free and useful resources. He gives a general overview of the…

  20. Using Action Research To Create a Computer-Assisted Homework Site.

    Science.gov (United States)

    Packard, Abbot L.; Holmes, Glen A.

    This paper investigates a collaboration between faculty and students in a college statistics course to develop a method of quickly getting homework graded with feedback indicated and returned to the students. Using a World Wide Web site to deliver this support was a possible solution. A survey was developed to gain student input in the process of…

  1. A prospective evaluation of conventional cystography for detection of urine leakage at the vesicourethral anastomosis site after radical prostatectomy based on computed tomography.

    Science.gov (United States)

    Han, K S; Choi, H J; Jung, D C; Park, S; Cho, K S; Joung, J Y; Seo, H K; Chung, J; Lee, K H

    2011-03-01

    To evaluate the diagnostic accuracy of conventional cystography for the detection of urine leakage at the vesicourethral anastomosis (VUA) site after radical prostatectomy based on computed tomography (CT) cystography. Patients who underwent radical prostatectomies at a single tertiary cancer centre were prospectively enrolled. Conventional cystography was routinely performed on postoperative day 7. Non-enhanced pelvic CT images were obtained after retrograde instillation of the same contrast material for a reference standard of urine leakage at the VUA site. Urine leakage was classified as follows: none; a plication abnormality; mild; moderate; and excessive. One hundred and twenty consecutive patients were enrolled. Conventional cystography detected 14 urine leakages, but CT cystography detected 40 urine leakages, which consisted of 28 mild and 12 moderate urine leakages. When using CT cystography as the standard measurement, conventional cystography showed a diagnostic accuracy of 17.8% (5/28) for mild urine leakage and 75% (9/12) for moderate leakage. Of nine patients diagnosed with mild leakage on conventional cystography, four (44.4%) had complicated moderate urine leakages based on CT cystography, requiring prolonged catheterization. The sensitivity, specificity, positive and negative predictive values, and accuracy of conventional cystography were 35, 100, 100, 75.4, and 78.3%, respectively. Conventional cystography is less accurate than CT cystography for diagnosing urine leakage at the VUA site after a radical prostatectomy. The present results suggest that CT cystography is a good choice for diagnostic imaging of urine leakage after radical prostatectomy. Copyright © 2010 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  2. Standard schematics for small heat pumps - Part 2: fundamentals and computer simulations; Standardschaltungen fuer Kleinwaermepumpenanlagen. Teil 2: Grundlagen und Computersimulationen

    Energy Technology Data Exchange (ETDEWEB)

    Afjei, Th.; Schonhardt, U.; Wemhoener, C. [Fachhochschule beider Basel, Muttenz (Switzerland); Erb, M. [Eicher und Pauli AG, Liestal (Switzerland); Gabathuler, H.R.; Mayer, H. [Gabathuler AG, Diessenhofen (Switzerland); Zweifel, G.; Achermann, M.; Euw, R. von; Stoeckli, U. [Hochschule fuer Technik und Architektur (HTA), Fachhochschule Zentralschweiz, Horw (Switzerland)

    2002-07-01

    This final report for the Swiss Federal Office of Energy (SFOE) presents the results of the second stage of the STASCH project (Standard Schemes for Small Heat Pump Systems up to 25 kW) that, with the aid of computer simulation, was to investigate certain issues in connection with heat pump configurations. The simulations were used to clarify questions resulting from a previous project (FAWA) that involved the analysis of heat pump systems in the field. The findings of the simulation have been incorporated into straightforward design tools for heat pump installers. The methodology behind the development of seven standard installation schematics is discussed. The testing, using computer simulation, of several variants for space heating and combined space-heating / hot water applications in both new and existing buildings is described. The factors taken into consideration when choosing the seven standard schematics, such as comfort, annual efficiency, power consumption, investment and operating costs and technical reliability, are discussed. The design tool, which helps choose correctly-sized heat pumps, piping, circulation pumps, thermal storage and the positioning of temperature sensors is introduced.

  3. A Standard Mutual Authentication Protocol for Cloud Computing Based Health Care System.

    Science.gov (United States)

    Mohit, Prerna; Amin, Ruhul; Karati, Arijit; Biswas, G P; Khan, Muhammad Khurram

    2017-04-01

    Telecare Medical Information System (TMIS) supports a standard platform to the patient for getting necessary medical treatment from the doctor(s) via Internet communication. Security protection is important for medical records (data) of the patients because of very sensitive information. Besides, patient anonymity is another most important property, which must be protected. Most recently, Chiou et al. suggested an authentication protocol for TMIS by utilizing the concept of cloud environment. They claimed that their protocol is patient anonymous and well security protected. We reviewed their protocol and found that it is completely insecure against patient anonymity. Further, the same protocol is not protected against mobile device stolen attack. In order to improve security level and complexity, we design a light weight authentication protocol for the same environment. Our security analysis ensures resilience of all possible security attacks. The performance of our protocol is relatively standard in comparison with the related previous research.

  4. Computational evaluation of a pencil ionization chamber in a standard diagnostic radiology beam

    International Nuclear Information System (INIS)

    Mendonca, Dalila Souza Costa; Neves, Lucio Pereira; Perini, Ana Paula; Belinato, Walmir

    2016-01-01

    In this work a pencil ionization chamber was evaluated. This evaluation consisted in the determination of the influence of the ionization chamber components in its response. For this purpose, the Monte Carlo simulations and the spectrum of the standard diagnostic radiology beam (RQR5) were utilized. The results obtained, showed that the influence of the ionization chamber components presented no significant influence on the chamber response. Therefore, this ionization chamber is a good alternative for dosimetry in diagnostic radiology. (author)

  5. Some considerations in standard gas leak designs and their applications to computer control systems

    International Nuclear Information System (INIS)

    Winkelman, C.R.; Wedel, T.A.

    The primary difficulty with flow rate measurements below 10 -10 standard cubic centimeters per second (std. cc/sec) is that there are no commercially available standards. The requirements, however, dictate that the problem of design and construction of a qualifiable standard in the ultra-sensitive range had to be solved. There are a number of leak types which were considered: capillary leaks, orifice leaks, and the pore type leaks, among others. The capillary leak was not used because of the cracking or sorting effects that are common to this type leak. For example, a gas blend flowing through a capillary leak will result in the lighter gases passing through the leak first. The difficulty of fabricating the proper hole size in relation to the flow rate requirements ruled out the orifice type leak. The selected choice was the pore type leak which utilizes the basic concept of a stainless steel knife edge driven into a fixed section composed of stainless steel with a gold overlay and maintained under force

  6. Analyzing Dental Implant Sites From Cone Beam Computed Tomography Scans on a Tablet Computer: A Comparative Study Between iPad and 3 Display Systems.

    Science.gov (United States)

    Carrasco, Alejandro; Jalali, Elnaz; Dhingra, Ajay; Lurie, Alan; Yadav, Sumit; Tadinada, Aditya

    2017-06-01

    The aim of this study was to compare a medical-grade PACS (picture archiving and communication system) monitor, a consumer-grade monitor, a laptop computer, and a tablet computer for linear measurements of height and width for specific implant sites in the posterior maxilla and mandible, along with visualization of the associated anatomical structures. Cone beam computed tomography (CBCT) scans were evaluated. The images were reviewed using PACS-LCD monitor, consumer-grade LCD monitor using CB-Works software, a 13″ MacBook Pro, and an iPad 4 using OsiriX DICOM reader software. The operators had to identify anatomical structures in each display using a 2-point scale. User experience between PACS and iPad was also evaluated by means of a questionnaire. The measurements were very similar for each device. P-values were all greater than 0.05, indicating no significant difference between the monitors for each measurement. The intraoperator reliability was very high. The user experience was similar in each category with the most significant difference regarding the portability where the PACS display received the lowest score and the iPad received the highest score. The iPad with retina display was comparable with the medical-grade monitor, producing similar measurements and image visualization, and thus providing an inexpensive, portable, and reliable screen to analyze CBCT images in the operating room during the implant surgery.

  7. Computer-aided mapping of stream channels beneath the Lawrence Livermore National Laboratory Super Fund Site

    Energy Technology Data Exchange (ETDEWEB)

    Sick, M. [Lawrence Livermore National Lab., CA (United States)

    1994-12-01

    The Lawrence Livermore National Laboratory (LLNL) site rests upon 300-400 feet of highly heterogeneous braided stream sediments which have been contaminated by a plume of Volatile Organic Compounds (VOCs). The stream channels are filled with highly permeable coarse grained materials that provide quick avenues for contaminant transport. The plume of VOCs has migrated off site in the TFA area, making it the area of greatest concern. I mapped the paleo-stream channels in the TFA area using SLICE an LLNL Auto-CADD routine. SLICE constructed 2D cross sections and sub-horizontal views of chemical, geophysical, and lithologic data sets. I interpreted these 2D views as a braided stream environment, delineating the edges of stream channels. The interpretations were extracted from Auto-CADD and placed into Earth Vision`s 3D modeling and viewing routines. Several 3D correlations have been generated, but no model has yet been chosen as a best fit.

  8. A computational model of modern standard arabic verbal morphology based on generation

    OpenAIRE

    González Martínez, Alicia

    2013-01-01

    Tesis doctoral inédita leída en la Universidad Autónoma de Madrid, Facultad de Filosofía y Letras, Departamento de Lingüística, Lenguas Modernas, Lógica y Fª de la Ciencia y Tª de la Literatura y Literataura Comparada. Fecha de lectura: 29-01-2013 The computational handling of non-concatenative morphologies is still a challenge in the field of natural language processing. Amongst the various areas of research, Arabic morphology stands out due to its highly complex structure. We propose a m...

  9. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Control modules C4, C6

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U. S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume is part of the manual related to the control modules for the newest updated version of this computational package.

  10. Design and implementation of a computer based site operations log for the ARM Program

    International Nuclear Information System (INIS)

    Tichler, J.L.; Bernstein, H.J.; Bobrowski, S.F.; Melton, R.B.; Campbell, A.P.; Edwards, D.M.; Kanciruk, P.; Singley, P.T.

    1992-01-01

    The Atmospheric Radiation Measurement (ARM) Program is a Department of Energy (DOE) research effort to reduce the uncertainties found in general circulation and other models due to the effects of clouds and solar radiation. ARM will provide an experimental testbed for the study of important atmospheric effects, particularly cloud and radiative processes, and testing of parameterizations of the processes for use in atmospheric models. The design of the testbed known as the Clouds and Radiation Testbed (CART), calls for five, long-term field data collection sites. The first site, located in the Southern Great Plains (SGP) in Lamont, OK began operation in the spring of 1992. The CART Data Environment (CDE) is the element of the testbed which acquires the basic observations from the instruments and processes them to meet the ARM requirements. A formal design was used to develop a description of the logical requirements for the CDE. This paper discusses the design and prototype implementation of a part of the CDE known as the site operations log, which records metadata defining the environment within which the data produced by the instruments is collected

  11. Designing Computer-Supported Complex Systems Curricula for the Next Generation Science Standards in High School Science Classrooms

    Directory of Open Access Journals (Sweden)

    Susan A. Yoon

    2016-12-01

    Full Text Available We present a curriculum and instruction framework for computer-supported teaching and learning about complex systems in high school science classrooms. This work responds to a need in K-12 science education research and practice for the articulation of design features for classroom instruction that can address the Next Generation Science Standards (NGSS recently launched in the USA. We outline the features of the framework, including curricular relevance, cognitively rich pedagogies, computational tools for teaching and learning, and the development of content expertise, and provide examples of how the framework is translated into practice. We follow this up with evidence from a preliminary study conducted with 10 teachers and 361 students, aimed at understanding the extent to which students learned from the activities. Results demonstrated gains in students’ complex systems understanding and biology content knowledge. In interviews, students identified influences of various aspects of the curriculum and instruction framework on their learning.

  12. A parallel simulated annealing algorithm for standard cell placement on a hypercube computer

    Science.gov (United States)

    Jones, Mark Howard

    1987-01-01

    A parallel version of a simulated annealing algorithm is presented which is targeted to run on a hypercube computer. A strategy for mapping the cells in a two dimensional area of a chip onto processors in an n-dimensional hypercube is proposed such that both small and large distance moves can be applied. Two types of moves are allowed: cell exchanges and cell displacements. The computation of the cost function in parallel among all the processors in the hypercube is described along with a distributed data structure that needs to be stored in the hypercube to support parallel cost evaluation. A novel tree broadcasting strategy is used extensively in the algorithm for updating cell locations in the parallel environment. Studies on the performance of the algorithm on example industrial circuits show that it is faster and gives better final placement results than the uniprocessor simulated annealing algorithms. An improved uniprocessor algorithm is proposed which is based on the improved results obtained from parallelization of the simulated annealing algorithm.

  13. Three computer codes to read, plot, and tabulate operational test-site recorded solar data. [TAPFIL, CHPLOT, and WRTCNL codes

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, S.D.; Sampson, R.J. Jr.; Stonemetz, R.E.; Rouse, S.L.

    1980-07-01

    A computer program, TAPFIL, has been developed by MSFC to read data from an IBM 360 tape for use on the PDP 11/70. The information (insolation, flowrates, temperatures, etc.) from 48 operational solar heating and cooling test sites is stored on the tapes. Two other programs, CHPLOT and WRTCNL, have been developed to plot and tabulate the data. These data will be used in the evaluation of collector efficiency and solar system performance. This report describes the methodology of the programs, their inputs, and their outputs.

  14. A comparison of standard radiological examinations, computed tomography, scintigraphy and angiography in the recidivistic diagnostic of bone tumors

    International Nuclear Information System (INIS)

    Schaeffer, G.

    1986-01-01

    In a retrospective study the diagnostic efficiency of standard radiography, computed tomography (CT), bone scintigraphy and angiography in the diagnosis of tumor recidivism was studied using 54 patients with an operatively treated bone tumor. The highest diagnostic sensitivity (100%) was achieved with the help of CT. For the determination or exclusion of a recidivistic bone tumor, the diagnostic strength of the individual procedures lies in their combinations, but these combinations should be made on the basis of the tumor type and disease. (MBC) [de

  15. Computer-aided process planning in prismatic shape die components based on Standard for the Exchange of Product model data

    Directory of Open Access Journals (Sweden)

    Awais Ahmad Khan

    2015-11-01

    Full Text Available Insufficient technologies made good integration between the die components in design, process planning, and manufacturing impossible in the past few years. Nowadays, the advanced technologies based on Standard for the Exchange of Product model data are making it possible. This article discusses the three main steps for achieving the complete process planning for prismatic parts of the die components. These three steps are data extraction, feature recognition, and process planning. The proposed computer-aided process planning system works as part of an integrated system to cover the process planning of any prismatic part die component. The system is built using Visual Basic with EWDraw system for visualizing the Standard for the Exchange of Product model data file. The system works successfully and can cover any type of sheet metal die components. The case study discussed in this article is taken from a large design of progressive die.

  16. Computing the temperature dependence of effective CP violation in the standard model

    DEFF Research Database (Denmark)

    Brauner, Tomas; Taanila, Olli; Tranberg, Anders

    2012-01-01

    model is strongly suppressed at high temperature, but that at T less than or similar to 1 GeV it may be relevant for certain scenarios of baryogenesis. We also identify a selected class of operators at the next, eighth order and discuss the convergence of the covariant gradient expansion....... of the effective action to the leading nontrivial, sixth order in the covariant gradient expansion as a function of temperature. In the limit of zero temperature, our result addresses the discrepancy between two independent calculations existing in the literature [1, 2]. We find that CP violation in the standard...

  17. Fiscal 1997 report on the results of the international standardization R and D. International standards for computers/manikins; 1997 nendo seika hokokusho kokusai hyojun soseigata kenkyu kaihatsu. Computer manikin ni kansuru kokusai hyojun kikaku

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    Through the development of computer manikins (CM) which assess human adaptability to products and environments, a draft for international standardization was worked out to propose to ISO. A draft for the international standardization was presented to ISO through a development of `a structure model` changing based on human attributes, a study of `a motion model` enabling changes in posture and movement, a study of `an evaluation model` evaluating attainment ranges and ecodynamic loads, and a development of `computer functions` realizing the above-mentioned functions. The development of CM having the following characteristics: a function to reproduce `the structure model` based on the ISO7250 human body dimensional measuring values which were regulated in items for the human body dimensional measuring, a function to change posture/movement based on the joint movable range data, a function to evaluate geometrical human adaptability such as attainment ranges. As a plug-in to Autodesk Mechanical Desktop 2.0, the above-mentioned functions were realized, and the modular structure platform was constructed which enables the wide-range cross-industry option and functional expansion by the advance of CM. 7 refs., 41 figs., 18 tabs.

  18. Potential Bone to Implant Contact Area of Short Versus Standard Implants: An In Vitro Micro-Computed Tomography Analysis.

    Science.gov (United States)

    Quaranta, Alessandro; DʼIsidoro, Orlando; Bambini, Fabrizio; Putignano, Angelo

    2016-02-01

    To compare the available potential bone-implant contact (PBIC) area of standard and short dental implants by micro-computed tomography (μCT) assessment. Three short implants with different diameters (4.5 × 6 mm, 4.1 × 7 mm, and 4.1 × 6 mm) and 2 standard implants (3.5 × 10 mm and 3.3 × 9 mm) with diverse design and surface features were scanned with μCT. Cross-sectional images were obtained. Image data were manually processed to find the plane that corresponds to the most coronal contact point between the crestal bone and implant. The available PBIC was calculated for each sample. Later on, the cross-sectional slices were processed by a 3-dimensional (3D) software, and 3D images of each sample were used for descriptive analysis and display the microtopography and macrotopography. The wide-diameter short implant (4.5 × 6 mm) showed the higher PBIC (210.89 mm) value followed by the standard (178.07 mm and 185.37 mm) and short implants (130.70 mm and 110.70 mm). Wide-diameter short implants show a surface area comparable with standard implants. Micro-CT analysis is a promising technique to evaluate surface area in dental implants with different macrodesign, microdesign, and surface features.

  19. Standardized evaluation of algorithms for computer-aided diagnosis of dementia based on structural MRI

    DEFF Research Database (Denmark)

    Bron, Esther E.; Smits, Marion; van der Flier, Wiesje M.

    2015-01-01

    algorithms based on a clinically representative multi-center data set. Using clinical practice as the starting point, the goal was to reproduce the clinical diagnosis. Therefore, we evaluated algorithms for multi-class classification of three diagnostic groups: patients with probable Alzheimer's disease...... of aging). The best performing algorithm yielded an accuracy of 63.0% and an area under the receiver-operating-characteristic curve (AUC) of 78.8%. In general, the best performances were achieved using feature extraction based on voxel-based morphometry or a combination of features that included volume......Abstract Algorithms for computer-aided diagnosis of dementia based on structural MRI have demonstrated high performance in the literature, but are difficult to compare as different data sets and methodology were used for evaluation. In addition, it is unclear how the algorithms would perform...

  20. A brain-computer interface as input channel for a standard assistive technology software.

    Science.gov (United States)

    Zickler, Claudia; Riccio, Angela; Leotta, Francesco; Hillian-Tress, Sandra; Halder, Sebastian; Holz, Elisa; Staiger-Sälzer, Pit; Hoogerwerf, Evert-Jan; Desideri, Lorenzo; Mattia, Donatella; Kübler, Andrea

    2011-10-01

    Recently brain-computer interface (BCI) control was integrated into the commercial assistive technology product QualiWORLD (QualiLife Inc., Paradiso-Lugano, CH). Usability of the first prototype was evaluated in terms of effectiveness (accuracy), efficiency (information transfer rate and subjective workload/NASA Task Load Index) and user satisfaction (Quebec User Evaluation of Satisfaction with assistive Technology, QUEST 2.0) by four end-users with severe disabilities. Three assistive technology experts evaluated the device from a third person perspective. The results revealed high performance levels in communication and internet tasks. Users and assistive technology experts were quite satisfied with the device. However, none could imagine using the device in daily life without improvements. Main obstacles were the EEG-cap and low speed.

  1. FEATURES OF SYSTEM COMPUTER SUPPORT TRAINING ON THE SCHOOL’S SITE

    Directory of Open Access Journals (Sweden)

    Petro H. Shevchuk

    2010-08-01

    Full Text Available The article considering the problems of computerization of teaching process at public educational establishments by means of global network facilities deployment. It analyses advantages and disadvantages of electronic maintenance of teaching through Internet by example of dedicated system at Miropylska gymnasium web-site. It describes the interaction of a teacher and students with the cyberspace to publish teaching material on Internet. The article also includes general recommendations on the issues of similar systems employment and describes principle directions of their further development.

  2. EFFAIR: a computer program for estimating the dispersion of atmospheric emissions from a nuclear site

    International Nuclear Information System (INIS)

    Dormuth, K.W.; Lyon, R.B.

    1978-11-01

    Analysis of the transport of material through the turbulent atmospheric boundary layer is an important part of environmental impact assessments for nuclear plants. Although this is a complex phenomenon, practical estimates of ground level concentrations downwind of release are usually obtained using a simple Gaussian formula whose coefficients are obtained from empirical correlations. Based on this formula, the computer program EFFAIR has been written to provide a flexible tool for atmospheric dispersion calculations. It is considered appropriate for calculating dilution factors at distances of 10 2 to 10 4 metres from an effluent source if reflection from the inversion lid is negligible in that range. (author)

  3. A computer aided measurement method for unstable pelvic fractures based on standardized radiographs

    International Nuclear Information System (INIS)

    Zhao, Jing-xin; Zhao, Zhe; Zhang, Li-cheng; Su, Xiu-yun; Du, Hai-long; Zhang, Li-ning; Zhang, Li-hai; Tang, Pei-fu

    2015-01-01

    To set up a method for measuring radiographic displacement of unstable pelvic ring fractures based on standardized X-ray images and then test its reliability and validity using a software-based measurement technique. Twenty-five patients that were diagnosed as AO/OTA type B or C pelvic fractures with unilateral pelvis fractured and dislocated were eligible for inclusion by a review of medical records in our clinical centre. Based on the input pelvic preoperative CT data, the standardized X-ray images, including inlet, outlet, and anterior-posterior (AP) radiographs, were simulated using Armira software (Visage Imaging GmbH, Berlin, Germany). After representative anatomic landmarks were marked on the standardized X-ray images, the 2-dimensional (2D) coordinates of these points could be revealed in Digimizer software (Model: Mitutoyo Corp., Tokyo, Japan). Subsequently, we developed a formula that indicated the translational and rotational displacement patterns of the injured hemipelvis. Five separate observers calculated the displacement outcomes using the established formula and determined the rotational patterns using a 3D-CT model based on their overall impression. We performed 3D reconstruction of all the fractured pelvises using Mimics (Materialise, Haasrode, Belgium) and determined the translational and rotational displacement using 3-matic suite. The interobserver reliability of the new method was assessed by comparing the continuous measure and categorical outcomes using intraclass correlation coefficient (ICC) and kappa statistic, respectively. The interobserver reliability of the new method for translational and rotational measurement was high, with both ICCs above 0.9. Rotational outcome assessed by the new method was the same as that concluded by 3-matic software. The agreement for rotational outcome among orthopaedic surgeons based on overall impression was poor (kappa statistic, 0.250 to 0.426). Compared with the 3D reconstruction outcome, the

  4. Determination of a tissue-level failure evaluation standard for rat femoral cortical bone utilizing a hybrid computational-experimental method.

    Science.gov (United States)

    Fan, Ruoxun; Liu, Jie; Jia, Zhengbin; Deng, Ying; Liu, Jun

    2018-01-01

    Macro-level failure in bone structure could be diagnosed by pain or physical examination. However, diagnosing tissue-level failure in a timely manner is challenging due to the difficulty in observing the interior mechanical environment of bone tissue. Because most fractures begin with tissue-level failure in bone tissue caused by continually applied loading, people attempt to monitor the tissue-level failure of bone and provide corresponding measures to prevent fracture. Many tissue-level mechanical parameters of bone could be predicted or measured; however, the value of the parameter may vary among different specimens belonging to a kind of bone structure even at the same age and anatomical site. These variations cause difficulty in representing tissue-level bone failure. Therefore, determining an appropriate tissue-level failure evaluation standard is necessary to represent tissue-level bone failure. In this study, the yield and failure processes of rat femoral cortical bones were primarily simulated through a hybrid computational-experimental method. Subsequently, the tissue-level strains and the ratio between tissue-level failure and yield strains in cortical bones were predicted. The results indicated that certain differences existed in tissue-level strains; however, slight variations in the ratio were observed among different cortical bones. Therefore, the ratio between tissue-level failure and yield strains for a kind of bone structure could be determined. This ratio may then be regarded as an appropriate tissue-level failure evaluation standard to represent the mechanical status of bone tissue.

  5. High Altitude Balloon Flight Path Prediction and Site Selection Based On Computer Simulations

    Science.gov (United States)

    Linford, Joel

    2010-10-01

    Interested in the upper atmosphere, Weber State University Physics department has developed a High Altitude Reconnaissance Balloon for Outreach and Research team, also known as HARBOR. HARBOR enables Weber State University to take a variety of measurements from ground level to altitudes as high as 100,000 feet. The flight paths of these balloons can extend as long as 100 miles from the launch zone, making the choice of where and when to fly critical. To ensure the ability to recover the packages in a reasonable amount of time, days and times are carefully selected using computer simulations limiting flight tracks to approximately 40 miles from the launch zone. The computer simulations take atmospheric data collected by National Oceanic and Atmospheric Administration (NOAA) to plot what flights might have looked like in the past, and to predict future flights. Using these simulations a launch zone has been selected in Duchesne Utah, which has hosted eight successful flights over the course of the last three years, all of which have been recovered. Several secondary launch zones in western Wyoming, Southern Idaho, and Northern Utah are also being considered.

  6. Simulation of international standard problem no. 44 open tests using Melcor computer code

    International Nuclear Information System (INIS)

    Song, Y.M.; Cho, S.W.

    2001-01-01

    MELCOR 1.8.4 code has been employed to simulate the KAEVER test series of K123/K148/K186/K188 that were proposed as open experiments of International Standard Problem No.44 by OECD-CSNI. The main purpose of this study is to evaluate the accuracy of the MELCOR aerosol model which calculates the aerosol distribution and settlement in a containment. For this, thermal hydraulic conditions are simulated first for the whole test period and then the behavior of hygroscopic CsOH/CsI and unsoluble Ag aerosols, which are predominant activity carriers in a release into the containment, is compared between the experimental results and the code predictions. The calculation results of vessel atmospheric concentration show a good simulation for dry aerosol but show large difference for wet aerosol due to a data mismatch in vessel humidity and the hygroscopicity. (authors)

  7. A comparison between standard well test evaluation methods used in SKB's site investigations and the generalised radial flow concept

    International Nuclear Information System (INIS)

    Follin, Sven; Ludvigson, Jan-Erik; Leven, Jakob

    2011-09-01

    According to the strategy for hydrogeological characterisation within the SKB's site investigation programme, two single-hole test methods are available for testing and parameterisation of groundwater flow models - constant-head injection testing with the Pipe String System (PSS method) and difference flow logging with the Posiva Flow Log (PFL method). This report presents the results of an investigation to assess discrepancies in the results of single-hole transmissivity measurements using these methods in the Forsmark site characterisation. The investigation explores the possibility that the source of the discrepancy observed lies in the assumptions of the flow geometry that are inherent to the methods used for standard constant-head injection well test analysis and difference flow logging analysis, respectively. In particular, the report looks at the generalised radial flow (GRF) concept by Barker (1988) as a means that might explain some of the differences. A confirmation of the actual flow geometries (dimensions) observed during hydraulic injection tests could help to identify admissible conceptual models for the tested system, and place the hydraulic testing with the PSS and PFL test methods in its full hydrogeological context. The investigation analyses 151 constant-head injection tests in three cored boreholes at Forsmark. The results suggest that the transmissivities derived with standard constant-head injection well test analysis methods and with the GRF concept, respectively, are similar provided that the dominating flow geometry during the testing is radial (cylindrical). Thus, having flow geometries with dimensions other than 2 affects the value of the interpreted transmissivity. For example, a flow system with a dimension of 1 may require an order of magnitude or more, higher transmissivity to produce the same flow rates. The median of the GRF flow dimensions of all 151 constant-head injection tests is 2.06 with 33% of the tests in the range 1

  8. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Miscellaneous -- Volume 3, Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    Petrie, L.M.; Jordon, W.C. [Oak Ridge National Lab., TN (United States); Edwards, A.L. [Oak Ridge National Lab., TN (United States)]|[Lawrence Livermore National Lab., CA (United States)] [and others

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice; (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System developments has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. This manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for the functional module documentation, and Volume 3--for the data libraries and subroutine libraries.

  9. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Control modules -- Volume 1, Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    Landers, N.F.; Petrie, L.M.; Knight, J.R. [Oak Ridge National Lab., TN (United States)] [and others

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. This manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for the functional module documentation, and Volume 3 for the documentation of the data libraries and subroutine libraries.

  10. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Miscellaneous -- Volume 3, Revision 4

    International Nuclear Information System (INIS)

    Petrie, L.M.; Jordon, W.C.; Edwards, A.L.

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice; (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System developments has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. This manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for the functional module documentation, and Volume 3--for the data libraries and subroutine libraries

  11. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Control modules -- Volume 1, Revision 4

    International Nuclear Information System (INIS)

    Landers, N.F.; Petrie, L.M.; Knight, J.R.

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. This manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for the functional module documentation, and Volume 3 for the documentation of the data libraries and subroutine libraries

  12. Computational approaches to standard-compliant biofilm data for reliable analysis and integration.

    Science.gov (United States)

    Sousa, Ana Margarida; Ferreira, Andreia; Azevedo, Nuno F; Pereira, Maria Olivia; Lourenço, Anália

    2012-12-01

    The study of microorganism consortia, also known as biofilms, is associated to a number of applications in biotechnology, ecotechnology and clinical domains. Nowadays, biofilm studies are heterogeneous and data-intensive, encompassing different levels of analysis. Computational modelling of biofilm studies has become thus a requirement to make sense of these vast and ever-expanding biofilm data volumes. The rationale of the present work is a machine-readable format for representing biofilm studies and supporting biofilm data interchange and data integration. This format is supported by the Biofilm Science Ontology (BSO), the first ontology on biofilms information. The ontology is decomposed into a number of areas of interest, namely: the Experimental Procedure Ontology (EPO) which describes biofilm experimental procedures; the Colony Morphology Ontology (CMO) which characterises morphologically microorganism colonies; and other modules concerning biofilm phenotype, antimicrobial susceptibility and virulence traits. The overall objective behind BSO is to develop semantic resources to capture, represent and share data on biofilms and related experiments in a regularized fashion manner. Furthermore, the present work also introduces a framework in assistance of biofilm data interchange and analysis - BiofOmics (http://biofomics.org) - and a public repository on colony morphology signatures - MorphoCol (http://stardust.deb.uminho.pt/morphocol).

  13. Computational Fluid Dynamic Analysis of the VHTR Lower Plenum Standard Problem

    International Nuclear Information System (INIS)

    Johnson, Richard W.; Schultz, Richard R.

    2009-01-01

    The United States Department of Energy is promoting the resurgence of nuclear power in the U. S. for both electrical power generation and production of process heat required for industrial processes such as the manufacture of hydrogen for use as a fuel in automobiles. The DOE project is called the next generation nuclear plant (NGNP) and is based on a Generation IV reactor concept called the very high temperature reactor (VHTR), which will use helium as the coolant at temperatures ranging from 450 C to perhaps 1000 C. While computational fluid dynamics (CFD) has not been used for past safety analysis for nuclear reactors in the U.S., it is being considered for safety analysis for existing and future reactors. It is fully recognized that CFD simulation codes will have to be validated for flow physics reasonably close to actual fluid dynamic conditions expected in normal and accident operational situations. To this end, experimental data have been obtained in a scaled model of a narrow slice of the lower plenum of a prismatic VHTR. The present report presents results of CFD examinations of these data to explore potential issues with the geometry, the initial conditions, the flow dynamics and the data needed to fully specify the inlet and boundary conditions; results for several turbulence models are examined. Issues are addressed and recommendations about the data are made

  14. A benchmarking tool to evaluate computer tomography perfusion infarct core predictions against a DWI standard.

    Science.gov (United States)

    Cereda, Carlo W; Christensen, Søren; Campbell, Bruce Cv; Mishra, Nishant K; Mlynash, Michael; Levi, Christopher; Straka, Matus; Wintermark, Max; Bammer, Roland; Albers, Gregory W; Parsons, Mark W; Lansberg, Maarten G

    2016-10-01

    Differences in research methodology have hampered the optimization of Computer Tomography Perfusion (CTP) for identification of the ischemic core. We aim to optimize CTP core identification using a novel benchmarking tool. The benchmarking tool consists of an imaging library and a statistical analysis algorithm to evaluate the performance of CTP. The tool was used to optimize and evaluate an in-house developed CTP-software algorithm. Imaging data of 103 acute stroke patients were included in the benchmarking tool. Median time from stroke onset to CT was 185 min (IQR 180-238), and the median time between completion of CT and start of MRI was 36 min (IQR 25-79). Volumetric accuracy of the CTP-ROIs was optimal at an rCBF threshold of benchmarking tool can play an important role in optimizing CTP software as it provides investigators with a novel method to directly compare the performance of alternative CTP software packages. © The Author(s) 2015.

  15. Protonation Sites, Tandem Mass Spectrometry and Computational Calculations of o-Carbonyl Carbazolequinone Derivatives.

    Science.gov (United States)

    Martínez-Cifuentes, Maximiliano; Clavijo-Allancan, Graciela; Zuñiga-Hormazabal, Pamela; Aranda, Braulio; Barriga, Andrés; Weiss-López, Boris; Araya-Maturana, Ramiro

    2016-07-05

    A series of a new type of tetracyclic carbazolequinones incorporating a carbonyl group at the ortho position relative to the quinone moiety was synthesized and analyzed by tandem electrospray ionization mass spectrometry (ESI/MS-MS), using Collision-Induced Dissociation (CID) to dissociate the protonated species. Theoretical parameters such as molecular electrostatic potential (MEP), local Fukui functions and local Parr function for electrophilic attack as well as proton affinity (PA) and gas phase basicity (GB), were used to explain the preferred protonation sites. Transition states of some main fragmentation routes were obtained and the energies calculated at density functional theory (DFT) B3LYP level were compared with the obtained by ab initio quadratic configuration interaction with single and double excitation (QCISD). The results are in accordance with the observed distribution of ions. The nature of the substituents in the aromatic ring has a notable impact on the fragmentation routes of the molecules.

  16. Simulation-based estimation of mean and standard deviation for meta-analysis via Approximate Bayesian Computation (ABC).

    Science.gov (United States)

    Kwon, Deukwoo; Reis, Isildinha M

    2015-08-12

    When conducting a meta-analysis of a continuous outcome, estimated means and standard deviations from the selected studies are required in order to obtain an overall estimate of the mean effect and its confidence interval. If these quantities are not directly reported in the publications, they must be estimated from other reported summary statistics, such as the median, the minimum, the maximum, and quartiles. We propose a simulation-based estimation approach using the Approximate Bayesian Computation (ABC) technique for estimating mean and standard deviation based on various sets of summary statistics found in published studies. We conduct a simulation study to compare the proposed ABC method with the existing methods of Hozo et al. (2005), Bland (2015), and Wan et al. (2014). In the estimation of the standard deviation, our ABC method performs better than the other methods when data are generated from skewed or heavy-tailed distributions. The corresponding average relative error (ARE) approaches zero as sample size increases. In data generated from the normal distribution, our ABC performs well. However, the Wan et al. method is best for estimating standard deviation under normal distribution. In the estimation of the mean, our ABC method is best regardless of assumed distribution. ABC is a flexible method for estimating the study-specific mean and standard deviation for meta-analysis, especially with underlying skewed or heavy-tailed distributions. The ABC method can be applied using other reported summary statistics such as the posterior mean and 95 % credible interval when Bayesian analysis has been employed.

  17. A computational model of the LGI1 protein suggests a common binding site for ADAM proteins.

    Directory of Open Access Journals (Sweden)

    Emanuela Leonardi

    Full Text Available Mutations of human leucine-rich glioma inactivated (LGI1 gene encoding the epitempin protein cause autosomal dominant temporal lateral epilepsy (ADTLE, a rare familial partial epileptic syndrome. The LGI1 gene seems to have a role on the transmission of neuronal messages but the exact molecular mechanism remains unclear. In contrast to other genes involved in epileptic disorders, epitempin shows no homology with known ion channel genes but contains two domains, composed of repeated structural units, known to mediate protein-protein interactions.A three dimensional in silico model of the two epitempin domains was built to predict the structure-function relationship and propose a functional model integrating previous experimental findings. Conserved and electrostatic charged regions of the model surface suggest a possible arrangement between the two domains and identifies a possible ADAM protein binding site in the β-propeller domain and another protein binding site in the leucine-rich repeat domain. The functional model indicates that epitempin could mediate the interaction between proteins localized to different synaptic sides in a static way, by forming a dimer, or in a dynamic way, by binding proteins at different times.The model was also used to predict effects of known disease-causing missense mutations. Most of the variants are predicted to alter protein folding while several other map to functional surface regions. In agreement with experimental evidence, this suggests that non-secreted LGI1 mutants could be retained within the cell by quality control mechanisms or by altering interactions required for the secretion process.

  18. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules, F9-F11

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with three of the functional modules in the code. Those are the Morse-SGC for the SCALE system, Heating 7.2, and KENO V.a. The manual describes the latest released versions of the codes.

  19. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules, F9-F11

    International Nuclear Information System (INIS)

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with three of the functional modules in the code. Those are the Morse-SGC for the SCALE system, Heating 7.2, and KENO V.a. The manual describes the latest released versions of the codes

  20. Systematic Standardized and Individualized Assessment of Masticatory Cycles Using Electromagnetic 3D Articulography and Computer Scripts

    Directory of Open Access Journals (Sweden)

    Ramón Fuentes

    2017-01-01

    Full Text Available Masticatory movements are studied for decades in odontology; a better understanding of them could improve dental treatments. The aim of this study was to describe an innovative, accurate, and systematic method of analyzing masticatory cycles, generating comparable quantitative data. The masticatory cycles of 5 volunteers (Class I, 19 ± 1.7 years without articular or dental occlusion problems were evaluated using 3D electromagnetic articulography supported by MATLAB software. The method allows the trajectory morphology of the set of chewing cycles to be analyzed from different views and angles. It was also possible to individualize the trajectory of each cycle providing accurate quantitative data, such as number of cycles, cycle areas in frontal view, and the ratio between each cycle area and the frontal mandibular border movement area. There was a moderate negative correlation (−0.61 between the area and the number of cycles: the greater the cycle area, the smaller the number of repetitions. Finally it was possible to evaluate the area of the cycles through time, which did not reveal a standardized behavior. The proposed method provided reproducible, intelligible, and accurate quantitative and graphical data, suggesting that it is promising and may be applied in different clinical situations and treatments.

  1. Two-loop renormalization in the standard model, part II. Renormalization procedures and computational techniques

    Energy Technology Data Exchange (ETDEWEB)

    Actis, S. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Passarino, G. [Torino Univ. (Italy). Dipt. di Fisica Teorica; INFN, Sezione di Torino (Italy)

    2006-12-15

    In part I general aspects of the renormalization of a spontaneously broken gauge theory have been introduced. Here, in part II, two-loop renormalization is introduced and discussed within the context of the minimal Standard Model. Therefore, this paper deals with the transition between bare parameters and fields to renormalized ones. The full list of one- and two-loop counterterms is shown and it is proven that, by a suitable extension of the formalism already introduced at the one-loop level, two-point functions suffice in renormalizing the model. The problem of overlapping ultraviolet divergencies is analyzed and it is shown that all counterterms are local and of polynomial nature. The original program of 't Hooft and Veltman is at work. Finite parts are written in a way that allows for a fast and reliable numerical integration with all collinear logarithms extracted analytically. Finite renormalization, the transition between renormalized parameters and physical (pseudo-)observables, are discussed in part III where numerical results, e.g. for the complex poles of the unstable gauge bosons, are shown. An attempt is made to define the running of the electromagnetic coupling constant at the two-loop level. (orig.)

  2. International standard problem (ISP) no. 41 follow up exercise: Containment iodine computer code exercise: parametric studies

    Energy Technology Data Exchange (ETDEWEB)

    Ball, J.; Glowa, G.; Wren, J. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada); Ewig, F. [GRS Koln (Germany); Dickenson, S. [AEAT, (United Kingdom); Billarand, Y.; Cantrel, L. [IPSN (France); Rydl, A. [NRIR (Czech Republic); Royen, J. [OECD/NEA (France)

    2001-11-01

    This report describes the results of the second phase of International Standard Problem (ISP) 41, an iodine behaviour code comparison exercise. The first phase of the study, which was based on a simple Radioiodine Test Facility (RTF) experiment, demonstrated that all of the iodine behaviour codes had the capability to reproduce iodine behaviour for a narrow range of conditions (single temperature, no organic impurities, controlled pH steps). The current phase, a parametric study, was designed to evaluate the sensitivity of iodine behaviour codes to boundary conditions such as pH, dose rate, temperature and initial I{sup -} concentration. The codes used in this exercise were IODE(IPSN), IODE(NRIR), IMPAIR(GRS), INSPECT(AEAT), IMOD(AECL) and LIRIC(AECL). The parametric study described in this report identified several areas of discrepancy between the various codes. In general, the codes agree regarding qualitative trends, but their predictions regarding the actual amount of volatile iodine varied considerably. The largest source of the discrepancies between code predictions appears to be their different approaches to modelling the formation and destruction of organic iodides. A recommendation arising from this exercise is that an additional code comparison exercise be performed on organic iodide formation, against data obtained front intermediate-scale studies (two RTF (AECL, Canada) and two CAIMAN facility, (IPSN, France) experiments have been chosen). This comparison will allow each of the code users to realistically evaluate and improve the organic iodide behaviour sub-models within their codes. (author)

  3. International standard problem (ISP) no. 41 follow up exercise: Containment iodine computer code exercise: parametric studies

    International Nuclear Information System (INIS)

    Ball, J.; Glowa, G.; Wren, J.; Ewig, F.; Dickenson, S.; Billarand, Y.; Cantrel, L.; Rydl, A.; Royen, J.

    2001-11-01

    This report describes the results of the second phase of International Standard Problem (ISP) 41, an iodine behaviour code comparison exercise. The first phase of the study, which was based on a simple Radioiodine Test Facility (RTF) experiment, demonstrated that all of the iodine behaviour codes had the capability to reproduce iodine behaviour for a narrow range of conditions (single temperature, no organic impurities, controlled pH steps). The current phase, a parametric study, was designed to evaluate the sensitivity of iodine behaviour codes to boundary conditions such as pH, dose rate, temperature and initial I - concentration. The codes used in this exercise were IODE(IPSN), IODE(NRIR), IMPAIR(GRS), INSPECT(AEAT), IMOD(AECL) and LIRIC(AECL). The parametric study described in this report identified several areas of discrepancy between the various codes. In general, the codes agree regarding qualitative trends, but their predictions regarding the actual amount of volatile iodine varied considerably. The largest source of the discrepancies between code predictions appears to be their different approaches to modelling the formation and destruction of organic iodides. A recommendation arising from this exercise is that an additional code comparison exercise be performed on organic iodide formation, against data obtained front intermediate-scale studies (two RTF (AECL, Canada) and two CAIMAN facility, (IPSN, France) experiments have been chosen). This comparison will allow each of the code users to realistically evaluate and improve the organic iodide behaviour sub-models within their codes. (author)

  4. Towards a "Golden Standard" for computing globin stability: Stability and structure sensitivity of myoglobin mutants.

    Science.gov (United States)

    Kepp, Kasper P

    2015-10-01

    Fast and accurate computation of protein stability is increasingly important for e.g. protein engineering and protein misfolding diseases, but no consensus methods exist for important proteins such as globins, and performance may depend on the type of structural input given. This paper reports benchmarking of six protein stability calculators (POPMUSIC 2.1, I-Mutant 2.0, I-Mutant 3.0, CUPSAT, SDM, and mCSM) against 134 experimental stability changes for mutations of sperm-whale myoglobin. Six different high-resolution structures were used to test structure sensitivity that may impair protein calculations. The trend accuracy of the methods decreased as I-Mutant 2.0 (R=0.64-0.65), SDM (R=0.57-0.60), POPMUSIC2.1 (R=0.54-0.57), I-Mutant 3.0 (R=0.53-0.55), mCSM (R=0.35-0.47), and CUPSAT (R=0.25-0.48). The mean signed errors increased as SDMMean absolute errors increased as I-Mutant 2.0

  5. Computer simulations of large asteroid impacts into oceanic and continental sites--preliminary results on atmospheric, cratering and ejecta dynamics

    Science.gov (United States)

    Roddy, D.J.; Schuster, S.H.; Rosenblatt, M.; Grant, L.B.; Hassig, P.J.; Kreyenhagen, K.N.

    1987-01-01

    Computer simulations have been completed that describe passage of a 10-km-diameter asteroid through the Earth's atmosphere and the subsequent cratering and ejecta dynamics caused by impact of the asteroid into both oceanic and continental sites. The asteroid was modeled as a spherical body moving vertically at 20 km/s with a kinetic energy of 2.6 ?? 1030 ergs (6.2 ?? 107 Mt ). Detailed material modeling of the asteroid, ocean, crustal units, sedimentary unit, and mantle included effects of strength and fracturing, generic asteroid and rock properties, porosity, saturation, lithostatic stresses, and geothermal contributions, each selected to simulate impact and geologic conditions that were as realistic as possible. Calculation of the passage of the asteroid through a U.S. Standard Atmosphere showed development of a strong bow shock wave followed by a highly shock compressed and heated air mass. Rapid expansion of this shocked air created a large low-density region that also expanded away from the impact area. Shock temperatures in air reached ???20,000 K near the surface of the uplifting crater rim and were as high as ???2000 K at more than 30 km range and 10 km altitude. Calculations to 30 s showed that the shock fronts in the air and in most of the expanding shocked air mass preceded the formation of the crater, ejecta, and rim uplift and did not interact with them. As cratering developed, uplifted rim and target material were ejected into the very low density, shock-heated air immediately above the forming crater, and complex interactions could be expected. Calculations of the impact events showed equally dramatic effects on the oceanic and continental targets through an interval of 120 s. Despite geologic differences in the targets, both cratering events developed comparable dynamic flow fields and by ???29 s had formed similar-sized transient craters ???39 km deep and ???62 km across. Transient-rim uplift of ocean and crust reached a maximum altitude of nearly

  6. Morphologic features of puncture sites after exoseal vascular closure device implantation: Changes on follow-up computed tomography

    International Nuclear Information System (INIS)

    Ryu, Hwa Seong; Jang, Joo Yeon; Kim, Tae Un; Lee, Jun Woo; Park, Jung Hwan; Choo, Ki Seok; Cho, Mong; Yoon, Ki Tae; Hong, Young Ki; Jeon, Ung Bae

    2017-01-01

    The study aimed to evaluate the morphologic changes in transarterial chemoembolization (TACE) puncture sites implanted with an ExoSeal vascular closure device (VCD) using follow-up computed tomography (CT). 16 patients who used ExoSeal VCD after TACE were enrolled. Using CT images, the diameters and anterior wall thicknesses of the puncture sites in the common femoral artery (CFA) were compared with those of the contralateral CFA before TACE, at 1 month after every TACE session, and at the final follow-up period. The rates of complications were also evaluated. There were no puncture- or VCD-related complications. Follow-up CT images of the CFA's of patients who used ExoSeal VCDs showed eccentric vascular wall thickening with soft-tissue densities considered to be hemostatic plugs. Final follow-up CT images (mean, 616 days; range, 95–1106 days) revealed partial or complete resorption of the hemostatic plugs. The CFA puncture site diameters did not differ statistically from those of the contralateral CFA on the final follow-up CT (p > 0.05), regardless of the number of VCDs used. Follow-up CT images of patients who used ExoSeal VCDs showed no significant vascular stenosis or significant vessel wall thickening

  7. Morphologic features of puncture sites after exoseal vascular closure device implantation: Changes on follow-up computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Ryu, Hwa Seong; Jang, Joo Yeon; Kim, Tae Un; Lee, Jun Woo; Park, Jung Hwan; Choo, Ki Seok; Cho, Mong; Yoon, Ki Tae; Hong, Young Ki; Jeon, Ung Bae [Pusan National University Yangsan Hospital, Yangsan (Korea, Republic of)

    2017-05-15

    The study aimed to evaluate the morphologic changes in transarterial chemoembolization (TACE) puncture sites implanted with an ExoSeal vascular closure device (VCD) using follow-up computed tomography (CT). 16 patients who used ExoSeal VCD after TACE were enrolled. Using CT images, the diameters and anterior wall thicknesses of the puncture sites in the common femoral artery (CFA) were compared with those of the contralateral CFA before TACE, at 1 month after every TACE session, and at the final follow-up period. The rates of complications were also evaluated. There were no puncture- or VCD-related complications. Follow-up CT images of the CFA's of patients who used ExoSeal VCDs showed eccentric vascular wall thickening with soft-tissue densities considered to be hemostatic plugs. Final follow-up CT images (mean, 616 days; range, 95–1106 days) revealed partial or complete resorption of the hemostatic plugs. The CFA puncture site diameters did not differ statistically from those of the contralateral CFA on the final follow-up CT (p > 0.05), regardless of the number of VCDs used. Follow-up CT images of patients who used ExoSeal VCDs showed no significant vascular stenosis or significant vessel wall thickening.

  8. Realization of the developing potential of training to computer science in conditions of adoption of the second generation state educational standards

    Directory of Open Access Journals (Sweden)

    Сергей Георгиевич Григорьев

    2010-03-01

    Full Text Available In article requirements to training to computer science and an information technology, formulated with a position of planned results presented in the standard of the second generation are described.

  9. Standardized evaluation framework for evaluating coronary artery stenosis detection, stenosis quantification and lumen segmentation algorithms in computed tomography angiography.

    Science.gov (United States)

    Kirişli, H A; Schaap, M; Metz, C T; Dharampal, A S; Meijboom, W B; Papadopoulou, S L; Dedic, A; Nieman, K; de Graaf, M A; Meijs, M F L; Cramer, M J; Broersen, A; Cetin, S; Eslami, A; Flórez-Valencia, L; Lor, K L; Matuszewski, B; Melki, I; Mohr, B; Oksüz, I; Shahzad, R; Wang, C; Kitslaar, P H; Unal, G; Katouzian, A; Örkisz, M; Chen, C M; Precioso, F; Najman, L; Masood, S; Ünay, D; van Vliet, L; Moreno, R; Goldenberg, R; Vuçini, E; Krestin, G P; Niessen, W J; van Walsum, T

    2013-12-01

    Though conventional coronary angiography (CCA) has been the standard of reference for diagnosing coronary artery disease in the past decades, computed tomography angiography (CTA) has rapidly emerged, and is nowadays widely used in clinical practice. Here, we introduce a standardized evaluation framework to reliably evaluate and compare the performance of the algorithms devised to detect and quantify the coronary artery stenoses, and to segment the coronary artery lumen in CTA data. The objective of this evaluation framework is to demonstrate the feasibility of dedicated algorithms to: (1) (semi-)automatically detect and quantify stenosis on CTA, in comparison with quantitative coronary angiography (QCA) and CTA consensus reading, and (2) (semi-)automatically segment the coronary lumen on CTA, in comparison with expert's manual annotation. A database consisting of 48 multicenter multivendor cardiac CTA datasets with corresponding reference standards are described and made available. The algorithms from 11 research groups were quantitatively evaluated and compared. The results show that (1) some of the current stenosis detection/quantification algorithms may be used for triage or as a second-reader in clinical practice, and that (2) automatic lumen segmentation is possible with a precision similar to that obtained by experts. The framework is open for new submissions through the website, at http://coronary.bigr.nl/stenoses/. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Computational discovery of picomolar Q(o) site inhibitors of cytochrome bc1 complex.

    Science.gov (United States)

    Hao, Ge-Fei; Wang, Fu; Li, Hui; Zhu, Xiao-Lei; Yang, Wen-Chao; Huang, Li-Shar; Wu, Jia-Wei; Berry, Edward A; Yang, Guang-Fu

    2012-07-11

    A critical challenge to the fragment-based drug discovery (FBDD) is its low-throughput nature due to the necessity of biophysical method-based fragment screening. Herein, a method of pharmacophore-linked fragment virtual screening (PFVS) was successfully developed. Its application yielded the first picomolar-range Q(o) site inhibitors of the cytochrome bc(1) complex, an important membrane protein for drug and fungicide discovery. Compared with the original hit compound 4 (K(i) = 881.80 nM, porcine bc(1)), the most potent compound 4f displayed 20 507-fold improved binding affinity (K(i) = 43.00 pM). Compound 4f was proved to be a noncompetitive inhibitor with respect to the substrate cytochrome c, but a competitive inhibitor with respect to the substrate ubiquinol. Additionally, we determined the crystal structure of compound 4e (K(i) = 83.00 pM) bound to the chicken bc(1) at 2.70 Å resolution, providing a molecular basis for understanding its ultrapotency. To our knowledge, this study is the first application of the FBDD method in the discovery of picomolar inhibitors of a membrane protein. This work demonstrates that the novel PFVS approach is a high-throughput drug discovery method, independent of biophysical screening techniques.

  11. A computer hydrogeologic model of the Nevada Test Site and surrounding region

    International Nuclear Information System (INIS)

    Gillson, R.; Hand, J.; Adams, P.; Lawrence, S.

    1996-01-01

    A three-dimensional, hydrogeologic model of the Nevada Test Site and surrounding region was developed as an element for regional groundwater flow and radionuclide transport models. The hydrogeologic model shows the distribution, thickness, and structural relationships of major aquifers and confining units, as conceived by a team of experts organized by the U.S. Department of Energy Nevada Operations Office. The model was created using Intergraph Corporation's Geographical Information System based Environmental Resource Management Application software. The study area encompasses more than 28,000 square kilometers in southern Nevada and Inyo County, California. Fifty-three geologic cross sections were constructed throughout the study area to provide a framework for the model. The lithology was simplified to 16 hydrostratigraphic units, and the geologic structures with minimal effect on groundwater flow were removed. Digitized cross sections, surface geology, and surface elevation data were the primary sources for the hydrogeologic model and database. Elevation data for the hydrostratigraphic units were posted, contoured, and gridded. Intergraph Corporation's three-dimensional visualization software, VOXEL trademark, was used to view the results interactively. The hydrogeologic database will be used in future flow modeling activities

  12. Establishing a standard calibration methodology for MOSFET detectors in computed tomography dosimetry

    International Nuclear Information System (INIS)

    Brady, S. L.; Kaufman, R. A.

    2012-01-01

    Purpose: The use of metal-oxide-semiconductor field-effect transistor (MOSFET) detectors for patient dosimetry has increased by ∼25% since 2005. Despite this increase, no standard calibration methodology has been identified nor calibration uncertainty quantified for the use of MOSFET dosimetry in CT. This work compares three MOSFET calibration methodologies proposed in the literature, and additionally investigates questions relating to optimal time for signal equilibration and exposure levels for maximum calibration precision. Methods: The calibration methodologies tested were (1) free in-air (FIA) with radiographic x-ray tube, (2) FIA with stationary CT x-ray tube, and (3) within scatter phantom with rotational CT x-ray tube. Each calibration was performed at absorbed dose levels of 10, 23, and 35 mGy. Times of 0 min or 5 min were investigated for signal equilibration before or after signal read out. Results: Calibration precision was measured to be better than 5%–7%, 3%–5%, and 2%–4% for the 10, 23, and 35 mGy respective dose levels, and independent of calibration methodology. No correlation was demonstrated for precision and signal equilibration time when allowing 5 min before or after signal read out. Differences in average calibration coefficients were demonstrated between the FIA with CT calibration methodology 26.7 ± 1.1 mV cGy −1 versus the CT scatter phantom 29.2 ± 1.0 mV cGy −1 and FIA with x-ray 29.9 ± 1.1 mV cGy −1 methodologies. A decrease in MOSFET sensitivity was seen at an average change in read out voltage of ∼3000 mV. Conclusions: The best measured calibration precision was obtained by exposing the MOSFET detectors to 23 mGy. No signal equilibration time is necessary to improve calibration precision. A significant difference between calibration outcomes was demonstrated for FIA with CT compared to the other two methodologies. If the FIA with a CT calibration methodology was used to create calibration coefficients for the

  13. Establishing a standard calibration methodology for MOSFET detectors in computed tomography dosimetry.

    Science.gov (United States)

    Brady, S L; Kaufman, R A

    2012-06-01

    The use of metal-oxide-semiconductor field-effect transistor (MOSFET) detectors for patient dosimetry has increased by ~25% since 2005. Despite this increase, no standard calibration methodology has been identified nor calibration uncertainty quantified for the use of MOSFET dosimetry in CT. This work compares three MOSFET calibration methodologies proposed in the literature, and additionally investigates questions relating to optimal time for signal equilibration and exposure levels for maximum calibration precision. The calibration methodologies tested were (1) free in-air (FIA) with radiographic x-ray tube, (2) FIA with stationary CT x-ray tube, and (3) within scatter phantom with rotational CT x-ray tube. Each calibration was performed at absorbed dose levels of 10, 23, and 35 mGy. Times of 0 min or 5 min were investigated for signal equilibration before or after signal read out. Calibration precision was measured to be better than 5%-7%, 3%-5%, and 2%-4% for the 10, 23, and 35 mGy respective dose levels, and independent of calibration methodology. No correlation was demonstrated for precision and signal equilibration time when allowing 5 min before or after signal read out. Differences in average calibration coefficients were demonstrated between the FIA with CT calibration methodology 26.7 ± 1.1 mV cGy(-1) versus the CT scatter phantom 29.2 ± 1.0 mV cGy(-1) and FIA with x-ray 29.9 ± 1.1 mV cGy(-1) methodologies. A decrease in MOSFET sensitivity was seen at an average change in read out voltage of ~3000 mV. The best measured calibration precision was obtained by exposing the MOSFET detectors to 23 mGy. No signal equilibration time is necessary to improve calibration precision. A significant difference between calibration outcomes was demonstrated for FIA with CT compared to the other two methodologies. If the FIA with a CT calibration methodology was used to create calibration coefficients for the eventual use for phantom dosimetry, a measurement error ~12

  14. Catalytic surface radical in dye-decolorizing peroxidase: a computational, spectroscopic and site-directed mutagenesis study

    Science.gov (United States)

    Linde, Dolores; Pogni, Rebecca; Cañellas, Marina; Lucas, Fátima; Guallar, Victor; Baratto, Maria Camilla; Sinicropi, Adalgisa; Sáez-Jiménez, Verónica; Coscolín, Cristina; Romero, Antonio; Medrano, Francisco Javier; Ruiz-Dueñas, Francisco J.; Martínez, Angel T.

    2014-01-01

    Dye-decolorizing peroxidase (DyP) of Auricularia auricula-judae has been expressed in Escherichia coli as a representative of a new DyP family, and subjected to mutagenic, spectroscopic, crystallographic and computational studies. The crystal structure of DyP shows a buried haem cofactor, and surface tryptophan and tyrosine residues potentially involved in long-range electron transfer from bulky dyes. Simulations using PELE (Protein Energy Landscape Exploration) software provided several binding-energy optima for the anthraquinone-type RB19 (Reactive Blue 19) near the above aromatic residues and the haem access-channel. Subsequent QM/MM (quantum mechanics/molecular mechanics) calculations showed a higher tendency of Trp-377 than other exposed haem-neighbouring residues to harbour a catalytic protein radical, and identified the electron-transfer pathway. The existence of such a radical in H2O2-activated DyP was shown by low-temperature EPR, being identified as a mixed tryptophanyl/tyrosyl radical in multifrequency experiments. The signal was dominated by the Trp-377 neutral radical contribution, which disappeared in the W377S variant, and included a tyrosyl contribution assigned to Tyr-337 after analysing the W377S spectra. Kinetics of substrate oxidation by DyP suggests the existence of high- and low-turnover sites. The high-turnover site for oxidation of RB19 (kcat> 200 s−1) and other DyP substrates was assigned to Trp-377 since it was absent from the W377S variant. The low-turnover site/s (RB19 kcat ~20 s−1) could correspond to the haem access-channel, since activity was decreased when the haem channel was occluded by the G169L mutation. If a tyrosine residue is also involved, it will be different from Tyr-337 since all activities are largely unaffected in the Y337S variant. PMID:25495127

  15. Rule-based modeling: a computational approach for studying biomolecular site dynamics in cell signaling systems

    Science.gov (United States)

    Chylek, Lily A.; Harris, Leonard A.; Tung, Chang-Shung; Faeder, James R.; Lopez, Carlos F.

    2013-01-01

    Rule-based modeling was developed to address the limitations of traditional approaches for modeling chemical kinetics in cell signaling systems. These systems consist of multiple interacting biomolecules (e.g., proteins), which themselves consist of multiple parts (e.g., domains, linear motifs, and sites of phosphorylation). Consequently, biomolecules that mediate information processing generally have the potential to interact in multiple ways, with the number of possible complexes and post-translational modification states tending to grow exponentially with the number of binary interactions considered. As a result, only large reaction networks capture all possible consequences of the molecular interactions that occur in a cell signaling system, which is problematic because traditional modeling approaches for chemical kinetics (e.g., ordinary differential equations) require explicit network specification. This problem is circumvented through representation of interactions in terms of local rules. With this approach, network specification is implicit and model specification is concise. Concise representation results in a coarse graining of chemical kinetics, which is introduced because all reactions implied by a rule inherit the rate law associated with that rule. Coarse graining can be appropriate if interactions are modular, and the coarseness of a model can be adjusted as needed. Rules can be specified using specialized model-specification languages, and recently developed tools designed for specification of rule-based models allow one to leverage powerful software engineering capabilities. A rule-based model comprises a set of rules, which can be processed by general-purpose simulation and analysis tools to achieve different objectives (e.g., to perform either a deterministic or stochastic simulation). PMID:24123887

  16. Rule-based modeling: a computational approach for studying biomolecular site dynamics in cell signaling systems.

    Science.gov (United States)

    Chylek, Lily A; Harris, Leonard A; Tung, Chang-Shung; Faeder, James R; Lopez, Carlos F; Hlavacek, William S

    2014-01-01

    Rule-based modeling was developed to address the limitations of traditional approaches for modeling chemical kinetics in cell signaling systems. These systems consist of multiple interacting biomolecules (e.g., proteins), which themselves consist of multiple parts (e.g., domains, linear motifs, and sites of phosphorylation). Consequently, biomolecules that mediate information processing generally have the potential to interact in multiple ways, with the number of possible complexes and posttranslational modification states tending to grow exponentially with the number of binary interactions considered. As a result, only large reaction networks capture all possible consequences of the molecular interactions that occur in a cell signaling system, which is problematic because traditional modeling approaches for chemical kinetics (e.g., ordinary differential equations) require explicit network specification. This problem is circumvented through representation of interactions in terms of local rules. With this approach, network specification is implicit and model specification is concise. Concise representation results in a coarse graining of chemical kinetics, which is introduced because all reactions implied by a rule inherit the rate law associated with that rule. Coarse graining can be appropriate if interactions are modular, and the coarseness of a model can be adjusted as needed. Rules can be specified using specialized model-specification languages, and recently developed tools designed for specification of rule-based models allow one to leverage powerful software engineering capabilities. A rule-based model comprises a set of rules, which can be processed by general-purpose simulation and analysis tools to achieve different objectives (e.g., to perform either a deterministic or stochastic simulation). © 2013 Wiley Periodicals, Inc.

  17. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    Science.gov (United States)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state

  18. Prescribed computer games in addition to occlusion versus standard occlusion treatment for childhood amblyopia: a pilot randomised controlled trial.

    Science.gov (United States)

    Tailor, Vijay K; Glaze, Selina; Khandelwal, Payal; Davis, Alison; Adams, Gillian G W; Xing, Wen; Bunce, Catey; Dahlmann-Noor, Annegret

    2015-01-01

    Amblyopia ("lazy eye") is the commonest vision deficit in children. If not fully corrected by glasses, amblyopia is treated by patching or blurring the better-seeing eye. Compliance with patching is often poor. Computer-based activities are increasingly topical, both as an adjunct to standard treatment and as a platform for novel treatments. Acceptability by families has not been explored, and feasibility of a randomised controlled trial (RCT) using computer games in terms of recruitment and treatment acceptability is uncertain. We carried out a pilot RCT to test whether computer-based activities are acceptable and accessible to families and to test trial methods such as recruitment and retention rates, randomisation, trial-specific data collection tools and analysis. The trial had three arms: standard near activity advice, Eye Five, a package developed for children with amblyopia, and an off-the-shelf handheld games console with pre-installed games. We enrolled 60 children age 3-8 years with moderate or severe amblyopia after completion of optical treatment. This trial was registered as UKCRN-ID 11074. Pre-screening of 3600 medical notes identified 189 potentially eligible children, of whom 60 remained eligible after optical treatment, and were enrolled between April 2012 and March 2013. One participant was randomised twice and withdrawn from the study. Of the 58 remaining, 37 were boys. The mean (SD) age was 4.6 (1.7) years. Thirty-seven had moderate and 21 severe amblyopia. Three participants were withdrawn at week 6, and in total, four were lost to follow-up at week 12. Most children and parents/carers found the study procedures, i.e. occlusion treatment, usage of the allocated near activity and completion of a study diary, easy. The prescribed cumulative dose of near activity was 84 h at 12 weeks. Reported near activity usage numbers were close to prescribed numbers in moderate amblyopes (94 % of prescribed) but markedly less in severe amblyopes (64

  19. Positron emission tomography-computed tomography standardized uptake values in clinical practice and assessing response to therapy.

    Science.gov (United States)

    Kinahan, Paul E; Fletcher, James W

    2010-12-01

    The use of standardized uptake values (SUVs) is now common place in clinical 2-deoxy-2-[(18)F] fluoro-D-glucose (FDG) position emission tomography-computed tomography oncology imaging and has a specific role in assessing patient response to cancer therapy. Ideally, the use of SUVs removes variability introduced by differences in patient size and the amount of injected FDG. However, in practice there are several sources of bias and variance that are introduced in the measurement of FDG uptake in tumors and also in the conversion of the image count data to SUVs. In this article the overall imaging process is reviewed and estimates of the magnitude of errors, where known, are given. Recommendations are provided for best practices in improving SUV accuracy. Copyright © 2010 Elsevier Inc. All rights reserved.

  20. Parallel Computation on Multicore Processors Using Explicit Form of the Finite Element Method and C++ Standard Libraries

    Directory of Open Access Journals (Sweden)

    Rek Václav

    2016-11-01

    Full Text Available In this paper, the form of modifications of the existing sequential code written in C or C++ programming language for the calculation of various kind of structures using the explicit form of the Finite Element Method (Dynamic Relaxation Method, Explicit Dynamics in the NEXX system is introduced. The NEXX system is the core of engineering software NEXIS, Scia Engineer, RFEM and RENEX. It has the possibilities of multithreaded running, which can now be supported at the level of native C++ programming language using standard libraries. Thanks to the high degree of abstraction that a contemporary C++ programming language provides, a respective library created in this way can be very generalized for other purposes of usage of parallelism in computational mechanics.

  1. The Fermilab Advanced Computer Program multi-array processor system (ACPMAPS): A site oriented supercomputer for theoretical physics

    International Nuclear Information System (INIS)

    Nash, T.; Areti, H.; Atac, R.

    1988-08-01

    The ACP Multi-Array Processor System (ACPMAPS) is a highly cost effective, local memory parallel computer designed for floating point intensive grid based problems. The processing nodes of the system are single board array processors based on the FORTRAN and C programmable Weitek XL chip set. The nodes are connected by a network of very high bandwidth 16 port crossbar switches. The architecture is designed to achieve the highest possible cost effectiveness while maintaining a high level of programmability. The primary application of the machine at Fermilab will be lattice gauge theory. The hardware is supported by a transparent site oriented software system called CANOPY which shields theorist users from the underlying node structure. 4 refs., 2 figs

  2. Applying standardized uptake values in gallium-67-citrate single-photon emission computed tomography/computed tomography studies and their correlation with blood test results in representative organs.

    Science.gov (United States)

    Toriihara, Akira; Daisaki, Hiromitsu; Yamaguchi, Akihiro; Yoshida, Katsuya; Isogai, Jun; Tateishi, Ukihide

    2018-05-21

    Recently, semiquantitative analysis using standardized uptake value (SUV) has been introduced in bone single-photon emission computed tomography/computed tomography (SPECT/CT). Our purposes were to apply SUV-based semiquantitative analytic method for gallium-67 (Ga)-citrate SPECT/CT and to evaluate correlation between SUV of physiological uptake and blood test results in representative organs. The accuracy of semiquantitative method was validated using an National Electrical Manufacturers Association body phantom study (radioactivity ratio of sphere : background=4 : 1). Thereafter, 59 patients (34 male and 25 female; mean age, 66.9 years) who had undergone Ga-citrate SPECT/CT were retrospectively enrolled in the study. A mean SUV of physiological uptake was calculated for the following organs: the lungs, right atrium, liver, kidneys, spleen, gluteal muscles, and bone marrow. The correlation between physiological uptakes and blood test results was evaluated using Pearson's correlation coefficient. The phantom study revealed only 1% error between theoretical and actual SUVs in the background, suggesting the sufficient accuracy of scatter and attenuation corrections. However, a partial volume effect could not be overlooked, particularly in small spheres with a diameter of less than 28 mm. The highest mean SUV was observed in the liver (range: 0.44-4.64), followed by bone marrow (range: 0.33-3.60), spleen (range: 0.52-2.12), and kidneys (range: 0.42-1.45). There was no significant correlation between hepatic uptake and liver function, renal uptake and renal function, or bone marrow uptake and blood cell count (P>0.05). The physiological uptake in Ga-citrate SPECT/CT can be represented as SUVs, which are not significantly correlated with corresponding blood test results.

  3. Dose reduction in abdominal computed tomography: intraindividual comparison of image quality of full-dose standard and half-dose iterative reconstructions with dual-source computed tomography.

    Science.gov (United States)

    May, Matthias S; Wüst, Wolfgang; Brand, Michael; Stahl, Christian; Allmendinger, Thomas; Schmidt, Bernhard; Uder, Michael; Lell, Michael M

    2011-07-01

    We sought to evaluate the image quality of iterative reconstruction in image space (IRIS) in half-dose (HD) datasets compared with full-dose (FD) and HD filtered back projection (FBP) reconstruction in abdominal computed tomography (CT). To acquire data with FD and HD simultaneously, contrast-enhanced abdominal CT was performed with a dual-source CT system, both tubes operating at 120 kV, 100 ref.mAs, and pitch 0.8. Three different image datasets were reconstructed from the raw data: Standard FD images applying FBP which served as reference, HD images applying FBP and HD images applying IRIS. For the HD data sets, only data from 1 tube detector-system was used. Quantitative image quality analysis was performed by measuring image noise in tissue and air. Qualitative image quality was evaluated according to the European Guidelines on Quality criteria for CT. Additional assessment of artifacts, lesion conspicuity, and edge sharpness was performed. : Image noise in soft tissue was substantially decreased in HD-IRIS (-3.4 HU, -22%) and increased in HD-FBP (+6.2 HU, +39%) images when compared with the reference (mean noise, 15.9 HU). No significant differences between the FD-FBP and HD-IRIS images were found for the visually sharp anatomic reproduction, overall diagnostic acceptability (P = 0.923), lesion conspicuity (P = 0.592), and edge sharpness (P = 0.589), while HD-FBP was rated inferior. Streak artifacts and beam hardening was significantly more prominent in HD-FBP while HD-IRIS images exhibited a slightly different noise pattern. Direct intrapatient comparison of standard FD body protocols and HD-IRIS reconstruction suggest that the latest iterative reconstruction algorithms allow for approximately 50% dose reduction without deterioration of the high image quality necessary for confident diagnosis.

  4. High-definition multidetector computed tomography for evaluation of coronary artery stents: comparison to standard-definition 64-detector row computed tomography.

    Science.gov (United States)

    Min, James K; Swaminathan, Rajesh V; Vass, Melissa; Gallagher, Scott; Weinsaft, Jonathan W

    2009-01-01

    The assessment of coronary stents with present-generation 64-detector row computed tomography scanners that use filtered backprojection and operating at standard definition of 0.5-0.75 mm (standard definition, SDCT) is limited by imaging artifacts and noise. We evaluated the performance of a novel, high-definition 64-slice CT scanner (HDCT), with improved spatial resolution (0.23 mm) and applied statistical iterative reconstruction (ASIR) for evaluation of coronary artery stents. HDCT and SDCT stent imaging was performed with the use of an ex vivo phantom. HDCT was compared with SDCT with both smooth and sharp kernels for stent intraluminal diameter, intraluminal area, and image noise. Intrastent visualization was assessed with an ASIR algorithm on HDCT scans, compared with the filtered backprojection algorithms by SDCT. Six coronary stents (2.5, 2.5, 2.75, 3.0, 3.5, 4.0mm) were analyzed by 2 independent readers. Interobserver correlation was high for both HDCT and SDCT. HDCT yielded substantially larger luminal area visualization compared with SDCT, both for smooth (29.4+/-14.5 versus 20.1+/-13.0; P<0.001) and sharp (32.0+/-15.2 versus 25.5+/-12.0; P<0.001) kernels. Stent diameter was higher with HDCT compared with SDCT, for both smooth (1.54+/-0.59 versus1.00+/-0.50; P<0.0001) and detailed (1.47+/-0.65 versus 1.08+/-0.54; P<0.0001) kernels. With detailed kernels, HDCT scans that used algorithms showed a trend toward decreased image noise compared with SDCT-filtered backprojection algorithms. On the basis of this ex vivo study, HDCT provides superior detection of intrastent luminal area and diameter visualization, compared with SDCT. ASIR image reconstruction techniques for HDCT scans enhance the in-stent assessment while decreasing image noise.

  5. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules F1-F8

    International Nuclear Information System (INIS)

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with eight of the functional modules in the code. Those are: BONAMI - resonance self-shielding by the Bondarenko method; NITAWL-II - SCALE system module for performing resonance shielding and working library production; XSDRNPM - a one-dimensional discrete-ordinates code for transport analysis; XSDOSE - a module for calculating fluxes and dose rates at points outside a shield; KENO IV/S - an improved monte carlo criticality program; COUPLE; ORIGEN-S - SCALE system module to calculate fuel depletion, actinide transmutation, fission product buildup and decay, and associated radiation source terms; ICE

  6. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules F1-F8

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with eight of the functional modules in the code. Those are: BONAMI - resonance self-shielding by the Bondarenko method; NITAWL-II - SCALE system module for performing resonance shielding and working library production; XSDRNPM - a one-dimensional discrete-ordinates code for transport analysis; XSDOSE - a module for calculating fluxes and dose rates at points outside a shield; KENO IV/S - an improved monte carlo criticality program; COUPLE; ORIGEN-S - SCALE system module to calculate fuel depletion, actinide transmutation, fission product buildup and decay, and associated radiation source terms; ICE.

  7. Comparison of x ray computed tomography number to proton relative linear stopping power conversion functions using a standard phantom.

    Science.gov (United States)

    Moyers, M F

    2014-06-01

    Adequate evaluation of the results from multi-institutional trials involving light ion beam treatments requires consideration of the planning margins applied to both targets and organs at risk. A major uncertainty that affects the size of these margins is the conversion of x ray computed tomography numbers (XCTNs) to relative linear stopping powers (RLSPs). Various facilities engaged in multi-institutional clinical trials involving proton beams have been applying significantly different margins in their patient planning. This study was performed to determine the variance in the conversion functions used at proton facilities in the U.S.A. wishing to participate in National Cancer Institute sponsored clinical trials. A simplified method of determining the conversion function was developed using a standard phantom containing only water and aluminum. The new method was based on the premise that all scanners have their XCTNs for air and water calibrated daily to constant values but that the XCTNs for high density/high atomic number materials are variable with different scanning conditions. The standard phantom was taken to 10 different proton facilities and scanned with the local protocols resulting in 14 derived conversion functions which were compared to the conversion functions used at the local facilities. For tissues within ±300 XCTN of water, all facility functions produced converted RLSP values within ±6% of the values produced by the standard function and within 8% of the values from any other facility's function. For XCTNs corresponding to lung tissue, converted RLSP values differed by as great as ±8% from the standard and up to 16% from the values of other facilities. For XCTNs corresponding to low-density immobilization foam, the maximum to minimum values differed by as much as 40%. The new method greatly simplifies determination of the conversion function, reduces ambiguity, and in the future could promote standardization between facilities. Although it

  8. Simulation of unsaturated flow and solute transport at the Las Cruces trench site using the PORFLO-3 computer code

    International Nuclear Information System (INIS)

    Rockhold, M.L.; Wurstner, S.K.

    1991-03-01

    The objective of this work was to test the ability of the PORFLO-3 computer code to simulate water infiltration and solute transport in dry soils. Data from a field-scale unsaturated zone flow and transport experiment, conducted near Las Cruces, New Mexico, were used for model validation. A spatial moment analysis was used to provide a quantitative basis for comparing the mean simulated and observed flow behavior. The scope of this work was limited to two-dimensional simulations of the second experiment at the Las Cruces trench site. Three simulation cases are presented. The first case represents a uniform soil profile, with homogeneous, isotropic hydraulic and transport properties. The second and third cases represent single stochastic realizations of randomly heterogeneous hydraulic conductivity fields, generated from the cumulative probability distribution of the measured data. Two-dimensional simulations produced water content changes that matched the observed data reasonably well. Models that explicitly incorporated heterogeneous hydraulic conductivity fields reproduced the characteristics of the observed data somewhat better than a uniform, homogeneous model. Improved predictions of water content changes at specific spatial locations were obtained by adjusting the soil hydraulic properties. The results of this study should only be considered a qualitative validation of the PORFLO-3 code. However, the results of this study demonstrate the importance of site-specific data for model calibration. Applications of the code for waste management and remediation activities will require site-specific data for model calibration before defensible predictions of unsaturated flow and containment transport can be made. 23 refs., 16 figs., 3 tabs

  9. An analysis of the intent of environmental standards in the U.S. that apply to waste disposed at the Nevada Test Site

    International Nuclear Information System (INIS)

    Hechanova, A.E.; Mattingly, B.T.; Gitnacht, D.

    2001-01-01

    This paper contains a discussion on the application of U.S. regulatory standards for transuranic waste disposed at the Nevada Test Site. Application of current compliance requirements and regulatory guidance defined for a generic disposal system, although satisfying the 'letter of the law,' is shown to be incompatible with the 'intent of the law' based on a thorough review of the preamble and background documents supporting the regulation. Specifically, the standards that apply to transuranic waste disposal were derived assuming deep geologic disposal and much larger and more hazardous waste forms: irradiated nuclear reactor fuel and high-level radioactive waste. Therefore, key assumptions that underpin the analyses used to justify the standards (e.g., the ground water pathway being considered the only major release mechanism) are inconsistent with the nature of the radionuclide inventory and the intermediate depth of waste emplacement in Greater Confinement Disposal boreholes at the Nevada Test Site. The authors recommend that site specific performance metrics be determined to foster an analysis which is transparent and consistent with U.S. Environmental Protection Agency intent in developing the standards for a generic disposal system. (authors)

  10. Out of Hours Emergency Computed Tomography Brain Studies: Comparison of Standard 3 Megapixel Diagnostic Workstation Monitors With the iPad 2.

    Science.gov (United States)

    Salati, Umer; Leong, Sum; Donnellan, John; Kok, Hong Kuan; Buckley, Orla; Torreggiani, William

    2015-11-01

    The purpose was to compare performance of diagnostic workstation monitors and the Apple iPad 2 (Cupertino, CA) in interpretation of emergency computed tomography (CT) brain studies. Two experienced radiologists interpreted 100 random emergency CT brain studies on both on-site diagnostic workstation monitors and the iPad 2 via remote access. The radiologists were blinded to patient clinical details and to each other's interpretation and the study list was randomized between interpretations on different modalities. Interobserver agreement between radiologists and intraobserver agreement between modalities was determined and Cohen kappa coefficients calculated for each. Performance with regards to urgent and nonurgent abnormalities was assessed separately. There was substantial intraobserver agreement of both radiologists between the modalities with overall calculated kappa values of 0.959 and 0.940 in detecting acute abnormalities and perfect agreement with regards to hemorrhage. Intraobserver agreement kappa values were 0.939 and 0.860 for nonurgent abnormalities. Interobserver agreement between the 2 radiologists for both diagnostic monitors and the iPad 2 was also substantial ranging from 0.821-0.860. The iPad 2 is a reliable modality in the interpretation of CT brain studies in them emergency setting and for the detection of acute and chronic abnormalities, with comparable performance to standard diagnostic workstation monitors. Copyright © 2015 Canadian Association of Radiologists. Published by Elsevier Inc. All rights reserved.

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  12. Standardizing Scale Height Computation of Maven Ngims Neutral Data and Variations Between Exobase and Homeopause Scale Heights

    Science.gov (United States)

    Elrod, M. K.; Slipski, M.; Curry, S.; Williamson, H. N.; Benna, M.; Mahaffy, P. R.

    2017-12-01

    The MAVEN NGIMS team produces a level 3 product which includes the computation of Ar scale height an atmospheric temperatures at 200 km. In the latest version (v05_r01) this has been revised to include scale height fits for CO2, N2 O and CO. Members of the MAVEN team have used various methods to compute scale heights leading to significant variations in scale height values depending on fits and techniques within a few orbits even, occasionally, the same pass. Additionally fitting scale heights in a very stable atmosphere like the day side vs night side can have different results based on boundary conditions. Currently, most methods only compute Ar scale heights as it is most stable and reacts least with the instrument. The NGIMS team has chosen to expand these fitting techniques to include fitted scale heights for CO2, N2, CO, and O. Having compared multiple techniques, the method found to be most reliable for most conditions was determined to be a simple fit method. We have focused this to a fitting method that determines the exobase altidude of the CO2 atmosphere as a maximum altitude for the highest point for fitting, and uses the periapsis as the lowest point and then fits the altitude versus log(density). The slope of altitude vs log(density) is -1/H where H is the scale height of the atmosphere for each species. Since this is between the homeopause and the exobase, each species will have a different scale height by this point. This is being released as a new standardization for the level 3 product, with the understanding that scientists and team members will continue to compute more precise scale heights and temperatures as needed based on science and model demands. This is being released in the PDS NGIMS level 3 v05 files for August 2017. Additionally, we are examining these scale heights for variations seasonally, diurnally, and above and below the exobase. The atmosphere is significantly more stable on the dayside than on the nightside. We have also found

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  14. The Construction And Instrumentation Of A Pilot Treatment System At The Standard Mine Superfund Site, Crested Butte, CO

    Science.gov (United States)

    A pilot biochemical reactor (BCR) was designed and constructed to treat mine-influenced water emanating from an adit at a remote site in southern Colorado which receives an average of 400 inches (10.2 meters) of snowfall each season. The objective of the study is to operate and ...

  15. The Construction And Instrumentation Of A Pilot Treatment System At The Standard Mine Superfund Site, Crested Butte, CO - (Presentation)

    Science.gov (United States)

    A pilot biochemical reactor (BCR) was designed and constructed to treat mine-influenced water emanating from an adit at a remote site in southern Colorado which receives an average of 400 inches (10.2 meters) of snowfall each season. The objective of the study is to operate and ...

  16. Standard anatomical and visual space for the mouse retina: computational reconstruction and transformation of flattened retinae with the Retistruct package.

    Directory of Open Access Journals (Sweden)

    David C Sterratt

    Full Text Available The concept of topographic mapping is central to the understanding of the visual system at many levels, from the developmental to the computational. It is important to be able to relate different coordinate systems, e.g. maps of the visual field and maps of the retina. Retinal maps are frequently based on flat-mount preparations. These use dissection and relaxing cuts to render the quasi-spherical retina into a 2D preparation. The variable nature of relaxing cuts and associated tears limits quantitative cross-animal comparisons. We present an algorithm, "Retistruct," that reconstructs retinal flat-mounts by mapping them into a standard, spherical retinal space. This is achieved by: stitching the marked-up cuts of the flat-mount outline; dividing the stitched outline into a mesh whose vertices then are mapped onto a curtailed sphere; and finally moving the vertices so as to minimise a physically-inspired deformation energy function. Our validation studies indicate that the algorithm can estimate the position of a point on the intact adult retina to within 8° of arc (3.6% of nasotemporal axis. The coordinates in reconstructed retinae can be transformed to visuotopic coordinates. Retistruct is used to investigate the organisation of the adult mouse visual system. We orient the retina relative to the nictitating membrane and compare this to eye muscle insertions. To align the retinotopic and visuotopic coordinate systems in the mouse, we utilised the geometry of binocular vision. In standard retinal space, the composite decussation line for the uncrossed retinal projection is located 64° away from the retinal pole. Projecting anatomically defined uncrossed retinal projections into visual space gives binocular congruence if the optical axis of the mouse eye is oriented at 64° azimuth and 22° elevation, in concordance with previous results. Moreover, using these coordinates, the dorsoventral boundary for S-opsin expressing cones closely matches

  17. SU-F-R-30: Interscanner Variability of Radiomics Features in Computed Tomography (CT) Using a Standard ACR Phantom

    Energy Technology Data Exchange (ETDEWEB)

    Shafiq ul Hassan, M; Zhang, G; Moros, E [H Lee Moffitt Cancer Center and Research Institute, Tampa, FL (United States); Department of Physics, University of South Florida, Tampa, FL (United States); Budzevich, M; Latifi, K; Hunt, D; Gillies, R [H Lee Moffitt Cancer Center and Research Institute, Tampa, FL (United States)

    2016-06-15

    Purpose: A simple approach to investigate Interscanner variability of Radiomics features in computed tomography (CT) using a standard ACR phantom. Methods: The standard ACR phantom was scanned on CT scanners from three different manufacturers. Scanning parameters of 120 KVp, 200 mA were used while slice thickness of 3.0 mm on two scanners and 3.27 mm on third scanner was used. Three spherical regions of interest (ROI) from water, medium density and high density inserts were contoured. Ninety four Radiomics features were extracted using an in-house program. These features include shape (11), intensity (22), GLCM (26), GLZSM (11), RLM (11), and NGTDM (5) and 8 fractal dimensions features. To evaluate the Interscanner variability across three scanners, a coefficient of variation (COV) is calculated for each feature group. Each group is further classified according to the COV- by calculating the percentage of features in each of the following categories: COV less than 2%, between 2 and 10% and greater than 10%. Results: For all feature groups, similar trend was observed for three different inserts. Shape features were the most robust for all scanners as expected. 70% of the shape features had COV <2%. For intensity feature group, 2% COV varied from 9 to 32% for three scanners. All features in four groups GLCM, GLZSM, RLM and NGTDM were found to have Interscanner variability ≥2%. The fractal dimensions dependence for medium and high density inserts were similar while it was different for water inserts. Conclusion: We concluded that even for similar scanning conditions, Interscanner variability across different scanners was significant. The texture features based on GLCM, GLZSM, RLM and NGTDM are highly scanner dependent. Since the inserts of the ACR Phantom are not heterogeneous in HU values suggests that matrix based 2nd order features are highly affected by variation in noise. Research partly funded by NIH/NCI R01CA190105-01.

  18. Identification and Validation of Novel Hedgehog-Responsive Enhancers Predicted by Computational Analysis of Ci/Gli Binding Site Density

    Science.gov (United States)

    Richards, Neil; Parker, David S.; Johnson, Lisa A.; Allen, Benjamin L.; Barolo, Scott; Gumucio, Deborah L.

    2015-01-01

    The Hedgehog (Hh) signaling pathway directs a multitude of cellular responses during embryogenesis and adult tissue homeostasis. Stimulation of the pathway results in activation of Hh target genes by the transcription factor Ci/Gli, which binds to specific motifs in genomic enhancers. In Drosophila, only a few enhancers (patched, decapentaplegic, wingless, stripe, knot, hairy, orthodenticle) have been shown by in vivo functional assays to depend on direct Ci/Gli regulation. All but one (orthodenticle) contain more than one Ci/Gli site, prompting us to directly test whether homotypic clustering of Ci/Gli binding sites is sufficient to define a Hh-regulated enhancer. We therefore developed a computational algorithm to identify Ci/Gli clusters that are enriched over random expectation, within a given region of the genome. Candidate genomic regions containing Ci/Gli clusters were functionally tested in chicken neural tube electroporation assays and in transgenic flies. Of the 22 Ci/Gli clusters tested, seven novel enhancers (and the previously known patched enhancer) were identified as Hh-responsive and Ci/Gli-dependent in one or both of these assays, including: Cuticular protein 100A (Cpr100A); invected (inv), which encodes an engrailed-related transcription factor expressed at the anterior/posterior wing disc boundary; roadkill (rdx), the fly homolog of vertebrate Spop; the segment polarity gene gooseberry (gsb); and two previously untested regions of the Hh receptor-encoding patched (ptc) gene. We conclude that homotypic Ci/Gli clustering is not sufficient information to ensure Hh-responsiveness; however, it can provide a clue for enhancer recognition within putative Hedgehog target gene loci. PMID:26710299

  19. Change of Maximum Standardized Uptake Value Slope in Dynamic Triphasic [18F]-Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography Distinguishes Malignancy From Postradiation Inflammation in Head-and-Neck Squamous Cell Carcinoma: A Prospective Trial

    International Nuclear Information System (INIS)

    Anderson, Carryn M.; Chang, Tangel; Graham, Michael M.; Marquardt, Michael D.; Button, Anna; Smith, Brian J.; Menda, Yusuf; Sun, Wenqing; Pagedar, Nitin A.; Buatti, John M.

    2015-01-01

    Purpose: To evaluate dynamic [ 18 F]-fluorodeoxyglucose (FDG) uptake methodology as a post–radiation therapy (RT) response assessment tool, potentially enabling accurate tumor and therapy-related inflammation differentiation, improving the posttherapy value of FDG–positron emission tomography/computed tomography (FDG-PET/CT). Methods and Materials: We prospectively enrolled head-and-neck squamous cell carcinoma patients who completed RT, with scheduled 3-month post-RT FDG-PET/CT. Patients underwent our standard whole-body PET/CT scan at 90 minutes, with the addition of head-and-neck PET/CT scans at 60 and 120 minutes. Maximum standardized uptake values (SUV max ) of regions of interest were measured at 60, 90, and 120 minutes. The SUV max slope between 60 and 120 minutes and change of SUV max slope before and after 90 minutes were calculated. Data were analyzed by primary site and nodal site disease status using the Cox regression model and Wilcoxon rank sum test. Outcomes were based on pathologic and clinical follow-up. Results: A total of 84 patients were enrolled, with 79 primary and 43 nodal evaluable sites. Twenty-eight sites were interpreted as positive or equivocal (18 primary, 8 nodal, 2 distant) on 3-month 90-minute FDG-PET/CT. Median follow-up was 13.3 months. All measured SUV endpoints predicted recurrence. Change of SUV max slope after 90 minutes more accurately identified nonrecurrence in positive or equivocal sites than our current standard of SUV max ≥2.5 (P=.02). Conclusions: The positive predictive value of post-RT FDG-PET/CT may significantly improve using novel second derivative analysis of dynamic triphasic FDG-PET/CT SUV max slope, accurately distinguishing tumor from inflammation on positive and equivocal scans

  20. Efficacy of plain radiography and computer tomography in localizing the site of pelvic arterial bleeding in trauma patients

    Energy Technology Data Exchange (ETDEWEB)

    Dormagen, Johann B. (Dept. of Radiology, Oslo Univ. Hospital, Ullevaal, Oslo (Norway)), e-mail: johannd@medisin.uio.no; Toetterman, Anna (Dept. of Orthopedic Surgery, Uppsala Univ. Hospital, Uppsala (Sweden)); Roeise, Olav (Div. of Neuroscience and Musculoskeletal Medicine, Oslo Univ. Hospital, Ullevaal, Oslo (Norway)); Sandvik, Leiv (Center for Clinical Research, Oslo Univ. Hospital, Ullevaal, Oslo (Norway)); Kloew, Nils-E. (Dept. of Cardiovascular Radiology, Oslo Univ. Hospital - Ullevaal, Oslo (Norway))

    2010-01-15

    Background: Immediate angiography is warranted in pelvic trauma patients with suspected arterial injury (AI) in order to stop ongoing bleeding. Prior to angiography, plain pelvic radiography (PPR) and abdominopelvic computer tomography (CT) are performed to identify fracture and hematoma sites. Purpose: To investigate if PPR and CT can identify the location of AI in trauma patients undergoing angiography. Material and Methods: 95 patients with pelvic fractures on PPR (29 women, 66 men), at a mean age of 44 (9-92) years, underwent pelvic angiography for suspected AI. Fifty-six of them underwent CT additionally. Right and left anterior and posterior fractures on PPR were registered, and fracture displacement was recorded for each quadrant. Arterial blush on CT was registered, and the size of the hematoma in each region was measured in cm2. AIs were registered for anterior and posterior segments of both internal iliac arteries. Presence of fractures, arterial blush, and hematomas were correlated with AI. Results: Presence of fracture in the corresponding skeletal segment on PPR showed sensitivity and specificity of 0.86 and 0.58 posteriorly, and 0.87 and 0.44 anteriorly. The area under the curve (AUC) was 0.77 and 0.69, respectively. Fracture displacement on PPR >0.9 cm posteriorly and >1.9 cm anteriorly revealed specificity of 0.84. Sensitivities of arterial blush and hematoma on CT were 0.38 and 0.82 posteriorly, and 0.24 and 0.82 anteriorly. The specificities were 0.96 and 0.58 posteriorly, and 0.79 and 0.53 anteriorly, respectively. For hematomas, the AUC was 0.79 posteriorly and 0.75 anteriorly. Size of hematoma >22 cm2 posteriorly and >29 cm2 anteriorly revealed specificity of 0.85 and 0.86, respectively. Conclusion: CT findings of arterial blush and hematoma predicted site of arterial bleeding on pelvic angiography. Also, PPR predicted the site of bleeding using location of fracture and size of displacement. In the hemodynamically unstable patient, PPR may

  1. Efficacy of plain radiography and computer tomography in localizing the site of pelvic arterial bleeding in trauma patients.

    Science.gov (United States)

    Dormagen, Johann B; Tötterman, Anna; Røise, Olav; Sandvik, Leiv; Kløw, Nils-E

    2010-02-01

    Immediate angiography is warranted in pelvic trauma patients with suspected arterial injury (AI) in order to stop ongoing bleeding. Prior to angiography, plain pelvic radiography (PPR) and abdominopelvic computer tomography (CT) are performed to identify fracture and hematoma sites. To investigate if PPR and CT can identify the location of AI in trauma patients undergoing angiography. 95 patients with pelvic fractures on PPR (29 women, 66 men), at a mean age of 44 (9-92) years, underwent pelvic angiography for suspected AI. Fifty-six of them underwent CT additionally. Right and left anterior and posterior fractures on PPR were registered, and fracture displacement was recorded for each quadrant. Arterial blush on CT was registered, and the size of the hematoma in each region was measured in cm(2). AIs were registered for anterior and posterior segments of both internal iliac arteries. Presence of fractures, arterial blush, and hematomas were correlated with AI. Presence of fracture in the corresponding skeletal segment on PPR showed sensitivity and specificity of 0.86 and 0.58 posteriorly, and 0.87 and 0.44 anteriorly. The area under the curve (AUC) was 0.77 and 0.69, respectively. Fracture displacement on PPR >0.9 cm posteriorly and >1.9 cm anteriorly revealed specificity of 0.84. Sensitivities of arterial blush and hematoma on CT were 0.38 and 0.82 posteriorly, and 0.24 and 0.82 anteriorly. The specificities were 0.96 and 0.58 posteriorly, and 0.79 and 0.53 anteriorly, respectively. For hematomas, the AUC was 0.79 posteriorly and 0.75 anteriorly. Size of hematoma >22 cm(2) posteriorly and >29 cm(2) anteriorly revealed specificity of 0.85 and 0.86, respectively. CT findings of arterial blush and hematoma predicted site of arterial bleeding on pelvic angiography. Also, PPR predicted the site of bleeding using location of fracture and size of displacement. In the hemodynamically unstable patient, PPR may contribute equally to effective assessment of injured arteries.

  2. Beyond the standard two-film theory: Computational fluid dynamics simulations for carbon dioxide capture in a wetted wall column

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Chao; Xu, Zhijie; Lai, Canhai; Sun, Xin

    2018-07-01

    The standard two-film theory (STFT) is a diffusion-based mechanism that can be used to describe gas mass transfer across liquid film. Fundamental assumptions of the STFT impose serious limitations on its ability to predict mass transfer coefficients. To better understand gas absorption across liquid film in practical situations, a multiphase computational fluid dynamics (CFD) model fully equipped with mass transport and chemistry capabilities has been developed for solvent-based carbon dioxide (CO2) capture to predict the CO2 mass transfer coefficient in a wetted wall column. The hydrodynamics is modeled using a volume of fluid method, and the diffusive and reactive mass transfer between the two phases is modeled by adopting a one-fluid formulation. We demonstrate that the proposed CFD model can naturally account for the influence of many important factors on the overall mass transfer that cannot be quantitatively explained by the STFT, such as the local variation in fluid velocities and properties, flow instabilities, and complex geometries. The CFD model also can predict the local mass transfer coefficient variation along the column height, which the STFT typically does not consider.

  3. Apophyseal Ossification of the Iliac Crest in Forensic Age Estimation: Computed Tomography Standards for Modern Australian Subadults.

    Science.gov (United States)

    Lottering, Nicolene; Alston-Knox, Clair L; MacGregor, Donna M; Izatt, Maree T; Grant, Caroline A; Adam, Clayton J; Gregory, Laura S

    2017-03-01

    This study contrasts the ontogeny of the iliac crest apophysis using conventional radiography and multislice computed tomography (MSCT), providing probabilistic information for age estimation of modern Australian subadults. Retrospective abdominopelvic MSCT data acquired from 524 Australian individuals aged 7-25 and surveillance radiographs of adolescent idiopathic scoliosis patients included in the Paediatric Spine Research Group Progression Study (n = 531) were assessed. Ossification scoring of pseudo-radiographs and three-dimensional (3D) volume-rendered reconstructions using Risser (1958) quantitative descriptors indicate discrepancies in age estimates, stage allocation, and conflicting morphological progression. To mitigate visualization limitations associated with two-dimensional radiographs, we provide and validate a modified 3D-MSCT scoring tier of ossification, demonstrating complete fusion between 17.3-19.2 and 17.1-20.1 years in males and females. Legal demarcation for doli incapax presumption and age of majority (18 years) can be achieved using probability estimates from a fitted cumulative probit model for apophyseal fusion using the recalibrated standards. © 2016 American Academy of Forensic Sciences.

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  5. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  6. Feasibility Study of Economics and Performance of Solar Photovoltaics at the Standard Chlorine of Delaware Superfund Site in Delaware City, Delaware. A Study Prepared in Partnership with the Environmental Protection Agency for the RE-Powering America's Land Initiative: Siting Renewable Energy on Potentially Contaminated Land and Mine Sites

    Energy Technology Data Exchange (ETDEWEB)

    Salasovich, J.; Geiger, J.; Mosey, G.; Healey, V.

    2013-06-01

    The U.S. Environmental Protection Agency (EPA), in accordance with the RE-Powering America's Land initiative, selected the Standard Chlorine of Delaware site in Delaware City, Delaware, for a feasibility study of renewable energy production. The National Renewable Energy Laboratory (NREL) provided technical assistance for this project. The purpose of this report is to assess the site for a possible photovoltaic (PV) system installation and estimate the cost, performance, and site impacts of different PV options. In addition, the report recommends financing options that could assist in the implementation of a PV system at the site.

  7. Scope of work-supplemental standards-related fieldwork - Salt Lake City UMTRA Project Site, Salt Lake City, Utah

    International Nuclear Information System (INIS)

    1996-01-01

    This scope of work governs the field effort to conduct transient in situ (hereafter referred to by the trademark name HydroPunch reg-sign) investigative subsurface logging and ground water sampling, and perform well point installation services at the U.S. Department of Energy's (DOE) Uranium Mill Tailings Remedial Action (UMTRA) Project site near Salt Lake City, Utah. The HydroPunch reg-sign and well point services subcontractor (the Subcontractor) shall provide services as stated herein to be used to investigate the subsurface, collect and analyze ground water samples, and install shallow well points

  8. Using Florida Keys Reference Sites As a Standard for Restoration of Forest Structure in Everglades Tree Islands

    International Nuclear Information System (INIS)

    Ross, M.S.; Sah, J.P.; Ruiz, P.L.; Ross, M.S.; Ogurcak, D.E.

    2010-01-01

    In south Florida, tropical hardwood forests (hammocks) occur in Everglades tree islands and as more extensive forests in coastal settings in the nearby Florida Keys. Keys hammocks have been less disturbed by humans, and many qualify as old-growth, while Everglades hammocks have received much heavier use. With improvement of tree island condition an important element in Everglades restoration efforts, we examined stand structure in 23 Keys hammocks and 69 Everglades tree islands. Based on Stand Density Index and tree diameter distributions, many Everglades hammocks were characterized by low stocking and under-representation in the smaller size classes. In contrast, most Keys forests had the dense canopies and open under stories usually associated with old-growth hardwood hammocks. Subject to the same caveats that apply to off-site references elsewhere, structural information from mature Keys hammocks can be helpful in planning and implementing forest restoration in Everglades tree islands. In many of these islands, such restoration might involve supplementing tree stocking by planting native trees to produce more complete site utilization and a more open under story.

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  12. Standardized uptake value on positron emission tomography/computed tomography predicts prognosis in patients with locally advanced pancreatic cancer.

    Science.gov (United States)

    Wang, Si-Liang; Cao, Shuo; Sun, Yu-Nan; Wu, Rong; Chi, Feng; Tang, Mei-Yue; Jin, Xue-Ying; Chen, Xiao-Dong

    2015-10-01

    The aim of the present study was to investigate the use and value of maximum standardized uptake value (SUV max) on positron emission tomography/computed tomography (PET/CT) images as a prognostic marker for patients with locally advanced pancreatic cancer (LAPC). The medical records of all consecutive patients who underwent PET/CT examination in our institution were retrospectively reviewed. Inclusion criteria were histologically or cytologically proven LAPC. Patients with distant metastasis were excluded. For statistical analysis, the SUV max of primary pancreatic cancer was measured. Survival rates were calculated using the Kaplan-Meier method, and multivariable analysis was performed to determine the association of SUV max with overall survival (OS) and progression-free survival (PFS) using a Cox proportional hazards model. Between July 2006 and June 2013, 69 patients were enrolled in the present study. OS and PFS were 14.9 months [95% confidence interval (CI) 13.1-16.7] and 8.3 months (95% CI 7.1-9.5), respectively. A high SUV max (>5.5) was observed in 35 patients, who had significantly worse OS and PFS than the remaining patients with a low SUV max (P = 0.025 and P = 0.003). Univariate analysis showed that SUV max and tumor size were prognostic factors for OS, with a hazard ratio of 1.90 and 1.81, respectively. A high SUV max was an independent prognostic factor, with a hazard ratio of 1.89 (95% CI 1.015-3.519, P = 0.045). The present study suggests that increased SUV max is a predictor of poor prognosis in patients with LAPC.

  13. Dynamic Site Characterization and Correlation of Shear Wave Velocity with Standard Penetration Test ` N' Values for the City of Agartala, Tripura State, India

    Science.gov (United States)

    Sil, Arjun; Sitharam, T. G.

    2014-08-01

    Seismic site characterization is the basic requirement for seismic microzonation and site response studies of an area. Site characterization helps to gauge the average dynamic properties of soil deposits and thus helps to evaluate the surface level response. This paper presents a seismic site characterization of Agartala city, the capital of Tripura state, in the northeast of India. Seismically, Agartala city is situated in the Bengal Basin zone which is classified as a highly active seismic zone, assigned by Indian seismic code BIS-1893, Indian Standard Criteria for Earthquake Resistant Design of Structures, Part-1 General Provisions and Buildings. According to the Bureau of Indian Standards, New Delhi (2002), it is the highest seismic level (zone-V) in the country. The city is very close to the Sylhet fault (Bangladesh) where two major earthquakes ( M w > 7) have occurred in the past and affected severely this city and the whole of northeast India. In order to perform site response evaluation, a series of geophysical tests at 27 locations were conducted using the multichannel analysis of surface waves (MASW) technique, which is an advanced method for obtaining shear wave velocity ( V s) profiles from in situ measurements. Similarly, standard penetration test (SPT-N) bore log data sets have been obtained from the Urban Development Department, Govt. of Tripura. In the collected data sets, out of 50 bore logs, 27 were selected which are close to the MASW test locations and used for further study. Both the data sets ( V s profiles with depth and SPT-N bore log profiles) have been used to calculate the average shear wave velocity ( V s30) and average SPT-N values for the upper 30 m depth of the subsurface soil profiles. These were used for site classification of the study area recommended by the National Earthquake Hazard Reduction Program (NEHRP) manual. The average V s30 and SPT-N classified the study area as seismic site class D and E categories, indicating that

  14. Exit site and tunnel infections in children on chronic peritoneal dialysis: findings from the Standardizing Care to Improve Outcomes in Pediatric End Stage Renal Disease (SCOPE) Collaborative.

    Science.gov (United States)

    Swartz, Sarah J; Neu, Alicia; Skversky Mason, Amy; Richardson, Troy; Rodean, Jonathan; Lawlor, John; Warady, Bradley; Somers, Michael J G

    2018-06-01

    The Standardizing Care to Improve Outcomes in Pediatric End Stage Renal Disease (SCOPE) Collaborative is a quality improvement initiative to reduce dialysis-associated infections. The frequency of peritoneal dialysis (PD) catheter exit site infection (ESI) and variables influencing its development and end result are unclear. We sought to determine ESI rates, to elucidate the epidemiology, risk factors, and outcomes for ESI, and to assess for association between provider compliance with care bundles and ESI risk. We reviewed demographic, dialysis and ESI data, and care bundle adherence and outcomes for SCOPE enrollees from October 2011 to September 2014. ESI involved only the exit site, only the subcutaneous catheter tunnel, or both. A total of 857 catheter insertions occurred in 734 children over 10,110 cumulative months of PD provided to these children. During this period 207 ESIs arose in 124 children or 0.25 ESIs per dialysis year. Median time to ESI was 392 days, with 69% of ESIs involving exit site only, 23% involving the tunnel only, and 8% involving both sites. Peritonitis developed in 6%. ESI incidence was associated with age (p = 0.003), being the lowest in children aged  0 at prior month's visit (p treatment, 24% required hospitalization, and 9% required catheter removal, generally secondary to tunnel infection. Exit site infections occur at an annualized rate of 0.25, typically well into the dialysis course. Younger patient age and documented review of site care are associated with lower ESI rates. Although most ESIs resolve, hospitalization is frequent, and tunnel involvement/catheter loss complicate outcomes.

  15. COMPARATIVE ANALYSIS OF RULES IN FIVE LEADING STANDARDS FOR SMOKE DETECTORS SITING IN THE PRESENCE OF A CEILING IRREGULARITY

    Directory of Open Access Journals (Sweden)

    Milan BLAGOJEVIĆ

    2017-12-01

    Full Text Available Subdividing elements and different structures on the ceiling like beams or similar, significantly affect the location of the smoke detector, because they change the flow of combustion products. From point of view of fire detection system, designers it is very interesting how to arrange and distribute smoke detectors in applications when beams are formed structure like a “honeycomb” The European norm 54-14 is mandatory, but in practice, a main question appears: “Do we have the explanations detailed enough for all of the situations that could occur related to length, width and depth of honeycomb cells”? The main goal of this paper is to show the differences between the rules and the instructions in five standards: EN 54-14, VDE 0833-2, BS 5839-1, NPB 88, NFPA 72, and to find the best solution for various situations in practice.

  16. An analysis of the intent of environmental standards in the united states that apply to waste disposed at the Nevada test site

    International Nuclear Information System (INIS)

    Hechanova, A.E.; Mattingly, B.T.

    2000-01-01

    This paper addresses the disposal of transuranic waste at the Nevada Test Site (NTS), the intention of the environmental standards under which the disposal is completed, and some lingering controversy surrounding the U.S. nuclear weapons complex remediation effort. A goal of this paper besides the informational value is to provide points of discussion regarding this very costly and large-scale program in the U.S. and provide a platform for the exchange of ideas regarding remediation activities in other countries. (authors)

  17. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  19. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  20. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  2. MO-F-CAMPUS-T-04: Implementation of a Standardized Monthly Quality Check for Linac Output Management in a Large Multi-Site Clinic

    Energy Technology Data Exchange (ETDEWEB)

    Xu, H; Yi, B; Prado, K [Univ. of Maryland School Of Medicine, Baltimore, MD (United States)

    2015-06-15

    Purpose: This work is to investigate the feasibility of a standardized monthly quality check (QC) of LINAC output determination in a multi-site, multi-LINAC institution. The QC was developed to determine individual LINAC output using the same optimized measurement setup and a constant calibration factor for all machines across the institution. Methods: The QA data over 4 years of 7 Varian machines over four sites, were analyzed. The monthly output constancy checks were performed using a fixed source-to-chamber-distance (SCD), with no couch position adjustment throughout the measurement cycle for all the photon energies: 6 and 18MV, and electron energies: 6, 9, 12, 16 and 20 MeV. The constant monthly output calibration factor (Nconst) was determined by averaging the machines’ output data, acquired with the same monthly ion chamber. If a different monthly ion chamber was used, Nconst was then re-normalized to consider its different NDW,Co-60. Here, the possible changes of Nconst over 4 years have been tracked, and the precision of output results based on this standardized monthly QA program relative to the TG-51 calibration for each machine was calculated. Any outlier of the group was investigated. Results: The possible changes of Nconst varied between 0–0.9% over 4 years. The normalization of absorbed-dose-to-water calibration factors corrects for up to 3.3% variations of different monthly QA chambers. The LINAC output precision based on this standardized monthly QC relative to the TG-51 output calibration is within 1% for 6MV photon energy and 2% for 18MV and all the electron energies. A human error in one TG-51 report was found through a close scrutiny of outlier data. Conclusion: This standardized QC allows for a reasonably simplified, precise and robust monthly LINAC output constancy check, with the increased sensitivity needed to detect possible human errors and machine problems.

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  4. The value of standard radiology, angiography and computed tomography for the diagnosis of primary and secondary bone tumors

    International Nuclear Information System (INIS)

    Wagner, R.

    1986-01-01

    The diagnostic value of X-ray images, angiography and computed tomography (CT) was compared using 45 benign, semi-malignant and malignant bone tumors. All around, computed tomography proved to be more accurate than angiography. CT is therefore recommended for use as the first non-invasive examination method after X-ray images have been made. (MBC) [de

  5. International standard problem (ISP) No. 41. Containment iodine computer code exercise based on a radioiodine test facility (RTF) experiment

    International Nuclear Information System (INIS)

    2000-04-01

    International Standard Problem (ISP) exercises are comparative exercises in which predictions of different computer codes for a given physical problem are compared with each other or with the results of a carefully controlled experimental study. The main goal of ISP exercises is to increase confidence in the validity and accuracy of the tools, which were used in assessing the safety of nuclear installations. Moreover, they enable code users to gain experience and demonstrate their competence. The ISP No. 41 exercise, computer code exercise based on a Radioiodine Test Facility (RTF) experiment on iodine behaviour in containment under severe accident conditions, is one of such ISP exercises. The ISP No. 41 exercise was borne at the recommendation at the Fourth Iodine Chemistry Workshop held at PSI, Switzerland in June 1996: 'the performance of an International Standard Problem as the basis of an in-depth comparison of the models as well as contributing to the database for validation of iodine codes'. [Proceedings NEA/CSNI/R(96)6, Summary and Conclusions NEA/CSNI/R(96)7]. COG (CANDU Owners Group), comprising AECL and the Canadian nuclear utilities, offered to make the results of a Radioiodine Test Facility (RTF) test available for such an exercise. The ISP No. 41 exercise was endorsed in turn by the FPC (PWG4's Task Group on Fission Product Phenomena in the Primary Circuit and the Containment), PWG4 (CSNI Principal Working Group on the Confinement of Accidental Radioactive Releases), and the CSNI. The OECD/NEA Committee on the Safety of Nuclear Installations (CSNI) has sponsored forty-five ISP exercises over the last twenty-four years, thirteen of them in the area of severe accidents. The criteria for the selection of the RTF test as a basis for the ISP-41 exercise were; (1) complementary to other RTF tests available through the PHEBUS and ACE programmes, (2) simplicity for ease of modelling and (3) good quality data. A simple RTF experiment performed under controlled

  6. Decommissioning standards

    International Nuclear Information System (INIS)

    Crofford, W.N.

    1980-01-01

    EPA has agreed to establish a series of environmental standards for the safe disposal of radioactive waste through participation in the Interagency Review Group on Nuclear Waste Management (IRG). One of the standards required under the IRG is the standard for decommissioning of radioactive contaminated sites, facilities, and materials. This standard is to be proposed by December 1980 and promulgated by December 1981. Several considerations are important in establishing these standards. This study includes discussions of some of these considerations and attempts to evaluate their relative importance. Items covered include: the form of the standards, timing for decommissioning, occupational radiation protection, costs and financial provisions. 4 refs

  7. SCALE: A modular code system for performing Standardized Computer Analyses for Licensing Evaluation. Volume 1, Part 2: Control modules S1--H1; Revision 5

    International Nuclear Information System (INIS)

    1997-03-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system

  8. SCALE: A modular code system for performing Standardized Computer Analyses for Licensing Evaluation. Volume 2, Part 3: Functional modules F16--F17; Revision 5

    International Nuclear Information System (INIS)

    1997-03-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system

  9. SCALE: A modular code system for performing Standardized Computer Analyses for Licensing Evaluation. Volume 2, Part 3: Functional modules F16--F17; Revision 5

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system.

  10. Effectiveness of Adaptive Statistical Iterative Reconstruction for 64-Slice Dual-Energy Computed Tomography Pulmonary Angiography in Patients With a Reduced Iodine Load: Comparison With Standard Computed Tomography Pulmonary Angiography.

    Science.gov (United States)

    Lee, Ji Won; Lee, Geewon; Lee, Nam Kyung; Moon, Jin Il; Ju, Yun Hye; Suh, Young Ju; Jeong, Yeon Joo

    2016-01-01

    The aim of the study was to assess the effectiveness of the adaptive statistical iterative reconstruction (ASIR) for dual-energy computed tomography pulmonary angiography (DE-CTPA) with a reduced iodine load. One hundred forty patients referred for chest CT were randomly divided into a DE-CTPA group with a reduced iodine load or a standard CTPA group. Quantitative and qualitative image qualities of virtual monochromatic spectral (VMS) images with filtered back projection (VMS-FBP) and those with 50% ASIR (VMS-ASIR) in the DE-CTPA group were compared. Image qualities of VMS-ASIR images in the DE-CTPA group and ASIR images in the standard CTPA group were also compared. All quantitative and qualitative indices, except attenuation value of pulmonary artery in the VMS-ASIR subgroup, were superior to those in the VMS-FBP subgroup (all P ASIR images were superior to those of ASIR images in the standard CTPA group (P ASIR images of the DE-CTPA group than in ASIR images of the standard CTPA group (P = 0.001). The ASIR technique tends to improve the image quality of VMS imaging. Dual-energy computed tomography pulmonary angiography with ASIR can reduce contrast medium volume and produce images of comparable quality with those of standard CTPA.

  11. Development of an accident consequence assessment code for evaluating site suitability of light- and heavy-water reactors based on the Korean Technical standards

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Won Tae; Jeong, Hae Sung; Jeong, Hyo Joon; Kil, A Reum; Kim, Eun Han; Han, Moon Hee [Nuclear Environment Safety Research Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-12-15

    Methodologies for a series of radiological consequence assessments show a distinctive difference according to the design principles of the original nuclear suppliers and their technical standards to be imposed. This is due to the uncertainties of the accidental source term, radionuclide behavior in the environment, and subsequent radiological dose. Both types of PWR and PHWR are operated in Korea. However, technical standards for evaluating atmospheric dispersion have been enacted based on the U.S. NRC's positions regardless of the reactor types. For this reason, it might cause a controversy between the licensor and licensee of a nuclear power plant. It was modelled under the framework of the NRC Regulatory Guide 1.145 for light-water reactors, reflecting the features of heavy-water reactors as specified in the Canadian National Standard and the modelling features in MACCS2, such as atmospheric diffusion coefficient, ground deposition, surface roughness, radioactive plume depletion, and exposure from ground deposition. An integrated accident consequence assessment code, ACCESS (Accident Consequence Assessment Code for Evaluating Site Suitability), was developed by taking into account the unique regulatory positions for reactor types under the framework of the current Korean technical standards. Field tracer experiments and hand calculations have been carried out for validation and verification of the models. The modelling approaches of ACCESS and its features are introduced, and its applicative results for a hypothetical accidental scenario are comprehensively discussed. In an applicative study, the predicted results by the light-water reactor assessment model were higher than those by other models in terms of total doses.

  12. Using standardized video cases for assessment of medical communication skills: reliability of an objective structured video examination by computer

    NARCIS (Netherlands)

    Hulsman, R. L.; Mollema, E. D.; Oort, F. J.; Hoos, A. M.; de Haes, J. C. J. M.

    2006-01-01

    OBJECTIVE: Using standardized video cases in a computerized objective structured video examination (OSVE) aims to measure cognitive scripts underlying overt communication behavior by questions on knowledge, understanding and performance. In this study the reliability of the OSVE assessment is

  13. Organization of a library of standard relocatible programmes, or a processing measurement data module based on computers of the TRA types

    International Nuclear Information System (INIS)

    Dadi, K.; Dadi, L.; Mateeva, A.; Salamatin, I.M.

    1976-01-01

    The paper describes the organization of a library of standard programs with binary cade. The library was developed for a measurement module on the basis of a TRA-1001-i computer (Elektronika-100, PDP-8). The library is placed on a external memory (magnetic disk) and has a module structure. The external memory assigned for the library is divided into pages. When loaded into the computer internal memory, several pages are taken as one whole to represent the loading module. The magnetic disk storage capacity being 1.25 million words, the library has a total of ca. 50 10 thousand words (eight cylinders). The work provides regulations for compiling standard programs in SLANG. The library is characterized by the following main features: possibility of being used in memory dynamic distribution mode; possibility of being used for computers with internal memory capacity 4K; no need for intermediary-language coding of displaced program; and possibility of autonomous shift of standard program. The above library is compared with a comprising DES programs library

  14. Reference interaction site model with hydrophobicity induced density inhomogeneity: An analytical theory to compute solvation properties of large hydrophobic solutes in the mixture of polyatomic solvent molecules

    International Nuclear Information System (INIS)

    Cao, Siqin; Sheong, Fu Kit; Huang, Xuhui

    2015-01-01

    Reference interaction site model (RISM) has recently become a popular approach in the study of thermodynamical and structural properties of the solvent around macromolecules. On the other hand, it was widely suggested that there exists water density depletion around large hydrophobic solutes (>1 nm), and this may pose a great challenge to the RISM theory. In this paper, we develop a new analytical theory, the Reference Interaction Site Model with Hydrophobicity induced density Inhomogeneity (RISM-HI), to compute solvent radial distribution function (RDF) around large hydrophobic solute in water as well as its mixture with other polyatomic organic solvents. To achieve this, we have explicitly considered the density inhomogeneity at the solute-solvent interface using the framework of the Yvon-Born-Green hierarchy, and the RISM theory is used to obtain the solute-solvent pair correlation. In order to efficiently solve the relevant equations while maintaining reasonable accuracy, we have also developed a new closure called the D2 closure. With this new theory, the solvent RDFs around a large hydrophobic particle in water and different water-acetonitrile mixtures could be computed, which agree well with the results of the molecular dynamics simulations. Furthermore, we show that our RISM-HI theory can also efficiently compute the solvation free energy of solute with a wide range of hydrophobicity in various water-acetonitrile solvent mixtures with a reasonable accuracy. We anticipate that our theory could be widely applied to compute the thermodynamic and structural properties for the solvation of hydrophobic solute

  15. Free surface profiles in river flows: Can standard energy-based gradually-varied flow computations be pursued?

    Science.gov (United States)

    Cantero, Francisco; Castro-Orgaz, Oscar; Garcia-Marín, Amanda; Ayuso, José Luis; Dey, Subhasish

    2015-10-01

    Is the energy equation for gradually-varied flow the best approximation for the free surface profile computations in river flows? Determination of flood inundation in rivers and natural waterways is based on the hydraulic computation of flow profiles. This is usually done using energy-based gradually-varied flow models, like HEC-RAS, that adopts a vertical division method for discharge prediction in compound channel sections. However, this discharge prediction method is not so accurate in the context of advancements over the last three decades. This paper firstly presents a study of the impact of discharge prediction on the gradually-varied flow computations by comparing thirteen different methods for compound channels, where both energy and momentum equations are applied. The discharge, velocity distribution coefficients, specific energy, momentum and flow profiles are determined. After the study of gradually-varied flow predictions, a new theory is developed to produce higher-order energy and momentum equations for rapidly-varied flow in compound channels. These generalized equations enable to describe the flow profiles with more generality than the gradually-varied flow computations. As an outcome, results of gradually-varied flow provide realistic conclusions for computations of flow in compound channels, showing that momentum-based models are in general more accurate; whereas the new theory developed for rapidly-varied flow opens a new research direction, so far not investigated in flows through compound channels.

  16. Advanced mathematical on-line analysis in nuclear experiments. Usage of parallel computing CUDA routines in standard root analysis

    Science.gov (United States)

    Grzeszczuk, A.; Kowalski, S.

    2015-04-01

    Compute Unified Device Architecture (CUDA) is a parallel computing platform developed by Nvidia for increase speed of graphics by usage of parallel mode for processes calculation. The success of this solution has opened technology General-Purpose Graphic Processor Units (GPGPUs) for applications not coupled with graphics. The GPGPUs system can be applying as effective tool for reducing huge number of data for pulse shape analysis measures, by on-line recalculation or by very quick system of compression. The simplified structure of CUDA system and model of programming based on example Nvidia GForce GTX580 card are presented by our poster contribution in stand-alone version and as ROOT application.

  17. The validity of cone-beam computed tomography in measuring root canal length using a gold standard

    NARCIS (Netherlands)

    Liang, Y.H.; Jiang, L.; Chen, C.; Gao, X.J.; Wesselink, P.R.; Wu, M.K.; Shemesh, H.

    2013-01-01

    Introduction The distance between a coronal reference point and the major apical foramen is important for working length determination. The aim of this in vitro study was to determine the accuracy of root canal length measurements performed with cone-beam computed tomographic (CBCT) scans using a

  18. Site-specific standard request for Underground Storage Tanks 1219-U, 1222-U, 2082-U, and 2068-U at the Rust Garage Facility Buildings 9754-1 and 9720-15

    International Nuclear Information System (INIS)

    1994-08-01

    This document is a site-specific standard request for underground storage tanks located at the Rust Garage Facility. These standards are justified based on conclusion derived from the exposure assessment that indicates there is no current or forseeable future human health risk associated with petroleum contaminants on the site, that current and future ecological risks would be generally limited to subsurface species and plant life with roots extending into the area, and that most of the impacted area at the site is covered by asphalt or concrete. The vertical and horizontal extent of soil and ground water contamination are limited to immediate area of the Rust Garage Facility

  19. Advanced mathematical on-line analysis in nuclear experiments. Usage of parallel computing CUDA routines in standard root analysis

    Directory of Open Access Journals (Sweden)

    Grzeszczuk A.

    2015-01-01

    Full Text Available Compute Unified Device Architecture (CUDA is a parallel computing platform developed by Nvidia for increase speed of graphics by usage of parallel mode for processes calculation. The success of this solution has opened technology General-Purpose Graphic Processor Units (GPGPUs for applications not coupled with graphics. The GPGPUs system can be applying as effective tool for reducing huge number of data for pulse shape analysis measures, by on-line recalculation or by very quick system of compression. The simplified structure of CUDA system and model of programming based on example Nvidia GForce GTX580 card are presented by our poster contribution in stand-alone version and as ROOT application.

  20. Exercising CMS dataflows and workflows in computing challenges at the SpanishTier-1 and Tier-2 sites

    Energy Technology Data Exchange (ETDEWEB)

    Caballero, J; Colino, N; Peris, A D; G-Abia, P; Hernandez, J M; R-Calonge, F J [CIEMAT, Madrid (Spain); Cabrillo, I; Caballero, I G; Marco, R; Matorras, F [IFCA, Santander (Spain); Flix, J; Merino, G [PIC, Barcelona (Spain)], E-mail: jose.hernandez@ciemat.es

    2008-07-15

    An overview of the data transfer, processing and analysis operations conducted at the Spanish Tier-1 (PIC, Barcelona) and Tier-2 (CIEMAT-Madrid and IFCA-Santander federation) centres during the past CMS CSA06 Computing, Software and Analysis challenge and in preparation for CSA07 is present0008.

  1. Exercising CMS dataflows and workflows in computing challenges at the SpanishTier-1 and Tier-2 sites

    International Nuclear Information System (INIS)

    Caballero, J; Colino, N; Peris, A D; G-Abia, P; Hernandez, J M; R-Calonge, F J; Cabrillo, I; Caballero, I G; Marco, R; Matorras, F; Flix, J; Merino, G

    2008-01-01

    An overview of the data transfer, processing and analysis operations conducted at the Spanish Tier-1 (PIC, Barcelona) and Tier-2 (CIEMAT-Madrid and IFCA-Santander federation) centres during the past CMS CSA06 Computing, Software and Analysis challenge and in preparation for CSA07 is presented

  2. Deriving causes of child mortality by re–analyzing national verbal autopsy data applying a standardized computer algorithm in Uganda, Rwanda and Ghana

    Directory of Open Access Journals (Sweden)

    Li Liu

    2015-06-01

    Full Text Available Background To accelerate progress toward the Millennium Development Goal 4, reliable information on causes of child mortality is critical. With more national verbal autopsy (VA studies becoming available, how to improve consistency of national VA derived child causes of death should be considered for the purpose of global comparison. We aimed to adapt a standardized computer algorithm to re–analyze national child VA studies conducted in Uganda, Rwanda and Ghana recently, and compare our results with those derived from physician review to explore issues surrounding the application of the standardized algorithm in place of physician review. Methods and Findings We adapted the standardized computer algorithm considering the disease profile in Uganda, Rwanda and Ghana. We then derived cause–specific mortality fractions applying the adapted algorithm and compared the results with those ascertained by physician review by examining the individual– and population–level agreement. Our results showed that the leading causes of child mortality in Uganda, Rwanda and Ghana were pneumonia (16.5–21.1% and malaria (16.8–25.6% among children below five years and intrapartum–related complications (6.4–10.7% and preterm birth complications (4.5–6.3% among neonates. The individual level agreement was poor to substantial across causes (kappa statistics: –0.03 to 0.83, with moderate to substantial agreement observed for injury, congenital malformation, preterm birth complications, malaria and measles. At the population level, despite fairly different cause–specific mortality fractions, the ranking of the leading causes was largely similar. Conclusions The standardized computer algorithm produced internally consistent distribution of causes of child mortality. The results were also qualitatively comparable to those based on physician review from the perspective of public health policy. The standardized computer algorithm has the advantage of

  3. Cloud Computing Security Model with Combination of Data Encryption Standard Algorithm (DES) and Least Significant Bit (LSB)

    Science.gov (United States)

    Basri, M.; Mawengkang, H.; Zamzami, E. M.

    2018-03-01

    Limitations of storage sources is one option to switch to cloud storage. Confidentiality and security of data stored on the cloud is very important. To keep up the confidentiality and security of such data can be done one of them by using cryptography techniques. Data Encryption Standard (DES) is one of the block cipher algorithms used as standard symmetric encryption algorithm. This DES will produce 8 blocks of ciphers combined into one ciphertext, but the ciphertext are weak against brute force attacks. Therefore, the last 8 block cipher will be converted into 8 random images using Least Significant Bit (LSB) algorithm which later draws the result of cipher of DES algorithm to be merged into one.

  4. New Computational Approaches for NMR-based Drug Design: A Protocol for Ligand Docking to Flexible Target Sites

    International Nuclear Information System (INIS)

    Gracia, Luis; Speidel, Joshua A.; Weinstein, Harel

    2006-01-01

    NMR-based drug design has met with some success in the last decade, as illustrated in numerous instances by Fesik's ''ligand screening by NMR'' approach. Ongoing efforts to generalize this success have led us to the development of a new paradigm in which quantitative computational approaches are being integrated with NMR derived data and biological assays. The key component of this work is the inclusion of the intrinsic dynamic quality of NMR structures in theoretical models and its use in docking. A new computational protocol is introduced here, designed to dock small molecule ligands to flexible proteins derived from NMR structures. The algorithm makes use of a combination of simulated annealing monte carlo simulations (SA/MC) and a mean field potential informed by the NMR data. The new protocol is illustrated in the context of an ongoing project aimed at developing new selective inhibitors for the PCAF bromodomains that interact with HIV Tat

  5. CERN readies world's biggest science grid The computing network now encompasses more than 100 sites in 31 countries

    CERN Multimedia

    Niccolai, James

    2005-01-01

    If the Large Hadron Collider (LHC) at CERN is to yield miraculous discoveries in particle physics, it may also require a small miracle in grid computing. By a lack of suitable tools from commercial vendors, engineers at the famed Geneva laboratory are hard at work building a giant grid to store and process the vast amount of data the collider is expected to produce when it begins operations in mid-2007 (2 pages)

  6. Contribution to global computation infrastructure: inter-platform delegation, integration of standard services and application to high-energy physics

    International Nuclear Information System (INIS)

    Lodygensky, Oleg

    2006-01-01

    The generalization and implementation of the current information resources, particularly the large storing capacities and the networks allow conceiving new methods of work and ways of entertainment. Centralized stand-alone, monolithic computing stations have been gradually replaced by distributed client-tailored architectures which in turn are challenged by the new distributed systems called 'pair-by pair' systems. This migration is no longer with the specialists' realm but users of more modest skills get used with this new techniques for e-mailing commercial information and exchanging various sorts of files on a 'equal-to-equal' basis. Trade, industry and research as well make profits largely of the new technique called 'grid', this new technique of handling information at a global scale. The present work concerns the grid utilisation for computation. A synergy was created with Paris-Sud University at Orsay, between the Information Research Laboratory (LRI) and the Linear Accelerator Laboratory (LAL) in order to foster the works on grid infrastructure of high research interest for LRI and offering new working methods for LAL. The results of the work developed within this inter-disciplinary-collaboration are based on XtremWeb, the research and production platform for global computation elaborated at LRI. First one presents the current status of the large-scale distributed systems, their basic principles and user-oriented architecture. The XtremWeb is then described focusing the modifications which were effected upon both architecture and implementation in order to fulfill optimally the requirements imposed to such a platform. Then one presents studies with the platform allowing a generalization of the inter-grid resources and development of a user-oriented grid adapted to special services, as well,. Finally one presents the operation modes, the problems to solve and the advantages of this new platform for the high-energy research community, the most demanding

  7. Institutional Computing: Final Report Quantum Effects on Cosmology: Probing Physics Beyond the Standard Model with Big Bang Nucleosynthesis

    Energy Technology Data Exchange (ETDEWEB)

    Paris, Mark W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-13

    The current one-year project allocation (w17 burst) supports the continuation of research performed in the two-year Institutional Computing allocation (w14 bigbangnucleosynthesis). The project has supported development and production runs resulting in several publications[1, 2, 3, 4] in peer-review journals and talks. Most signi cantly, we have recently achieved a signi cant improvement in code performance. This improvement was essential to the prospect of making further progress on this heretofore unsolved multiphysics problem that lies at the intersection of nuclear and particle theory and the kinetic theory of energy transport in a system with internal (quantum) degrees of freedom.

  8. Evaluation of linear measurements of implant sites based o head orientation during acquisition: An ex vivo study using cone-beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Sabban, Hanadi; Mahdian, Mina; Dhingra, Ajay; Lurie, Alan G.; Tadinada, Aditya [University of Connecticut School of Dental Medicine, Farmington (United States)

    2015-06-15

    This study evaluated the effect of various head orientations during cone-beam computed tomography (CBCT) image acquisition on linear measurements of potential implant sites. Six dry human skulls with a total of 28 implant sites were evaluated for seven different head orientations. The scans were acquired using a Hitachi CB-MercuRay CBCT machine. The scanned volumes were reconstructed. Horizontal and vertical measurements were made and were compared to measurements made after simulating the head position to corrected head angulations. Data was analyzed using a two-way ANOVA test. Statistical analysis revealed a significant interaction between the mean errors in vertical measurements with a marked difference observed at the extension head position (P<0.05). Statistical analysis failed to yield any significant interaction between the mean errors in horizontal measurements at various head positions. Head orientation could significantly affect the vertical measurements in CBCT scans. The main head position influencing the measurements is extension.

  9. Site-Mutation of Hydrophobic Core Residues Synchronically Poise Super Interleukin 2 for Signaling: Identifying Distant Structural Effects through Affordable Computations

    Directory of Open Access Journals (Sweden)

    Longcan Mei

    2018-03-01

    Full Text Available A superkine variant of interleukin-2 with six site mutations away from the binding interface developed from the yeast display technique has been previously characterized as undergoing a distal structure alteration which is responsible for its super-potency and provides an elegant case study with which to get insight about how to utilize allosteric effect to achieve desirable protein functions. By examining the dynamic network and the allosteric pathways related to those mutated residues using various computational approaches, we found that nanosecond time scale all-atom molecular dynamics simulations can identify the dynamic network as efficient as an ensemble algorithm. The differentiated pathways for the six core residues form a dynamic network that outlines the area of structure alteration. The results offer potentials of using affordable computing power to predict allosteric structure of mutants in knowledge-based mutagenesis.

  10. Development of a computer code system for selecting off-site protective action in radiological accidents based on the multiobjective optimization method

    International Nuclear Information System (INIS)

    Ishigami, Tsutomu; Oyama, Kazuo

    1989-09-01

    This report presents a new method to support selection of off-site protective action in nuclear reactor accidents, and provides a user's manual of a computer code system, PRASMA, developed using the method. The PRASMA code system gives several candidates of protective action zones of evacuation, sheltering and no action based on the multiobjective optimization method, which requires objective functions and decision variables. We have assigned population risks of fatality, injury and cost as the objective functions, and distance from a nuclear power plant characterizing the above three protective action zones as the decision variables. (author)

  11. Transcription factor HIF1A: downstream targets, associated pathways, polymorphic hypoxia response element (HRE) sites, and initiative for standardization of reporting in scientific literature.

    Science.gov (United States)

    Slemc, Lucija; Kunej, Tanja

    2016-11-01

    Hypoxia-inducible factor-1α (HIF-1α) has crucial role in adapting cells to hypoxia through expression regulation of many genes. Identification of HIF-1α target genes (HIF-1α-TGs) is important for understanding the adapting mechanism. The aim of the present study was to collect known HIF-1α-TGs and identify their associated pathways. Targets and associated genomics data were retrieved using PubMed, WoS ( http://apps.webofknowledge.com/ ), HGNC ( http://www.genenames.org/ ), NCBI ( http://www.ncbi.nlm.nih.gov/ ), Ensemblv.84 ( http://www.ensembl.org/index.html ), DAVID Bioinformatics Resources ( https://david.ncifcrf.gov /), and Disease Ontology database ( http://disease-ontology.org/ ). From 51 papers, we collected 98 HIF-1α TGs found to be associated with 20 pathways, including metabolism of carbohydrates and pathways in cancer. Reanalysis of genomic coordinates of published HREs (hypoxia response elements) revealed six polymorphisms within HRE sites (HRE-SNPs): ABCG2, ACE, CA9, and CP. Due to large heterogeneity of results presentation in scientific literature, we also propose a first step towards reporting standardization of HIF-1α-target interactions consisting of ten relevant data types. Suggested minimal checklist for reporting will enable faster development of a complete catalog of HIF-1α-TGs, data sharing, bioinformatics analyses, and setting novel more targeted hypotheses. The proposed format for data standardization is not yet complete but presents a baseline for further optimization of the protocol with additional details, for example, regarding the experimental validation.

  12. Communications standards

    CERN Document Server

    Stokes, A V

    1986-01-01

    Communications Standards deals with the standardization of computer communication networks. This book examines the types of local area networks (LANs) that have been developed and looks at some of the relevant protocols in more detail. The work of Project 802 is briefly discussed, along with a protocol which has developed from one of the LAN standards and is now a de facto standard in one particular area, namely the Manufacturing Automation Protocol (MAP). Factors that affect the usage of networks, such as network management and security, are also considered. This book is divided into three se

  13. Site of cochlear stimulation and its effect on electrically evoked compound action potentials using the MED-EL standard electrode array

    Directory of Open Access Journals (Sweden)

    Helbig Silke

    2009-12-01

    Full Text Available Abstract Background The standard electrode array for the MED-EL MAESTRO cochlear implant system is 31 mm in length which allows an insertion angle of approximately 720°. When fully inserted, this long electrode array is capable of stimulating the most apical region of the cochlea. No investigation has explored Electrically Evoked Compound Action Potential (ECAP recordings in this region with a large number of subjects using a commercially available cochlear implant system. The aim of this study is to determine if certain properties of ECAP recordings vary, depending on the stimulation site in the cochlea. Methods Recordings of auditory nerve responses were conducted in 67 subjects to demonstrate the feasibility of ECAP recordings using the Auditory Nerve Response Telemetry (ART™ feature of the MED-EL MAESTRO system software. These recordings were then analyzed based on the site of cochlear stimulation defined as basal, middle and apical to determine if the amplitude, threshold and slope of the amplitude growth function and the refractory time differs depending on the region of stimulation. Results Findings show significant differences in the ECAP recordings depending on the stimulation site. Comparing the apical with the basal region, on average higher amplitudes, lower thresholds and steeper slopes of the amplitude growth function have been observed. The refractory time shows an overall dependence on cochlear region; however post-hoc tests showed no significant effect between individual regions. Conclusions Obtaining ECAP recordings is also possible in the most apical region of the cochlea. However, differences can be observed depending on the region of the cochlea stimulated. Specifically, significant higher ECAP amplitude, lower thresholds and steeper amplitude growth function slopes have been observed in the apical region. These differences could be explained by the location of the stimulating electrode with respect to the neural tissue

  14. Determining the spill flow discharge of combined sewer overflows using rating curves based on computational fluid dynamics instead of the standard weir equation.

    Science.gov (United States)

    Fach, S; Sitzenfrei, R; Rauch, W

    2009-01-01

    It is state of the art to evaluate and optimise sewer systems with urban drainage models. Since spill flow data is essential in the calibration process of conceptual models it is important to enhance the quality of such data. A wide spread approach is to calculate the spill flow volume by using standard weir equations together with measured water levels. However, these equations are only applicable to combined sewer overflow (CSO) structures, whose weir constructions correspond with the standard weir layout. The objective of this work is to outline an alternative approach to obtain spill flow discharge data based on measurements with a sonic depth finder. The idea is to determine the relation between water level and rate of spill flow by running a detailed 3D computational fluid dynamics (CFD) model. Two real world CSO structures have been chosen due to their complex structure, especially with respect to the weir construction. In a first step the simulation results were analysed to identify flow conditions for discrete steady states. It will be shown that the flow conditions in the CSO structure change after the spill flow pipe acts as a controlled outflow and therefore the spill flow discharge cannot be described with a standard weir equation. In a second step the CFD results will be used to derive rating curves which can be easily applied in everyday practice. Therefore the rating curves are developed on basis of the standard weir equation and the equation for orifice-type outlets. Because the intersection of both equations is not known, the coefficients of discharge are regressed from CFD simulation results. Furthermore, the regression of the CFD simulation results are compared with the one of the standard weir equation by using historic water levels and hydrographs generated with a hydrodynamic model. The uncertainties resulting of the wide spread use of the standard weir equation are demonstrated.

  15. Reviews of computing technology: Software overview

    Energy Technology Data Exchange (ETDEWEB)

    Hartshorn, W.R.; Johnson, A.L.

    1994-01-05

    The Savannah River Site Computing Architecture states that the site computing environment will be standards-based, data-driven, and workstation-oriented. Larger server systems deliver needed information to users in a client-server relationship. Goals of the Architecture include utilizing computing resources effectively, maintaining a high level of data integrity, developing a robust infrastructure, and storing data in such a way as to promote accessibility and usability. This document describes the current storage environment at Savannah River Site (SRS) and presents some of the problems that will be faced and strategies that are planned over the next few years.

  16. Ion binding by humic and fulvic acids: A computational procedure based on functional site heterogeneity and the physical chemistry of polyelectrolyte solutions

    International Nuclear Information System (INIS)

    Marinsky, J.A.; Reddy, M.M.; Ephraim, J.; Mathuthu, A.

    1988-04-01

    Ion binding equilibria for humic and fulvic acids are examined from the point of view of functional site heterogeneity and the physical chemistry of polyelectrolyte solutions. A detailed explanation of the potentiometric properties of synthetic polyelectrolytes and ion-exchange gels is presented first to provide the basis for a parallel consideration of the potentiometric properties exhibited by humic and fulvic acids. The treatment is then extended to account for functional site heterogeneity. Sample results are presented for analysis of the ion-binding reactions of a standard soil fulvic acid (Armadale Horizons Bh) with this approach to test its capability for anticipation of metal ion removal from solution. The ultimate refined model is shown to be adaptable, after appropriate consideration of the heterogeneity and polyelectrolyte factors, to programming already available for the consideration of ion binding by inorganics in natural waters. (orig.)

  17. The reliability of cone-beam computed tomography to assess bone density at dental implant recipient sites: a histomorphometric analysis by micro-CT.

    Science.gov (United States)

    González-García, Raúl; Monje, Florencio

    2013-08-01

    The aim of this study was to objectively assess the reliability of the cone-beam computed tomography (CBCT) as a tool to pre-operatively determine radiographic bone density (RBD) by the density values provided by the system, analyzing its relationship with histomorphometric bone density expressed as bone volumetric fraction (BV/TV) assessed by micro-CT of bone biopsies at the site of insertion of dental implants in the maxillary bones. Thirty-nine bone biopsies of the maxillary bones at the sites of 39 dental implants from 31 edentulous healthy patients were analyzed. The NobelGuide™ software was used for implant planning, which also allowed fabrication of individual stereolithographic surgical guides. The analysis of CBCT images allowed pre-operative determination of mean density values of implant recipient sites along the major axis of the planned implants (axial RBD). Stereolithographic surgical guides were used to guide implant insertion and also to extract cylindrical bone biopsies from the core of the exact implant site. Further analysis of several osseous micro-structural variables including BV/TV was performed by micro-CT of the extracted bone biopsies. Mean axial RBD was 478 ± 212 (range: 144-953). A statistically significant difference (P = 0.02) was observed among density values of the cortical bone of the upper maxilla and mandible. A high positive Pearson's correlation coefficient (r = 0.858, P micro-CT at the site of dental implants in the maxillary bones. Pre-operative estimation of density values by CBCT is a reliable tool to objectively determine bone density. © 2012 John Wiley & Sons A/S.

  18. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Functional modules F1--F8 -- Volume 2, Part 1, Revision 4

    International Nuclear Information System (INIS)

    Greene, N.M.; Petrie, L.M.; Westfall, R.M.; Bucholz, J.A.; Hermann, O.W.; Fraley, S.K.

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation; Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries

  19. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Functional modules F9--F16 -- Volume 2, Part 2, Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    West, J.T.; Hoffman, T.J.; Emmett, M.B.; Childs, K.W.; Petrie, L.M.; Landers, N.F.; Bryan, C.B.; Giles, G.E. [Oak Ridge National Lab., TN (United States)

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries. This volume discusses the following functional modules: MORSE-SGC; HEATING 7.2; KENO V.a; JUNEBUG-II; HEATPLOT-S; REGPLOT 6; PLORIGEN; and OCULAR.

  20. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Functional modules F1--F8 -- Volume 2, Part 1, Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    Greene, N.M.; Petrie, L.M.; Westfall, R.M.; Bucholz, J.A.; Hermann, O.W.; Fraley, S.K. [Oak Ridge National Lab., TN (United States)

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation; Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries.

  1. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Functional modules F9--F16 -- Volume 2, Part 2, Revision 4

    International Nuclear Information System (INIS)

    West, J.T.; Hoffman, T.J.; Emmett, M.B.; Childs, K.W.; Petrie, L.M.; Landers, N.F.; Bryan, C.B.; Giles, G.E.

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries. This volume discusses the following functional modules: MORSE-SGC; HEATING 7.2; KENO V.a; JUNEBUG-II; HEATPLOT-S; REGPLOT 6; PLORIGEN; and OCULAR

  2. Comparison of standard reading and computer aided detection (CAD) on a national proficiency test of screening mammography

    International Nuclear Information System (INIS)

    Ciatto, Stefano; Del Turco, Marco Rosselli; Risso, Gabriella; Catarzi, Sandra; Bonardi, Rita; Viterbo, Valeria; Gnutti, Pierangela; Guglielmoni, Barbara; Pinelli, Lelio; Pandiscia, Anna; Navarra, Francesco; Lauria, Adele; Palmiero, Rosa; Indovina, Pietro Luigi

    2003-01-01

    Objective: To evaluate the role of computer aided detection (CAD) in improving the interpretation of screening mammograms Material and methods: Ten radiologists underwent a proficiency test of screening mammography first by conventional reading and then with the help of CAD. Radiologists were blinded to test results for the whole study duration. Results of conventional and CAD reading were compared in terms of sensitivity and recall rate. Double reading was simulated combining conventional readings of four expert radiologists and compared with CAD reading. Results: Considering all ten readings, cancer was identified in 146 or 153 of 170 cases (85.8 vs. 90.0%; χ 2 =0.99, df=1, P=0.31) and recalls were 106 or 152 of 1330 cases (7.9 vs. 11.4%; χ 2 =8.69, df=1, P=0.003) at conventional or CAD reading, respectively. CAD reading was essentially the same (sensitivity 97.0 vs. 96.0%; χ 2 =7.1, df=1, P=0.93; recall rate 10.7 vs. 10.6%; χ 2 =1.5, df=1, P=0.96) as compared with simulated conventional double reading. Conclusion: CAD reading seems to improve the sensitivity of conventional reading while reducing specificity, both effects being of limited size. CAD reading had almost the same performance of simulated conventional double reading, suggesting a possible use of CAD which needs to be confirmed by further studies inclusive of cost-effective analysis

  3. Sensitivity of Non-Contrast Computed Tomography for Small Renal Calculi with Endoscopy as the Gold Standard.

    Science.gov (United States)

    Bhojani, Naeem; Paonessa, Jessica E; El Tayeb, Marawan M; Williams, James C; Hameed, Tariq A; Lingeman, James E

    2018-04-03

    To compare the sensitivity of non-contrast CT to endoscopy for detection of renal calculi. Imaging modalities for detection of nephrolithiasis have centered on abdominal x-ray (KUB), ultrasound (US), and non-contrast computed tomography (CT). Sensitivities of 58-62% (KUB), 45% (US), and 95-100% (CT) have been previously reported. However, these results have never been correlated with endoscopic findings. Idiopathic calcium oxalate stone formers with symptomatic calculi requiring ureteroscopy (URS) were studied. At the time of surgery, the number and location of all calculi within the kidney were recorded followed by basket retrieval. Each calculus was measured and sent for micro CT and infrared spectrophotometry. All CT scans were reviewed by the same genitourinary radiologist who was blinded to the endoscopic findings. The radiologist reported on the number, location, and size of each calculus. 18 renal units were studied in 11 patients. Average time from CT scan to URS was 28.6 days. The mean number of calculi identified per kidney was 9.2±6.1 for endoscopy and 5.9±4.1 for CT (p<0.004). The mean size of total renal calculi (sum of longest stone diameters) per kidney was 22.4±17.1 mm and 18.2±13.2 mm for endoscopy and CT, respectively (p=0.06). CT scan underreports the number of renal calculi, probably missing some small stones and unable to distinguish those lying in close proximity to one another. However, the total stone burden seen by CT is, on average, accurate when compared to that found on endoscopic examination. Copyright © 2018. Published by Elsevier Inc.

  4. Low-Dose and Standard-Dose Unenhanced Helical Computed Tomography for the Assessment of Acute Renal Colic: Prospective Comparative Study

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Bong Soo; Hwang, Im Kyung; Choi, Yo Won; Namkung, Sook; Kim, Heung Cheol; Hwang, Woo Cheol; Choi, Kuk Myung; Park, Ji Kang; Han, Tae Il; Kang, Weechang [Cheju National Univ. College of Medicine, Jeju (Korea, Republic of). Dept. of Diagnostic Radiology

    2005-11-01

    Purpose: To compare the efficacy of low-dose and standard-dose computed tomography (CT) for the diagnosis of ureteral stones. Material and Methods: Unenhanced helical CT was performed with both a standard dose (260 mAs, pitch 1.5) and a low dose (50 mAs, pitch 1.5) in 121 patients suspected of having acute renal colic. The two studies were prospectively and independently interpreted for the presence and location of ureteral stones, abnormalities unrelated to stone disease, identification of secondary signs, i.e. hydronephrosis and perinephric stranding, and tissue rim sign. The standard-dose CT images were interpreted by one reviewer and the low-dose CT images independently by two reviewers unaware of the standard-dose CT findings. The findings of the standard and low-dose CT scans were compared with the exact McNemar test. Interobserver agreements were assessed with kappa analysis. The effective radiation doses resulting from two different protocols were calculated by means of commercially available software to which the Monte-Carlo phantom model was given. Results: The sensitivity, specificity, and accuracy of standard-dose CT for detecting ureteral stones were 99%, 93%, and 98%, respectively, whereas for the two reviewers the sensitivity of low-dose CT was 93% and 95%, specificity 86%, and accuracy 92% and 94%. We found no significant differences between standard-dose and low-dose CT in the sensitivity and specificity for diagnosing ureter stones ( P >0.05 for both). However, the sensitivity of low-dose CT for detection of 19 stones less than or equal to 2 mm in diameter was 79% and 68%, respectively, for the two reviewers. Low-dose CT was comparable to standard-dose CT in visualizing hydronephrosis and the tissue rim sign. Perinephric stranding was far less clear on low-dose CT. Low-dose CT had the same diagnostic performance as standard-dose CT in diagnosing alternative diseases. Interobserver agreement between the two low-dose CT reviewers in the diagnosis of

  5. User's guide for the implementation of level one of the proposed American National Standard Specifications for an information interchange data descriptive file on control data 6000/7000 series computers

    CERN Document Server

    Wiley, R A

    1977-01-01

    User's guide for the implementation of level one of the proposed American National Standard Specifications for an information interchange data descriptive file on control data 6000/7000 series computers

  6. A comparison between standard well test evaluation methods used in SKB's site investigations and the generalised radial flow concept

    Energy Technology Data Exchange (ETDEWEB)

    Follin, Sven (SF GeoLogic AB (Sweden)); Ludvigson, Jan-Erik; Leven, Jakob (Geosigma AB (Sweden))

    2011-09-15

    According to the strategy for hydrogeological characterisation within the SKB's site investigation programme, two single-hole test methods are available for testing and parameterisation of groundwater flow models - constant-head injection testing with the Pipe String System (PSS method) and difference flow logging with the Posiva Flow Log (PFL method). This report presents the results of an investigation to assess discrepancies in the results of single-hole transmissivity measurements using these methods in the Forsmark site characterisation. The investigation explores the possibility that the source of the discrepancy observed lies in the assumptions of the flow geometry that are inherent to the methods used for standard constant-head injection well test analysis and difference flow logging analysis, respectively. In particular, the report looks at the generalised radial flow (GRF) concept by Barker (1988) as a means that might explain some of the differences. A confirmation of the actual flow geometries (dimensions) observed during hydraulic injection tests could help to identify admissible conceptual models for the tested system, and place the hydraulic testing with the PSS and PFL test methods in its full hydrogeological context. The investigation analyses 151 constant-head injection tests in three cored boreholes at Forsmark. The results suggest that the transmissivities derived with standard constant-head injection well test analysis methods and with the GRF concept, respectively, are similar provided that the dominating flow geometry during the testing is radial (cylindrical). Thus, having flow geometries with dimensions other than 2 affects the value of the interpreted transmissivity. For example, a flow system with a dimension of 1 may require an order of magnitude or more, higher transmissivity to produce the same flow rates. The median of the GRF flow dimensions of all 151 constant-head injection tests is 2.06 with 33% of the tests in the range 1

  7. Computational study of hydration at the TD damaged site of DNA in complex with repair enzyme T4 endonuclease V

    International Nuclear Information System (INIS)

    Pinak, Miroslav

    2000-02-01

    An analysis of the distribution of water around DNA surface focusing on the role of the distribution of water molecules in the proper recognition of damaged site by repair enzyme T4 Endonuclease V was performed. The native DNA dodecamer, dodecamer with the thymine dimer (TD) and complex of DNA and part of repair enzyme T4 Endonuclease V were examined throughout the 500 ps of molecular dynamics simulation. During simulation the number of water molecules close to the DNA atoms and the residence time were calculated. There is an increase in number of water molecules lying in the close vicinity to TD if compared with those lying close to two native thymines (TT). Densely populated area with water molecules around TD is one of the factors detected by enzyme during scanning process. The residence time was found higher for molecule of the complex and the six water molecules were found occupying the stabile positions between the TD and catalytic center close to atoms P, C3' and N3. These molecules originate water mediated hydrogen bond network that contribute to the stability of complex required for the onset of repair process. (author)

  8. Computational study of hydration at the TD damaged site of DNA in complex with repair enzyme T4 endonuclease V

    Energy Technology Data Exchange (ETDEWEB)

    Pinak, Miroslav [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2000-02-01

    An analysis of the distribution of water around DNA surface focusing on the role of the distribution of water molecules in the proper recognition of damaged site by repair enzyme T4 Endonuclease V was performed. The native DNA dodecamer, dodecamer with the thymine dimer (TD) and complex of DNA and part of repair enzyme T4 Endonuclease V were examined throughout the 500 ps of molecular dynamics simulation. During simulation the number of water molecules close to the DNA atoms and the residence time were calculated. There is an increase in number of water molecules lying in the close vicinity to TD if compared with those lying close to two native thymines (TT). Densely populated area with water molecules around TD is one of the factors detected by enzyme during scanning process. The residence time was found higher for molecule of the complex and the six water molecules were found occupying the stabile positions between the TD and catalytic center close to atoms P, C3' and N3. These molecules originate water mediated hydrogen bond network that contribute to the stability of complex required for the onset of repair process. (author)

  9. Arthroscopic Latarjet Techniques: Graft and Fixation Positioning Assessed With 2-Dimensional Computed Tomography Is Not Equivalent With Standard Open Technique.

    Science.gov (United States)

    Neyton, Lionel; Barth, Johannes; Nourissat, Geoffroy; Métais, Pierre; Boileau, Pascal; Walch, Gilles; Lafosse, Laurent

    2018-05-19

    To analyze graft and fixation (screw and EndoButton) positioning after the arthroscopic Latarjet technique with 2-dimensional computed tomography (CT) and to compare it with the open technique. We performed a retrospective multicenter study (March 2013 to June 2014). The inclusion criteria included patients with recurrent anterior instability treated with the Latarjet procedure. The exclusion criterion was the absence of a postoperative CT scan. The positions of the hardware, the positions of the grafts in the axial and sagittal planes, and the dispersion of values (variability) were compared. The study included 208 patients (79 treated with open technique, 87 treated with arthroscopic Latarjet technique with screw fixation [arthro-screw], and 42 treated with arthroscopic Latarjet technique with EndoButton fixation [arthro-EndoButton]). The angulation of the screws was different in the open group versus the arthro-screw group (superior, 10.3° ± 0.7° vs 16.9° ± 1.0° [P open inferior screws (P = .003). In the axial plane (level of equator), the arthroscopic techniques resulted in lateral positions (arthro-screw, 1.5 ± 0.3 mm lateral [P open technique (0.9 ± 0.2 mm medial). At the level of 25% of the glenoid height, the arthroscopic techniques resulted in lateral positions (arthro-screw, 0.3 ± 0.3 mm lateral [P open technique (1.0 ± 0.2 mm medial). Higher variability was observed in the arthro-screw group. In the sagittal plane, the arthro-screw technique resulted in higher positions (55% ± 3% of graft below equator) and the arthro-EndoButton technique resulted in lower positions (82% ± 3%, P open technique (71% ± 2%). Variability was not different. This study shows that the position of the fixation devices and position of the bone graft with the arthroscopic techniques are statistically significantly different from those with the open technique with 2-dimensional CT assessment. In the sagittal plane, the arthro-screw technique provides the highest

  10. - LAA Occluder View for post-implantation Evaluation (LOVE) - standardized imaging proposal evaluating implanted left atrial appendage occlusion devices by cardiac computed tomography

    International Nuclear Information System (INIS)

    Behnes, Michael; Akin, Ibrahim; Sartorius, Benjamin; Fastner, Christian; El-Battrawy, Ibrahim; Borggrefe, Martin; Haubenreisser, Holger; Meyer, Mathias; Schoenberg, Stefan O.; Henzler, Thomas

    2016-01-01

    A standardized imaging proposal evaluating implanted left atrial appendage (LAA) occlusion devices by cardiac computed tomography angiography (cCTA) has never been investigated. cCTA datasets were acquired on a 3 rd generation dual-source CT system and reconstructed with a slice thickness of 0.5 mm. An interdisciplinary evaluation was performed by two interventional cardiologists and one radiologist on a 3D multi-planar workstation. A standardized multi-planar reconstruction algorithm was developed in order to assess relevant clinical aspects of implanted LAA occlusion devices being outlined within a pictorial essay. The following clinical aspects of implanted LAA occlusion devices were evaluated within the most appropriate cCTA multi-planar reconstruction: (1) topography to neighboring structures, (2) peri-device leaks, (3) coverage of LAA lobes, (4) indirect signs of neo-endothelialization. These are illustrated within concise CT imaging examples emphasizing the potential value of the proposed cCTA imaging algorithm: Starting from anatomical cCTA planes and stepwise angulation planes perpendicular to the base of the LAA devices generates an optimal LAA Occluder View for post-implantation Evaluation (LOVE). Aligned true axial, sagittal and coronal LOVE planes offer a standardized and detailed evaluation of LAA occlusion devices after percutaneous implantation. This pictorial essay presents a standardized imaging proposal by cCTA using multi-planar reconstructions that enables systematical follow-up and comparison of patients after LAA occlusion device implantation. The online version of this article (doi:10.1186/s12880-016-0127-y) contains supplementary material, which is available to authorized users

  11. Algorithms for Computation of Fundamental Properties of Seawater. Endorsed by Unesco/SCOR/ICES/IAPSO Joint Panel on Oceanographic Tables and Standards and SCOR Working Group 51. Unesco Technical Papers in Marine Science, No. 44.

    Science.gov (United States)

    Fofonoff, N. P.; Millard, R. C., Jr.

    Algorithms for computation of fundamental properties of seawater, based on the practicality salinity scale (PSS-78) and the international equation of state for seawater (EOS-80), are compiled in the present report for implementing and standardizing computer programs for oceanographic data processing. Sample FORTRAN subprograms and tables are given…

  12. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses ... CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known ...

  13. Data Base Directions; the Next Steps. Proceedings of the Workshop of the National Bureau of Standards and the Association for Computing Machinery (Fort Lauderdale, Florida, October 29-31, 1975).

    Science.gov (United States)

    Berg, John L., Ed.

    To investigate the information needs of managers making decisions regarding the use of data base technology, the National Bureau of Standards and the Association for Computing Machinery held a workshop with approximately 80 experts in five major areas: auditing, evolving technology, government regulation, standards, and user experience. Results of…

  14. Predicting the Metabolic Sites by Flavin-Containing Monooxygenase on Drug Molecules Using SVM Classification on Computed Quantum Mechanics and Circular Fingerprints Molecular Descriptors.

    Directory of Open Access Journals (Sweden)

    Chien-Wei Fu

    Full Text Available As an important enzyme in Phase I drug metabolism, the flavin-containing monooxygenase (FMO also metabolizes some xenobiotics with soft nucleophiles. The site of metabolism (SOM on a molecule is the site where the metabolic reaction is exerted by an enzyme. Accurate prediction of SOMs on drug molecules will assist the search for drug leads during the optimization process. Here, some quantum mechanics features such as the condensed Fukui function and attributes from circular fingerprints (called Molprint2D are computed and classified using the support vector machine (SVM for predicting some potential SOMs on a series of drugs that can be metabolized by FMO enzymes. The condensed Fukui function fA- representing the nucleophilicity of central atom A and the attributes from circular fingerprints accounting the influence of neighbors on the central atom. The total number of FMO substrates and non-substrates collected in the study is 85 and they are equally divided into the training and test sets with each carrying roughly the same number of potential SOMs. However, only N-oxidation and S-oxidation features were considered in the prediction since the available C-oxidation data was scarce. In the training process, the LibSVM package of WEKA package and the option of 10-fold cross validation are employed. The prediction performance on the test set evaluated by accuracy, Matthews correlation coefficient and area under ROC curve computed are 0.829, 0.659, and 0.877 respectively. This work reveals that the SVM model built can accurately predict the potential SOMs for drug molecules that are metabolizable by the FMO enzymes.

  15. SedCT: MATLAB™ tools for standardized and quantitative processing of sediment core computed tomography (CT) data collected using a medical CT scanner

    Science.gov (United States)

    Reilly, B. T.; Stoner, J. S.; Wiest, J.

    2017-08-01

    Computed tomography (CT) of sediment cores allows for high-resolution images, three-dimensional volumes, and down core profiles. These quantitative data are generated through the attenuation of X-rays, which are sensitive to sediment density and atomic number, and are stored in pixels as relative gray scale values or Hounsfield units (HU). We present a suite of MATLAB™ tools specifically designed for routine sediment core analysis as a means to standardize and better quantify the products of CT data collected on medical CT scanners. SedCT uses a graphical interface to process Digital Imaging and Communications in Medicine (DICOM) files, stitch overlapping scanned intervals, and create down core HU profiles in a manner robust to normal coring imperfections. Utilizing a random sampling technique, SedCT reduces data size and allows for quick processing on typical laptop computers. SedCTimage uses a graphical interface to create quality tiff files of CT slices that are scaled to a user-defined HU range, preserving the quantitative nature of CT images and easily allowing for comparison between sediment cores with different HU means and variance. These tools are presented along with examples from lacustrine and marine sediment cores to highlight the robustness and quantitative nature of this method.

  16. Design of an EEG-based brain-computer interface (BCI) from standard components running in real-time under Windows.

    Science.gov (United States)

    Guger, C; Schlögl, A; Walterspacher, D; Pfurtscheller, G

    1999-01-01

    An EEG-based brain-computer interface (BCI) is a direct connection between the human brain and the computer. Such a communication system is needed by patients with severe motor impairments (e.g. late stage of Amyotrophic Lateral Sclerosis) and has to operate in real-time. This paper describes the selection of the appropriate components to construct such a BCI and focuses also on the selection of a suitable programming language and operating system. The multichannel system runs under Windows 95, equipped with a real-time Kernel expansion to obtain reasonable real-time operations on a standard PC. Matlab controls the data acquisition and the presentation of the experimental paradigm, while Simulink is used to calculate the recursive least square (RLS) algorithm that describes the current state of the EEG in real-time. First results of the new low-cost BCI show that the accuracy of differentiating imagination of left and right hand movement is around 95%.

  17. INTRAOPERATIVE IMAGE NAVIGATION: EXPERIMENTAL STUDY OF THE FEASIBILITY AND SURGEON PREFERENCE BETWEEN A STERILE ENCASED NINTENDO WIITM REMOTE AND STANDARD WIRELESS COMPUTER MOUSE.

    Science.gov (United States)

    Appleby, Ryan; Zur Linden, Alex; Sears, William

    2017-05-01

    Diagnostic imaging plays an important role in the operating room, providing surgeons with a reference and surgical plan. Surgeon autonomy in the operating room has been suggested to decrease errors that stem from communication mistakes. A standard computer mouse was compared to a wireless remote-control style controller for computer game consoles (Wiimote) for the navigation of diagnostic imaging studies by sterile personnel in this prospective survey study. Participants were recruited from a cohort of residents and faculty that use the surgical suites at our institution. Outcome assessments were based on survey data completed by study participants following each use of either the mouse or Wiimote, and compared using an analysis of variance. The mouse was significantly preferred by the study participants in the categories of handling, accuracy and efficiency, and overall satisfaction (P <0.05). The mouse was preferred to both the Wiimote and to no device, when participants were asked to rank options for image navigation. This indicates the need for the implementation of intraoperative image navigation devices, to increase surgeon autonomy in the operating room. © 2017 American College of Veterinary Radiology.

  18. TECHNICAL BASIS FOR DOE STANDARD 3013 EQUIVALENCY SUPPORTING REDUCED TEMPERATURE STABILIZATION OF OXALATE-DERIVED PLUTONIUM OXIDE PRODUCED BY THE HB-LINE FACILITY AT SAVANNAH RIVER SITE

    Energy Technology Data Exchange (ETDEWEB)

    Duffey, J.; Livingston, R.; Berg, J.; Veirs, D.

    2012-07-02

    The HB-Line (HBL) facility at the Savannah River Site (SRS) is designed to produce high-purity plutonium dioxide (PuO{sub 2}) which is suitable for future use in production of Mixed Oxide (MOX) fuel. The MOX Fuel Fabrication Facility (MFFF) requires PuO{sub 2} feed to be packaged per the U.S. Department of Energy (DOE) Standard 3013 (DOE-STD-3013) to comply with the facility's safety basis. The stabilization conditions imposed by DOE-STD-3013 for PuO{sub 2} (i.e., 950 C for 2 hours) preclude use of the HBL PuO{sub 2} in direct fuel fabrication and reduce the value of the HBL product as MFFF feedstock. Consequently, HBL initiated a technical evaluation to define acceptable operating conditions for production of high-purity PuO{sub 2} that fulfills the DOE-STD-3013 criteria for safe storage. The purpose of this document is to demonstrate that within the defined operating conditions, the HBL process will be equivalent for meeting the requirements of the DOE-STD-3013 stabilization process for plutonium-bearing materials from the DOE complex. The proposed 3013 equivalency reduces the prescribed stabilization temperature for high-purity PuO{sub 2} from oxalate precipitation processes from 950 C to 640 C and places a limit of 60% on the relative humidity (RH) at the lowest material temperature. The equivalency is limited to material produced using the HBL established flow sheet, for example, nitric acid anion exchange and Pu(IV) direct strike oxalate precipitation with stabilization at a minimum temperature of 640 C for four hours (h). The product purity must meet the MFFF acceptance criteria of 23,600 {micro}g/g Pu (i.e., 2.1 wt %) total impurities and chloride content less than 250 {micro}g/g of Pu. All other stabilization and packaging criteria identified by DOE-STD-3013-2012 or earlier revisions of the standard apply. Based on the evaluation of test data discussed in this document, the expert judgment of the authors supports packaging the HBL product under a 3013

  19. Computational Identification of Antigenicity-Associated Sites in the Hemagglutinin Protein of A/H1N1 Seasonal Influenza Virus.

    Directory of Open Access Journals (Sweden)

    Xiaowei Ren

    Full Text Available The antigenic variability of influenza viruses has always made influenza vaccine development challenging. The punctuated nature of antigenic drift of influenza virus suggests that a relatively small number of genetic changes or combinations of genetic changes may drive changes in antigenic phenotype. The present study aimed to identify antigenicity-associated sites in the hemagglutinin protein of A/H1N1 seasonal influenza virus using computational approaches. Random Forest Regression (RFR and Support Vector Regression based on Recursive Feature Elimination (SVR-RFE were applied to H1N1 seasonal influenza viruses and used to analyze the associations between amino acid changes in the HA1 polypeptide and antigenic variation based on hemagglutination-inhibition (HI assay data. Twenty-three and twenty antigenicity-associated sites were identified by RFR and SVR-RFE, respectively, by considering the joint effects of amino acid residues on antigenic drift. Our proposed approaches were further validated with the H3N2 dataset. The prediction models developed in this study can quantitatively predict antigenic differences with high prediction accuracy based only on HA1 sequences. Application of the study results can increase understanding of H1N1 seasonal influenza virus antigenic evolution and accelerate the selection of vaccine strains.

  20. Interaction of water, alkyl hydroperoxide, and allylic alcohol with a single-site homogeneous Ti-Si epoxidation catalyst: A spectroscopic and computational study.

    Science.gov (United States)

    Urakawa, Atsushi; Bürgi, Thomas; Skrabal, Peter; Bangerter, Felix; Baiker, Alfons

    2005-02-17

    Tetrakis(trimethylsiloxy)titanium (TTMST, Ti(OSiMe3)4) possesses an isolated Ti center and is a highly active homogeneous catalyst in epoxidation of various olefins. The structure of TTMST resembles that of the active sites in some heterogeneous Ti-Si epoxidation catalysts, especially silylated titania-silica mixed oxides. Water cleaves the Ti-O-Si bond and deactivates the catalyst. An alkyl hydroperoxide, TBHP (tert-butyl hydroperoxide), does not cleave the Ti-O-Si bond, but interacts via weak hydrogen-bonding as supported by NMR, DOSY, IR, and computational studies. ATR-IR spectroscopy combined with computational investigations shows that more than one, that is, up to four, TBHP can undergo hydrogen-bonding with TTMST, leading to the activation of the O-O bond of TBHP. The greater the number of TBHP molecules that form hydrogen bonds to TTMST, the more electrophilic the O-O bond becomes, and the more active the complex is for epoxidation. An allylic alcohol, 2-cyclohexen-1-ol, does not interact strongly with TTMST, but the interaction is prominent when it interacts with the TTMST-TBHP complex. On the basis of the experimental and theoretical findings, a hydrogen-bond-assisted epoxidation mechanism of TTMST is suggested.

  1. Analysis of Vector Models in Quantification of Artifacts Produced by Standard Prosthetic Inlays in Cone-Beam Computed Tomography (CBCT – a Preliminary Study

    Directory of Open Access Journals (Sweden)

    Ingrid Różyło-Kalinowska

    2014-11-01

    Full Text Available Cone-beam computed tomography (CBCT is a relatively new, but highly efficient imaging method applied first in dentistry in 1998. However, the quality of the obtained slices depends among other things on artifacts generated by dental restorations as well as orthodontic and prosthetic appliances. The aim of the study was to quantify the artifacts produced by standard prosthetic inlays in CBCT images. The material consisted of 17 standard prosthetic inlays mounted in dental roots embedded in resin. The samples were examined by means of a large field of view CBCT unit, Galileos (Sirona, Germany, at 85 kV and 14 mAs. The analysis was performed using Able 3DDoctor software for data in the CT raster space as well as by means of Materialise Magics software for generated vector models (STL. The masks generated in the raster space included the area of the inlays together with image artifacts. The region of interest (ROI of the raster space is a set of voxels from a selected range of Hounsfield units (109-3071. Ceramic inlay with zirconium dioxide (Cera Post as well as epoxy resin inlay including silica fibers enriched with zirconium (Easy Post produced the most intense artifacts. The smallest image distortions were created by titanium inlays, both passive (Harald Nordin and active (Flexi Flange. Inlays containing zirconium generated the strongest artifacts, thus leading to the greatest distortions in the CBCT images. Carbon fiber inlay did not considerably affect the image quality.

  2. Comparison of high-resolution and standard zoom imaging modes in cone beam computed tomography for detection of longitudinal root fracture: An in vitro study

    International Nuclear Information System (INIS)

    Taramsari, Mehran; Kajan, Zahra Dalili; Bashizadeh, Parinaz; Salamat, Fatemeh

    2013-01-01

    The purpose of this study was to compare the efficacy of two imaging modes in a cone beam computed tomography (CBCT) system in detecting root fracture in endodontically-treated teeth with fiber posts or screw posts by selecting two fields of view. In this study, 78 endodontically-treated single canal premolars were included. A post space was created in all of them. Then the teeth were randomly set in one of 6 artificial dental arches. In 39 of the 78 teeth set in the 6 dental arches, a root fracture was intentionally created. Next, a fiber post and a screw post were cemented into 26 teeth having equal the root fractures. High resolution (HiRes) and standard zoom images were provided by a CBCT device. Upon considering the reconstructed images, two observers in agreement with each other confirmed the presence or absence of root fracture. A McNemar test was used for comparing the results of the two modes. The frequency of making a correct diagnosis using the HiRes zoom imaging mode was 71.8% and in standard zoom was 59%. The overall sensitivity and specificity in diagnosing root fracture in the HiRes mode were 71.79% and 46.15% and in the standard zoom modes were 58.97% and 33.33%, respectively. There were no significant differences between the diagnostic values of the two imaging modes used in the diagnosis of root fracture or in the presence of root canal restorations. In both modes, the most true-positive results were reported in the post space group.

  3. Computer Programs for Obtaining and Analyzing Daily Mean Steamflow Data from the U.S. Geological Survey National Water Information System Web Site

    Science.gov (United States)

    Granato, Gregory E.

    2009-01-01

    Research Council, 2004). The USGS maintains the National Water Information System (NWIS), a distributed network of computers and file servers used to store and retrieve hydrologic data (Mathey, 1998; U.S. Geological Survey, 2008). NWISWeb is an online version of this database that includes water data from more than 24,000 streamflow-gaging stations throughout the United States (U.S. Geological Survey, 2002, 2008). Information from NWISWeb is commonly used to characterize streamflows at gaged sites and to help predict streamflows at ungaged sites. Five computer programs were developed for obtaining and analyzing streamflow from the National Water Information System (NWISWeb). The programs were developed as part of a study by the U.S. Geological Survey, in cooperation with the Federal Highway Administration, to develop a stochastic empirical loading and dilution model. The programs were developed because reliable, efficient, and repeatable methods are needed to access and process streamflow information and data. The first program is designed to facilitate the downloading and reformatting of NWISWeb streamflow data. The second program is designed to facilitate graphical analysis of streamflow data. The third program is designed to facilitate streamflow-record extension and augmentation to help develop long-term statistical estimates for sites with limited data. The fourth program is designed to facilitate statistical analysis of streamflow data. The fifth program is a preprocessor to create batch input files for the U.S. Environmental Protection Agency DFLOW3 program for calculating low-flow statistics. These computer programs were developed to facilitate the analysis of daily mean streamflow data for planning-level water-quality analyses but also are useful for many other applications pertaining to streamflow data and statistics. These programs and the associated documentation are included on the CD-ROM accompanying this report. This report and the appendixes on the

  4. Hydrogeologic characterization and assessment of bioremediation of chlorinated benzenes and benzene in wetland areas, Standard Chlorine of Delaware, Inc. Superfund Site, New Castle County, Delaware, 2009-12

    Science.gov (United States)

    Lorah, Michelle M.; Walker, Charles W.; Baker, Anna C.; Teunis, Jessica A.; Emily Majcher,; Brayton, Michael J.; Raffensperger, Jeff P.; Cozzarelli, Isabelle M.

    2015-01-01

    Wetlands at the Standard Chlorine of Delaware, Inc. Superfund Site (SCD) in New Castle County, Delaware, are affected by contamination with chlorobenzenes and benzene from past waste storage and disposal, spills, leaks, and contaminated groundwater discharge. In cooperation with the U.S. Environmental Protection Agency, the U.S. Geological Survey began an investigation in June 2009 to characterize the hydrogeology and geochemistry in the wetlands and assess the feasibility of monitored natural attenuation and enhanced bioremediation as remedial strategies. Groundwater flow in the wetland study area is predominantly vertically upward in the wetland sediments and the underlying aquifer, and groundwater discharge accounts for a minimum of 47 percent of the total discharge for the subwatershed of tidal Red Lion Creek. Thus, groundwater transport of contaminants to surface water could be significant. The major contaminants detected in groundwater in the wetland study area included benzene, monochlorobenzene, and tri- and di-chlorobenzenes. Shallow wetland groundwater in the northwest part of the wetland study area was characterized by high concentrations of total chlorinated benzenes and benzene (maximum about 75,000 micrograms per liter [μg/L]), low pH, and high chloride. In the northeast part of the wetland study area, wetland groundwater had low to moderate concentrations of total chlorinated benzenes and benzene (generally not greater than 10,000 μg/L), moderate pH, and high sulfate concentrations. Concentrations in the groundwater in excess of 1 percent of the solubility of the individual chlorinated benzenes indicate that a contaminant source is present in the wetland sediments as dense nonaqueous phase liquids (DNAPLs). Consistently higher contaminant concentrations in the shallow wetland groundwater than deeper in the wetland sediments or the aquifer also indicate a continued source in the wetland sediments, which could include dissolution of DNAPLs and

  5. The DYD-RCT protocol: an on-line randomised controlled trial of an interactive computer-based intervention compared with a standard information website to reduce alcohol consumption among hazardous drinkers

    Directory of Open Access Journals (Sweden)

    Godfrey Christine

    2007-10-01

    Full Text Available Abstract Background Excessive alcohol consumption is a significant public health problem throughout the world. Although there are a range of effective interventions to help heavy drinkers reduce their alcohol consumption, these have little proven population-level impact. Researchers internationally are looking at the potential of Internet interventions in this area. Methods/Design In a two-arm randomised controlled trial, an on-line psychologically enhanced interactive computer-based intervention is compared with a flat, text-based information web-site. Recruitment, consent, randomisation and data collection are all on-line. The primary outcome is total past-week alcohol consumption; secondary outcomes include hazardous or harmful drinking, dependence, harm caused by alcohol, and mental health. A health economic analysis is included. Discussion This trial will provide information on the effectiveness and cost-effectiveness of an on-line intervention to help heavy drinkers drink less. Trial registration International Standard Randomised Controlled Trial Number Register ISRCTN31070347

  6. Does the intensity of diffuse thyroid gland uptake on F-18 fluorodeoxyglucose positron emission tomography/computed tomography scan predict the severity of hypothyroidism? Correlation between maximal standardized uptake value and serum thyroid stimulating hormone levels

    International Nuclear Information System (INIS)

    Pruthi, Ankur; Choudhury, Partha Sarathi; Gupta, Manoj; Taywade, Sameer

    2015-01-01

    F-18 fluorodeoxyglucose (F-18 FDG) positron emission tomography/computed tomography (PET/CT) scan and hypothyroidism. The aim was to determine whether the intensity of diffuse thyroid gland uptake on F-18 FDG PET/CT scans predicts the severity of hypothyroidism. A retrospective analysis of 3868 patients who underwent F-18 FDG PET/CT scans, between October 2012 and June 2013 in our institution for various oncological indications was done. Out of them, 106 (2.7%) patients (79 females, 27 males) presented with bilateral diffuse thyroid gland uptake as an incidental finding. These patients were investigated retrospectively and various parameters such as age, sex, primary cancer site, maximal standardized uptake value (SUVmax), results of thyroid function tests (TFTs) and fine-needle aspiration cytology results were noted. The SUVmax values were correlated with serum thyroid stimulating hormone (S. TSH) levels using Pearson's correlation analysis. Pearson's correlation analysis. Clinical information and TFT (serum FT3, FT4 and TSH levels) results were available for 31 of the 106 patients (27 females, 4 males; mean age 51.5 years). Twenty-six out of 31 patients (84%) were having abnormal TFTs with abnormal TSH levels in 24/31 patients (mean S. TSH: 22.35 μIU/ml, median: 7.37 μIU/ml, range: 0.074-211 μIU/ml). Among 7 patients with normal TSH levels, 2 patients demonstrated low FT3 and FT4 levels. No significant correlation was found between maximum standardized uptake value and TSH levels (r = 0.115, P > 0.05). Incidentally detected diffuse thyroid gland uptake on F-18 FDG PET/CT scan was usually associated with hypothyroidism probably caused by autoimmune thyroiditis. Patients should be investigated promptly irrespective of the intensity of FDG uptake with TFTs to initiate replacement therapy and a USG examination to look for any suspicious nodules

  7. Cost-effectiveness of computer-assisted training in cognitive-behavioral therapy as an adjunct to standard care for addiction.

    Science.gov (United States)

    Olmstead, Todd A; Ostrow, Cary D; Carroll, Kathleen M

    2010-08-01

    To determine the cost-effectiveness, from clinic and patient perspectives, of a computer-based version of cognitive-behavioral therapy (CBT4CBT) as an addition to regular clinical practice for substance dependence. PARTICIPANTS, DESIGN AND MEASUREMENTS: This cost-effectiveness study is based on a randomized clinical trial in which 77 individuals seeking treatment for substance dependence at an outpatient community setting were randomly assigned to treatment as usual (TAU) or TAU plus biweekly access to computer-based training in CBT (TAU plus CBT4CBT). The primary patient outcome measure was the total number of drug-free specimens provided during treatment. Incremental cost-effectiveness ratios (ICERs) and cost-effectiveness acceptability curves (CEACs) were used to determine the cost-effectiveness of TAU plus CBT4CBT relative to TAU alone. Results are presented from both the clinic and patient perspectives and are shown to be robust to (i) sensitivity analyses and (ii) a secondary objective patient outcome measure. The per patient cost of adding CBT4CBT to standard care was $39 ($27) from the clinic (patient) perspective. From the clinic (patient) perspective, TAU plus CBT4CBT is likely to be cost-effective when the threshold value to decision makers of an additional drug-free specimen is greater than approximately $21 ($15), and TAU alone is likely to be cost-effective when the threshold value is less than approximately $21 ($15). The ICERs for TAU plus CBT4CBT also compare favorably to ICERs reported elsewhere for other empirically validated therapies, including contingency management. TAU plus CBT4CBT appears to be a good value from both the clinic and patient perspectives. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.

  8. Is Whole-Body Computed Tomography the Standard Work-up for Severely-Injured Children? Results of a Survey among German Trauma Centers.

    Science.gov (United States)

    Bayer, J; Reising, K; Kuminack, K; Südkamp, N P; Strohm, P C

    2015-01-01

    Whole-body computed tomography is accepted as the standard procedure in the primary diagnostic of polytraumatised adults in the emergency room. Up to now there is still controversial discussion about the same algorithm in the primary diagnostic of children. The aim of this study was to survey the participation of German trauma-centres in the care of polytraumatised children and the hospital dependant use of whole-body computed tomography for initial patient work-up. A questionnaire was mailed to every Department of Traumatology registered in the DGU (German Trauma Society) databank. We received 60,32% of the initially sent questionnaires and after applying exclusion criteria 269 (53,91%) were applicable to statistical analysis. In the three-tiered German hospital system no statistical difference was seen in the general participation of children polytrauma care between hospitals of different tiers (p = 0.315). Even at the lowest hospital level 69,47% of hospitals stated to participate in polytrauma care for children, at the intermediate and highest level hospitals 91,89% and 95,24% stated to be involved in children polytrauma care, respectively. Children suspicious of multiple injuries or polytrauma received significantly fewer primary whole-body CTs in lowest level compared to intermediate level hospitals (36,07% vs. 56,57%; p = 0.015) and lowest level compared to highest level hospitals (36,07% vs. 68,42%; p = 0.001). Comparing the use of whole-body CT in intermediate to highest level hospitals a not significant increase in its use could be seen in highest level hospitals (56,57% vs. 68,42%; p = 0.174). According to our survey, taking care of polytraumatised children in Germany is not limited to specialised hospitals or a defined hospital level-of-care. Additionally, there is no established radiologic standard in work-up of the polytraumatised child. However, in higher hospital care -levels a higher percentage of hospitals employs whole-body CTs for primary

  9. Added value of cardiac computed tomography for evaluation of mechanical aortic valve: Emphasis on evaluation of pannus with surgical findings as standard reference.

    Science.gov (United States)

    Suh, Young Joo; Lee, Sak; Im, Dong Jin; Chang, Suyon; Hong, Yoo Jin; Lee, Hye-Jeong; Hur, Jin; Choi, Byoung Wook; Chang, Byung-Chul; Shim, Chi Young; Hong, Geu-Ru; Kim, Young Jin

    2016-07-01

    The added value of cardiac computed tomography (CT) with transesophageal echocardiography (TEE) for evaluating mechanical aortic valve (AV) dysfunction has not yet been investigated. The purposes of this study were to investigate the added value of cardiac CT for evaluation of mechanical AVs and diagnoses of pannus compared to TEE, with surgical findings of redo-aortic valve replacement (AVR) used as a standard reference. 25 patients who underwent redo-AVR due to mechanical AV dysfunction and cardiac CT before redo-AVR were included. The presence of pannus, encroachment ratio by pannus, and limitation of motion (LOM) were evaluated on CT. The diagnostic performance of pannus detection was compared using TEE, CT, and CT+TEE, with surgical findings as a standard reference. The added value of CT for diagnosing the cause of mechanical AV dysfunction was assessed compared to TTE+TEE. In two patients, CT analysis was not feasible due to severe metallic artifacts. On CT, pannus and LOM were found in 100% (23/23) and 60.9% (14/23). TEE identified pannus in 48.0% of patients (12/25). CT, TEE, and CT+TEE correctly identified pannus with sensitivity of 92.0%, 48.0%, and 92.0%, respectively (P=0.002 for CT vs. TEE). In 11 of 13 cases (84.6%) with inconclusive or negative TEE results for pannus, CT detected the pannus. Among 13 inconclusive cases of TTE+TEE for the cause of mechanical AV dysfunction, CT suggested 6 prosthetic valve obstruction (PVO) by pannus, 4 low-flow low-gradient PVO, and one LOM without significant PVO. Cardiac CT showed added diagnostic value with TEE in the detection of pannus as the cause of mechanical AV dysfunction. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. New orthopaedic implant management tool for computer-assisted planning, navigation, and simulation: from implant CAD files to a standardized XML-based implant database.

    Science.gov (United States)

    Sagbo, S; Blochaou, F; Langlotz, F; Vangenot, C; Nolte, L-P; Zheng, G

    2005-01-01

    Computer-Assisted Orthopaedic Surgery (CAOS) has made much progress over the last 10 years. Navigation systems have been recognized as important tools that help surgeons, and various such systems have been developed. A disadvantage of these systems is that they use non-standard formalisms and techniques. As a result, there are no standard concepts for implant and tool management or data formats to store information for use in 3D planning and navigation. We addressed these limitations and developed a practical and generic solution that offers benefits for surgeons, implant manufacturers, and CAS application developers. We developed a virtual implant database containing geometrical as well as calibration information for orthopedic implants and instruments, with a focus on trauma. This database has been successfully tested for various applications in the client/server mode. The implant information is not static, however, because manufacturers periodically revise their implants, resulting in the deletion of some implants and the introduction of new ones. Tracking these continuous changes and keeping CAS systems up to date is a tedious task if done manually. This leads to additional costs for system development, and some errors are inevitably generated due to the huge amount of information that has to be processed. To ease management with respect to implant life cycle, we developed a tool to assist end-users (surgeons, hospitals, CAS system providers, and implant manufacturers) in managing their implants. Our system can be used for pre-operative planning and intra-operative navigation, and also for any surgical simulation involving orthopedic implants. Currently, this tool allows addition of new implants, modification of existing ones, deletion of obsolete implants, export of a given implant, and also creation of backups. Our implant management system has been successfully tested in the laboratory with very promising results. It makes it possible to fill the current gap

  11. Change of Maximum Standardized Uptake Value Slope in Dynamic Triphasic [{sup 18}F]-Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography Distinguishes Malignancy From Postradiation Inflammation in Head-and-Neck Squamous Cell Carcinoma: A Prospective Trial

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Carryn M., E-mail: carryn-anderson@uiowa.edu [Department of Radiation Oncology, University of Iowa, Iowa City, Iowa (United States); Chang, Tangel [Department of Radiation Oncology, University of Iowa, Iowa City, Iowa (United States); Graham, Michael M. [Department of Nuclear Medicine, University of Iowa, Iowa City, Iowa (United States); Marquardt, Michael D. [Department of Radiation Oncology, University of Iowa, Iowa City, Iowa (United States); Button, Anna; Smith, Brian J. [Department of Biostatistics, University of Iowa, Iowa City, Iowa (United States); Menda, Yusuf [Department of Nuclear Medicine, University of Iowa, Iowa City, Iowa (United States); Sun, Wenqing [Department of Radiation Oncology, University of Iowa, Iowa City, Iowa (United States); Pagedar, Nitin A. [Department of Otolaryngology—Head and Neck Surgery, University of Iowa, Iowa City, Iowa (United States); Buatti, John M. [Department of Radiation Oncology, University of Iowa, Iowa City, Iowa (United States)

    2015-03-01

    Purpose: To evaluate dynamic [{sup 18}F]-fluorodeoxyglucose (FDG) uptake methodology as a post–radiation therapy (RT) response assessment tool, potentially enabling accurate tumor and therapy-related inflammation differentiation, improving the posttherapy value of FDG–positron emission tomography/computed tomography (FDG-PET/CT). Methods and Materials: We prospectively enrolled head-and-neck squamous cell carcinoma patients who completed RT, with scheduled 3-month post-RT FDG-PET/CT. Patients underwent our standard whole-body PET/CT scan at 90 minutes, with the addition of head-and-neck PET/CT scans at 60 and 120 minutes. Maximum standardized uptake values (SUV{sub max}) of regions of interest were measured at 60, 90, and 120 minutes. The SUV{sub max} slope between 60 and 120 minutes and change of SUV{sub max} slope before and after 90 minutes were calculated. Data were analyzed by primary site and nodal site disease status using the Cox regression model and Wilcoxon rank sum test. Outcomes were based on pathologic and clinical follow-up. Results: A total of 84 patients were enrolled, with 79 primary and 43 nodal evaluable sites. Twenty-eight sites were interpreted as positive or equivocal (18 primary, 8 nodal, 2 distant) on 3-month 90-minute FDG-PET/CT. Median follow-up was 13.3 months. All measured SUV endpoints predicted recurrence. Change of SUV{sub max} slope after 90 minutes more accurately identified nonrecurrence in positive or equivocal sites than our current standard of SUV{sub max} ≥2.5 (P=.02). Conclusions: The positive predictive value of post-RT FDG-PET/CT may significantly improve using novel second derivative analysis of dynamic triphasic FDG-PET/CT SUV{sub max} slope, accurately distinguishing tumor from inflammation on positive and equivocal scans.

  12. Automatic Substitute Computed Tomography Generation and Contouring for Magnetic Resonance Imaging (MRI)-Alone External Beam Radiation Therapy From Standard MRI Sequences

    Energy Technology Data Exchange (ETDEWEB)

    Dowling, Jason A., E-mail: jason.dowling@csiro.au [CSIRO Australian e-Health Research Centre, Herston, Queensland (Australia); University of Newcastle, Callaghan, New South Wales (Australia); Sun, Jidi [University of Newcastle, Callaghan, New South Wales (Australia); Pichler, Peter [Calvary Mater Newcastle Hospital, Waratah, New South Wales (Australia); Rivest-Hénault, David; Ghose, Soumya [CSIRO Australian e-Health Research Centre, Herston, Queensland (Australia); Richardson, Haylea [Calvary Mater Newcastle Hospital, Waratah, New South Wales (Australia); Wratten, Chris; Martin, Jarad [University of Newcastle, Callaghan, New South Wales (Australia); Calvary Mater Newcastle Hospital, Waratah, New South Wales (Australia); Arm, Jameen [Calvary Mater Newcastle Hospital, Waratah, New South Wales (Australia); Best, Leah [Department of Radiology, Hunter New England Health, New Lambton, New South Wales (Australia); Chandra, Shekhar S. [School of Information Technology and Electrical Engineering, University of Queensland, Brisbane, Queensland (Australia); Fripp, Jurgen [CSIRO Australian e-Health Research Centre, Herston, Queensland (Australia); Menk, Frederick W. [University of Newcastle, Callaghan, New South Wales (Australia); Greer, Peter B. [University of Newcastle, Callaghan, New South Wales (Australia); Calvary Mater Newcastle Hospital, Waratah, New South Wales (Australia)

    2015-12-01

    Purpose: To validate automatic substitute computed tomography CT (sCT) scans generated from standard T2-weighted (T2w) magnetic resonance (MR) pelvic scans for MR-Sim prostate treatment planning. Patients and Methods: A Siemens Skyra 3T MR imaging (MRI) scanner with laser bridge, flat couch, and pelvic coil mounts was used to scan 39 patients scheduled for external beam radiation therapy for localized prostate cancer. For sCT generation a whole-pelvis MRI scan (1.6 mm 3-dimensional isotropic T2w SPACE [Sampling Perfection with Application optimized Contrasts using different flip angle Evolution] sequence) was acquired. Three additional small field of view scans were acquired: T2w, T2*w, and T1w flip angle 80° for gold fiducials. Patients received a routine planning CT scan. Manual contouring of the prostate, rectum, bladder, and bones was performed independently on the CT and MR scans. Three experienced observers contoured each organ on MRI, allowing interobserver quantification. To generate a training database, each patient CT scan was coregistered to their whole-pelvis T2w using symmetric rigid registration and structure-guided deformable registration. A new multi-atlas local weighted voting method was used to generate automatic contours and sCT results. Results: The mean error in Hounsfield units between the sCT and corresponding patient CT (within the body contour) was 0.6 ± 14.7 (mean ± 1 SD), with a mean absolute error of 40.5 ± 8.2 Hounsfield units. Automatic contouring results were very close to the expert interobserver level (Dice similarity coefficient): prostate 0.80 ± 0.08, bladder 0.86 ± 0.12, rectum 0.84 ± 0.06, bones 0.91 ± 0.03, and body 1.00 ± 0.003. The change in monitor units between the sCT-based plans relative to the gold standard CT plan for the same dose prescription was found to be 0.3% ± 0.8%. The 3-dimensional γ pass rate was 1.00 ± 0.00 (2 mm/2%). Conclusions: The MR-Sim setup and automatic s

  13. Automatic Substitute Computed Tomography Generation and Contouring for Magnetic Resonance Imaging (MRI)-Alone External Beam Radiation Therapy From Standard MRI Sequences

    International Nuclear Information System (INIS)

    Dowling, Jason A.; Sun, Jidi; Pichler, Peter; Rivest-Hénault, David; Ghose, Soumya; Richardson, Haylea; Wratten, Chris; Martin, Jarad; Arm, Jameen; Best, Leah; Chandra, Shekhar S.; Fripp, Jurgen; Menk, Frederick W.; Greer, Peter B.

    2015-01-01

    Purpose: To validate automatic substitute computed tomography CT (sCT) scans generated from standard T2-weighted (T2w) magnetic resonance (MR) pelvic scans for MR-Sim prostate treatment planning. Patients and Methods: A Siemens Skyra 3T MR imaging (MRI) scanner with laser bridge, flat couch, and pelvic coil mounts was used to scan 39 patients scheduled for external beam radiation therapy for localized prostate cancer. For sCT generation a whole-pelvis MRI scan (1.6 mm 3-dimensional isotropic T2w SPACE [Sampling Perfection with Application optimized Contrasts using different flip angle Evolution] sequence) was acquired. Three additional small field of view scans were acquired: T2w, T2*w, and T1w flip angle 80° for gold fiducials. Patients received a routine planning CT scan. Manual contouring of the prostate, rectum, bladder, and bones was performed independently on the CT and MR scans. Three experienced observers contoured each organ on MRI, allowing interobserver quantification. To generate a training database, each patient CT scan was coregistered to their whole-pelvis T2w using symmetric rigid registration and structure-guided deformable registration. A new multi-atlas local weighted voting method was used to generate automatic contours and sCT results. Results: The mean error in Hounsfield units between the sCT and corresponding patient CT (within the body contour) was 0.6 ± 14.7 (mean ± 1 SD), with a mean absolute error of 40.5 ± 8.2 Hounsfield units. Automatic contouring results were very close to the expert interobserver level (Dice similarity coefficient): prostate 0.80 ± 0.08, bladder 0.86 ± 0.12, rectum 0.84 ± 0.06, bones 0.91 ± 0.03, and body 1.00 ± 0.003. The change in monitor units between the sCT-based plans relative to the gold standard CT plan for the same dose prescription was found to be 0.3% ± 0.8%. The 3-dimensional γ pass rate was 1.00 ± 0.00 (2 mm/2%). Conclusions: The MR-Sim setup and automatic s

  14. Lower maximum standardized uptake value of fluorine-18 fluorodeoxyglucose positron emission tomography coupled with computed tomography imaging in pancreatic ductal adenocarcinoma patients with diabetes.

    Science.gov (United States)

    Chung, Kwang Hyun; Park, Joo Kyung; Lee, Sang Hyub; Hwang, Dae Wook; Cho, Jai Young; Yoon, Yoo-Seok; Han, Ho-Seong; Hwang, Jin-Hyeok

    2015-04-01

    The effects of diabetes mellitus (DM) on sensitivity of fluorine-18 fluorodeoxyglucose positron emission tomography coupled with computed tomography ((18)F-FDG PET/CT) for diagnosing pancreatic ductal adenocarcinomas (PDACs) is not well known. This study was aimed to evaluate the effects of DM on the validity of (18)F-FDG PET/CT in PDAC. A total of 173 patients with PDACs who underwent (18)F-FDG PET/CT were enrolled (75 in the DM group and 98 in the non-DM group). The maximum standardized uptake values (SUVsmax) were compared. The mean SUVmax was significantly lower in the DM group than in the non-DM group (4.403 vs 5.998, P = .001). The sensitivity of SUVmax (cut-off value 4.0) was significantly lower in the DM group than in the non-DM group (49.3% vs 75.5%, P < .001) and also lower in normoglycemic DM patients (n = 24) than in non-DM patients (54.2% vs 75.5%, P = .038). DM contributes to a lower SUVmax of (18)F-FDG PET/CT in patients with PDACs. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Correctness of multi-detector-row computed tomography for diagnosing mechanical prosthetic heart valve disorders using operative findings as a gold standard

    Energy Technology Data Exchange (ETDEWEB)

    Tsai, I.Chen [Taichung Veterans General Hospital, Department of Radiology, Taichung (China); Institute of Clinical Medicine and Faculty of Medicine, National Yang-Ming University, Taipei (China); Lin, Yung-Kai; Chang, Yen; Wang, Chung-Chi; Hsieh, Shih-Rong; Wei, Hao-Ji; Tsai, Hung-Wen [Taichung Veterans General Hospital, Section of Cardiovascular Surgery, Cardiovascular Center, Taichung (China); Fu, Yun-Ching; Jan, Sheng-Ling [Institute of Clinical Medicine and Faculty of Medicine, National Yang-Ming University, Taipei (China); Taichung Veterans General Hospital, Section of Pediatric Cardiology, Department of Pediatrics, Taichung (China); Wang, Kuo-Yang [Taichung Veterans General Hospital, Section of General Cardiology, Cardiovascular Center, Taichung (China); Chung-Shan Medical University, Department of Medicine, Taichung (China); Chen, Min-Chi; Chen, Clayton Chi-Chang [Taichung Veterans General Hospital, Department of Radiology, Taichung (China); Central Taiwan University of Science and Technology, Department of Radiological Technology, Taichung (China)

    2009-04-15

    The purpose was to compare the findings of multi-detector computed tomography (MDCT) in prosthetic valve disorders using the operative findings as a gold standard. In a 3-year period, we prospectively enrolled 25 patients with 31 prosthetic heart valves. MDCT and transthoracic echocardiography (TTE) were done to evaluate pannus formation, prosthetic valve dysfunction, suture loosening (paravalvular leak) and pseudoaneurysm formation. Patients indicated for surgery received an operation within 1 week. The MDCT findings were compared with the operative findings. One patient with a Bjoerk-Shiley valve could not be evaluated by MDCT due to a severe beam-hardening artifact; thus, the exclusion rate for MDCT was 3.2% (1/31). Prosthetic valve disorders were suspected in 12 patients by either MDCT or TTE. Six patients received an operation that included three redo aortic valve replacements, two redo mitral replacements and one Amplatzer ductal occluder occlusion of a mitral paravalvular leak. The concordance of MDCT for diagnosing and localizing prosthetic valve disorders and the surgical findings was 100%. Except for images impaired by severe beam-hardening artifacts, MDCT provides excellent delineation of prosthetic valve disorders. (orig.)

  16. R2SM: a package for the analytic computation of the R2 Rational terms in the Standard Model of the Electroweak interactions

    International Nuclear Information System (INIS)

    Garzelli, M.V.

    2011-01-01

    The analytical package written in FORM presented in this paper allows the computation of the complete set of Feynman Rules producing the Rational terms of kind R 2 contributing to the virtual part of NLO corrections in the Standard Model of the Electroweak interactions. Building block topologies filled by means of generic scalars, vectors and fermions, allowing to build these Feynman Rules in terms of specific elementary particles, are explicitly given in the R ξ gauge class, together with the automatic dressing procedure to obtain the Feynman Rules from them. The results in more specific gauges, like the 't Hooft Feynman one, follow as particular cases, in both the HV and the FDH dimensional regularization schemes. As a check on our formulas, the gauge independence of the total Rational contribution (R 1 +R 2 ) to renormalized S-matrix elements is verified by considering the specific example of the H →γγ decay process at 1-loop. This package can be of interest for people aiming at a better understanding of the nature of the Rational terms. It is organized in a modular way, allowing a further use of some its files even in different contexts. Furthermore, it can be considered as a first seed in the effort towards a complete automation of the process of the analytical calculation of the R 2 effective vertices, given the Lagrangian of a generic gauge theory of particle interactions. (orig.)

  17. Radiological Risk Assessments for Occupational Exposure at Fuel Fabrication Facility in AlTuwaitha Site Baghdad – Iraq by using RESRAD Computer Code

    Science.gov (United States)

    Ibrahim, Ziadoon H.; Ibrahim, S. A.; Mohammed, M. K.; Shaban, A. H.

    2018-05-01

    The purpose of this study is to evaluate the radiological risks for workers for one year of their activities at Fuel Fabrication Facility (FFF) so as to make the necessary protection to prevent or minimize risks resulted from these activities this site now is under the Iraqi decommissioning program (40). Soil samples surface and subsurface were collected from different positions of this facility and analyzed by gamma rays spectroscopy technique High Purity Germanium detector (HPGe) was used. It was found out admixture of radioactive isotopes (232Th 40K 238U 235U137Cs) according to the laboratory results the highest values were (975758) for 238U (21203) for 235U (218) for 232Th (4046) for 40K and (129) for 137Cs in (Bqkg1) unit. The annual total radiation dose and risks were estimated by using RESRAD (onsite) 70 computer code. The highest total radiation dose was (5617μSv/year) in area that represented by soil sample (S7) and the radiological risks morbidity and mortality (118E02 8661E03) respectively in the same area

  18. Comparison of C-arm computed tomography and on-site quick cortisol assay for adrenal venous sampling: A retrospective study of 178 patients

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Chin-Chen; Lee, Bo-Ching; Chang, Yeun-Chung; Liu, Kao-Lang [National Taiwan University Hospital and National Taiwan University College of Medicine, Department of Medical Imaging, Taipei (China); Wu, Vin-Cent [National Taiwan University Hospital and National Taiwan University College of Medicine, Department of Internal Medicine, Taipei (China); Huang, Kuo-How [National Taiwan University Hospital and National Taiwan University College of Medicine, Department of Urology, Taipei (China); Collaboration: on behalf of the TAIPAI Study Group

    2017-12-15

    To compare the performance of on-site quick cortisol assay (QCA) and C-arm computed tomography (CT) assistance on adrenal venous sampling (AVS) without adrenocorticotropic hormone stimulation. The institutional review board at our hospital approved this retrospective study, which included 178 consecutive patients with primary aldosteronism. During AVS, we used C-arm CT to confirm right adrenal cannulation between May 2012 and June 2015 (n = 100) and QCA for bilateral adrenal cannulation between July 2015 and September 2016 (n = 78). Successful AVS required a selectivity index (cortisol{sub adrenal} {sub vein}/cortisol{sub peripheral}) of ≥ 2.0 bilaterally. The overall success rate of C-arm CT-assisted AVS was 87%, which increased to 97.4% under QCA (P =.013). The procedure time (C-arm CT, 49.5 ± 21.3 min; QCA, 37.5 ± 15.6 min; P <.001) and radiation dose (C-arm CT, 673.9 ± 613.8 mGy; QCA, 346.4 ± 387.8 mGy; P <.001) were also improved. The resampling rate was 16% and 21.8% for C-arm CT and QCA, respectively. The initial success rate of the performing radiologist remained stable during the study period (C-arm CT 75%; QCA, 82.1%, P =.259). QCA might be superior to C-arm CT for improving the performance of AVS. (orig.)

  19. Comparison of C-arm computed tomography and on-site quick cortisol assay for adrenal venous sampling: A retrospective study of 178 patients.

    Science.gov (United States)

    Chang, Chin-Chen; Lee, Bo-Ching; Chang, Yeun-Chung; Wu, Vin-Cent; Huang, Kuo-How; Liu, Kao-Lang

    2017-12-01

    To compare the performance of on-site quick cortisol assay (QCA) and C-arm computed tomography (CT) assistance on adrenal venous sampling (AVS) without adrenocorticotropic hormone stimulation. The institutional review board at our hospital approved this retrospective study, which included 178 consecutive patients with primary aldosteronism. During AVS, we used C-arm CT to confirm right adrenal cannulation between May 2012 and June 2015 (n = 100) and QCA for bilateral adrenal cannulation between July 2015 and September 2016 (n = 78). Successful AVS required a selectivity index (cortisol adrenal vein /cortisol peripheral ) of ≥ 2.0 bilaterally. The overall success rate of C-arm CT-assisted AVS was 87%, which increased to 97.4% under QCA (P = .013). The procedure time (C-arm CT, 49.5 ± 21.3 min; QCA, 37.5 ± 15.6 min; P AVS. • Adrenal venous sampling (AVS) is a technically challenging procedure. • C-arm CT and quick cortisol assay (QCA) are efficient for assisting AVS. • QCA might outperform C-arm CT in enhancing AVS performance.

  20. Evaluation of a Method for Nitrotyrosine Site Identification and Relative Quantitation Using a Stable Isotope-Labeled Nitrated Spike-In Standard and High Resolution Fourier Transform MS and MS/MS Analysis

    Directory of Open Access Journals (Sweden)

    Kent W. Seeley

    2014-04-01

    Full Text Available The overproduction of reactive oxygen and nitrogen species (ROS and RNS can have deleterious effects in the cell, including structural and possible activity-altering modifications to proteins. Peroxynitrite is one such RNS that can result in a specific protein modification, nitration of tyrosine residues to form nitrotyrosine, and to date, the identification of nitrotyrosine sites in proteins continues to be a major analytical challenge. We have developed a method by which 15N-labeled nitrotyrosine groups are generated on peptide or protein standards using stable isotope-labeled peroxynitrite (O15NOO−, and the resulting standard is mixed with representative samples in which nitrotyrosine formation is to be measured by mass spectrometry (MS. Nitropeptide MS/MS spectra are filtered using high mass accuracy Fourier transform MS (FTMS detection of the nitrotyrosine immonium ion. Given that the nitropeptide pair is co-isolated for MS/MS fragmentation, the nitrotyrosine immonium ions (at m/z = 181 or 182 can be used for relative quantitation with negligible isotopic interference at a mass resolution of greater than 50,000 (FWHM, full width at half-maximum. Furthermore, the standard potentially allows for the increased signal of nitrotyrosine-containing peptides, thus facilitating selection for MS/MS in a data-dependent mode of acquisition. We have evaluated the methodology in terms of nitrotyrosine site identification and relative quantitation using nitrated peptide and protein standards.