WorldWideScience

Sample records for previous version comput

  1. Standardized computer-based organized reporting of EEG SCORE - Second version

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE....... In the end, the diagnostic significance is scored, using a standardized list of terms. SCORE has specific modules for scoring seizures (including seizure semiology and ictal EEG patterns), neonatal recordings (including features specific for this age group), and for Critical Care EEG Terminology. SCORE...

  2. External cephalic version among women with a previous cesarean delivery: report on 36 cases and review of the literature.

    Science.gov (United States)

    Abenhaim, Haim A; Varin, Jocelyne; Boucher, Marc

    2009-01-01

    Whether or not women with a previous cesarean section should be considered for an external cephalic version remains unclear. In our study, we sought to examine the relationship between a history of previous cesarean section and outcomes of external cephalic version for pregnancies at 36 completed weeks of gestation or more. Data on obstetrical history and on external cephalic version outcomes was obtained from the C.H.U. Sainte-Justine External Cephalic Version Database. Baseline clinical characteristics were compared among women with and without a history of previous cesarean section. We used logistic regression analysis to evaluate the effect of previous cesarean section on success of external cephalic version while adjusting for parity, maternal body mass index, gestational age, estimated fetal weight, and amniotic fluid index. Over a 15-year period, 1425 external cephalic versions were attempted of which 36 (2.5%) were performed on women with a previous cesarean section. Although women with a history of previous cesarean section were more likely to be older and para >2 (38.93% vs. 15.0%), there were no difference in gestational age, estimated fetal weight, and amniotic fluid index. Women with a prior cesarean section had a success rate similar to women without [50.0% vs. 51.6%, adjusted OR: 1.31 (0.48-3.59)]. Women with a previous cesarean section who undergo an external cephalic version have similar success rates than do women without. Concern about procedural success in women with a previous cesarean section is unwarranted and should not deter attempting an external cephalic version.

  3. Trace contaminant control simulation computer program, version 8.1

    Science.gov (United States)

    Perry, J. L.

    1994-01-01

    The Trace Contaminant Control Simulation computer program is a tool for assessing the performance of various process technologies for removing trace chemical contamination from a spacecraft cabin atmosphere. Included in the simulation are chemical and physical adsorption by activated charcoal, chemical adsorption by lithium hydroxide, absorption by humidity condensate, and low- and high-temperature catalytic oxidation. Means are provided for simulating regenerable as well as nonregenerable systems. The program provides an overall mass balance of chemical contaminants in a spacecraft cabin given specified generation rates. Removal rates are based on device flow rates specified by the user and calculated removal efficiencies based on cabin concentration and removal technology experimental data. Versions 1.0 through 8.0 are documented in NASA TM-108409. TM-108409 also contains a source file listing for version 8.0. Changes to version 8.0 are documented in this technical memorandum and a source file listing for the modified version, version 8.1, is provided. Detailed descriptions for the computer program subprograms are extracted from TM-108409 and modified as necessary to reflect version 8.1. Version 8.1 supersedes version 8.0. Information on a separate user's guide is available from the author.

  4. Montage Version 3.0

    Science.gov (United States)

    Jacob, Joseph; Katz, Daniel; Prince, Thomas; Berriman, Graham; Good, John; Laity, Anastasia

    2006-01-01

    The final version (3.0) of the Montage software has been released. To recapitulate from previous NASA Tech Briefs articles about Montage: This software generates custom, science-grade mosaics of astronomical images on demand from input files that comply with the Flexible Image Transport System (FITS) standard and contain image data registered on projections that comply with the World Coordinate System (WCS) standards. This software can be executed on single-processor computers, multi-processor computers, and such networks of geographically dispersed computers as the National Science Foundation s TeraGrid or NASA s Information Power Grid. The primary advantage of running Montage in a grid environment is that computations can be done on a remote supercomputer for efficiency. Multiple computers at different sites can be used for different parts of a computation a significant advantage in cases of computations for large mosaics that demand more processor time than is available at any one site. Version 3.0 incorporates several improvements over prior versions. The most significant improvement is that this version is accessible to scientists located anywhere, through operational Web services that provide access to data from several large astronomical surveys and construct mosaics on either local workstations or remote computational grids as needed.

  5. The NASA Ames PAH IR Spectroscopic Database: Computational Version 3.00 with Updated Content and the Introduction of Multiple Scaling Factors

    Science.gov (United States)

    Bauschlicher, Charles W., Jr.; Ricca, A.; Boersma, C.; Allamandola, L. J.

    2018-02-01

    Version 3.00 of the library of computed spectra in the NASA Ames PAH IR Spectroscopic Database (PAHdb) is described. Version 3.00 introduces the use of multiple scale factors, instead of the single scaling factor used previously, to align the theoretical harmonic frequencies with the experimental fundamentals. The use of multiple scale factors permits the use of a variety of basis sets; this allows new PAH species to be included in the database, such as those containing oxygen, and yields an improved treatment of strained species and those containing nitrogen. In addition, the computed spectra of 2439 new PAH species have been added. The impact of these changes on the analysis of an astronomical spectrum through database-fitting is considered and compared with a fit using Version 2.00 of the library of computed spectra. Finally, astronomical constraints are defined for the PAH spectral libraries in PAHdb.

  6. The HARWELL version of the computer code E-DEP-1

    International Nuclear Information System (INIS)

    Matthews, M.D.

    1983-03-01

    This document describes the modified HARWELL version of the computer program EDEP-1 which has been in use on the IBM Central Computer for some years. The program can be used to calculate heavy ion ranges and/or profiles of energy deposited into nuclear processes for a wide variety of ion/target combinations. The initial setting up of this program on the IBM Central Computer has been described in an earlier report. A second report was later issued to bring the first report up to date following changes to this code required to suit the needs of workers at HARWELL. This later report described in particular the provision of new electronic stopping powers and an alternative method for calculating the energy straggle of beam ions with depth in a target. This new report describes further extensions to the electronic stopping powers available in the HARWELL version of this program and, for the first time, gives details of alternative nuclear stopping powers now available. This new document is intended as a reference manual for the use of the HARWELL version of EDEP-1. In this respect this document should be the final report on the status of this program. (author)

  7. Space shuttle general purpose computers (GPCs) (current and future versions)

    Science.gov (United States)

    1988-01-01

    Current and future versions of general purpose computers (GPCs) for space shuttle orbiters are represented in this frame. The two boxes on the left (AP101B) represent the current GPC configuration, with the input-output processor at far left and the central processing unit (CPU) at its side. The upgraded version combines both elements in a single unit (far right, AP101S).

  8. Matched cohort study of external cephalic version in women with previous cesarean delivery.

    Science.gov (United States)

    Keepanasseril, Anish; Anand, Keerthana; Soundara Raghavan, Subrahmanian

    2017-07-01

    To evaluate the efficacy and safety of external cephalic version (ECV) among women with previous cesarean delivery. A retrospective study was conducted using data for women with previous cesarean delivery and breech presentation who underwent ECV at or after 36 weeks of pregnancy during 2011-2016. For every case, two multiparous women without previous cesarean delivery who underwent ECV and were matched for age and pregnancy duration were included. Characteristics and outcomes were compared between groups. ECV was successful for 32 (84.2%) of 38 women with previous cesarean delivery and 62 (81.6%) in the control group (P=0.728). Multivariate regression analysis confirmed that previous cesarean was not associated with ECV success (odds ratio 1.89, 95% confidence interval 0.19-18.47; P=0.244). Successful vaginal delivery after successful ECV was reported for 19 (59.4%) women in the previous cesarean delivery group and 52 (83.9%) in the control group (P<0.001). No ECV-associated complications occurred in women with previous cesarean delivery. To avoid a repeat cesarean delivery, ECV can be offered to women with breech presentation and previous cesarean delivery who are otherwise eligible for a trial of labor. © 2017 International Federation of Gynecology and Obstetrics.

  9. A PC [personal computer]-based version of KENO V.a

    International Nuclear Information System (INIS)

    Nigg, D.A.; Atkinson, C.A.; Briggs, J.B.; Taylor, J.T.

    1990-01-01

    The use of personal computers (PCs) and engineering workstations for complex scientific computations has expanded rapidly in the last few years. This trend is expected to continue in the future with the introduction of increasingly sophisticated microprocessors and microcomputer systems. For a number of reasons, including security, economy, user convenience, and productivity, an integrated system of neutronics and radiation transport software suitable for operation in an IBM PC-class environment has been under development at the Idaho National Engineering Laboratory (INEL) for the past 3 yr. Nuclear cross-section data and resonance parameters are preprocessed from the Evaluated Nuclear Data Files Version 5 (ENDF/B-V) and supplied in a form suitable for use in a PC-based spectrum calculation and multigroup cross-section generation module. This module produces application-specific data libraries that can then be used in various neutron transport and diffusion theory code modules. This paper discusses several details of the Monte Carlo criticality module, which is based on the well-known highly-sophisticated KENO V.a package developed at Oak Ridge National Laboratory and previously released in mainframe form by the Radiation Shielding Information Center (RSIC). The conversion process and a variety of benchmarking results are described

  10. COSY INFINITY Version 9

    International Nuclear Information System (INIS)

    Makino, Kyoko; Berz, Martin

    2006-01-01

    In this paper, we review the features in the newly released version of COSY INFINITY, which currently has a base of more than 1000 registered users, focusing on the topics which are new and some topics which became available after the first release of the previous versions 8 and 8.1. The recent main enhancements of the code are devoted to reliability and efficiency of the computation, to verified integration, and to rigorous global optimization. There are various data types available in COSY INFINITY to support these goals, and the paper also reviews the feature and usage of those data types

  11. Validation of the Online version of the Previous Day Food Questionnaire for schoolchildren

    Directory of Open Access Journals (Sweden)

    Raquel ENGEL

    Full Text Available ABSTRACT Objective To evaluate the validity of the web-based version of the Previous Day Food Questionnaire Online for schoolchildren from the 2nd to 5th grades of elementary school. Methods Participants were 312 schoolchildren aged 7 to 12 years of a public school from the city of Florianópolis, Santa Catarina, Brazil. Validity was assessed by sensitivity, specificity, as well as by agreement rates (match, omission, and intrusion rates of food items reported by children on the Previous Day Food Questionnaire Online, using direct observation of foods/beverages eaten during school meals (mid-morning snack or afternoon snack on the previous day as the reference. Multivariate multinomial logistic regression analysis was used to evaluate the influence of participants’ characteristics on omission and intrusion rates. Results The results showed adequate sensitivity (67.7% and specificity (95.2%. There were low omission and intrusion rates of 22.8% and 29.5%, respectively when all food items were analyzed. Pizza/hamburger showed the highest omission rate, whereas milk and milk products showed the highest intrusion rate. The participants who attended school in the afternoon shift presented a higher probability of intrusion compared to their peers who attended school in the morning. Conclusion The Previous Day Food Questionnaire Online possessed satisfactory validity for the assessment of food intake at the group level in schoolchildren from the 2nd to 5th grades of public school.

  12. Test-retest reliability and comparability of paper and computer questionnaires for the Finnish version of the Tampa Scale of Kinesiophobia.

    Science.gov (United States)

    Koho, P; Aho, S; Kautiainen, H; Pohjolainen, T; Hurri, H

    2014-12-01

    To estimate the internal consistency, test-retest reliability and comparability of paper and computer versions of the Finnish version of the Tampa Scale of Kinesiophobia (TSK-FIN) among patients with chronic pain. In addition, patients' personal experiences of completing both versions of the TSK-FIN and preferences between these two methods of data collection were studied. Test-retest reliability study. Paper and computer versions of the TSK-FIN were completed twice on two consecutive days. The sample comprised 94 consecutive patients with chronic musculoskeletal pain participating in a pain management or individual rehabilitation programme. The group rehabilitation design consisted of physical and functional exercises, evaluation of the social situation, psychological assessment of pain-related stress factors, and personal pain management training in order to regain overall function and mitigate the inconvenience of pain and fear-avoidance behaviour. The mean TSK-FIN score was 37.1 [standard deviation (SD) 8.1] for the computer version and 35.3 (SD 7.9) for the paper version. The mean difference between the two versions was 1.9 (95% confidence interval 0.8 to 2.9). Test-retest reliability was 0.89 for the paper version and 0.88 for the computer version. Internal consistency was considered to be good for both versions. The intraclass correlation coefficient for comparability was 0.77 (95% confidence interval 0.66 to 0.85), indicating substantial reliability between the two methods. Both versions of the TSK-FIN demonstrated substantial intertest reliability, good test-retest reliability, good internal consistency and acceptable limits of agreement, suggesting their suitability for clinical use. However, subjects tended to score higher when using the computer version. As such, in an ideal situation, data should be collected in a similar manner throughout the course of rehabilitation or clinical research. Copyright © 2014 Chartered Society of Physiotherapy. Published

  13. U.S. Army weapon systems human-computer interface style guide. Version 2

    Energy Technology Data Exchange (ETDEWEB)

    Avery, L.W.; O`Mara, P.A.; Shepard, A.P.; Donohoo, D.T.

    1997-12-31

    A stated goal of the US Army has been the standardization of the human computer interfaces (HCIs) of its system. Some of the tools being used to accomplish this standardization are HCI design guidelines and style guides. Currently, the Army is employing a number of HCI design guidance documents. While these style guides provide good guidance for the command, control, communications, computers, and intelligence (C4I) domain, they do not necessarily represent the more unique requirements of the Army`s real time and near-real time (RT/NRT) weapon systems. The Office of the Director of Information for Command, Control, Communications, and Computers (DISC4), in conjunction with the Weapon Systems Technical Architecture Working Group (WSTAWG), recognized this need as part of their activities to revise the Army Technical Architecture (ATA), now termed the Joint Technical Architecture-Army (JTA-A). To address this need, DISC4 tasked the Pacific Northwest National Laboratory (PNNL) to develop an Army weapon systems unique HCI style guide, which resulted in the US Army Weapon Systems Human-Computer Interface (WSHCI) Style Guide Version 1. Based on feedback from the user community, DISC4 further tasked PNNL to revise Version 1 and publish Version 2. The intent was to update some of the research and incorporate some enhancements. This document provides that revision. The purpose of this document is to provide HCI design guidance for the RT/NRT Army system domain across the weapon systems subdomains of ground, aviation, missile, and soldier systems. Each subdomain should customize and extend this guidance by developing their domain-specific style guides, which will be used to guide the development of future systems within their subdomains.

  14. BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis, Version III

    International Nuclear Information System (INIS)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W. III.

    1981-06-01

    This report is a condensed documentation for VERSION III of the BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis. An experienced analyst should be able to use this system routinely for solving problems by referring to this document. Individual reports must be referenced for details. This report covers basic input instructions and describes recent extensions to the modules as well as to the interface data file specifications. Some application considerations are discussed and an elaborate sample problem is used as an instruction aid. Instructions for creating the system on IBM computers are also given

  15. BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis, Version III

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W. III.

    1981-06-01

    This report is a condensed documentation for VERSION III of the BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis. An experienced analyst should be able to use this system routinely for solving problems by referring to this document. Individual reports must be referenced for details. This report covers basic input instructions and describes recent extensions to the modules as well as to the interface data file specifications. Some application considerations are discussed and an elaborate sample problem is used as an instruction aid. Instructions for creating the system on IBM computers are also given.

  16. New version: GRASP2K relativistic atomic structure package

    Science.gov (United States)

    Jönsson, P.; Gaigalas, G.; Bieroń, J.; Fischer, C. Froese; Grant, I. P.

    2013-09-01

    , Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 730252 No. of bytes in distributed program, including test data, etc.: 14808872 Distribution format: tar.gz Programming language: Fortran. Computer: Intel Xeon, 2.66 GHz. Operating system: Suse, Ubuntu, and Debian Linux 64-bit. RAM: 500 MB or more Classification: 2.1. Catalogue identifier of previous version: ADZL_v1_0 Journal reference of previous version: Comput. Phys. Comm. 177 (2007) 597 Does the new version supersede the previous version?: Yes Nature of problem: Prediction of atomic properties — atomic energy levels, oscillator strengths, radiative decay rates, hyperfine structure parameters, Landé gJ-factors, and specific mass shift parameters — using a multiconfiguration Dirac-Hartree-Fock approach. Solution method: The computational method is the same as in the previous GRASP2K [1] version except that for v3 codes the njgraf library module [2] for recoupling has been replaced by librang [3,4]. Reasons for new version: New angular libraries with improved performance are available. Also methodology for transforming from jj- to LSJ-coupling has been developed. Summary of revisions: New angular libraries where the coefficients of fractional parentage have been extended to j=9/2, making calculations feasible for the lanthanides and actinides. Inclusion of a new program jj2lsj, which reports the percentage composition of the wave function in LSJ. Transition programs have been modified to produce a file of transition data with one record for each transition in the same format as Atsp2K [C. Froese Fischer, G. Tachiev, G. Gaigalas and M.R. Godefroid, Comput. Phys. Commun. 176 (2007) 559], which identifies each atomic state by the total energy and a label for the CSF with the largest expansion coefficient in LSJ intermediate coupling. Updated to 64-bit architecture. A

  17. Verification of RESRAD-build computer code, version 3.1

    International Nuclear Information System (INIS)

    2003-01-01

    RESRAD-BUILD is a computer model for analyzing the radiological doses resulting from the remediation and occupancy of buildings contaminated with radioactive material. It is part of a family of codes that includes RESRAD, RESRAD-CHEM, RESRAD-RECYCLE, RESRAD-BASELINE, and RESRAD-ECORISK. The RESRAD-BUILD models were developed and codified by Argonne National Laboratory (ANL); version 1.5 of the code and the user's manual were publicly released in 1994. The original version of the code was written for the Microsoft DOS operating system. However, subsequent versions of the code were written for the Microsoft Windows operating system. The purpose of the present verification task (which includes validation as defined in the standard) is to provide an independent review of the latest version of RESRAD-BUILD under the guidance provided by ANSI/ANS-10.4 for verification and validation of existing computer programs. This approach consists of a posteriori V and V review which takes advantage of available program development products as well as user experience. The purpose, as specified in ANSI/ANS-10.4, is to determine whether the program produces valid responses when used to analyze problems within a specific domain of applications, and to document the level of verification. The culmination of these efforts is the production of this formal Verification Report. The first step in performing the verification of an existing program was the preparation of a Verification Review Plan. The review plan consisted of identifying: Reason(s) why a posteriori verification is to be performed; Scope and objectives for the level of verification selected; Development products to be used for the review; Availability and use of user experience; and Actions to be taken to supplement missing or unavailable development products. The purpose, scope and objectives for the level of verification selected are described in this section of the Verification Report. The development products that were used

  18. Independent validation testing of the FLAME computer code, Version 1.0

    International Nuclear Information System (INIS)

    Martian, P.; Chung, J.N.

    1992-07-01

    Independent testing of the FLAME computer code, Version 1.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Validation tests, (i.e., tests which compare field data to the computer generated solutions) were used to determine the operational status of the FLAME computer code and were done on a qualitative basis through graphical comparisons of the experimental and numerical data. These tests were specifically designed to check: (1) correctness of the FORTRAN coding, (2) computational accuracy, and (3) suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: (1) independent applications, and (2) graduated difficulty of test cases. Three tests ranging in complexity from simple one-dimensional steady-state flow field problems under near-saturated conditions to two-dimensional transient flow problems with very dry initial conditions

  19. ClustalXeed: a GUI-based grid computation version for high performance and terabyte size multiple sequence alignment

    Directory of Open Access Journals (Sweden)

    Kim Taeho

    2010-09-01

    Full Text Available Abstract Background There is an increasing demand to assemble and align large-scale biological sequence data sets. The commonly used multiple sequence alignment programs are still limited in their ability to handle very large amounts of sequences because the system lacks a scalable high-performance computing (HPC environment with a greatly extended data storage capacity. Results We designed ClustalXeed, a software system for multiple sequence alignment with incremental improvements over previous versions of the ClustalX and ClustalW-MPI software. The primary advantage of ClustalXeed over other multiple sequence alignment software is its ability to align a large family of protein or nucleic acid sequences. To solve the conventional memory-dependency problem, ClustalXeed uses both physical random access memory (RAM and a distributed file-allocation system for distance matrix construction and pair-align computation. The computation efficiency of disk-storage system was markedly improved by implementing an efficient load-balancing algorithm, called "idle node-seeking task algorithm" (INSTA. The new editing option and the graphical user interface (GUI provide ready access to a parallel-computing environment for users who seek fast and easy alignment of large DNA and protein sequence sets. Conclusions ClustalXeed can now compute a large volume of biological sequence data sets, which were not tractable in any other parallel or single MSA program. The main developments include: 1 the ability to tackle larger sequence alignment problems than possible with previous systems through markedly improved storage-handling capabilities. 2 Implementing an efficient task load-balancing algorithm, INSTA, which improves overall processing times for multiple sequence alignment with input sequences of non-uniform length. 3 Support for both single PC and distributed cluster systems.

  20. Computational Ecology and Software (http://www.iaees.org/publications/journals/ces/online-version.asp

    Directory of Open Access Journals (Sweden)

    ces@iaees.org

    Full Text Available Computational Ecology and Software ISSN 2220-721X URL: http://www.iaees.org/publications/journals/ces/online-version.asp RSS: http://www.iaees.org/publications/journals/ces/rss.xml E-mail: ces@iaees.org Editor-in-Chief: WenJun Zhang Aims and Scope COMPUTATIONAL ECOLOGY AND SOFTWARE (ISSN 2220-721X is an open access, peer-reviewed online journal that considers scientific articles in all different areas of computational ecology. It is the transactions of the International Society of Computational Ecology. The journal is concerned with the ecological researches, constructions and applications of theories and methods of computational sciences including computational mathematics, computational statistics and computer science. It features the simulation, approximation, prediction, recognition, and classification of ecological issues. Intensive computation is one of the major stresses of the journal. The journal welcomes research articles, short communications, review articles, perspectives, and book reviews. The journal also supports the activities of the International Society of Computational Ecology. The topics to be covered by CES include, but are not limited to: •Computation intensive methods, numerical and optimization methods, differential and difference equation modeling and simulation, prediction, recognition, classification, statistical computation (Bayesian computing, randomization, bootstrapping, Monte Carlo techniques, stochastic process, etc., agent-based modeling, individual-based modeling, artificial neural networks, knowledge based systems, machine learning, genetic algorithms, data exploration, network analysis and computation, databases, ecological modeling and computation using Geographical Information Systems, satellite imagery, and other computation intensive theories and methods. •Artificial ecosystems, artificial life, complexity of ecosystems and virtual reality. •The development, evaluation and validation of software and

  1. DNAStat, version 2.1--a computer program for processing genetic profile databases and biostatistical calculations.

    Science.gov (United States)

    Berent, Jarosław

    2010-01-01

    This paper presents the new DNAStat version 2.1 for processing genetic profile databases and biostatistical calculations. The popularization of DNA studies employed in the judicial system has led to the necessity of developing appropriate computer programs. Such programs must, above all, address two critical problems, i.e. the broadly understood data processing and data storage, and biostatistical calculations. Moreover, in case of terrorist attacks and mass natural disasters, the ability to identify victims by searching related individuals is very important. DNAStat version 2.1 is an adequate program for such purposes. The DNAStat version 1.0 was launched in 2005. In 2006, the program was updated to 1.1 and 1.2 versions. There were, however, slight differences between those versions and the original one. The DNAStat version 2.0 was launched in 2007 and the major program improvement was an introduction of the group calculation options with the potential application to personal identification of mass disasters and terrorism victims. The last 2.1 version has the option of language selection--Polish or English, which will enhance the usage and application of the program also in other countries.

  2. User's Manual for LEWICE Version 3.2

    Science.gov (United States)

    Wright, William

    2008-01-01

    A research project is underway at NASA Glenn to produce a computer code which can accurately predict ice growth under a wide range of meteorological conditions for any aircraft surface. This report will present a description of the code inputs and outputs from version 3.2 of this software, which is called LEWICE. This version differs from release 2.0 due to the addition of advanced thermal analysis capabilities for de-icing and anti-icing applications using electrothermal heaters or bleed air applications, the addition of automated Navier-Stokes analysis, an empirical model for supercooled large droplets (SLD) and a pneumatic boot option. An extensive effort was also undertaken to compare the results against the database of electrothermal results which have been generated in the NASA Glenn Icing Research Tunnel (IRT) as was performed for the validation effort for version 2.0. This report will primarily describe the features of the software related to the use of the program. Appendix A has been included to list some of the inner workings of the software or the physical models used. This information is also available in the form of several unpublished documents internal to NASA. This report is intended as a replacement for all previous user manuals of LEWICE. In addition to describing the changes and improvements made for this version, information from previous manuals may be duplicated so that the user will not need to consult previous manuals to use this software.

  3. Global Precipitation Climatology Project (GPCP) - Monthly, Version 2.2 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Version 2.2 of the dataset has been superseded by a newer version. Users should not use version 2.2 except in rare cases (e.g., when reproducing previous studies...

  4. Global Historical Climatology Network (GHCN), Version 1 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Please note, this dataset has been superseded by a newer version (see below). Users should not use this version except in rare cases (e.g., when reproducing previous...

  5. User Manual for the NASA Glenn Ice Accretion Code LEWICE: Version 2.0

    Science.gov (United States)

    Wright, William B.

    1999-01-01

    A research project is underway at NASA Glenn to produce a computer code which can accurately predict ice growth under a wide range of meteorological conditions for any aircraft surface. This report will present a description of the code inputs and outputs from version 2.0 of this code, which is called LEWICE. This version differs from previous releases due to its robustness and its ability to reproduce results accurately for different spacing and time step criteria across computing platform. It also differs in the extensive effort undertaken to compare the results against the database of ice shapes which have been generated in the NASA Glenn Icing Research Tunnel (IRT) 1. This report will only describe the features of the code related to the use of the program. The report will not describe the inner working of the code or the physical models used. This information is available in the form of several unpublished documents which will be collectively referred to as a Programmers Manual for LEWICE 2 in this report. These reports are intended as an update/replacement for all previous user manuals of LEWICE. In addition to describing the changes and improvements made for this version, information from previous manuals may be duplicated so that the user will not need to consult previous manuals to use this code.

  6. Radiological Safety Analysis Computer (RSAC) Program Version 7.0 Users’ Manual

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Bradley J Schrader

    2009-03-01

    The Radiological Safety Analysis Computer (RSAC) Program Version 7.0 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users’ manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods.

  7. Radiological Safety Analysis Computer (RSAC) Program Version 7.2 Users’ Manual

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Bradley J Schrader

    2010-10-01

    The Radiological Safety Analysis Computer (RSAC) Program Version 7.2 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users’ manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods.

  8. Radiological Safety Analysis Computer (RSAC) Program Version 7.0 Users Manual

    International Nuclear Information System (INIS)

    Schrader, Bradley J.

    2009-01-01

    The Radiological Safety Analysis Computer (RSAC) Program Version 7.0 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods

  9. Comparability of Computer-based and Paper-based Versions of Writing Section of PET in Iranian EFL Context

    Directory of Open Access Journals (Sweden)

    Mohammad Mohammadi

    2010-11-01

    Full Text Available Computer technology has provided language testing experts with opportunity to develop computerized versions of traditional paper-based language tests. New generations of TOEFL and Cambridge IELTS, BULATS, KET, PET are good examples of computer-based language tests. Since this new method of testing introduces new factors into the realm of language assessment ( e.g. modes of test delivery, familiarity with computer, etc.,the question may be whether the two modes of computer- and paper-based tests comparably measure the same construct, and hence, the scores obtained from the two modes can be used interchangeably. Accordingly, the present study aimed to investigate the comparability of the paper- and computer-based versions of a writing test. The data for this study were collected from administering the writing section of a Cambridge Preliminary English Test (PET to eighty Iranian intermediate EFL learners through the two modes of computer- and paper-based testing. Besides, a computer familiarity questionnaire was used to divide participants into two groups with high and low computer familiarity. The results of the independent samples t-test revealed that there was no statistically significant difference between the learners' computer- and paper-based writing scores. The results of the paired samples t-test showed no statistically significant difference between high- and low-computer-familiar groups on computer-based writing. The researchers concluded that the two modes comparably measured the same construct.

  10. Comparability of Computer-based and Paper-based Versions of Writing Section of PET in Iranian EFL Context

    OpenAIRE

    Mohammad Mohammadi; Masoud Barzgaran

    2010-01-01

    Computer technology has provided language testing experts with opportunity to develop computerized versions of traditional paper-based language tests. New generations of TOEFL and Cambridge IELTS, BULATS, KET, PET are good examples of computer-based language tests. Since this new method of testing introduces new factors into the realm of language assessment ( e.g. modes of test delivery, familiarity with computer, etc.),the question may be whether the two modes of computer- and paper-based te...

  11. DEVELOPMENT OF QUARRY SOLUTION VERSION 1.0 FOR QUICK COMPUTATION OF DRILLING AND BLASTING PARAMETERS

    Directory of Open Access Journals (Sweden)

    B. ADEBAYO

    2014-10-01

    Full Text Available Computation of drilling cost, quantity of explosives and blasting cost are routine procedure in Quarry and all these parameters are estimated manually in most of the quarries in Nigeria. This paper deals with the development of application package QUARRY SOLUTION Version 1.0 for quarries using Visual Basic 6.0. In order to achieve this data were obtained from the quarry such as drilling and blasting activities. Also, empirical formulae developed by different researchers were used for computation of the required parameters viz: practical burden, spacing, length of hole, cost of drilling consumables, drilling cost, powder factor, quantity of column charge, total quantity of explosives, volume of blast and blasting cost. The output obtained from the software QUARRY SOLUTION Version 1.0 for length of drilling, drilling cost, total quantity of explosives, volume of blast and blasting cost were compared with the results manually computed for these routine parameters estimated during drilling and blasting operation in quarry, it was then discovered that they followed the same trend. The computation from the application package revealed that 611 blast-holes require 3326.71 kg of high explosives (166 cartons of explosives and 20147.2 kg of low explosives (806 bags of explosives. The total cost was computed to be N 5133999:50 ($ 32087.49. Moreover, the output showed that these routine parameters estimated during drilling and blasting could be computed within a short time frame using this QUARRY SOLUTION, therefore, improving productivity and efficiency. This application package is recommended for use in open-pit and quarries when all necessary inputs are supplied.

  12. Intruder dose pathway analysis for the onsite disposal of radioactive wastes: The ONSITE/MAXI1 computer program

    International Nuclear Information System (INIS)

    Kennedy, W.E. Jr.; Peloquin, R.A.; Napier, B.A.; Neuder, S.M.

    1987-02-01

    This document summarizes initial efforts to develop human-intrusion scenarios and a modified version of the MAXI computer program for potential use by the NRC in reviewing applications for onsite radioactive waste disposal. Supplement 1 of NUREG/CR-3620 (1986) summarized modifications and improvements to the ONSITE/MAXI1 software package. This document summarizes a modified version of the ONSITE/MAXI1 computer program. This modified version of the computer program operates on a personal computer and permits the user to optionally select radiation dose conversion factors published by the International Commission on Radiological Protection (ICRP) in their Publication No. 30 (ICRP 1979-1982) in place of those published by the ICRP in their Publication No. 2 (ICRP 1959) (as implemented in the previous versions of the ONSITE/MAXI1 computer program). The pathway-to-human models used in the computer program have not been changed from those described previously. Computer listings of the ONSITE/MAXI1 computer program and supporting data bases are included in the appendices of this document

  13. Dynamic Computation of Change Operations in Version Management of Business Process Models

    Science.gov (United States)

    Küster, Jochen Malte; Gerth, Christian; Engels, Gregor

    Version management of business process models requires that changes can be resolved by applying change operations. In order to give a user maximal freedom concerning the application order of change operations, position parameters of change operations must be computed dynamically during change resolution. In such an approach, change operations with computed position parameters must be applicable on the model and dependencies and conflicts of change operations must be taken into account because otherwise invalid models can be constructed. In this paper, we study the concept of partially specified change operations where parameters are computed dynamically. We provide a formalization for partially specified change operations using graph transformation and provide a concept for their applicability. Based on this, we study potential dependencies and conflicts of change operations and show how these can be taken into account within change resolution. Using our approach, a user can resolve changes of business process models without being unnecessarily restricted to a certain order.

  14. Backfilling the Grid with Containerized BOINC in the ATLAS computing

    CERN Document Server

    Wu, Wenjing; The ATLAS collaboration

    2018-01-01

    Virtualization is a commonly used solution for utilizing the opportunistic computing resources in the HEP field, as it provides a unified software and OS layer that the HEP computing tasks require over the heterogeneous opportunistic computing resources. However there is always performance penalty with virtualization, especially for short jobs which are always the case for volunteer computing tasks, the overhead of virtualization becomes a big portion in the wall time, hence it leads to low CPU efficiency of the jobs. With the wide usage of containers in HEP computing, we explore the possibility of adopting the container technology into the ATLAS BOINC project, hence we implemented a Native version in BOINC, which uses the singularity container or direct usage of the target OS to replace VirtualBox. In this paper, we will discuss 1) the implementation and workflow of the Native version in the ATLAS BOINC; 2) the performance measurement of the Native version comparing to the previous Virtualization version. 3)...

  15. The efficacy and safety of external cephalic version after a previous caesarean delivery.

    Science.gov (United States)

    Weill, Yishay; Pollack, Raphael N

    2017-06-01

    External cephalic version (ECV) in the presence of a uterine scar is still considered a relative contraindication despite encouraging studies of the efficacy and safety of this procedure. We present our experience with this patient population, which is the largest cohort published to date. To evaluate the efficacy and safety of ECV in the setting of a prior caesarean delivery. A total of 158 patients with a fetus presenting as breech, who had an unscarred uterus, had an ECV performed. Similarly, 158 patients with a fetus presenting as breech, and who had undergone a prior caesarean delivery also underwent an ECV. Outcomes were compared. ECV was successfully performed in 136/158 (86.1%) patients in the control group. Of these patients, 6/136 (4.4%) delivered by caesarean delivery. In the study group, 117/158 (74.1%) patients had a successful ECV performed. Of these patients, 12/117 (10.3%) delivered by caesarean delivery. There were no significant complications in either of the groups. ECV may be successfully performed in patients with a previous caesarean delivery. It is associated with a high success rate, and is not associated with an increase in complications. © 2016 The Royal Australian and New Zealand College of Obstetricians and Gynaecologists.

  16. Global Historical Climatology Network - Daily (GHCN-Daily), Version 2 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Please note, this dataset has been superseded by a newer version (see below). Users should not use this version except in rare cases (e.g., when reproducing previous...

  17. Automated evaluation of matrix elements between contracted wavefunctions: A Mathematica version of the FRODO program

    Science.gov (United States)

    Angeli, C.; Cimiraglia, R.

    2013-02-01

    A symbolic program performing the Formal Reduction of Density Operators (FRODO), formerly developed in the MuPAD computer algebra system with the purpose of evaluating the matrix elements of the electronic Hamiltonian between internally contracted functions in a complete active space (CAS) scheme, has been rewritten in Mathematica. New version : A program summaryProgram title: FRODO Catalogue identifier: ADV Y _v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVY_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3878 No. of bytes in distributed program, including test data, etc.: 170729 Distribution format: tar.gz Programming language: Mathematica Computer: Any computer on which the Mathematica computer algebra system can be installed Operating system: Linux Classification: 5 Catalogue identifier of previous version: ADV Y _v1_0 Journal reference of previous version: Comput. Phys. Comm. 171(2005)63 Does the new version supersede the previous version?: No Nature of problem. In order to improve on the CAS-SCF wavefunction one can resort to multireference perturbation theory or configuration interaction based on internally contracted functions (ICFs) which are obtained by application of the excitation operators to the reference CAS-SCF wavefunction. The previous formulation of such matrix elements in the MuPAD computer algebra system, has been rewritten using Mathematica. Solution method: The method adopted consists in successively eliminating all occurrences of inactive orbital indices (core and virtual) from the products of excitation operators which appear in the definition of the ICFs and in the electronic Hamiltonian expressed in the second quantization formalism. Reasons for new version: Some years ago we published in this journal a couple of papers [1, 2

  18. Simion 3D Version 6.0 User`s Manual

    Energy Technology Data Exchange (ETDEWEB)

    Dahl, D.A.

    1995-11-01

    The original SIMION was an electrostatic lens analysis and design program developed by D.C. McGilvery at Latrobe University, Bundoora Victoria, Australia, 1977. SIMION for the PC, developed at the Idaho National Engineering Laboratory, shares little more than its name with the original McGilvery version. INEL`s fifth major SIMION release, version 6.0, represents a quantum improvement over previous versions. This C based program can model complex problems using an ion optics workbench that can hold up to 200 2D and/or 3D electrostatic/magnetic potential arrays. Arrays can have up to 10,000,000 points. SIMION 3D`s 32 bit virtual Graphics User Interface provides a highly interactive advanced user environment. All potential arrays are visualized as 3D objects that the user can cut away to inspect ion trajectories and potential energy surfaces. User programs have been greatly extended in versatility and power. A new geometry file option supports the definition of highly complex array geometry. Extensive algorithm modifications have dramatically improved this version`s computational speed and accuracy.

  19. The Importance of Business Model Factors for Cloud Computing Adoption: Role of Previous Experiences

    Directory of Open Access Journals (Sweden)

    Bogataj Habjan Kristina

    2017-08-01

    Full Text Available Background and Purpose: Bringing several opportunities for more effective and efficient IT governance and service exploitation, cloud computing is expected to impact the European and global economies significantly. Market data show that despite many advantages and promised benefits the adoption of cloud computing is not as fast and widespread as foreseen. This situation shows the need for further exploration of the potentials of cloud computing and its implementation on the market. The purpose of this research was to identify individual business model factors with the highest impact on cloud computing adoption. In addition, the aim was to identify the differences in opinion regarding the importance of business model factors on cloud computing adoption according to companies’ previous experiences with cloud computing services.

  20. THEAP-I: A computer program for thermal hydraulic analysis of a thermally interacting channel bundle of complex geometry. The micro computer version user`s manual

    Energy Technology Data Exchange (ETDEWEB)

    Megaritou, A; Bartzis, J G

    1987-09-01

    In the present report the micro computer version of the code is described. More emphasis is given in the new features of the code (i.e. input data structure). A set of instructions for running in an IBM-AT2 computer with the Microsoft FORTRAN V.4.0 is also included together with a sample problem refering to the Greek Research Reactor.

  1. USER'S GUIDE TO THE PERSONAL COMPUTER VERSION OF THE BIOGENIC EMISSIONS INVENTORY SYSTEM (PC-BEIS2)

    Science.gov (United States)

    The document is a user's guide for an updated Personal Computer version of the Biogenic Emissions Inventory System (PC-BEIS2), allowing users to estimate hourly emissions of biogenic volatile organic compounds (BVOCs) and soil nitrogen oxide emissions for any county in the contig...

  2. A one-dimensional material transfer model for HECTR version 1.5

    International Nuclear Information System (INIS)

    Geller, A.S.; Wong, C.C.

    1991-08-01

    HECTR (Hydrogen Event Containment Transient Response) is a lumped-parameter computer code developed for calculating the pressure-temperature response to combustion in a nuclear power plant containment building. The code uses a control-volume approach and subscale models to simulate the mass, momentum, and energy transfer occurring in the containment during a loss-of-collant-accident (LOCA). This document describes one-dimensional subscale models for mass and momentum transfer, and the modifications to the code required to implement them. Two problems were analyzed: the first corresponding to a standard problem studied with previous HECTR versions, the second to experiments. The performance of the revised code relative to previous HECTR version is discussed as is the ability of the code to model the experiments. 8 refs., 5 figs., 3 tabs

  3. Program package for multicanonical simulations of U(1) lattice gauge theory-Second version

    Science.gov (United States)

    Bazavov, Alexei; Berg, Bernd A.

    2013-03-01

    A new version STMCMUCA_V1_1 of our program package is available. It eliminates compatibility problems of our Fortran 77 code, originally developed for the g77 compiler, with Fortran 90 and 95 compilers. New version program summaryProgram title: STMC_U1MUCA_v1_1 Catalogue identifier: AEET_v1_1 Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html Programming language: Fortran 77 compatible with Fortran 90 and 95 Computers: Any capable of compiling and executing Fortran code Operating systems: Any capable of compiling and executing Fortran code RAM: 10 MB and up depending on lattice size used No. of lines in distributed program, including test data, etc.: 15059 No. of bytes in distributed program, including test data, etc.: 215733 Keywords: Markov chain Monte Carlo, multicanonical, Wang-Landau recursion, Fortran, lattice gauge theory, U(1) gauge group, phase transitions of continuous systems Classification: 11.5 Catalogue identifier of previous version: AEET_v1_0 Journal Reference of previous version: Computer Physics Communications 180 (2009) 2339-2347 Does the new version supersede the previous version?: Yes Nature of problem: Efficient Markov chain Monte Carlo simulation of U(1) lattice gauge theory (or other continuous systems) close to its phase transition. Measurements and analysis of the action per plaquette, the specific heat, Polyakov loops and their structure factors. Solution method: Multicanonical simulations with an initial Wang-Landau recursion to determine suitable weight factors. Reweighting to physical values using logarithmic coding and calculating jackknife error bars. Reasons for the new version: The previous version was developed for the g77 compiler Fortran 77 version. Compiler errors were encountered with Fortran 90 and Fortran 95 compilers (specified below). Summary of revisions: epsilon=one/10**10 is replaced by epsilon/10.0D10 in the parameter statements of the subroutines u1_bmha.f, u1_mucabmha.f, u1wl

  4. Multi keno-VAX a modified version of the reactor computer code Multi keno-2

    Energy Technology Data Exchange (ETDEWEB)

    Imam, M [National center for nuclear safety and radiation control, atomic energy authority, Cairo, (Egypt)

    1995-10-01

    The reactor computer code Multi keno-2 is developed in Japan from the original Monte Carlo Keno-IV. By applications of this code on some real problems, fatal errors were detected. These errors are related to the restart option in the code. The restart option is essential for solving time-consuming problems on mini-computer like VAX-6320. These errors were corrected and other modifications were carried out in the code. Because of these modifications new input data description was written for the code. Thus a new VAX/VMS version for the program was developed which is also adaptable for mini-mainframes. This new developed program, called Multi keno-VAX is accepted in the Nea-IAEA data bank and is added to its international computer codes library. 1 fig.

  5. Multi keno-VAX a modified version of the reactor computer code Multi keno-2

    International Nuclear Information System (INIS)

    Imam, M.

    1995-01-01

    The reactor computer code Multi keno-2 is developed in Japan from the original Monte Carlo Keno-IV. By applications of this code on some real problems, fatal errors were detected. These errors are related to the restart option in the code. The restart option is essential for solving time-consuming problems on mini-computer like VAX-6320. These errors were corrected and other modifications were carried out in the code. Because of these modifications new input data description was written for the code. Thus a new VAX/VMS version for the program was developed which is also adaptable for mini-mainframes. This new developed program, called Multi keno-VAX is accepted in the Nea-IAEA data bank and is added to its international computer codes library. 1 fig

  6. Planetary Mission Entry Vehicles Quick Reference Guide. Version 3.0

    Science.gov (United States)

    Davies, Carol; Arcadi, Marla

    2006-01-01

    This is Version 3.0 of the planetary mission entry vehicle document. Three new missions, Re-entry F, Hayabusa, and ARD have been added to t he previously published edition (Version 2.1). In addition, the Huyge ns mission has been significantly updated and some Apollo data correc ted. Due to the changing nature of planetary vehicles during the desi gn, manufacture and mission phases, and to the variables involved in measurement and computation, please be aware that the data provided h erein cannot be guaranteed. Contact Carol Davies at cdavies@mail.arc. nasa.gov to correct or update the current data, or to suggest other missions.

  7. NCDC International Best Track Archive for Climate Stewardship (IBTrACS) Project, Version 2 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Version 2 of the dataset has been superseded by a newer version. Users should not use version 2 except in rare cases (e.g., when reproducing previous studies that...

  8. NCDC International Best Track Archive for Climate Stewardship (IBTrACS) Project, Version 1 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Version 1 of the dataset has been superseded by a newer version. Users should not use version 1 except in rare cases (e.g., when reproducing previous studies that...

  9. Standardized computer-based organized reporting of EEG SCORE - Second version

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se...

  10. The comparison of CAP88-PC version 2.0 versus CAP88-PC version 1.0

    International Nuclear Information System (INIS)

    Yakubovich, B.A.; Klee, K.O.; Palmer, C.R.; Spotts, P.B.

    1997-12-01

    40 CFR Part 61 (Subpart H of the NESHAP) requires DOE facilities to use approved sampling procedures, computer models, or other approved procedures when calculating Effective Dose Equivalent (EDE) values to members of the public. Currently version 1.0 of the approved computer model CAP88-PC is used to calculate EDE values. The DOE has upgraded the CAP88-PC software to version 2.0. This version provides simplified data entry, better printing characteristics, the use of a mouse, and other features. The DOE has developed and released version 2.0 for testing and comment. This new software is a WINDOWS based application that offers a new graphical user interface with new utilities for preparing and managing population and weather data, and several new decay chains. The program also allows the user to view results before printing. This document describes a test that confirmed CAP88-PC version 2.0 generates results comparable to the original version of the CAP88-PC program

  11. Verification and validation of RADMODL Version 1.0

    International Nuclear Information System (INIS)

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V ampersand V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident

  12. Verification and validation of RADMODL Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V&V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident.

  13. QDENSITY—A Mathematica quantum computer simulation

    Science.gov (United States)

    Juliá-Díaz, Bruno; Burdis, Joseph M.; Tabakin, Frank

    2009-03-01

    This Mathematica 6.0 package is a simulation of a Quantum Computer. The program provides a modular, instructive approach for generating the basic elements that make up a quantum circuit. The main emphasis is on using the density matrix, although an approach using state vectors is also implemented in the package. The package commands are defined in Qdensity.m which contains the tools needed in quantum circuits, e.g., multiqubit kets, projectors, gates, etc. New version program summaryProgram title: QDENSITY 2.0 Catalogue identifier: ADXH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 26 055 No. of bytes in distributed program, including test data, etc.: 227 540 Distribution format: tar.gz Programming language: Mathematica 6.0 Operating system: Any which supports Mathematica; tested under Microsoft Windows XP, Macintosh OS X, and Linux FC4 Catalogue identifier of previous version: ADXH_v1_0 Journal reference of previous version: Comput. Phys. Comm. 174 (2006) 914 Classification: 4.15 Does the new version supersede the previous version?: Offers an alternative, more up to date, implementation Nature of problem: Analysis and design of quantum circuits, quantum algorithms and quantum clusters. Solution method: A Mathematica package is provided which contains commands to create and analyze quantum circuits. Several Mathematica notebooks containing relevant examples: Teleportation, Shor's Algorithm and Grover's search are explained in detail. A tutorial, Tutorial.nb is also enclosed. Reasons for new version: The package has been updated to make it fully compatible with Mathematica 6.0 Summary of revisions: The package has been updated to make it fully compatible with Mathematica 6.0 Running time: Most examples

  14. New version of PLNoise: a package for exact numerical simulation of power-law noises

    Science.gov (United States)

    Milotti, Edoardo

    2007-08-01

    installed on the target machine. No. of lines in distributed program, including test data, etc.:2975 No. of bytes in distributed program, including test data, etc.:194 588 Distribution format:tar.gz Catalogue identifier of previous version: ADXV_v1_0 Journal reference of previous version: Comput. Phys. Comm. 175 (2006) 212 Does the new version supersede the previous version?: Yes Nature of problem: Exact generation of different types of colored noise. Solution method: Random superposition of relaxation processes [E. Milotti, Phys. Rev. E 72 (2005) 056701], possibly followed by an integration step to produce noise with spectral index >2. Reasons for the new version: Extension to 1/f noises with spectral index 2<α⩽4: the new version generates both noises with spectral with spectral index 0<α⩽2 and with 2<α⩽4. Summary of revisions: Although the overall structure remains the same, one routine has been added and several changes have been made throughout the code to include the new integration step. Unusual features: The algorithm is theoretically guaranteed to be exact, and unlike all other existing generators it can generate samples with uneven spacing. Additional comments: The program requires an initialization step; for some parameter sets this may become rather heavy. Running time: Running time varies widely with different input parameters, however in a test run like the one in Section 3 in the long write-up, the generation routine took on average about 75 μs for each sample.

  15. A Computer Adaptive Testing Version of the Addiction Severity Index-Multimedia Version (ASI-MV): The Addiction Severity CAT

    Science.gov (United States)

    Butler, Stephen F.; Black, Ryan A.; McCaffrey, Stacey A.; Ainscough, Jessica; Doucette, Ann M.

    2017-01-01

    The purpose of this study was to develop and validate a computer adaptive testing (CAT) version of the Addiction Severity Index-Multimedia Version (ASI-MV®), the Addiction Severity CAT. This goal was accomplished in four steps. First, new candidate items for Addiction Severity CAT domains were evaluated after brainstorming sessions with experts in substance abuse treatment. Next, this new item bank was psychometrically evaluated on a large non-clinical (n =4419) and substance abuse treatment sample (n =845). Based on these results, final items were selected and calibrated for the creation of the Addiction Severity CAT algorithms. Once the algorithms were developed for the entire assessment, a fully functioning prototype of an Addiction Severity CAT was created. CAT simulations were conducted and optimal termination criteria were selected for the Addiction Severity CAT algorithms. Finally, construct validity of the CAT algorithms was evaluated by examining convergent/discriminant validity and sensitivity to change. The Addiction Severity CAT was determined to be valid, sensitive to change, and reliable. Further, the Addiction Severity CAT’s time of administration was found to be significantly less than the average time of administration for the ASI-MV composite scores. This study represents the initial validation of an IRT-based Addiction Severity CAT, and further exploration of the Addiction Severity CAT is needed. PMID:28230387

  16. A computer adaptive testing version of the Addiction Severity Index-Multimedia Version (ASI-MV): The Addiction Severity CAT.

    Science.gov (United States)

    Butler, Stephen F; Black, Ryan A; McCaffrey, Stacey A; Ainscough, Jessica; Doucette, Ann M

    2017-05-01

    The purpose of this study was to develop and validate a computer adaptive testing (CAT) version of the Addiction Severity Index-Multimedia Version (ASI-MV), the Addiction Severity CAT. This goal was accomplished in 4 steps. First, new candidate items for Addiction Severity CAT domains were evaluated after brainstorming sessions with experts in substance abuse treatment. Next, this new item bank was psychometrically evaluated on a large nonclinical (n = 4,419) and substance abuse treatment (n = 845) sample. Based on these results, final items were selected and calibrated for the creation of the Addiction Severity CAT algorithms. Once the algorithms were developed for the entire assessment, a fully functioning prototype of an Addiction Severity CAT was created. CAT simulations were conducted, and optimal termination criteria were selected for the Addiction Severity CAT algorithms. Finally, construct validity of the CAT algorithms was evaluated by examining convergent and discriminant validity and sensitivity to change. The Addiction Severity CAT was determined to be valid, sensitive to change, and reliable. Further, the Addiction Severity CAT's time of completion was found to be significantly less than the average time of completion for the ASI-MV composite scores. This study represents the initial validation of an Addiction Severity CAT based on item response theory, and further exploration of the Addiction Severity CAT is needed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. A new version of Scilab software package for the study of dynamical systems

    Science.gov (United States)

    Bordeianu, C. C.; Felea, D.; Beşliu, C.; Jipa, Al.; Grossu, I. V.

    2009-11-01

    This work presents a new version of a software package for the study of chaotic flows, maps and fractals [1]. The codes were written using Scilab, a software package for numerical computations providing a powerful open computing environment for engineering and scientific applications. It was found that Scilab provides various functions for ordinary differential equation solving, Fast Fourier Transform, autocorrelation, and excellent 2D and 3D graphical capabilities. The chaotic behaviors of the nonlinear dynamics systems were analyzed using phase-space maps, autocorrelation functions, power spectra, Lyapunov exponents and Kolmogorov-Sinai entropy. Various well-known examples are implemented, with the capability of the users inserting their own ODE or iterative equations. New version program summaryProgram title: Chaos v2.0 Catalogue identifier: AEAP_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAP_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1275 No. of bytes in distributed program, including test data, etc.: 7135 Distribution format: tar.gz Programming language: Scilab 5.1.1. Scilab 5.1.1 should be installed before running the program. Information about the installation can be found at scilab.org/howto/install/windows" xlink:type="simple">http://wiki.scilab.org/howto/install/windows. Computer: PC-compatible running Scilab on MS Windows or Linux Operating system: Windows XP, Linux RAM: below 150 Megabytes Classification: 6.2 Catalogue identifier of previous version: AEAP_v1_0 Journal reference of previous version: Comput. Phys. Comm. 178 (2008) 788 Does the new version supersede the previous version?: Yes Nature of problem: Any physical model containing linear or nonlinear ordinary differential equations (ODE). Solution method: Numerical solving of

  18. Value of computed tomography pelvimetry in patients with a previous cesarean section

    International Nuclear Information System (INIS)

    Yamani, Tarik Y.; Rouzi, Abdulrahim A.

    1998-01-01

    A case-control study was conducted at the Department of Obstetrics and Gynaecology, King Abdulaziz University Hospital, Jeddah, Saudi Arabia to determine the value of computed tomography pelivimetry in patients with a previous cesarean section. Between January 1993 and December 1995, 219 pregnant women with one previous cesarean had antenatal CT pelvimetry for assessment of the pelvis. One hundred and nineteen women did not have CT pelvimetry and served as control. Fifty-one women (51%) in the CT pelvimetry group were delivered by cesarean section. Twenty-three women (23%) underwent elective cesarean section for contracted pelvis based upon the findings of CT pelvimetry and 28 women (28%) underwent emergency cesarean section after trial of labor. In the group who did not have CT pelvimetry, 26 women (21.8%) underwent emergency cesarean section. This was a statistically significant difference (P=0.02). There were no statistically significant differences in birthweight and Apgar scores either group. There was no prenatal or maternal mortality in this study. Computed tomography pelvimetry increased the rate of cesarean delivery without any benefit in the immediate delivery outcomes. Therefore, the practice of documenting the adequacy of the pelvis by CT pelvimetry before vaginal birth after cesarean should be abandoned. (author)

  19. User's Guide for TOUGH2-MP - A Massively Parallel Version of the TOUGH2 Code

    International Nuclear Information System (INIS)

    Earth Sciences Division; Zhang, Keni; Zhang, Keni; Wu, Yu-Shu; Pruess, Karsten

    2008-01-01

    TOUGH2-MP is a massively parallel (MP) version of the TOUGH2 code, designed for computationally efficient parallel simulation of isothermal and nonisothermal flows of multicomponent, multiphase fluids in one, two, and three-dimensional porous and fractured media. In recent years, computational requirements have become increasingly intensive in large or highly nonlinear problems for applications in areas such as radioactive waste disposal, CO2 geological sequestration, environmental assessment and remediation, reservoir engineering, and groundwater hydrology. The primary objective of developing the parallel-simulation capability is to significantly improve the computational performance of the TOUGH2 family of codes. The particular goal for the parallel simulator is to achieve orders-of-magnitude improvement in computational time for models with ever-increasing complexity. TOUGH2-MP is designed to perform parallel simulation on multi-CPU computational platforms. An earlier version of TOUGH2-MP (V1.0) was based on the TOUGH2 Version 1.4 with EOS3, EOS9, and T2R3D modules, a software previously qualified for applications in the Yucca Mountain project, and was designed for execution on CRAY T3E and IBM SP supercomputers. The current version of TOUGH2-MP (V2.0) includes all fluid property modules of the standard version TOUGH2 V2.0. It provides computationally efficient capabilities using supercomputers, Linux clusters, or multi-core PCs, and also offers many user-friendly features. The parallel simulator inherits all process capabilities from V2.0 together with additional capabilities for handling fractured media from V1.4. This report provides a quick starting guide on how to set up and run the TOUGH2-MP program for users with a basic knowledge of running the (standard) version TOUGH2 code. The report also gives a brief technical description of the code, including a discussion of parallel methodology, code structure, as well as mathematical and numerical methods used

  20. NOAA Climate Data Record (CDR) of Ocean Heat Fluxes, Version 1.0 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Please note, this dataset has been superseded by a newer version (see below). Users should not use this version except in rare cases (e.g., when reproducing previous...

  1. An improved computational version of the LTSN method to solve transport problems in a slab

    International Nuclear Information System (INIS)

    Cardona, Augusto V.; Oliveira, Jose Vanderlei P. de; Vilhena, Marco Tullio de; Segatto, Cynthia F.

    2008-01-01

    In this work, we present an improved computational version of the LTS N method to solve transport problems in a slab. The key feature relies on the reordering of the set of S N equations. This procedure reduces by a factor of two the task of evaluating the eigenvalues of the matrix associated to SN approximations. We present numerical simulations and comparisons with the ones of the classical LTS N approach. (author)

  2. NOAA Climate Data Record (CDR) of Sea Surface Temperature - WHOI, Version 1.0 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Please note, this dataset has been superseded by a newer version (see below). Users should not use this version except in rare cases (e.g., when reproducing previous...

  3. NOAA Climate Data Record (CDR) of Ocean Near Surface Atmospheric Properties, Version 1 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Please note, this dataset has been superseded by a newer version (see below). Users should not use this version except in rare cases (e.g., when reproducing previous...

  4. User Manual for the NASA Glenn Ice Accretion Code LEWICE. Version 2.2.2

    Science.gov (United States)

    Wright, William B.

    2002-01-01

    A research project is underway at NASA Glenn to produce a computer code which can accurately predict ice growth under a wide range of meteorological conditions for any aircraft surface. This report will present a description of the code inputs and outputs from version 2.2.2 of this code, which is called LEWICE. This version differs from release 2.0 due to the addition of advanced thermal analysis capabilities for de-icing and anti-icing applications using electrothermal heaters or bleed air applications. An extensive effort was also undertaken to compare the results against the database of electrothermal results which have been generated in the NASA Glenn Icing Research Tunnel (IRT) as was performed for the validation effort for version 2.0. This report will primarily describe the features of the software related to the use of the program. Appendix A of this report has been included to list some of the inner workings of the software or the physical models used. This information is also available in the form of several unpublished documents internal to NASA. This report is intended as a replacement for all previous user manuals of LEWICE. In addition to describing the changes and improvements made for this version, information from previous manuals may be duplicated so that the user will not need to consult previous manuals to use this code.

  5. US Army Weapon Systems Human-Computer Interface (WSHCI) style guide, Version 1

    Energy Technology Data Exchange (ETDEWEB)

    Avery, L.W.; O`Mara, P.A.; Shepard, A.P.

    1996-09-30

    A stated goal of the U.S. Army has been the standardization of the human computer interfaces (HCIS) of its system. Some of the tools being used to accomplish this standardization are HCI design guidelines and style guides. Currently, the Army is employing a number of style guides. While these style guides provide good guidance for the command, control, communications, computers, and intelligence (C4I) domain, they do not necessarily represent the more unique requirements of the Army`s real time and near-real time (RT/NRT) weapon systems. The Office of the Director of Information for Command, Control, Communications, and Computers (DISC4), in conjunction with the Weapon Systems Technical Architecture Working Group (WSTAWG), recognized this need as part of their activities to revise the Army Technical Architecture (ATA). To address this need, DISC4 tasked the Pacific Northwest National Laboratory (PNNL) to develop an Army weapon systems unique HCI style guide. This document, the U.S. Army Weapon Systems Human-Computer Interface (WSHCI) Style Guide, represents the first version of that style guide. The purpose of this document is to provide HCI design guidance for RT/NRT Army systems across the weapon systems domains of ground, aviation, missile, and soldier systems. Each domain should customize and extend this guidance by developing their domain-specific style guides, which will be used to guide the development of future systems within their domains.

  6. JaxoDraw: A graphical user interface for drawing Feynman diagrams. Version 2.0 release notes

    Science.gov (United States)

    Binosi, D.; Collins, J.; Kaufhold, C.; Theussl, L.

    2009-09-01

    A new version of the Feynman graph plotting tool JaxoDraw is presented. Version 2.0 is a fundamental re-write of most of the JaxoDraw core and some functionalities, in particular importing graphs, are not backward-compatible with the 1.x branch. The most prominent new features include: drawing of Bézier curves for all particle modes, on-the-fly update of edited objects, multiple undo/redo functionality, the addition of a plugin infrastructure, and a general improved memory performance. A new LaTeX style file is presented that has been written specifically on top of the original axodraw.sty to meet the needs of this new version. New version program summaryProgram title: JaxoDraw Catalogue identifier: ADUA_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADUA_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL No. of lines in distributed program, including test data, etc.: 103 544 No. of bytes in distributed program, including test data, etc.: 3 745 814 Distribution format: tar.gz Programming language: Java Computer: Any Java-enabled platform Operating system: Any Java-enabled platform, tested on Linux, Windows XP, Mac OS X Classification: 14 Catalogue identifier of previous version: ADUA_v1_0 Journal reference of previous version: Comput. Phys. Comm. 161 (2004) 76 Does the new version supersede the previous version?: Yes Nature of problem: Existing methods for drawing Feynman diagrams usually require some hard-coding in one or the other programming or scripting language. It is not very convenient and often time consuming, to generate relatively simple diagrams. Solution method: A program is provided that allows for the interactive drawing of Feynman diagrams with a graphical user interface. The program is easy to learn and use, produces high quality output in several formats and runs on any operating system where a Java Runtime Environment is available. Reasons for new version: A

  7. Certification of version 1.2 of the PORFLO-3 code for the WHC scientific and engineering computational center

    International Nuclear Information System (INIS)

    Kline, N.W.

    1994-01-01

    Version 1.2 of the PORFLO-3 Code has migrated from the Hanford Cray computer to workstations in the WHC Scientific and Engineering Computational Center. The workstation-based configuration and acceptance testing are inherited from the CRAY-based configuration. The purpose of this report is to document differences in the new configuration as compared to the parent Cray configuration, and summarize some of the acceptance test results which have shown that the migrated code is functioning correctly in the new environment

  8. DEVELOPMENT OF QUARRY SOLUTION VERSION 1.0 FOR QUICK COMPUTATION OF DRILLING AND BLASTING PARAMETERS

    OpenAIRE

    B. ADEBAYO; A. W. BELLO

    2014-01-01

    Computation of drilling cost, quantity of explosives and blasting cost are routine procedure in Quarry and all these parameters are estimated manually in most of the quarries in Nigeria. This paper deals with the development of application package QUARRY SOLUTION Version 1.0 for quarries using Visual Basic 6.0. In order to achieve this data were obtained from the quarry such as drilling and blasting activities. Also, empirical formulae developed by different researchers were used for computat...

  9. Multithreaded transactions in scientific computing. The Growth06_v2 program

    Science.gov (United States)

    Daniluk, Andrzej

    2009-07-01

    Writing a concurrent program can be more difficult than writing a sequential program. Programmer needs to think about synchronization, race conditions and shared variables. Transactions help reduce the inconvenience of using threads. A transaction is an abstraction, which allows programmers to group a sequence of actions on the program into a logical, higher-level computation unit. This paper presents a new version of the GROWTHGr and GROWTH06 programs. New version program summaryProgram title: GROWTH06_v2 Catalogue identifier: ADVL_v2_1 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVL_v2_1.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 65 255 No. of bytes in distributed program, including test data, etc.: 865 985 Distribution format: tar.gz Programming language: Object Pascal Computer: Pentium-based PC Operating system: Windows 9x, XP, NT, Vista RAM: more than 1 MB Classification: 4.3, 7.2, 6.2, 8, 14 Catalogue identifier of previous version: ADVL_v2_0 Journal reference of previous version: Comput. Phys. Comm. 175 (2006) 678 Does the new version supersede the previous version?: Yes Nature of problem: The programs compute the RHEED intensities during the growth of thin epitaxial structures prepared using the molecular beam epitaxy (MBE). The computations are based on the use of kinematical diffraction theory. Solution method: Epitaxial growth of thin films is modelled by a set of non-linear differential equations [1]. The Runge-Kutta method with adaptive stepsize control was used for solving initial value problem for non-linear differential equations [2]. Reasons for new version: According to the users' suggestions functionality of the program has been improved. Moreover, new use cases have been added which make the handling of the program easier and more

  10. CASKETSS-2: a computer code system for thermal and structural analysis of nuclear fuel shipping casks (version 2)

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1991-08-01

    A computer program CASKETSS-2 has been developed for the purpose of thermal and structural analysis of nuclear fuel shipping casks. CASKETSS-2 means a modular code system for CASK Evaluation code system Thermal and Structural Safety (Version 2). Main features of CASKETSS-2 are as follow; (1) Thermal and structural analysis computer programs for one-, two-, three-dimensional geometries are contained in the code system. (2) There are simplified computer programs and a detailed one in the structural analysis part in the code system. (3) Input data generator is provided in the code system. (4) Graphic computer program is provided in the code system. In the paper, brief illustration of calculation method, input data and sample calculations are presented. (author)

  11. COMODI: an ontology to characterise differences in versions of computational models in biology.

    Science.gov (United States)

    Scharm, Martin; Waltemath, Dagmar; Mendes, Pedro; Wolkenhauer, Olaf

    2016-07-11

    Open model repositories provide ready-to-reuse computational models of biological systems. Models within those repositories evolve over time, leading to different model versions. Taken together, the underlying changes reflect a model's provenance and thus can give valuable insights into the studied biology. Currently, however, changes cannot be semantically interpreted. To improve this situation, we developed an ontology of terms describing changes in models. The ontology can be used by scientists and within software to characterise model updates at the level of single changes. When studying or reusing a model, these annotations help with determining the relevance of a change in a given context. We manually studied changes in selected models from BioModels and the Physiome Model Repository. Using the BiVeS tool for difference detection, we then performed an automatic analysis of changes in all models published in these repositories. The resulting set of concepts led us to define candidate terms for the ontology. In a final step, we aggregated and classified these terms and built the first version of the ontology. We present COMODI, an ontology needed because COmputational MOdels DIffer. It empowers users and software to describe changes in a model on the semantic level. COMODI also enables software to implement user-specific filter options for the display of model changes. Finally, COMODI is a step towards predicting how a change in a model influences the simulation results. COMODI, coupled with our algorithm for difference detection, ensures the transparency of a model's evolution, and it enhances the traceability of updates and error corrections. COMODI is encoded in OWL. It is openly available at http://comodi.sems.uni-rostock.de/ .

  12. Beam dynamics simulations using a parallel version of PARMILA

    International Nuclear Information System (INIS)

    Ryne, R.D.

    1996-01-01

    The computer code PARMILA has been the primary tool for the design of proton and ion linacs in the United States for nearly three decades. Previously it was sufficient to perform simulations with of order 10000 particles, but recently the need to perform high resolution halo studies for next-generation, high intensity linacs has made it necessary to perform simulations with of order 100 million particles. With the advent of massively parallel computers such simulations are now within reach. Parallel computers already make it possible, for example, to perform beam dynamics calculations with tens of millions of particles, requiring over 10 GByte of core memory, in just a few hours. Also, parallel computers are becoming easier to use thanks to the availability of mature, Fortran-like languages such as Connection Machine Fortran and High Performance Fortran. We will describe our experience developing a parallel version of PARMILA and the performance of the new code

  13. Beam dynamics simulations using a parallel version of PARMILA

    International Nuclear Information System (INIS)

    Ryne, Robert

    1996-01-01

    The computer code PARMILA has been the primary tool for the design of proton and ion linacs in the United States for nearly three decades. Previously it was sufficient to perform simulations with of order 10000 particles, but recently the need to perform high resolution halo studies for next-generation, high intensity linacs has made it necessary to perform simulations with of order 100 million particles. With the advent of massively parallel computers such simulations are now within reach. Parallel computers already make it possible, for example, to perform beam dynamics calculations with tens of millions of particles, requiring over 10 GByte of core memory, in just a few hours. Also, parallel computers are becoming easier to use thanks to the availability of mature, Fortran-like languages such as Connection Machine Fortran and High Performance Fortran. We will describe our experience developing a parallel version of PARMILA and the performance of the new code. (author)

  14. Comparing a Video and Text Version of a Web-Based Computer-Tailored Intervention for Obesity Prevention: A Randomized Controlled Trial.

    Science.gov (United States)

    Walthouwer, Michel Jean Louis; Oenema, Anke; Lechner, Lilian; de Vries, Hein

    2015-10-19

    Web-based computer-tailored interventions often suffer from small effect sizes and high drop-out rates, particularly among people with a low level of education. Using videos as a delivery format can possibly improve the effects and attractiveness of these interventions The main aim of this study was to examine the effects of a video and text version of a Web-based computer-tailored obesity prevention intervention on dietary intake, physical activity, and body mass index (BMI) among Dutch adults. A second study aim was to examine differences in appreciation between the video and text version. The final study aim was to examine possible differences in intervention effects and appreciation per educational level. A three-armed randomized controlled trial was conducted with a baseline and 6 months follow-up measurement. The intervention consisted of six sessions, lasting about 15 minutes each. In the video version, the core tailored information was provided by means of videos. In the text version, the same tailored information was provided in text format. Outcome variables were self-reported and included BMI, physical activity, energy intake, and appreciation of the intervention. Multiple imputation was used to replace missing values. The effect analyses were carried out with multiple linear regression analyses and adjusted for confounders. The process evaluation data were analyzed with independent samples t tests. The baseline questionnaire was completed by 1419 participants and the 6 months follow-up measurement by 1015 participants (71.53%). No significant interaction effects of educational level were found on any of the outcome variables. Compared to the control condition, the video version resulted in lower BMI (B=-0.25, P=.049) and lower average daily energy intake from energy-dense food products (B=-175.58, PWeb-based computer-tailored obesity prevention intervention was the most effective intervention and most appreciated. Future research needs to examine if the

  15. Assessment of radionuclide databases in CAP88 mainframe version 1.0 and Windows-based version 3.0.

    Science.gov (United States)

    LaBone, Elizabeth D; Farfán, Eduardo B; Lee, Patricia L; Jannik, G Timothy; Donnelly, Elizabeth H; Foley, Trevor Q

    2009-09-01

    In this study the radionuclide databases for two versions of the Clean Air Act Assessment Package-1988 (CAP88) computer model were assessed in detail. CAP88 estimates radiation dose and the risk of health effects to human populations from radionuclide emissions to air. This program is used by several U.S. Department of Energy (DOE) facilities to comply with National Emission Standards for Hazardous Air Pollutants regulations. CAP88 Mainframe, referred to as version 1.0 on the U.S. Environmental Protection Agency Web site (http://www.epa.gov/radiation/assessment/CAP88/), was the very first CAP88 version released in 1988. Some DOE facilities including the Savannah River Site still employ this version (1.0) while others use the more user-friendly personal computer Windows-based version 3.0 released in December 2007. Version 1.0 uses the program RADRISK based on International Commission on Radiological Protection Publication 30 as its radionuclide database. Version 3.0 uses half-life, dose, and risk factor values based on Federal Guidance Report 13. Differences in these values could cause different results for the same input exposure data (same scenario), depending on which version of CAP88 is used. Consequently, the differences between the two versions are being assessed in detail at Savannah River National Laboratory. The version 1.0 and 3.0 database files contain 496 and 838 radionuclides, respectively, and though one would expect the newer version to include all the 496 radionuclides, 35 radionuclides are listed in version 1.0 that are not included in version 3.0. The majority of these has either extremely short or long half-lives or is no longer in production; however, some of the short-lived radionuclides might produce progeny of great interest at DOE sites. In addition, 122 radionuclides were found to have different half-lives in the two versions, with 21 over 3 percent different and 12 over 10 percent different.

  16. Low-dose computed tomography image restoration using previous normal-dose scan

    International Nuclear Information System (INIS)

    Ma, Jianhua; Huang, Jing; Feng, Qianjin; Zhang, Hua; Lu, Hongbing; Liang, Zhengrong; Chen, Wufan

    2011-01-01

    Purpose: In current computed tomography (CT) examinations, the associated x-ray radiation dose is of a significant concern to patients and operators. A simple and cost-effective means to perform the examinations is to lower the milliampere-seconds (mAs) or kVp parameter (or delivering less x-ray energy to the body) as low as reasonably achievable in data acquisition. However, lowering the mAs parameter will unavoidably increase data noise and the noise would propagate into the CT image if no adequate noise control is applied during image reconstruction. Since a normal-dose high diagnostic CT image scanned previously may be available in some clinical applications, such as CT perfusion imaging and CT angiography (CTA), this paper presents an innovative way to utilize the normal-dose scan as a priori information to induce signal restoration of the current low-dose CT image series. Methods: Unlike conventional local operations on neighboring image voxels, nonlocal means (NLM) algorithm utilizes the redundancy of information across the whole image. This paper adapts the NLM to utilize the redundancy of information in the previous normal-dose scan and further exploits ways to optimize the nonlocal weights for low-dose image restoration in the NLM framework. The resulting algorithm is called the previous normal-dose scan induced nonlocal means (ndiNLM). Because of the optimized nature of nonlocal weights calculation, the ndiNLM algorithm does not depend heavily on image registration between the current low-dose and the previous normal-dose CT scans. Furthermore, the smoothing parameter involved in the ndiNLM algorithm can be adaptively estimated based on the image noise relationship between the current low-dose and the previous normal-dose scanning protocols. Results: Qualitative and quantitative evaluations were carried out on a physical phantom as well as clinical abdominal and brain perfusion CT scans in terms of accuracy and resolution properties. The gain by the use

  17. High-Performance Java Codes for Computational Fluid Dynamics

    Science.gov (United States)

    Riley, Christopher; Chatterjee, Siddhartha; Biswas, Rupak; Biegel, Bryan (Technical Monitor)

    2001-01-01

    The computational science community is reluctant to write large-scale computationally -intensive applications in Java due to concerns over Java's poor performance, despite the claimed software engineering advantages of its object-oriented features. Naive Java implementations of numerical algorithms can perform poorly compared to corresponding Fortran or C implementations. To achieve high performance, Java applications must be designed with good performance as a primary goal. This paper presents the object-oriented design and implementation of two real-world applications from the field of Computational Fluid Dynamics (CFD): a finite-volume fluid flow solver (LAURA, from NASA Langley Research Center), and an unstructured mesh adaptation algorithm (2D_TAG, from NASA Ames Research Center). This work builds on our previous experience with the design of high-performance numerical libraries in Java. We examine the performance of the applications using the currently available Java infrastructure and show that the Java version of the flow solver LAURA performs almost within a factor of 2 of the original procedural version. Our Java version of the mesh adaptation algorithm 2D_TAG performs within a factor of 1.5 of its original procedural version on certain platforms. Our results demonstrate that object-oriented software design principles are not necessarily inimical to high performance.

  18. NOAA Climate Data Record (CDR) of AVHRR Daily and Monthly Aerosol Optical Thickness over Global Oceans, Version 2.0 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Version 2 of the dataset has been superseded by a newer version. Users should not use version 2 except in rare cases (e.g., when reproducing previous studies that...

  19. NOAA Climate Data Record (CDR) of AVHRR Daily and Monthly Aerosol Optical Thickness over Global Oceans, Version 1.0 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Version 1 of the dataset has been superseded by a newer version. Users should not use version 1 except in rare cases (e.g., when reproducing previous studies that...

  20. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (MACINTOSH VERSION)

    Science.gov (United States)

    Culbert, C.

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh

  1. Versions of the Waste Reduction Model (WARM)

    Science.gov (United States)

    This page provides a brief chronology of changes made to EPA’s Waste Reduction Model (WARM), organized by WARM version number. The page includes brief summaries of changes and updates since the previous version.

  2. Toward a microrealistic version of quantum mechanics. II

    International Nuclear Information System (INIS)

    Maxwell, N.

    1976-01-01

    Possible objections to the propensity microrealistic version of quantum mechanics proposed previously are answered. This version of quantum mechanics is compared with the statistical, particle, microrealistic viewpoint, and a crucial experiment is proposed designed to distinguish between these two microrealistic versions of quantum mechanics

  3. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (IBM PC VERSION)

    Science.gov (United States)

    Riley, G.

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh

  4. A new Fortran 90 program to compute regular and irregular associated Legendre functions (new version announcement)

    Science.gov (United States)

    Schneider, Barry I.; Segura, Javier; Gil, Amparo; Guan, Xiaoxu; Bartschat, Klaus

    2018-04-01

    This is a revised and updated version of a modern Fortran 90 code to compute the regular Plm (x) and irregular Qlm (x) associated Legendre functions for all x ∈(- 1 , + 1) (on the cut) and | x | > 1 and integer degree (l) and order (m). The necessity to revise the code comes as a consequence of some comments of Prof. James Bremer of the UC//Davis Mathematics Department, who discovered that there were errors in the code for large integer degree and order for the normalized regular Legendre functions on the cut.

  5. Verification of the 2.00 WAPPA-B [Waste Package Performance Assessment-B version] code

    International Nuclear Information System (INIS)

    Tylock, B.; Jansen, G.; Raines, G.E.

    1987-07-01

    The old version of the Waste Package Performance Assessment (WAPPA) code has been modified into a new code version, 2.00 WAPPA-B. The input files and the results for two benchmarks at repository conditions are fully documented in the appendixes of the EA reference report. The 2.00 WAPPA-B version of the code is suitable for computation of barrier failure due to uniform corrosion; however, an improved sub-version, 2.01 WAPPA-B, is recommended for general use due to minor errors found in 2.00 WAPPA-B during its verification procedures. The input files and input echoes have been modified to include behavior of both radionuclides and elements, but the 2.00 WAPPA-B version of the WAPPA code is not recommended for computation of radionuclide releases. The 2.00 WAPPA-B version computes only mass balances and the initial presence of radionuclides that can be released. Future code development in the 3.00 WAPPA-C version will include radionuclide release computations. 19 refs., 10 figs., 1 tab

  6. Meeting the requirements of specialists and generalists in Version 3 of the Read Codes: Two illustrative "Case Reports"

    Directory of Open Access Journals (Sweden)

    Fiona Sinclair

    1997-11-01

    Full Text Available The Read Codes have been recognised as the standard for General Practice computing since 1988 and the original 4-byte set continues to be extensively used to record primary health care data. Read Version 3 (the Read Thesaurus is an expanded clinical vocabulary with an enhanced file structure designed to meet the detailed requirements of specialist practitioners and to address some of the limitations of previous versions. A recent phase of integration of the still widely-used 4-byte set has highlighted the need to ensure that the new Thesaurus continues to support generalist requirements.

  7. GENII Version 2 Users’ Guide

    Energy Technology Data Exchange (ETDEWEB)

    Napier, Bruce A.

    2004-03-08

    The GENII Version 2 computer code was developed for the Environmental Protection Agency (EPA) at Pacific Northwest National Laboratory (PNNL) to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) and the radiological risk estimating procedures of Federal Guidance Report 13 into updated versions of existing environmental pathway analysis models. The resulting environmental dosimetry computer codes are compiled in the GENII Environmental Dosimetry System. The GENII system was developed to provide a state-of-the-art, technically peer-reviewed, documented set of programs for calculating radiation dose and risk from radionuclides released to the environment. The codes were designed with the flexibility to accommodate input parameters for a wide variety of generic sites. Operation of a new version of the codes, GENII Version 2, is described in this report. Two versions of the GENII Version 2 code system are available, a full-featured version and a version specifically designed for demonstrating compliance with the dose limits specified in 40 CFR 61.93(a), the National Emission Standards for Hazardous Air Pollutants (NESHAPS) for radionuclides. The only differences lie in the limitation of the capabilities of the user to change specific parameters in the NESHAPS version. This report describes the data entry, accomplished via interactive, menu-driven user interfaces. Default exposure and consumption parameters are provided for both the average (population) and maximum individual; however, these may be modified by the user. Source term information may be entered as radionuclide release quantities for transport scenarios, or as basic radionuclide concentrations in environmental media (air, water, soil). For input of basic or derived concentrations, decay of parent radionuclides and ingrowth of radioactive decay products prior to the start of the exposure scenario may be considered. A single code run can

  8. User's guide for ABCI version 9.4 (azimuthal beam cavity interaction) and introducing the ABCI windows application package

    International Nuclear Information System (INIS)

    Chin, Yong Ho

    2005-12-01

    ABCI is a computer program which solves the Maxwell equations directly in the time domain when a bunched beam goes through an axi-symmetric structure on or off axis. An arbitrary charge distribution can be defined by the user (default=Gaussian). This document is meant to be a comprehensive user's guide to describe all features of ABCI version 9.4, including also all additions since the release of the guide for version 8.8. All appendixes from the previous two user's guides that contain different important topics are also quoted. The main advantages of ABCI lie in its high speed of execution, the minimum use of computer memory, implementation of Napoly integration method and many elaborate options of Fourier transformations. In the version 9.4, even wake potentials for a counter-rotating beam of opposite charge can be calculated instead of usual ones for a beam trailing the driving beams. Now, the Windows application version of ABCI is available as a package which includes ABCI stand-alone executable modules, the sample input files, the source codes, manuals and the Windows version of TopDrawer, TopDrawW. This package can be downloaded from the ABCI home page: http://abci.kek.jp/abci.htm. Just by drag-and-droping an input file on the icon of ABCI application, all the calculation results pop out. Neither compilation of the source code nor installation of the program to Windows is necessary. Together with the TopDrawer for Windows, all works (computation of wake fields, generation of figures and so on) can be done simply and easily on Windows alone. How to use ABCI on Windows and how to install the program to other computer systems are explained at the end of this manual. (author)

  9. A computationally efficient description of heterogeneous freezing: A simplified version of the Soccer ball model

    Science.gov (United States)

    Niedermeier, Dennis; Ervens, Barbara; Clauss, Tina; Voigtländer, Jens; Wex, Heike; Hartmann, Susan; Stratmann, Frank

    2014-01-01

    In a recent study, the Soccer ball model (SBM) was introduced for modeling and/or parameterizing heterogeneous ice nucleation processes. The model applies classical nucleation theory. It allows for a consistent description of both apparently singular and stochastic ice nucleation behavior, by distributing contact angles over the nucleation sites of a particle population assuming a Gaussian probability density function. The original SBM utilizes the Monte Carlo technique, which hampers its usage in atmospheric models, as fairly time-consuming calculations must be performed to obtain statistically significant results. Thus, we have developed a simplified and computationally more efficient version of the SBM. We successfully used the new SBM to parameterize experimental nucleation data of, e.g., bacterial ice nucleation. Both SBMs give identical results; however, the new model is computationally less expensive as confirmed by cloud parcel simulations. Therefore, it is a suitable tool for describing heterogeneous ice nucleation processes in atmospheric models.

  10. PROSA version 4.0 manual

    International Nuclear Information System (INIS)

    Bicking, U.; Golly, W.; Peter, N.; Seifert, R.

    1991-05-01

    This report includes a comprehensive manual of the computer program PROSA which illustrate the handling and functioning of PROSA. The manual PROSA 4.0 (FORTRAN 77) describes the PC Version of PROSA including its program moduls. The PROSA program package is a statistical tool to decide on the basis of statistical assumptions whether in a given sequence of material balance periods a loss of material might have occurred. The evaluation of the material balance data is based on statistical test procedures. In the present PROSA Version 4.0 the three tests CUMUF test, PAGE's test and GEMUF test are applied to a sequence of material balances. PROSA Version 4.0 supports a real sequential evaluation. That means, PROSA is not only able to evaluate a series of MUF values sequentially after the campaign has finished, but also real sequentially during the campaign. PROSA Version 4.0 is a menu-guided computer program. Data input can be performed either by diskette or by key-enter. Result output is primarily an information whether or not an alarm is indicated. This information can be displayed either numerically or graphically. Therefore, a comfortable graphical output utility is attached to PROSA 4.0. The program moduls are compiled and linked with the Ryan Mc-Farland Compiler. The PROSA graphical utility uses the PLOT88 Library of Plotworks, Inc. (orig./HP) [de

  11. New developments in program STANSOL version 3

    International Nuclear Information System (INIS)

    Gray, W.H.

    1981-10-01

    STANSOL is a computer program that applied a solution for the mechanical displacement, stress, and strain in rotationally-transversely isotropic, homogeneous, axisymmetric solenoids. Careful application of the solution permits the complex mechanical behavior of multilayered, nonhomogeneous solenoids to be examined in which the loads may vary arbitrarily from layer to layer. Loads applied to the solenoid model by program STANSOL may consist of differential temperature, winding preload, internal and/or external surface pressure, and electromagnetic Lorentz body forces. STANSOL version 3, the latest update to the original version of the computer program, also permits structural analysis of solenoid magnets in which frictionless interlayer gaps may open or close. This paper presents the new theory coded into version 3 of the STANSOL program, as well as the new input data format and graphical output display of the resulting analysis

  12. Computer code conversion using HISTORIAN

    International Nuclear Information System (INIS)

    Matsumoto, Kiyoshi; Kumakura, Toshimasa.

    1990-09-01

    When a computer program written for a computer A is converted for a computer B, in general, the A version source program is rewritten for B version. However, in this way of program conversion, the following inconvenient problems arise. 1) The original statements to be rewritten for B version are lost. 2) If the original statements of the A version rewritten for B version would remain as comment lines, the B version source program becomes quite large. 3) When update directives of the program are mailed from the organization which developed the program or when some modifications are needed for the program, it is difficult to point out the part to be updated or modified in the B version source program. To solve these problems, the conversion method using the general-purpose software management aid system, HISTORIAN, has been introduced. This conversion method makes a large computer code a easy-to-use program for use to update, modify or improve after the conversion. This report describes the planning and procedures of the conversion method and the MELPROG-PWR/MOD1 code conversion from the CRAY version to the JAERI FACOM version as an example. This report would provide useful information for those who develop or introduce large programs. (author)

  13. WIMSD4 Version 101 and cataloged procedure

    International Nuclear Information System (INIS)

    Roth, M.J.; Taubman, C.J.; Lawrence, J.H.

    1982-06-01

    The changes made to WIMSD4 to produce Version 101 on the Harwell IBM 3033 and the Winfrith ICL 2976 computers are summarised. A detailed description of the amended catalogued procedure for executing WIMSD4 on the Harwell Computer is given. (author)

  14. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  15. Neural Computation and the Computational Theory of Cognition

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-01-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism--neural processes are computations in the…

  16. TAE+ 5.2 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.2 (DEC RISC ULTRIX VERSION)

    Science.gov (United States)

    TAE SUPPORT OFFICE

    1994-01-01

    workstations running ULTRIX (TK50 cartridge in UNIX tar format), (3) HP9000 Series 700/800 computers running HP-UX 9.x and X11/R5 (HP 4mm DDS DAT tape cartridge in UNIX tar format), (4) Sun4 (SPARC) series computers running SunOS (.25 inch tape cartridge in UNIX tar format), and (5) SGI Indigo computers running IRIX (.25 inch IRIS tape cartridge in UNIX tar format). Please contact COSMIC to obtain detailed information about the supported operating system and OSF/Motif releases required for each of these machine versions. An optional Motif Object Code License is available for the Sun4 version of TAE Plus 5.2. Version 5.1 of TAE Plus remains available for DEC VAX computers running VMS, HP9000 Series 300/400 computers running HP-UX, and HP 9000 Series 700/800 computers running HP-UX 8.x and X11/R4. Please contact COSMIC for details on these versions of TAE Plus.

  17. Implementing version support for complex objects

    OpenAIRE

    Blanken, Henk

    1991-01-01

    New applications in the area of office information systems, computer aided design and manufacturing make new demands upon database management systems. Among others highly structured objects and their history have to be represented and manipulated. The paper discusses some general problems concerning the access and storage of complex objects with their versions and the solutions developed within the AIM/II project. Queries related to versions are distinguished in ASOF queries (asking informati...

  18. Wien Automatic System Package (WASP). A computer code for power generating system expansion planning. Version WASP-III Plus. User's manual. Volume 2: Appendices

    International Nuclear Information System (INIS)

    1995-01-01

    With several Member States, the IAEA has completed a new version of the WASP program, which has been called WASP-Ill Plus since it follows quite closely the methodology of the WASP-Ill model. The major enhancements in WASP-Ill Plus with respect to the WASP-Ill version are: increase in the number of thermal fuel types (from 5 to 10); verification of which configurations generated by CONGEN have already been simulated in previous iterations with MERSIM; direct calculation of combined Loading Order of FIXSYS and VARSYS plants; simulation of system operation includes consideration of physical constraints imposed on some fuel types (i.e., fuel availability for electricity generation); extended output of the resimulation of the optimal solution; generation of a file that can be used for graphical representation of the results of the resimulation of the optimal solution and cash flows of the investment costs; calculation of cash flows allows to include the capital costs of plants firmly committed or in construction (FIXSYS plants); user control of the distribution of capital cost expenditures during the construction period (if required to be different from the general 'S' curve distribution used as default). This second volume of the document to support use of the WASP-Ill Plus computer code consists of 5 appendices giving some additional information about the WASP-Ill Plus program. Appendix A is mainly addressed to the WASP-Ill Plus system analyst and supplies some information which could help in the implementation of the program on the user computer facilities. This appendix also includes some aspects about WASP-Ill Plus that could not be treated in detail in Chapters 1 to 11. Appendix B identifies all error and warning messages that may appear in the WASP printouts and advises the user how to overcome the problem. Appendix C presents the flow charts of the programs along with a brief description of the objectives and structure of each module. Appendix D describes the

  19. ENDF-6 formats manual. Version of Oct. 1991

    International Nuclear Information System (INIS)

    Rose, P.F.; Dunford, C.L.

    1992-01-01

    ENDF-6 is the international computer file format for evaluated nuclear data. In contrast to the earlier versions (ENDF-4 and ENDF-5) the new version ENDF-6 has been designed not only for neutron reaction data but also for photo-nuclear and charged-particle nuclear reaction data. This document gives a detailed description of the formats and procedures adopted for ENDF-6. The present version includes update pages dated Oct. 1991. (author). Refs, figs, and tabs

  20. ASPEN Version 3.0

    Science.gov (United States)

    Rabideau, Gregg; Chien, Steve; Knight, Russell; Schaffer, Steven; Tran, Daniel; Cichy, Benjamin; Sherwood, Robert

    2006-01-01

    The Automated Scheduling and Planning Environment (ASPEN) computer program has been updated to version 3.0. ASPEN is a modular, reconfigurable, application software framework for solving batch problems that involve reasoning about time, activities, states, and resources. Applications of ASPEN can include planning spacecraft missions, scheduling of personnel, and managing supply chains, inventories, and production lines. ASPEN 3.0 can be customized for a wide range of applications and for a variety of computing environments that include various central processing units and random access memories.

  1. ELIPGRID-PC: Upgraded version

    International Nuclear Information System (INIS)

    Davidson, J.R.

    1995-12-01

    Evaluating the need for and the effectiveness of remedial cleanup at waste sites often includes finding average contaminant concentrations and identifying pockets of contamination called hot spots. The standard tool for calculating the probability of detecting pockets of contamination called hot spots has been the ELIPGRID code of singer and Wickman. The ELIPGRID-PC program has recently made this algorithm available for an IBM reg-sign personal computer (PC) or compatible. A new version of ELIPGRID-PC, incorporating Monte Carlo test results and simple graphics, is herein described. Various examples of how to use the program for both single and multiple hot spot cases are given. The code for an American National Standards Institute C version of the ELIPGRID algorithm is provided, and limitations and further work are noted. This version of ELIPGRID-PC reliably meets the goal of moving Singer's ELIPGRID algorithm to the PC

  2. SACRD: a data base for fast reactor safety computer codes, contents and glossary of Version 1 of the system

    International Nuclear Information System (INIS)

    Greene, N.M.; Forsberg, V.M.; Raiford, G.B.; Arwood, J.W.; Flanagan, G.F.

    1979-01-01

    SACRD is a data base of material properties and other handbook data needed in computer codes used for fast reactor safety studies. This document lists the contents of Version 1 and also serves as a glossary of terminology used in the data base. Data are available in the thermodynamics, heat transfer, fluid mechanics, structural mechanics, aerosol transport, meteorology, neutronics and dosimetry areas. Tabular, graphical and parameterized data are provided in many cases

  3. Computer-Aided System Engineering and Analysis (CASE/A) Programmer's Manual, Version 5.0

    Science.gov (United States)

    Knox, J. C.

    1996-01-01

    The Computer Aided System Engineering and Analysis (CASE/A) Version 5.0 Programmer's Manual provides the programmer and user with information regarding the internal structure of the CASE/A 5.0 software system. CASE/A 5.0 is a trade study tool that provides modeling/simulation capabilities for analyzing environmental control and life support systems and active thermal control systems. CASE/A has been successfully used in studies such as the evaluation of carbon dioxide removal in the space station. CASE/A modeling provides a graphical and command-driven interface for the user. This interface allows the user to construct a model by placing equipment components in a graphical layout of the system hardware, then connect the components via flow streams and define their operating parameters. Once the equipment is placed, the simulation time and other control parameters can be set to run the simulation based on the model constructed. After completion of the simulation, graphical plots or text files can be obtained for evaluation of the simulation results over time. Additionally, users have the capability to control the simulation and extract information at various times in the simulation (e.g., control equipment operating parameters over the simulation time or extract plot data) by using "User Operations (OPS) Code." This OPS code is written in FORTRAN with a canned set of utility subroutines for performing common tasks. CASE/A version 5.0 software runs under the VAX VMS(Trademark) environment. It utilizes the Tektronics 4014(Trademark) graphics display system and the VTIOO(Trademark) text manipulation/display system.

  4. Computer Game Lugram - Version for Blind Children

    Directory of Open Access Journals (Sweden)

    V. Delić

    2011-06-01

    Full Text Available Computer games have undoubtedly become an integral part of educational activities of children. However, since computer games typically abound with audio and visual effects, most of them are completely useless for children with disabilities. Specifically, computer games dealing with the basics of geometry can contribute to mathematics education, but they require significant modifications in order to be suitable for the visually impaired children. The paper presents the results of research and adaptation of the educational computer game Lugram to the needs of completely blind children, as well as the testing of the prototype, whose results are encouraging to further research and development in the same direction.

  5. The Gaia Framework: Version Support In Web Based Open Hypermedia

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Kejser, Thomas

    2004-01-01

    The GAIA framework prototype, described herein, explores the possibilities and problems that arise when combining versioning and open hypermedia paradigms. It will be argued that it - by adding versioning as a separate service in the hypermedia architecture – is possible to build consistent...... versioning field and GAIA is compared with previous attempts at defining hypermedia versioning frameworks. GAIA is capable of multi-level versioning and versioning of structures and supports freezing mechanisms for both documents and hyperstructure. The experiences from GAIA provide an input to new reference...

  6. The Gaia Framework: Version Support In Web Based Open Hypermedia

    DEFF Research Database (Denmark)

    Kejser, Thomas; Grønbæk, Kaj

    2003-01-01

    The GAIA framework prototype, described herein, explores the possibilities and problems that arise when combining versioning and open hypermedia paradigms. It will be argued that it - by adding versioning as a separate service in the hypermedia architecture - is possible to build consistent...... versioning field and GAIA is compared with previous attempts at defining hypermedia versioning frameworks. GAIA is capable of multi-level versioning and versioning of structures and supports freezing mechanisms for both documents and hyperstructure. The experiences from GAIA provide an input to new reference...

  7. MATLAB Software Versions and Licenses for the Peregrine System |

    Science.gov (United States)

    High-Performance Computing | NREL MATLAB Software Versions and Licenses for the Peregrine System MATLAB Software Versions and Licenses for the Peregrine System Learn about the MATLAB software Peregrine is R2017b. Licenses MATLAB is proprietary software. As such, users have access to a limited number

  8. Computer Game Lugram - Version for Blind Children

    OpenAIRE

    V. Delić; N. Vujnović Sedlar; B. Lučić

    2011-01-01

    Computer games have undoubtedly become an integral part of educational activities of children. However, since computer games typically abound with audio and visual effects, most of them are completely useless for children with disabilities. Specifically, computer games dealing with the basics of geometry can contribute to mathematics education, but they require significant modifications in order to be suitable for the visually impaired children. The paper presents the results of research and ...

  9. Integrated Global Radiosonde Archive (IGRA) - Monthly Means (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Please note, this dataset has been superseded by a newer version (see below). Users should not use this version except in rare cases (e.g., when reproducing previous...

  10. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (IBM PC VERSION WITH CLIPSITS)

    Science.gov (United States)

    Riley, , .

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh

  11. NOAA Climate Data Record (CDR) of GPS RO-Calibrated AMSU Channel 9 (Temperatures in the Lower Stratosphere,TLS), Version 1.0 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Please note, this dataset has been superseded by a newer version (see below). Users should not use this version except in rare cases (e.g., when reproducing previous...

  12. NOAA Climate Data Record (CDR) of GPS RO-Calibrated AMSU Channel 9 (Temperatures in the Lower Stratosphere,TLS), Version 1.1 (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Please note, this dataset has been superseded by a newer version (see below). Users should not use this version except in rare cases (e.g., when reproducing previous...

  13. Stratified B-trees and versioning dictionaries

    OpenAIRE

    Twigg, Andy; Byde, Andrew; Milos, Grzegorz; Moreton, Tim; Wilkes, John; Wilkie, Tom

    2011-01-01

    A classic versioned data structure in storage and computer science is the copy-on-write (CoW) B-tree -- it underlies many of today's file systems and databases, including WAFL, ZFS, Btrfs and more. Unfortunately, it doesn't inherit the B-tree's optimality properties; it has poor space utilization, cannot offer fast updates, and relies on random IO to scale. Yet, nothing better has been developed since. We describe the `stratified B-tree', which beats all known semi-external memory versioned B...

  14. Wien Automatic System Planning (WASP) Package. A computer code for power generating system expansion planning. Version WASP-IV. User's manual

    International Nuclear Information System (INIS)

    2001-01-01

    As a continuation of its efforts to provide methodologies and tools to Member States to carry out comparative assessment and analyse priority environmental issues related to the development of the electric power sector, the IAEA has completed a new version of the Wien Automatic System Planning (WASP) Package WASP-IV for carrying out power generation expansion planning taking into consideration fuel availability and environmental constraints. This manual constitutes a part of this work and aims to provide users with a guide to use effectively the new version of the model WASP-IV. WASP was originally developed in 1972 by the Tennessee Valley Authority and the Oak Ridge National Laboratory in the USA to meet the IAEA needs to analyse the economic competitiveness of nuclear power in comparison to other generation expansion alternatives for supplying the future electricity requirements of a country or region. Previous versions of the model were used by Member States in many national and regional studies to analyse the electric power system expansion planning and the role of nuclear energy in particular. Experience gained from its application allowed development of WASP into a very comprehensive planning tool for electric power system expansion analysis. New, improved versions were developed, which took into consideration the needs expressed by the users of the programme in order to address important emerging issues being faced by the electric system planners. In 1979, WASP-IV was released and soon after became an indispensable tool in many Member States for generation expansion planning. The WASP-IV version was continually upgraded and the development of version WASP-III Plus commenced in 1992. By 1995, WASP-III Plus was completed, which followed closely the methodology of the WASP-III but incorporated new features. In order to meet the needs of electricity planners and following the recommendations of the Helsinki symposium, development of a new version of WASP was

  15. A multidimensional version of the Kolmogorov-Smirnov test

    International Nuclear Information System (INIS)

    Fasano, G.; Franceschini, A.

    1987-01-01

    A generalization of the classical Kolmogorov-Smirnov test, which is suitable to analyse random samples defined in two or three dimensions is discussed. This test provides some improvements with respect to an earlier version proposed by a previous author. In particular: (i) it is faster, by a factor equal to the sample size, n, and then usable to analyse quite sizeable samples; (ii) it fully takes into account the dependence of the test statistics on the degree of correlation of data points and on the sample size; (iii) it allows for a generalization to the three-dimensional case which is still viable as regards computing time. Supported by a larger number of Monte Carlo simulations, it is ensured that this test is sufficiently distribution-free for any practical purposes. (author)

  16. System software for the NMFECC CRAY-1 version of GIFTS 4B

    International Nuclear Information System (INIS)

    Gray, W.H.; Baudry, T.V.

    1981-01-01

    The Oak Ridge National Laboratory (ORNL) maintains a version of the GIFTS system structural analysis computer programs. Executable modules are supported on two different types of computer hardware, a DECsystem-10 and a CRAY-1. Without external difference to the user, these modules execute equivalently upon both types of hardware. Presented herein are the local software enhancements for the ORNL version of GIFTS for the National Magnetic Fusion Energy Computer Center (NMFECC) CRAY-1 computer as well as a description of the ORNL implementation of the system-dependent portions of the GIFTS software library for the NMFECC CRAY-1

  17. Comparison of capability between two versions of reactor transient diagnosis expert system 'DISKET' programmed in different languages

    International Nuclear Information System (INIS)

    Yokobayashi, Masao; Yoshida, Kazuo

    1991-01-01

    An expert system DISKET has been developed at JAERI to apply knowledge engineering techniques to the transient diagnosis of nuclear power plant. The first version of DISKET programmed in UTILISP has been developed with the main-frame computer FACOM M-780 at JAERI. The LISP language is not suitable for on-line diagnostic systems because it is highly dependent on computer to be used and requires a large computer memory. The large mainframe computer is also not suitable because there are various restrictions as a multi-user computer system. The second version of DISKET for a practical use has been developed in FORTRAN to realize on-line real time diagnoses with limited computer resources. These two versions of DISKET with the same knowledge base have been compared in running capability, and it has been found that the LISP version of DISKET needs more than two times of memory and CPU time of FORTRAN version. From this result, it is shown that this approach is a practical one to develop expert systems for on-line real time diagnosis of transients with limited computer resources. (author)

  18. Special issue of Higher-Order and Symbolic Computation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Sabry, Amr

    This issue of HOSC is dedicated to the general topic of continuations. It grew out of the third ACM SIGPLAN Workshop on Continuations (CW'01), which took place in London, UK on January 16, 2001 [3]. The notion of continuation is ubiquitous in many different areas of computer science, including...... and streamline Filinski's earlier work in the previous special issue of HOSC (then LISP and Symbolic Computation) that grew out of the first ACM SIGPLAN Workshop on Continuations [1, 2]. Hasegawa and Kakutani's article is the journal version of an article presented at FOSSACS 2001 and that received the EATCS...

  19. CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (UNIX VERSION)

    Science.gov (United States)

    Donnell, B.

    1994-01-01

    COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the

  20. CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (MACINTOSH VERSION)

    Science.gov (United States)

    Riley, G.

    1994-01-01

    COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the

  1. TJ-II Library Manual (Version 2)

    International Nuclear Information System (INIS)

    Tribaldos, V.; Milligen, B. Ph. van; Lopez-Fraguas, A.

    2001-01-01

    This is a manual of use of the TJ2 Numerical Library that has been developed for making numerical computations of different TJ-II configurations. This manual is a new version of the earlier manual CIEMAT report 806. (Author)

  2. Efficient conjugate gradient algorithms for computation of the manipulator forward dynamics

    Science.gov (United States)

    Fijany, Amir; Scheid, Robert E.

    1989-01-01

    The applicability of conjugate gradient algorithms for computation of the manipulator forward dynamics is investigated. The redundancies in the previously proposed conjugate gradient algorithm are analyzed. A new version is developed which, by avoiding these redundancies, achieves a significantly greater efficiency. A preconditioned conjugate gradient algorithm is also presented. A diagonal matrix whose elements are the diagonal elements of the inertia matrix is proposed as the preconditioner. In order to increase the computational efficiency, an algorithm is developed which exploits the synergism between the computation of the diagonal elements of the inertia matrix and that required by the conjugate gradient algorithm.

  3. Retrieval of articles in personal computer

    International Nuclear Information System (INIS)

    Choi, Byung Gil; Park, Seog Hee; Kim, Sung Hoon; Shinn, Kyung Sub

    1994-01-01

    Although many useful articles appear in the journals published in Korea, they are not always cited by researchers mainly due to absence of efficient searching system. The authors made a program with 6 predefined filtering forms to detect published articles rapidly and accurately. The programs was coded using database management system CA-Clipper Version 5.2 (Computer Associates International, Inc.) through preliminary work for 1 year. We used 486 DX II (8 Mbyte RAM, VGA, 200 Mbyte Hard Disk). Ink-jet Printer (Hewlett Packard Company), and MS-DOS Version 5.0 (Microsoft Co). We inputted total of 1986 articles published in the Journal of Korea Radiological Society from 1981 to 1993. The searching time was 10 to 15 seconds for each use. We had very flexible user interfaces and simplified searching methods, but more complicated filtering could also be performed. Although the previous version have had some bugs, this upgrade version resolved the problems and fitted in searching articles. The program would be valuable for radiologist in searching articles published not only in the Journal of the Korean Radiological Society, but also in the Journal of the Korean Society of Medicine Ultrasound and the Korean Journal of Nuclear Medicine

  4. United States Climate Reference Network (USCRN) Processed Data (Version Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Please note, this dataset has been superseded by a newer version (see below). Users should not use this version except in rare cases (e.g., when reproducing previous...

  5. V.S.O.P. (99/09) computer code system for reactor physics and fuel cycle simulation. Version 2009

    Energy Technology Data Exchange (ETDEWEB)

    Ruetten, H.J.; Haas, K.A.; Brockmann, H.; Ohlig, U.; Pohl, C.; Scherer, W.

    2010-07-15

    V.S.O.P. (99/ 09) represents the further development of V.S.O.P. (99/ 05). Compared to its precursor, the code system has been improved again in many details. The main motivation for this new code version was to update the basic nuclear libraries used by the code system. Thus, all cross section libraries involved in the code have now been based on ENDF/B-VII. V.S.O.P. is a computer code system for the comprehensive numerical simulation of the physics of thermal reactors. It implies the setup of the reactor and of the fuel element, processing of cross sections, neutron spectrum evaluation, neutron diffusion calculation in two or three dimensions, fuel burnup, fuel shuffling, reactor control, thermal hydraulics and fuel cycle costs. The thermal hydraulics part (steady state and time-dependent) is restricted to gas-cooled reactors and to two spatial dimensions. The code can simulate the reactor operation from the initial core towards the equilibrium core. This latest code version was developed and tested under the WINDOWS-XP - operating system. (orig.)

  6. V.S.O.P. (99/09) computer code system for reactor physics and fuel cycle simulation. Version 2009

    International Nuclear Information System (INIS)

    Ruetten, H.J.; Haas, K.A.; Brockmann, H.; Ohlig, U.; Pohl, C.; Scherer, W.

    2010-07-01

    V.S.O.P. (99/ 09) represents the further development of V.S.O.P. (99/ 05). Compared to its precursor, the code system has been improved again in many details. The main motivation for this new code version was to update the basic nuclear libraries used by the code system. Thus, all cross section libraries involved in the code have now been based on ENDF/B-VII. V.S.O.P. is a computer code system for the comprehensive numerical simulation of the physics of thermal reactors. It implies the setup of the reactor and of the fuel element, processing of cross sections, neutron spectrum evaluation, neutron diffusion calculation in two or three dimensions, fuel burnup, fuel shuffling, reactor control, thermal hydraulics and fuel cycle costs. The thermal hydraulics part (steady state and time-dependent) is restricted to gas-cooled reactors and to two spatial dimensions. The code can simulate the reactor operation from the initial core towards the equilibrium core. This latest code version was developed and tested under the WINDOWS-XP - operating system. (orig.)

  7. Planned development and evaluation protocol of two versions of a web-based computer-tailored nutrition education intervention aimed at adults, including cognitive and environmental feedback.

    Science.gov (United States)

    Springvloet, Linda; Lechner, Lilian; Oenema, Anke

    2014-01-17

    Despite decades of nutrition education, the prevalence of unhealthy dietary patterns is still high and inequalities in intake between high and low socioeconomic groups still exist. Therefore, it is important to innovate and improve existing nutrition education interventions. This paper describes the development, design and evaluation protocol of a web-based computer-tailored nutrition education intervention for adults targeting fruit, vegetable, high-energy snack and fat intake. This intervention innovates existing computer-tailored interventions by not only targeting motivational factors, but also volitional and self-regulation processes and environmental-level factors. The intervention development was guided by the Intervention Mapping protocol, ensuring a theory-informed and evidence-based intervention. Two versions of the intervention were developed: a basic version targeting knowledge, awareness, attitude, self-efficacy and volitional and self-regulation processes, and a plus version additionally addressing the home environment arrangement and the availability and price of healthy food products in supermarkets. Both versions consist of four modules: one for each dietary behavior, i.e. fruit, vegetables, high-energy snacks and fat. Based on the self-regulation phases, each module is divided into three sessions. In the first session, feedback on dietary behavior is provided to increase awareness, feedback on attitude and self-efficacy is provided and goals and action plans are stated. In the second session goal achievement is evaluated, reasons for failure are explored, coping plans are stated and goals can be adapted. In the third session, participants can again evaluate their behavioral change and tips for maintenance are provided. Both versions will be evaluated in a three-group randomized controlled trial with measurements at baseline, 1-month, 4-months and 9-months post-intervention, using online questionnaires. Both versions will be compared with a generic

  8. BehavePlus fire modeling system, version 5.0: Variables

    Science.gov (United States)

    Patricia L. Andrews

    2009-01-01

    This publication has been revised to reflect updates to version 4.0 of the BehavePlus software. It was originally published as the BehavePlus fire modeling system, version 4.0: Variables in July, 2008.The BehavePlus fire modeling system is a computer program based on mathematical models that describe wildland fire behavior and effects and the...

  9. Three-dimensional biplanar radiography as a new means of accessing femoral version: a comparitive study of EOS three-dimensional radiography versus computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Pomerantz, M.L. [University of California San Diego School of Medicine, Orthopaedic Surgery Department, San Diego, CA (United States); Glaser, Diana [Aurora Spine, Carlsbad, CA (United States); Doan, Josh [Orthopedic Biomechanics Research Center, San Diego, CA (United States); Kumar, Sita [University of California, San Diego, CA (United States); Edmonds, Eric W. [University of California San Diego School of Medicine, Orthopaedic Surgery Department, San Diego, CA (United States); Rady Children' s Hospital San Diego, Division of Orthopedic Surgery, San Diego, CA (United States)

    2014-10-17

    To validate femoral version measurements made from biplanar radiography (BR), three-dimensional (3D) reconstructions (EOS imaging, France) were made in differing rotational positions against the gold standard of computed tomography (CT). Two cadaveric femurs were scanned with CT and BR in five different femoral versions creating ten total phantoms. The native version was modified by rotating through a mid-diaphyseal hinge twice into increasing anteversion and twice into increased retroversion. For each biplanar scan, the phantom itself was rotated -10, -5, 0, +5 and +10 . Three-dimensional CT reconstructions were designated the true value for femoral version. Two independent observers measured the femoral version on CT axial slices and BR 3D reconstructions twice. The mean error (upper bound of the 95 % confidence interval), inter- and intraobserver reliability, and the error compared to the true version were determined for both imaging techniques. Interobserver intraclass correlation for CT axial images ranged from 0.981 to 0.991, and the intraobserver intraclass correlation ranged from 0.994 to 0.996. For the BR 3D reconstructions these values ranged from 0.983 to 0.998 and 0.982 to 0.998, respectively. For the CT measurements the upper bound of error from the true value was 5.4-7.5 , whereas for BR 3D reconstructions it was 4.0-10.1 . There was no statistical difference in the mean error from the true values for any of the measurements done with axial CT or BR 3D reconstructions. BR 3D reconstructions accurately and reliably provide clinical data on femoral version compared to CT even with rotation of the patient of up to 10 from neutral. (orig.)

  10. SERA: Simulation Environment for Radiotherapy Applications - Users Manual Version 1CO

    International Nuclear Information System (INIS)

    Venhuizen, James Robert; Wessol, Daniel Edward; Wemple, Charles Alan; Wheeler, Floyd J; Harkin, G. J.; Frandsen, M. W.; Albright, C. L.; Cohen, M.T.; Rossmeier, M.; Cogliati, J.J.

    2002-01-01

    This document is the user manual for the Simulation Environment for Radiotherapy Applications (SERA) software program developed for boron-neutron capture therapy (BNCT) patient treatment planning by researchers at the Idaho National Engineering and Environmental Laboratory (INEEL) and students and faculty at Montana State University (MSU) Computer Science Department. This manual corresponds to the final release of the program, Version 1C0, developed to run under the RedHat Linux Operating System (version 7.2 or newer) or the Solaris Operating System (version 2.6 or newer). SERA is a suite of command line or interactively launched software modules, including graphical, geometric reconstruction, and execution interface modules for developing BNCT treatment plans. The program allows the user to develop geometric models of the patient as derived from Computed Tomography (CT) and Magnetic Resonance Imaging (MRI) images, perform dose computation for these geometric models, and display the computed doses on overlays of the original images as three dimensional representations. This manual provides a guide to the practical use of SERA, but is not an exhaustive treatment of each feature of the code

  11. SERA: Simulation Environment for Radiotherapy Applications - Users Manual Version 1CO

    Energy Technology Data Exchange (ETDEWEB)

    Venhuizen, James Robert; Wessol, Daniel Edward; Wemple, Charles Alan; Wheeler, Floyd J; Harkin, G. J.; Frandsen, M. W.; Albright, C. L.; Cohen, M.T.; Rossmeier, M.; Cogliati, J.J.

    2002-06-01

    This document is the user manual for the Simulation Environment for Radiotherapy Applications (SERA) software program developed for boron-neutron capture therapy (BNCT) patient treatment planning by researchers at the Idaho National Engineering and Environmental Laboratory (INEEL) and students and faculty at Montana State University (MSU) Computer Science Department. This manual corresponds to the final release of the program, Version 1C0, developed to run under the RedHat Linux Operating System (version 7.2 or newer) or the Solaris™ Operating System (version 2.6 or newer). SERA is a suite of command line or interactively launched software modules, including graphical, geometric reconstruction, and execution interface modules for developing BNCT treatment plans. The program allows the user to develop geometric models of the patient as derived from Computed Tomography (CT) and Magnetic Resonance Imaging (MRI) images, perform dose computation for these geometric models, and display the computed doses on overlays of the original images as three dimensional representations. This manual provides a guide to the practical use of SERA, but is not an exhaustive treatment of each feature of the code.

  12. Navier-Stokes computer

    International Nuclear Information System (INIS)

    Hayder, M.E.

    1988-01-01

    A new scientific supercomputer, known as the Navier-Stokes Computer (NSC), has been designed. The NSC is a multi-purpose machine, and for applications in the field of computational fluid dynamics (CFD), this supercomputer is expected to yield a computational speed far exceeding that of the present-day super computers. This computer has a few very powerful processors (known as nodes) connected by an internodal network. There are three versions of the NSC nodes: micro-, mini- and full-node. The micro-node was developed to prove, to demonstrate and to refine the key architectural features of the NSC. Architectures of the two recent versions of the NSC nodes are presented, with the main focus on the full-node. At a clock speed of 20 MHz, the mini- and the full-node have peak computational speeds of 200 and 640 MFLOPS, respectively. The full-node is the final version for the NSC nodes and an NSC is expected to have 128 full-nodes. To test the suitability of different algorithms on the NSC architecture, an NSC simulator was developed. Some of the existing computational fluid dynamics codes were placed on this simulator to determine important and relevant issues relating to the efficient use of the NSC architecture

  13. MEASUREMENT AND PRECISION, EXPERIMENTAL VERSION.

    Science.gov (United States)

    Harvard Univ., Cambridge, MA. Harvard Project Physics.

    THIS DOCUMENT IS AN EXPERIMENTAL VERSION OF A PROGRAMED TEXT ON MEASUREMENT AND PRECISION. PART I CONTAINS 24 FRAMES DEALING WITH PRECISION AND SIGNIFICANT FIGURES ENCOUNTERED IN VARIOUS MATHEMATICAL COMPUTATIONS AND MEASUREMENTS. PART II BEGINS WITH A BRIEF SECTION ON EXPERIMENTAL DATA, COVERING SUCH POINTS AS (1) ESTABLISHING THE ZERO POINT, (2)…

  14. Cielo Computational Environment Usage Model With Mappings to ACE Requirements for the General Availability User Environment Capabilities Release Version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    Vigil,Benny Manuel [Los Alamos National Laboratory; Ballance, Robert [SNL; Haskell, Karen [SNL

    2012-08-09

    Cielo is a massively parallel supercomputer funded by the DOE/NNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale (ACES), a partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL). The primary Cielo compute platform is physically located at Los Alamos National Laboratory. This Cielo Computational Environment Usage Model documents the capabilities and the environment to be provided for the Q1 FY12 Level 2 Cielo Capability Computing (CCC) Platform Production Readiness Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory, or Sandia National Laboratories, but also addresses the needs of users working in the unclassified environment. The Cielo Computational Environment Usage Model maps the provided capabilities to the tri-Lab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment capabilities of the ASC community. A description of ACE requirements met, and those requirements that are not met, are included in each section of this document. The Cielo Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the tri-Lab community.

  15. CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (IBM PC VERSION)

    Science.gov (United States)

    Donnell, B.

    1994-01-01

    COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the

  16. In-vessel source term analysis code TRACER version 2.3. User's manual

    International Nuclear Information System (INIS)

    Toyohara, Daisuke; Ohno, Shuji; Hamada, Hirotsugu; Miyahara, Shinya

    2005-01-01

    A computer code TRACER (Transport Phenomena of Radionuclides for Accident Consequence Evaluation of Reactor) version 2.3 has been developed to evaluate species and quantities of fission products (FPs) released into cover gas during a fuel pin failure accident in an LMFBR. The TRACER version 2.3 includes new or modified models shown below. a) Both model: a new model for FPs release from fuel. b) Modified model for FPs transfer from fuel to bubbles or sodium coolant. c) Modified model for bubbles dynamics in coolant. Computational models, input data and output data of the TRACER version 2.3 are described in this user's manual. (author)

  17. GENXICC2.1: An improved version of GENXICC for hadronic production of doubly heavy baryons

    Science.gov (United States)

    Wang, Xian-You; Wu, Xing-Gang

    2013-03-01

    We present an improved version of GENXICC, which is a generator for hadronic production of the doubly heavy baryons Ξcc, Ξbc and Ξbb and has been introduced by C.H. Chang, J.X. Wang and X.G. Wu [Comput. Phys. Commun. 177 (2007) 467; Comput. Phys. Commun. 181 (2010) 1144]. In comparison with the previous GENXICC versions, we update the program in order to generate the unweighted baryon events more effectively under various simulation environments, whose distributions are now generated according to the probability proportional to the integrand. One Les Houches Event (LHE) common block has been added to produce a standard LHE data file that contains useful information of the doubly heavy baryon and its accompanying partons. Such LHE data can be conveniently imported into PYTHIA to do further hadronization and decay simulation, especially, the color-flow problem can be solved with PYTHIA8.0. NEW VERSION PROGRAM SUMMARYTitle of program: GENXICC2.1 Program obtained from: CPC Program Library Reference to original program: GENXICC Reference in CPC: Comput. Phys. Commun. 177, 467 (2007); Comput. Phys. Commun. 181, 1144 (2010) Does the new version supersede the old program: No Computer: Any LINUX based on PC with FORTRAN 77 or FORTRAN 90 and GNU C compiler as well Operating systems: LINUX Programming language used: FORTRAN 77/90 Memory required to execute with typical data: About 2.0 MB No. of bytes in distributed program: About 2 MB, including PYTHIA6.4 Distribution format: .tar.gz Nature of physical problem: Hadronic production of doubly heavy baryons Ξcc, Ξbc and Ξbb. Method of solution: The upgraded version with a proper interface to PYTHIA can generate full production and decay events, either weighted or unweighted, conveniently and effectively. Especially, the unweighted events are generated by using an improved hit-and-miss approach. Reasons for new version: Responding to the feedback from users of CMS and LHCb groups at the Large Hadron Collider, and based on

  18. Standardized computer-based organized reporting of EEG

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C.

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE....... In the end, the diagnostic significance is scored, using a standardized list of terms. SCORE has specific modules for scoring seizures (including seizure semiology and ictal EEG patterns), neonatal recordings (including features specific for this age group), and for Critical Care EEG Terminology. SCORE...

  19. Standardized computer-based organized reporting of EEG

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C.

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE...... are used to report the features of clinical relevance, extracted while assessing the EEGs. Selection of the terms is context sensitive: initial choices determine the subsequently presented sets of additional choices. This process automatically generates a report and feeds these features into a database...

  20. PC 386-based version of DORT

    International Nuclear Information System (INIS)

    Tanker, E.

    1992-01-01

    Problems encountered during the adaptation of DORT on a personal computer using a Fortran77 compiler are described, modifications done to solve these are explained. Three test cases were run with the modified version and results are compared with those obtained on an IBM 3090/200. Numerical differences were observed in the last three decimal digits of the computations at most. The running times on the PC were found to be satisfactory for these test cases

  1. Version control of pathway models using XML patches.

    Science.gov (United States)

    Saffrey, Peter; Orton, Richard

    2009-03-17

    Computational modelling has become an important tool in understanding biological systems such as signalling pathways. With an increase in size complexity of models comes a need for techniques to manage model versions and their relationship to one another. Model version control for pathway models shares some of the features of software version control but has a number of differences that warrant a specific solution. We present a model version control method, along with a prototype implementation, based on XML patches. We show its application to the EGF/RAS/RAF pathway. Our method allows quick and convenient storage of a wide range of model variations and enables a thorough explanation of these variations. Trying to produce these results without such methods results in slow and cumbersome development that is prone to frustration and human error.

  2. TRASYS - THERMAL RADIATION ANALYZER SYSTEM (CRAY VERSION WITH NASADIG)

    Science.gov (United States)

    Anderson, G. E.

    1994-01-01

    . The flexible structure of TRASYS allows considerable freedom in the definition and choice of solution method for a thermal radiation problem. The program's flexible structure has also allowed TRASYS to retain the same basic input structure as the authors update it in order to keep up with changing requirements. Among its other important features are the following: 1) up to 3200 node problem size capability with shadowing by intervening opaque or semi-transparent surfaces; 2) choice of diffuse, specular, or diffuse/specular radiant interchange solutions; 3) a restart capability that minimizes recomputing; 4) macroinstructions that automatically provide the executive logic for orbit generation that optimizes the use of previously completed computations; 5) a time variable geometry package that provides automatic pointing of the various parts of an articulated spacecraft and an automatic look-back feature that eliminates redundant form factor calculations; 6) capability to specify submodel names to identify sets of surfaces or components as an entity; and 7) subroutines to perform functions which save and recall the internodal and/or space form factors in subsequent steps for nodes with fixed geometry during a variable geometry run. There are two machine versions of TRASYS v27: a DEC VAX version and a Cray UNICOS version. Both versions require installation of the NASADIG library (MSC-21801 for DEC VAX or COS-10049 for CRAY), which is available from COSMIC either separately or bundled with TRASYS. The NASADIG (NASA Device Independent Graphics Library) plot package provides a pictorial representation of input geometry, orbital/orientation parameters, and heating rate output as a function of time. NASADIG supports Tektronix terminals. The CRAY version of TRASYS v27 is written in FORTRAN 77 for batch or interactive execution and has been implemented on CRAY X-MP and CRAY Y-MP series computers running UNICOS. The standard distribution medium for MSC-21959 (CRAY version

  3. Amnioinfusion for women with a singleton breech presentation and a previous failed external cephalic version: a randomized controlled trial.

    Science.gov (United States)

    Diguisto, Caroline; Winer, Norbert; Descriaud, Celine; Tavernier, Elsa; Weymuller, Victoire; Giraudeau, Bruno; Perrotin, Franck

    2018-04-01

    Our trial aimed to assess the effectiveness of amnioinfusion for a second attempt at external cephalic version (ECV). This open randomized controlled trial was planned with a sequential design. Women at a term ≥36 weeks of gestation with a singleton fetus in breech presentation and a first unsuccessful ECV were recruited in two level-3 maternity units. They were randomly allocated to transabdominal amnioinfusion with a 500-mL saline solution under ultrasound surveillance or no amnioinfusion before the second ECV attempt. Trained senior obstetricians performed all procedures. The primary outcome was the cephalic presentation rate at delivery. Analyses were conducted according to intention to treat (NCT00465712). Recruitment difficulties led to stopping the trial after a 57-month period, 119 women were randomized: 59 allocated to amnioinfusion + ECV and 60 to ECV only. Data were analyzed without applying the sequential feature of the design. The rate of cephalic presentation at delivery did not differ significantly according to whether the second version attempt was or was not preceded by amnioinfusion (20 versus 12%, p = .20). Premature rupture of the membranes occurred for 15% of the women in the amnioinfusion group. Amnioinfusion before a second attempt to external version does not significantly increase the rate of cephalic presentation at delivery.

  4. COSY INFINITY version 8

    International Nuclear Information System (INIS)

    Makino, Kyoko; Berz, Martin

    1999-01-01

    The latest version of the particle optics code COSY INFINITY is presented. Using Differential Algebraic (DA) methods, the code allows the computation of aberrations of arbitrary field arrangements to in principle unlimited order. Besides providing a general overview of the code, several recent techniques developed for specific applications are highlighted. These include new features for the direct utilization of detailed measured fields as well as rigorous treatment of remainder bounds

  5. FORIG: a computer code for calculating radionuclide generation and depletion in fusion and fission reactors. User's manual

    International Nuclear Information System (INIS)

    Blink, J.A.

    1985-03-01

    In this manual we describe the use of the FORIG computer code to solve isotope-generation and depletion problems in fusion and fission reactors. FORIG runs on a Cray-1 computer and accepts more extensive activation cross sections than ORIGEN2 from which it was adapted. This report is an updated and a combined version of the previous ORIGEN2 and FORIG manuals. 7 refs., 15 figs., 13 tabs

  6. CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (DEC VAX VMS VERSION)

    Science.gov (United States)

    Donnell, B.

    1994-01-01

    COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the

  7. Nuclear criticality safety handbook. Version 2

    International Nuclear Information System (INIS)

    1999-03-01

    The Nuclear Criticality Safety Handbook, Version 2 essentially includes the description of the Supplement Report to the Nuclear Criticality Safety Handbook, released in 1995, into the first version of Nuclear Criticality Safety Handbook, published in 1988. The following two points are new: (1) exemplifying safety margins related to modelled dissolution and extraction processes, (2) describing evaluation methods and alarm system for criticality accidents. Revision is made based on previous studies for the chapter that treats modelling the fuel system: e.g., the fuel grain size that the system can be regarded as homogeneous, non-uniformity effect of fuel solution, and burnup credit. This revision solves the inconsistencies found in the first version between the evaluation of errors found in JACS code system and criticality condition data that were calculated based on the evaluation. (author)

  8. IGT-Open: An open-source, computerized version of the Iowa Gambling Task.

    Science.gov (United States)

    Dancy, Christopher L; Ritter, Frank E

    2017-06-01

    The Iowa Gambling Task (IGT) is commonly used to understand the processes involved in decision-making. Though the task was originally run without a computer, using a computerized version of the task has become typical. These computerized versions of the IGT are useful, because they can make the task more standardized across studies and allow for the task to be used in environments where a physical version of the task may be difficult or impossible to use (e.g., while collecting brain imaging data). Though these computerized versions of the IGT have been useful for experimentation, having multiple software implementations of the task could present reliability issues. We present an open-source software version of the Iowa Gambling Task (called IGT-Open) that allows for millisecond visual presentation accuracy and is freely available to be used and modified. This software has been used to collect data from human subjects and also has been used to run model-based simulations with computational process models developed to run in the ACT-R architecture.

  9. Version pressure feedback mechanisms for speculative versioning caches

    Science.gov (United States)

    Eichenberger, Alexandre E.; Gara, Alan; O& #x27; Brien, Kathryn M.; Ohmacht, Martin; Zhuang, Xiaotong

    2013-03-12

    Mechanisms are provided for controlling version pressure on a speculative versioning cache. Raw version pressure data is collected based on one or more threads accessing cache lines of the speculative versioning cache. One or more statistical measures of version pressure are generated based on the collected raw version pressure data. A determination is made as to whether one or more modifications to an operation of a data processing system are to be performed based on the one or more statistical measures of version pressure, the one or more modifications affecting version pressure exerted on the speculative versioning cache. An operation of the data processing system is modified based on the one or more determined modifications, in response to a determination that one or more modifications to the operation of the data processing system are to be performed, to affect the version pressure exerted on the speculative versioning cache.

  10. SPARK Version 1.1 user manual

    International Nuclear Information System (INIS)

    Weissenburger, D.W.

    1988-01-01

    This manual describes the input required to use Version 1.1 of the SPARK computer code. SPARK 1.1 is a library of FORTRAN main programs and subprograms designed to calculate eddy currents on conducting surfaces where current flow is assumed zero in the direction normal to the surface. Surfaces are modeled with triangular and/or quadrilateral elements. Lorentz forces produced by the interaction of eddy currents with background magnetic fields can be output at element nodes in a form compatible with most structural analysis codes. In addition, magnetic fields due to eddy currents can be determined at points off the surface. Version 1.1 features eddy current streamline plotting with optional hidden-surface-removal graphics and topological enhancements that allow essentially any orientable surface to be modeled. SPARK also has extensive symmetry specification options. In order to make the manual as self-contained as possible, six appendices are included that present summaries of the symmetry options, topological options, coil options and code algorithms, with input and output examples. An edition of SPARK 1.1 is available on the Cray computers at the National Magnetic Fusion Energy Computer Center at Livermore, California. Another more generic edition is operational on the VAX computers at the Princeton Plasma Physics Laboratory and is available on magnetic tape by request. The generic edition requires either the GKS or PLOT10 graphics package and the IMSL or NAG mathematical package. Requests from outside the United States will be subject to applicable federal regulations regarding dissemination of computer programs. 22 refs

  11. SPARK Version 1. 1 user manual

    Energy Technology Data Exchange (ETDEWEB)

    Weissenburger, D.W.

    1988-01-01

    This manual describes the input required to use Version 1.1 of the SPARK computer code. SPARK 1.1 is a library of FORTRAN main programs and subprograms designed to calculate eddy currents on conducting surfaces where current flow is assumed zero in the direction normal to the surface. Surfaces are modeled with triangular and/or quadrilateral elements. Lorentz forces produced by the interaction of eddy currents with background magnetic fields can be output at element nodes in a form compatible with most structural analysis codes. In addition, magnetic fields due to eddy currents can be determined at points off the surface. Version 1.1 features eddy current streamline plotting with optional hidden-surface-removal graphics and topological enhancements that allow essentially any orientable surface to be modeled. SPARK also has extensive symmetry specification options. In order to make the manual as self-contained as possible, six appendices are included that present summaries of the symmetry options, topological options, coil options and code algorithms, with input and output examples. An edition of SPARK 1.1 is available on the Cray computers at the National Magnetic Fusion Energy Computer Center at Livermore, California. Another more generic edition is operational on the VAX computers at the Princeton Plasma Physics Laboratory and is available on magnetic tape by request. The generic edition requires either the GKS or PLOT10 graphics package and the IMSL or NAG mathematical package. Requests from outside the United States will be subject to applicable federal regulations regarding dissemination of computer programs. 22 refs.

  12. EASI graphics - Version II

    International Nuclear Information System (INIS)

    Allensworth, J.A.

    1984-04-01

    EASI (Estimate of Adversary Sequence Interruption) is an analytical technique for measuring the effectiveness of physical protection systems. EASI Graphics is a computer graphics extension of EASI which provides a capability for performing sensitivity and trade-off analyses of the parameters of a physical protection system. This document reports on the implementation of the Version II of EASI Graphics and illustrates its application with some examples. 5 references, 15 figures, 6 tables

  13. VizieR Online Data Catalog: SKY2000 Master Catalog, Version 5 (Myers+ 2006)

    Science.gov (United States)

    Myers, J. R.; Sande, C. B.; Miller, A. C.; Warren, W. H., Jr.; Tracewell, D. A.

    2015-02-01

    The SKYMAP Star Catalog System consists of a Master Catalog stellar database and a collection of utility software designed to create and maintain the database and to generate derivative mission star catalogs (run catalogs). It contains an extensive compilation of information on almost 300000 stars brighter than 8.0mag. The original SKYMAP Master Catalog was generated in the early 1970's. Incremental updates and corrections were made over the following years but the first complete revision of the source data occurred with Version 4.0. This revision also produced a unique, consolidated source of astrometric information which can be used by the astronomical community. The derived quantities were removed and wideband and photometric data in the R (red) and I (infrared) systems were added. Version 4 of the SKY2000 Master Catalog was completed in April 2002; it marks the global replacement of the variability identifier and variability data fields. More details can be found in the description file sky2kv4.pdf. The SKY2000 Version 5 Revision 4 Master Catalog differs from Revision 3 in that MK and HD spectral types have been added from the Catalogue of Stellar Spectral Classifications (B. A. Skiff of Lowell Observatory, 2005), which has been assigned source code 50 in this process. 9622 entries now have MK types from this source, while 3976 entries have HD types from this source. SKY2000 V5 R4 also differs globally from preceding MC versions in that the Galactic coordinate computations performed by UPDATE have been increased in accuracy, so that differences from the same quantities from other sources are now typically in the last decimal places carried in the MC. This version supersedes the previous versions 1(V/95), 2(V/102), 3(V/105) and 4(V/109). (6 data files).

  14. Computer code SICHTA-85/MOD 1 for thermohydraulic and mechanical modelling of WWER fuel channel behaviour during LOCA and comparison with original version of the SICHTA code

    International Nuclear Information System (INIS)

    Bujan, A.; Adamik, V.; Misak, J.

    1986-01-01

    A brief description is presented of the expansion of the SICHTA-83 computer code for the analysis of the thermal history of the fuel channel for large LOCAs by modelling the mechanical behaviour of fuel element cladding. The new version of the code has a more detailed treatment of heat transfer in the fuel-cladding gap because it also respects the mechanical (plastic) deformations of the cladding and the fuel-cladding interaction (magnitude of contact pressure). Also respected is the change in pressure of the gas filling of the fuel element, the mechanical criterion is considered of a failure of the cladding and the degree is considered of the blockage of the through-flow cross section for coolant flow in the fuel channel. The LOCA WWER-440 model computation provides a comparison of the new SICHTA-85/MOD 1 code with the results of the original 83 version of SICHTA. (author)

  15. A hybrid version of swan for fast and efficient practical wave modelling

    NARCIS (Netherlands)

    M. Genseberger (Menno); J. Donners

    2016-01-01

    htmlabstractIn the Netherlands, for coastal and inland water applications, wave modelling with SWAN has become a main ingredient. However, computational times are relatively high. Therefore we investigated the parallel efficiency of the current MPI and OpenMP versions of SWAN. The MPI version is

  16. Systems analysis programs for Hands-on integrated reliability evaluations (SAPHIRE) Version 5.0: Verification and validation (V ampersand V) manual. Volume 9

    International Nuclear Information System (INIS)

    Jones, J.L.; Calley, M.B.; Capps, E.L.; Zeigler, S.L.; Galyean, W.J.; Novack, S.D.; Smith, C.L.; Wolfram, L.M.

    1995-03-01

    A verification and validation (V ampersand V) process has been performed for the System Analysis Programs for Hands-on Integrated Reliability Evaluation (SAPHIRE) Version 5.0. SAPHIRE is a set of four computer programs that NRC developed for performing probabilistic risk assessments. They allow an analyst to perform many of the functions necessary to create, quantify, and evaluate the risk associated with a facility or process being analyzed. The programs are Integrated Reliability and Risk Analysis System (IRRAS) System Analysis and Risk Assessment (SARA), Models And Results Database (MAR-D), and Fault tree, Event tree, and Piping and instrumentation diagram (FEP) graphical editor. Intent of this program is to perform a V ampersand V of successive versions of SAPHIRE. Previous efforts have been the V ampersand V of SAPHIRE Version 4.0. The SAPHIRE 5.0 V ampersand V plan is based on the SAPHIRE 4.0 V ampersand V plan with revisions to incorporate lessons learned from the previous effort. Also, the SAPHIRE 5.0 vital and nonvital test procedures are based on the test procedures from SAPHIRE 4.0 with revisions to include the new SAPHIRE 5.0 features as well as to incorporate lessons learned from the previous effort. Most results from the testing were acceptable; however, some discrepancies between expected code operation and actual code operation were identified. Modifications made to SAPHIRE are identified

  17. Wien Automatic System Planning (WASP) Package. A computer code for power generating system expansion planning. Version WASP-III Plus. User's manual. Volume 1: Chapters 1-11

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    determination of the optimal expansion of combined thermal and hydro power systems, taking into account the optimal operation of the hydro reservoirs throughout the year. Microcomputer (PC) versions of WASP-Ill and MAED have also been developed as stand alone programs and as part of an integrated package for energy and electricity planning called ENPEP (Energy and Power Evaluation Program). A PC version of the VALORAGUA model has also been completed in 1992. With all these developments, the catalogue of planning methodologies offered by the IAEA to its Member States has been upgraded to facilitate the work by electricity planners, WASP in particular is currently accepted as a powerful tool for electric system expansion planning. Nevertheless, experienced users of the program have indicated the need to introduce more enhancements within the WASP model in order to cope with the problems constantly faced by planners owing to the increasing complexity of this type of analysis. With several Member States, the IAEA has completed a new version of the WASP program, which has been called WASP-Ill Plus since it follows quite closely the methodology of the WASP-Ill model. The major enhancements in WASP-Ill Plus with respect to the WASP-Ill version are: increase in the number of thermal fuel types (from 5 to 10); verification of which configurations generated by CONGEN have already been simulated in previous iterations with MERSIM; direct calculation of combined Loading Order of FIXSYS and VARSYS plants; simulation of system operation includes consideration of physical constraints imposed on some fuel types (i.e., fuel availability for electricity generation); extended output of the resimulation of the optimal solution; generation of a file that can be used for graphical representation of the results of the resimulation of the optimal solution and cash flows of the investment costs; calculation of cash flows allows to include the capital costs of plants firmly committed or in construction

  18. Emphysema and bronchiectasis in COPD patients with previous pulmonary tuberculosis: computed tomography features and clinical implications

    Directory of Open Access Journals (Sweden)

    Jin J

    2018-01-01

    Full Text Available Jianmin Jin,1 Shuling Li,2 Wenling Yu,2 Xiaofang Liu,1 Yongchang Sun1,3 1Department of Respiratory and Critical Care Medicine, Beijing Tongren Hospital, Capital Medical University, Beijing, 2Department of Radiology, Beijing Tongren Hospital, Capital Medical University, Beijing, 3Department of Respiratory and Critical Care Medicine, Peking University Third Hospital, Beijing, China Background: Pulmonary tuberculosis (PTB is a risk factor for COPD, but the clinical characteristics and the chest imaging features (emphysema and bronchiectasis of COPD with previous PTB have not been studied well.Methods: The presence, distribution, and severity of emphysema and bronchiectasis in COPD patients with and without previous PTB were evaluated by high-resolution computed tomography (HRCT and compared. Demographic data, respiratory symptoms, lung function, and sputum culture of Pseudomonas aeruginosa were also compared between patients with and without previous PTB.Results: A total of 231 COPD patients (82.2% ex- or current smokers, 67.5% male were consecutively enrolled. Patients with previous PTB (45.0% had more severe (p=0.045 and longer history (p=0.008 of dyspnea, more exacerbations in the previous year (p=0.011, and more positive culture of P. aeruginosa (p=0.001, compared with those without PTB. Patients with previous PTB showed a higher prevalence of bronchiectasis (p<0.001, which was more significant in lungs with tuberculosis (TB lesions, and a higher percentage of more severe bronchiectasis (Bhalla score ≥2, p=0.031, compared with those without previous PTB. The overall prevalence of emphysema was not different between patients with and without previous PTB, but in those with previous PTB, a higher number of subjects with middle (p=0.001 and lower (p=0.019 lobe emphysema, higher severity score (p=0.028, higher prevalence of panlobular emphysema (p=0.013, and more extensive centrilobular emphysema (p=0.039 were observed. Notably, in patients with

  19. TRASYS - THERMAL RADIATION ANALYZER SYSTEM (DEC VAX VERSION WITH NASADIG)

    Science.gov (United States)

    Anderson, G. E.

    1994-01-01

    . The flexible structure of TRASYS allows considerable freedom in the definition and choice of solution method for a thermal radiation problem. The program's flexible structure has also allowed TRASYS to retain the same basic input structure as the authors update it in order to keep up with changing requirements. Among its other important features are the following: 1) up to 3200 node problem size capability with shadowing by intervening opaque or semi-transparent surfaces; 2) choice of diffuse, specular, or diffuse/specular radiant interchange solutions; 3) a restart capability that minimizes recomputing; 4) macroinstructions that automatically provide the executive logic for orbit generation that optimizes the use of previously completed computations; 5) a time variable geometry package that provides automatic pointing of the various parts of an articulated spacecraft and an automatic look-back feature that eliminates redundant form factor calculations; 6) capability to specify submodel names to identify sets of surfaces or components as an entity; and 7) subroutines to perform functions which save and recall the internodal and/or space form factors in subsequent steps for nodes with fixed geometry during a variable geometry run. There are two machine versions of TRASYS v27: a DEC VAX version and a Cray UNICOS version. Both versions require installation of the NASADIG library (MSC-21801 for DEC VAX or COS-10049 for CRAY), which is available from COSMIC either separately or bundled with TRASYS. The NASADIG (NASA Device Independent Graphics Library) plot package provides a pictorial representation of input geometry, orbital/orientation parameters, and heating rate output as a function of time. NASADIG supports Tektronix terminals. The CRAY version of TRASYS v27 is written in FORTRAN 77 for batch or interactive execution and has been implemented on CRAY X-MP and CRAY Y-MP series computers running UNICOS. The standard distribution medium for MSC-21959 (CRAY version

  20. TRASYS - THERMAL RADIATION ANALYZER SYSTEM (DEC VAX VERSION WITHOUT NASADIG)

    Science.gov (United States)

    Vogt, R. A.

    1994-01-01

    . The flexible structure of TRASYS allows considerable freedom in the definition and choice of solution method for a thermal radiation problem. The program's flexible structure has also allowed TRASYS to retain the same basic input structure as the authors update it in order to keep up with changing requirements. Among its other important features are the following: 1) up to 3200 node problem size capability with shadowing by intervening opaque or semi-transparent surfaces; 2) choice of diffuse, specular, or diffuse/specular radiant interchange solutions; 3) a restart capability that minimizes recomputing; 4) macroinstructions that automatically provide the executive logic for orbit generation that optimizes the use of previously completed computations; 5) a time variable geometry package that provides automatic pointing of the various parts of an articulated spacecraft and an automatic look-back feature that eliminates redundant form factor calculations; 6) capability to specify submodel names to identify sets of surfaces or components as an entity; and 7) subroutines to perform functions which save and recall the internodal and/or space form factors in subsequent steps for nodes with fixed geometry during a variable geometry run. There are two machine versions of TRASYS v27: a DEC VAX version and a Cray UNICOS version. Both versions require installation of the NASADIG library (MSC-21801 for DEC VAX or COS-10049 for CRAY), which is available from COSMIC either separately or bundled with TRASYS. The NASADIG (NASA Device Independent Graphics Library) plot package provides a pictorial representation of input geometry, orbital/orientation parameters, and heating rate output as a function of time. NASADIG supports Tektronix terminals. The CRAY version of TRASYS v27 is written in FORTRAN 77 for batch or interactive execution and has been implemented on CRAY X-MP and CRAY Y-MP series computers running UNICOS. The standard distribution medium for MSC-21959 (CRAY version

  1. QRev—Software for computation and quality assurance of acoustic doppler current profiler moving-boat streamflow measurements—Technical manual for version 2.8

    Science.gov (United States)

    Mueller, David S.

    2016-06-21

    The software program, QRev applies common and consistent computational algorithms combined with automated filtering and quality assessment of the data to improve the quality and efficiency of streamflow measurements and helps ensure that U.S. Geological Survey streamflow measurements are consistent, accurate, and independent of the manufacturer of the instrument used to make the measurement. Software from different manufacturers uses different algorithms for various aspects of the data processing and discharge computation. The algorithms used by QRev to filter data, interpolate data, and compute discharge are documented and compared to the algorithms used in the manufacturers’ software. QRev applies consistent algorithms and creates a data structure that is independent of the data source. QRev saves an extensible markup language (XML) file that can be imported into databases or electronic field notes software. This report is the technical manual for version 2.8 of QRev.

  2. Item analysis of the Spanish version of the Boston Naming Test with a Spanish speaking adult population from Colombia.

    Science.gov (United States)

    Kim, Stella H; Strutt, Adriana M; Olabarrieta-Landa, Laiene; Lequerica, Anthony H; Rivera, Diego; De Los Reyes Aragon, Carlos Jose; Utria, Oscar; Arango-Lasprilla, Juan Carlos

    2018-02-23

    The Boston Naming Test (BNT) is a widely used measure of confrontation naming ability that has been criticized for its questionable construct validity for non-English speakers. This study investigated item difficulty and construct validity of the Spanish version of the BNT to assess cultural and linguistic impact on performance. Subjects were 1298 healthy Spanish speaking adults from Colombia. They were administered the 60- and 15-item Spanish version of the BNT. A Rasch analysis was computed to assess dimensionality, item hierarchy, targeting, reliability, and item fit. Both versions of the BNT satisfied requirements for unidimensionality. Although internal consistency was excellent for the 60-item BNT, order of difficulty did not increase consistently with item number and there were a number of items that did not fit the Rasch model. For the 15-item BNT, a total of 5 items changed position on the item hierarchy with 7 poor fitting items. Internal consistency was acceptable. Construct validity of the BNT remains a concern when it is administered to non-English speaking populations. Similar to previous findings, the order of item presentation did not correspond with increasing item difficulty, and both versions were inadequate at assessing high naming ability.

  3. TPDWR2: thermal power determination for Westinghouse reactors, Version 2. User's guide

    International Nuclear Information System (INIS)

    Kaczynski, G.M.; Woodruff, R.W.

    1985-12-01

    TPDWR2 is a computer program which was developed to determine the amount of thermal power generated by any Westinghouse nuclear power plant. From system conditions, TPDWR2 calculates enthalpies of water and steam and the power transferred to or from various components in the reactor coolant system and to or from the chemical and volume control system. From these results and assuming that the reactor core is operating at constant power and is at thermal equilibrium, TPDWR2 calculates the thermal power generated by the reactor core. TPDWR2 runs on the IBM PC and XT computers when IBM Personal Computer DOS, Version 2.00 or 2.10, and IBM Personal Computer Basic, Version D2.00 or D2.10, are stored on the same diskette with TPDWR2

  4. Monteray Mark-I: Computer program (PC-version) for shielding calculation with Monte Carlo method

    International Nuclear Information System (INIS)

    Pudjijanto, M.S.; Akhmad, Y.R.

    1998-01-01

    A computer program for gamma ray shielding calculation using Monte Carlo method has been developed. The program is written in WATFOR77 language. The MONTERAY MARH-1 is originally developed by James Wood. The program was modified by the authors that the modified version is easily executed. Applying Monte Carlo method the program observe photon gamma transport in an infinity planar shielding with various thick. A photon gamma is observed till escape from the shielding or when its energy less than the cut off energy. Pair production process is treated as pure absorption process that annihilation photons generated in the process are neglected in the calculation. The out put data calculated by the program are total albedo, build-up factor, and photon spectra. The calculation result for build-up factor of a slab lead and water media with 6 MeV parallel beam gamma source shows that they are in agreement with published data. Hence the program is adequate as a shielding design tool for observing gamma radiation transport in various media

  5. Large-scale computer-mediated training for management teachers

    Directory of Open Access Journals (Sweden)

    Gilly Salmon

    1997-01-01

    Full Text Available In 1995/6 the Open University Business School (OUBS trained 187 tutors in the UK and Continental Western Europe in Computer Mediated Conferencing (CMC for management education. The medium chosen for the training was FirstClassTM. In 1996/7 the OUBS trained a further 106 tutors in FirstClassTM using an improved version of the previous years training. The on line training was based on a previously developed model of learning on line. The model was tested both by means of the structure of the training programme and the improvements made. The training programme was evaluated and revised for the second cohort. Comparison was made between the two training programmes.

  6. GEMPAK 5.1 - A GENERAL METEOROLOGICAL PACKAGE (UNIX VERSION)

    Science.gov (United States)

    Desjardins, M. L.

    1994-01-01

    GEMPAK is a general meteorological software package developed at NASA/Goddard Space Flight Center. It includes programs to analyze and display surface, upper-air, and gridded data, including model output. There are very general programs to list, edit, and plot data on maps, to display profiles and time series, to draw and fill contours, to draw streamlines, to plot symbols for clouds, sky cover, and pressure tendency, and draw cross sections in the case of gridded data and sounding data. In addition, there are Barnes objective analysis programs to grid surface and upper-air data. The programs include the capabilities to derive meteorological parameters from those found in the dataset, to perform vertical interpolations of sounding data to different coordinate systems, and to compute an extensive set of gridded diagnostic quantities by specifying various nested combinations of scalars and vector arithmetic, algebraic, and differential operators. The GEMPAK 5.1 graphics/transformation subsystem, GEMPLT, provides device-independent graphics. GEMPLT also has the capability to display output in a variety of map projections or overlaid on satellite imagery. GEMPAK 5.1 is written in FORTRAN 77 and C-language and has been implemented on VAX computers under VMS and on computers running the UNIX operating system. During installation and normal use, this package occupies approximately 100Mb of hard disk space. The UNIX version of GEMPAK includes drivers for several graphic output systems including MIT's X Window System (X11,R4), Sun GKS, PostScript (color and monochrome), Silicon Graphics, and others. The VMS version of GEMPAK also includes drivers for several graphic output systems including PostScript (color and monochrome). The VMS version is delivered with the object code for the Transportable Applications Environment (TAE) program, version 4.1 which serves as a user interface. A color monitor is recommended for displaying maps on video display devices. Data for rendering

  7. An Improved Version of TOPAZ 3D

    International Nuclear Information System (INIS)

    Krasnykh, Anatoly

    2003-01-01

    An improved version of the TOPAZ 3D gun code is presented as a powerful tool for beam optics simulation. In contrast to the previous version of TOPAZ 3D, the geometry of the device under test is introduced into TOPAZ 3D directly from a CAD program, such as Solid Edge or AutoCAD. In order to have this new feature, an interface was developed, using the GiD software package as a meshing code. The article describes this method with two models to illustrate the results

  8. Transfer and development of the PC version of ABAQUS program

    International Nuclear Information System (INIS)

    Li Xiaofeng; Zhu Yuqiao

    1998-01-01

    The transfer and development of the PC version of ABAQUS,a large nonlinear mechanical finite element analysis program, are carried out. Some special problem such as difference of the floating data format in different computers and the computer's unusual dead halt during the data transfer is solved and the visualized I/O capability is added in the redevelopment. Thus, by utilizing the visual capability, the intensity of analysis works is reduced, and the correctness of analysis is ensured. The PC ABAQUS are tested by the standard examples from VAX version of ABAQUS and the calculation results are correct. The results of calculation of stress and deformation for CEFR shell structure with PC ABAQUS and ADINA codes agree very well

  9. ARROW (Version 2) Commercial Software Validation and Configuration Control

    Energy Technology Data Exchange (ETDEWEB)

    HEARD, F.J.

    2000-02-10

    ARROW (Version 2), a compressible flow piping network modeling and analysis computer program from Applied Flow Technology, was installed for use at the U.S. Department of Energy Hanford Site near Richland, Washington.

  10. ARROW (Version 2) Commercial Software Validation and Configuration Control

    International Nuclear Information System (INIS)

    HEARD, F.J.

    2000-01-01

    ARROW (Version 2), a compressible flow piping network modeling and analysis computer program from Applied Flow Technology, was installed for use at the U.S. Department of Energy Hanford Site near Richland, Washington

  11. Numerical Recipes in C++: The Art of Scientific Computing (2nd edn). Numerical Recipes Example Book (C++) (2nd edn). Numerical Recipes Multi-Language Code CD ROM with LINUX or UNIX Single-Screen License Revised Version

    International Nuclear Information System (INIS)

    Borcherds, P

    2003-01-01

    The two Numerical Recipes books are marvellous. The principal book, The Art of Scientific Computing, contains program listings for almost every conceivable requirement, and it also contains a well written discussion of the algorithms and the numerical methods involved. The Example Book provides a complete driving program, with helpful notes, for nearly all the routines in the principal book. The first edition of Numerical Recipes: The Art of Scientific Computing was published in 1986 in two versions, one with programs in Fortran, the other with programs in Pascal. There were subsequent versions with programs in BASIC and in C. The second, enlarged edition was published in 1992, again in two versions, one with programs in Fortran (NR(F)), the other with programs in C (NR(C)). In 1996 the authors produced Numerical Recipes in Fortran 90: The Art of Parallel Scientific Computing as a supplement, called Volume 2, with the original (Fortran) version referred to as Volume 1. Numerical Recipes in C++ (NR(C++)) is another version of the 1992 edition. The numerical recipes are also available on a CD ROM: if you want to use any of the recipes, I would strongly advise you to buy the CD ROM. The CD ROM contains the programs in all the languages. When the first edition was published I bought it, and have also bought copies of the other editions as they have appeared. Anyone involved in scientific computing ought to have a copy of at least one version of Numerical Recipes, and there also ought to be copies in every library. If you already have NR(F), should you buy the NR(C++) and, if not, which version should you buy? In the preface to Volume 2 of NR(F), the authors say 'C and C++ programmers have not been far from our minds as we have written this volume, and we think that you will find that time spent in absorbing its principal lessons will be amply repaid in the future as C and C++ eventually develop standard parallel extensions'. In the preface and introduction to NR

  12. NDL-v2.0: A new version of the numerical differentiation library for parallel architectures

    Science.gov (United States)

    Hadjidoukas, P. E.; Angelikopoulos, P.; Voglis, C.; Papageorgiou, D. G.; Lagaris, I. E.

    2014-07-01

    We present a new version of the numerical differentiation library (NDL) used for the numerical estimation of first and second order partial derivatives of a function by finite differencing. In this version we have restructured the serial implementation of the code so as to achieve optimal task-based parallelization. The pure shared-memory parallelization of the library has been based on the lightweight OpenMP tasking model allowing for the full extraction of the available parallelism and efficient scheduling of multiple concurrent library calls. On multicore clusters, parallelism is exploited by means of TORC, an MPI-based multi-threaded tasking library. The new MPI implementation of NDL provides optimal performance in terms of function calls and, furthermore, supports asynchronous execution of multiple library calls within legacy MPI programs. In addition, a Python interface has been implemented for all cases, exporting the functionality of our library to sequential Python codes. Catalog identifier: AEDG_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEDG_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 63036 No. of bytes in distributed program, including test data, etc.: 801872 Distribution format: tar.gz Programming language: ANSI Fortran-77, ANSI C, Python. Computer: Distributed systems (clusters), shared memory systems. Operating system: Linux, Unix. Has the code been vectorized or parallelized?: Yes. RAM: The library uses O(N) internal storage, N being the dimension of the problem. It can use up to O(N2) internal storage for Hessian calculations, if a task throttling factor has not been set by the user. Classification: 4.9, 4.14, 6.5. Catalog identifier of previous version: AEDG_v1_0 Journal reference of previous version: Comput. Phys. Comm. 180

  13. Antepartum transabdominal amnioinfusion to facilitate external cephalic version after initial failure.

    Science.gov (United States)

    Benifla, J L; Goffinet, F; Darai, E; Madelenat, P

    1994-12-01

    Transabdominal amnioinfusion can be used to facilitate external cephalic version. Our technique involves filling the uterine cavity with 700 or 900 mL of 37C saline under continuous echographic monitoring. External cephalic version is done the next morning. We have used this procedure in six women, all of whom had previous unsuccessful attempts at external cephalic version. After amnioinfusion, all six patients were converted to cephalic presentation and delivered normally, without obstetric or neonatal complications.

  14. National Radiobiology Archives Distributed Access User's Manual, Version 1. 1

    Energy Technology Data Exchange (ETDEWEB)

    Smith, S.K.; Prather, J.C.; Ligotke, E.K.; Watson, C.R.

    1992-06-01

    This supplement to the NRA Distributed Access User's manual (PNL-7877), November 1991, describes installation and use of Version 1.1 of the software package; this is not a replacement of the previous manual. Version 1.1 of the NRA Distributed Access Package is a maintenance release. It eliminates several bugs, and includes a few new features which are described in this manual. Although the appearance of some menu screens has changed, we are confident that the Version 1.0 User's Manual will provide an adequate introduction to the system. Users who are unfamiliar with Version 1.0 may wish to experiment with that version before moving on to Version 1.1.

  15. Development of IMPACTS-BRC, Version 2.1

    International Nuclear Information System (INIS)

    Rao, R.R.; Kozak, M.W.; Rollstin, J.A.

    1991-01-01

    IMPACTS-BRC is a computer program developed to conduct scoping analyses for use in supporting rulemaking on petitions for exemption of waste streams from multiple producers. It was not initially intended for use on individual license applications for specific sites. However, the Federal Register, Volume 51, Number 168, specifies that IMPACTS-BRC be used to evaluate incoming license applications. This creates a problem since IMPACTS-BRC is not being used for its intended purpose. It is a generic code that is now being used for site specific applications. This is only a valid procedure if it can be shown that generic results from IMPACTS-BRC are conservative when compared to results from site specific models. Otherwise, IMPACTS-BRC should not be used. The purpose of this work was to verify that IMPACTS-BRC works as specified in its user's guide. In other words, Sandia National Laboratories (SNL) has determined that the mathematical models given in the user's guide are correctly implemented into the computer code. No direct work has been done to verify that the mathematical models used in the code are appropriate for the purpose that they are being used. In fact, scrutiny of the groundwater transport models in IMPACTS-BRC has led us to recommend that alternate geosphere models should be used. Other work carried out for this project included verifying that the input data for IMPACTS-BRC is correct and traceable. This was carried out, and a new version of the data with these qualities was produced. The new version of the data was used with the verified IMPACTS-BRC, Version 2.0 to produce IMPACTS-BRC, Version 2.1

  16. HECTR [Hydrogen Event Containment Transient Response] Version 1.5N: A modification of HECTR Version 1.5 for application to N Reactor

    International Nuclear Information System (INIS)

    Camp, A.L.; Dingman, S.E.

    1987-05-01

    This report describes HECTR Version 1.5N, which is a special version of HECTR developed specifically for application to the N Reactor. HECTR is a fast-running, lumped-parameter containment analysis computer program that is most useful for performing parametric studies. The main purpose of HECTR is to analyze nuclear reactor accidents involving the transport and combustion of hydrogen, but HECTR can also function as an experiment analysis tool and can solve a limited set of other types of containment problems. Version 1.5N is a modification of Version 1.5 and includes changes to the spray actuation logic, and models for steam vents, vacuum breakers, and building cross-vents. Thus, all of the key features of the N Reactor confinement can be modeled. HECTR is designed for flexibility and provides for user control of many important parameters, if built-in correlations and default values are not desired

  17. FY17 Status Report on the Computing Systems for the Yucca Mountain Project TSPA-LA Models.

    Energy Technology Data Exchange (ETDEWEB)

    Appel, Gordon John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Appel, Gordon John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Reynolds, John Thomas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Garland, Jason P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-10-01

    Sandia National Laboratories (SNL) continued evaluation of total system performance assessment (TSPA) computing systems for the previously considered Yucca Mountain Project (YMP). This was done to maintain the operational readiness of the computing infrastructure (computer hardware and software) and knowledge capability for total system performance assessment (TSPA) type analysis, as directed by the National Nuclear Security Administration (NNSA), DOE 2010. This work is a continuation of the ongoing readiness evaluation reported in Lee and Hadgu (2014), Hadgu et al. (2015) and Hadgu and Appel (2016). The TSPA computing hardware (CL2014) and storage system described in Hadgu et al. (2015) were used for the current analysis. One floating license of GoldSim with Versions 9.60.300, 10.5, 11.1 and 12.0 was installed on the cluster head node, and its distributed processing capability was mapped on the cluster processors. Other supporting software were tested and installed to support the TSPA- type analysis on the server cluster. The current tasks included preliminary upgrade of the TSPA-LA from Version 9.60.300 to the latest version 12.0 and address DLL-related issues observed in the FY16 work. The model upgrade task successfully converted the Nominal Modeling case to GoldSim Versions 11.1/12. Conversions of the rest of the TSPA models were also attempted but program and operational difficulties precluded this. Upgrade of the remaining of the modeling cases and distributed processing tasks is expected to continue. The 2014 server cluster and supporting software systems are fully operational to support TSPA-LA type analysis.

  18. Prediction of Success in External Cephalic Version under Tocolysis: Still a Challenge.

    Science.gov (United States)

    Vaz de Macedo, Carolina; Clode, Nuno; Mendes da Graça, Luís

    2015-01-01

    External cephalic version is a procedure of fetal rotation to a cephalic presentation through manoeuvres applied to the maternal abdomen. There are several prognostic factors described in literature for external cephalic version success and prediction scores have been proposed, but their true implication in clinical practice is controversial. We aim to identify possible factors that could contribute to the success of an external cephalic version attempt in our population. We retrospectively examined 207 consecutive external cephalic version attempts under tocolysis conducted between January 1997 and July 2012. We consulted the department's database for the following variables: race, age, parity, maternal body mass index, gestational age, estimated fetal weight, breech category, placental location and amniotic fluid index. We performed descriptive and analytical statistics for each variable and binary logistic regression. External cephalic version was successful in 46.9% of cases (97/207). None of the included variables was associated with the outcome of external cephalic version attempts after adjustment for confounding factors. We present a success rate similar to what has been previously described in literature. However, in contrast to previous authors, we could not associate any of the analysed variables with success of the external cephalic version attempt. We believe this discrepancy is partly related to the type of statistical analysis performed. Even though there are numerous prognostic factors identified for the success in external cephalic version, care must be taken when counselling and selecting patients for this procedure. The data obtained suggests that external cephalic version should continue being offered to all eligible patients regardless of prognostic factors for success.

  19. Wien Automatic System Planning (WASP) Package. A computer code for power generating system expansion planning. Version WASP-III Plus. User's manual. Volume 1: Chapters 1-11

    International Nuclear Information System (INIS)

    1995-01-01

    determination of the optimal expansion of combined thermal and hydro power systems, taking into account the optimal operation of the hydro reservoirs throughout the year. Microcomputer (PC) versions of WASP-Ill and MAED have also been developed as stand alone programs and as part of an integrated package for energy and electricity planning called ENPEP (Energy and Power Evaluation Program). A PC version of the VALORAGUA model has also been completed in 1992. With all these developments, the catalogue of planning methodologies offered by the IAEA to its Member States has been upgraded to facilitate the work by electricity planners, WASP in particular is currently accepted as a powerful tool for electric system expansion planning. Nevertheless, experienced users of the program have indicated the need to introduce more enhancements within the WASP model in order to cope with the problems constantly faced by planners owing to the increasing complexity of this type of analysis. With several Member States, the IAEA has completed a new version of the WASP program, which has been called WASP-Ill Plus since it follows quite closely the methodology of the WASP-Ill model. The major enhancements in WASP-Ill Plus with respect to the WASP-Ill version are: increase in the number of thermal fuel types (from 5 to 10); verification of which configurations generated by CONGEN have already been simulated in previous iterations with MERSIM; direct calculation of combined Loading Order of FIXSYS and VARSYS plants; simulation of system operation includes consideration of physical constraints imposed on some fuel types (i.e., fuel availability for electricity generation); extended output of the resimulation of the optimal solution; generation of a file that can be used for graphical representation of the results of the resimulation of the optimal solution and cash flows of the investment costs; calculation of cash flows allows to include the capital costs of plants firmly committed or in construction

  20. Quantum walk computation

    International Nuclear Information System (INIS)

    Kendon, Viv

    2014-01-01

    Quantum versions of random walks have diverse applications that are motivating experimental implementations as well as theoretical studies. Recent results showing quantum walks are “universal for quantum computation” relate to algorithms, to be run on quantum computers. We consider whether an experimental implementation of a quantum walk could provide useful computation before we have a universal quantum computer

  1. RASCAL Version 2.0 workbook

    International Nuclear Information System (INIS)

    Athey, G.F.; McKenna, T.J.

    1993-05-01

    The Radiological Assessment System for Consequence Analysis, Version 2.0 (RASCAL 2.0) has been developed for use by the NRC personnel who respond to radiological emergencies. This workbook is intended to complement the RASCAL 2.0 User's Guide (NUREG/CR-5247, Vol. 1). The workbook contains exercises designed to familiarize the user with the computer based tools of RASCAL through hands-on problem solving. The workbook is composed of four major sections. The first part is a RASCAL familiarization exercise to acquaint the user with the operation of the forms, menus, on-line help, and documentation. The latter three parts contain exercises in using the three tools of RASCAL Version 2.0: DECAY, FM-DOSE, and ST-DOSE. Each section of exercises is followed by discussion on how the tools could be used to solve the problem

  2. The FORM version of MINCER

    International Nuclear Information System (INIS)

    Larin, S.A.; Academy of Sciences of the USSR, Moscow; Tkachov, F.V.; McGill Univ., Montreal, PQ; Academy of Sciences of the USSR, Moscow; Vermaseren, J.A.M.

    1991-01-01

    The program MINCER for massless three-loop Feynman diagrams of the propagator type has been reprogrammed in the language of FORM. The new version is thoroughly optimized and can be run from a utility like the UNIX make, which allows one to conveniently process large numbers of diagrams. It has been used for some calculations that were previously not practical. (author). 22 refs.; 14 figs

  3. Condiment: general synthesis of different versions

    International Nuclear Information System (INIS)

    Mangin, J.P.

    1990-01-01

    CONDIMENT is a code for the computation of ion migration and diffusion in areas close to radwaste storage facilities. This type of application was found to require a mesh pattern and boundary conditions different from the usual, which justifies the writing of a new code. A first version (version 2) only convers the migration of a single, non radioactive ion. The discretization, the selection of an implicit scheme, and the various boundary conditions are described. Physical quantities such as diffusion coefficient, porosity, retardation factor and permeability vary in space but not in time. A first extension consists of taking consideration radioactivity and filiation. Discretization with respect to time is modified, and a check performed on the original analytical solutions. In a second extension, consideration is given to non-linear adsorption, which makes it necessary to use the NEWTON-RAPHSON method. One can thus modelize the FREUNDLICH isotherms, in spite of the singular point at the origin. Diffusion, apparent porosity and permeability values can be changed as computed proceeds. The last extension is the introduction of two ions with the formation of precipitate. The formulation is derived from that used for non-linear adsorption, the precipitate playing a part similar to that of adsorbed concentration. Agreement with the original analytical solutions is verified. The case of migration with several interacting ions is approached from the theoretical standpoint. We described the discretization, which is similar to that in the first version, but involves many additional variables. Numerical stability is shown to be unconditional [fr

  4. xdamp Version 4: An IDL Based Data and Image Manipulation Program

    International Nuclear Information System (INIS)

    William P. Ballard

    2002-01-01

    The original DAMP (W t a Manipulation Program) was written by Mark Hedemann of Sandia National Laboratories and used the CA-DISSPLA(trademark) (available from Computer Associates International, Inc., Garden City, NY) graphics package as its engine. It was used to plot, modify, and otherwise manipulate the one-dimensional data waveforms (data vs. time) from a wide variety of accelerators. With the waning of CA-DISSPLA and the increasing popularity of Unix(reg s ign)-based workstations, a replacement was needed. This package uses the IDL(reg s ign) software, available from Research Systems Incorporated, a Xerox company, in Boulder, Colorado, as the engine, and creates a set of widgets to manipulate the data in a manner similar to the original DAMP and earlier versions of xdamp. IDL is currently supported on a wide variety of Unix platforms such as IBM(reg s ign) workstations, Hewlett Packard workstations, SUN(reg s ign) workstations, Microsoft(reg s ign) Windows(trademark) computers, Macintosh(reg s ign) computers and Digital Equipment Corporation VMS(reg s ign) and Alpha(reg s ign) systems. Thus, xdamp is portable across many platforms. We have verified operation, albeit with some minor IDL bugs, on personal computers using Windows 95 and Windows NT; IBM Unix platforms; DEC Alpha and VMS systems; HP 9000/700 series workstations; and Macintosh computers, both regular and PowerPC(trademark) versions. Version 4 is an update that removes some obsolete features and better supports very large arrays and Excel formatted data import

  5. ATLAS grid compute cluster with virtualized service nodes

    International Nuclear Information System (INIS)

    Mejia, J; Stonjek, S; Kluth, S

    2010-01-01

    The ATLAS Computing Grid consists of several hundred compute clusters distributed around the world as part of the Worldwide LHC Computing Grid (WLCG). The Grid middleware and the ATLAS software which has to be installed on each site, often require a certain Linux distribution and sometimes even specific version thereof. On the other hand, mostly due to maintenance reasons, computer centres install the same operating system and version on all computers. This might lead to problems with the Grid middleware if the local version is different from the one for which it has been developed. At RZG we partly solved this conflict by using virtualization technology for the service nodes. We will present the setup used at RZG and show how it helped to solve the problems described above. In addition we will illustrate the additional advantages gained by the above setup.

  6. GEMPAK 5.1 - A GENERAL METEOROLOGICAL PACKAGE (VAX VMS VERSION)

    Science.gov (United States)

    Des, Jardins M. L.

    1994-01-01

    GEMPAK is a general meteorological software package developed at NASA/Goddard Space Flight Center. It includes programs to analyze and display surface, upper-air, and gridded data, including model output. There are very general programs to list, edit, and plot data on maps, to display profiles and time series, to draw and fill contours, to draw streamlines, to plot symbols for clouds, sky cover, and pressure tendency, and draw cross sections in the case of gridded data and sounding data. In addition, there are Barnes objective analysis programs to grid surface and upper-air data. The programs include the capabilities to derive meteorological parameters from those found in the dataset, to perform vertical interpolations of sounding data to different coordinate systems, and to compute an extensive set of gridded diagnostic quantities by specifying various nested combinations of scalars and vector arithmetic, algebraic, and differential operators. The GEMPAK 5.1 graphics/transformation subsystem, GEMPLT, provides device-independent graphics. GEMPLT also has the capability to display output in a variety of map projections or overlaid on satellite imagery. GEMPAK 5.1 is written in FORTRAN 77 and C-language and has been implemented on VAX computers under VMS and on computers running the UNIX operating system. During installation and normal use, this package occupies approximately 100Mb of hard disk space. The UNIX version of GEMPAK includes drivers for several graphic output systems including MIT's X Window System (X11,R4), Sun GKS, PostScript (color and monochrome), Silicon Graphics, and others. The VMS version of GEMPAK also includes drivers for several graphic output systems including PostScript (color and monochrome). The VMS version is delivered with the object code for the Transportable Applications Environment (TAE) program, version 4.1 which serves as a user interface. A color monitor is recommended for displaying maps on video display devices. Data for rendering

  7. Validation of MCNP6 Version 1.0 with the ENDF/B-VII.1 Cross Section Library for Plutonium Metals, Oxides, and Solutions on the High Performance Computing Platform Moonlight

    Energy Technology Data Exchange (ETDEWEB)

    Chapman, Bryan Scott [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Gough, Sean T. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-12-05

    This report documents a validation of the MCNP6 Version 1.0 computer code on the high performance computing platform Moonlight, for operations at Los Alamos National Laboratory (LANL) that involve plutonium metals, oxides, and solutions. The validation is conducted using the ENDF/B-VII.1 continuous energy group cross section library at room temperature. The results are for use by nuclear criticality safety personnel in performing analysis and evaluation of various facility activities involving plutonium materials.

  8. Nuclear Criticality Safety Handbook, Version 2. English translation

    International Nuclear Information System (INIS)

    2001-08-01

    The Nuclear Criticality Safety Handbook, Version 2 essentially includes the description of the Supplement Report to the Nuclear Criticality Safety Handbook, released in 1995, into the first version of the Nuclear Criticality Safety Handbook, published in 1988. The following two points are new: (1) exemplifying safety margins related to modeled dissolution and extraction processes, (2) describing evaluation methods and alarm system for criticality accidents. Revision has been made based on previous studies for the chapter that treats modeling the fuel system: e.g., the fuel grain size that the system can be regarded as homogeneous, non-uniformity effect of fuel solution, an burnup credit. This revision has solved the inconsistencies found in the first version between the evaluation of errors found in JACS code system and the criticality condition data that were calculated based on the evaluation. This report is an English translation of the Nuclear Criticality Safety Handbook, Version 2, originally published in Japanese as JAERI 1340 in 1999. (author)

  9. xdamp Version 3: An IDL reg-sign-based data and image manipulation program

    International Nuclear Information System (INIS)

    Ballard, W.P.

    1998-05-01

    The original DAMP (DAta Manipulation Program) was written by Mark Hedemann of Sandia National Laboratories and used the CA-DISSPLA trademark (available from Computer Associates International, Inc., Garden City, NY) graphics package as its engine. It was used to plot, modify, and otherwise manipulate the one-dimensional data waveforms (data vs. time) from a wide variety of accelerators. With the waning of CA-DISSPLA and the increasing popularity of Unix reg-sign-based workstations, a replacement was needed. This package uses the IDL reg-sign software, available from Research Systems Incorporated in Boulder, Colorado, as the engine, and creates a set of widgets to manipulate the data in a manner similar to the original DAMP and earlier versions of xdamp. IDL is currently supported on a wide variety of Unix platforms such as IBM reg-sign workstations, Hewlett Packard workstations, SUN reg-sign workstations, Microsoft reg-sign Windows trademark computers, Macintosh reg-sign computers and Digital Equipment Corporation VMS reg-sign and Alpha reg-sign systems. Thus, xdamp is portable across many platforms. The author has verified operation, albeit with some minor IDL bugs, on personal computers using Windows 95 and Windows NT; IBM Unix platforms; and DEC alpha and VMS systems; HP 9000/700 series workstations; and Macintosh computers, both regular and PowerPC trademark versions. Version 3 adds the capability to manipulate images to the original xdamp capabilities

  10. FORM version 4.0

    Science.gov (United States)

    Kuipers, J.; Ueda, T.; Vermaseren, J. A. M.; Vollinga, J.

    2013-05-01

    We present version 4.0 of the symbolic manipulation system FORM. The most important new features are manipulation of rational polynomials and the factorization of expressions. Many other new functions and commands are also added; some of them are very general, while others are designed for building specific high level packages, such as one for Gröbner bases. New is also the checkpoint facility, that allows for periodic backups during long calculations. Finally, FORM 4.0 has become available as open source under the GNU General Public License version 3. Program summaryProgram title: FORM. Catalogue identifier: AEOT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 151599 No. of bytes in distributed program, including test data, etc.: 1 078 748 Distribution format: tar.gz Programming language: The FORM language. FORM itself is programmed in a mixture of C and C++. Computer: All. Operating system: UNIX, LINUX, Mac OS, Windows. Classification: 5. Nature of problem: FORM defines a symbolic manipulation language in which the emphasis lies on fast processing of very large formulas. It has been used successfully for many calculations in Quantum Field Theory and mathematics. In speed and size of formulas that can be handled it outperforms other systems typically by an order of magnitude. Special in this version: The version 4.0 contains many new features. Most important are factorization and rational arithmetic. The program has also become open source under the GPL. The code in CPC is for reference. You are encouraged to upload the most recent sources from www.nikhef.nl/form/formcvs.php because of frequent bug fixes. Solution method: See "Nature of Problem", above. Additional comments: NOTE: The code in CPC is for reference. You are encouraged

  11. An update to the Surface Ocean CO2 Atlas (SOCAT version 2)

    NARCIS (Netherlands)

    Bakker, D.C.E.; Pfeil, B.; Smith, K.; Hankin, S.; Olsen, A.; Alin, S. R.; Cosca, C.; Harasawa, S.; Kozyr, A.; Nojiri, Y.; O'Brien, K. M.; Schuster, U.; Telszewski, M.; Tilbrook, B.; Wada, C.; Akl, J.; Barbero, L.; Bates, N. R.; Boutin, J.; Bozec, Y.; Cai, W. -J.; Castle, R. D.; Chavez, F. P.; Chen, L.; Chierici, M.; Currie, K.; de Baar, H. J. W.; Evans, W.; Feely, R. A.; Fransson, A.; Gao, Z.; Hales, B.; Hardman-Mountford, N. J.; Hoppema, M.; Huang, W. -J.; Hunt, C. W.; Huss, B.; Ichikawa, T.; Johannessen, T.; Jones, E. M.; Jones, S. D.; Jutterstrom, S.; Kitidis, V.; Koertzinger, A.; Landschuetzer, P.; Lauvset, S. K.; Lefevre, N.; Manke, A. B.; Mathis, J. T.; Merlivat, L.; Metzl, N.; Murata, A.; Newberger, T.; Omar, A. M.; Ono, T.; Park, G. -H.; Paterson, K.; Pierrot, D.; Rios, A. F.; Sabine, C. L.; Saito, S.; Salisbury, J.; Sarma, V. V. S. S.; Schlitzer, R.; Sieger, R.; Skjelvan, I.; Steinhoff, T.; Sullivan, K. F.; Sun, H.; Sutton, A. J.; Suzuki, T.; Sweeney, C.; Takahashi, T.; Tjiputra, J.; Tsurushima, N.; van Heuven, S. M. A. C.; Vandemark, D.; Vlahos, P.; Wallace, D. W. R.; Wanninkhof, R.; Watson, A.J.

    2014-01-01

    The Surface Ocean CO2 Atlas (SOCAT), an activity of the international marine carbon research community, provides access to synthesis and gridded fCO(2) (fugacity of carbon dioxide) products for the surface oceans. Version 2 of SOCAT is an update of the previous release (version 1) with more data

  12. Computer versus paper--does it make any difference in test performance?

    Science.gov (United States)

    Karay, Yassin; Schauber, Stefan K; Stosch, Christoph; Schüttpelz-Brauns, Katrin

    2015-01-01

    CONSTRUCT: In this study, we examine the differences in test performance between the paper-based and the computer-based version of the Berlin formative Progress Test. In this context it is the first study that allows controlling for students' prior performance. Computer-based tests make possible a more efficient examination procedure for test administration and review. Although university staff will benefit largely from computer-based tests, the question arises if computer-based tests influence students' test performance. A total of 266 German students from the 9th and 10th semester of medicine (comparable with the 4th-year North American medical school schedule) participated in the study (paper = 132, computer = 134). The allocation of the test format was conducted as a randomized matched-pair design in which students were first sorted according to their prior test results. The organizational procedure, the examination conditions, the room, and seating arrangements, as well as the order of questions and answers, were identical in both groups. The sociodemographic variables and pretest scores of both groups were comparable. The test results from the paper and computer versions did not differ. The groups remained within the allotted time, but students using the computer version (particularly the high performers) needed significantly less time to complete the test. In addition, we found significant differences in guessing behavior. Low performers using the computer version guess significantly more than low-performing students in the paper-pencil version. Participants in computer-based tests are not at a disadvantage in terms of their test results. The computer-based test required less processing time. The reason for the longer processing time when using the paper-pencil version might be due to the time needed to write the answer down, controlling for transferring the answer correctly. It is still not known why students using the computer version (particularly low

  13. National Radiobiology Archives Distributed Access User`s Manual, Version 1.1. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Smith, S.K.; Prather, J.C.; Ligotke, E.K.; Watson, C.R.

    1992-06-01

    This supplement to the NRA Distributed Access User`s manual (PNL-7877), November 1991, describes installation and use of Version 1.1 of the software package; this is not a replacement of the previous manual. Version 1.1 of the NRA Distributed Access Package is a maintenance release. It eliminates several bugs, and includes a few new features which are described in this manual. Although the appearance of some menu screens has changed, we are confident that the Version 1.0 User`s Manual will provide an adequate introduction to the system. Users who are unfamiliar with Version 1.0 may wish to experiment with that version before moving on to Version 1.1.

  14. The Integrated Tiger Series version 5.0

    International Nuclear Information System (INIS)

    Laub, Th.W.; Kensek, R.P.; Franke, B.C.; Lorence, L.J.; Crawford, M.J.; Quirk, Th.J.

    2005-01-01

    The Integrated Tiger Series (ITS) is a powerful and user-friendly software package permitting Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. The package contains programs to perform 1-, 2-, and 3-dimensional simulations. Improvements in the ITS code package since the release of version 3.0 include improved physics, multigroup and adjoint capabilities, Computer-Aided Design geometry tracking, parallel implementations of all ITS codes, and more automated sub-zoning capabilities. These improvements and others are described as current or planned development efforts. The ITS package is currently at version 5.0. (authors)

  15. Development of the unified version of COBRA/RELAP5

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, J J; Ha, K S; Chung, B D; Lee, W J; Sim, S K [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    The COBRA/RELAP5 code, an integrated version of the COBRA-TF and RELAP5/MOD3 codes, has been developed for the realistic simulations of complicated, multi-dimensional, two-phase, thermal-hydraulic system transients in light water reactors. Recently, KAERI developed an unified version of the COBRA/RELAP5 code, which can run in serial mode on both workstations and personal computers. This paper provides the brief overview of the code integration scheme, the recent code modifications, the developmental assessments, and the future development plan. 13 refs., 5 figs., 2 tabs. (Author)

  16. Development of the unified version of COBRA/RELAP5

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, J. J.; Ha, K. S.; Chung, B. D.; Lee, W. J.; Sim, S. K. [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1997-12-31

    The COBRA/RELAP5 code, an integrated version of the COBRA-TF and RELAP5/MOD3 codes, has been developed for the realistic simulations of complicated, multi-dimensional, two-phase, thermal-hydraulic system transients in light water reactors. Recently, KAERI developed an unified version of the COBRA/RELAP5 code, which can run in serial mode on both workstations and personal computers. This paper provides the brief overview of the code integration scheme, the recent code modifications, the developmental assessments, and the future development plan. 13 refs., 5 figs., 2 tabs. (Author)

  17. Computer Security: Mac security – nothing for old versions

    CERN Multimedia

    Stefan Lueders, Computer Security Team

    2016-01-01

    A fundamental pillar of computer security is the regular maintenance of your code, operating system and application software – or, in computer lingo: patching, patching, patching.   Only software which is up-to-date should be free from any known vulnerabilities and thus provide you with a basic level of computer security. Neglecting regular updates is putting your computer at risk – and consequently your account, your password, your data, your photos, your videos and your money. Therefore, prompt and automatic patching is paramount. But the Microsofts, Googles and Apples of this world do not always help… Software vendors handle their update policy in different ways. While Android is a disaster – not because of Google, but due to the slow adaptation of many smartphone vendors (see “Android’s Armageddon”) – Microsoft provides updates for their Windows 7, Windows 8 and Windows 10 operating systems through their &ldq...

  18. The quantum computer game: citizen science

    Science.gov (United States)

    Damgaard, Sidse; Mølmer, Klaus; Sherson, Jacob

    2013-05-01

    Progress in the field of quantum computation is hampered by daunting technical challenges. Here we present an alternative approach to solving these by enlisting the aid of computer players around the world. We have previously examined a quantum computation architecture involving ultracold atoms in optical lattices and strongly focused tweezers of light. In The Quantum Computer Game (see http://www.scienceathome.org/), we have encapsulated the time-dependent Schrödinger equation for the problem in a graphical user interface allowing for easy user input. Players can then search the parameter space with real-time graphical feedback in a game context with a global high-score that rewards short gate times and robustness to experimental errors. The game which is still in a demo version has so far been tried by several hundred players. Extensions of the approach to other models such as Gross-Pitaevskii and Bose-Hubbard are currently under development. The game has also been incorporated into science education at high-school and university level as an alternative method for teaching quantum mechanics. Initial quantitative evaluation results are very positive. AU Ideas Center for Community Driven Research, CODER.

  19. Code OK3 - An upgraded version of OK2 with beam wobbling function

    Science.gov (United States)

    Ogoyski, A. I.; Kawata, S.; Popov, P. H.

    2010-07-01

    For computer simulations on heavy ion beam (HIB) irradiation onto a target with an arbitrary shape and structure in heavy ion fusion (HIF), the code OK2 was developed and presented in Computer Physics Communications 161 (2004). Code OK3 is an upgrade of OK2 including an important capability of wobbling beam illumination. The wobbling beam introduces a unique possibility for a smooth mechanism of inertial fusion target implosion, so that sufficient fusion energy is released to construct a fusion reactor in future. New version program summaryProgram title: OK3 Catalogue identifier: ADST_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADST_v3_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 221 517 No. of bytes in distributed program, including test data, etc.: 2 471 015 Distribution format: tar.gz Programming language: C++ Computer: PC (Pentium 4, 1 GHz or more recommended) Operating system: Windows or UNIX RAM: 2048 MBytes Classification: 19.7 Catalogue identifier of previous version: ADST_v2_0 Journal reference of previous version: Comput. Phys. Comm. 161 (2004) 143 Does the new version supersede the previous version?: Yes Nature of problem: In heavy ion fusion (HIF), ion cancer therapy, material processing, etc., a precise beam energy deposition is essentially important [1]. Codes OK1 and OK2 have been developed to simulate the heavy ion beam energy deposition in three-dimensional arbitrary shaped targets [2, 3]. Wobbling beam illumination is important to smooth the beam energy deposition nonuniformity in HIF, so that a uniform target implosion is realized and a sufficient fusion output energy is released. Solution method: OK3 code works on the base of OK1 and OK2 [2, 3]. The code simulates a multi-beam illumination on a target with arbitrary shape and

  20. NETS - A NEURAL NETWORK DEVELOPMENT TOOL, VERSION 3.0 (MACINTOSH VERSION)

    Science.gov (United States)

    Phillips, T. A.

    1994-01-01

    allows the user to generate C code to implement the network loaded into the system. This permits the placement of networks as components, or subroutines, in other systems. In short, once a network performs satisfactorily, the Generate C Code option provides the means for creating a program separate from NETS to run the network. Other features: files may be stored in binary or ASCII format; multiple input propagation is permitted; bias values may be included; capability to scale data without writing scaling code; quick interactive testing of network from the main menu; and several options that allow the user to manipulate learning efficiency. NETS is written in ANSI standard C language to be machine independent. The Macintosh version (MSC-22108) includes code for both a graphical user interface version and a command line interface version. The machine independent version (MSC-21588) only includes code for the command line interface version of NETS 3.0. The Macintosh version requires a Macintosh II series computer and has been successfully implemented under System 7. Four executables are included on these diskettes, two for floating point operations and two for integer arithmetic. It requires Think C 5.0 to compile. A minimum of 1Mb of RAM is required for execution. Sample input files and executables for both the command line version and the Macintosh user interface version are provided on the distribution medium. The Macintosh version is available on a set of three 3.5 inch 800K Macintosh format diskettes. The machine independent version has been successfully implemented on an IBM PC series compatible running MS-DOS, a DEC VAX running VMS, a SunIPC running SunOS, and a CRAY Y-MP running UNICOS. Two executables for the IBM PC version are included on the MS-DOS distribution media, one compiled for floating point operations and one for integer arithmetic. The machine independent version is available on a set of three 5.25 inch 360K MS-DOS format diskettes (standard

  1. Reply to comment by Añel on "Most computational hydrology is not reproducible, so is it really science?"

    Science.gov (United States)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit

    2017-03-01

    In this article, we reply to a comment made on our previous commentary regarding reproducibility in computational hydrology. Software licensing and version control of code are important technical aspects of making code and workflows of scientific experiments open and reproducible. However, in our view, it is the cultural change that is the greatest challenge to overcome to achieve reproducible scientific research in computational hydrology. We believe that from changing the culture and attitude among hydrological scientists, details will evolve to cover more (technical) aspects over time.

  2. URGENCES NOUVELLE VERSION

    CERN Multimedia

    Medical Service

    2002-01-01

    The table of emergency numbers that appeared in Bulletin 10/2002 is out of date. The updated version provided by the Medical Service appears on the following page. Please disregard the previous version. URGENT NEED OF A DOCTOR GENEVAPATIENT NOT FIT TO BE MOVED: Call your family doctor Or SOS MEDECINS (24H/24H) 748 49 50 Or ASSOC. OF GENEVA DOCTORS (7H-23H) 322 20 20 PATIENT CAN BE MOVED: HOPITAL CANTONAL 24 Micheli du Crest 372 33 11 382 33 11 CHILDREN'S HOSPITAL 6 rue Willy Donzé 382 68 18 382 45 55 MATERNITY 24 Micheli du Crest 382 68 16 382 33 11 OPHTALMOLOGY 22 Alcide Jentzer 382 84 00 HOPITAL DE LA TOUR Meyrin 719 61 11 CENTRE MEDICAL DE MEYRIN Champs Fréchets 719 74 00 URGENCES : FIRE BRIGADE 118 FIRE BRIGADE CERN 767 44 44 BESOIN URGENT D'AMBULANCE (GENEVE ET VAUD) : 144 POLICE 117 ANTI-POISON CENTRE 24H/24H 01 251 51 510 EUROPEAN EMERGENCY CALL: 112 FRANCE PATIENT NOT FIT TO BE MOVED: call your family doctor PATIENT CAN BE MOVED: ST. JULIE...

  3. HECTR Version 1.5 user's manual

    International Nuclear Information System (INIS)

    Dingman, S.E.; Camp, A.L.; Wong, C.C.; King, D.B.; Gasser, R.D.

    1986-04-01

    This report describes the use and features of HECTR Version 1.5. HECTR is a relatively fast-running, lumped-volume containment analysis computer program that is most useful for performing parametric studies. The main purpose of HECTR is to analyze nuclear reactor accidents involving the transport and combustion of hydrogen, but HECTR can also function as an experiment analysis tool and can solve a limited set of other types of containment problems. New models added to HECTR Version 1.5 include fan coolers, containment leakage, continuous burning, and the capability to treat carbon monoxide and carbon dioxide. Models for the ice condenser, sumps, and Mark III suppression pool were upgraded. HECTR is designed for flexibility and provides for user control of many important parameters, particularly those related to hydrogen combustion. Built-in correlations and default values of key parameters are also provided

  4. XTALOPT version r11: An open-source evolutionary algorithm for crystal structure prediction

    Science.gov (United States)

    Avery, Patrick; Falls, Zackary; Zurek, Eva

    2018-01-01

    Version 11 of XTALOPT, an evolutionary algorithm for crystal structure prediction, has now been made available for download from the CPC library or the XTALOPT website, http://xtalopt.github.io. Whereas the previous versions of XTALOPT were published under the Gnu Public License (GPL), the current version is made available under the 3-Clause BSD License, which is an open source license that is recognized by the Open Source Initiative. Importantly, the new version can be executed via a command line interface (i.e., it does not require the use of a Graphical User Interface). Moreover, the new version is written as a stand-alone program, rather than an extension to AVOGADRO.

  5. A Computer-Interpretable Version of the AACE, AME, ETA Medical Guidelines for Clinical Practice for the Diagnosis and Management of Thyroid Nodules

    DEFF Research Database (Denmark)

    Peleg, Mor; Fox, John; Patkar, Vivek

    2014-01-01

    with data that are not necessarily obtained in a rigid flowchart sequence. Tallis-a user-friendly web-based "enactment tool"- was then used as the "execution engine" (computer program). This tool records and displays tasks that are done and prompts users to perform the next indicated steps. The development...... GuideLine Interchange Format, version 3, known as GLIF3, which emphasizes the organization of a care algorithm into a flowchart. The flowchart specified the sequence of tasks required to evaluate a patient with a thyroid nodule. PROforma, a second guideline-modeling language, was then employed to work...

  6. MULTIPLE PROJECTIONS SYSTEM (MPS) - USER'S MANUAL VERSION 1.0

    Science.gov (United States)

    The report is a user's manual for version 1.0 of the Multiple Projections Systems (MPS), a computer system that can perform "what if" scenario analysis and report the final results (i.e., Rate of Further Progress - ROP - inventories) to EPA (i.e., the Aerometric Information Retri...

  7. The sagittal stem alignment and the stem version clearly influence the impingement-free range of motion in total hip arthroplasty: a computer model-based analysis.

    Science.gov (United States)

    Müller, Michael; Duda, Georg; Perka, Carsten; Tohtz, Stephan

    2016-03-01

    The component alignment in total hip arthroplasty influences the impingement-free range of motion (ROM). While substantiated data is available for the cup positioning, little is known about the stem alignment. Especially stem rotation and the sagittal alignment influence the position of the cone in relation to the edge of the socket and thus the impingement-free functioning. Hence, the question arises as to what influence do these parameters have on the impingement-free ROM? With the help of a computer model the influence of the sagittal stem alignment and rotation on the impingement-free ROM were investigated. The computer model was based on the CT dataset of a patient with a non-cemented THA. In the model the stem version was set at 10°/0°/-10° and the sagittal alignment at 5°/0°/-5°, which resulted in nine alternative stem positions. For each position, the maximum impingement-free ROM was investigated. Both stem version and sagittal stem alignment have a relevant influence on the impingement-free ROM. In particular, flexion and extension as well as internal and external rotation capability present evident differences. In the position intervals of 10° sagittal stem alignment and 20° stem version a difference was found of about 80° in the flexion and 50° in the extension capability. Likewise, differences were evidenced of up to 72° in the internal and up to 36° in the external rotation. The sagittal stem alignment and the stem torsion have a relevant influence on the impingement-free ROM. To clarify the causes of an impingement or accompanying problems, both parameters should be examined and, if possible, a combined assessment of these factors should be made.

  8. User manual for version 4.3 of the Tripoli-4 Monte-Carlo method particle transport computer code

    International Nuclear Information System (INIS)

    Both, J.P.; Mazzolo, A.; Peneliau, Y.; Petit, O.; Roesslinger, B.

    2003-01-01

    This manual relates to Version 4.3 TRIPOLI-4 code. TRIPOLI-4 is a computer code simulating the transport of neutrons, photons, electrons and positrons. It can be used for radiation shielding calculations (long-distance propagation with flux attenuation in non-multiplying media) and neutronic calculations (fissile medium, criticality or sub-criticality basis). This makes it possible to calculate k eff (for criticality), flux, currents, reaction rates and multi-group cross-sections. TRIPOLI-4 is a three-dimensional code that uses the Monte-Carlo method. It allows for point-wise description in terms of energy of cross-sections and multi-group homogenized cross-sections and features two modes of geometrical representation: surface and combinatorial. The code uses cross-section libraries in ENDF/B format (such as JEF2-2, ENDF/B-VI and JENDL) for point-wise description cross-sections in APOTRIM format (from the APOLLO2 code) or a format specific to TRIPOLI-4 for multi-group description. (authors)

  9. Overview of MPLNET Version 3 Cloud Detection

    Science.gov (United States)

    Lewis, Jasper R.; Campbell, James; Welton, Ellsworth J.; Stewart, Sebastian A.; Haftings, Phillip

    2016-01-01

    The National Aeronautics and Space Administration Micro Pulse Lidar Network, version 3, cloud detection algorithm is described and differences relative to the previous version are highlighted. Clouds are identified from normalized level 1 signal profiles using two complementary methods. The first method considers vertical signal derivatives for detecting low-level clouds. The second method, which detects high-level clouds like cirrus, is based on signal uncertainties necessitated by the relatively low signal-to-noise ratio exhibited in the upper troposphere by eye-safe network instruments, especially during daytime. Furthermore, a multitemporal averaging scheme is used to improve cloud detection under conditions of a weak signal-to-noise ratio. Diurnal and seasonal cycles of cloud occurrence frequency based on one year of measurements at the Goddard Space Flight Center (Greenbelt, Maryland) site are compared for the new and previous versions. The largest differences, and perceived improvement, in detection occurs for high clouds (above 5 km, above MSL), which increase in occurrence by over 5%. There is also an increase in the detection of multilayered cloud profiles from 9% to 19%. Macrophysical properties and estimates of cloud optical depth are presented for a transparent cirrus dataset. However, the limit to which the cirrus cloud optical depth could be reliably estimated occurs between 0.5 and 0.8. A comparison using collocated CALIPSO measurements at the Goddard Space Flight Center and Singapore Micro Pulse Lidar Network (MPLNET) sites indicates improvements in cloud occurrence frequencies and layer heights.

  10. SAGE Version 7.0 Algorithm: Application to SAGE II

    Science.gov (United States)

    Damadeo, R. P; Zawodny, J. M.; Thomason, L. W.; Iyer, N.

    2013-01-01

    This paper details the Stratospheric Aerosol and Gas Experiments (SAGE) version 7.0 algorithm and how it is applied to SAGE II. Changes made between the previous (v6.2) and current (v7.0) versions are described and their impacts on the data products explained for both coincident event comparisons and time-series analysis. Users of the data will notice a general improvement in all of the SAGE II data products, which are now in better agreement with more modern data sets (e.g. SAGE III) and more robust for use with trend studies.

  11. Control rod computer code IAMCOS: general theory and numerical methods

    International Nuclear Information System (INIS)

    West, G.

    1982-11-01

    IAMCOS is a computer code for the description of mechanical and thermal behavior of cylindrical control rods for fast breeders. This code version was applied, tested and modified from 1979 to 1981. In this report are described the basic model (02 version), theoretical definitions and computation methods [fr

  12. Axially deformed solution of the Skyrme-Hartree-Fock-Bogoliubov equations using the transformed harmonic oscillator basis (II) HFBTHO v2.00d: A new version of the program

    Science.gov (United States)

    Stoitsov, M. V.; Schunck, N.; Kortelainen, M.; Michel, N.; Nam, H.; Olsen, E.; Sarich, J.; Wild, S.

    2013-06-01

    We describe the new version 2.00d of the code HFBTHO that solves the nuclear Skyrme-Hartree-Fock (HF) or Skyrme-Hartree-Fock-Bogoliubov (HFB) problem by using the cylindrical transformed deformed harmonic oscillator basis. In the new version, we have implemented the following features: (i) the modified Broyden method for non-linear problems, (ii) optional breaking of reflection symmetry, (iii) calculation of axial multipole moments, (iv) finite temperature formalism for the HFB method, (v) linear constraint method based on the approximation of the Random Phase Approximation (RPA) matrix for multi-constraint calculations, (vi) blocking of quasi-particles in the Equal Filling Approximation (EFA), (vii) framework for generalized energy density with arbitrary density-dependences, and (viii) shared memory parallelism via OpenMP pragmas. Program summaryProgram title: HFBTHO v2.00d Catalog identifier: ADUI_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADUI_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 167228 No. of bytes in distributed program, including test data, etc.: 2672156 Distribution format: tar.gz Programming language: FORTRAN-95. Computer: Intel Pentium-III, Intel Xeon, AMD-Athlon, AMD-Opteron, Cray XT5, Cray XE6. Operating system: UNIX, LINUX, WindowsXP. RAM: 200 Mwords Word size: 8 bits Classification: 17.22. Does the new version supercede the previous version?: Yes Catalog identifier of previous version: ADUI_v1_0 Journal reference of previous version: Comput. Phys. Comm. 167 (2005) 43 Nature of problem: The solution of self-consistent mean-field equations for weakly-bound paired nuclei requires a correct description of the asymptotic properties of nuclear quasi-particle wave functions. In the present implementation, this is achieved by using the single-particle wave functions

  13. STOMP Subsurface Transport Over Multiple Phases, Version 4.0, User’s Guide

    Energy Technology Data Exchange (ETDEWEB)

    White, Mark D.; Oostrom, Martinus

    2006-06-09

    This guide describes the general use, input file formatting, compilation and execution of the STOMP (Subsurface Transport Over Multiple Phases) simulator, a scientific tool for analyzing single and multiple phase subsurface flow and transport. A description of the simulator’s governing equations, constitutive functions and numerical solution algorithms are provided in a companion theory guide. In writing these guides for the STOMP simulator, the authors have assumed that the reader comprehends concepts and theories associated with multiple-phase hydrology, heat transfer, thermodynamics, radioactive chain decay, and relative permeability-saturation-capillary pressure constitutive relations. The authors further assume that the reader is familiar with the computing environment on which they plan to compile and execute the STOMP simulator. Source codes for the sequential versions of the simulator are available in pure FORTRAN 77 or mixed FORTRAN 77/90 forms. The pure FORTRAN 77 source code form requires a parameters file to define the memory requirements for the array elements. The mixed FORTRAN 77/90 form of the source code uses dynamic memory allocation to define memory requirements, based on a FORTRAN 90 preprocessor STEP, that reads the input files. The simulator utilizes a variable source code configuration, which allows the execution memory and speed to be tailored to the problem specifics, and essentially requires that the source code be assembled and compiled through a software maintenance utility. The memory requirements for executing the simulator are dependent on the complexity of physical system to be modeled and the size and dimensionality of the computational domain. Likewise execution speed depends on the problem complexity, size and dimensionality of the computational domain, and computer performance. Selected operational modes of the STOMP simulator are available for scalable execution on multiple processor (i.e., parallel) computers. These versions

  14. Axially deformed solution of the Skyrme-Hartree-Fock-Bogolyubov equations using the transformed harmonic oscillator basis (III) HFBTHO (v3.00): A new version of the program

    Science.gov (United States)

    Perez, R. Navarro; Schunck, N.; Lasseri, R.-D.; Zhang, C.; Sarich, J.

    2017-11-01

    We describe the new version 3.00 of the code HFBTHO that solves the nuclear Hartree-Fock (HF) or Hartree-Fock-Bogolyubov (HFB) problem by using the cylindrical transformed deformed harmonic oscillator basis. In the new version, we have implemented the following features: (i) the full Gogny force in both particle-hole and particle-particle channels, (ii) the calculation of the nuclear collective inertia at the perturbative cranking approximation, (iii) the calculation of fission fragment charge, mass and deformations based on the determination of the neck, (iv) the regularization of zero-range pairing forces, (v) the calculation of localization functions, (vi) a MPI interface for large-scale mass table calculations. Program Files doi:http://dx.doi.org/10.17632/c5g2f92by3.1 Licensing provisions: GPL v3 Programming language: FORTRAN-95 Journal reference of previous version: M.V. Stoitsov, N. Schunck, M. Kortelainen, N. Michel, H. Nam, E. Olsen, J. Sarich, and S. Wild, Comput. Phys. Commun. 184 (2013). Does the new version supersede the previous one: Yes Summary of revisions: 1. the Gogny force in both particle-hole and particle-particle channels was implemented; 2. the nuclear collective inertia at the perturbative cranking approximation was implemented; 3. fission fragment charge, mass and deformations were implemented based on the determination of the position of the neck between nascent fragments; 4. the regularization method of zero-range pairing forces was implemented; 5. the localization functions of the HFB solution were implemented; 6. a MPI interface for large-scale mass table calculations was implemented. Nature of problem:HFBTHO is a physics computer code that is used to model the structure of the nucleus. It is an implementation of the energy density functional (EDF) approach to atomic nuclei, where the energy of the nucleus is obtained by integration over space of some phenomenological energy density, which is itself a functional of the neutron and proton

  15. NETS - A NEURAL NETWORK DEVELOPMENT TOOL, VERSION 3.0 (MACHINE INDEPENDENT VERSION)

    Science.gov (United States)

    Baffes, P. T.

    1994-01-01

    allows the user to generate C code to implement the network loaded into the system. This permits the placement of networks as components, or subroutines, in other systems. In short, once a network performs satisfactorily, the Generate C Code option provides the means for creating a program separate from NETS to run the network. Other features: files may be stored in binary or ASCII format; multiple input propagation is permitted; bias values may be included; capability to scale data without writing scaling code; quick interactive testing of network from the main menu; and several options that allow the user to manipulate learning efficiency. NETS is written in ANSI standard C language to be machine independent. The Macintosh version (MSC-22108) includes code for both a graphical user interface version and a command line interface version. The machine independent version (MSC-21588) only includes code for the command line interface version of NETS 3.0. The Macintosh version requires a Macintosh II series computer and has been successfully implemented under System 7. Four executables are included on these diskettes, two for floating point operations and two for integer arithmetic. It requires Think C 5.0 to compile. A minimum of 1Mb of RAM is required for execution. Sample input files and executables for both the command line version and the Macintosh user interface version are provided on the distribution medium. The Macintosh version is available on a set of three 3.5 inch 800K Macintosh format diskettes. The machine independent version has been successfully implemented on an IBM PC series compatible running MS-DOS, a DEC VAX running VMS, a SunIPC running SunOS, and a CRAY Y-MP running UNICOS. Two executables for the IBM PC version are included on the MS-DOS distribution media, one compiled for floating point operations and one for integer arithmetic. The machine independent version is available on a set of three 5.25 inch 360K MS-DOS format diskettes (standard

  16. USAGE OF STANDARD PERSONAL COMPUTER PORTS FOR DESIGNING OF THE DOUBLE REDUNDANT FAULT-TOLERANT COMPUTER CONTROL SYSTEMS

    Directory of Open Access Journals (Sweden)

    Rafig SAMEDOV

    2005-01-01

    Full Text Available In this study, for designing of the fault-tolerant control systems by using standard personal computers, the ports have been investigated, different structure versions have been designed and the method for choosing of an optimal structure has been suggested. In this scope, first of all, the ÇİFTYAK system has been defined and its work principle has been determined. Then, data transmission ports of the standard personal computers have been classified and analyzed. After that, the structure versions have been designed and evaluated according to the used data transmission methods, the numbers of ports and the criterions of reliability, performance, truth, control and cost. Finally, the method for choosing of the most optimal structure version has been suggested.

  17. GEANT4 simulations for Proton computed tomography applications

    International Nuclear Information System (INIS)

    Yevseyeva, Olga; Assis, Joaquim T. de; Evseev, Ivan; Schelin, Hugo R.; Shtejer Diaz, Katherin; Lopes, Ricardo T.

    2011-01-01

    Proton radiation therapy is a highly precise form of cancer treatment. In existing proton treatment centers, dose calculations are performed based on X-ray computed tomography (CT). Alternatively, one could image the tumor directly with proton CT (pCT). Proton beams in medical applications deal with relatively thick targets like the human head or trunk. Thus, the fidelity of proton computed tomography (pCT) simulations as a tool for proton therapy planning depends in the general case on the accuracy of results obtained for the proton interaction with thick absorbers. GEANT4 simulations of proton energy spectra after passing thick absorbers do not agree well with existing experimental data, as showed previously. The spectra simulated for the Bethe-Bloch domain showed an unexpected sensitivity to the choice of low-energy electromagnetic models during the code execution. These observations were done with the GEANT4 version 8.2 during our simulations for pCT. This work describes in more details the simulations of the proton passage through gold absorbers with varied thickness. The simulations were done by modifying only the geometry in the Hadron therapy Example, and for all available choices of the Electromagnetic Physics Models. As the most probable reasons for these effects is some specific feature in the code or some specific implicit parameters in the GEANT4 manual, we continued our study with version 9.2 of the code. Some improvements in comparison with our previous results were obtained. The simulations were performed considering further applications for pCT development. The authors want to thank CNPq, CAPES and 'Fundacao Araucaria' for financial support of this work. (Author)

  18. Validation of the Turkish Version of the Cognitive Test Anxiety Scale–Revised

    Directory of Open Access Journals (Sweden)

    Sati Bozkurt

    2017-01-01

    Full Text Available The current study explored the psychometric properties of the newly designed Turkish version of the Cognitive Test Anxiety Scale–Revised (CTAR. Results of an exploratory factor analysis revealed an unidimensional structure consistent with the conceptualized nature of cognitive test anxiety and previous examinations of the English version of the CTAR. Examination of the factor loadings revealed two items that were weakly related to the test anxiety construct and as such were prime candidates for removal. Confirmatory factor analyses were conducted to compare model fit for the 25- and 23-item version of the measure. Results indicated that the 23-item version of the measure provided a better fit to the data which support the removal of the problematic items in the Turkish version of the CTAR. Additional analyses demonstrated the internal consistency, test–retest reliability, concurrent validity, and gender equivalence for responses offered on the Turkish version of the measure. Results of the analysis revealed a 23-item Turkish version of the T-CTAR is a valid and reliable measure of cognitive test anxiety for use among Turkish students.

  19. Modified computation of the nozzle damping coefficient in solid rocket motors

    Science.gov (United States)

    Liu, Peijin; Wang, Muxin; Yang, Wenjing; Gupta, Vikrant; Guan, Yu; Li, Larry K. B.

    2018-02-01

    In solid rocket motors, the bulk advection of acoustic energy out of the nozzle constitutes a significant source of damping and can thus influence the thermoacoustic stability of the system. In this paper, we propose and test a modified version of a historically accepted method of calculating the nozzle damping coefficient. Building on previous work, we separate the nozzle from the combustor, but compute the acoustic admittance at the nozzle entry using the linearized Euler equations (LEEs) rather than with short nozzle theory. We compute the combustor's acoustic modes also with the LEEs, taking the nozzle admittance as the boundary condition at the combustor exit while accounting for the mean flow field in the combustor using an analytical solution to Taylor-Culick flow. We then compute the nozzle damping coefficient via a balance of the unsteady energy flux through the nozzle. Compared with established methods, the proposed method offers competitive accuracy at reduced computational costs, helping to improve predictions of thermoacoustic instability in solid rocket motors.

  20. Radioimmunoassay data processing program for IBM PC computers

    International Nuclear Information System (INIS)

    1989-06-01

    The Medical Applications Section of the International Atomic Energy Agency (IAEA) has previously developed several programs for use on the Hewlett-Packard HP-41C programmable calculator to facilitate better quality control in radioimmunoassay through improved data processing. The program described in this document is designed for off-line analysis using an IBM PC (or compatible) for counting data from standards and unknown specimens (i.e. for analysis of counting data previously recorded by a counter), together with internal quality control (IQC) data both within and between batch. The greater computing power of the IBM PC has enabled the imprecision profile and IQC control curves which were unavailable on the HP-41C version. It is intended that the program would make available good data processing capability to laboratories having limited financial resources and serious problems of quality control. 3 refs

  1. Development and Evaluation of a Chinese Version of the Questionnaire on Teacher Interaction (QTI)

    Science.gov (United States)

    Sun, Xiaojing; Mainhard, Tim; Wubbels, Theo

    2018-01-01

    Teacher-student interpersonal relationships play an important role in education. The Questionnaire on Teacher Interaction (QTI) was designed to measure students' interpersonal perceptions of their teachers. There are two Chinese versions of the QTI for student use, and that inherited the weaknesses of the previous English versions, such as items…

  2. LLNL Yucca Mountain project - near-field environment characterization technical area: Letter report: EQ3/6 version 8: differences from version 7

    Energy Technology Data Exchange (ETDEWEB)

    Wolery, T.J.

    1994-09-29

    EQ3/6 is a software package for geochemical modeling of aqueous systems, such as water/rock or waste/water rock. It is being developed for a variety of applications in geochemical studies for the Yucca Mountain Site Characterization Project. The software has been extensively rewritten for Version 8. The source code has been extensively modernized. The software is now written in Fortran 77 with the most common extensions that are part of the new Fortran 90 standard. The architecture of the software has been improved for better performance and to allow the incorporation of new functional capabilities in Version 8 and planned subsequent versions. In particular, the structure of the major data arrays has been significantly altered and extended. Three new major functional capabilities have been incorporated in Version 8. The first of these allows the treatment of redox disequilibrium in reaction-path modeling. This is a natural extension of the long-running capability of providing for such disequilibrium in static speciation-solubility calculations. Such a capability is important, for example, when dealing with systems containing organic species and certain dissolved gas species. The user defines (and sets the controls for) the components in disequilibrium. Such corrections can now be made if the requisite data are present on a supporting data file. At present, this capability is supported only by the SHV data file, which is based on SUPCRT92. Equilibrium constants and other thermodynamic quantities are correct1961ed for pressures which lie off a standard curve, which is defined on the supporting data file and ordinarily corresponds to 1.013 bar up to IOOC, and the steam/liquid water equilibrium pressure up to 300C. The third new major capability is generic ion exchange option previously developed in prototype in a branch Version 7 level version of EQ3/6 by Brian Viani, Bill Bourcier, and Carol Bruton. This option has been modified to fit into the Version 8 data

  3. Portable computers - portable operating systems

    International Nuclear Information System (INIS)

    Wiegandt, D.

    1985-01-01

    Hardware development has made rapid progress over the past decade. Computers used to have attributes like ''general purpose'' or ''universal'', nowadays they are labelled ''personal'' and ''portable''. Recently, a major manufacturing company started marketing a portable version of their personal computer. But even for these small computers the old truth still holds that the biggest disadvantage of a computer is that it must be programmed, hardware by itself does not make a computer. (orig.)

  4. Development of the web-based Spanish and Catalan versions of the Euroqol 5D-Y (EQ-5D-Y) and comparison of results with the paper version.

    Science.gov (United States)

    Robles, Noemí; Rajmil, Luis; Rodriguez-Arjona, Dolors; Azuara, Marta; Codina, Francisco; Raat, Hein; Ravens-Sieberer, Ulrike; Herdman, Michael

    2015-06-03

    The objectives of the study were to develop web-based Spanish and Catalan versions of the EQ-5D-Y, and to compare scores and psychometric properties with the paper version. Web-based and paper versions of EQ-5D-Y were included in a cross-sectional study in Palafolls (Barcelona), Spain and administered to students (n = 923) aged 8 to 18 years from 2 primary and 1 secondary school and their parents. All students completed both the web-based and paper versions during school time with an interval of at least 2 h between administrations. The order of administration was randomized. Participants completed EQ-5D-Y, a measure of mental health status (the Strengths and Difficulties Questionnaire), and sociodemographic variables using a self-administered questionnaire. Parents questionnaire included parental level of education and presence of chronic conditions in children. Missing values, and floor and ceiling effects were compared between versions. Mean score differences were computed for the visual analogue scale (VAS). Percentage of agreement, kappa index (k) and intraclass correlation coefficient (ICC) were computed to analyze the level of agreement between web-based and paper versions on EQ-5D-Y dimensions and VAS. Known groups validity was analyzed and compared between the two formats. Participation rate was 77 % (n = 715). Both formats of EQ-5D-Y showed low percentages of missing values (n = 2, and 4 to 9 for web and paper versions respectively), and a high ceiling effect by dimension (range from 79 % to 96 %). Percent agreement for EQ-5D-Y dimensions on the web and paper versions was acceptable (range 89 % to 97 %), and k ranged from 0.55 (0.48-0.61, usual activities dimension) to 0.75 (0.68-0.82, mobility dimension). Mean score difference on the VAS was 0.07, and the ICC for VAS scores on the two formats was 0.84 (0.82-0.86). Both formats showed acceptable ability to discriminate according to self-perceived health, reporting chronic conditions, and

  5. xdamp Version 6 : an IDL-based data and image manipulation program.

    Energy Technology Data Exchange (ETDEWEB)

    Ballard, William Parker

    2012-04-01

    The original DAMP (DAta Manipulation Program) was written by Mark Hedemann of Sandia National Laboratories and used the CA-DISSPLA{trademark} (available from Computer Associates International, Inc., Garden City, NY) graphics package as its engine. It was used to plot, modify, and otherwise manipulate the one-dimensional data waveforms (data vs. time) from a wide variety of accelerators. With the waning of CA-DISSPLA and the increasing popularity of Unix(reg sign)-based workstations, a replacement was needed. This package uses the IDL(reg sign) software, available from Research Systems Incorporated, a Xerox company, in Boulder, Colorado, as the engine, and creates a set of widgets to manipulate the data in a manner similar to the original DAMP and earlier versions of xdamp. IDL is currently supported on a wide variety of Unix platforms such as IBM(reg sign) workstations, Hewlett Packard workstations, SUN(reg sign) workstations, Microsoft(reg sign) Windows{trademark} computers, Macintosh(reg sign) computers and Digital Equipment Corporation VMS(reg sign) and Alpha(reg sign) systems. Thus, xdamp is portable across many platforms. We have verified operation, albeit with some minor IDL bugs, on personal computers using Windows 7 and Windows Vista; Unix platforms; and Macintosh computers. Version 6 is an update that uses the IDL Virtual Machine to resolve the need for licensing IDL.

  6. Research in computer science

    Science.gov (United States)

    Ortega, J. M.

    1986-01-01

    Various graduate research activities in the field of computer science are reported. Among the topics discussed are: (1) failure probabilities in multi-version software; (2) Gaussian Elimination on parallel computers; (3) three dimensional Poisson solvers on parallel/vector computers; (4) automated task decomposition for multiple robot arms; (5) multi-color incomplete cholesky conjugate gradient methods on the Cyber 205; and (6) parallel implementation of iterative methods for solving linear equations.

  7. Revision history aware repositories of computational models of biological systems.

    Science.gov (United States)

    Miller, Andrew K; Yu, Tommy; Britten, Randall; Cooling, Mike T; Lawson, James; Cowan, Dougal; Garny, Alan; Halstead, Matt D B; Hunter, Peter J; Nickerson, David P; Nunns, Geo; Wimalaratne, Sarala M; Nielsen, Poul M F

    2011-01-14

    Building repositories of computational models of biological systems ensures that published models are available for both education and further research, and can provide a source of smaller, previously verified models to integrate into a larger model. One problem with earlier repositories has been the limitations in facilities to record the revision history of models. Often, these facilities are limited to a linear series of versions which were deposited in the repository. This is problematic for several reasons. Firstly, there are many instances in the history of biological systems modelling where an 'ancestral' model is modified by different groups to create many different models. With a linear series of versions, if the changes made to one model are merged into another model, the merge appears as a single item in the history. This hides useful revision history information, and also makes further merges much more difficult, as there is no record of which changes have or have not already been merged. In addition, a long series of individual changes made outside of the repository are also all merged into a single revision when they are put back into the repository, making it difficult to separate out individual changes. Furthermore, many earlier repositories only retain the revision history of individual files, rather than of a group of files. This is an important limitation to overcome, because some types of models, such as CellML 1.1 models, can be developed as a collection of modules, each in a separate file. The need for revision history is widely recognised for computer software, and a lot of work has gone into developing version control systems and distributed version control systems (DVCSs) for tracking the revision history. However, to date, there has been no published research on how DVCSs can be applied to repositories of computational models of biological systems. We have extended the Physiome Model Repository software to be fully revision history aware

  8. Revision history aware repositories of computational models of biological systems

    Directory of Open Access Journals (Sweden)

    Nickerson David P

    2011-01-01

    Full Text Available Abstract Background Building repositories of computational models of biological systems ensures that published models are available for both education and further research, and can provide a source of smaller, previously verified models to integrate into a larger model. One problem with earlier repositories has been the limitations in facilities to record the revision history of models. Often, these facilities are limited to a linear series of versions which were deposited in the repository. This is problematic for several reasons. Firstly, there are many instances in the history of biological systems modelling where an 'ancestral' model is modified by different groups to create many different models. With a linear series of versions, if the changes made to one model are merged into another model, the merge appears as a single item in the history. This hides useful revision history information, and also makes further merges much more difficult, as there is no record of which changes have or have not already been merged. In addition, a long series of individual changes made outside of the repository are also all merged into a single revision when they are put back into the repository, making it difficult to separate out individual changes. Furthermore, many earlier repositories only retain the revision history of individual files, rather than of a group of files. This is an important limitation to overcome, because some types of models, such as CellML 1.1 models, can be developed as a collection of modules, each in a separate file. The need for revision history is widely recognised for computer software, and a lot of work has gone into developing version control systems and distributed version control systems (DVCSs for tracking the revision history. However, to date, there has been no published research on how DVCSs can be applied to repositories of computational models of biological systems. Results We have extended the Physiome Model

  9. Advanced Simulation and Computing FY17 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hendrickson, Bruce [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wade, Doug [National Nuclear Security Administration (NNSA), Washington, DC (United States). Office of Advanced Simulation and Computing and Institutional Research and Development; Hoang, Thuc [National Nuclear Security Administration (NNSA), Washington, DC (United States). Computational Systems and Software Environment

    2016-08-29

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.

  10. Measuring Engagement at Work: Validation of the Chinese Version of the Utrecht Work Engagement Scale

    OpenAIRE

    Ng, Sm; Fong, TCt

    2011-01-01

    Background: Work engagement is a positive work-related state of fulfillment characterized by vigor, dedication, and absorption. Previous studies have operationalized the construct through development of the Utrecht Work Engagement Scale. Apart from the original three-factor 17-item version of the instrument (UWES-17), there exists a nine-item shortened revised version (UWES-9). Purpose: The current study explored the psychometric properties of the Chinese version of the Utrecht Work Engagemen...

  11. Measuring Engagement at Work: Validation of the Chinese Version of the Utrecht Work Engagement Scale

    OpenAIRE

    Fong, Ted Chun-tat; Ng, Siu-man

    2011-01-01

    Background Work engagement is a positive work-related state of fulfillment characterized by vigor, dedication, and absorption. Previous studies have operationalized the construct through development of the Utrecht Work Engagement Scale. Apart from the original three-factor 17-item version of the instrument (UWES-17), there exists a nine-item shortened revised version (UWES-9). Purpose The current study explored the psychometric properties of the Chinese version of the Utrecht Work Engagement ...

  12. TOUGH2 User's Guide Version 2

    International Nuclear Information System (INIS)

    Pruess, K.; Oldenburg, C.M.; Moridis, G.J.

    1999-01-01

    TOUGH2 is a numerical simulator for nonisothermal flows of multicomponent, multiphase fluids in one, two, and three-dimensional porous and fractured media. The chief applications for which TOUGH2 is designed are in geothermal reservoir engineering, nuclear waste disposal, environmental assessment and remediation, and unsaturated and saturated zone hydrology. TOUGH2 was first released to the public in 1991; the 1991 code was updated in 1994 when a set of preconditioned conjugate gradient solvers was added to allow a more efficient solution of large problems. The current Version 2.0 features several new fluid property modules and offers enhanced process modeling capabilities, such as coupled reservoir-wellbore flow, precipitation and dissolution effects, and multiphase diffusion. Numerous improvements in previously released modules have been made and new user features have been added, such as enhanced linear equation solvers, and writing of graphics files. The T2VOC module for three-phase flows of water, air and a volatile organic chemical (VOC), and the T2DM module for hydrodynamic dispersion in 2-D flow systems have been integrated into the overall structure of the code and are included in the Version 2.0 package. Data inputs are upwardly compatible with the previous version. Coding changes were generally kept to a minimum, and were only made as needed to achieve the additional functionalities desired. TOUGH2 is written in standard FORTRAN77 and can be run on any platform, such as workstations, PCs, Macintosh, mainframe and supercomputers, for which appropriate FORTRAN compilers are available. This report is a self-contained guide to application of TOUGH2 to subsurface flow problems. It gives a technical description of the TOUGH2 code, including a discussion of the physical processes modeled, and the mathematical and numerical methods used. Illustrative sample problems are presented along with detailed instructions for preparing input data

  13. NOAA Climate Data Record of Microwave Sounding Unit (MSU) Mean Atmospheric Layer Temperature, Version 1.2 (Superseded)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Please note, this dataset has been superseded by a newer version (see below). Users should not use this version except in rare cases (e.g., when reproducing previous...

  14. Choice of Human-Computer Interaction Mode in Stroke Rehabilitation.

    Science.gov (United States)

    Mousavi Hondori, Hossein; Khademi, Maryam; Dodakian, Lucy; McKenzie, Alison; Lopes, Cristina V; Cramer, Steven C

    2016-03-01

    Advances in technology are providing new forms of human-computer interaction. The current study examined one form of human-computer interaction, augmented reality (AR), whereby subjects train in the real-world workspace with virtual objects projected by the computer. Motor performances were compared with those obtained while subjects used a traditional human-computer interaction, that is, a personal computer (PC) with a mouse. Patients used goal-directed arm movements to play AR and PC versions of the Fruit Ninja video game. The 2 versions required the same arm movements to control the game but had different cognitive demands. With AR, the game was projected onto the desktop, where subjects viewed the game plus their arm movements simultaneously, in the same visual coordinate space. In the PC version, subjects used the same arm movements but viewed the game by looking up at a computer monitor. Among 18 patients with chronic hemiparesis after stroke, the AR game was associated with 21% higher game scores (P = .0001), 19% faster reaching times (P = .0001), and 15% less movement variability (P = .0068), as compared to the PC game. Correlations between game score and arm motor status were stronger with the AR version. Motor performances during the AR game were superior to those during the PC game. This result is due in part to the greater cognitive demands imposed by the PC game, a feature problematic for some patients but clinically useful for others. Mode of human-computer interface influences rehabilitation therapy demands and can be individualized for patients. © The Author(s) 2015.

  15. [External cephalic version].

    Science.gov (United States)

    Navarro-Santana, B; Duarez-Coronado, M; Plaza-Arranz, J

    2016-08-01

    To analyze the rate of successful external cephalic versions in our center and caesarean sections that would be avoided with the use of external cephalic versions. From January 2012 to March 2016 external cephalic versions carried out at our center, which were a total of 52. We collected data about female age, gestational age at the time of the external cephalic version, maternal body mass index (BMI), fetal variety and situation, fetal weight, parity, location of the placenta, amniotic fluid index (ILA), tocolysis, analgesia, and newborn weight at birth, minor adverse effects (dizziness, hypotension and maternal pain) and major adverse effects (tachycardia, bradycardia, decelerations and emergency cesarean section). 45% of the versions were unsuccessful and 55% were successful. The percentage of successful vaginal delivery in versions was 84% (4% were instrumental) and 15% of caesarean sections. With respect to the variables studied, only significant differences in birth weight were found; suggesting that birth weight it is related to the outcome of external cephalic version. Probably we did not find significant differences due to the number of patients studied. For women with breech presentation, we recommend external cephalic version before the expectant management or performing a cesarean section. The external cephalic version increases the proportion of fetuses in cephalic presentation and also decreases the rate of caesarean sections.

  16. Python pocket reference, version 2.4

    CERN Document Server

    Lutz, Mark

    2005-01-01

    Python is optimized for quality, productivity, portability, and integration. Hundreds of thousands of Python developers around the world rely on Python for general-purpose tasks, Internet scripting, systems programming, user interfaces, and product customization. Available on all major computing platforms, including commercial versions of Unix, Linux, Windows, and Mac OS X, Python is portable, powerful and remarkable easy to use. With its convenient, quick-reference format, Python Pocket Reference, 3rd Edition is the perfect on-the-job reference. More importantly, it's now been refreshed

  17. UQTk version 2.0 user manual

    Energy Technology Data Exchange (ETDEWEB)

    Debusschere, Bert J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Sargsyan, Khachik [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Safta, Cosmin [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2013-10-01

    The UQ Toolkit (UQTk) is a collection of libraries and tools for the quantification of uncertainty in numerical model predictions. Version 2.0 ffers intrusive and non-intrusive methods for propagating input uncertainties through computational models, tools for sensitivity analysis, methods for sparse surrogate construction, and Bayesian inference tools for inferring parameters from experimental data. This manual discusses the download and installation process for UQTk, provides pointers to the UQ methods used in the toolkit, and describes some of the examples provided with the toolkit.

  18. The NJOY Nuclear Data Processing System, Version 2016

    Energy Technology Data Exchange (ETDEWEB)

    Macfarlane, Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Muir, Douglas W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Boicourt, R. M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kahler, III, Albert Comstock [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Conlin, Jeremy Lloyd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-09

    The NJOY Nuclear Data Processing System, version 2016, is a comprehensive computer code package for producing pointwise and multigroup cross sections and related quantities from evaluated nuclear data in the ENDF-4 through ENDF-6 legacy card-image formats. NJOY works with evaluated files for incident neutrons, photons, and charged particles, producing libraries for a wide variety of particle transport and reactor analysis codes.

  19. QRev—Software for computation and quality assurance of acoustic doppler current profiler moving-boat streamflow measurements—User’s manual for version 2.8

    Science.gov (United States)

    Mueller, David S.

    2016-05-12

    The software program, QRev computes the discharge from moving-boat acoustic Doppler current profiler measurements using data collected with any of the Teledyne RD Instrument or SonTek bottom tracking acoustic Doppler current profilers. The computation of discharge is independent of the manufacturer of the acoustic Doppler current profiler because QRev applies consistent algorithms independent of the data source. In addition, QRev automates filtering and quality checking of the collected data and provides feedback to the user of potential quality issues with the measurement. Various statistics and characteristics of the measurement, in addition to a simple uncertainty assessment are provided to the user to assist them in properly rating the measurement. QRev saves an extensible markup language file that can be imported into databases or electronic field notes software. The user interacts with QRev through a tablet-friendly graphical user interface. This report is the manual for version 2.8 of QRev.

  20. A randomised clinical trial of intrapartum fetal monitoring with computer analysis and alerts versus previously available monitoring

    Directory of Open Access Journals (Sweden)

    Santos Cristina

    2010-10-01

    Full Text Available Abstract Background Intrapartum fetal hypoxia remains an important cause of death and permanent handicap and in a significant proportion of cases there is evidence of suboptimal care related to fetal surveillance. Cardiotocographic (CTG monitoring remains the basis of intrapartum surveillance, but its interpretation by healthcare professionals lacks reproducibility and the technology has not been shown to improve clinically important outcomes. The addition of fetal electrocardiogram analysis has increased the potential to avoid adverse outcomes, but CTG interpretation remains its main weakness. A program for computerised analysis of intrapartum fetal signals, incorporating real-time alerts for healthcare professionals, has recently been developed. There is a need to determine whether this technology can result in better perinatal outcomes. Methods/design This is a multicentre randomised clinical trial. Inclusion criteria are: women aged ≥ 16 years, able to provide written informed consent, singleton pregnancies ≥ 36 weeks, cephalic presentation, no known major fetal malformations, in labour but excluding active second stage, planned for continuous CTG monitoring, and no known contra-indication for vaginal delivery. Eligible women will be randomised using a computer-generated randomisation sequence to one of the two arms: continuous computer analysis of fetal monitoring signals with real-time alerts (intervention arm or continuous CTG monitoring as previously performed (control arm. Electrocardiographic monitoring and fetal scalp blood sampling will be available in both arms. The primary outcome measure is the incidence of fetal metabolic acidosis (umbilical artery pH ecf > 12 mmol/L. Secondary outcome measures are: caesarean section and instrumental vaginal delivery rates, use of fetal blood sampling, 5-minute Apgar score Discussion This study will provide evidence of the impact of intrapartum monitoring with computer analysis and real

  1. ARC2D - EFFICIENT SOLUTION METHODS FOR THE NAVIER-STOKES EQUATIONS (DEC RISC ULTRIX VERSION)

    Science.gov (United States)

    Biyabani, S. R.

    1994-01-01

    ARC2D is a computational fluid dynamics program developed at the NASA Ames Research Center specifically for airfoil computations. The program uses implicit finite-difference techniques to solve two-dimensional Euler equations and thin layer Navier-Stokes equations. It is based on the Beam and Warming implicit approximate factorization algorithm in generalized coordinates. The methods are either time accurate or accelerated non-time accurate steady state schemes. The evolution of the solution through time is physically realistic; good solution accuracy is dependent on mesh spacing and boundary conditions. The mathematical development of ARC2D begins with the strong conservation law form of the two-dimensional Navier-Stokes equations in Cartesian coordinates, which admits shock capturing. The Navier-Stokes equations can be transformed from Cartesian coordinates to generalized curvilinear coordinates in a manner that permits one computational code to serve a wide variety of physical geometries and grid systems. ARC2D includes an algebraic mixing length model to approximate the effect of turbulence. In cases of high Reynolds number viscous flows, thin layer approximation can be applied. ARC2D allows for a variety of solutions to stability boundaries, such as those encountered in flows with shocks. The user has considerable flexibility in assigning geometry and developing grid patterns, as well as in assigning boundary conditions. However, the ARC2D model is most appropriate for attached and mildly separated boundary layers; no attempt is made to model wake regions and widely separated flows. The techniques have been successfully used for a variety of inviscid and viscous flowfield calculations. The Cray version of ARC2D is written in FORTRAN 77 for use on Cray series computers and requires approximately 5Mb memory. The program is fully vectorized. The tape includes variations for the COS and UNICOS operating systems. Also included is a sample routine for CONVEX

  2. Psychometric properties of the Hebrew short version of the Zimbardo Time Perspective Inventory.

    Science.gov (United States)

    Orkibi, Hod

    2015-06-01

    The purpose of this study was to develop a short Hebrew version of the Zimbardo Time Perspective Inventory that can be easily administered by health professionals in research, therapy, and counseling. First, the empirical links of time perspective (TP) to subjective well-being and health protective and health risk behaviors are reviewed. Then, a brief account of the instrument's previous modifications is provided. Results of confirmatory factor analysis (N = 572) verified the five-factor structure of the short version and yielded acceptable internal consistency reliability for each factor. The correlation coefficients between the five subscales of the short (20 items) and the original (56 items) instruments were all above .79, indicating the suitability of the short version for assessing the five TP factors. Support for the discriminant and concurrent validity was also achieved, largely in agreement with previous findings. Finally, limitations and future directions are addressed, and potential applications in therapy and counseling are offered. © The Author(s) 2014.

  3. Fiscal impacts model documentation. Version 1.0

    International Nuclear Information System (INIS)

    Beck, S.L.; Scott, M.J.

    1986-05-01

    The Fiscal Impacts (FI) Model, Version 1.0 was developed under Pacific Northwest Laboratory's Monitored Retrievable Storage (MRS) Program to aid in development of the MRS Reference Site Environmental Document (PNL 5476). It computes estimates of 182 fiscal items for state and local government jurisdictions, using input data from the US Census Bureau's 1981 Survey of Governments and local population forecasts. The model can be adapted for any county or group of counties in the United States

  4. Standard interface files and procedures for reactor physics codes. Version IV

    International Nuclear Information System (INIS)

    O'Dell, R.D.

    1977-09-01

    Standards, procedures, and recommendations of the Committee on Computer Code Coordination for promoting the exchange of reactor physics codes are updated to Version IV status. Standards and procedures covering general programming, program structure, standard interface files, and file management and handling subroutines are included

  5. Energy flow in plate assembles by hierarchical version of finite element method

    DEFF Research Database (Denmark)

    Wachulec, Marcin; Kirkegaard, Poul Henning

    method has been proposed. In this paper a modified hierarchical version of finite element method is used for modelling of energy flow in plate assembles. The formulation includes description of in-plane forces so that planes lying in different planes can be modelled. Two examples considered are: L......The dynamic analysis of structures in medium and high frequencies are usually focused on frequency and spatial averages of energy of components, and not the displacement/velocity fields. This is especially true for structure-borne noise modelling. For the analysis of complicated structures...... the finite element method has been used to study the energy flow. The finite element method proved its usefulness despite the computational expense. Therefore studies have been conducted in order to simplify and reduce the computations required. Among others, the use of hierarchical version of finite element...

  6. Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.

    Science.gov (United States)

    Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar

    2015-09-04

    The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.

  7. Procedure guideline for thyroid scintigraphy (version 3); Verfahrensanweisung fuer die Schilddruesenszintigraphie (Version 3)

    Energy Technology Data Exchange (ETDEWEB)

    Dietlein, M.; Schicha, H. [Deutsche Gesellschaft fuer Nuklearmedizin (DGN) (Germany); Koeln Univ. (Germany). Klinik und Poliklinik fuer Nuklearmedizin; Dressler, J. [Deutsche Gesellschaft fuer Nuklearmedizin (DGN) (Germany); Nuklearmedizinische Klinik der Henriettenstiftung, Hannover (Germany); Eschner, W. [Deutsche Gesellschaft fuer Nuklearmedizin (DGN) (Germany); Deutsche Gesellschaft fuer Medizinische Physik (DGMP) (Germany); Koeln Univ. (Germany). Klinik und Poliklinik fuer Nuklearmedizin; Leisner, B. [Deutsche Gesellschaft fuer Nuklearmedizin (DGN) (Germany); Allgemeines Krankenhaus St. Georg, Hamburg (Germany). Abt. fuer Nuklearmedizin; Reiners, C. [Deutsche Gesellschaft fuer Nuklearmedizin (DGN) (Germany); Wuerzburg Univ. (Germany). Klinik und Poliklinik fuer Nuklearmedizin

    2007-07-01

    The version 3 of the procedure guideline for thyroid scintigraphy is an update of the procedure guideline previously published in 2003. The interpretation of the scintigraphy requires the knowledge of the patients' history, the palpation of the neck, the laboratory parameters and of the sonography. The interpretation of the technetium-99m uptake requires the knowledge of the TSH-level. As a consequence of the improved alimentary iodine supply the {sup 99m}Tc-uptake has decreased; 100 000 counts per scintigraphy should be acquired. For this, an imaging time of 10 minutes is generally needed using a high resolution collimator for thyroid imaging. (orig.)

  8. Conservation Reasoning Ability and Performance on BSCS Blue Version Examinations.

    Science.gov (United States)

    Lawson, Anton E.; Nordland, Floyd H.

    Twenty-three high school biology students were individually administered three conservation tasks (weight, volume, volume displacement). During one semester, they were examined over the course material using published Biological Sciences Curriculum Study (BSCS) Blue Version examination questions which were previously classified as requiring either…

  9. Computer modelling of the WWER fuel elements under high burnup conditions by the computer codes PIN-W and RODQ2D

    Energy Technology Data Exchange (ETDEWEB)

    Valach, M; Zymak, J; Svoboda, R [Nuclear Research Inst. Rez plc, Rez (Czech Republic)

    1997-08-01

    This paper presents the development status of the computer codes for the WWER fuel elements thermomechanical behavior modelling under high burnup conditions at the Nuclear Research Institute Rez. The accent is given on the analysis of the results from the parametric calculations, performed by the programmes PIN-W and RODQ2D, rather than on their detailed theoretical description. Several new optional correlations for the UO2 thermal conductivity with degradation effect caused by burnup were implemented into the both codes. Examples of performed calculations document differences between previous and new versions of both programmes. Some recommendations for further development of the codes are given in conclusion. (author). 6 refs, 9 figs.

  10. Computer modelling of the WWER fuel elements under high burnup conditions by the computer codes PIN-W and RODQ2D

    International Nuclear Information System (INIS)

    Valach, M.; Zymak, J.; Svoboda, R.

    1997-01-01

    This paper presents the development status of the computer codes for the WWER fuel elements thermomechanical behavior modelling under high burnup conditions at the Nuclear Research Institute Rez. The accent is given on the analysis of the results from the parametric calculations, performed by the programmes PIN-W and RODQ2D, rather than on their detailed theoretical description. Several new optional correlations for the UO2 thermal conductivity with degradation effect caused by burnup were implemented into the both codes. Examples of performed calculations document differences between previous and new versions of both programmes. Some recommendations for further development of the codes are given in conclusion. (author). 6 refs, 9 figs

  11. Development of environmental dose assessment system (EDAS) code of PC version

    Energy Technology Data Exchange (ETDEWEB)

    Taki, Mitsumasa; Kikuchi, Masamitsu; Kobayashi, Hideo; Yamaguchi, Takenori [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2003-05-01

    A computer code (EDAS) was developed to assess the public dose for the safety assessment to get the license of nuclear reactor operation. This code system is used for the safety analysis of public around the nuclear reactor in normal operation and severe accident. This code was revised and composed for personal computer user according to the Nuclear Safety Guidelines reflected the ICRP1990 recommendation. These guidelines are revised by Nuclear Safety Commission on March, 2001, which are 'Weather analysis guideline for the safety assessment of nuclear power reactor', 'Public dose around the facility assessment guideline corresponding to the objective value for nuclear power light water reactor' and 'Public dose assessment guideline for safety review of nuclear power light water reactor'. This code has been already opened for public user by JAERI, and English version code and user manual are also prepared. This English version code is helpful for international cooperation concerning the nuclear safety assessment with JAERI. (author)

  12. Development of environmental dose assessment system (EDAS) code of PC version

    CERN Document Server

    Taki, M; Kobayashi, H; Yamaguchi, T

    2003-01-01

    A computer code (EDAS) was developed to assess the public dose for the safety assessment to get the license of nuclear reactor operation. This code system is used for the safety analysis of public around the nuclear reactor in normal operation and severe accident. This code was revised and composed for personal computer user according to the Nuclear Safety Guidelines reflected the ICRP1990 recommendation. These guidelines are revised by Nuclear Safety Commission on March, 2001, which are 'Weather analysis guideline for the safety assessment of nuclear power reactor', 'Public dose around the facility assessment guideline corresponding to the objective value for nuclear power light water reactor' and 'Public dose assessment guideline for safety review of nuclear power light water reactor'. This code has been already opened for public user by JAERI, and English version code and user manual are also prepared. This English version code is helpful for international cooperation concerning the nuclear safety assessme...

  13. Modeling of BWR core meltdown accidents - for application in the MELRPI. MOD2 computer code

    Energy Technology Data Exchange (ETDEWEB)

    Koh, B R; Kim, S H; Taleyarkhan, R P; Podowski, M Z; Lahey, Jr, R T

    1985-04-01

    This report summarizes improvements and modifications made in the MELRPI computer code. A major difference between this new, updated version of the code, called MELRPI.MOD2, and the one reported previously, concerns the inclusion of a model for the BWR emergency core cooling systems (ECCS). This model and its computer implementation, the ECCRPI subroutine, account for various emergency injection modes, for both intact and rubblized geometries. Other changes to MELRPI deal with an improved model for canister wall oxidation, rubble bed modeling, and numerical integration of system equations. A complete documentation of the entire MELRPI.MOD2 code is also given, including an input guide, list of subroutines, sample input/output and program listing.

  14. Development of the short version of the informal caregiver burden assessment questionnaire

    Directory of Open Access Journals (Sweden)

    Teresa Martins

    2015-04-01

    Full Text Available OBJETIVE to create a reduced version of the QASCI, which is structurally equivalent to the long one and meets the criteria of reliability and validity. METHOD Through secondary data from previous studies, the participants were divided into two samples, one for the development of reduced version and the second for study of the factorial validity. Participants responded to QASCI, the SF 36, the ADHS and demographic questions. RESULTS A reduced version of 14 items showed adequate psychometric properties of validity and internal consistency, adapted to a heptadimensional structure that assesses positive and negative aspects of care. CONCLUSION Confirmatory factor analysis revealed a good fit with the advocated theoretical model.

  15. Poisson/Superfish codes for personal computers

    International Nuclear Information System (INIS)

    Humphries, S.

    1992-01-01

    The Poisson/Superfish codes calculate static E or B fields in two-dimensions and electromagnetic fields in resonant structures. New versions for 386/486 PCs and Macintosh computers have capabilities that exceed the mainframe versions. Notable improvements are interactive graphical post-processors, improved field calculation routines, and a new program for charged particle orbit tracking. (author). 4 refs., 1 tab., figs

  16. Labour Outcomes After Successful External Cephalic Version Compared With Spontaneous Cephalic Version.

    Science.gov (United States)

    Krueger, Samantha; Simioni, Julia; Griffith, Lauren E; Hutton, Eileen K

    2018-01-01

    This study sought to compare obstetrical outcomes for women with a cephalic presentation at birth resulting from successful external cephalic version (ECV) compared to those resulting from spontaneous cephalic version (SCV). Secondary analysis was performed on Early External Cephalic Version Trial data. A total of 931 study participants had breech presentations between 34 and 36 weeks' gestation and cephalic presentations at birth. The incidence of intrapartum interventions was compared between patients with successful ECV (557) and those with SCV (374). A generalized linear mixed model was used to determine ORs for our primary outcomes. Parity, maternal BMI, previous CS, and enrolment centre were controlled for in the analysis. No differences were found after ECV compared with SCV in the incidence of CS (96 of 557 and 76 of 374, respectively; adjusted OR [aOR] 0.89; 95% CI 0.63-1.26), instrumental birth (68 of 557 and 29 of 373, respectively; aOR 1.55; 95% CI 0.96-2.50), or normal vaginal birth (393 of 557 and 268 of 373, respectively; aOR 0.92; 95% CI 0.68-1.24). Multiparous women with successful ECV were half as likely to require a CS compared with those with SCV and no ECV (28 of 313 and 42 of 258, respectively; aOR 0.45; 95% CI 0.26-0.80). This is the first study to compare birth outcomes of breech pregnancies that convert to cephalic presentation by means of SCV with birth outcomes of breech pregnancies that have ECV. Women with a cephalic-presenting fetus at birth as a result of successful ECV are not at greater risk of obstetrical interventions at birth when compared with women with fetuses who spontaneously turn to a cephalic presentation in the third trimester. Copyright © 2018. Published by Elsevier Inc.

  17. EOS MLS Level 1B Data Processing, Version 2.2

    Science.gov (United States)

    Perun, Vincent; Jarnot, Robert; Pickett, Herbert; Cofield, Richard; Schwartz, Michael; Wagner, Paul

    2009-01-01

    A computer program performs level- 1B processing (the term 1B is explained below) of data from observations of the limb of the Earth by the Earth Observing System (EOS) Microwave Limb Sounder (MLS), which is an instrument aboard the Aura spacecraft. This software accepts, as input, the raw EOS MLS scientific and engineering data and the Aura spacecraft ephemeris and attitude data. Its output consists of calibrated instrument radiances and associated engineering and diagnostic data. [This software is one of several computer programs, denoted product generation executives (PGEs), for processing EOS MLS data. Starting from level 0 (representing the aforementioned raw data, the PGEs and their data products are denoted by alphanumeric labels (e.g., 1B and 2) that signify the successive stages of processing.] At the time of this reporting, this software is at version 2.2 and incorporates improvements over a prior version that make the code more robust, improve calibration, provide more diagnostic outputs, improve the interface with the Level 2 PGE, and effect a 15-percent reduction in file sizes by use of data compression.

  18. Relap4/SAS/Mod5 - A version of Relap4/Mod 5 adapted to IPEN/CNEN - SP computer center

    International Nuclear Information System (INIS)

    Sabundjian, G.

    1988-04-01

    In order to improve the safety of nuclear reactor power plants several computer codes have been developed in the area of thermal - hydraulics accident analysis. Among the public-available codes, RELAP4, developed by Aerojet Nuclear Company, has been the most popular one. RELAP4 has produced satisfactory results when compared to most of the available experimental data. The purposes of the present work are: optimization of RELAP4 output and messages by writing there information in temporary records, - display of RELAP4 results in graphical form through the printer. The sample problem consists on a simplified model of a 150 MW (e) PWR whose primary circuit is simulated by 6 volumes, 8 junctions and 1 heat slab. This new version of RELAP4 (named RELAP4/SAS/MOD5) have produced results which show that the above mentioned purposes have been reached. Obviously the graphical output by RELAP4/SAS/MOD5 favors the interpretation of results by the user. (author) [pt

  19. Solution of the Skyrme-Hartree-Fock-Bogolyubov equations in the Cartesian deformed harmonic-oscillator basis.. (VII) HFODD (v2.49t): A new version of the program

    Science.gov (United States)

    Schunck, N.; Dobaczewski, J.; McDonnell, J.; Satuła, W.; Sheikh, J. A.; Staszczak, A.; Stoitsov, M.; Toivanen, P.

    2012-01-01

    We describe the new version (v2.49t) of the code HFODD which solves the nuclear Skyrme-Hartree-Fock (HF) or Skyrme-Hartree-Fock-Bogolyubov (HFB) problem by using the Cartesian deformed harmonic-oscillator basis. In the new version, we have implemented the following physics features: (i) the isospin mixing and projection, (ii) the finite-temperature formalism for the HFB and HF + BCS methods, (iii) the Lipkin translational energy correction method, (iv) the calculation of the shell correction. A number of specific numerical methods have also been implemented in order to deal with large-scale multi-constraint calculations and hardware limitations: (i) the two-basis method for the HFB method, (ii) the Augmented Lagrangian Method (ALM) for multi-constraint calculations, (iii) the linear constraint method based on the approximation of the RPA matrix for multi-constraint calculations, (iv) an interface with the axial and parity-conserving Skyrme-HFB code HFBTHO, (v) the mixing of the HF or HFB matrix elements instead of the HF fields. Special care has been paid to using the code on massively parallel leadership class computers. For this purpose, the following features are now available with this version: (i) the Message Passing Interface (MPI) framework, (ii) scalable input data routines, (iii) multi-threading via OpenMP pragmas, (iv) parallel diagonalization of the HFB matrix in the simplex-breaking case using the ScaLAPACK library. Finally, several little significant errors of the previous published version were corrected. New version program summaryProgram title:HFODD (v2.49t) Catalogue identifier: ADFL_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADFL_v3_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence v3 No. of lines in distributed program, including test data, etc.: 190 614 No. of bytes in distributed program, including test data, etc.: 985 898 Distribution

  20. DIII-D tokamak control and neutral beam computer system upgrades

    International Nuclear Information System (INIS)

    Penaflor, B.G.; McHarg, B.B.; Piglowski, D.A.; Pham, D.; Phillips, J.C.

    2004-01-01

    This paper covers recent computer system upgrades made to the DIII-D tokamak control and neutral beam computer systems. The systems responsible for monitoring and controlling the DIII-D tokamak and injecting neutral beam power have recently come online with new computing hardware and software. The new hardware and software have provided a number of significant improvements over the previous Modcomp AEG VME and accessware based systems. These improvements include the incorporation of faster, less expensive, and more readily available computing hardware which have provided performance increases of up to a factor 20 over the prior systems. A more modern graphical user interface with advanced plotting capabilities has improved feedback to users on the operating status of the tokamak and neutral beam systems. The elimination of aging and non supportable hardware and software has increased overall maintainability. The distinguishing characteristics of the new system include: (1) a PC based computer platform running the Redhat version of the Linux operating system; (2) a custom PCI CAMAC software driver developed by general atomics for the kinetic systems 2115 serial highway card; and (3) a custom developed supervisory control and data acquisition (SCADA) software package based on Kylix, an inexpensive interactive development environment (IDE) tool from borland corporation. This paper provides specific details of the upgraded computer systems

  1. Computer graphics from basic to application

    International Nuclear Information System (INIS)

    Kim, Do Hyeong; Mun, Sung Min

    1998-04-01

    This book mentions conception of computer graphics, background history, necessity and applied field like construction design, image processing, auto mobile design, fashion design and TV broadcast, basic principle of computer, computer graphics hardware, computer graphics software such as adobe illustrator tool box and adobe photo shop, quarkXpress like introduction, application and operating circumstance, 3D graphics with summary, difference of versions of 3D studio and system, and Auto CAD application.

  2. Computer graphics from basic to application

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Do Hyeong; Mun, Sung Min

    1998-04-15

    This book mentions conception of computer graphics, background history, necessity and applied field like construction design, image processing, auto mobile design, fashion design and TV broadcast, basic principle of computer, computer graphics hardware, computer graphics software such as adobe illustrator tool box and adobe photo shop, quarkXpress like introduction, application and operating circumstance, 3D graphics with summary, difference of versions of 3D studio and system, and Auto CAD application.

  3. Fuel rod computations. The COMETHE code in its CEA version

    International Nuclear Information System (INIS)

    Lenepveu, Dominique.

    1976-01-01

    The COMETHE code (COde d'evolution MEcanique et THermique) is intended for computing the irradiation behavior of water reactor fuel pins. It is concerned with steadily operated cylindrical pins, containing fuel pellet stacks (UO 2 or PuO 2 ). The pin consists in five different axial zones: two expansion chambers, two blankets, and a central core that may be divided into several stacks parted by plugs. As far as computation is concerned, the pin is divided into slices (maximum 15) in turn divided into rings (maximum 50). Information are obtained for each slice: the radial temperature distribution, heat transfer coefficients, thermal flux at the pin surface, changes in geometry according to temperature conditions, and specific burn-up. The physical models involved take account for: heat transfer, fission gas release, fuel expansion, and creep of the can. Results computed with COMETHE are compared with those from ELP and EPEL irradiation experiments [fr

  4. APPLE-2: an improved version of APPLE code for plotting neutron and gamma ray spectra and reaction rates

    International Nuclear Information System (INIS)

    Kawasaki, Hiromitsu; Seki, Yasushi.

    1982-07-01

    A computer code APPLE-2 which plots the spatial distribution of energy spectra of multi-group neutron and/or gamma ray fluxes, and reaction rates has been developed. This code is an improved version of the previously developed APPLE code and has the following features: (1) It plots energy spectra of neutron and/or gamma ray fluxes calculated by ANISN, DOT and MORSE. (2) It calculates and plots the spatial distribution of neutron and gamma ray fluxes and various types of reaction rates such as nuclear heating rates, operational dose rates, displacement damage rates. (3) Input data specification is greatly simplified by the use of standard, response libraries and by close coupling with radiation transport calculation codes. (4) Plotting outputs are given in camera ready form. (author)

  5. Advanced computer-based training

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, H D; Martin, H D

    1987-05-01

    The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment.

  6. Advanced computer-based training

    International Nuclear Information System (INIS)

    Fischer, H.D.; Martin, H.D.

    1987-01-01

    The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment. (orig.) [de

  7. Elementary mathematical and computational tools for electrical and computer engineers using Matlab

    CERN Document Server

    Manassah, Jamal T

    2013-01-01

    Ideal for use as a short-course textbook and for self-study Elementary Mathematical and Computational Tools for Electrical and Computer Engineers Using MATLAB fills that gap. Accessible after just one semester of calculus, it introduces the many practical analytical and numerical tools that are essential to success both in future studies and in professional life. Sharply focused on the needs of the electrical and computer engineering communities, the text provides a wealth of relevant exercises and design problems. Changes in MATLAB's version 6.0 are included in a special addendum.

  8. Neural computation and the computational theory of cognition.

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-04-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism-neural processes are computations in the generic sense. After that, we reject on empirical grounds the common assimilation of neural computation to either analog or digital computation, concluding that neural computation is sui generis. Analog computation requires continuous signals; digital computation requires strings of digits. But current neuroscientific evidence indicates that typical neural signals, such as spike trains, are graded like continuous signals but are constituted by discrete functional elements (spikes); thus, typical neural signals are neither continuous signals nor strings of digits. It follows that neural computation is sui generis. Finally, we highlight three important consequences of a proper understanding of neural computation for the theory of cognition. First, understanding neural computation requires a specially designed mathematical theory (or theories) rather than the mathematical theories of analog or digital computation. Second, several popular views about neural computation turn out to be incorrect. Third, computational theories of cognition that rely on non-neural notions of computation ought to be replaced or reinterpreted in terms of neural computation. Copyright © 2012 Cognitive Science Society, Inc.

  9. Latest NASA Instrument Cost Model (NICM): Version VI

    Science.gov (United States)

    Mrozinski, Joe; Habib-Agahi, Hamid; Fox, George; Ball, Gary

    2014-01-01

    The NASA Instrument Cost Model, NICM, is a suite of tools which allow for probabilistic cost estimation of NASA's space-flight instruments at both the system and subsystem level. NICM also includes the ability to perform cost by analogy as well as joint confidence level (JCL) analysis. The latest version of NICM, Version VI, was released in Spring 2014. This paper will focus on the new features released with NICM VI, which include: 1) The NICM-E cost estimating relationship, which is applicable for instruments flying on Explorer-like class missions; 2) The new cluster analysis ability which, alongside the results of the parametric cost estimation for the user's instrument, also provides a visualization of the user's instrument's similarity to previously flown instruments; and 3) includes new cost estimating relationships for in-situ instruments.

  10. Charged-particle thermonuclear reaction rates: IV. Comparison to previous work

    International Nuclear Information System (INIS)

    Iliadis, C.; Longland, R.; Champagne, A.E.; Coc, A.

    2010-01-01

    We compare our Monte Carlo reaction rates (see Paper II of this issue) to previous results that were obtained by using the classical method of computing thermonuclear reaction rates. For each reaction, the comparison is presented using two types of graphs: the first shows the change in reaction rate uncertainties, while the second displays our new results normalized to the previously recommended reaction rate. We find that the rates have changed significantly for almost all reactions considered here. The changes are caused by (i) our new Monte Carlo method of computing reaction rates (see Paper I of this issue), and (ii) newly available nuclear physics information (see Paper III of this issue).

  11. SAGE FOR MACINTOSH (MSAGE) VERSION 1.0 SOLVENT ALTERNATIVES GUIDE - USER'S GUIDE

    Science.gov (United States)

    The guide provides instructions for using the Solvent Alternatives Guide (SAGE) for Macintosh, version 1.0. The guide assumes that the user is familiar with the fundamentals of operating aMacintosh personal computer under the System 7.0 (or higher) operating system. SAGE for ...

  12. UQTk Version 3.0.3 User Manual

    Energy Technology Data Exchange (ETDEWEB)

    Sargsyan, Khachik [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Safta, Cosmin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chowdhary, Kamaljit Singh [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Castorena, Sarah [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); De Bord, Sarah [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Debusschere, Bert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-05-01

    The UQ Toolkit (UQTk) is a collection of libraries and tools for the quantification of uncertainty in numerical model predictions. Version 3.0.3 offers intrusive and non-intrusive methods for propagating input uncertainties through computational models, tools for sen- sitivity analysis, methods for sparse surrogate construction, and Bayesian inference tools for inferring parameters from experimental data. This manual discusses the download and installation process for UQTk, provides pointers to the UQ methods used in the toolkit, and describes some of the examples provided with the toolkit.

  13. The Case for Teaching Computer Graphics with WebGL: A 25-Year Perspective.

    Science.gov (United States)

    Angel, Ed

    2017-01-01

    OpenGL has been the standard API for teaching computer graphics. There are now multiple versions of the standard, including WebGL. which is the JavaScript implementation of OpenGL ES 2.0. The author argues that WebGL is the version best suited for an introductory course in computer graphics.

  14. Parallel computers and three-dimensional computational electromagnetics

    International Nuclear Information System (INIS)

    Madsen, N.K.

    1994-01-01

    The authors have continued to enhance their ability to use new massively parallel processing computers to solve time-domain electromagnetic problems. New vectorization techniques have improved the performance of their code DSI3D by factors of 5 to 15, depending on the computer used. New radiation boundary conditions and far-field transformations now allow the computation of radar cross-section values for complex objects. A new parallel-data extraction code has been developed that allows the extraction of data subsets from large problems, which have been run on parallel computers, for subsequent post-processing on workstations with enhanced graphics capabilities. A new charged-particle-pushing version of DSI3D is under development. Finally, DSI3D has become a focal point for several new Cooperative Research and Development Agreement activities with industrial companies such as Lockheed Advanced Development Company, Varian, Hughes Electron Dynamics Division, General Atomic, and Cray

  15. RASCAL Version 2.1 workbook. Volume 2, Revision 2

    International Nuclear Information System (INIS)

    Athey, G.F.; Sjoreen, A.L.; McKenna, T.J.

    1994-12-01

    The Radiological Assessment System for Consequence Analysis, Version 2.1 (RASCAL 2.1) was developed for use by the NRC personnel who respond to radiological emergencies. This workbook complements the RASCAL 2.1 User's guide (NUREG/CR-5247, Vol. 1, Rev. 2). The workbook contains exercises designed to familiarize the user with the computer-based tools of RASCAL through hands-on problem solving. The workbook contains four major sections. The first is a RASCAL familiarization exercise to acquaint the user with the operation of the forms, menus, online help, and documentation. The latter three sections contain exercises in using the three tools of RASCAL Version 2.1: DECAY, FM-DOSE, and ST-DOSE. A discussion section describing how the tools could be used to solve the problems follows each set of exercises

  16. A MHD equilibrium code 'EQUCIR version 2' applicable to up-down asymmetric toroidal plasma

    International Nuclear Information System (INIS)

    Shinya, Kichiro; Ninomiya, Hiromasa

    1981-01-01

    Computer code EQUCIR version 2, which can analyse tokamak plasma equilibrium without assuming up-down symmetry with respect to the mid-plane, has been developed. This code is essentially the same as EQUCIR version 1 which has already been reported and can deal with only symmetrical plasma with respect to the mid-plane. Because data input stream is slightly different from version 1 physical background of the change and the method of calculation are explained. Data input manual for the different points is also summarized. The code has been applied to the analysis of INTOR single-null divertor plasmas and to the design of hybrid poloidal coils resulting in useful and powerful means for the design. (author)

  17. 7th International Workshop on Natural Computing

    CERN Document Server

    Hagiya, Masami

    2015-01-01

    This book highlights recent advances in natural computing, including biology and its theory, bio-inspired computing, computational aesthetics, computational models and theories, computing with natural media, philosophy of natural computing and educational technology. It presents extended versions of the best papers selected from the symposium “7th International Workshop on Natural Computing” (IWNC7), held in Tokyo, Japan, in 2013. The target audience is not limited to researchers working in natural computing but also those active in biological engineering, fine/media art design, aesthetics and philosophy.

  18. SHABERTH - ANALYSIS OF A SHAFT BEARING SYSTEM (CRAY VERSION)

    Science.gov (United States)

    Coe, H. H.

    1994-01-01

    shear forces in the inlet zone of lubricated contacts, which accounts for the degree of lubricant film starvation; modeling normal and friction forces between a ball and a cage pocket, which account for the transition between the hydrodynamic and elastohydrodynamic regimes of lubrication; and a model of the effect on fatigue life of the ratio of the EHD plateau film thickness to the composite surface roughness. SHABERTH is intended to be as general as possible. The models in SHABERTH allow for the complete mathematical simulation of real physical systems. Systems are limited to a maximum of five bearings supporting the shaft, a maximum of thirty rolling elements per bearing, and a maximum of one hundred temperature nodes. The SHABERTH program structure is modular and has been designed to permit refinement and replacement of various component models as the need and opportunities develop. A preprocessor is included in the IBM PC version of SHABERTH to provide a user friendly means of developing SHABERTH models and executing the resulting code. The preprocessor allows the user to create and modify data files with minimal effort and a reduced chance for errors. Data is utilized as it is entered; the preprocessor then decides what additional data is required to complete the model. Only this required information is requested. The preprocessor can accommodate data input for any SHABERTH compatible shaft bearing system model. The system may include ball bearings, roller bearings, and/or tapered roller bearings. SHABERTH is written in FORTRAN 77, and two machine versions are available from COSMIC. The CRAY version (LEW-14860) has a RAM requirement of 176K of 64 bit words. The IBM PC version (MFS-28818) is written for IBM PC series and compatible computers running MS-DOS, and includes a sample MS-DOS executable. For execution, the PC version requires at least 1Mb of RAM and an 80386 or 486 processor machine with an 80x87 math co-processor. The standard distribution medium for the

  19. Rotary engine performance computer program (RCEMAP and RCEMAPPC): User's guide

    Science.gov (United States)

    Bartrand, Timothy A.; Willis, Edward A.

    1993-01-01

    This report is a user's guide for a computer code that simulates the performance of several rotary combustion engine configurations. It is intended to assist prospective users in getting started with RCEMAP and/or RCEMAPPC. RCEMAP (Rotary Combustion Engine performance MAP generating code) is the mainframe version, while RCEMAPPC is a simplified subset designed for the personal computer, or PC, environment. Both versions are based on an open, zero-dimensional combustion system model for the prediction of instantaneous pressures, temperature, chemical composition and other in-chamber thermodynamic properties. Both versions predict overall engine performance and thermal characteristics, including bmep, bsfc, exhaust gas temperature, average material temperatures, and turbocharger operating conditions. Required inputs include engine geometry, materials, constants for use in the combustion heat release model, and turbomachinery maps. Illustrative examples and sample input files for both versions are included.

  20. ASYS2: a new version of computer algebra package ASYS for analysis and simplification of polynomial systems

    International Nuclear Information System (INIS)

    Gerdt, V.P.; Khutornoj, N.V.

    1993-01-01

    In this paper a new version of a package ASYS for analysis of nonlinear algebraic equations based on the Groebner basis technique is described. In addition to the first version ASYS1 of the package, the current one has a number of new facilities which provide its higher efficiency. Some examples and results of comparison between ASYS2, ASYS1 and two other REDUCE packages GROEBNER and CALI included in REDUCE 3.5, are given. 16 refs., 4 tabs

  1. SAGE FOR WINDOWS (WSAGE) VERSION 1.0 SOLVENT ALTERNATIVES GUIDE - USER'S GUIDE

    Science.gov (United States)

    The guide provides instructions for using the Solvent Alternatives Guide (SAGE) for Windows, version 1.0. The guide assumes that the user is familiar with the fundamentals of operating Windows 3.1 (or higher) on a personal computer under the DOS 5.0 (or higher) operating system. ...

  2. Special issue of Higher-Order and Symbolic Computation

    DEFF Research Database (Denmark)

    solicited from papers presented at ASIAPEPM 02, the 2002 SIGPLAN Symposium on Partial Evaluation and Semantics-Based Program Manipulation [1]. The four articles were subjected to the usual process of journal reviewing. "Cost-Augmented Partial Evaluation of Functional Logic Programs" extends previous......The present issue is dedicated to Partial Evaluation and Semantics-Based Program Manipulation. Its first two articles were solicited from papers presented at PEPM 02, the 2002 ACMSIGPLANWorkshop on Partial Evaluation and Semantics-Based Program Manipulation [2], and its last two articles were...... narrowing-driven techniques of partial evaluation for functional-logic programs by the inclusion of abstract computation costs into the partial-evaluation process. A preliminary version of this work was presented at PEPM 02. "Specialization Scenarios: A Pragmatic Approach to Declaring Program Specialization...

  3. WSPEEDI (worldwide version of SPEEDI): A computer code system for the prediction of radiological impacts on Japanese due to a nuclear accident in foreign countries

    Energy Technology Data Exchange (ETDEWEB)

    Chino, Masamichi; Yamazawa, Hiromi; Nagai, Haruyasu; Moriuchi, Shigeru [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Ishikawa, Hirohiko

    1995-09-01

    A computer code system has been developed for near real-time dose assessment during radiological emergencies. The system WSPEEDI, the worldwide version of SPEEDI (System for Prediction of Environmental Emergency Dose Information) aims at predicting the radiological impact on Japanese due to a nuclear accident in foreign countries. WSPEEDI consists of a mass-consistent wind model WSYNOP for large-scale wind fields and a particle random walk model GEARN for atmospheric dispersion and dry and wet deposition of radioactivity. The models are integrated into a computer code system together with a system control software, worldwide geographic database, meteorological data processor and graphic software. The performance of the models has been evaluated using the Chernobyl case with reliable source terms, well-established meteorological data and a comprehensive monitoring database. Furthermore, the response of the system has been examined by near real-time simulations of the European Tracer Experiment (ETEX), carried out over about 2,000 km area in Europe. (author).

  4. WSPEEDI (worldwide version of SPEEDI): A computer code system for the prediction of radiological impacts on Japanese due to a nuclear accident in foreign countries

    International Nuclear Information System (INIS)

    Chino, Masamichi; Yamazawa, Hiromi; Nagai, Haruyasu; Moriuchi, Shigeru; Ishikawa, Hirohiko.

    1995-09-01

    A computer code system has been developed for near real-time dose assessment during radiological emergencies. The system WSPEEDI, the worldwide version of SPEEDI (System for Prediction of Environmental Emergency Dose Information) aims at predicting the radiological impact on Japanese due to a nuclear accident in foreign countries. WSPEEDI consists of a mass-consistent wind model WSYNOP for large-scale wind fields and a particle random walk model GEARN for atmospheric dispersion and dry and wet deposition of radioactivity. The models are integrated into a computer code system together with a system control software, worldwide geographic database, meteorological data processor and graphic software. The performance of the models has been evaluated using the Chernobyl case with reliable source terms, well-established meteorological data and a comprehensive monitoring database. Furthermore, the response of the system has been examined by near real-time simulations of the European Tracer Experiment (ETEX), carried out over about 2,000 km area in Europe. (author)

  5. Ariadne version 4 - a program for simulation of QCD cascades implementing the colour dipole model

    International Nuclear Information System (INIS)

    Loennblad, L.

    1992-01-01

    The fourth version of the Ariadne program for generating QCD cascades in the colour dipole approximation is presented. The underlying physics issues are discussed and a manual for using the program is given together with a few sample programs. The major changes from previous versions are the introduction of photon radiation from quarks and inclusion of interfaces to the LEPTO and PYTHIA programs. (orig.)

  6. Computing Mass Properties From AutoCAD

    Science.gov (United States)

    Jones, A.

    1990-01-01

    Mass properties of structures computed from data in drawings. AutoCAD to Mass Properties (ACTOMP) computer program developed to facilitate quick calculations of mass properties of structures containing many simple elements in such complex configurations as trusses or sheet-metal containers. Mathematically modeled in AutoCAD or compatible computer-aided design (CAD) system in minutes by use of three-dimensional elements. Written in Microsoft Quick-Basic (Version 2.0).

  7. Integrated Reliability and Risk Analysis System (IRRAS) Version 2.0 user's guide

    International Nuclear Information System (INIS)

    Russell, K.D.; Sattison, M.B.; Rasmuson, D.M.

    1990-06-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Also provided in the system is an integrated full-screen editor for use when interfacing with remote mainframe computer systems. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 2.0 and is the subject of this user's guide. Version 2.0 of IRRAS provides all of the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance. 9 refs., 292 figs., 4 tabs

  8. Multiyear interactive computer almanac, 1800-2050

    CERN Document Server

    United States. Naval Observatory

    2005-01-01

    The Multiyear Interactive Computer Almanac (MICA Version 2.2.2 ) is a software system that runs on modern versions of Windows and Macintosh computers created by the U.S. Naval Observatory's Astronomical Applications Department, especially for astronomers, surveyors, meteorologists, navigators and others who regularly need accurate information on the positions, motions, and phenomena of celestial objects. MICA produces high-precision astronomical data in tabular form, tailored for the times and locations specified by the user. Unlike traditional almanacs, MICA computes these data in real time, eliminating the need for table look-ups and additional hand calculations. MICA tables can be saved as standard text files, enabling their use in other applications. Several important new features have been added to this edition of MICA, including: extended date coverage from 1800 to 2050; a redesigned user interface; a graphical sky map; a phenomena calculator (eclipses, transits, equinoxes, solstices, conjunctions, oppo...

  9. 8th International Workshop on Natural Computing

    CERN Document Server

    Hagiya, Masami

    2016-01-01

    This book highlights recent advances in natural computing, including biology and its theory, bio-inspired computing, computational aesthetics, computational models and theories, computing with natural media, philosophy of natural computing, and educational technology. It presents extended versions of the best papers selected from the “8th International Workshop on Natural Computing” (IWNC8), a symposium held in Hiroshima, Japan, in 2014. The target audience is not limited to researchers working in natural computing but also includes those active in biological engineering, fine/media art design, aesthetics, and philosophy.

  10. The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 2 Core.

    Science.gov (United States)

    Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J

    2018-03-09

    Computational models can help researchers to interpret data, understand biological functions, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that different software systems can exchange. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 2 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML, their encoding in XML (the eXtensible Markup Language), validation rules that determine the validity of an SBML document, and examples of models in SBML form. The design of Version 2 differs from Version 1 principally in allowing new MathML constructs, making more child elements optional, and adding identifiers to all SBML elements instead of only selected elements. Other materials and software are available from the SBML project website at http://sbml.org/.

  11. Validation of the Spanish Addiction Severity Index Multimedia Version (S-ASI-MV).

    Science.gov (United States)

    Butler, Stephen F; Redondo, José Pedro; Fernandez, Kathrine C; Villapiano, Albert

    2009-01-01

    This study aimed to develop and test the reliability and validity of a Spanish adaptation of the ASI-MV, a computer administered version of the Addiction Severity Index, called the S-ASI-MV. Participants were 185 native Spanish-speaking adult clients from substance abuse treatment facilities serving Spanish-speaking clients in Florida, New Mexico, California, and Puerto Rico. Participants were administered the S-ASI-MV as well as Spanish versions of the general health subscale of the SF-36, the work and family unit subscales of the Social Adjustment Scale Self-Report, the Michigan Alcohol Screening Test, the alcohol and drug subscales of the Personality Assessment Inventory, and the Hopkins Symptom Checklist-90. Three-to-five-day test-retest reliability was examined along with criterion validity, convergent/discriminant validity, and factorial validity. Measurement invariance between the English and Spanish versions of the ASI-MV was also examined. The S-ASI-MV demonstrated good test-retest reliability (ICCs for composite scores between .59 and .93), criterion validity (rs for composite scores between .66 and .87), and convergent/discriminant validity. Factorial validity and measurement invariance were demonstrated. These results compared favorably with those reported for the original interviewer version of the ASI and the English version of the ASI-MV.

  12. Utilization of the RELAP4/MOD5/SAS code version in loss of coolant accident in the Angra 1 nuclear power station

    International Nuclear Information System (INIS)

    Sabundjian, G.; Freitas, R.L.

    1991-09-01

    A new version of computer code RELAP4/MOD5 was developed to improve the output. The new version, called RELAP4/MOD5/SAS, prints the main variables in graphical form. In order to check the program, a 36 - volume simulation of the Loss-of-Coolant Accident for Angra - I was performed and the results compared to those of a existing 44 - volume simulation showed a satisfactory agreement with a substantial reduction in computing time. (author)

  13. Overlaid Alice: a statistical model computer code including fission and preequilibrium models

    International Nuclear Information System (INIS)

    Blann, M.

    1976-01-01

    The most recent edition of an evaporation code originally written previously with frequent updating and improvement. This version replaces the version Alice described previously. A brief summary is given of the types of calculations which can be done. A listing of the code and the results of several sample calculations are presented

  14. Data File Standard for Flow Cytometry, version FCS 3.1.

    Science.gov (United States)

    Spidlen, Josef; Moore, Wayne; Parks, David; Goldberg, Michael; Bray, Chris; Bierre, Pierre; Gorombey, Peter; Hyun, Bill; Hubbard, Mark; Lange, Simon; Lefebvre, Ray; Leif, Robert; Novo, David; Ostruszka, Leo; Treister, Adam; Wood, James; Murphy, Robert F; Roederer, Mario; Sudar, Damir; Zigon, Robert; Brinkman, Ryan R

    2010-01-01

    The flow cytometry data file standard provides the specifications needed to completely describe flow cytometry data sets within the confines of the file containing the experimental data. In 1984, the first Flow Cytometry Standard format for data files was adopted as FCS 1.0. This standard was modified in 1990 as FCS 2.0 and again in 1997 as FCS 3.0. We report here on the next generation flow cytometry standard data file format. FCS 3.1 is a minor revision based on suggested improvements from the community. The unchanged goal of the standard is to provide a uniform file format that allows files created by one type of acquisition hardware and software to be analyzed by any other type.The FCS 3.1 standard retains the basic FCS file structure and most features of previous versions of the standard. Changes included in FCS 3.1 address potential ambiguities in the previous versions and provide a more robust standard. The major changes include simplified support for international characters and improved support for storing compensation. The major additions are support for preferred display scale, a standardized way of capturing the sample volume, information about originality of the data file, and support for plate and well identification in high throughput, plate based experiments. Please see the normative version of the FCS 3.1 specification in Supporting Information for this manuscript (or at http://www.isac-net.org/ in the Current standards section) for a complete list of changes.

  15. Consumer and provider responses to a computerized version of the Illness Management and Recovery Program.

    Science.gov (United States)

    Wright-Berryman, Jennifer L; Salyers, Michelle P; O'Halloran, James P; Kemp, Aaron S; Mueser, Kim T; Diazoni, Amanda J

    2013-12-01

    To explore mental health consumer and provider responses to a computerized version of the Illness Management and Recovery (IMR) program. Semistructured interviews were conducted to gather data from 6 providers and 12 consumers who participated in a computerized prototype of the IMR program. An inductive-consensus-based approach was used to analyze the interview responses. Qualitative analysis revealed consumers perceived various personal benefits and ease of use afforded by the new technology platform. Consumers also highly valued provider assistance and offered several suggestions to improve the program. The largest perceived barriers to future implementation were lack of computer skills and access to computers. Similarly, IMR providers commented on its ease and convenience, and the reduction of time intensive material preparation. Providers also expressed that the use of technology creates more options for the consumer to access treatment. The technology was acceptable, easy to use, and well-liked by consumers and providers. Clinician assistance with technology was viewed as helpful to get clients started with the program, as lack of computer skills and access to computers was a concern. Access to materials between sessions appears to be desired; however, given perceived barriers of computer skills and computer access, additional supports may be needed for consumers to achieve full benefits of a computerized version of IMR. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  16. Corporations' Resistance to Innovation: The Adoption of the Internet Protocol Version 6

    Science.gov (United States)

    Pazdrowski, Tomasz

    2013-01-01

    Computer networks that brought unprecedented growth in global communication have been using Internet Protocol version 4 (IPv4) as a standard for routing. The exponential increase in the use of the networks caused an acute shortage of available identification numbers (IP addresses). The shortage and other network communication issues are…

  17. Integrative shell of the program complex MARS (Version 1.0) radiation transfer in three-dimensional geometries

    International Nuclear Information System (INIS)

    Degtyarev, I.I.; Lokhovitskij, A.E.; Maslov, M.A.; Yazynin, I.A.

    1994-01-01

    The first version of integrative shell of the program complex MARS is written for calculating radiation transfer in the three-dimensional geometries. The integrative shell allows the user to work in convenient form with complex MARS, creat input files data and get graphic visualization of calculated functions. Version 1.0 is adapted for personal computers of types IBM-286,386,486 with operative size memory not smaller than 500K. 5 refs

  18. Computer-aided detection system performance on current and previous digital mammograms in patients with contralateral metachronous breast cancer

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Moon, Woo Kyung; Cho, Nariya; Chang, Jung Min

    2012-01-01

    Background: The computer-aided detection (CAD) system is widely used for screening mammography. The performance of the CAD system for contralateral breast cancer has not been reported for women with a history of breast cancer. Purpose: To retrospectively evaluate the performance of a CAD system on current and previous mammograms in patients with contralateral metachronous breast cancer. Material and Methods: During a 3-year period, 4945 postoperative patients had follow-up examinations, from whom we selected 55 women with contralateral breast cancers. Among them, 38 had visible malignant signs on the current mammograms. We analyzed the sensitivity and false-positive marks of the system on the current and previous mammograms according to lesion type and breast density. Results: The total visible lesion components on the current mammograms included 27 masses and 14 calcifications in 38 patients. The case-based sensitivity for all lesion types was 63.2% (24/38) with false-positive marks of 0.71 per patient. The lesion-based sensitivity for masses and calcifications was 59.3% (16/27) and 71.4% (10/14), respectively. The lesion-based sensitivity for masses in fatty and dense breasts was 68.8% (11/16) and 45.5% (5/11), respectively. The lesion-based sensitivity for calcifications in fatty and dense breasts was 100.0% (3/3) and 63.6% (7/11), respectively. The total visible lesion components on the previous mammograms included 13 masses and three calcifications in 16 patients, and the sensitivity for all lesion types was 31.3% (5/16) with false-positive marks of 0.81 per patient. On these mammograms, the sensitivity for masses and calcifications was 30.8% (4/13) and 33.3% (1/3), respectively. The sensitivity in fatty and dense breasts was 28.6% (2/7) and 33.3% (3/9), respectively. Conclusion: In the women with a history of breast cancer, the sensitivity of the CAD system in visible contralateral breast cancer was lower than in most previous reports using the same CAD

  19. xdamp Version 6.100: An IDL(reg sign)-based data and image manipulation program

    International Nuclear Information System (INIS)

    Ballard, William Parker

    2012-01-01

    The original DAMP (DAta Manipulation Program) was written by Mark Hedemann of Sandia National Laboratories and used the CA-DISSPLA(trademark) (available from Computer Associates International, Inc., Garden City, NY) graphics package as its engine. It was used to plot, modify, and otherwise manipulate the one-dimensional data waveforms (data vs. time) from a wide variety of accelerators. With the waning of CA-DISSPLA and the increasing popularity of Unix(reg sign)-based workstations, a replacement was needed. This package uses the IDL(reg sign) software, available from Research Systems Incorporated, a Xerox company, in Boulder, Colorado, as the engine, and creates a set of widgets to manipulate the data in a manner similar to the original DAMP and earlier versions of xdamp. IDL is currently supported on a wide variety of Unix platforms such as IBM(reg sign) workstations, Hewlett Packard workstations, SUN(reg sign) workstations, Microsoft(reg sign) Windows(trademark) computers, Macintosh(reg sign) computers and Digital Equipment Corporation VMS(reg sign) and Alpha(reg sign) systems. Thus, xdamp is portable across many platforms. We have verified operation, albeit with some minor IDL bugs, on personal computers using Windows 7 and Windows Vista; Unix platforms; and Macintosh computers. Version 6 is an update that uses the IDL Virtual Machine to resolve the need for licensing IDL.

  20. The Light-Water-Reactor Version of the URANUS Integral fuel-rod code

    Energy Technology Data Exchange (ETDEWEB)

    Labmann, K; Moreno, A

    1977-07-01

    The LWR version of the URANUS code, a digital computer programme for the thermal and mechanical analysis of fuel rods, is presented. Material properties are discussed and their effect on integral fuel rod behaviour elaborated via URANUS results for some carefully selected reference experiments. The numerical results do not represent post-irradiation analyses of in-pile experiments, they illustrate rather typical and diverse URANUS capabilities. The performance test shows that URANUS is reliable and efficient, thus the code is a most valuable tool in fuel rod analysis work. K. LaBmann developed the LWR version of the URANUS code, material properties were reviewed and supplied by A. Moreno. (Author) 41 refs.

  1. AUS98 - The 1998 version of the AUS modular neutronic code system

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, G.S.; Harrington, B.V

    1998-07-01

    AUS is a neutronics code system which may be used for calculations of a wide range of fission reactors, fusion blankets and other neutron applications. The present version, AUS98, has a nuclear cross section library based on ENDF/B-VI and includes modules which provide for reactor lattice calculations, one-dimensional transport calculations, multi-dimensional diffusion calculations, cell and whole reactor burnup calculations, and flexible editing of results. Calculations of multi-region resonance shielding, coupled neutron and photon transport, energy deposition, fission product inventory and neutron diffusion are combined within the one code system. The major changes from the previous AUS publications are the inclusion of a cross-section library based on ENDF/B-VI, the addition of the MICBURN module for controlling whole reactor burnup calculations, and changes to the system as a consequence of moving from IBM main-frame computers to UNIX workstations This report gives details of all system aspects of AUS and all modules except the POW3D multi-dimensional diffusion module refs., tabs.

  2. AUS98 - The 1998 version of the AUS modular neutronic code system

    International Nuclear Information System (INIS)

    Robinson, G.S.; Harrington, B.V.

    1998-07-01

    AUS is a neutronics code system which may be used for calculations of a wide range of fission reactors, fusion blankets and other neutron applications. The present version, AUS98, has a nuclear cross section library based on ENDF/B-VI and includes modules which provide for reactor lattice calculations, one-dimensional transport calculations, multi-dimensional diffusion calculations, cell and whole reactor burnup calculations, and flexible editing of results. Calculations of multi-region resonance shielding, coupled neutron and photon transport, energy deposition, fission product inventory and neutron diffusion are combined within the one code system. The major changes from the previous AUS publications are the inclusion of a cross-section library based on ENDF/B-VI, the addition of the MICBURN module for controlling whole reactor burnup calculations, and changes to the system as a consequence of moving from IBM main-frame computers to UNIX workstations This report gives details of all system aspects of AUS and all modules except the POW3D multi-dimensional diffusion module

  3. TOUGH2 User's Guide Version 2

    Energy Technology Data Exchange (ETDEWEB)

    Pruess, K.; Oldenburg, C.M.; Moridis, G.J.

    1999-11-01

    TOUGH2 is a numerical simulator for nonisothermal flows of multicomponent, multiphase fluids in one, two, and three-dimensional porous and fractured media. The chief applications for which TOUGH2 is designed are in geothermal reservoir engineering, nuclear waste disposal, environmental assessment and remediation, and unsaturated and saturated zone hydrology. TOUGH2 was first released to the public in 1991; the 1991 code was updated in 1994 when a set of preconditioned conjugate gradient solvers was added to allow a more efficient solution of large problems. The current Version 2.0 features several new fluid property modules and offers enhanced process modeling capabilities, such as coupled reservoir-wellbore flow, precipitation and dissolution effects, and multiphase diffusion. Numerous improvements in previously released modules have been made and new user features have been added, such as enhanced linear equation solvers, and writing of graphics files. The T2VOC module for three-phase flows of water, air and a volatile organic chemical (VOC), and the T2DM module for hydrodynamic dispersion in 2-D flow systems have been integrated into the overall structure of the code and are included in the Version 2.0 package. Data inputs are upwardly compatible with the previous version. Coding changes were generally kept to a minimum, and were only made as needed to achieve the additional functionalities desired. TOUGH2 is written in standard FORTRAN77 and can be run on any platform, such as workstations, PCs, Macintosh, mainframe and supercomputers, for which appropriate FORTRAN compilers are available. This report is a self-contained guide to application of TOUGH2 to subsurface flow problems. It gives a technical description of the TOUGH2 code, including a discussion of the physical processes modeled, and the mathematical and numerical methods used. Illustrative sample problems are presented along with detailed instructions for preparing input data.

  4. Computer Graphics Research Laboratory Quarterly Progress Report Number 49, July-September 1993

    Science.gov (United States)

    1993-11-22

    20 Texture Sampling and Strength Guided Motion: Jeffry S. Nimeroff 23 21 Radiosity : Min-Zhi Shao 24 22 Blended Shape Primitives: Douglas DeCarlo 25 23...placement. "* Extensions of radiosity rendering. "* A discussion of blended shape primitives and the applications in computer vision and computer...user. Radiosity : An improved version of the radiosity renderer is included. This version uses a fast over- relaxation progressive refinement algorithm

  5. GROGi-F. Modified version of GROGi 2 nuclear evaporation computer code including fission decay channel

    International Nuclear Information System (INIS)

    Delagrange, H.

    1977-01-01

    This report is the user manual of the GR0GI-F code, modified version of the GR0GI-2 code. It calculates the cross sections for heavy ion induced fission. Fission probabilities are calculated via the Bohr-Wheeler formalism

  6. Air Space Proportion in Pterosaur Limb Bones Using Computed Tomography and Its Implications for Previous Estimates of Pneumaticity

    Science.gov (United States)

    Martin, Elizabeth G.; Palmer, Colin

    2014-01-01

    Air Space Proportion (ASP) is a measure of how much air is present within a bone, which allows for a quantifiable comparison of pneumaticity between specimens and species. Measured from zero to one, higher ASP means more air and less bone. Conventionally, it is estimated from measurements of the internal and external bone diameter, or by analyzing cross-sections. To date, the only pterosaur ASP study has been carried out by visual inspection of sectioned bones within matrix. Here, computed tomography (CT) scans are used to calculate ASP in a small sample of pterosaur wing bones (mainly phalanges) and to assess how the values change throughout the bone. These results show higher ASPs than previous pterosaur pneumaticity studies, and more significantly, higher ASP values in the heads of wing bones than the shaft. This suggests that pneumaticity has been underestimated previously in pterosaurs, birds, and other archosaurs when shaft cross-sections are used to estimate ASP. Furthermore, ASP in pterosaurs is higher than those found in birds and most sauropod dinosaurs, giving them among the highest ASP values of animals studied so far, supporting the view that pterosaurs were some of the most pneumatized animals to have lived. The high degree of pneumaticity found in pterosaurs is proposed to be a response to the wing bone bending stiffness requirements of flight rather than a means to reduce mass, as is often suggested. Mass reduction may be a secondary result of pneumaticity that subsequently aids flight. PMID:24817312

  7. The implementation of the CDC version of RELAP5/MOD1/019 on an IBM compatible computer system (AMDAHL 470/V8)

    International Nuclear Information System (INIS)

    Kolar, W.; Brewka, W.

    1984-01-01

    RELAP5/MOD1 is an advanced one-dimensional best estimate system code, which is used for safety analysis studies of nuclear pressurized water reactor systems and related integral and separate effect test facilities. The program predicts the system response for large break, small break LOCA and special transients. To a large extent RELAP5/MOD1 is written in Fortran, only a small part of the program is coded in CDC assembler. RELAP5/MOD1 was developed on the CDC CYBER 176 at INEL*. The code development team made use of CDC system programs like the CDC UPDATE facility and incorporated in the program special purpose software packages. The report describes the problems which have been encountered when implementing the CDC version of RELAP5/MOD1 on an IBM compatible computer systems (AMDAHL 470/V8)

  8. Dental Fear Survey: A Cross-Sectional Study Evaluating the Psychometric Properties of the Brazilian Portuguese Version

    Directory of Open Access Journals (Sweden)

    Maurício Antônio Oliveira

    2014-01-01

    Full Text Available Objective. The aim of this study was to evaluate the psychometric properties of the Brazilian version of the Dental Fear Survey (DFS, previously translated to the Brazilian Portuguese language and validated. Methods. A cross-sectional study with 1,256 undergraduates from the city of Belo Horizonte, Brazil, was carried out. The DFS and a questionnaire about previous dental experiences were self-administered. Data analysis involved descriptive statistics, principal components analysis (PCA, confirmatory factor analysis (CFA, internal consistency and test-retest reliability, and construct, discriminant, and convergent validity. Results. PCA identified a three-factor structure. CFA confirmed the multidimensionality of the Brazilian version of the DFS. A modified model of the Brazilian version of the DFS fits better than the hypothesized model. The Cronbach’s alpha coefficient for the total DFS scale was 0.95. Conclusion. The DFS demonstrated acceptable construct validity, convergent validity, and discriminant validity. These results supported the reliability and validity of the DFS among Brazilian undergraduates.

  9. PCACE-Personal-Computer-Aided Cabling Engineering

    Science.gov (United States)

    Billitti, Joseph W.

    1987-01-01

    PCACE computer program developed to provide inexpensive, interactive system for learning and using engineering approach to interconnection systems. Basically database system that stores information as files of individual connectors and handles wiring information in circuit groups stored as records. Directly emulates typical manual engineering methods of handling data, thus making interface between user and program very natural. Apple version written in P-Code Pascal and IBM PC version of PCACE written in TURBO Pascal 3.0

  10. High Performance Computing Multicast

    Science.gov (United States)

    2012-02-01

    A History of the Virtual Synchrony Replication Model,” in Replication: Theory and Practice, Charron-Bost, B., Pedone, F., and Schiper, A. (Eds...Performance Computing IP / IPv4 Internet Protocol (version 4.0) IPMC Internet Protocol MultiCast LAN Local Area Network MCMD Dr. Multicast MPI

  11. HPC Institutional Computing Project: W15_lesreactiveflow KIVA-hpFE Development: A Robust and Accurate Engine Modeling Software

    Energy Technology Data Exchange (ETDEWEB)

    Carrington, David Bradley [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Waters, Jiajia [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-05

    KIVA-hpFE is a high performance computer software for solving the physics of multi-species and multiphase turbulent reactive flow in complex geometries having immersed moving parts. The code is written in Fortran 90/95 and can be used on any computer platform with any popular complier. The code is in two versions, a serial version and a parallel version utilizing MPICH2 type Message Passing Interface (MPI or Intel MPI) for solving distributed domains. The parallel version is at least 30x faster than the serial version and much faster than our previous generation of parallel engine modeling software, by many factors. The 5th generation algorithm construction is a Galerkin type Finite Element Method (FEM) solving conservative momentum, species, and energy transport equations along with two-equation turbulent model k-ω Reynolds Averaged Navier-Stokes (RANS) model and a Vreman type dynamic Large Eddy Simulation (LES) method. The LES method is capable modeling transitional flow from laminar to fully turbulent; therefore, this LES method does not require special hybrid or blending to walls. The FEM projection method also uses a Petrov-Galerkin (P-G) stabilization along with pressure stabilization. We employ hierarchical basis sets, constructed on the fly with enrichment in areas associated with relatively larger error as determined by error estimation methods. In addition, when not using the hp-adaptive module, the code employs Lagrangian basis or shape functions. The shape functions are constructed for hexahedral, prismatic and tetrahedral elements. The software is designed to solve many types of reactive flow problems, from burners to internal combustion engines and turbines. In addition, the formulation allows for direct integration of solid bodies (conjugate heat transfer), as in heat transfer through housings, parts, cylinders. It can also easily be extended to stress modeling of solids, used in fluid structure interactions problems, solidification, porous media

  12. Nuclear Engine System Simulation (NESS). Version 2.0: Program user's guide. Final Report

    International Nuclear Information System (INIS)

    Pelaccio, D.G.; Scheil, C.M.; Petrosky, L.

    1993-03-01

    This Program User's Guide discusses the Nuclear Thermal Propulsion (NTP) engine system design features and capabilities modeled in the Nuclear Engine System Simulation (NESS): Version 2.0 program (referred to as NESS throughout the remainder of this document), as well as its operation. NESS was upgraded to include many new modeling capabilities not available in the original version delivered to NASA LeRC in Dec. 1991, NESS's new features include the following: (1) an improved input format; (2) an advanced solid-core NERVA-type reactor system model (ENABLER 2); (3) a bleed-cycle engine system option; (4) an axial-turbopump design option; (5) an automated pump-out turbopump assembly sizing option; (6) an off-design gas generator engine cycle design option; (7) updated hydrogen properties; (8) an improved output formnd (9) personal computer operation capability. Sample design cases are presented in the user's guide that demonstrate many of the new features associated with this upgraded version of NESS, as well as design modeling features associated with the original version of NESS

  13. Conversion of the COBRA-IV-I code from CDC CYBER to HP 9000/700 version

    International Nuclear Information System (INIS)

    Sohn, D. S.; Yoo, Y. J.; Nahm, K. Y.; Hwang, D. H.

    1996-01-01

    COBRA-IV-I is a multichannel analysis code for the thermal-hydraulic analysis of rod bundle nuclear fuel elements and cores based on the subchannel approach. The existing COBRA-IV-I code is the control data corporation (CDC) CYBER version, which has limitations on the computer core storage and gives some inconvenience to the user interface. To solve these problems, we have converted the COBRA-IV-I code form the CDC CYBER mainframe to an Hewlett Packard (HP) 9000/700-series workstation version, and have verified the converted code. as a result, we have found almost no difference between the two versions in their calculation results. Therefore we expect the HP 9000/700 version of the COBRA-IV-I code to be the basis for the future development of an improved multichannel analysis code under the more convenient user environment. (author). 3 tabs., 2 figs., 8 refs

  14. Optical computing: introduction by the feature editors.

    Science.gov (United States)

    Li, Y; Tanida, J; Tooley, F; Wagner, K

    1996-03-10

    This feature issue of Applied Optics: Information Processing contains 19 papers on Optical Computing. Many of these papers are expanded versions of presentations given at the Optical Society of America's Sixth Topical Meeting on Optical Computing held in Salt Lake City, Utah, in March 1995. This introduction provides a brief historical account of the series of optical computing meetings and a brief review of the papers contained in this special issue.

  15. Users' manual for LEHGC: A Lagrangian-Eulerian Finite-Element Model of Hydrogeochemical Transport Through Saturated-Unsaturated Media. Version 1.1

    International Nuclear Information System (INIS)

    Yeh, Gour-Tsyh

    1995-11-01

    The computer program LEHGC is a Hybrid Lagrangian-Eulerian Finite-Element Model of HydroGeo-Chemical (LEHGC) Transport Through Saturated-Unsaturated Media. LEHGC iteratively solves two-dimensional transport and geochemical equilibrium equations and is a descendant of HYDROGEOCHEM, a strictly Eulerian finite-element reactive transport code. The hybrid Lagrangian-Eulerian scheme improves on the Eulerian scheme by allowing larger time steps to be used in the advection-dominant transport calculations. This causes less numerical dispersion and alleviates the problem of calculated negative concentrations at sharp concentration fronts. The code also is more computationally efficient than the strictly Eulerian version. LEHGC is designed for generic application to reactive transport problems associated with contaminant transport in subsurface media. Input to the program includes the geometry of the system, the spatial distribution of finite elements and nodes, the properties of the media, the potential chemical reactions, and the initial and boundary conditions. Output includes the spatial distribution of chemical element concentrations as a function of time and space and the chemical speciation at user-specified nodes. LEHGC Version 1.1 is a modification of LEHGC Version 1.0. The modification includes: (1) devising a tracking algorithm with the computational effort proportional to N where N is the number of computational grid nodes rather than N 2 as in LEHGC Version 1.0, (2) including multiple adsorbing sites and multiple ion-exchange sites, (3) using four preconditioned conjugate gradient methods for the solution of matrix equations, and (4) providing a model for some features of solute transport by colloids

  16. Procedure guideline for radioiodine test (version 3); Verfahrensanweisung zum Radioiodtest (Version 3)

    Energy Technology Data Exchange (ETDEWEB)

    Dietlein, M.; Schicha, H. [Deutsche Gesellschaft fuer Nuklearmedizin (DGN) (Germany); Koeln Univ. (Germany). Klinik und Poliklinik fuer Nuklearmedizin; Dressler, J. [Deutsche Gesellschaft fuer Nuklearmedizin (DGN) (Germany); Nuklearmedizinische Klinik der Henriettenstiftung, Hannover (Germany); Eschner, W. [Deutsche Gesellschaft fuer Nuklearmedizin (DGN) (Germany); Deutsche Gesellschaft fuer Medizinische Physik (DGMP) (Germany); Koeln Univ. (Germany). Klinik und Poliklinik fuer Nuklearmedizin; Lassmann, M. [Deutsche Gesellschaft fuer Nuklearmedizin (DGN) (Germany); Deutsche Gesellschaft fuer Medizinische Physik (DGMP) (Germany); Wuerzburg Univ. (Germany). Klinik und Poliklinik fuer Nuklearmedizin; Leisner, B. [Deutsche Gesellschaft fuer Nuklearmedizin (DGN) (Germany); Allgemeines Krankenhaus St. Georg, Hamburg (Germany). Abt. fuer Nuklearmedizin; Reiners, C. [Deutsche Gesellschaft fuer Nuklearmedizin (DGN) (Germany); Wuerzburg Univ. (Germany). Klinik und Poliklinik fuer Nuklearmedizin

    2007-07-01

    The version 3 of the procedure guideline for radioiodine test is an update of the guideline previously published in 2003. The procedure guideline discusses the pros and cons of a single measurement or of repeated measurements of the iodine-131 uptake and their optimal timing. Different formulas are described when one, two or three values of the radioiodine kinetic are available. The probe with a sodium-iodine crystal, alternatively or additionally the gamma camera using the ROI-technique are instrumentations for the measurement of iodine-131 uptake. A possible source of error is an inappropriate measurement (sonography) of the target volume. The patients' preparation includes the withdrawal of antithyroid drugs 2-3 days before radioiodine administration. The patient has to avoid iodine-containing medication and the possibility of additives of iodine in vitamin- and electrolyte-supplementation has to be considered. (orig.)

  17. 76 FR 13984 - Cloud Computing Forum & Workshop III

    Science.gov (United States)

    2011-03-15

    ..., Reference Architecture and Taxonomy, Target USG Agency Business Use Cases and SAJACC were formed. The... Technology Roadmap; a series of high-value target U.S. Government Agency Cloud Computing Business Use Cases; a first version of a neutral cloud computing reference architecture and taxonomy; the NIST Standards...

  18. FastChem: A computer program for efficient complex chemical equilibrium calculations in the neutral/ionized gas phase with applications to stellar and planetary atmospheres

    Science.gov (United States)

    Stock, Joachim W.; Kitzmann, Daniel; Patzer, A. Beate C.; Sedlmayr, Erwin

    2018-06-01

    For the calculation of complex neutral/ionized gas phase chemical equilibria, we present a semi-analytical versatile and efficient computer program, called FastChem. The applied method is based on the solution of a system of coupled nonlinear (and linear) algebraic equations, namely the law of mass action and the element conservation equations including charge balance, in many variables. Specifically, the system of equations is decomposed into a set of coupled nonlinear equations in one variable each, which are solved analytically whenever feasible to reduce computation time. Notably, the electron density is determined by using the method of Nelder and Mead at low temperatures. The program is written in object-oriented C++ which makes it easy to couple the code with other programs, although a stand-alone version is provided. FastChem can be used in parallel or sequentially and is available under the GNU General Public License version 3 at https://github.com/exoclime/FastChem together with several sample applications. The code has been successfully validated against previous studies and its convergence behavior has been tested even for extreme physical parameter ranges down to 100 K and up to 1000 bar. FastChem converges stable and robust in even most demanding chemical situations, which posed sometimes extreme challenges for previous algorithms.

  19. On the interpretability and computational reliability of frequency-domain Granger causality [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Luca Faes

    2017-09-01

    Full Text Available This Correspondence article is a comment which directly relates to the paper “A study of problems encountered in Granger causality analysis from a neuroscience perspective” (Stokes and Purdon, 2017. We agree that interpretation issues of Granger causality (GC in neuroscience exist, partially due to the historically unfortunate use of the name “causality”, as described in previous literature. On the other hand, we think that Stokes and Purdon use a formulation of GC which is outdated (albeit still used and do not fully account for the potential of the different frequency-domain versions of GC; in doing so, their paper dismisses GC measures based on a suboptimal use of them. Furthermore, since data from simulated systems are used, the pitfalls that are found with the used formulation are intended to be general, and not limited to neuroscience. It would be a pity if this paper, even if written in good faith, became a wildcard against all possible applications of GC, regardless of the large body of work recently published which aims to address faults in methodology and interpretation. In order to provide a balanced view, we replicate the simulations of Stokes and Purdon, using an updated GC implementation and exploiting the combination of spectral and causal information, showing that in this way the pitfalls are mitigated or directly solved.

  20. Development of EASYQAD version β: A Visualization Code System for QAD-CGGP-A Gamma and Neutron Shielding Calculation Code

    International Nuclear Information System (INIS)

    Kim, Jae Cheon; Lee, Hwan Soo; Ha, Pham Nhu Viet; Kim, Soon Young; Shin, Chang Ho; Kim, Jong Kyung

    2007-01-01

    EASYQAD had been previously developed by using MATLAB GUI (Graphical User Interface) in order to perform conveniently gamma and neutron shielding calculations at Hanyang University. It had been completed as version α of radiation shielding analysis code. In this study, EASYQAD was upgraded to version β with many additional functions and more user-friendly graphical interfaces. For general users to run it on Windows XP environment without any MATLAB installation, this version was developed into a standalone code system

  1. Translation of ARAC computer codes

    International Nuclear Information System (INIS)

    Takahashi, Kunio; Chino, Masamichi; Honma, Toshimitsu; Ishikawa, Hirohiko; Kai, Michiaki; Imai, Kazuhiko; Asai, Kiyoshi

    1982-05-01

    In 1981 we have translated the famous MATHEW, ADPIC and their auxiliary computer codes for CDC 7600 computer version to FACOM M-200's. The codes consist of a part of the Atmospheric Release Advisory Capability (ARAC) system of Lawrence Livermore National Laboratory (LLNL). The MATHEW is a code for three-dimensional wind field analysis. Using observed data, it calculates the mass-consistent wind field of grid cells by a variational method. The ADPIC is a code for three-dimensional concentration prediction of gases and particulates released to the atmosphere. It calculates concentrations in grid cells by the particle-in-cell method. They are written in LLLTRAN, i.e., LLNL Fortran language and are implemented on the CDC 7600 computers of LLNL. In this report, i) the computational methods of the MATHEW/ADPIC and their auxiliary codes, ii) comparisons of the calculated results with our JAERI particle-in-cell, and gaussian plume models, iii) translation procedures from the CDC version to FACOM M-200's, are described. Under the permission of LLNL G-Division, this report is published to keep the track of the translation procedures and to serve our JAERI researchers for comparisons and references of their works. (author)

  2. The Systems Biology Markup Language (SBML: Language Specification for Level 3 Version 2 Core

    Directory of Open Access Journals (Sweden)

    Hucka Michael

    2018-03-01

    Full Text Available Computational models can help researchers to interpret data, understand biological functions, and make quantitative predictions. The Systems Biology Markup Language (SBML is a file format for representing computational models in a declarative form that different software systems can exchange. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 2 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML, their encoding in XML (the eXtensible Markup Language, validation rules that determine the validity of an SBML document, and examples of models in SBML form. The design of Version 2 differs from Version 1 principally in allowing new MathML constructs, making more child elements optional, and adding identifiers to all SBML elements instead of only selected elements. Other materials and software are available from the SBML project website at http://sbml.org/.

  3. Zgoubi user`s guide. Version 4

    Energy Technology Data Exchange (ETDEWEB)

    Meot, F. [Fermi National Accelerator Lab., Batavia, IL (United States). Dept. of Physics; Valero, S. [CEA, Gif-sur-Yvette (France)

    1997-10-15

    The computer code Zgoubi calculates trajectories of charged particles in magnetic and electric fields. At the origin specially adapted to the definition and adjustment of beam lines and magnetic spectrometers, it has so-evolved that it allows the study of systems including complex sequences of optical elements such as dipoles, quadrupoles, arbitrary multipoles and other magnetic or electric devices, and is able as well to handle periodic structures. Compared to other codes, it presents several peculiarities: (1) a numerical method for integrating the Lorentz equation, based on Taylor series, which optimizes computing time and provides high accuracy and strong symplecticity, (2) spin tracking, using the same numerical method as for the Lorentz equation, (3) calculation of the synchrotron radiation electric field and spectra in arbitrary magnetic fields, from the ray-tracing outcomes, (4) the possibility of using a mesh, which allows ray-tracing from simulated or measured (1-D, 2-D or 3-D) field maps, (5) Monte Carlo procedures: unlimited number of trajectories, in-flight decay, etc. (6) built-in fitting procedure, (7) multiturn tracking in circular accelerators including many features proper to machine parameter calculation and survey, and also the simulation of time-varying power supplies. The initial version of the Code, dedicated to the ray-tracing in magnetic fields, was developed by D. Garreta and J.C. Faivre at CEN-Saclay in the early 1970`s. It was perfected for the purpose of studying the four spectrometers (SPES I, II, III, IV) at the Laboratoire National Saturne (CEA-Saclay, France), and SPEG at Ganil (Caen, France). It is now in use in several national and foreign laboratories. This manual is intended only to describe the details of the most recent version of Zogoubi, which is far from being a {open_quotes}finished product{close_quotes}.

  4. Survey of computer vision technology for UVA navigation

    Science.gov (United States)

    Xie, Bo; Fan, Xiang; Li, Sijian

    2017-11-01

    Navigation based on computer version technology, which has the characteristics of strong independence, high precision and is not susceptible to electrical interference, has attracted more and more attention in the filed of UAV navigation research. Early navigation project based on computer version technology mainly applied to autonomous ground robot. In recent years, the visual navigation system is widely applied to unmanned machine, deep space detector and underwater robot. That further stimulate the research of integrated navigation algorithm based on computer version technology. In China, with many types of UAV development and two lunar exploration, the three phase of the project started, there has been significant progress in the study of visual navigation. The paper expounds the development of navigation based on computer version technology in the filed of UAV navigation research and draw a conclusion that visual navigation is mainly applied to three aspects as follows.(1) Acquisition of UAV navigation parameters. The parameters, including UAV attitude, position and velocity information could be got according to the relationship between the images from sensors and carrier's attitude, the relationship between instant matching images and the reference images and the relationship between carrier's velocity and characteristics of sequential images.(2) Autonomous obstacle avoidance. There are many ways to achieve obstacle avoidance in UAV navigation. The methods based on computer version technology ,including feature matching, template matching, image frames and so on, are mainly introduced. (3) The target tracking, positioning. Using the obtained images, UAV position is calculated by using optical flow method, MeanShift algorithm, CamShift algorithm, Kalman filtering and particle filter algotithm. The paper expounds three kinds of mainstream visual system. (1) High speed visual system. It uses parallel structure, with which image detection and processing are

  5. OMWS: A Web Service Interface for Ecological Niche Modelling

    Directory of Open Access Journals (Sweden)

    Renato De Giovanni

    2015-09-01

    Full Text Available Ecological niche modelling (ENM experiments often involve a high number of tasks to be performed. Such tasks may consume a significant amount of computing resources and take a long time to complete, especially when using personal computers. OMWS is a Web service interface that allows more powerful computing back-ends to be remotely exploited by other applications to carry out ENM tasks. Its latest version includes a new operation that can be used to specify complex workflows in a single request, adding the possibility of using workflow management systems on parallel computing back-end. In this paper we describe the OMWS protocol and compare its most recent version with the previous one by running the same ENM experiment using two functionally equivalent clients, each designed for one of the OMWS interface versions. Different back-end configurations were used to investigate how the performance scales for each protocol version when more processing power is made available. Results show that the new version outperforms (in a factor of 2 the previous one when more computing resources are used.

  6. An update to the Surface Ocean CO2 Atlas (SOCAT version 2)

    Digital Repository Service at National Institute of Oceanography (India)

    Bakker, D.C.E.; Hankin, S.; Olsen, A; Pfeil, B.; Smith, K.; Alin, S.R.; Cosca, C.; Hales, B.; Harasawa, S.; Kozyr, A; Nojiri, Y.; OBrien, K.M.; Schuster, U.; Telszewski, M.; Tilbrook, B.; Wada, C.; Akl, J.; Barbero, L.; Bates, N.; Boutin, J.; Cai, W.J.; Castle, R.D.; Chavez, F.; Chen, L.; Chierici, M.; Currie, K.; Evans, W.; Feely, R.A; Fransson, A; Gao, Z.; Hardman-Mountford, N.; Hoppema, M.; Huang, W.J.; Hunt, C.W.; Huss, B.; Ichikawa, T.; Jacobson, A; Johannessen, T.; Jones, E.M.; Jones, S.; Sara, J.; Kitidis, V.; Kortzinger, A.; Lauvset, S.; Lefevre, N.; Manke, A.B.; Mathis, J.; Metzl, N.; Monteiro, P.; Murata, A.; Newberger, T.; Nobuo, T.; Ono, T.; Paterson, K.; Pierrot, D.; Rios, A.F.; Sabine, C.L.; Saito, S.; Salisbury, J.; Sarma, V.V.S.S.; Schlitzer, R.; Sieger, R.; Skjelvan, I.; Steinhoff, T.; Sullivan, K.; Sutherland, S.C.; Suzuki, T.; Sutton, A.; Sweeney, C.; Takahashi, T.; Tjiputra, J.; VanHeuven, S.; Vandemark, D.; Vlahos, P.; Wallace, D.W.R.; Wanninkhof, R.; Watson, A.J.

    of SOCAT is an update of the previous release (version 1) with more data (increased from 6.3 million to 10.1 million surface water fCO2 values) and extended data coverage (from 1968–2007 to 1968–2011). The quality control criteria, while...

  7. Quantum analogue computing.

    Science.gov (United States)

    Kendon, Vivien M; Nemoto, Kae; Munro, William J

    2010-08-13

    We briefly review what a quantum computer is, what it promises to do for us and why it is so hard to build one. Among the first applications anticipated to bear fruit is the quantum simulation of quantum systems. While most quantum computation is an extension of classical digital computation, quantum simulation differs fundamentally in how the data are encoded in the quantum computer. To perform a quantum simulation, the Hilbert space of the system to be simulated is mapped directly onto the Hilbert space of the (logical) qubits in the quantum computer. This type of direct correspondence is how data are encoded in a classical analogue computer. There is no binary encoding, and increasing precision becomes exponentially costly: an extra bit of precision doubles the size of the computer. This has important consequences for both the precision and error-correction requirements of quantum simulation, and significant open questions remain about its practicality. It also means that the quantum version of analogue computers, continuous-variable quantum computers, becomes an equally efficient architecture for quantum simulation. Lessons from past use of classical analogue computers can help us to build better quantum simulators in future.

  8. UPEML, Computer Independent Emulator of CDC Update Utility

    International Nuclear Information System (INIS)

    2002-01-01

    1 - Description of program or function: UPEML is a machine-portable CDC UPDATE emulation program. It is capable of emulating a significant subset of the standard CDC UPDATE functions, including program library creation and subsequent modification. 2 - Method of solution: UPEML was originally written to facilitate the use of CDC-based scientific packages on alternate computers. In addition to supporting computers such as the VAX/VMS, IBM, and CRAY/COS, Version 3.0 now supports UNIX workstations and the CRAY/UNICOS operating system. Several program bugs have been corrected in Version 3.0. Version 3.0 has several new features including 1) improved error checking, 2) the ability to use *ADDFILE and READ from nested files, 3) creation of compile file on creation, 4) allows identifiers to begin with numbers, and 5) ability to control warning messages and program termination on error conditions. 3 - Restrictions on the complexity of the problem: None noted

  9. AERO2S - SUBSONIC AERODYNAMIC ANALYSIS OF WINGS WITH LEADING- AND TRAILING-EDGE FLAPS IN COMBINATION WITH CANARD OR HORIZONTAL TAIL SURFACES (CDC VERSION)

    Science.gov (United States)

    Darden, C. M.

    1994-01-01

    necessary only to add an identification record and the namelist data that are to be changed from the previous run. This code was originally developed in 1989 in FORTRAN V on a CDC 6000 computer system, and was later ported to an MS-DOS environment. Both versions are available from COSMIC. There are only a few differences between the PC version (LAR-14458) and CDC version (LAR-14178) of AERO2S distributed by COSMIC. The CDC version has one main source code file while the PC version has two files which are easier to edit and compile on a PC. The PC version does not require a FORTRAN compiler which supports NAMELIST because a special INPUT subroutine has been added. The CDC version includes two MODIFY decks which can be used to improve the code and prevent the possibility of some infrequently occurring errors while PC-version users will have to make these code changes manually. The PC version includes an executable which was generated with the Ryan McFarland/FORTRAN compiler and requires 253K RAM and an 80x87 math co-processor. Using this executable, the sample case requires about four hours to execute on an 8MHz AT-class microcomputer with a co-processor. The source code conforms to the FORTRAN 77 standard except that it uses variables longer than six characters. With two minor modifications, the PC version should be portable to any computer with a FORTRAN compiler and sufficient memory. The CDC version of AERO2S is available in CDC NOS Internal format on a 9-track 1600 BPI magnetic tape. The PC version is available on a set of two 5.25 inch 360K MS-DOS format diskettes. IBM AT is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. CDC is a registered trademark of Control Data Corporation. NOS is a trademark of Control Data Corporation.

  10. Statistical properties of dynamical systems – Simulation and abstract computation

    International Nuclear Information System (INIS)

    Galatolo, Stefano; Hoyrup, Mathieu; Rojas, Cristóbal

    2012-01-01

    Highlights: ► A survey on results about computation and computability on the statistical properties of dynamical systems. ► Computability and non-computability results for invariant measures. ► A short proof for the computability of the convergence speed of ergodic averages. ► A kind of “constructive” version of the pointwise ergodic theorem. - Abstract: We survey an area of recent development, relating dynamics to theoretical computer science. We discuss some aspects of the theoretical simulation and computation of the long term behavior of dynamical systems. We will focus on the statistical limiting behavior and invariant measures. We present a general method allowing the algorithmic approximation at any given accuracy of invariant measures. The method can be applied in many interesting cases, as we shall explain. On the other hand, we exhibit some examples where the algorithmic approximation of invariant measures is not possible. We also explain how it is possible to compute the speed of convergence of ergodic averages (when the system is known exactly) and how this entails the computation of arbitrarily good approximations of points of the space having typical statistical behaviour (a sort of constructive version of the pointwise ergodic theorem).

  11. CAMAC throughput of a new RISC-based data acquisition computer at the DIII-D tokamak

    International Nuclear Information System (INIS)

    VanderLaan, J.F.; Cummings, J.W.

    1993-10-01

    The amount of experimental data acquired per plasma discharge at DIII-D has continued to grow. The largest shot size in May 1991 was 49 Mbyte; in May 1992, 66 Mbyte; and in April 1993, 80 Mbyte. The increasing load has prompted the installation of a new Motorola 88100-based MODCOMP computer to supplement the existing core of three older MODCOMP data acquisition CPUs. New Kinetic Systems CAMAC serial highway driver hardware runs on the 88100 VME bus. The new operating system is MODCOMP REAL/IX version of AT ampersand T System V UNIX with real-time extensions and networking capabilities; future plans call for installation of additional computers of this type for tokamak and neutral beam control functions. Experiences with the CAMAC hardware and software will be chronicled, including observation of data throughput. The Enhanced Serial Highway crate controller is advertised as twice as fast as the previous crate controller, and computer I/O speeds are expected to also increase data rates

  12. CAMAC throughput of a new RISC-based data acquisition computer at the DIII-D tokamak

    Science.gov (United States)

    Vanderlaan, J. F.; Cummings, J. W.

    1993-10-01

    The amount of experimental data acquired per plasma discharge at DIII-D has continued to grow. The largest shot size in May 1991 was 49 Mbyte; in May 1992, 66 Mbyte; and in April 1993, 80 Mbyte. The increasing load has prompted the installation of a new Motorola 88100-based MODCOMP computer to supplement the existing core of three older MODCOMP data acquisition CPU's. New Kinetic Systems CAMAC serial highway driver hardware runs on the 88100 VME bus. The new operating system is MODCOMP REAL/IX version of AT&T System V UNIX with real-time extensions and networking capabilities; future plans call for installation of additional computers of this type for tokamak and neutral beam control functions. Experiences with the CAMAC hardware and software will be chronicled, including observation of data throughput. The Enhanced Serial Highway crate controller is advertised as twice as fast as the previous crate controller, and computer I/O speeds are expected to also increase data rates.

  13. FORECAST: Regulatory effects cost analysis software manual -- Version 4.1. Revision 1

    International Nuclear Information System (INIS)

    Lopez, B.; Sciacca, F.W.

    1996-07-01

    The FORECAST program was developed to facilitate the preparation of the value-impact portion of NRC regulatory analyses. This PC program integrates the major cost and benefit considerations that may result from a proposed regulatory change. FORECAST automates much of the calculations typically needed in a regulatory analysis and thus reduces the time and labor required to perform these analyses. More importantly, its integrated and consistent treatment of the different value-impact considerations should help assure comprehensiveness, uniformity, and accuracy in the preparation of NRC regulatory analyses. The Current FORECAST Version 4.1 has been upgraded from the previous version and now includes an uncertainty package, an automatic cost escalation package, and other improvements. In addition, it now explicitly addresses public health impacts, occupational health impacts, onsite property damage, and government costs. Thus, FORECAST Version 4.1 can treat all attributes normally quantified in a regulatory analysis

  14. The National Energy Audit (NEAT) Engineering Manual (Version 6)

    Energy Technology Data Exchange (ETDEWEB)

    Gettings, M.B.

    2001-04-20

    Government-funded weatherization assistance programs resulted from increased oil prices caused by the 1973 oil embargo. These programs were instituted to reduce US consumption of oil and help low-income families afford the increasing cost of heating their homes. In the summer of 1988, Oak Ridge National Laboratory (ORNL) began providing technical support to the Department of Energy (DOE) Weatherization Assistance Program (WAP). A preliminary study found no suitable means of cost-effectively selecting energy efficiency improvements (measures) for single-family homes that incorporated all the factors seen as beneficial in improving cost-effectiveness and usability. In mid-1989, ORNL was authorized to begin development of a computer-based measure selection technique. In November of 1992 a draft version of the program was made available to all WAP state directors for testing. The first production release, Version 4.3, was made available in october of 1993. The Department of Energy's Weatherization Assistance Program has continued funding improvements to the program increasing its user-friendliness and applicability. initial publication of this engineering manual coincides with availability of Version 6.1, November 1997, though algorithms described generally apply to all prior versions. Periodic updates of specific sections in the manual will permit maintaining a relevant document. This Engineering Manual delineates the assumptions used by NEAT in arriving at the measure recommendations based on the user's input of the building characteristics. Details of the actual data entry are available in the NEAT User's Manual (ORNL/Sub/91-SK078/1) and will not be discussed in this manual.

  15. Reheating breakfast: Age and multitasking on a computer-based and a non-computer-based task

    OpenAIRE

    Feinkohl, I.; Cress, U.; Kimmerle, J.

    2016-01-01

    Computer-based assessments are popular means to measure individual differences, including age differences, in cognitive ability, but are rarely tested for the extent to which they correspond to more realistic behavior. In the present study, we explored the extent to which performance on an existing computer-based task of multitasking ('cooking breakfast') may be generalizable by comparing it with a newly developed version of the same task that required interaction with physical objects. Twent...

  16. Development of the Brazilian version of the Child Hayling Test

    Directory of Open Access Journals (Sweden)

    Larissa de Souza Siqueira

    Full Text Available Abstract Introduction: The Hayling Test assesses the components of initiation, inhibition, cognitive flexibility and verbal speed by means of a sentence completion task. This study presents the process of developing the Brazilian version of the Child Hayling Test (CHT and reports evidence of its content validity. Methods: 139 people took part in the study. The adaptation was performed by seven translators and 12 specialist judges. An initial sample of 92 healthy children was recruited to test a selection of sentences adapted from previous adult and pediatric versions of the instrument, and a sample of 28 healthy children was recruited for pilot testing of the final version. The instrument was developed in seven stages: 1 translation, 2 back-translation, 3 comparison of translated versions, 4 preparation of new stimuli, 5 data collection with healthy children to analyze comprehension of the stimuli and analyses by the authors against the psycholinguistic criteria adopted, 6 analyses conducted by judges who are specialists in neuropsychology or linguistics, and 7 the pilot study. Results: Twenty-four of the 72 sentences constructed were selected on the basis of 70-100% agreement between judges evaluating what they assessed and level of comprehensibility. The pilot study revealed better performance by older children, providing evidence of the instrument's sensitivity to developmental factors. Conclusions: Future studies employing this version of CHT with clinical pediatric populations who have frontal lesions and dysfunctions and in related areas are needed to test functional and differential diagnoses of preserved or impaired executive functions.

  17. Trusted Network Interpretation of the Trusted Computer System Evaluation Criteria. Version 1.

    Science.gov (United States)

    1987-07-01

    for Secure Computer Systema, MTR-3153, The MITRE Corporation, Bedford, MA, June 1975. 1 See, for example, M. D. Abrams and H. J. Podell , Tutorial...References References Abrams, M. D. and H. J. Podell , Tutorial: Computer and Network Security, IEEE Com- puter Society Press, 1987. Addendum to the

  18. High-Throughput Computational Assessment of Previously Synthesized Semiconductors for Photovoltaic and Photoelectrochemical Devices

    DEFF Research Database (Denmark)

    Kuhar, Korina; Pandey, Mohnish; Thygesen, Kristian Sommer

    2018-01-01

    Using computational screening we identify materials with potential use as light absorbers in photovoltaic or photoelectrochemical devices. The screening focuses on compounds of up to three different chemical elements which are abundant and nontoxic. A prescreening is carried out based on informat...

  19. Supporting students' learning in the domain of computer science

    Science.gov (United States)

    Gasparinatou, Alexandra; Grigoriadou, Maria

    2011-03-01

    Previous studies have shown that students with low knowledge understand and learn better from more cohesive texts, whereas high-knowledge students have been shown to learn better from texts of lower cohesion. This study examines whether high-knowledge readers in computer science benefit from a text of low cohesion. Undergraduate students (n = 65) read one of four versions of a text concerning Local Network Topologies, orthogonally varying local and global cohesion. Participants' comprehension was examined through free-recall measure, text-based, bridging-inference, elaborative-inference, problem-solving questions and a sorting task. The results indicated that high-knowledge readers benefited from the low-cohesion text. The interaction of text cohesion and knowledge was reliable for the sorting activity, for elaborative-inference and for problem-solving questions. Although high-knowledge readers performed better in text-based and in bridging-inference questions with the low-cohesion text, the interaction of text cohesion and knowledge was not reliable. The results suggest a more complex view of when and for whom textual cohesion affects comprehension and consequently learning in computer science.

  20. Upon Further Review: V. An Examination of Previous Lightcurve Analysis from the Palmer Divide Observatory

    Science.gov (United States)

    Warner, Brian D.

    2011-01-01

    Updated results are given for nine asteroids previously reported from the Palmer Divide Observatory (PDO). The original images were re-measured to obtain new data sets using the latest version of MPO Canopus photometry software, analysis tools, and revised techniques for linking multiple observing runs covering several days to several weeks. Results that were previously not reported or were moderately different were found for 1659 Punkajarju, 1719 Jens, 1987 Kaplan, 2105 Gudy, 2961 Katsurahama, 3285 Ruth Wolfe, 3447 Burckhalter, 7816 Hanoi, and (34817) 2000 SE116. This is one in a series of papers that will examine results obtained during the initial years of the asteroid lightcurve program at PDO.

  1. Safety analysis report for the TRUPACT-II shipping package (condensed version). Volume 1, Rev. 14

    International Nuclear Information System (INIS)

    1994-10-01

    The condensed version of the TRUPACT-II Contact Handled Transuranic Waste Safety Analysis Report for Packaging (SARP) contains essential material required by TRUPACT-II users, plus additional contents (payload) information previously submitted to the U.S. Nuclear Regulatory Commission. All or part of the following sections, which are not required by users of the TRUPACT-II, are deleted from the condensed version: (i) structural analysis, (ii) thermal analysis, (iii) containment analysis, (iv) criticality analysis, (v) shielding analysis, and (vi) hypothetical accident test results

  2. Safety analysis report for the TRUPACT-II shipping package (condensed version). Volume 1, Rev. 14

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-10-01

    The condensed version of the TRUPACT-II Contact Handled Transuranic Waste Safety Analysis Report for Packaging (SARP) contains essential material required by TRUPACT-II users, plus additional contents (payload) information previously submitted to the U.S. Nuclear Regulatory Commission. All or part of the following sections, which are not required by users of the TRUPACT-II, are deleted from the condensed version: (i) structural analysis, (ii) thermal analysis, (iii) containment analysis, (iv) criticality analysis, (v) shielding analysis, and (vi) hypothetical accident test results.

  3. Computer aided systems human engineering: A hypermedia tool

    Science.gov (United States)

    Boff, Kenneth R.; Monk, Donald L.; Cody, William J.

    1992-01-01

    The Computer Aided Systems Human Engineering (CASHE) system, Version 1.0, is a multimedia ergonomics database on CD-ROM for the Apple Macintosh II computer, being developed for use by human system designers, educators, and researchers. It will initially be available on CD-ROM and will allow users to access ergonomics data and models stored electronically as text, graphics, and audio. The CASHE CD-ROM, Version 1.0 will contain the Boff and Lincoln (1988) Engineering Data Compendium, MIL-STD-1472D and a unique, interactive simulation capability, the Perception and Performance Prototyper. Its features also include a specialized data retrieval, scaling, and analysis capability and the state of the art in information retrieval, browsing, and navigation.

  4. Psychometric properties of the Finnish version of the Women's Health Questionnaire.

    Science.gov (United States)

    Katainen, Riina E; Engblom, Janne R; Vahlberg, Tero J; Polo-Kantola, Päivi

    2017-08-01

    The Women's Health Questionnaire (WHQ) is a validated and commonly used instrument for measuring climacteric-related symptoms. A revised version was previously developed. However, validation in a Finnish population is lacking. As it is important to use qualified instruments, we performed a validation study of the WHQ in a Finnish population. In all, 3,421 women, aged 41 to 54 years, formed the study population. In the original 36-item WHQ, the items were rated on a 1 to 4 scale and on a binary scale (0-1). The scaling of the revised 23-item WHQ was 0 to 100. We evaluated the psychometric properties (internal consistency, correlations between the symptom domains, factor structure, and sampling adequacy) in all three versions. For the 1 to 4 scale and on the revised version of the WHQ, the internal consistency was acceptable (the Cronbach's α coefficients >0.70) for most of the domains. On the binary scale, the majority of the coefficient values were below the acceptable level. The original symptom domains, especially those on the revised version, were recognizable from the factors in the exploratory factor analysis, but there were some limitations. The Kaiser-Meyer-Olkin values were high. The WHQ is a valid instrument for measuring climacteric-related symptoms in Finnish middle-aged women. The psychometric properties of the revised 23-item WHQ were as good or even better than those of the original 36-item WHQ. Thus, we encourage use of the revised version.

  5. An improved version of the MICROX-2 code

    Energy Technology Data Exchange (ETDEWEB)

    Mathews, D. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1997-11-01

    The MICROX-2 code prepares broad group neutron cross sections for use in diffusion- and/or transport-theory codes from an input library of fine group and pointwise cross sections. The neutron weighting spectrum is obtained by solving the B{sub 1} neutron balance equations at about 10000 energies in a one-dimensional (planar, spherical or cylindrical), two-region unit cell. The regions are coupled by collision probabilities based upon spatially flat neutron emission. Energy dependent Dancoff factors and bucklings correct the one-dimensional calculations for multi-dimensional lattice effects. A critical buckling search option is also included. The inner region may include two different types of fuel particles (grains). This report describes the present PSI FORTRAN 90 version of the MICROX-2 code which operates on CRAY computers and IBM PC`s. The equations which are solved in the various energy ranges are given along with descriptions of various changes that have been made in the present PSI version of the code. A completely re-written description of the user input is also included. (author) 7 figs., 4 tabs., 59 refs.

  6. Particle and heavy ion transport code system, PHITS, version 2.52

    International Nuclear Information System (INIS)

    Sato, Tatsuhiko; Matsuda, Norihiro; Hashimoto, Shintaro; Iwamoto, Yosuke; Noda, Shusaku; Ogawa, Tatsuhiko; Nakashima, Hiroshi; Fukahori, Tokio; Okumura, Keisuke; Kai, Tetsuya; Niita, Koji; Iwase, Hiroshi; Chiba, Satoshi; Furuta, Takuya; Sihver, Lembit

    2013-01-01

    An upgraded version of the Particle and Heavy Ion Transport code System, PHITS2.52, was developed and released to the public. The new version has been greatly improved from the previously released version, PHITS2.24, in terms of not only the code itself but also the contents of its package, such as the attached data libraries. In the new version, a higher accuracy of simulation was achieved by implementing several latest nuclear reaction models. The reliability of the simulation was improved by modifying both the algorithms for the electron-, positron-, and photon-transport simulations and the procedure for calculating the statistical uncertainties of the tally results. Estimation of the time evolution of radioactivity became feasible by incorporating the activation calculation program DCHAIN-SP into the new package. The efficiency of the simulation was also improved as a result of the implementation of shared-memory parallelization and the optimization of several time-consuming algorithms. Furthermore, a number of new user-support tools and functions that help users to intuitively and effectively perform PHITS simulations were developed and incorporated. Due to these improvements, PHITS is now a more powerful tool for particle transport simulation applicable to various research and development fields, such as nuclear technology, accelerator design, medical physics, and cosmic-ray research. (author)

  7. Effects of complex feedback on computer-assisted modular instruction

    NARCIS (Netherlands)

    Gordijn, Jan; Nijhof, W.J.

    2002-01-01

    The aim of this study is to determine the effects of two versions of Computer-Based Feedback within a prevocational system of modularized education in The Netherlands. The implementation and integration of Computer-Based Feedback (CBF) in Installation Technology modules in all schools (n=60) in The

  8. Programming for computations Python : a gentle introduction to numerical simulations with Python

    CERN Document Server

    Linge, Svein

    2016-01-01

    This book presents computer programming as a key method for solving mathematical problems. There are two versions of the book, one for MATLAB and one for Python. The book was inspired by the Springer book TCSE 6: A Primer on Scientific Programming with Python (by Langtangen), but the style is more accessible and concise, in keeping with the needs of engineering students. The book outlines the shortest possible path from no previous experience with programming to a set of skills that allows the students to write simple programs for solving common mathematical problems with numerical methods in engineering and science courses. The emphasis is on generic algorithms, clean design of programs, use of functions, and automatic tests for verification.

  9. Citham a computer code for calculating fuel depletion-description, tests, modifications and evaluation

    International Nuclear Information System (INIS)

    Alvarenga, M.A.B.

    1984-12-01

    The CITHAN computer code was developed at IPEN (Instituto de Pesquisas Energeticas e Nucleares) to link the HAMMER computer code with a fuel depletion routine and to provide neutron cross sections to be read with the appropriate format of the CITATION code. The problem arised due to the efforts to addapt the new version denomined HAMMER-TECHION with the routine refered. The HAMMER-TECHION computer code was elaborated by Haifa Institute, Israel within a project with EPRI. This version is at CNEN to be used in multigroup constant generation for neutron diffusion calculation in the scope of the new methodology to be adopted by CNEN. The theoretical formulation of CITHAM computer code, tests and modificatins are described. (Author) [pt

  10. High Performance Computing - Power Application Programming Interface Specification Version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Laros, James H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grant, Ryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Levenhagen, Michael J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Olivier, Stephen Lecler [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pedretti, Kevin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ward, H. Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Younge, Andrew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-03-01

    Measuring and controlling the power and energy consumption of high performance computing systems by various components in the software stack is an active research area. Implementations in lower level software layers are beginning to emerge in some production systems, which is very welcome. To be most effective, a portable interface to measurement and control features would significantly facilitate participation by all levels of the software stack. We present a proposal for a standard power Application Programming Interface (API) that endeavors to cover the entire software space, from generic hardware interfaces to the input from the computer facility manager.

  11. AERO2S - SUBSONIC AERODYNAMIC ANALYSIS OF WINGS WITH LEADING- AND TRAILING-EDGE FLAPS IN COMBINATION WITH CANARD OR HORIZONTAL TAIL SURFACES (IBM PC VERSION)

    Science.gov (United States)

    Carlson, H. W.

    1994-01-01

    necessary only to add an identification record and the namelist data that are to be changed from the previous run. This code was originally developed in 1989 in FORTRAN V on a CDC 6000 computer system, and was later ported to an MS-DOS environment. Both versions are available from COSMIC. There are only a few differences between the PC version (LAR-14458) and CDC version (LAR-14178) of AERO2S distributed by COSMIC. The CDC version has one main source code file while the PC version has two files which are easier to edit and compile on a PC. The PC version does not require a FORTRAN compiler which supports NAMELIST because a special INPUT subroutine has been added. The CDC version includes two MODIFY decks which can be used to improve the code and prevent the possibility of some infrequently occurring errors while PC-version users will have to make these code changes manually. The PC version includes an executable which was generated with the Ryan McFarland/FORTRAN compiler and requires 253K RAM and an 80x87 math co-processor. Using this executable, the sample case requires about four hours to execute on an 8MHz AT-class microcomputer with a co-processor. The source code conforms to the FORTRAN 77 standard except that it uses variables longer than six characters. With two minor modifications, the PC version should be portable to any computer with a FORTRAN compiler and sufficient memory. The CDC version of AERO2S is available in CDC NOS Internal format on a 9-track 1600 BPI magnetic tape. The PC version is available on a set of two 5.25 inch 360K MS-DOS format diskettes. IBM AT is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. CDC is a registered trademark of Control Data Corporation. NOS is a trademark of Control Data Corporation.

  12. Important notice for Windows 2000 Service Pack 3 computers

    CERN Multimedia

    The NICE Team

    2005-01-01

    Microsoft is ending support for Windows 2000 Service Pack 3, which was introduced in 2002. As a consequence, computers running Windows 2000 Service Pack 3 (or older versions1)) must be updated. It is recommended that Windows 2000 computers be re-installed with Windows XP Service Pack 2 (see http://cern.ch/Win/Services/Installation/Diane). If this is not possible for compatibility reasons, Windows 2000 Service Pack 4 must be installed to ensure the computers continue to receive security patches (see http://cern.ch/Win/Docs/2000SP4). In the next few days, NICE 2000 computers requiring an update will receive a pop-up window with instructions. Users requiring help with the update can contact Helpdesk@cern.ch or call 78888. If your computer needs to be updated you are recommended to read the additional information available at http://cern.ch/Win/Docs/2000SP3. The NICE Team 1) To determine your Windows service pack version, use the ‘Start' button and select ‘Run'. In the new window that open...

  13. Important notice for Windows 2000 Service Pack 3 computers

    CERN Multimedia

    The NICE Team

    2005-01-01

    Microsoft is ending support for Windows 2000 Service Pack 3, which was introduced in 2002. As a consequence, computers running Windows 2000 Service Pack 3 (or older versions1) ) must be updated. It is recommended that Windows 2000 computers be re-installed with Windows XP Service Pack 2 (see http://cern.ch/Win/Services/Installation/Diane). If this is not possible for compatibility reasons, Windows 2000 Service Pack 4 must be installed to ensure the computers continue to receive security patches (see http://cern.ch/Win/Docs/2000SP4). In the next few days, NICE 2000 computers requiring an update will receive a pop-up window with instructions. Users requiring help with the update can contact Helpdesk@cern.ch or call 78888. If your computer needs to be updated you are recommended to read the additional information available at http://cern.ch/Win/Docs/2000SP3. The NICE Team 1) To determine your Windows service pack version, use the ‘Start' button and select ‘Run'. In the new window that opens, type ‘wi...

  14. Versioning Complex Data

    Energy Technology Data Exchange (ETDEWEB)

    Macduff, Matt C.; Lee, Benno; Beus, Sherman J.

    2014-06-29

    Using the history of ARM data files, we designed and demonstrated a data versioning paradigm that is feasible. Assigning versions to sets of files that are modified with some special assumptions and domain specific rules was effective in the case of ARM data, which has more than 5000 datastreams and 500TB of data.

  15. FBR metallic materials test manual (English version)

    International Nuclear Information System (INIS)

    Odaka, Susumu; Kato, Shoichi; Yoshida, Eiichi

    2003-06-01

    For the development of the fast breeder reactor, this manual describes the method of in-air and in-sodium material tests and the method of organization the data. This previous manual has revised in accordance with the revision of Japanese Industrial Standard (JIS) and the conversion to the international unit. The test methods of domestic committees such as the VAMAS (Versailles Project on Advanced Materials and Standards) workshop were also refereed. The material test technologies accumulated in this group until now were also incorporated. This English version was prepared in order to provide more engineers with the FBR metallic materials test manual. (author)

  16. 1984 CERN school of computing

    International Nuclear Information System (INIS)

    1985-01-01

    The eighth CERN School of Computing covered subjects mainly related to computing for elementary-particle physics. These proceedings contain written versions of most of the lectures delivered at the School. Notes on the following topics are included: trigger and data-acquisition plans for the LEP experiments; unfolding methods in high-energy physics experiments; Monte Carlo techniques; relational data bases; data networks and open systems; the Newcastle connection; portable operating systems; expert systems; microprocessors - from basic chips to complete systems; algorithms for parallel computers; trends in supercomputers and computational physics; supercomputing and related national projects in Japan; application of VLSI in high-energy physics, and single-user systems. See hints under the relevant topics. (orig./HSI)

  17. Cross-Cultural Adaptation and Psychometric Testing of the Brazilian Version of the Self-Care of Heart Failure Index Version 6.2

    Science.gov (United States)

    Ávila, Christiane Wahast; Riegel, Barbara; Pokorski, Simoni Chiarelli; Camey, Suzi; Silveira, Luana Claudia Jacoby; Rabelo-Silva, Eneida Rejane

    2013-01-01

    Objective. To adapt and evaluate the psychometric properties of the Brazilian version of the SCHFI v 6.2. Methods. With the approval of the original author, we conducted a complete cross-cultural adaptation of the instrument (translation, synthesis, back translation, synthesis of back translation, expert committee review, and pretesting). The adapted version was named Brazilian version of the self-care of heart failure index v 6.2. The psychometric properties assessed were face validity and content validity (by expert committee review), construct validity (convergent validity and confirmatory factor analysis), and reliability. Results. Face validity and content validity were indicative of semantic, idiomatic, experimental, and conceptual equivalence. Convergent validity was demonstrated by a significant though moderate correlation (r = −0.51) on comparison with equivalent question scores of the previously validated Brazilian European heart failure self-care behavior scale. Confirmatory factor analysis supported the original three-factor model as having the best fit, although similar results were obtained for inadequate fit indices. The reliability of the instrument, as expressed by Cronbach's alpha, was 0.40, 0.82, and 0.93 for the self-care maintenance, self-care management, and self-care confidence scales, respectively. Conclusion. The SCHFI v 6.2 was successfully adapted for use in Brazil. Nevertheless, further studies should be carried out to improve its psychometric properties. PMID:24163765

  18. MCNP(trademark) Version 5

    International Nuclear Information System (INIS)

    Cox, Lawrence J.; Barrett, Richard F.; Booth, Thomas Edward; Briesmeister, Judith F.; Brown, Forrest B.; Bull, Jeffrey S.; Giesler, Gregg Carl; Goorley, John T.; Mosteller, Russell D.; Forster, R. Arthur; Post, Susan E.; Prael, Richard E.; Selcow, Elizabeth Carol; Sood, Avneet

    2002-01-01

    The Monte Carlo transport workhorse, MCNP, is undergoing a massive renovation at Los Alamos National Laboratory (LANL) in support of the Eolus Project of the Advanced Simulation and Computing (ASCI) Program. MCNP Version 5 (V5) (expected to be released to RSICC in Spring, 2002) will consist of a major restructuring from FORTRAN-77 (with extensions) to ANSI-standard FORTRAN-90 with support for all of the features available in the present release (MCNP-4C2/4C3). To most users, the look-and-feel of MCNP will not change much except for the improvements (improved graphics, easier installation, better online documentation). For example, even with the major format change, full support for incremental patching will still be provided. In addition to the language and style updates, MCNP V5 will have various new user features. These include improved photon physics, neutral particle radiography, enhancements and additions to variance reduction methods, new source options, and improved parallelism support (PVM, MPI, OpenMP).

  19. THE NASA AMES PAH IR SPECTROSCOPIC DATABASE VERSION 2.00: UPDATED CONTENT, WEB SITE, AND ON(OFF)LINE TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    Boersma, C.; Mattioda, A. L.; Allamandola, L. J. [NASA Ames Research Center, MS 245-6, Moffett Field, CA 94035 (United States); Bauschlicher, C. W. Jr.; Ricca, A. [NASA Ames Research Center, MS 230-3, Moffett Field, CA 94035 (United States); Cami, J.; Peeters, E.; De Armas, F. Sánchez; Saborido, G. Puerta [SETI Institute, 189 Bernardo Avenue 100, Mountain View, CA 94043 (United States); Hudgins, D. M., E-mail: Christiaan.Boersma@nasa.gov [NASA Headquarters, MS 3Y28, 300 E St. SW, Washington, DC 20546 (United States)

    2014-03-01

    A significantly updated version of the NASA Ames PAH IR Spectroscopic Database, the first major revision since its release in 2010, is presented. The current version, version 2.00, contains 700 computational and 75 experimental spectra compared, respectively, with 583 and 60 in the initial release. The spectra span the 2.5-4000 μm (4000-2.5 cm{sup -1}) range. New tools are available on the site that allow one to analyze spectra in the database and compare them with imported astronomical spectra as well as a suite of IDL object classes (a collection of programs utilizing IDL's object-oriented programming capabilities) that permit offline analysis called the AmesPAHdbIDLSuite. Most noteworthy among the additions are the extension of the computational spectroscopic database to include a number of significantly larger polycyclic aromatic hydrocarbons (PAHs), the ability to visualize the molecular atomic motions corresponding to each vibrational mode, and a new tool that allows one to perform a non-negative least-squares fit of an imported astronomical spectrum with PAH spectra in the computational database. Finally, a methodology is described in the Appendix, and implemented using the AmesPAHdbIDLSuite, that allows the user to enforce charge balance during the fitting procedure.

  20. An adaptive maneuvering logic computer program for the simulation of one-on-one air-to-air combat. Volume 1: General description

    Science.gov (United States)

    Burgin, G. H.; Fogel, L. J.; Phelps, J. P.

    1975-01-01

    A technique for computer simulation of air combat is described. Volume 1 decribes the computer program and its development in general terms. Two versions of the program exist. Both incorporate a logic for selecting and executing air combat maneuvers with performance models of specific fighter aircraft. In the batch processing version the flight paths of two aircraft engaged in interactive aerial combat and controlled by the same logic are computed. The realtime version permits human pilots to fly air-to-air combat against the adaptive maneuvering logic (AML) in Langley Differential Maneuvering Simulator (DMS). Volume 2 consists of a detailed description of the computer programs.

  1. Inverse kinetics equations for on line measurement of reactivity using personal computer

    International Nuclear Information System (INIS)

    Ratemi, Wajdi; El Gadamsi, Walied; Beleid, Abdul Kariem

    1993-01-01

    Computer with their astonishing speed of calculations along with their easy connection to real systems, are very appropriate for digital measurements of real system variables. In the nuclear industry, such computer application will produce compact control rooms of real power plants, where information and results display can be obtained through push button concept. In our study, we use two personal computers for the purpose of simulation and measurement. One of them is used as a digital simulator to a real reactor, where we effectively simulate the reactor power through a cross talk network. The computed power is passed at certain chosen sampling time to the other computer. The purpose of the other computer is to use the inverse kinetics equations to calculate the reactivity parameter based on the received power and then it performs on line display of the power curve and the reactivity curve using color graphics. In this study, we use the one group version of the inverse kinetics algorithm which can easily be extended to larger group version. The language of programming used in Turbo BASIC, which is very comparable, in terms of efficiency, to FORTRAN language, besides its effective graphics routines. With the use of the extended version of the Inverse Kinetics algorithm, we can effectively apply this techniques of measurement for the purpose of on line display of the reactivity of the Tajoura Research Reactor. (author)

  2. Distributed MRI reconstruction using Gadgetron-based cloud computing.

    Science.gov (United States)

    Xue, Hui; Inati, Souheil; Sørensen, Thomas Sangild; Kellman, Peter; Hansen, Michael S

    2015-03-01

    To expand the open source Gadgetron reconstruction framework to support distributed computing and to demonstrate that a multinode version of the Gadgetron can be used to provide nonlinear reconstruction with clinically acceptable latency. The Gadgetron framework was extended with new software components that enable an arbitrary number of Gadgetron instances to collaborate on a reconstruction task. This cloud-enabled version of the Gadgetron was deployed on three different distributed computing platforms ranging from a heterogeneous collection of commodity computers to the commercial Amazon Elastic Compute Cloud. The Gadgetron cloud was used to provide nonlinear, compressed sensing reconstruction on a clinical scanner with low reconstruction latency (eg, cardiac and neuroimaging applications). The proposed setup was able to handle acquisition and 11 -SPIRiT reconstruction of nine high temporal resolution real-time, cardiac short axis cine acquisitions, covering the ventricles for functional evaluation, in under 1 min. A three-dimensional high-resolution brain acquisition with 1 mm(3) isotropic pixel size was acquired and reconstructed with nonlinear reconstruction in less than 5 min. A distributed computing enabled Gadgetron provides a scalable way to improve reconstruction performance using commodity cluster computing. Nonlinear, compressed sensing reconstruction can be deployed clinically with low image reconstruction latency. © 2014 Wiley Periodicals, Inc.

  3. HALE UAS Concept of Operations. Version 3.0

    Science.gov (United States)

    2006-01-01

    This document is a system level Concept of Operations (CONOPS) from the perspective of future High Altitude Long Endurance (HALE) Unmanned Aircraft Systems (UAS) service providers and National Airspace System (NAS) users. It describes current systems (existing UAS), describes HALE UAS functions and operations to be performed (via sample missions), and offers insight into the user s environment (i.e., the UAS as a system of systems). It is intended to be a source document for NAS UAS operational requirements, and provides a construct for government agencies to use in guiding their regulatory decisions, architecture requirements, and investment strategies. Although it does not describe the technical capabilities of a specific HALE UAS system (which do, and will vary widely), it is intended to aid in requirements capture and to be used as input to the functional requirements and analysis process. The document provides a basis for development of functional requirements and operational guidelines to achieve unrestricted access into the NAS. This document is an FY06 update to the FY05 Access 5 Project-approved Concept of Operations document previously published in the Public Domain on the Access 5 open website. This version is recommended to be approved for public release also. The updates are a reorganization of materials from the previous version with the addition of an updated set of operational requirements, inclusion of sample mission scenarios, and identification of roles and responsibilities of interfaces within flight phases.

  4. The Unified Extensional Versioning Model

    DEFF Research Database (Denmark)

    Asklund, U.; Bendix, Lars Gotfred; Christensen, H. B.

    1999-01-01

    Versioning of components in a system is a well-researched field where various adequate techniques have already been established. In this paper, we look at how versioning can be extended to cover also the structural aspects of a system. There exist two basic techniques for versioning - intentional...

  5. ORACLS- OPTIMAL REGULATOR ALGORITHMS FOR THE CONTROL OF LINEAR SYSTEMS (CDC VERSION)

    Science.gov (United States)

    Armstrong, E. S.

    1994-01-01

    computations: the eigenvalues and eigenvectors of real matrices; the relative stability of a given matrix; matrix factorization; the solution of linear constant coefficient vector-matrix algebraic equations; the controllability properties of a linear time-invariant system; the steady-state covariance matrix of an open-loop stable system forced by white noise; and the transient response of continuous linear time-invariant systems. The control law design routines of ORACLS implement some of the more common techniques of time-invariant LQG methodology. For the finite-duration optimal linear regulator problem with noise-free measurements, continuous dynamics, and integral performance index, a routine is provided which implements the negative exponential method for finding both the transient and steady-state solutions to the matrix Riccati equation. For the discrete version of this problem, the method of backwards differencing is applied to find the solutions to the discrete Riccati equation. A routine is also included to solve the steady-state Riccati equation by the Newton algorithms described by Klein, for continuous problems, and by Hewer, for discrete problems. Another routine calculates the prefilter gain to eliminate control state cross-product terms in the quadratic performance index and the weighting matrices for the sampled data optimal linear regulator problem. For cases with measurement noise, duality theory and optimal regulator algorithms are used to calculate solutions to the continuous and discrete Kalman-Bucy filter problems. Finally, routines are included to implement the continuous and discrete forms of the explicit (model-in-the-system) and implicit (model-in-the-performance-index) model following theory. These routines generate linear control laws which cause the output of a dynamic time-invariant system to track the output of a prescribed model. In order to apply ORACLS, the user must write an executive (driver) program which inputs the problem coefficients

  6. Cross-cultural adaptation and validation of the Danish consensus version of the 10-item Perceived Stress Scale

    DEFF Research Database (Denmark)

    Eskildsen, Anita; Dalgaard, Vita Ligaya; Nielsen, Kent Jacob

    2015-01-01

    with work-related stress complaints. METHODS: A consensus-building process was performed involving the authors of the three previous Danish translations and the consensus version was back-translated into English and pilot-tested. Psychometric properties of the final version were examined in a sample of 64...... patients with work-related stress complaints. RESULTS: The face validity, reliability, and internal consistency of the Danish consensus version of the PSS-10 were satisfactory, and convergent construct validity was confirmed. Receiver operating characteristic (ROC) curves of the change scores showed......OBJECTIVES: The aims of the present study were to (i) cross-culturally adapt a Danish consensus version of the 10-item Perceived Stress Scale (PSS-10) and (ii) evaluate its psychometric properties in terms of agreement, reliability, validity, responsiveness, and interpretability among patients...

  7. A survey of computational physics introductory computational science

    CERN Document Server

    Landau, Rubin H; Bordeianu, Cristian C

    2008-01-01

    Computational physics is a rapidly growing subfield of computational science, in large part because computers can solve previously intractable problems or simulate natural processes that do not have analytic solutions. The next step beyond Landau's First Course in Scientific Computing and a follow-up to Landau and Páez's Computational Physics, this text presents a broad survey of key topics in computational physics for advanced undergraduates and beginning graduate students, including new discussions of visualization tools, wavelet analysis, molecular dynamics, and computational fluid dynamics

  8. Infant behaviour questionnaire - revised version: a psychometric study in a Portuguese sample.

    Science.gov (United States)

    Costa, Raquel; Figueiredo, Bárbara

    2018-04-01

    Although the original characteristics of temperament tend to remain constant over the course of development, environmental circumstances may influence infants' reactions and behaviour. Parents' reports of infant temperament are rich informants of infant behaviours in different contexts. This study aimed to examine the psychometric properties of the Portuguese version of the Infant Behaviour Questionnaire - Revised (IBQ-R) and test the adequacy of the original and other previously published structures to the Portuguese data. 330 mothers and 81 fathers of children aged between 3 and 12 months completed the Portuguese version of the IBQ-R. The confirmatory factorial analysis revealed a non-adequate model fit of the IBQ-R original structure to the Portuguese data; nonetheless, it did reveal an adequate model fit of a previous published IBQ-R structure. This structure, although only slightly different from the original one, seems to be more suitable for the Portuguese data. This study provides data that indicates that the IBQ-R is a reliable questionnaire to evaluate infant temperament in the Portuguese culture.

  9. SCELib3.0: The new revision of SCELib, the parallel computational library of molecular properties in the Single Center Approach

    Science.gov (United States)

    Sanna, N.; Baccarelli, I.; Morelli, G.

    2009-12-01

    SCELib is a computer program which implements the Single Center Expansion (SCE) method to describe molecular electronic densities and the interaction potentials between a charged projectile (electron or positron) and a target molecular system. The first version (CPC Catalog identifier ADMG_v1_0) was submitted to the CPC Program Library in 2000, and version 2.0 (ADMG_v2_0) was submitted in 2004. We here announce the new release 3.0 which presents additional features with respect to the previous versions aiming at a significative enhance of its capabilities to deal with larger molecular systems. SCELib 3.0 allows for ab initio effective core potential (ECP) calculations of the molecular wavefunctions to be used in the SCE method in addition to the standard all-electron description of the molecule. The list of supported architectures has been updated and the code has been ported to platforms based on accelerating coprocessors, such as the NVIDIA GPGPU and the new parallel model adopted is able to efficiently run on a mixed many-core computing system. Program summaryProgram title: SCELib3.0 Catalogue identifier: ADMG_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADMG_v3_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2 018 862 No. of bytes in distributed program, including test data, etc.: 4 955 014 Distribution format: tar.gz Programming language: C Compilers used: xlc V8.x, Intel C V10.x, Portland Group V7.x, nvcc V2.x Computer: All SMP platforms based on AIX, Linux and SUNOS operating systems over SPARC, POWER, Intel Itanium2, X86, em64t and Opteron processors Operating system: SUNOS, IBM AIX, Linux RedHat (Enterprise), Linux SuSE (SLES) Has the code been vectorized or parallelized?: Yes. 1 to 32 (CPU or GPU) used RAM: Up to 32 GB depending on the molecular

  10. Solution of the Skyrme-Hartree-Fock-Bogolyubov equations in the Cartesian deformed harmonic-oscillator basis. (VII) HFODD (v2.49t): A new version of the program

    International Nuclear Information System (INIS)

    Schunck, Nicolas F.; McDonnell, J.; Sheikh, J.A.; Staszczak, A.; Stoitsov, Mario; Dobaczewski, J.; Toivanen, P.

    2012-01-01

    We describe the new version (v2.49t) of the code HFODD which solves the nuclear Skyrme Hartree-Fock (HF) or Skyrme Hartree-Fock-Bogolyubov (HFB) problem by using the Cartesian deformed harmonic-oscillator basis. In the new version, we have implemented the following physics features: (i) the isospin mixing and projection, (ii) the finite temperature formalism for the HFB and HF+BCS methods, (iii) the Lipkin translational energy correction method, (iv) the calculation of the shell correction. A number of specific numerical methods have also been implemented in order to deal with large-scale multi-constraint calculations and hardware limitations: (i) the two-basis method for the HFB method, (ii) the Augmented Lagrangian Method (ALM) for multi-constraint calculations, (iii) the linear constraint method based on the approximation of the RPA matrix for multi-constraint calculations, (iv) an interface with the axial and parity-conserving Skyrme-HFB code HFBTHO, (v) the mixing of the HF or HFB matrix elements instead of the HF fields. Special care has been paid to using the code on massively parallel leadership class computers. For this purpose, the following features are now available with this version: (i) the Message Passing Interface (MPI) framework, (ii) scalable input data routines, (iii) multi-threading via OpenMP pragmas, (iv) parallel diagonalization of the HFB matrix in the simplex breaking case using the ScaLAPACK library. Finally, several little significant errors of the previous published version were corrected.

  11. Next Generation Computer Resources: Reference Model for Project Support Environments (Version 2.0)

    National Research Council Canada - National Science Library

    Brown, Alan

    1993-01-01

    The objective of the Next Generation Computer Resources (NGCR) program is to restructure the Navy's approach to acquisition of standard computing resources to take better advantage of commercial advances and investments...

  12. Computer virus information update CIAC-2301

    Energy Technology Data Exchange (ETDEWEB)

    Orvis, W.J.

    1994-01-15

    While CIAC periodically issues bulletins about specific computer viruses, these bulletins do not cover all the computer viruses that affect desktop computers. The purpose of this document is to identify most of the known viruses for the MS-DOS and Macintosh platforms and give an overview of the effects of each virus. The authors also include information on some windows, Atari, and Amiga viruses. This document is revised periodically as new virus information becomes available. This document replaces all earlier versions of the CIAC Computer virus Information Update. The date on the front cover indicates date on which the information in this document was extracted from CIAC`s Virus database.

  13. Validation of a new radiographic measurement of acetabular version: the transverse axis distance (TAD).

    Science.gov (United States)

    Nitschke, Ashley; Lambert, Jeffery R; Glueck, Deborah H; Jesse, Mary Kristen; Mei-Dan, Omer; Strickland, Colin; Petersen, Brian

    2015-11-01

    This study has three aims: (1) validate a new radiographic measure of acetabular version, the transverse axis distance (TAD) by showing equivalent TAD accuracy in predicting CT equatorial acetabular version when compared to a previously validated, but more cumbersome, radiographic measure, the p/a ratio; (2) establish predictive equations of CT acetabular version from TAD; (3) calculate a sensitive and specific cut point for predicting excessive CT acetabular anteversion using TAD. A 14-month retrospective review was performed of patients who had undergone a dedicated MSK CT pelvis study and who also had a technically adequate AP pelvis radiograph. Two trained observers measured the radiographic p/a ratio, TAD, and CT acetabular equatorial version for 110 hips on a PACS workstation. Mixed model analysis was used to find prediction equations, and ROC analysis was used to evaluate the diagnostic accuracy of p/a ratio and TAD. CT equatorial acetabular version can accurately be predicted from either p/a ratio (p TAD (p TAD are comparable (p =0.46). Patients whose TAD is higher than 17 mm may have excessive acetabular anteversion. For that cutpoint, the sensitivity of TAD is 0.73, with specificity of 0.82. TAD is an accurate radiographic predictor of CT acetabular anteversion and provides an easy-to-use and intuitive point-of-care assessment of acetabular version in patients with hip pain.

  14. Tracking code patterns over multiple software versions with Herodotos

    DEFF Research Database (Denmark)

    Palix, Nicolas Jean-Michel; Lawall, Julia; Muller, Gilles

    2010-01-01

    An important element of understanding a software code base is to identify the repetitive patterns of code it contains and how these evolve over time. Some patterns are useful to the software, and may be modularized. Others are detrimental to the software, such as patterns that represent defects...... pattern occurrences over multiple versions of a software project, independent of other changes in the source files. Guided by a user-provided configuration file, Herodotos builds various graphs showing the evolution of the pattern occurrences and computes some statistics. We have evaluated this approach...

  15. 1987 CERN school of computing

    International Nuclear Information System (INIS)

    Verkerk, C.

    1988-01-01

    These Proceedings contain written versions of most of the lectures delivered at the 1987 CERN School of Computing. Five lecture series treated various aspects of data communications: integrated services networks, standard LANs and optical LANs, open systems networking in practice, and distributed operating systems. Present and future computer architectures were covered and an introduction to vector processing was given, followed by lectures on vectorization of pattern recognition and Monte Carlo code. Aspects of computing in high-energy physics were treated in lectures on data acquisition and analysis at LEP, on data-base systems in high-energy physics experiments, and on Fastbus. The experience gained with personal work stations was also presented. Various other topics were covered: the use of computers in number theory and in astronomy, fractals, and computer security and access control. (orig.)

  16. High Performance Computing - Power Application Programming Interface Specification Version 1.4

    Energy Technology Data Exchange (ETDEWEB)

    Laros III, James H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); DeBonis, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grant, Ryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kelly, Suzanne M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Levenhagen, Michael J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Olivier, Stephen Lecler [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pedretti, Kevin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-10-01

    Measuring and controlling the power and energy consumption of high performance computing systems by various components in the software stack is an active research area [13, 3, 5, 10, 4, 21, 19, 16, 7, 17, 20, 18, 11, 1, 6, 14, 12]. Implementations in lower level software layers are beginning to emerge in some production systems, which is very welcome. To be most effective, a portable interface to measurement and control features would significantly facilitate participation by all levels of the software stack. We present a proposal for a standard power Application Programming Interface (API) that endeavors to cover the entire software space, from generic hardware interfaces to the input from the computer facility manager.

  17. Fuzzy Versions of Epistemic and Deontic Logic

    Science.gov (United States)

    Gounder, Ramasamy S.; Esterline, Albert C.

    1998-01-01

    Epistemic and deontic logics are modal logics, respectively, of knowledge and of the normative concepts of obligation, permission, and prohibition. Epistemic logic is useful in formalizing systems of communicating processes and knowledge and belief in AI (Artificial Intelligence). Deontic logic is useful in computer science wherever we must distinguish between actual and ideal behavior, as in fault tolerance and database integrity constraints. We here discuss fuzzy versions of these logics. In the crisp versions, various axioms correspond to various properties of the structures used in defining the semantics of the logics. Thus, any axiomatic theory will be characterized not only by its axioms but also by the set of properties holding of the corresponding semantic structures. Fuzzy logic does not proceed with axiomatic systems, but fuzzy versions of the semantic properties exist and can be shown to correspond to some of the axioms for the crisp systems in special ways that support dependency networks among assertions in a modal domain. This in turn allows one to implement truth maintenance systems. For the technical development of epistemic logic, and for that of deontic logic. To our knowledge, we are the first to address fuzzy epistemic and fuzzy deontic logic explicitly and to consider the different systems and semantic properties available. We give the syntax and semantics of epistemic logic and discuss the correspondence between axioms of epistemic logic and properties of semantic structures. The same topics are covered for deontic logic. Fuzzy epistemic and fuzzy deontic logic discusses the relationship between axioms and semantic properties for these logics. Our results can be exploited in truth maintenance systems.

  18. External cephalic version for breech presentation at term

    International Nuclear Information System (INIS)

    Rauf, B.; Hassan, L.

    2007-01-01

    To assess the success rate of External Cephalic Version (ECV) at term and its effects on measures of pregnancy outcome. A total of 40 patients were offered ECV over a period of fourteen months. All singleton breech presentations with an otherwise normal antenatal course between 36-41 weeks of gestation were included in the study. Exclusion criteria included contraindications to ECV i.e. multiple pregnancy, oligohydramnios, growth retardation, antepartum hemorrhage, rupture of membranes toxemias of pregnancy, non-reassuring fetal monitoring pattern, previous uterine scar, bad obstetric history, any contraindication to vaginal delivery, labour and patient wishes after thorough counseling. Overall success rate of the procedure and its effect on maternal and fetal outcome was determined. Significance of results was determined using Chi-square test. A total of 40 patients were recruited for the trial. Overall success rate was 67.5% with only 30% being primi-gravida (p<0.05). Multi-gravida showed higher success rate of 80%. Following successful ECV, spontaneous vaginal delivery was attained in 77.7% (n=21), while caesarean section was performed due to various indications in about 6 cases (p<0.05). Following failed version, 61.5% (n=8) had elective C/S and only 5 delivered vaginally. Route of delivery did not affect the perinatal outcome except for congenital abnormalities. Following successful ECV, there was only one stillbirth. Overall live births associated with successful version was 96.2% (p<0.05), while in failed version, there were no fetal deaths. ECV at term appears to be a useful procedure to reduce the number and associated complications of term breech presentation. It is safe for the mother and the fetus and helps to avoid a significant number of caesarean sections. (author)

  19. Exact computation of the 9-j symbols

    International Nuclear Information System (INIS)

    Lai Shantao; Chiu Jingnan

    1992-01-01

    A useful algebraic formula for the 9-j symbol has been rewritten for convenient use on a computer. A simple FORTRAN program for the exact computation of 9-j symbols has been written for the VAX with VMS version V5,4-1 according to this formula. The results agree with the approximate values in existing literature. Some specific values of 9-j symbols needed for the intensity and alignments of three-photon nonresonant transitions are tabulated. Approximate 9-j symbol values beyond the limitation of the computer can also be computed by this program. The computer code of the exact computation of 3-j, 6-j and 9-j symbols are available through electronic mail upon request. (orig.)

  20. Calculations of reactor-accident consequences, Version 2. CRAC2: computer code user's guide

    International Nuclear Information System (INIS)

    Ritchie, L.T.; Johnson, J.D.; Blond, R.M.

    1983-02-01

    The CRAC2 computer code is a revision of the Calculation of Reactor Accident Consequences computer code, CRAC, developed for the Reactor Safety Study. The CRAC2 computer code incorporates significant modeling improvements in the areas of weather sequence sampling and emergency response, and refinements to the plume rise, atmospheric dispersion, and wet deposition models. New output capabilities have also been added. This guide is to facilitate the informed and intelligent use of CRAC2. It includes descriptions of the input data, the output results, the file structures, control information, and five sample problems

  1. CALIPSO lidar calibration at 532 nm: version 4 nighttime algorithm

    Science.gov (United States)

    Kar, Jayanta; Vaughan, Mark A.; Lee, Kam-Pui; Tackett, Jason L.; Avery, Melody A.; Garnier, Anne; Getzewich, Brian J.; Hunt, William H.; Josset, Damien; Liu, Zhaoyan; Lucker, Patricia L.; Magill, Brian; Omar, Ali H.; Pelon, Jacques; Rogers, Raymond R.; Toth, Travis D.; Trepte, Charles R.; Vernier, Jean-Paul; Winker, David M.; Young, Stuart A.

    2018-03-01

    Data products from the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) on board Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) were recently updated following the implementation of new (version 4) calibration algorithms for all of the Level 1 attenuated backscatter measurements. In this work we present the motivation for and the implementation of the version 4 nighttime 532 nm parallel channel calibration. The nighttime 532 nm calibration is the most fundamental calibration of CALIOP data, since all of CALIOP's other radiometric calibration procedures - i.e., the 532 nm daytime calibration and the 1064 nm calibrations during both nighttime and daytime - depend either directly or indirectly on the 532 nm nighttime calibration. The accuracy of the 532 nm nighttime calibration has been significantly improved by raising the molecular normalization altitude from 30-34 km to the upper possible signal acquisition range of 36-39 km to substantially reduce stratospheric aerosol contamination. Due to the greatly reduced molecular number density and consequently reduced signal-to-noise ratio (SNR) at these higher altitudes, the signal is now averaged over a larger number of samples using data from multiple adjacent granules. Additionally, an enhanced strategy for filtering the radiation-induced noise from high-energy particles was adopted. Further, the meteorological model used in the earlier versions has been replaced by the improved Modern-Era Retrospective analysis for Research and Applications, Version 2 (MERRA-2), model. An aerosol scattering ratio of 1.01 ± 0.01 is now explicitly used for the calibration altitude. These modifications lead to globally revised calibration coefficients which are, on average, 2-3 % lower than in previous data releases. Further, the new calibration procedure is shown to eliminate biases at high altitudes that were present in earlier versions and consequently leads to an improved representation of

  2. Advanced Simulation and Computing Fiscal Year 2016 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hendrickson, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-27

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The purpose of this IP is to outline key work requirements to be performed and to control individual work activities within the scope of work. Contractors may not deviate from this plan without a revised WA or subsequent IP.

  3. Risk factors for cesarean section and instrumental vaginal delivery after successful external cephalic version

    NARCIS (Netherlands)

    de Hundt, Marcella; Vlemmix, Floortje; Bais, Joke M. J.; de Groot, Christianne J.; Mol, Ben Willem; Kok, Marjolein

    2016-01-01

    Aim of this article is to examine if we could identify factors that predict cesarean section and instrumental vaginal delivery in women who had a successful external cephalic version. We used data from a previous randomized trial among 25 hospitals and their referring midwife practices in the

  4. A comparison of the standard and the computerized versions of the Well-being Questionnaire (WBQ) and the Diabetes Treatment Satisfaction Questionnaire (DTSQ)

    DEFF Research Database (Denmark)

    Pouwer, F; Snoek, Frank J; Van Der Ploeg, Henk M

    1998-01-01

    In the present study, the equivalence of paper and pencil assessment versus computer assessment of two self-administered questionnaires was investigated by means of a randomized cross-over design. Therefore, 105 out-patients with diabetes were invited to participate; 76 patients completed both...... the computer and the paper and pencil version of the Well-being Questionnaire (WBQ) and the Diabetes Treatment Satisfaction Questionnaire (DTSQ) in a randomized order, with a mean interval of 7 days. The scales showed high test-retest correlations and the means, dispersions, kurtosis and skewness were found...... of a questionnaire was easy. It is concluded that the paper and pencil and the computerized versions of the WBQ and DTSQ can be considered equivalent. Therefore, the norms and cut-off scores obtained from paper and pencil assessments can be used in computerized versions of the WBQ and DTSQ and vice versa....

  5. Augmentation of Teaching Tools: Outsourcing the HSD Computing for SPSS Application

    Science.gov (United States)

    Wang, Jianjun

    2010-01-01

    The widely-used Tukey's HSD index is not produced in the current version of SPSS (i.e., PASW Statistics, version 18), and a computer program named "HSD Calculator" has been chosen to amend this problem. In comparison to hand calculation, this program application does not require table checking, which eliminates potential concern on the size of a…

  6. Reliability and Validity Testing of a Danish Translated Version of Spinal Appearance Questionnaire (SAQ) v 1.1

    DEFF Research Database (Denmark)

    Simony, Ane; Carreon, Leah Y; Hansen, Karen Højmark

    2016-01-01

    Study Design Cross-sectional. Objective To develop a psychometrically reliable and valid Danish version of the Spinal Appearance Questionnaire (SAQ). Summary of Background Data The SAQ was developed as a disease-specific measure of quality of life in patients with adolescent idiopathic scoliosis...... (AIS), specifically for younger patients, as it has more visual cues than verbal questions. A reliable and valid Danish Version is not available. Methods A Danish version of the SAQ was developed using previously published and widely accepted guidelines. The final Danish SAQ and the Danish SRS22-R were...... effect for SAQ Expectations. There was good to excellent internal consistency within each domain. Conclusion This purpose of this study was to translate and validate a Danish version of the SAQ. Although problems were identified with items 7 and 8, the Danish SAQ is reliable and valid....

  7. PR-EDB: Power Reactor Embrittlement Data Base, version 1: Program description

    International Nuclear Information System (INIS)

    Stallmann, F.W.; Kam, F.B.K.; Taylor, B.J.

    1990-06-01

    Data concerning radiation embrittlement of pressure vessel steels in commercial power reactors have been collected form available surveillance reports. The purpose of this NRC-sponsored program is to provide the technical bases for voluntary consensus standards, regulatory guides, standard review plans, and codes. The data can also be used for the exploration and verification of embrittlement prediction models. The data files are given in dBASE 3 Plus format and can be accessed with any personal computer using the DOS operating system. Menu-driven software is provided for easy access to the data including curve fitting and plotting facilities. This software has drastically reduced the time and effort for data processing and evaluation compared to previous data bases. The current compilation of the Power Reactor Embrittlement Data base (PR-EDB, version 1) contains results from surveillance capsule reports of 78 reactors with 381 data points from 110 different irradiated base materials (plates and forgings) and 161 data points from 79 different welds. Results from heat-affected-zone materials are also listed. Electric Power Research Institute (EPRI), reactor vendors, and utilities are in the process of providing back-up quality assurance checks of the PR-EDB and will be supplementing the data base with additional data and documentation. 2 figs., 28 tabs

  8. A comparison of reliability and construct validity between the original and revised versions of the Rosenberg Self-Esteem Scale.

    Science.gov (United States)

    Wongpakaran, Tinakon; Tinakon, Wongpakaran; Wongpakaran, Nahathai; Nahathai, Wongpakaran

    2012-03-01

    The Rosenberg Self-Esteem Scale (RSES) is a widely used instrument that has been tested for reliability and validity in many settings; however, some negative-worded items appear to have caused it to reveal low reliability in a number of studies. In this study, we revised one negative item that had previously (from the previous studies) produced the worst outcome in terms of the structure of the scale, then re-analyzed the new version for its reliability and construct validity, comparing it to the original version with respect to fit indices. In total, 851 students from Chiang Mai University (mean age: 19.51±1.7, 57% of whom were female), participated in this study. Of these, 664 students completed the Thai version of the original RSES - containing five positively worded and five negatively worded items, while 187 students used the revised version containing six positively worded and four negatively worded items. Confirmatory factor analysis was applied, using a uni-dimensional model with method effects and a correlated uniqueness approach. The revised version showed the same level of reliability (good) as the original, but yielded a better model fit. The revised RSES demonstrated excellent fit statistics, with χ²=29.19 (df=19, n=187, p=0.063), GFI=0.970, TFI=0.969, NFI=0.964, CFI=0.987, SRMR=0.040 and RMSEA=0.054. The revised version of the Thai RSES demonstrated an equivalent level of reliability but a better construct validity when compared to the original.

  9. Procedure guideline for thyroid scintigraphy (version 3)

    International Nuclear Information System (INIS)

    Dietlein, M.; Schicha, H.; Eschner, W.; Deutsche Gesellschaft fuer Medizinische Physik; Koeln Univ.; Leisner, B.; Allgemeines Krankenhaus St. Georg, Hamburg; Reiners, C.; Wuerzburg Univ.

    2007-01-01

    The version 3 of the procedure guideline for thyroid scintigraphy is an update of the procedure guideline previously published in 2003. The interpretation of the scintigraphy requires the knowledge of the patients' history, the palpation of the neck, the laboratory parameters and of the sonography. The interpretation of the technetium-99m uptake requires the knowledge of the TSH-level. As a consequence of the improved alimentary iodine supply the 99m Tc-uptake has decreased; 100 000 counts per scintigraphy should be acquired. For this, an imaging time of 10 minutes is generally needed using a high resolution collimator for thyroid imaging. (orig.)

  10. Students' Mathematics Word Problem-Solving Achievement in a Computer-Based Story

    Science.gov (United States)

    Gunbas, N.

    2015-01-01

    The purpose of this study was to investigate the effect of a computer-based story, which was designed in anchored instruction framework, on sixth-grade students' mathematics word problem-solving achievement. Problems were embedded in a story presented on a computer as computer story, and then compared with the paper-based version of the same story…

  11. The algebraic manipulation program DIRAC on IBM personal computers

    International Nuclear Information System (INIS)

    Grozin, A.G.; Perlt, H.

    1989-01-01

    The version DIRAC (2.2) for IBM compatible personal computers is described. It is designed to manipulate algebraically with polynomials and tensors. After a short introduction concerning implementation and usage on personal computers an example program is given. It contains a detailed user's guide to DIRAC (2.2) and, additionally some useful applications. 4 refs

  12. Psychometric assessment of the Spiritual Climate Scale Arabic version for nurses in Saudi Arabia.

    Science.gov (United States)

    Cruz, Jonas Preposi; Albaqawi, Hamdan Mohammad; Alharbi, Sami Melbes; Alicante, Jerico G; Vitorino, Luciano M; Abunab, Hamzeh Y

    2017-12-07

    To assess the psychometric properties of the Spiritual Climate Scale Arabic version for Saudi nurses. Evidence showed that a high level of spiritual climate in the workplace is associated with increased productivity and performance, enhanced emotional intelligence, organisational commitment and job satisfaction among nurses. A convenient sample of 165 Saudi nurses was surveyed in this descriptive, cross-sectional study. Cronbach's α and intraclass correlation coefficient of the 2 week test-retest scores were computed to establish reliability. Exploratory factor analysis was performed to support the validity of the Spiritual Climate Scale Arabic version. The Spiritual Climate Scale Arabic version manifested excellent content validity. Exploratory factor analysis supported a single factor with an explained variance of 73.2%. The Cronbach's α values of the scale ranged from .79 to .88, while the intraclass correlation coefficient value was .90. The perceived spiritual climate was associated with the respondents' hospital, gender, age and years of experience. Findings of this study support the sound psychometric properties of the Spiritual Climate Scale Arabic version. The Spiritual Climate Scale Arabic version can be used by nurse managers to assess the nurses' perception of the spiritual climate in any clinical area. This process can lead to spiritually centred interventions, thereby ensuring a clinical climate that accepts and respects different spiritual beliefs and practices. © 2017 John Wiley & Sons Ltd.

  13. 2MASS Catalog Server Kit Version 2.1

    Science.gov (United States)

    Yamauchi, C.

    2013-10-01

    The 2MASS Catalog Server Kit is open source software for use in easily constructing a high performance search server for important astronomical catalogs. This software utilizes the open source RDBMS PostgreSQL, therefore, any users can setup the database on their local computers by following step-by-step installation guide. The kit provides highly optimized stored functions for positional searchs similar to SDSS SkyServer. Together with these, the powerful SQL environment of PostgreSQL will meet various user's demands. We released 2MASS Catalog Server Kit version 2.1 in 2012 May, which supports the latest WISE All-Sky catalog (563,921,584 rows) and 9 major all-sky catalogs. Local databases are often indispensable for observatories with unstable or narrow-band networks or severe use, such as retrieving large numbers of records within a small period of time. This software is the best for such purposes, and increasing supported catalogs and improvements of version 2.1 can cover a wider range of applications including advanced calibration system, scientific studies using complicated SQL queries, etc. Official page: http://www.ir.isas.jaxa.jp/~cyamauch/2masskit/

  14. Verification of RESRAD-RDD. (Version 2.01)

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, Jing-Jy [Argonne National Lab. (ANL), Argonne, IL (United States); Flood, Paul E. [Argonne National Lab. (ANL), Argonne, IL (United States); LePoire, David [Argonne National Lab. (ANL), Argonne, IL (United States); Kamboj, Sunita [Argonne National Lab. (ANL), Argonne, IL (United States); Yu, Charley [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-09-01

    In this report, the results generated by RESRAD-RDD version 2.01 are compared with those produced by RESRAD-RDD version 1.7 for different scenarios with different sets of input parameters. RESRAD-RDD version 1.7 is spreadsheet-driven, performing calculations with Microsoft Excel spreadsheets. RESRAD-RDD version 2.01 revamped version 1.7 by using command-driven programs designed with Visual Basic.NET to direct calculations with data saved in Microsoft Access database, and re-facing the graphical user interface (GUI) to provide more flexibility and choices in guideline derivation. Because version 1.7 and version 2.01 perform the same calculations, the comparison of their results serves as verification of both versions. The verification covered calculation results for 11 radionuclides included in both versions: Am-241, Cf-252, Cm-244, Co-60, Cs-137, Ir-192, Po-210, Pu-238, Pu-239, Ra-226, and Sr-90. At first, all nuclidespecific data used in both versions were compared to ensure that they are identical. Then generic operational guidelines and measurement-based radiation doses or stay times associated with a specific operational guideline group were calculated with both versions using different sets of input parameters, and the results obtained with the same set of input parameters were compared. A total of 12 sets of input parameters were used for the verification, and the comparison was performed for each operational guideline group, from A to G, sequentially. The verification shows that RESRAD-RDD version 1.7 and RESRAD-RDD version 2.01 generate almost identical results; the slight differences could be attributed to differences in numerical precision with Microsoft Excel and Visual Basic.NET. RESRAD-RDD version 2.01 allows the selection of different units for use in reporting calculation results. The results of SI units were obtained and compared with the base results (in traditional units) used for comparison with version 1.7. The comparison shows that RESRAD

  15. HANFORD TANK WASTE OPERATIONS SIMULATOR VERSION DESCRIPTION DOCUMENT

    International Nuclear Information System (INIS)

    ALLEN, G.K.

    2003-01-01

    This document describes the software version controls established for the Hanford Tank Waste Operations Simulator (HTWOS). It defines: the methods employed to control the configuration of HTWOS; the version of each of the 26 separate modules for the version 1.0 of HTWOS; the numbering rules for incrementing the version number of each module; and a requirement to include module version numbers in each case results documentation. Version 1.0 of HTWOS is the first version under formal software version control. HTWOS contains separate revision numbers for each of its 26 modules. Individual module version numbers do not reflect the major release HTWOS configured version number

  16. ORACLS- OPTIMAL REGULATOR ALGORITHMS FOR THE CONTROL OF LINEAR SYSTEMS (DEC VAX VERSION)

    Science.gov (United States)

    Frisch, H.

    1994-01-01

    computations: the eigenvalues and eigenvectors of real matrices; the relative stability of a given matrix; matrix factorization; the solution of linear constant coefficient vector-matrix algebraic equations; the controllability properties of a linear time-invariant system; the steady-state covariance matrix of an open-loop stable system forced by white noise; and the transient response of continuous linear time-invariant systems. The control law design routines of ORACLS implement some of the more common techniques of time-invariant LQG methodology. For the finite-duration optimal linear regulator problem with noise-free measurements, continuous dynamics, and integral performance index, a routine is provided which implements the negative exponential method for finding both the transient and steady-state solutions to the matrix Riccati equation. For the discrete version of this problem, the method of backwards differencing is applied to find the solutions to the discrete Riccati equation. A routine is also included to solve the steady-state Riccati equation by the Newton algorithms described by Klein, for continuous problems, and by Hewer, for discrete problems. Another routine calculates the prefilter gain to eliminate control state cross-product terms in the quadratic performance index and the weighting matrices for the sampled data optimal linear regulator problem. For cases with measurement noise, duality theory and optimal regulator algorithms are used to calculate solutions to the continuous and discrete Kalman-Bucy filter problems. Finally, routines are included to implement the continuous and discrete forms of the explicit (model-in-the-system) and implicit (model-in-the-performance-index) model following theory. These routines generate linear control laws which cause the output of a dynamic time-invariant system to track the output of a prescribed model. In order to apply ORACLS, the user must write an executive (driver) program which inputs the problem coefficients

  17. Previous medical history of diseases in children with attention deficit hyperactivity disorder and their parents

    Directory of Open Access Journals (Sweden)

    Ayyoub Malek

    2014-02-01

    Full Text Available Introduction: The etiology of Attention deficit hyperactivity disorder (ADHD is complex and most likely includes genetic and environmental factors. This study was conducted to evaluatethe role of previous medical history of diseases in ADHD children and their parents during theearlier years of the ADHD children's lives. Methods: In this case-control study, 164 ADHD children attending to Child and AdolescentPsychiatric Clinics of Tabriz University of Medical Sciences, Iran, compared with 166 normal children selected in a random-cluster method from primary and guidance schools. ADHDrating scale (Parents version and clinical interview based on schedule for Schedule forAffective Disorders and Schizophrenia for School-Age Children-Present and Lifetime Version(K-SADS were used to diagnose ADHD cases and to select the control group. Two groupswere compared for the existence of previous medical history of diseases in children andparents. Fisher's exact test and logistic regression model were used for data analysis. Results: The frequency of maternal history of medical disorders (28.7% vs. 12.0%; P = 0.001was significantly higher in children with ADHD compared with the control group. The frequency of jaundice, dysentery, epilepsy, asthma, allergy, and head trauma in the medicalhistory of children were not significantly differed between the two groups. Conclusion: According to this preliminary study, it may be concluded that the maternal historyof medical disorders is one of contributing risk factors for ADHD.

  18. Programming for computations MATLAB/Octave : a gentle introduction to numerical simulations with MATLAB/Octave

    CERN Document Server

    Linge, Svein

    2016-01-01

    This book presents computer programming as a key method for solving mathematical problems. There are two versions of the book, one for MATLAB and one for Python. The book was inspired by the Springer book TCSE 6: A Primer on Scientific Programming with Python (by Langtangen), but the style is more accessible and concise, in keeping with the needs of engineering students. The book outlines the shortest possible path from no previous experience with programming to a set of skills that allows the students to write simple programs for solving common mathematical problems with numerical methods in engineering and science courses. The emphasis is on generic algorithms, clean design of programs, use of functions, and automatic tests for verification.

  19. Cloud Quantum Computing of an Atomic Nucleus

    Science.gov (United States)

    Dumitrescu, E. F.; McCaskey, A. J.; Hagen, G.; Jansen, G. R.; Morris, T. D.; Papenbrock, T.; Pooser, R. C.; Dean, D. J.; Lougovski, P.

    2018-05-01

    We report a quantum simulation of the deuteron binding energy on quantum processors accessed via cloud servers. We use a Hamiltonian from pionless effective field theory at leading order. We design a low-depth version of the unitary coupled-cluster ansatz, use the variational quantum eigensolver algorithm, and compute the binding energy to within a few percent. Our work is the first step towards scalable nuclear structure computations on a quantum processor via the cloud, and it sheds light on how to map scientific computing applications onto nascent quantum devices.

  20. Initial water quantification results using neutron computed tomography

    Science.gov (United States)

    Heller, A. K.; Shi, L.; Brenizer, J. S.; Mench, M. M.

    2009-06-01

    Neutron computed tomography is an important imaging tool in the field of non-destructive testing and in fundamental research for many engineering applications. Contrary to X-rays, neutrons can be attenuated by some light materials, such as hydrogen, but can penetrate many heavy materials. Thus, neutron computed tomography is useful in obtaining important three-dimensional information about a sample's interior structure and material properties that other traditional methods cannot provide. The neutron computed tomography system at the Pennsylvania State University's Radiation Science and Engineering Center is being utilized to develop a water quantification technique for investigation of water distribution in fuel cells under normal conditions. A hollow aluminum cylinder test sample filled with a known volume of water was constructed for purposes of testing the quantification technique. Transmission images of the test sample at different angles were easily acquired through the synthesis of a dedicated image acquisition computer driving a rotary table controller and an in-house developed synchronization software package. After data acquisition, Octopus (version 8.2) and VGStudio Max (version 1.2) were used to perform cross-sectional and three-dimensional reconstructions of the sample, respectively. The initial reconstructions and water quantification results are presented.

  1. Traduire Flaubert : Madame Bovary en version roumaine

    Directory of Open Access Journals (Sweden)

    Florica Courriol

    2012-02-01

    Full Text Available Cette communication est le résultat d'une analyse traductionnelle visant à réévaluer une version roumaine de Madame Bovary des années soixante-dix (Doamna Bovary traduit par Demostene Botez à travers le travail concret sur le texte avec les étudiants d'un mastère. De multiples anomalies ayant été ainsi décelées, j'ai été amenée à retraduire ce grand classique. Il m'a semblé profitable à la recherche flaubertienne d'exposer les difficultés et leurs résolutions, d'ordre purement linguistique ou civilisationnel, que l'on se place sur le plan de la lecture (compréhension du texte ou sur celui du rendu dans la langue d'arrivée. Le choix des points litigieux ne se limite pas au simple discours mais touche aussi à la problématique des noms propres (peut-on les traduire ? faut-il avoir recours aux fameuses notes de traducteur ? et, en conséquence logique, à celle du titre. Les exemples concrets viennent étayer une pratique traduisante et la mettre en relation avec la critique, l'histoire littéraire et la création flaubertienne dans son ensemble.I intend to address in this paper the concrete problems I encountered when translating into Romanian FLAUBERT’s major work, Madame Bovary. First of all, its nature of retranslation has to be specified, since an existing version, published in the 1970s by a Romanian poet (Demostene Botez, was in fact in circulation when I started working on the original text. This previous translation was probably the result of a collective operation, as the split rhythm suggests at several points. In the meantime, other publishers saw fit to print additional versions in great haste. However, a mere examination of the problematic passages is sufficient enough to realize that the authors of the new versions only limited themselves to take up the first one. They improved the text here and there, “updated” some expressions, while keeping the glaring errors of Botez’s version.

  2. Validation of a new radiographic measurement of acetabular version: the transverse axis distance (TAD)

    Energy Technology Data Exchange (ETDEWEB)

    Nitschke, Ashley [University of Colorado School of Medicine, University of Colorado Denver, Division of Musculoskeletal Radiology, Department of Radiology, Aurora, CO (United States); Lambert, Jeffery R. [University of Colorado, Department of Biostatistics and Informatics, Colorado School of Public Health, Aurora, CO (United States); Glueck, Deborah H. [University of Colorado, Department of Biostatistics and Informatics, Colorado School of Public Health, Aurora, CO (United States); University of Colorado School of Medicine, University of Colorado Denver, Department of Radiology, Aurora, CO (United States); Jesse, Mary Kristen; Strickland, Colin [University of Colorado School of Medicine, University of Colorado Denver, Division of Musculoskeletal Radiology, Department of Radiology and Orthopaedics, Aurora, CO (United States); Mei-Dan, Omer [University of Colorado School of Medicine, University of Colorado Denver, Division of Sports Medicine and Hip Preservation, Department of Orthopaedics, Aurora, CO (United States); Petersen, Brian [University of Colorado School of Medicine, University of Colorado Denver, Division of Musculoskeletal Radiology, Department of Radiology and Orthopaedics, Aurora, CO (United States); Inland Imaging, Division of Musculoskeletal Radiology, Spokane, WA (United States)

    2015-11-15

    This study has three aims: (1) validate a new radiographic measure of acetabular version, the transverse axis distance (TAD) by showing equivalent TAD accuracy in predicting CT equatorial acetabular version when compared to a previously validated, but more cumbersome, radiographic measure, the p/a ratio; (2) establish predictive equations of CT acetabular version from TAD; (3) calculate a sensitive and specific cut point for predicting excessive CT acetabular anteversion using TAD. A 14-month retrospective review was performed of patients who had undergone a dedicated MSK CT pelvis study and who also had a technically adequate AP pelvis radiograph. Two trained observers measured the radiographic p/a ratio, TAD, and CT acetabular equatorial version for 110 hips on a PACS workstation. Mixed model analysis was used to find prediction equations, and ROC analysis was used to evaluate the diagnostic accuracy of p/a ratio and TAD. CT equatorial acetabular version can accurately be predicted from either p/a ratio (p < 0.001) or TAD (p < 0.001). The diagnostic accuracies of p/a ratio and TAD are comparable (p =0.46). Patients whose TAD is higher than 17 mm may have excessive acetabular anteversion. For that cutpoint, the sensitivity of TAD is 0.73, with specificity of 0.82. TAD is an accurate radiographic predictor of CT acetabular anteversion and provides an easy-to-use and intuitive point-of-care assessment of acetabular version in patients with hip pain. (orig.)

  3. Validation of a new radiographic measurement of acetabular version: the transverse axis distance (TAD)

    International Nuclear Information System (INIS)

    Nitschke, Ashley; Lambert, Jeffery R.; Glueck, Deborah H.; Jesse, Mary Kristen; Strickland, Colin; Mei-Dan, Omer; Petersen, Brian

    2015-01-01

    This study has three aims: (1) validate a new radiographic measure of acetabular version, the transverse axis distance (TAD) by showing equivalent TAD accuracy in predicting CT equatorial acetabular version when compared to a previously validated, but more cumbersome, radiographic measure, the p/a ratio; (2) establish predictive equations of CT acetabular version from TAD; (3) calculate a sensitive and specific cut point for predicting excessive CT acetabular anteversion using TAD. A 14-month retrospective review was performed of patients who had undergone a dedicated MSK CT pelvis study and who also had a technically adequate AP pelvis radiograph. Two trained observers measured the radiographic p/a ratio, TAD, and CT acetabular equatorial version for 110 hips on a PACS workstation. Mixed model analysis was used to find prediction equations, and ROC analysis was used to evaluate the diagnostic accuracy of p/a ratio and TAD. CT equatorial acetabular version can accurately be predicted from either p/a ratio (p < 0.001) or TAD (p < 0.001). The diagnostic accuracies of p/a ratio and TAD are comparable (p =0.46). Patients whose TAD is higher than 17 mm may have excessive acetabular anteversion. For that cutpoint, the sensitivity of TAD is 0.73, with specificity of 0.82. TAD is an accurate radiographic predictor of CT acetabular anteversion and provides an easy-to-use and intuitive point-of-care assessment of acetabular version in patients with hip pain. (orig.)

  4. Accelerator System Model (ASM) user manual with physics and engineering model documentation. ASM version 1.0

    International Nuclear Information System (INIS)

    1993-07-01

    The Accelerator System Model (ASM) is a computer program developed to model proton radiofrequency accelerators and to carry out system level trade studies. The ASM FORTRAN subroutines are incorporated into an intuitive graphical user interface which provides for the open-quotes constructionclose quotes of the accelerator in a window on the computer screen. The interface is based on the Shell for Particle Accelerator Related Codes (SPARC) software technology written for the Macintosh operating system in the C programming language. This User Manual describes the operation and use of the ASM application within the SPARC interface. The Appendix provides a detailed description of the physics and engineering models used in ASM. ASM Version 1.0 is joint project of G. H. Gillespie Associates, Inc. and the Accelerator Technology (AT) Division of the Los Alamos National Laboratory. Neither the ASM Version 1.0 software nor this ASM Documentation may be reproduced without the expressed written consent of both the Los Alamos National Laboratory and G. H. Gillespie Associates, Inc

  5. The ALICE Magnetic System Computation.

    CERN Document Server

    Klempt, W; CERN. Geneva; Swoboda, Detlef

    1995-01-01

    In this note we present the first results from the ALICE magnetic system computation performed in the 3-dimensional way with the Vector Fields TOSCA code (version 6.5) [1]. To make the calculations we have used the IBM RISC System 6000-370 and 6000-550 machines combined in the CERN PaRC UNIX cluster.

  6. Modeling CANDU type fuel behaviour during extended burnup irradiations using a revised version of the ELESIM code

    International Nuclear Information System (INIS)

    Arimescu, V.I.; Richmond, W.R.

    1992-05-01

    The high-burnup database for CANDU fuel, with a variety of cases, offers a good opportunity to check models of fuel behaviour, and to identify areas for improvement. Good agreement of calculated values of fission-gas release, and sheath hoop strain, with experimental data indicates that the global behaviour of the fuel element is adequately simulated by a computer code. Using, the ELESIM computer code, the fission-gas release, swelling, and fuel pellet expansion models were analysed, and changes made for gaseous swelling, and diffusional release of fission-gas atoms to the grain boundaries. Using this revised version of ELESIM, satisfactory agreement between measured values of fission-gas release was found for most of the high-burnup database cases. It is concluded that the revised version of the ELESIM code is able to simulate with reasonable accuracy high-burnup as well as low-burnup CANDU fuel

  7. Status of SACRD: a data base for fast reactor safety computer codes

    International Nuclear Information System (INIS)

    Greene, N.M.; Flanagan, G.F.; Alter, H.

    1982-01-01

    In 1975 work was initiated to provide a central computerized data collection of evaluated data for use in fast reactor safety computer codes. This data base is called SACRD and is intended to encompass handbook and other nonproblem-dependent data related to LMFBR's, especially at extreme conditions where little or no experimental data are available. Version 1 of the data base was released in the latter part of 1978 and remained the standard version until Version 81, which was released in October 1981

  8. Is Contralateral Templating Reliable for Establishing Rotational Alignment During Intramedullary Stabilization of Femoral Shaft Fractures? A Study of Individual Bilateral Differences in Femoral Version.

    Science.gov (United States)

    Croom, William P; Lorenzana, Daniel J; Auran, Richard L; Cavallero, Matthew J; Heckmann, Nathanael; Lee, Jackson; White, Eric A

    2018-02-01

    To determine native individual bilateral differences (IBDs) in femoral version in a diverse population. Computed tomography scans with complete imaging of uninjured bilateral femora were used to determine femoral version and IBDs in version. Age, sex, and ethnicity of each subject were also collected. Femoral version and IBDs in version were correlated with demographic variables using univariate and multivariate regression models. One hundred sixty-four subjects were included in the study. The average femoral version was 9.4 degrees (±9.4 degrees). The mean IBD in femoral version was 5.4 degrees (±4.4 degrees, P alignment during intramedullary stabilization of diaphyseal femur fractures. This is also an important consideration when considering malrotation of femur fractures because most studies define malrotation as a greater than 10-15-degree difference compared with the contralateral side. Prognostic Level IV. See Instructions for Authors for a complete description of levels of evidence.

  9. A comparison of the Space Station version of ASTROMAG with two free-flyer versions

    International Nuclear Information System (INIS)

    Green, M.A.

    1992-06-01

    This Report compares the Space Station version of ASTROMAG with free-flyer versions of ASTROMAG which could fly on an Atlas lla rocket and a Delta rocket. Launch with either free-flyer imposes severe weight limits on the magnet and its cryogenic system. Both versions of ASTROMAG magnet which fly on free-flying satellites do not have to be charged more than once during the mission. This permits one to simplify the charging system and the cryogenic system. The helium ll pump loop which supplies helium to the gas cooled electrical leads can be eliminated in both of the free-flyer versions of the ASTROMAG magnet. This report describes the superconducting dipole moment correction coils which are necessary for the magnet to operate on a free-flying satellite

  10. Versioning of printed products

    Science.gov (United States)

    Tuijn, Chris

    2005-01-01

    During the definition of a printed product in an MIS system, a lot of attention is paid to the production process. The MIS systems typically gather all process-related parameters at such a level of detail that they can determine what the exact cost will be to make a specific product. This information can then be used to make a quote for the customer. Considerably less attention is paid to the content of the products since this does not have an immediate impact on the production costs (assuming that the number of inks or plates is known in advance). The content management is typically carried out either by the prepress systems themselves or by dedicated workflow servers uniting all people that contribute to the manufacturing of a printed product. Special care must be taken when considering versioned products. With versioned products we here mean distinct products that have a number of pages or page layers in common. Typical examples are comic books that have to be printed in different languages. In this case, the color plates can be shared over the different versions and the black plate will be different. Other examples are nation-wide magazines or newspapers that have an area with regional pages or advertising leaflets in different languages or currencies. When considering versioned products, the content will become an important cost factor. First of all, the content management (and associated proofing and approval cycles) becomes much more complex and, therefore, the risk that mistakes will be made increases considerably. Secondly, the real production costs are very much content-dependent because the content will determine whether plates can be shared across different versions or not and how many press runs will be needed. In this paper, we will present a way to manage different versions of a printed product. First, we will introduce a data model for version management. Next, we will show how the content of the different versions can be supplied by the customer

  11. Microgravity computing codes. User's guide

    Science.gov (United States)

    1982-01-01

    Codes used in microgravity experiments to compute fluid parameters and to obtain data graphically are introduced. The computer programs are stored on two diskettes, compatible with the floppy disk drives of the Apple 2. Two versions of both disks are available (DOS-2 and DOS-3). The codes are written in BASIC and are structured as interactive programs. Interaction takes place through the keyboard of any Apple 2-48K standard system with single floppy disk drive. The programs are protected against wrong commands given by the operator. The programs are described step by step in the same order as the instructions displayed on the monitor. Most of these instructions are shown, with samples of computation and of graphics.

  12. Community Land Model Version 3.0 (CLM3.0) Developer's Guide

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, FM

    2004-12-21

    This document describes the guidelines adopted for software development of the Community Land Model (CLM) and serves as a reference to the entire code base of the released version of the model. The version of the code described here is Version 3.0 which was released in the summer of 2004. This document, the Community Land Model Version 3.0 (CLM3.0) User's Guide (Vertenstein et al., 2004), the Technical Description of the Community Land Model (CLM) (Oleson et al., 2004), and the Community Land Model's Dynamic Global Vegetation Model (CLM-DGVM): Technical Description and User's Guide (Levis et al., 2004) provide the developer, user, or researcher with details of implementation, instructions for using the model, a scientific description of the model, and a scientific description of the Dynamic Global Vegetation Model integrated with CLM respectively. The CLM is a single column (snow-soil-vegetation) biogeophysical model of the land surface which can be run serially (on a laptop or personal computer) or in parallel (using distributed or shared memory processors or both) on both vector and scalar computer architectures. Written in Fortran 90, CLM can be run offline (i.e., run in isolation using stored atmospheric forcing data), coupled to an atmospheric model (e.g., the Community Atmosphere Model (CAM)), or coupled to a climate system model (e.g., the Community Climate System Model Version 3 (CCSM3)) through a flux coupler (e.g., Coupler 6 (CPL6)). When coupled, CLM exchanges fluxes of energy, water, and momentum with the atmosphere. The horizontal land surface heterogeneity is represented by a nested subgrid hierarchy composed of gridcells, landunits, columns, and plant functional types (PFTs). This hierarchical representation is reflected in the data structures used by the model code. Biophysical processes are simulated for each subgrid unit (landunit, column, and PFT) independently, and prognostic variables are maintained for each subgrid unit

  13. Analyses of Receptive and Productive Korean EFL Vocabulary: Computer-Based Vocabulary Learning Program

    Science.gov (United States)

    Kim, Scott Sungki

    2013-01-01

    The present research study investigated the effects of 8 versions of a computer-based vocabulary learning program on receptive and productive knowledge levels of college students. The participants were 106 male and 103 female Korean EFL students from Kyungsung University and Kwandong University in Korea. Students who participated in versions of…

  14. Exploring individual cognitions, self-regulation skills, and environmental-level factors as mediating variables of two versions of a Web-based computer-tailored nutrition education intervention aimed at adults: A randomized controlled trial.

    Science.gov (United States)

    Springvloet, Linda; Lechner, Lilian; Candel, Math J J M; de Vries, Hein; Oenema, Anke

    2016-03-01

    This study explored whether the determinants that were targeted in two versions of a Web-based computer-tailored nutrition education intervention mediated the effects on fruit, high-energy snack, and saturated fat intake among adults who did not comply with dietary guidelines. A RCT was conducted with a basic (tailored intervention targeting individual cognitions and self-regulation), plus (additionally targeting environmental-level factors), and control group (generic nutrition information). Participants were recruited from the general Dutch adult population and randomly assigned to one of the study groups. Online self-reported questionnaires assessed dietary intake and potential mediating variables (behavior-specific cognitions, action- and coping planning, environmental-level factors) at baseline and one (T1) and four (T2) months post-intervention (i.e. four and seven months after baseline). The joint-significance test was used to establish mediating variables at different time points (T1-mediating variables - T2-intake; T1-mediating variables - T1-intake; T2-mediating variables - T2-intake). Educational differences were examined by testing interaction terms. The effect of the plus version on fruit intake was mediated (T2-T2) by intention and fruit availability at home and for high-educated participants also by attitude. Among low/moderate-educated participants, high-energy snack availability at home mediated (T1-T1) the effect of the basic version on high-energy snack intake. Subjective norm mediated (T1-T1) the effect of the basic version on fat intake among high-educated participants. Only some of the targeted determinants mediated the effects of both intervention versions on fruit, high-energy snack, and saturated fat intake. A possible reason for not finding a more pronounced pattern of mediating variables is that the educational content was tailored to individual characteristics and that participants only received feedback for relevant and not for all

  15. Measuring engagement at work: validation of the Chinese version of the Utrecht Work Engagement Scale.

    Science.gov (United States)

    Fong, Ted Chun-tat; Ng, Siu-man

    2012-09-01

    Work engagement is a positive work-related state of fulfillment characterized by vigor, dedication, and absorption. Previous studies have operationalized the construct through development of the Utrecht Work Engagement Scale. Apart from the original three-factor 17-item version of the instrument (UWES-17), there exists a nine-item shortened revised version (UWES-9). The current study explored the psychometric properties of the Chinese version of the Utrecht Work Engagement Scale in terms of factorial validity, scale reliability, descriptive statistics, and construct validity. A cross-sectional questionnaire survey was conducted in 2009 among 992 workers from over 30 elderly service units in Hong Kong. Confirmatory factor analyses revealed a better fit for the three-factor model of the UWES-9 than the UWES-17 and the one-factor model of the UWES-9. The three factors showed acceptable internal consistency and strong correlations with factors in the original versions. Engagement was negatively associated with perceived stress and burnout while positively with age and holistic care climate. The UWES-9 demonstrates adequate psychometric properties, supporting its use in future research in the Chinese context.

  16. Third-Order Transport with MAD Input: A Computer Program for Designing Charged Particle Beam Transport Systems

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Karl

    1998-10-28

    TRANSPORT has been in existence in various evolutionary versions since 1963. The present version of TRANSPORT is a first-, second-, and third-order matrix multiplication computer program intended for the design of static-magnetic beam transport systems.

  17. PAV ontology: provenance, authoring and versioning.

    Science.gov (United States)

    Ciccarese, Paolo; Soiland-Reyes, Stian; Belhajjame, Khalid; Gray, Alasdair Jg; Goble, Carole; Clark, Tim

    2013-11-22

    Provenance is a critical ingredient for establishing trust of published scientific content. This is true whether we are considering a data set, a computational workflow, a peer-reviewed publication or a simple scientific claim with supportive evidence. Existing vocabularies such as Dublin Core Terms (DC Terms) and the W3C Provenance Ontology (PROV-O) are domain-independent and general-purpose and they allow and encourage for extensions to cover more specific needs. In particular, to track authoring and versioning information of web resources, PROV-O provides a basic methodology but not any specific classes and properties for identifying or distinguishing between the various roles assumed by agents manipulating digital artifacts, such as author, contributor and curator. We present the Provenance, Authoring and Versioning ontology (PAV, namespace http://purl.org/pav/): a lightweight ontology for capturing "just enough" descriptions essential for tracking the provenance, authoring and versioning of web resources. We argue that such descriptions are essential for digital scientific content. PAV distinguishes between contributors, authors and curators of content and creators of representations in addition to the provenance of originating resources that have been accessed, transformed and consumed. We explore five projects (and communities) that have adopted PAV illustrating their usage through concrete examples. Moreover, we present mappings that show how PAV extends the W3C PROV-O ontology to support broader interoperability. The initial design of the PAV ontology was driven by requirements from the AlzSWAN project with further requirements incorporated later from other projects detailed in this paper. The authors strived to keep PAV lightweight and compact by including only those terms that have demonstrated to be pragmatically useful in existing applications, and by recommending terms from existing ontologies when plausible. We analyze and compare PAV with related

  18. 7th International Joint Conference on Computational Intelligence

    CERN Document Server

    Rosa, Agostinho; Cadenas, José; Correia, António; Madani, Kurosh; Ruano, António; Filipe, Joaquim

    2017-01-01

    This book includes a selection of revised and extended versions of the best papers from the seventh International Joint Conference on Computational Intelligence (IJCCI 2015), held in Lisbon, Portugal, from 12 to 14 November 2015, which was composed of three co-located conferences: The International Conference on Evolutionary Computation Theory and Applications (ECTA), the International Conference on Fuzzy Computation Theory and Applications (FCTA), and the International Conference on Neural Computation Theory and Applications (NCTA). The book presents recent advances in scientific developments and applications in these three areas, reflecting the IJCCI’s commitment to high quality standards.

  19. Factorial Structure of the French Version of the Rosenberg Self-Esteem Scale among the Elderly

    Science.gov (United States)

    Gana, Kamel; Alaphilippe, Daniel; Bailly, Nathalie

    2005-01-01

    Ten different confirmatory factor analysis models, including ones with correlated traits correlated methods, correlated traits correlated uniqueness, and correlated traits uncorrelated methods, were proposed to examine the factorial structure of the French version of the Rosenberg Self-Esteem Scale (Rosenberg, 1965). In line with previous studies…

  20. A constructive version of AIP revisited

    NARCIS (Netherlands)

    Barros, A.; Hou, T.

    2008-01-01

    In this paper, we review a constructive version of the Approximation Induction Principle. This version states that bisimilarity of regular processes can be decided by observing only a part of their behaviour. We use this constructive version to formulate a complete inference system for the Algebra

  1. FELIX-2.0: New version of the finite element solver for the time dependent generator coordinate method with the Gaussian overlap approximation

    Science.gov (United States)

    Regnier, D.; Dubray, N.; Verrière, M.; Schunck, N.

    2018-04-01

    The time-dependent generator coordinate method (TDGCM) is a powerful method to study the large amplitude collective motion of quantum many-body systems such as atomic nuclei. Under the Gaussian Overlap Approximation (GOA), the TDGCM leads to a local, time-dependent Schrödinger equation in a multi-dimensional collective space. In this paper, we present the version 2.0 of the code FELIX that solves the collective Schrödinger equation in a finite element basis. This new version features: (i) the ability to solve a generalized TDGCM+GOA equation with a metric term in the collective Hamiltonian, (ii) support for new kinds of finite elements and different types of quadrature to compute the discretized Hamiltonian and overlap matrices, (iii) the possibility to leverage the spectral element scheme, (iv) an explicit Krylov approximation of the time propagator for time integration instead of the implicit Crank-Nicolson method implemented in the first version, (v) an entirely redesigned workflow. We benchmark this release on an analytic problem as well as on realistic two-dimensional calculations of the low-energy fission of 240Pu and 256Fm. Low to moderate numerical precision calculations are most efficiently performed with simplex elements with a degree 2 polynomial basis. Higher precision calculations should instead use the spectral element method with a degree 4 polynomial basis. We emphasize that in a realistic calculation of fission mass distributions of 240Pu, FELIX-2.0 is about 20 times faster than its previous release (within a numerical precision of a few percents).

  2. Model-based version management system framework

    International Nuclear Information System (INIS)

    Mehmood, W.

    2016-01-01

    In this paper we present a model-based version management system. Version Management System (VMS) a branch of software configuration management (SCM) aims to provide a controlling mechanism for evolution of software artifacts created during software development process. Controlling the evolution requires many activities to perform, such as, construction and creation of versions, identification of differences between versions, conflict detection and merging. Traditional VMS systems are file-based and consider software systems as a set of text files. File based VMS systems are not adequate for performing software configuration management activities such as, version control on software artifacts produced in earlier phases of the software life cycle. New challenges of model differencing, merge, and evolution control arise while using models as central artifact. The goal of this work is to present a generic framework model-based VMS which can be used to overcome the problem of tradition file-based VMS systems and provide model versioning services. (author)

  3. 1990 CERN School of Computing

    International Nuclear Information System (INIS)

    1991-01-01

    These Proceedings contain written versions of lectures delivered at the 1990 CERN School of Computing, covering a variety of topics. Computer networks are treated in three papers: standards in computer networking; evolution of local and metropolitan area networks; asynchronous transfer mode, the solution for broadband ISDN. Data acquisition and analysis are the topic of papers on: data acquisition using MODEL software; graphical event analysis. Two papers in the field of signal processing treat digital image processing and the use of digital signal processors in HEP. Another paper reviews the present state of digital optical computing. Operating systems and programming discipline are covered in two papers: UNIX, evolution towards distributed systems; new developments in program verification. Three papers treat miscellaneous topics: computer security within the CERN environment; numerical simulation in fluid mechanics; fractals. An introduction to transputers and Occam gives an account of the tutorial lectures given at the School. (orig.)

  4. SHADOW3: a new version of the synchrotron X-ray optics modelling package

    International Nuclear Information System (INIS)

    Sanchez del Rio, Manuel; Canestrari, Niccolo; Jiang, Fan; Cerrina, Franco

    2011-01-01

    SHADOW3, a new version of the X-ray tracing code SHADOW, is introduced. A new version of the popular X-ray tracing code SHADOW is presented. An important step has been made in restructuring the code following new computer engineering standards, ending with a modular Fortran 2003 structure and an application programming interface (API). The new code has been designed to be compatible with the original file-oriented SHADOW philosophy, but simplifying the compilation, installation and use. In addition, users can now become programmers using the newly designed SHADOW3 API for creating scripts, macros and programs; being able to deal with optical system optimization, image simulation, and also low transmission calculations requiring a large number of rays (>10 6 ). Plans for future development and questions on how to accomplish them are also discussed

  5. New Generation General Purpose Computer (GPC) compact IBM unit

    Science.gov (United States)

    1991-01-01

    New Generation General Purpose Computer (GPC) compact IBM unit replaces a two-unit earlier generation computer. The new IBM unit is documented in table top views alone (S91-26867, S91-26868), with the onboard equipment it supports including the flight deck CRT screen and keypad (S91-26866), and next to the two earlier versions it replaces (S91-26869).

  6. VizieR Online Data Catalog: Asiago Supernova Catalogue (Version 2008-Mar)

    Science.gov (United States)

    Barbon, R.; Buondi, V.; Cappellaro, E.; Turatto, M.

    2008-02-01

    This catalogue supersedes the previous version by Barbon et al. (1999A&AS..139..531B, Cat. II/227), and contains data about the supernovae observed since 1895 and their parent galaxies until the beginning of 2008. In addition to the list of newly discovered SNe, the literature has been searched for new information on past SNe as well. The data for the parent galaxies have also been homogenized. (1 data file).

  7. Implementing ASPEN on the CRAY computer

    International Nuclear Information System (INIS)

    Duerre, K.H.; Bumb, A.C.

    1981-01-01

    This paper describes our experience in converting the ASPEN program for use on our CRAY computers at the Los Alamos National Laboratory. The CRAY computer is two-to-five times faster than a CDC-7600 for scalar operations, is equipped with up to two million words of high-speed storage, and has vector processing capability. Thus, the CRAY is a natural candidate for programs that are the size and complexity of ASPEN. Our approach to converting ASPEN and the conversion problems are discussed, including our plans for optimizing the program. Comparisons of run times for test problems between the CRAY and IBM 370 computer versions are presented

  8. SKEMA - A computer code to estimate atmospheric dispersion

    International Nuclear Information System (INIS)

    Sacramento, A.M. do.

    1985-01-01

    This computer code is a modified version of DWNWND code, developed in Oak Ridge National Laboratory. The Skema code makes an estimative of concentration in air of a material released in atmosphery, by ponctual source. (C.M.) [pt

  9. Determining Optimal Decision Version

    Directory of Open Access Journals (Sweden)

    Olga Ioana Amariei

    2014-06-01

    Full Text Available In this paper we start from the calculation of the product cost, applying the method of calculating the cost of hour- machine (THM, on each of the three cutting machines, namely: the cutting machine with plasma, the combined cutting machine (plasma and water jet and the cutting machine with a water jet. Following the calculation of cost and taking into account the precision of manufacturing of each machine, as well as the quality of the processed surface, the optimal decisional version needs to be determined regarding the product manufacturing. To determine the optimal decisional version, we resort firstly to calculating the optimal version on each criterion, and then overall using multiattribute decision methods.

  10. Air Traffic Management Technology Demonstration-1 Concept of Operations (ATD-1 ConOps), Version 3.0

    Science.gov (United States)

    Baxley, Brian T.; Johnson, William C.; Scardina, John; Shay, Richard F.

    2016-01-01

    This document describes the goals, benefits, technologies, and procedures of the Concept of Operations (ConOps) for the Air Traffic Management (ATM) Technology Demonstration #1 (ATD-1), and provides an update to the previous versions of the document [ref 1 and ref 2].

  11. DEMAID - A DESIGN MANAGER'S AID FOR INTELLIGENT DECOMPOSITION (MACINTOSH VERSION)

    Science.gov (United States)

    Rogers, J. L.

    1994-01-01

    effects of an output with respect to a change in a particular input. The second method traces backward to determine what modules must be re-executed if the output of a module must be recomputed. DeMAID is available in three machine versions: a Macintosh version which is written in Symantec's Think C 3.01, a Sun version, and an SGI IRIS version, both of which are written in C language. The Macintosh version requires system software 6.0.2 or later and CLIPS 4.3. The source code for the Macintosh version will not compile under version 4.0 of Think C; however, a sample executable is provided on the distribution media. QuickDraw is required for plotting. The Sun version requires GKS 4.1 graphics libraries, OpenWindows 3, and CLIPS 4.3. The SGI IRIS version requires CLIPS 4.3. Since DeMAID is not compatible with CLIPS 5.0 or later, the source code for CLIPS 4.3 is included on the distribution media; however, the documentation for CLIPS 4.3 is not included in the documentation package for DeMAID. It is available from COSMIC separately as the documentation for MSC-21208. The standard distribution medium for the Macintosh version of DeMAID is a set of four 3.5 inch 800K Macintosh format diskettes. The standard distribution medium for the Sun version of DeMAID is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. The standard distribution medium for the IRIS version is a .25 inch IRIX compatible streaming magnetic tape cartridge in UNIX tar format. All versions include sample input. DeMAID was originally developed for use on VAX VMS computers in 1989. The Macintosh version of DeMAID was released in 1991 and updated in 1992. The Sun version of DeMAID was released in 1992 and updated in 1993. The SGI IRIS version was released in 1993.

  12. DEMAID - A DESIGN MANAGER'S AID FOR INTELLIGENT DECOMPOSITION (SUN VERSION)

    Science.gov (United States)

    Rogers, J. L.

    1994-01-01

    effects of an output with respect to a change in a particular input. The second method traces backward to determine what modules must be re-executed if the output of a module must be recomputed. DeMAID is available in three machine versions: a Macintosh version which is written in Symantec's Think C 3.01, a Sun version, and an SGI IRIS version, both of which are written in C language. The Macintosh version requires system software 6.0.2 or later and CLIPS 4.3. The source code for the Macintosh version will not compile under version 4.0 of Think C; however, a sample executable is provided on the distribution media. QuickDraw is required for plotting. The Sun version requires GKS 4.1 graphics libraries, OpenWindows 3, and CLIPS 4.3. The SGI IRIS version requires CLIPS 4.3. Since DeMAID is not compatible with CLIPS 5.0 or later, the source code for CLIPS 4.3 is included on the distribution media; however, the documentation for CLIPS 4.3 is not included in the documentation package for DeMAID. It is available from COSMIC separately as the documentation for MSC-21208. The standard distribution medium for the Macintosh version of DeMAID is a set of four 3.5 inch 800K Macintosh format diskettes. The standard distribution medium for the Sun version of DeMAID is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. The standard distribution medium for the IRIS version is a .25 inch IRIX compatible streaming magnetic tape cartridge in UNIX tar format. All versions include sample input. DeMAID was originally developed for use on VAX VMS computers in 1989. The Macintosh version of DeMAID was released in 1991 and updated in 1992. The Sun version of DeMAID was released in 1992 and updated in 1993. The SGI IRIS version was released in 1993.

  13. A Computerized Version of the Scrambled Sentences Test

    Directory of Open Access Journals (Sweden)

    Roberto Viviani

    2018-01-01

    Full Text Available The scrambled sentences test (SST, an experimental procedure that involves participants writing down their cognitions, has been used to elicit individual differences in depressiveness and vulnerability to depression. We describe here a modification of the SST to adapt it to computerized administration, with a particular view of its use in large samples and functional neuroimaging applications. In a first study with the computerized version, we reproduce the preponderance of positive cognitions in the healthy and the inverse association of these cognitions with individual measures of depressiveness. We also report a tendency of self-referential cognitions to elicit higher positive cognition rates. In a second study, we describe the patterns of neural activations elicited by emotional and neutral sentences in a functional neuroimaging study, showing that it replicates and extends previous findings obtained with the original version of the SST. During the formation of emotional cognitions, ventral areas such as the ventral anterior cingulus and the supramarginal gyrus were relatively activated. This activation pattern speaks for the recruitment of mechanisms coordinating motivational and associative processes in the formation of value-based decisions.

  14. The version control service for the ATLAS data acquisition configuration files

    International Nuclear Information System (INIS)

    Soloviev, Igor

    2012-01-01

    The ATLAS experiment at the LHC in Geneva uses a complex and highly distributed Trigger and Data Acquisition system, involving a very large number of computing nodes and custom modules. The configuration of the system is specified by schema and data in more than 1000 XML files, with various experts responsible for updating the files associated with their components. Maintaining an error free and consistent set of XML files proved a major challenge. Therefore a special service was implemented; to validate any modifications; to check the authorization of anyone trying to modify a file; to record who had made changes, plus when and why; and to provide tools to compare different versions of files and to go back to earlier versions if required. This paper provides details of the implementation and exploitation experience, that may be interesting for other applications using many human-readable files maintained by different people, where consistency of the files and traceability of modifications are key requirements.

  15. ORIGEN2: a revised and updated version of the Oak Ridge isotope generation and depletion code

    International Nuclear Information System (INIS)

    Croff, A.G.

    1980-07-01

    ORIGEN2 is a versatile point depletion and decay computer code for use in simulating nuclear fuel cycles and calculating the nuclide compositions of materials contained therein. This code represents a revision and update of the original ORIGEN computer code which has been distributed world-wide beginning in the early 1970s. The purpose of this report is to give a summary description of a revised and updated version of the original ORIGEN computer code, which has been designated ORIGEN2. A detailed description of the computer code ORIGEN2 is presented. The methods used by ORIGEN2 to solve the nuclear depletion and decay equations are included. Input information necessary to use ORIGEN2 that has not been documented in supporting reports is documented

  16. Fibonacci’s Computation Methods vs Modern Algorithms

    Directory of Open Access Journals (Sweden)

    Ernesto Burattini

    2013-12-01

    Full Text Available In this paper we discuss some computational procedures given by Leonardo Pisano Fibonacci in his famous Liber Abaci book, and we propose their translation into a modern language for computers (C ++. Among the other we describe the method of “cross” multiplication, we evaluate its computational complexity in algorithmic terms and we show the output of a C ++ code that describes the development of the method applied to the product of two integers. In a similar way we show the operations performed on fractions introduced by Fibonacci. Thanks to the possibility to reproduce on a computer, the Fibonacci’s different computational procedures, it was possible to identify some calculation errors present in the different versions of the original text.

  17. User manual for version 4.3 of the Tripoli-4 Monte-Carlo method particle transport computer code; Notice d'utilisation du code Tripoli-4, version 4.3: code de transport de particules par la methode de Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Both, J.P.; Mazzolo, A.; Peneliau, Y.; Petit, O.; Roesslinger, B

    2003-07-01

    This manual relates to Version 4.3 TRIPOLI-4 code. TRIPOLI-4 is a computer code simulating the transport of neutrons, photons, electrons and positrons. It can be used for radiation shielding calculations (long-distance propagation with flux attenuation in non-multiplying media) and neutronic calculations (fissile medium, criticality or sub-criticality basis). This makes it possible to calculate k{sub eff} (for criticality), flux, currents, reaction rates and multi-group cross-sections. TRIPOLI-4 is a three-dimensional code that uses the Monte-Carlo method. It allows for point-wise description in terms of energy of cross-sections and multi-group homogenized cross-sections and features two modes of geometrical representation: surface and combinatorial. The code uses cross-section libraries in ENDF/B format (such as JEF2-2, ENDF/B-VI and JENDL) for point-wise description cross-sections in APOTRIM format (from the APOLLO2 code) or a format specific to TRIPOLI-4 for multi-group description. (authors)

  18. Detailed analysis of the Japanese version of the Rapid Dementia Screening Test, revised version.

    Science.gov (United States)

    Moriyama, Yasushi; Yoshino, Aihide; Muramatsu, Taro; Mimura, Masaru

    2017-11-01

    The number-transcoding task on the Japanese version of the Rapid Dementia Screening Test (RDST-J) requires mutual conversion between Arabic and Chinese numerals (209 to , 4054 to , to 681, to 2027). In this task, question and answer styles of Chinese numerals are written horizontally. We investigated the impact of changing the task so that Chinese numerals are written vertically. Subjects were 211 patients with very mild to severe Alzheimer's disease and 42 normal controls. Mini-Mental State Examination scores ranged from 26 to 12, and Clinical Dementia Rating scores ranged from 0.5 to 3. Scores of all four subtasks of the transcoding task significantly improved in the revised version compared with the original version. The sensitivity and specificity of total scores ≥9 on the RDST-J original and revised versions for discriminating between controls and subjects with Clinical Dementia Rating scores of 0.5 were 63.8% and 76.6% on the original and 60.1% and 85.8% on revised version. The revised RDST-J total score had low sensitivity and high specificity compared with the original RDST-J for discriminating subjects with Clinical Dementia Rating scores of 0.5 from controls. © 2017 Japanese Psychogeriatric Society.

  19. Version 2 of RSXMULTI

    International Nuclear Information System (INIS)

    Heinicke, P.; Berg, D.; Constanta-Fanourakis, P.; Quigg, E.K.

    1985-01-01

    MULTI is a general purpose, high speed, high energy physics interface to data acquisition and data investigation system that runs on PDP-11 and VAX architecture. This paper describes the latest version of MULTI, which runs under RSX-11M version 4.1 and supports a modular approach to the separate tasks that interface to it, allowing the same system to be used in single CPU test beam experiments as well as multiple interconnected CPU, large scale experiments. MULTI uses CAMAC (IEE-583) for control and monitoring of an experiment, and is written in FORTRAN-77 and assembler. The design of this version, which simplified the interface between tasks, and eliminated the need for a hard to maintain homegrown I/O system is also discussed

  20. Initial water quantification results using neutron computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Heller, A.K. [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States)], E-mail: axh174@psu.edu; Shi, L.; Brenizer, J.S.; Mench, M.M. [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States)

    2009-06-21

    Neutron computed tomography is an important imaging tool in the field of non-destructive testing and in fundamental research for many engineering applications. Contrary to X-rays, neutrons can be attenuated by some light materials, such as hydrogen, but can penetrate many heavy materials. Thus, neutron computed tomography is useful in obtaining important three-dimensional information about a sample's interior structure and material properties that other traditional methods cannot provide. The neutron computed tomography system at Pennsylvania State University's Radiation Science and Engineering Center is being utilized to develop a water quantification technique for investigation of water distribution in fuel cells under normal conditions. A hollow aluminum cylinder test sample filled with a known volume of water was constructed for purposes of testing the quantification technique. Transmission images of the test sample at different angles were easily acquired through the synthesis of a dedicated image acquisition computer driving a rotary table controller and an in-house developed synchronization software package. After data acquisition, Octopus (version 8.2) and VGStudio Max (version 1.2) were used to perform cross-sectional and three-dimensional reconstructions of the sample, respectively. The initial reconstructions and water quantification results are presented.

  1. An Introduction To PC-TRIM.

    Science.gov (United States)

    John R. Mills

    1989-01-01

    The timber resource inventory model (TRIM) has been adapted to run on person al computers. The personal computer version of TRIM (PC-TRIM) is more widely used than its mainframe parent. Errors that existed in previous versions of TRIM have been corrected. Information is presented to help users with program input and output management in the DOS environment, to...

  2. Astronomy with a home computer

    CERN Document Server

    Monks, Neale

    2005-01-01

    Here is a one-volume guide to just about everything computer-related for amateur astronomers! Today's amateur astronomy is inextricably linked to personal computers. Computer-controlled "go-to" telescopes are inexpensive. CCD and webcam imaging make intensive use of the technology for capturing and processing images. Planetarium software provides information and an easy interface for telescopes. The Internet offers links to other astronomers, information, and software. The list goes on and on. Find out here how to choose the best planetarium program: are commercial versions really better than freeware? Learn how to optimise a go-to telescope, or connect it to a lap-top. Discover how to choose the best webcam and use it with your telescope. Create a mosaic of the Moon, or high-resolution images of the planets... Astronomy with a Home Computer is designed for every amateur astronomer who owns a home computer, whether it is running Microsoft Windows, Mac O/S or Linux. It doesn't matter what kind of telescope you...

  3. GASFLOW: A Computational Fluid Dynamics Code for Gases, Aerosols, and Combustion, Volume 1: Theory and Computational Model

    International Nuclear Information System (INIS)

    Nichols, B.D.; Mueller, C.; Necker, G.A.; Travis, J.R.; Spore, J.W.; Lam, K.L.; Royl, P.; Redlinger, R.; Wilson, T.L.

    1998-01-01

    Los Alamos National Laboratory (LANL) and Forschungszentrum Karlsruhe (FzK) are developing GASFLOW, a three-dimensional (3D) fluid dynamics field code as a best-estimate tool to characterize local phenomena within a flow field. Examples of 3D phenomena include circulation patterns; flow stratification; hydrogen distribution mixing and stratification; combustion and flame propagation; effects of noncondensable gas distribution on local condensation and evaporation; and aerosol entrainment, transport, and deposition. An analysis with GASFLOW will result in a prediction of the gas composition and discrete particle distribution in space and time throughout the facility and the resulting pressure and temperature loadings on the walls and internal structures with or without combustion. A major application of GASFLOW is for predicting the transport, mixing, and combustion of hydrogen and other gases in nuclear reactor containments and other facilities. It has been applied to situations involving transporting and distributing combustible gas mixtures. It has been used to study gas dynamic behavior (1) in low-speed, buoyancy-driven flows, as well as sonic flows or diffusion dominated flows; and (2) during chemically reacting flows, including deflagrations. The effects of controlling such mixtures by safety systems can be analyzed. The code version described in this manual is designated GASFLOW 2.1, which combines previous versions of the United States Nuclear Regulatory Commission code HMS (for Hydrogen Mixing Studies) and the Department of Energy and FzK versions of GASFLOW. The code was written in standard Fortran 90. This manual comprises three volumes. Volume I describes the governing physical equations and computational model. Volume II describes how to use the code to set up a model geometry, specify gas species and material properties, define initial and boundary conditions, and specify different outputs, especially graphical displays. Sample problems are included

  4. Definition of the Flexible Image Transport System (FITS), version 3.0

    Science.gov (United States)

    Pence, W. D.; Chiappetti, L.; Page, C. G.; Shaw, R. A.; Stobie, E.

    2010-12-01

    The Flexible Image Transport System (FITS) has been used by astronomers for over 30 years as a data interchange and archiving format; FITS files are now handled by a wide range of astronomical software packages. Since the FITS format definition document (the “standard”) was last printed in this journal in 2001, several new features have been developed and standardized, notably support for 64-bit integers in images and tables, variable-length arrays in tables, and new world coordinate system conventions which provide a mapping from an element in a data array to a physical coordinate on the sky or within a spectrum. The FITS Working Group of the International Astronomical Union has therefore produced this new version 3.0 of the FITS standard, which is provided here in its entirety. In addition to describing the new features in FITS, numerous editorial changes were made to the previous version to clarify and reorganize many of the sections. Also included are some appendices which are not formally part of the standard. The FITS standard is likely to undergo further evolution, in which case the latest version may be found on the FITS Support Office Web site at http://fits.gsfc.nasa.gov/, which also provides many links to FITS-related resources.

  5. Definition of the Flexible Image Transport System (FITS), Version 3.0

    Science.gov (United States)

    Pence, W. D.; Chiapetti, L.; Page, C. G.; Shaw, R. A.; Stobie, E.

    2010-01-01

    The Flexible Image Transport System (FITS) has been used by astronomers for over 30 years as a data interchange and archiving format; FITS files are now handled by a wide range of astronomical software packages. Since the FITS format definition document (the "standard") was last printed in this journal in 2001, several new features have been developed and standardized, notably support for 64-bit integers in images and tables, variable-length arrays in tables, and new world coordinate system conventions which provide a mapping from an element in a data array to a physical coordinate on the sky or within a spectrum. The FITS Working Group of the International Astronomical Union has therefore produced this new Version 3.0 of the FITS standard, which is provided here in its entirety. In addition to describing the new features in FITS, numerous editorial changes were made to the previous version to clarify and reorganize many of the sections. Also included are some appendices which are not formally part of the standard. The FITS standard is likely to undergo further evolution, in which case the latest version may be found on the FITS Support Office Web site at http://fits.gsfc.nasa.gov/, which also provides many links to FITS-related resources.

  6. A Microsoft Windows version of the MCNP visual editor

    International Nuclear Information System (INIS)

    Schwarz, R.A.; Carter, L.L.; Pfohl, J.

    1999-01-01

    Work has started on a Microsoft Windows version of the MCNP visual editor. The MCNP visual editor provides a graphical user interface for displaying and creating MCNP geometries. The visual editor is currently available from the Radiation Safety Information Computational Center (RSICC) and the Nuclear Energy Agency (NEA) as software package PSR-358. It currently runs on the major UNIX platforms (IBM, SGI, HP, SUN) and Linux. Work has started on converting the visual editor to work in a Microsoft Windows environment. This initial work focuses on converting the display capabilities of the visual editor; the geometry creation capability of the visual editor may be included in future upgrades

  7. The Implications of Pervasive Computing on Network Design

    Science.gov (United States)

    Briscoe, R.

    Mark Weiser's late-1980s vision of an age of calm technology with pervasive computing disappearing into the fabric of the world [1] has been tempered by an industry-driven vision with more of a feel of conspicuous consumption. In the modified version, everyone carries around consumer electronics to provide natural, seamless interactions both with other people and with the information world, particularly for eCommerce, but still through a pervasive computing fabric.

  8. Reliability and validity of the Japanese version of the Resilience Scale and its short version

    Directory of Open Access Journals (Sweden)

    Kondo Maki

    2010-11-01

    Full Text Available Abstract Background The clinical relevance of resilience has received considerable attention in recent years. The aim of this study is to demonstrate the reliability and validity of the Japanese version of the Resilience Scale (RS and short version of the RS (RS-14. Findings The original English version of RS was translated to Japanese and the Japanese version was confirmed by back-translation. Participants were 430 nursing and university psychology students. The RS, Center for Epidemiologic Studies Depression Scale (CES-D, Rosenberg Self-Esteem Scale (RSES, Social Support Questionnaire (SSQ, Perceived Stress Scale (PSS, and Sheehan Disability Scale (SDS were administered. Internal consistency, convergent validity and factor loadings were assessed at initial assessment. Test-retest reliability was assessed using data collected from 107 students at 3 months after baseline. Mean score on the RS was 111.19. Cronbach's alpha coefficients for the RS and RS-14 were 0.90 and 0.88, respectively. The test-retest correlation coefficients for the RS and RS-14 were 0.83 and 0.84, respectively. Both the RS and RS-14 were negatively correlated with the CES-D and SDS, and positively correlated with the RSES, SSQ and PSS (all p Conclusions This study demonstrates that the Japanese version of RS has psychometric properties with high degrees of internal consistency, high test-retest reliability, and relatively low concurrent validity. RS-14 was equivalent to the RS in internal consistency, test-retest reliability, and concurrent validity. Low scores on the RS, a positive correlation between the RS and perceived stress, and a relatively low correlation between the RS and depressive symptoms in this study suggest that validity of the Japanese version of the RS might be relatively low compared with the original English version.

  9. Performance of the improved version of Monte Carlo Code A3MCNP for cask shielding design

    International Nuclear Information System (INIS)

    Hasegawa, T.; Ueki, K.; Sato, O.; Sjoden, G.E.; Miyake, Y.; Ohmura, M.; Haghighat, A.

    2004-01-01

    A 3 MCNP (Automatic Adjoint Accelerated MCNP) is a revised version of the MCNP Monte Carlo code, that automatically prepares variance reduction parameters for the CADIS (Consistent Adjoint Driven Importance Sampling) methodology. Using a deterministic ''importance'' (or adjoint) function, CADIS performs source and transport biasing within the weight-window technique. The current version of A 3 MCNP uses the 3-D Sn transport TORT code to determine a 3-D importance function distribution. Based on simulation of several real-life problems, it is demonstrated that A3MCNP provides precise calculation results with a remarkably short computation time by using the proper and objective variance reduction parameters. However, since the first version of A 3 MCNP provided only a point source configuration option for large-scale shielding problems, such as spent-fuel transport casks, a large amount of memory may be necessary to store enough points to properly represent the source. Hence, we have developed an improved version of A 3 MCNP (referred to as A 3 MCNPV) which has a volumetric source configuration option. This paper describes the successful use of A 3 MCNPV for cask neutron and gamma-ray shielding problem

  10. SHADOW3: a new version of the synchrotron X-ray optics modelling package

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez del Rio, Manuel, E-mail: srio@esrf.eu [European Synchrotron Radiation Facility, 6 Jules Horowitz, 38000 Grenoble (France); Canestrari, Niccolo [CNRS, Grenoble (France); European Synchrotron Radiation Facility, 6 Jules Horowitz, 38000 Grenoble (France); Jiang, Fan; Cerrina, Franco [Boston University, 8 St Mary’s Street, Boston, MA 02215 (United States)

    2011-09-01

    SHADOW3, a new version of the X-ray tracing code SHADOW, is introduced. A new version of the popular X-ray tracing code SHADOW is presented. An important step has been made in restructuring the code following new computer engineering standards, ending with a modular Fortran 2003 structure and an application programming interface (API). The new code has been designed to be compatible with the original file-oriented SHADOW philosophy, but simplifying the compilation, installation and use. In addition, users can now become programmers using the newly designed SHADOW3 API for creating scripts, macros and programs; being able to deal with optical system optimization, image simulation, and also low transmission calculations requiring a large number of rays (>10{sup 6}). Plans for future development and questions on how to accomplish them are also discussed.

  11. hp-version discontinuous Galerkin methods on polygonal and polyhedral meshes

    CERN Document Server

    Cangiani, Andrea; Georgoulis, Emmanuil H; Houston, Paul

    2017-01-01

    Over the last few decades discontinuous Galerkin finite element methods (DGFEMs) have been witnessed tremendous interest as a computational framework for the numerical solution of partial differential equations. Their success is due to their extreme versatility in the design of the underlying meshes and local basis functions, while retaining key features of both (classical) finite element and finite volume methods. Somewhat surprisingly, DGFEMs on general tessellations consisting of polygonal (in 2D) or polyhedral (in 3D) element shapes have received little attention within the literature, despite the potential computational advantages. This volume introduces the basic principles of hp-version (i.e., locally varying mesh-size and polynomial order) DGFEMs over meshes consisting of polygonal or polyhedral element shapes, presents their error analysis, and includes an extensive collection of numerical experiments. The extreme flexibility provided by the locally variable elemen t-shapes, element-sizes, and elemen...

  12. The SIMRAND 1 computer program: Simulation of research and development projects

    Science.gov (United States)

    Miles, R. F., Jr.

    1986-01-01

    The SIMRAND I Computer Program (Version 5.0 x 0.3) written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles is described. The SIMRAND I Computer Program comprises eleven modules-a main routine and ten subroutines. Two additional files are used at compile time; one inserts the system or task equations into the source code, while the other inserts the dimension statements and common blocks. The SIMRAND I Computer Program can be run on most microcomputers or mainframe computers with only minor modifications to the computer code.

  13. A COMETHE version with transient capability

    International Nuclear Information System (INIS)

    Vliet, J. van; Lebon, G.; Mathieu, P.

    1980-01-01

    A version of the COMETHE code is under development to simulate transient situations. This paper focuses on some aspects of the transient heat transfer models. Initially the coupling between transient heat transfer and other thermomechanical models is discussed. An estimation of the thermal characteristic times shows that the cladding temperatures are often in quasi-steady state. In order to reduce the computing time, calculations are therefore switched from a transient to a quasi-static numerical procedure as soon as such a quasi-equilibrium is detected. The temperature calculation is performed by use of the Lebon-Lambermont restricted variational principle, with piecewise polynoms as trial functions. The method has been checked by comparison with some exact results and yields good agreement for transient as well as for quasi-static situations. This method therefore provides a valuable tool for the simulation of the transient behaviour of nuclear reactor fuel rods. (orig.)

  14. 4th International Joint Conference on Computational Intelligence

    CERN Document Server

    Correia, António; Rosa, Agostinho; Filipe, Joaquim

    2015-01-01

    The present book includes extended and revised versions of a set of selected papers from the Fourth International Joint Conference on Computational Intelligence (IJCCI 2012)., held in Barcelona, Spain, from 5 to 7 October, 2012. The conference was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and was organized in cooperation with the Association for the Advancement of Artificial Intelligence (AAAI). The conference brought together researchers, engineers and practitioners in computational technologies, especially those related to the areas of fuzzy computation, evolutionary computation and neural computation. It is composed of three co-located conferences, each one specialized in one of the aforementioned -knowledge areas. Namely: - International Conference on Evolutionary Computation Theory and Applications (ECTA) - International Conference on Fuzzy Computation Theory and Applications (FCTA) - International Conference on Neural Computation Theory a...

  15. Accelerator System Model (ASM) user manual with physics and engineering model documentation. ASM version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-07-01

    The Accelerator System Model (ASM) is a computer program developed to model proton radiofrequency accelerators and to carry out system level trade studies. The ASM FORTRAN subroutines are incorporated into an intuitive graphical user interface which provides for the {open_quotes}construction{close_quotes} of the accelerator in a window on the computer screen. The interface is based on the Shell for Particle Accelerator Related Codes (SPARC) software technology written for the Macintosh operating system in the C programming language. This User Manual describes the operation and use of the ASM application within the SPARC interface. The Appendix provides a detailed description of the physics and engineering models used in ASM. ASM Version 1.0 is joint project of G. H. Gillespie Associates, Inc. and the Accelerator Technology (AT) Division of the Los Alamos National Laboratory. Neither the ASM Version 1.0 software nor this ASM Documentation may be reproduced without the expressed written consent of both the Los Alamos National Laboratory and G. H. Gillespie Associates, Inc.

  16. SCALE-4 [Standardized Computer Analyses for Licensing Evaluation]: An improved computational system for spent-fuel cask analysis

    International Nuclear Information System (INIS)

    Parks, C.V.

    1989-01-01

    The purpose of this paper is to provide specific information regarding improvements available with Version 4.0 of the SCALE system and discuss the future of SCALE within the current computing and regulatory environment. The emphasis focuses on the improvements in SCALE-4 over that available in SCALE-3. 10 refs., 1 fig., 1 tab

  17. The complete guide to blender graphics computer modeling and animation

    CERN Document Server

    Blain, John M

    2014-01-01

    Smoothly Leads Users into the Subject of Computer Graphics through the Blender GUIBlender, the free and open source 3D computer modeling and animation program, allows users to create and animate models and figures in scenes, compile feature movies, and interact with the models and create video games. Reflecting the latest version of Blender, The Complete Guide to Blender Graphics: Computer Modeling & Animation, 2nd Edition helps beginners learn the basics of computer animation using this versatile graphics program. This edition incorporates many new features of Blender, including developments

  18. Quality assurance and verification of the MACCS [MELCOR Accident Consequence Code System] code, Version 1.5

    International Nuclear Information System (INIS)

    Dobbe, C.A.; Carlson, E.R.; Marshall, N.H.; Marwil, E.S.; Tolli, J.E.

    1990-02-01

    An independent quality assurance (QA) and verification of Version 1.5 of the MELCOR Accident Consequence Code System (MACCS) was performed. The QA and verification involved examination of the code and associated documentation for consistent and correct implementation of the models in an error-free FORTRAN computer code. The QA and verification was not intended to determine either the adequacy or appropriateness of the models that are used MACCS 1.5. The reviews uncovered errors which were fixed by the SNL MACCS code development staff prior to the release of MACCS 1.5. Some difficulties related to documentation improvement and code restructuring are also presented. The QA and verification process concluded that Version 1.5 of the MACCS code, within the scope and limitations process concluded that Version 1.5 of the MACCS code, within the scope and limitations of the models implemented in the code is essentially error free and ready for widespread use. 15 refs., 11 tabs

  19. Radiation dose and image quality of X-ray volume imaging systems: cone-beam computed tomography, digital subtraction angiography and digital fluoroscopy.

    Science.gov (United States)

    Paul, Jijo; Jacobi, Volkmar; Farhang, Mohammad; Bazrafshan, Babak; Vogl, Thomas J; Mbalisike, Emmanuel C

    2013-06-01

    Radiation dose and image quality estimation of three X-ray volume imaging (XVI) systems. A total of 126 patients were examined using three XVI systems (groups 1-3) and their data were retrospectively analysed from 2007 to 2012. Each group consisted of 42 patients and each patient was examined using cone-beam computed tomography (CBCT), digital subtraction angiography (DSA) and digital fluoroscopy (DF). Dose parameters such as dose-area product (DAP), skin entry dose (SED) and image quality parameters such as Hounsfield unit (HU), noise, signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) were estimated and compared using appropriate statistical tests. Mean DAP and SED were lower in recent XVI than its previous counterparts in CBCT, DSA and DF. HU of all measured locations was non-significant between the groups except the hepatic artery. Noise showed significant difference among groups (P < 0.05). Regarding CNR and SNR, the recent XVI showed a higher and significant difference compared to its previous versions. Qualitatively, CBCT showed significance between versions unlike the DSA and DF which showed non-significance. A reduction of radiation dose was obtained for the recent-generation XVI system in CBCT, DSA and DF. Image noise was significantly lower; SNR and CNR were higher than in previous versions. The technological advancements and the reduction in the number of frames led to a significant dose reduction and improved image quality with the recent-generation XVI system. • X-ray volume imaging (XVI) systems are increasingly used for interventional radiological procedures. • More modern XVI systems use lower radiation doses compared with earlier counterparts. • Furthermore more modern XVI systems provide higher image quality. • Technological advances reduce radiation dose and improve image quality.

  20. Implementation of an electronic medical record system in previously computer-naïve primary care centres: a pilot study from Cyprus.

    Science.gov (United States)

    Samoutis, George; Soteriades, Elpidoforos S; Kounalakis, Dimitris K; Zachariadou, Theodora; Philalithis, Anastasios; Lionis, Christos

    2007-01-01

    The computer-based electronic medical record (EMR) is an essential new technology in health care, contributing to high-quality patient care and efficient patient management. The majority of southern European countries, however, have not yet implemented universal EMR systems and many efforts are still ongoing. We describe the development of an EMR system and its pilot implementation and evaluation in two previously computer-naïve public primary care centres in Cyprus. One urban and one rural primary care centre along with their personnel (physicians and nurses) were selected to participate. Both qualitative and quantitative evaluation tools were used during the implementation phase. Qualitative data analysis was based on the framework approach, whereas quantitative assessment was based on a nine-item questionnaire and EMR usage parameters. Two public primary care centres participated, and a total often health professionals served as EMR system evaluators. Physicians and nurses rated EMR relatively highly, while patients were the most enthusiastic supporters for the new information system. Major implementation impediments were the physicians' perceptions that EMR usage negatively affected their workflow, physicians' legal concerns, lack of incentives, system breakdowns, software design problems, transition difficulties and lack of familiarity with electronic equipment. The importance of combining qualitative and quantitative evaluation tools is highlighted. More efforts are needed for the universal adoption and routine use of EMR in the primary care system of Cyprus as several barriers to adoption exist; however, none is insurmountable. Computerised systems could improve efficiency and quality of care in Cyprus, benefiting the entire population.

  1. Computing Equilibrium Chemical Compositions

    Science.gov (United States)

    Mcbride, Bonnie J.; Gordon, Sanford

    1995-01-01

    Chemical Equilibrium With Transport Properties, 1993 (CET93) computer program provides data on chemical-equilibrium compositions. Aids calculation of thermodynamic properties of chemical systems. Information essential in design and analysis of such equipment as compressors, turbines, nozzles, engines, shock tubes, heat exchangers, and chemical-processing equipment. CET93/PC is version of CET93 specifically designed to run within 640K memory limit of MS-DOS operating system. CET93/PC written in FORTRAN.

  2. LERC-SLAM - THE NASA LEWIS RESEARCH CENTER SATELLITE LINK ATTENUATION MODEL PROGRAM (MACINTOSH VERSION)

    Science.gov (United States)

    Manning, R. M.

    1994-01-01

    antenna required to establish a link with the satellite, the statistical parameters that characterize the rainrate process at the terminal site, the length of the propagation path within the potential rain region, and its projected length onto the local horizontal. The IBM PC version of LeRC-SLAM (LEW-14979) is written in Microsoft QuickBASIC for an IBM PC compatible computer with a monitor and printer capable of supporting an 80-column format. The IBM PC version is available on a 5.25 inch MS-DOS format diskette. The program requires about 30K RAM. The source code and executable are included. The Macintosh version of LeRC-SLAM (LEW-14977) is written in Microsoft Basic, Binary (b) v2.00 for Macintosh II series computers running MacOS. This version requires 400K RAM and is available on a 3.5 inch 800K Macintosh format diskette, which includes source code only. The Macintosh version was developed in 1987 and the IBM PC version was developed in 1989. IBM PC is a trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. Macintosh is a registered trademark of Apple Computer, Inc.

  3. APGEN Version 5.0

    Science.gov (United States)

    Maldague, Pierre; Page, Dennis; Chase, Adam

    2005-01-01

    Activity Plan Generator (APGEN), now at version 5.0, is a computer program that assists in generating an integrated plan of activities for a spacecraft mission that does not oversubscribe spacecraft and ground resources. APGEN generates an interactive display, through which the user can easily create or modify the plan. The display summarizes the plan by means of a time line, whereon each activity is represented by a bar stretched between its beginning and ending times. Activities can be added, deleted, and modified via simple mouse and keyboard actions. The use of resources can be viewed on resource graphs. Resource and activity constraints can be checked. Types of activities, resources, and constraints are defined by simple text files, which the user can modify. In one of two modes of operation, APGEN acts as a planning expert assistant, displaying the plan and identifying problems in the plan. The user is in charge of creating and modifying the plan. In the other mode, APGEN automatically creates a plan that does not oversubscribe resources. The user can then manually modify the plan. APGEN is designed to interact with other software that generates sequences of timed commands for implementing details of planned activities.

  4. Main modelling features of the ASTEC V2.1 major version

    International Nuclear Information System (INIS)

    Chatelard, P.; Belon, S.; Bosland, L.; Carénini, L.; Coindreau, O.; Cousin, F.; Marchetto, C.; Nowack, H.; Piar, L.; Chailan, L.

    2016-01-01

    Highlights: • Recent modelling improvements of the ASTEC European severe accident code are outlined. • Key new physical models now available in the ASTEC V2.1 major version are described. • ASTEC progress towards a multi-design reactor code is illustrated for BWR and PHWR. • ASTEC strong link with the on-going EC CESAM FP7 project is emphasized. • Main remaining modelling issues (on which IRSN efforts are now directing) are given. - Abstract: A new major version of the European severe accident integral code ASTEC, developed by IRSN with some GRS support, was delivered in November 2015 to the ASTEC worldwide community. Main modelling features of this V2.1 version are summarised in this paper. In particular, the in-vessel coupling technique between the reactor coolant system thermal-hydraulics module and the core degradation module has been strongly re-engineered to remove some well-known weaknesses of the former V2.0 series. The V2.1 version also includes new core degradation models specifically addressing BWR and PHWR reactor types, as well as several other physical modelling improvements, notably on reflooding of severely damaged cores, Zircaloy oxidation under air atmosphere, corium coolability during corium concrete interaction and source term evaluation. Moreover, this V2.1 version constitutes the back-bone of the CESAM FP7 project, which final objective is to further improve ASTEC for use in Severe Accident Management analysis of the Gen.II–III nuclear power plants presently under operation or foreseen in near future in Europe. As part of this European project, IRSN efforts to continuously improve both code numerical robustness and computing performances at plant scale as well as users’ tools are being intensified. Besides, ASTEC will continue capitalising the whole knowledge on severe accidents phenomenology by progressively keeping physical models at the state of the art through a regular feed-back from the interpretation of the current and

  5. PR-EDB: Power Reactor Embrittlement Database Version 3

    International Nuclear Information System (INIS)

    Wang, Jy-An John; Subramani, Ranjit

    2008-01-01

    The aging and degradation of light-water reactor pressure vessels is of particular concern because of their relevance to plant integrity and the magnitude of the expected irradiation embrittlement. The radiation embrittlement of reactor pressure vessel materials depends on many factors, such as neutron fluence, flux, and energy spectrum, irradiation temperature, and preirradiation material history and chemical compositions. These factors must be considered to reliably predict pressure vessel embrittlement and to ensure the safe operation of the reactor. Large amounts of data from surveillance capsules are needed to develop a generally applicable damage prediction model that can be used for industry standards and regulatory guides. Furthermore, the investigations of regulatory issues such as vessel integrity over plant life, vessel failure, and sufficiency of current codes, Standard Review Plans (SRPs), and Guides for license renewal can be greatly expedited by the use of a well-designed computerized database. The Power Reactor Embrittlement Database (PR-EDB) is such a comprehensive collection of data for U.S. designed commercial nuclear reactors. The current version of the PR-EDB lists the test results of 104 heat-affected-zone (HAZ) materials, 115 weld materials, and 141 base materials, including 103 plates, 35 forgings, and 3 correlation monitor materials that were irradiated in 321 capsules from 106 commercial power reactors. The data files are given in dBASE format and can be accessed with any personal computer using the Windows operating system. 'User-friendly' utility programs have been written to investigate radiation embrittlement using this database. Utility programs allow the user to retrieve, select and manipulate specific data, display data to the screen or printer, and fit and plot Charpy impact data. The PR-EDB Version 3.0 upgrades Version 2.0. The package was developed based on the Microsoft .NET framework technology and uses Microsoft Access for

  6. PR-EDB: Power Reactor Embrittlement Database - Version 3

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Jy-An John [ORNL; Subramani, Ranjit [ORNL

    2008-03-01

    The aging and degradation of light-water reactor pressure vessels is of particular concern because of their relevance to plant integrity and the magnitude of the expected irradiation embrittlement. The radiation embrittlement of reactor pressure vessel materials depends on many factors, such as neutron fluence, flux, and energy spectrum, irradiation temperature, and preirradiation material history and chemical compositions. These factors must be considered to reliably predict pressure vessel embrittlement and to ensure the safe operation of the reactor. Large amounts of data from surveillance capsules are needed to develop a generally applicable damage prediction model that can be used for industry standards and regulatory guides. Furthermore, the investigations of regulatory issues such as vessel integrity over plant life, vessel failure, and sufficiency of current codes, Standard Review Plans (SRPs), and Guides for license renewal can be greatly expedited by the use of a well-designed computerized database. The Power Reactor Embrittlement Database (PR-EDB) is such a comprehensive collection of data for U.S. designed commercial nuclear reactors. The current version of the PR-EDB lists the test results of 104 heat-affected-zone (HAZ) materials, 115 weld materials, and 141 base materials, including 103 plates, 35 forgings, and 3 correlation monitor materials that were irradiated in 321 capsules from 106 commercial power reactors. The data files are given in dBASE format and can be accessed with any personal computer using the Windows operating system. "User-friendly" utility programs have been written to investigate radiation embrittlement using this database. Utility programs allow the user to retrieve, select and manipulate specific data, display data to the screen or printer, and fit and plot Charpy impact data. The PR-EDB Version 3.0 upgrades Version 2.0. The package was developed based on the Microsoft .NET framework technology and uses Microsoft Access for

  7. Comparison of computed tomography dose reporting software

    International Nuclear Information System (INIS)

    Abdullah, A.; Sun, Z.; Pongnapang, N.; Ng, K. H.

    2008-01-01

    Computed tomography (CT) dose reporting software facilitates the estimation of doses to patients undergoing CT examinations. In this study, comparison of three software packages, i.e. CT-Expo (version 1.5, Medizinische Hochschule, Hannover (Germany)), ImPACT CT Patients Dosimetry Calculator (version 0.99x, Imaging Performance Assessment on Computed Tomography, www.impactscan.org) and WinDose (version 2.1a, Wellhofer Dosimetry, Schwarzenbruck (Germany)), has been made in terms of their calculation algorithm and the results of calculated doses. Estimations were performed for head, chest, abdominal and pelvic examinations based on the protocols recommended by European guidelines using single-slice CT (SSCT) (Siemens Somatom Plus 4, Erlangen (Germany)) and multi-slice CT (MSCT) (Siemens Sensation 16, Erlangen (Germany)) for software-based female and male phantoms. The results showed that there are some differences in final dose reporting provided by these software packages. There are deviations of effective doses produced by these software packages. Percentages of coefficient of variance range from 3.3 to 23.4 % in SSCT and from 10.6 to 43.8 % in MSCT. It is important that researchers state the name of the software that is used to estimate the various CT dose quantities. Users must also understand the equivalent terminologies between the information obtained from the CT console and the software packages in order to use the software correctly. (authors)

  8. Prospective EFL Teachers' Emotional Intelligence and Tablet Computer Use and Literacy

    Science.gov (United States)

    Herguner, Sinem

    2017-01-01

    The aim of this study was to investigate whether there is a relationship between tablet computer use and literacy, and emotional intelligence of prospective English language teachers. The study used a survey approach. In the study, "Prospective Teachers Tablet Computer Use and Literacy Scale" and an adapted and translated version into…

  9. Description of input and examples for PHREEQC version 3: a computer program for speciation, batch-reaction, one-dimensional transport, and inverse geochemical calculations

    Science.gov (United States)

    Parkhurst, David L.; Appelo, C.A.J.

    2013-01-01

    PHREEQC version 3 is a computer program written in the C and C++ programming languages that is designed to perform a wide variety of aqueous geochemical calculations. PHREEQC implements several types of aqueous models: two ion-association aqueous models (the Lawrence Livermore National Laboratory model and WATEQ4F), a Pitzer specific-ion-interaction aqueous model, and the SIT (Specific ion Interaction Theory) aqueous model. Using any of these aqueous models, PHREEQC has capabilities for (1) speciation and saturation-index calculations; (2) batch-reaction and one-dimensional (1D) transport calculations with reversible and irreversible reactions, which include aqueous, mineral, gas, solid-solution, surface-complexation, and ion-exchange equilibria, and specified mole transfers of reactants, kinetically controlled reactions, mixing of solutions, and pressure and temperature changes; and (3) inverse modeling, which finds sets of mineral and gas mole transfers that account for differences in composition between waters within specified compositional uncertainty limits. Many new modeling features were added to PHREEQC version 3 relative to version 2. The Pitzer aqueous model (pitzer.dat database, with keyword PITZER) can be used for high-salinity waters that are beyond the range of application for the Debye-Hückel theory. The Peng-Robinson equation of state has been implemented for calculating the solubility of gases at high pressure. Specific volumes of aqueous species are calculated as a function of the dielectric properties of water and the ionic strength of the solution, which allows calculation of pressure effects on chemical reactions and the density of a solution. The specific conductance and the density of a solution are calculated and printed in the output file. In addition to Runge-Kutta integration, a stiff ordinary differential equation solver (CVODE) has been included for kinetic calculations with multiple rates that occur at widely different time scales

  10. Comparing short versions of the AUDIT in a community-based survey of young people

    Science.gov (United States)

    2013-01-01

    Background The 10-item Alcohol Use Disorders Identification Test (AUDIT-10) is commonly used to monitor harmful alcohol consumption among high-risk groups, including young people. However, time and space constraints have generated interest for shortened versions. Commonly used variations are the AUDIT-C (three questions) and the Fast Alcohol Screening Test (FAST) (four questions), but their utility in screening young people in non-clinical settings has received little attention. Methods We examined the performance of established and novel shortened versions of the AUDIT in relation to the full AUDIT-10 in a community-based survey of young people (16–29 years) attending a music festival in Melbourne, Australia (January 2010). Among those reporting drinking alcohol in the previous 12 months, the following statistics were systematically assessed for all possible combinations of three or four AUDIT items and established AUDIT variations: Cronbach’s alpha (internal consistency), variance explained (R2) and Pearson’s correlation coefficient (concurrent validity). For our purposes, novel shortened AUDIT versions considered were required to represent all three AUDIT domains and include item 9 on alcohol-related injury. Results We recruited 640 participants (68% female) reporting drinking in the previous 12 months. Median AUDIT-10 score was 10 in males and 9 in females, and 127 (20%) were classified as having at least high-level alcohol problems according to WHO classification. The FAST scored consistently high across statistical measures; it explained 85.6% of variance in AUDIT-10, correlation with AUDIT-10 was 0.92, and Cronbach’s alpha was 0.66. A number of novel four-item AUDIT variations scored similarly high. Comparatively, the AUDIT-C scored substantially lower on all measures except internal consistency. Conclusions Numerous abbreviated variations of the AUDIT may be a suitable alternative to the AUDIT-10 for classifying high-level alcohol problems in a

  11. Some neutronics and thermal-hydraulics codes for reactor analysis using personal computers

    International Nuclear Information System (INIS)

    Woodruff, W.L.

    1990-01-01

    Some neutronics and thermal-hydraulics codes formerly available only for main frame computers may now be run on personal computers. Brief descriptions of the codes are provided. Running times for some of the codes are compared for an assortment of personal and main frame computers. With some limitations in detail, personal computer versions of the codes can be used to solve many problems of interest in reactor analyses at very modest costs. 11 refs., 4 tabs

  12. Visualization of unsteady computational fluid dynamics

    Science.gov (United States)

    Haimes, Robert

    1994-11-01

    A brief summary of the computer environment used for calculating three dimensional unsteady Computational Fluid Dynamic (CFD) results is presented. This environment requires a super computer as well as massively parallel processors (MPP's) and clusters of workstations acting as a single MPP (by concurrently working on the same task) provide the required computational bandwidth for CFD calculations of transient problems. The cluster of reduced instruction set computers (RISC) is a recent advent based on the low cost and high performance that workstation vendors provide. The cluster, with the proper software can act as a multiple instruction/multiple data (MIMD) machine. A new set of software tools is being designed specifically to address visualizing 3D unsteady CFD results in these environments. Three user's manuals for the parallel version of Visual3, pV3, revision 1.00 make up the bulk of this report.

  13. NFDRSPC: The National Fire-Danger Rating System on a Personal Computer

    Science.gov (United States)

    Bryan G. Donaldson; James T. Paul

    1990-01-01

    This user's guide is an introductory manual for using the 1988 version (Burgan 1988) of the National Fire-Danger Rating System on an IBM PC or compatible computer. NFDRSPC is a window-oriented, interactive computer program that processes observed and forecast weather with fuels data to produce NFDRS indices. Other program features include user-designed display...

  14. Fetomaternal hemorrhage during external cephalic version.

    Science.gov (United States)

    Boucher, Marc; Marquette, Gerald P; Varin, Jocelyne; Champagne, Josette; Bujold, Emmanuel

    2008-07-01

    To estimate the frequency and volume of fetomaternal hemorrhage during external cephalic version for term breech singleton fetuses and to identify risk factors involved with this complication. A prospective observational study was performed including all patients undergoing a trial of external cephalic version for a breech presentation of at least 36 weeks of gestation between 1987 and 2001 in our center. A search for fetal erythrocytes using the standard Kleihauer-Betke test was obtained before and after each external cephalic version. The frequency and volume of fetomaternal hemorrhage were calculated. Putative risk factors for fetomaternal hemorrhage were evaluated by chi(2) test and Mann-Whitney U test. A Kleihauer-Betke test result was available before and after 1,311 trials of external cephalic version. The Kleihauer-Betke test was positive in 67 (5.1%) before the procedure. Of the 1,244 women with a negative Kleihauer-Betke test before external cephalic version, 30 (2.4%) had a positive Kleihauer-Betke test after the procedure. Ten (0.8%) had an estimated fetomaternal hemorrhage greater than 1 mL, and one (0.08%) had an estimated fetomaternal hemorrhage greater than 30 mL. The risk of fetomaternal hemorrhage was not influenced by parity, gestational age, body mass index, number of attempts at version, placental location, or amniotic fluid index. The risk of detectable fetomaternal hemorrhage during external cephalic version was 2.4%, with fetomaternal hemorrhage more than 30 mL in less than 0.1% of cases. These data suggest that the performance of a Kleihauer-Betke test is unwarranted in uneventful external cephalic version and that in Rh-negative women, no further Rh immune globulin is necessary other than the routine 300-microgram dose at 28 weeks of gestation and postpartum. II.

  15. French version validation of the psychotic symptom rating scales (PSYRATS for outpatients with persistent psychotic symptoms

    Directory of Open Access Journals (Sweden)

    Favrod Jerome

    2012-09-01

    Full Text Available Abstract Background Most scales that assess the presence and severity of psychotic symptoms often measure a broad range of experiences and behaviours, something that restricts the detailed measurement of specific symptoms such as delusions or hallucinations. The Psychotic Symptom Rating Scales (PSYRATS is a clinical assessment tool that focuses on the detailed measurement of these core symptoms. The goal of this study was to examine the psychometric properties of the French version of the PSYRATS. Methods A sample of 103 outpatients suffering from schizophrenia or schizoaffective disorders and presenting persistent psychotic symptoms over the previous three months was assessed using the PSYRATS. Seventy-five sample participants were also assessed with the Positive And Negative Syndrome Scale (PANSS. Results ICCs were superior to .90 for all items of the PSYRATS. Factor analysis replicated the factorial structure of the original version of the delusions scale. Similar to previous replications, the factor structure of the hallucinations scale was partially replicated. Convergent validity indicated that some specific PSYRATS items do not correlate with the PANSS delusions or hallucinations. The distress items of the PSYRATS are negatively correlated with the grandiosity scale of the PANSS. Conclusions The results of this study are limited by the relatively small sample size as well as the selection of participants with persistent symptoms. The French version of the PSYRATS partially replicates previously published results. Differences in factor structure of the hallucinations scale might be explained by greater variability of its elements. The future development of the scale should take into account the presence of grandiosity in order to better capture details of the psychotic experience.

  16. Progress Towards AIRS Science Team Version-7 at SRT

    Science.gov (United States)

    Susskind, Joel; Blaisdell, John; Iredell, Lena; Kouvaris, Louis

    2016-01-01

    The AIRS Science Team Version-6 retrieval algorithm is currently producing level-3 Climate Data Records (CDRs) from AIRS that have been proven useful to scientists in understanding climate processes. CDRs are gridded level-3 products which include all cases passing AIRS Climate QC. SRT has made significant further improvements to AIRS Version-6. At the last Science Team Meeting, we described results using SRT AIRS Version-6.22. SRT Version-6.22 is now an official build at JPL called 6.2.4. Version-6.22 results are significantly improved compared to Version-6, especially with regard to water vapor and ozone profiles. We have adapted AIRS Version-6.22 to run with CrIS/ATMS, at the Sounder SIPS which processed CrIS/ATMS data for August 2014. JPL AIRS Version-6.22 uses the Version-6 AIRS tuning coefficients. AIRS Version-6.22 has at least two limitations which must be improved before finalization of Version-7: Version-6.22 total O3 has spurious high values in the presence of Saharan dust over the ocean; and Version-6.22 retrieved upper stratospheric temperatures are very poor in polar winter. SRT Version-6.28 addresses the first concern. John Blaisdell ran the analog of AIRS Version-6.28 in his own sandbox at JPL for the 14th and 15th of every month in 2014 and all of July and October for 2014. AIRS Version-6.28a is hot off the presses and addresses the second concern.

  17. Intercomparison of ILAS-II version 1.4 and version 2 target parameters with MIPAS-Envisat measurements

    Directory of Open Access Journals (Sweden)

    A. Griesfeller

    2008-02-01

    Full Text Available This paper assesses the mean differences between the two ILAS-II data versions (1.4 and 2 by comparing them with MIPAS measurements made between May and October 2003. For comparison with ILAS-II results, MIPAS data processed at the Institut für Meteorologie und Klimaforschung, Karlsruhe, Germany (IMK in cooperation with the Instituto de Astrofísica de Andalucía (IAA in Granada, Spain, were used. The coincidence criteria of ±300 km in space and ±12 h in time for H2O, N2O, and CH4 and the coincidence criteria of ±300 km in space and ±6 h in time for ClONO2, O3, and HNO3 were used. The ILAS-II data were separated into sunrise (= Northern Hemisphere and sunset (= Southern Hemisphere. For the sunrise data, a clear improvement from version 1.4 to version 2 was observed for H2O, CH4, ClONO2, and O3. In particular, the ILAS-II version 1.4 mixing ratios of H2O and CH4 were unrealistically small, and those of ClONO2 above altitudes of 30 km unrealistically large. For N2O and HNO3, there were no large differences between the two versions. Contrary to the Northern Hemisphere, where some exceptional profiles deviated significantly from known climatology, no such outlying profiles were found in the Southern Hemisphere for both versions. Generally, the ILAS-II version 2 data were in better agreement with the MIPAS data than the version 1.4, and are recommended for quantitative analysis in the stratosphere. For H2O data in the Southern Hemisphere, further data quality evaluation is necessary.

  18. DEMAID - A DESIGN MANAGER'S AID FOR INTELLIGENT DECOMPOSITION (SGI IRIS VERSION)

    Science.gov (United States)

    Rogers, J. L.

    1994-01-01

    effects of an output with respect to a change in a particular input. The second method traces backward to determine what modules must be re-executed if the output of a module must be recomputed. DeMAID is available in three machine versions: a Macintosh version which is written in Symantec's Think C 3.01, a Sun version, and an SGI IRIS version, both of which are written in C language. The Macintosh version requires system software 6.0.2 or later and CLIPS 4.3. The source code for the Macintosh version will not compile under version 4.0 of Think C; however, a sample executable is provided on the distribution media. QuickDraw is required for plotting. The Sun version requires GKS 4.1 graphics libraries, OpenWindows 3, and CLIPS 4.3. The SGI IRIS version requires CLIPS 4.3. Since DeMAID is not compatible with CLIPS 5.0 or later, the source code for CLIPS 4.3 is included on the distribution media; however, the documentation for CLIPS 4.3 is not included in the documentation package for DeMAID. It is available from COSMIC separately as the documentation for MSC-21208. The standard distribution medium for the Macintosh version of DeMAID is a set of four 3.5 inch 800K Macintosh format diskettes. The standard distribution medium for the Sun version of DeMAID is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. The standard distribution medium for the IRIS version is a .25 inch IRIX compatible streaming magnetic tape cartridge in UNIX tar format. All versions include sample input. DeMAID was originally developed for use on VAX VMS computers in 1989. The Macintosh version of DeMAID was released in 1991 and updated in 1992. The Sun version of DeMAID was released in 1992 and updated in 1993. The SGI IRIS version was released in 1993.

  19. CARES (Computer Analysis for Rapid Evaluation of Structures) Version 1.0, seismic module

    International Nuclear Information System (INIS)

    Xu, J.; Philippacopoulas, A.J.; Miller, C.A.; Costantino, C.J.

    1990-07-01

    During FY's 1988 and 1989, Brookhaven National Laboratory (BNL) developed the CARES system (Computer Analysis for Rapid Evaluation of Structures) for the US Nuclear Regulatory Commission (NRC). CARES is a PC software system which has been designed to perform structural response computations similar to those encountered in licensing reviews of nuclear power plant structures. The documentation of the Seismic Module of CARES consists of three volumes. This report is Volume 2 of the three volume documentation of the Seismic Module of CARES and represents the User's Manual. 14 refs

  20. Investigating the reliability and validity of the Dutch versions of the illness management and recovery scales among clients with mental disorders.

    Science.gov (United States)

    Goossens, Peter J J; Beentjes, Titus A A; Knol, Suzanne; Salyers, Michelle P; de Vries, Sjoerd J

    2017-12-01

    The Illness Management and Recovery scales (IMRS) can measure the progress of clients' illness self-management and recovery. Previous studies have examined the psychometric properties of the IMRS. This study examined the reliability and validity of the Dutch version of the IMRS. Clients (n = 111) and clinicians (n = 40) completed the client and clinician versions of the IMRS, respectively. The scales were administered again 2 weeks later to assess stability over time. Validity was assessed with the Utrecht Coping List (UCL), Dutch Empowerment Scale (DES), and Brief Symptom Inventory (BSI). The client and clinician versions of the IMRS had moderate internal reliability, with α = 0.69 and 0.71, respectively. The scales showed strong test-retest reliability, r = 0.79, for the client version and r = 0.86 for the clinician version. Correlations between client and clinician versions ranged from r = 0.37 to 0.69 for the total and subscales. We also found relationships in expected directions between the client IMRS and UCL, DES and BSI, which supports validity of the Dutch version of the IMRS. The Dutch version of the IMRS demonstrated good reliability and validity. The IMRS could be useful for Dutch-speaking programs interested in evaluating client progress on illness self-management and recovery.

  1. Development of a Chinese version of the Oswestry Disability Index version 2.1.

    Science.gov (United States)

    Lue, Yi-Jing; Hsieh, Ching-Lin; Huang, Mao-Hsiung; Lin, Gau-Tyan; Lu, Yen-Mou

    2008-10-01

    Cross-cultural adaptation and cross-sectional psychometric testing in a convenience sample of patients with low back pain. To translate and culturally adapt the Oswestry Disability Index version 2.1 (ODI 2.1) into a Mandarin Chinese version and to assess its reliability and validity. The Chinese ODI 2.1 has not been developed and validated. The ODI 2.1 was translated and culturally adapted to the Chinese version. The validity of the translated Chinese version was assessed by examining the relationship between the ODI and other well-known measures. Test-retest reliability was examined in 52 of these patients, who completed a second questionnaire within 1 week. Internal consistency of the ODI 2.1 was excellent with Cronbach's alpha = 0.903. The intraclass correlation coefficient of test-retest reliability was 0.89. The minimal detectable change was 12.8. The convergent validity of the Chinese ODI is supported by its high correlation with other physical functional status measures (Roland Morris Disability Questionnaire and SF-36 physical functioning subscale, r = 0.76 and -0.75, respectively), and moderate correlation with other measures (Visual Analogue Scale, r = 0.68) and certain SF-36 subscales (role-physical, bodily pain, and social functioning, r range: -0.49 to -0.57). As expected, the ODI was least correlated with nonfunctional measures (SF-36 mental subscale and role-emotional subscale, r = -0.25 and -0.33, respectively). The results of this study indicate that the Chinese version of the ODI 2.1 is a reliable and valid instrument for the measurement of functional status in patients with low back pain.

  2. The version control service for ATLAS data acquisition configuration files

    CERN Document Server

    Soloviev, Igor; The ATLAS collaboration

    2012-01-01

    To configure data taking session the ATLAS systems and detectors store more than 160 MBytes of data acquisition related configuration information in OKS XML files [1]. The total number of the files exceeds 1300 and they are updated by many system experts. In the past from time to time after such updates we had experienced problems caused by XML syntax errors or inconsistent state of files from a point of view of the overall ATLAS configuration. It was not always possible to know who made a modification causing problems or how to go back to a previous version of the modified file. Few years ago a special service addressing these issues has been implemented and deployed on ATLAS Point-1. It excludes direct write access to XML files stored in a central database repository. Instead, for an update the files are copied into a user repository, validated after modifications and committed using a version control system. The system's callback updates the central repository. Also, it keeps track of all modifications pro...

  3. Japanese evaluated nuclear data library version 3, JENDL-3

    International Nuclear Information System (INIS)

    Asami, Tetsuo; Igarashi, Shun-ichi; Ihara, Hitoshi

    1989-01-01

    The third version of Japanese Evaluated Nuclear Data Library, JENDL-3, has recently been compiled and issued. The major features of JENDL-3 are that it covers a much larger number of nuclides than the previous versions, that evaluation is made over the all energy regions from 10 -5 eV to 20 MeV using proper nuclear theories for each of the energy regions, and that it contains data on gamma ray generation. The present report first gives an outline of JENDL-3, and then describes the evaluation of nuclear data covering light nuclides, nuclides in structural materials, fission product nuclides, major actinide nuclides, transplutonium nuclides, and gamma ray generation. The applicability of JENDL-3 is examined through a variety of bench mark tests covering the fast reactor, thermal neutron reactor, shielding, neutronics of nuclear fusion reactor, dosimetry, and gamma ray generation data. The report also describes a library for calculation of decay heat, which has close relations with JENDL-3. It has been demonstrated that the experimental data given in this library are very reliable with high reproducibility. Additional activities planned for the future are also outlined briefly. (N.K.)

  4. High-performance computational fluid dynamics: a custom-code approach

    International Nuclear Information System (INIS)

    Fannon, James; Náraigh, Lennon Ó; Loiseau, Jean-Christophe; Valluri, Prashant; Bethune, Iain

    2016-01-01

    We introduce a modified and simplified version of the pre-existing fully parallelized three-dimensional Navier–Stokes flow solver known as TPLS. We demonstrate how the simplified version can be used as a pedagogical tool for the study of computational fluid dynamics (CFDs) and parallel computing. TPLS is at its heart a two-phase flow solver, and uses calls to a range of external libraries to accelerate its performance. However, in the present context we narrow the focus of the study to basic hydrodynamics and parallel computing techniques, and the code is therefore simplified and modified to simulate pressure-driven single-phase flow in a channel, using only relatively simple Fortran 90 code with MPI parallelization, but no calls to any other external libraries. The modified code is analysed in order to both validate its accuracy and investigate its scalability up to 1000 CPU cores. Simulations are performed for several benchmark cases in pressure-driven channel flow, including a turbulent simulation, wherein the turbulence is incorporated via the large-eddy simulation technique. The work may be of use to advanced undergraduate and graduate students as an introductory study in CFDs, while also providing insight for those interested in more general aspects of high-performance computing. (paper)

  5. High-performance computational fluid dynamics: a custom-code approach

    Science.gov (United States)

    Fannon, James; Loiseau, Jean-Christophe; Valluri, Prashant; Bethune, Iain; Náraigh, Lennon Ó.

    2016-07-01

    We introduce a modified and simplified version of the pre-existing fully parallelized three-dimensional Navier-Stokes flow solver known as TPLS. We demonstrate how the simplified version can be used as a pedagogical tool for the study of computational fluid dynamics (CFDs) and parallel computing. TPLS is at its heart a two-phase flow solver, and uses calls to a range of external libraries to accelerate its performance. However, in the present context we narrow the focus of the study to basic hydrodynamics and parallel computing techniques, and the code is therefore simplified and modified to simulate pressure-driven single-phase flow in a channel, using only relatively simple Fortran 90 code with MPI parallelization, but no calls to any other external libraries. The modified code is analysed in order to both validate its accuracy and investigate its scalability up to 1000 CPU cores. Simulations are performed for several benchmark cases in pressure-driven channel flow, including a turbulent simulation, wherein the turbulence is incorporated via the large-eddy simulation technique. The work may be of use to advanced undergraduate and graduate students as an introductory study in CFDs, while also providing insight for those interested in more general aspects of high-performance computing.

  6. Comprehensive Application of the International Classification of Headache Disorders Third Edition, Beta Version.

    Science.gov (United States)

    Kim, Byung-Kun; Cho, Soo-Jin; Kim, Byung-Su; Sohn, Jong-Hee; Kim, Soo-Kyoung; Cha, Myoung-Jin; Song, Tae-Jin; Kim, Jae-Moon; Park, Jeong Wook; Chu, Min Kyung; Park, Kwang-Yeol; Moon, Heui-Soo

    2016-01-01

    The purpose of this study was to test the feasibility and usefulness of the International Classification of Headache Disorders, third edition, beta version (ICHD-3β), and compare the differences with the International Classification of Headache Disorders, second edition (ICHD-2). Consecutive first-visit patients were recruited from 11 headache clinics in Korea. Headache classification was performed in accordance with ICHD-3β. The characteristics of headaches were analyzed and the feasibility and usefulness of this version was assessed by the proportion of unclassified headache disorders compared with ICHD-2. A total of 1,627 patients were enrolled (mean age, 47.4±14.7 yr; 62.8% female). Classification by ICHD-3β was achieved in 97.8% of headache patients, whereas 90.0% could be classified by ICHD-2. Primary headaches (n=1,429, 87.8%) were classified as follows: 697 migraines, 445 tension-type headaches, 22 cluster headaches, and 265 other primary headache disorders. Secondary headache or painful cranial neuropathies/other facial pains were diagnosed in 163 patients (10.0%). Only 2.2% were not classified by ICHD-3β. The main reasons for missing classifications were insufficient information (1.6%) or absence of suitable classification (0.6%). The diagnoses differed from those using ICHD-2 in 243 patients (14.9%). Among them, 165 patients were newly classified from unclassified with ICHD-2 because of the relaxation of the previous strict criteria or the introduction of a new diagnostic category. ICHD-3β would yield a higher classification rate than its previous version, ICHD-2. ICHD-3β is applicable in clinical practice for first-visit headache patients of a referral hospital.

  7. So, you are buying your first computer.

    Science.gov (United States)

    Ferrara-Love, R

    1999-06-01

    Buying your first computer need not be that complicated. The first thing that is needed is an understanding of what you want and need the computer for. By making a list of the various essentials, you will be on your way to purchasing that computer. Once that is completed, you will need an understanding of what each of the components of the computer is, how it works, and what options you have. This way, you will be better able to discuss your needs with the salesperson. The focus of this article is limited to personal computers or PCs (i.e., IBMs [Armonk, NY], IBM clones, Compaq [Houston, TX], Gateway [North Sioux City, SD], and so on). I am not including Macintosh or Apple [Cupertino, CA] in this discussion; most software is often made exclusively for personal computers or at least on the market for personal computers before becoming available in Macintosh version.

  8. Core 2D. A code for non-isothermal water flow and reactive solute transport. Users manual version 2

    Energy Technology Data Exchange (ETDEWEB)

    Samper, J.; Juncosa, R.; Delgado, J.; Montenegro, L. [Universidad de A Coruna (Spain)

    2000-07-01

    Understanding natural groundwater quality patterns, quantifying groundwater pollution and assessing the effects of waste disposal, require modeling tools accounting for water flow, and transport of heat and dissolved species as well as their complex interactions with solid and gases phases. This report contains the users manual of CORE ''2D Version V.2.0, a COde for modeling water flow (saturated and unsaturated), heat transport and multicomponent Reactive solute transport under both local chemical equilibrium and kinetic conditions. it is an updated and improved version of CORE-LE-2D V0 (Samper et al., 1988) which in turns is an extended version of TRANQUI, a previous reactive transport code (ENRESA, 1995). All these codes were developed within the context of Research Projects funded by ENRESA and the European Commission. (Author)

  9. Core2D. A code for non-isothermal water flow and reactive solute transport. Users manual version 2

    International Nuclear Information System (INIS)

    Samper, J.; Juncosa, R.; Delgado, J.; Montenegro, L.

    2000-01-01

    Understanding natural groundwater quality patterns, quantifying groundwater pollution and assessing the effects of waste disposal, require modeling tools accounting for water flow, and transport of heat and dissolved species as well as their complex interactions with solid and gases phases. This report contains the users manual of CORE ''2D Version V.2.0, a COde for modeling water flow (saturated and unsaturated), heat transport and multicomponent Reactive solute transport under both local chemical equilibrium and kinetic conditions. it is an updated and improved version of CORE-LE-2D V0 (Samper et al., 1988) which in turns is an extended version of TRANQUI, a previous reactive transport code (ENRESA, 1995). All these codes were developed within the context of Research Projects funded by ENRESA and the European Commission. (Author)

  10. Core 2D. A code for non-isothermal water flow and reactive solute transport. Users manual version 2

    Energy Technology Data Exchange (ETDEWEB)

    Samper, J; Juncosa, R; Delgado, J; Montenegro, L [Universidad de A Coruna (Spain)

    2000-07-01

    Understanding natural groundwater quality patterns, quantifying groundwater pollution and assessing the effects of waste disposal, require modeling tools accounting for water flow, and transport of heat and dissolved species as well as their complex interactions with solid and gases phases. This report contains the users manual of CORE ''2D Version V.2.0, a COde for modeling water flow (saturated and unsaturated), heat transport and multicomponent Reactive solute transport under both local chemical equilibrium and kinetic conditions. it is an updated and improved version of CORE-LE-2D V0 (Samper et al., 1988) which in turns is an extended version of TRANQUI, a previous reactive transport code (ENRESA, 1995). All these codes were developed within the context of Research Projects funded by ENRESA and the European Commission. (Author)

  11. Sorting on STAR. [CDC computer algorithm timing comparison

    Science.gov (United States)

    Stone, H. S.

    1978-01-01

    Timing comparisons are given for three sorting algorithms written for the CDC STAR computer. One algorithm is Hoare's (1962) Quicksort, which is the fastest or nearly the fastest sorting algorithm for most computers. A second algorithm is a vector version of Quicksort that takes advantage of the STAR's vector operations. The third algorithm is an adaptation of Batcher's (1968) sorting algorithm, which makes especially good use of vector operations but has a complexity of N(log N)-squared as compared with a complexity of N log N for the Quicksort algorithms. In spite of its worse complexity, Batcher's sorting algorithm is competitive with the serial version of Quicksort for vectors up to the largest that can be treated by STAR. Vector Quicksort outperforms the other two algorithms and is generally preferred. These results indicate that unusual instruction sets can introduce biases in program execution time that counter results predicted by worst-case asymptotic complexity analysis.

  12. Reliability and validity of the Japanese version of the Resilience Scale and its short version.

    Science.gov (United States)

    Nishi, Daisuke; Uehara, Ritei; Kondo, Maki; Matsuoka, Yutaka

    2010-11-17

    The clinical relevance of resilience has received considerable attention in recent years. The aim of this study is to demonstrate the reliability and validity of the Japanese version of the Resilience Scale (RS) and short version of the RS (RS-14). The original English version of RS was translated to Japanese and the Japanese version was confirmed by back-translation. Participants were 430 nursing and university psychology students. The RS, Center for Epidemiologic Studies Depression Scale (CES-D), Rosenberg Self-Esteem Scale (RSES), Social Support Questionnaire (SSQ), Perceived Stress Scale (PSS), and Sheehan Disability Scale (SDS) were administered. Internal consistency, convergent validity and factor loadings were assessed at initial assessment. Test-retest reliability was assessed using data collected from 107 students at 3 months after baseline. Mean score on the RS was 111.19. Cronbach's alpha coefficients for the RS and RS-14 were 0.90 and 0.88, respectively. The test-retest correlation coefficients for the RS and RS-14 were 0.83 and 0.84, respectively. Both the RS and RS-14 were negatively correlated with the CES-D and SDS, and positively correlated with the RSES, SSQ and PSS (all p reliability, and relatively low concurrent validity. RS-14 was equivalent to the RS in internal consistency, test-retest reliability, and concurrent validity. Low scores on the RS, a positive correlation between the RS and perceived stress, and a relatively low correlation between the RS and depressive symptoms in this study suggest that validity of the Japanese version of the RS might be relatively low compared with the original English version.

  13. Computing and Displaying Isosurfaces in R

    Directory of Open Access Journals (Sweden)

    Dai Feng

    2008-09-01

    Full Text Available This paper presents R utilities for computing and displaying isosurfaces, or three-dimensional contour surfaces, from a three-dimensional array of function values. A version of the marching cubes algorithm that takes into account face and internal ambiguities is used to compute the isosurfaces. Vectorization is used to ensure adequate performance using only R code. Examples are presented showing contours of theoretical densities, density estimates, and medical imaging data. Rendering can use the rgl package or standard or grid graphics, and a set of tools for representing and rendering surfaces using standard or grid graphics is presented.

  14. Test documentation for the GENII Software Version 1.485

    International Nuclear Information System (INIS)

    Rittmann, P.D.

    1994-01-01

    Version 1.485 of the GENII software was released by the PNL GENII custodian in December of 1990. At that time the WHC GENII custodian performed several tests to verify that the advertised revisions were indeed present and that these changes had not introduced errors in the calculations normally done by WHC. These tests were not documented at that time. The purpose of this document is to summarize suitable acceptance tests of GENII and compare them with a few hand calculations. The testing is not as thorough as that used by the PNL GENII Custodian, but is sufficient to establish that the GENII program appears to work correctly on WHC managed personal computers

  15. CARES (Computer Analysis for Rapid Evaluation of Structures) Version 1.0, seismic module

    International Nuclear Information System (INIS)

    Xu, J.; Philippacopoulas, A.J.; Miller, C.A.; Costantino, C.J.

    1990-07-01

    During FY's 1988 and 1989, Brookhaven National Laboratory (BNL) developed the CARES system (Computer Analysis for Rapid Evaluation of Structures) for the US Nuclear Regulatory Commission (NRC). CARES is a PC software system which has been designed to perform structural response computations similar to those encountered in licensing reviews of nuclear power plant structures. The documentation of the Seismic Module of CARES consists of three volumes. This report represents Volume 3 of the volume documentation of the Seismic Module of CARES. It presents three sample problems typically encountered in the Soil-Structure Interaction analyses. 14 refs., 36 figs., 2 tabs

  16. TWOS - TIME WARP OPERATING SYSTEM, VERSION 2.5.1

    Science.gov (United States)

    Bellenot, S. F.

    1994-01-01

    The Time Warp Operating System (TWOS) is a special-purpose operating system designed to support parallel discrete-event simulation. TWOS is a complete implementation of the Time Warp mechanism, a distributed protocol for virtual time synchronization based on process rollback and message annihilation. Version 2.5.1 supports simulations and other computations using both virtual time and dynamic load balancing; it does not support general time-sharing or multi-process jobs using conventional message synchronization and communication. The program utilizes the underlying operating system's resources. TWOS runs a single simulation at a time, executing it concurrently on as many processors of a distributed system as are allocated. The simulation needs only to be decomposed into objects (logical processes) that interact through time-stamped messages. TWOS provides transparent synchronization. The user does not have to add any more special logic to aid in synchronization, nor give any synchronization advice, nor even understand much about how the Time Warp mechanism works. The Time Warp Simulator (TWSIM) subdirectory contains a sequential simulation engine that is interface compatible with TWOS. This means that an application designer and programmer who wish to use TWOS can prototype code on TWSIM on a single processor and/or workstation before having to deal with the complexity of working on a distributed system. TWSIM also provides statistics about the application which may be helpful for determining the correctness of an application and for achieving good performance on TWOS. Version 2.5.1 has an updated interface that is not compatible with 2.0. The program's user manual assists the simulation programmer in the design, coding, and implementation of discrete-event simulations running on TWOS. The manual also includes a practical user's guide to the TWOS application benchmark, Colliding Pucks. TWOS supports simulations written in the C programming language. It is designed

  17. Introducing external cephalic version in a Malaysian setting.

    Science.gov (United States)

    Yong, Stephen P Y

    2007-02-01

    To assess the outcome of external cephalic version for routine management of malpresenting foetuses at term. Prospective observational study. Tertiary teaching hospital, Malaysia. From September 2003 to June 2004, a study involving 41 pregnant women with malpresentation at term was undertaken. An external cephalic version protocol was implemented. Data were collected for identifying characteristics associated with success or failure of external cephalic version. Maternal and foetal outcome measures including success rate of external cephalic version, maternal and foetal complications, and characteristics associated with success or failure; engagement of presenting part, placental location, direction of version, attempts at version, use of intravenous tocolytic agent, eventual mode of delivery, Apgar scores, birth weights, and maternal satisfaction with the procedure. Data were available for 38 women. External cephalic version was successful in 63% of patients; the majority (75%) of whom achieved a vaginal delivery. Multiparity (odds ratio=34.0; 95% confidence interval, 0.67-1730) and high amniotic fluid index (4.9; 1.3-18.2) were associated with successful external cephalic version. Engagement of presenting part (odds ratio=0.0001; 95% confidence interval, 0.00001-0.001) and a need to resort to backward somersault (0.02; 0.00001-0.916) were associated with poor success rates. Emergency caesarean section rate for foetal distress directly resulting from external cephalic version was 8%, but there was no perinatal or maternal adverse outcome. The majority (74%) of women were satisfied with external cephalic version. External cephalic version has acceptable success rates. Multiparity, liquor volume, engagement of presenting part, and the need for backward somersault were strong predictors of outcome. External cephalic version is relatively safe, simple to learn and perform, and associated with maternal satisfaction. Modern obstetric units should routinely offer the

  18. Third International Joint Conference on Computational Intelligence (IJCCI 2011)

    CERN Document Server

    Dourado, António; Rosa, Agostinho; Filipe, Joaquim; Computational Intelligence

    2013-01-01

    The present book includes a set of selected extended papers from the third International Joint Conference on Computational Intelligence (IJCCI 2011), held in Paris, France, from 24 to 26 October 2011. The conference was composed of three co-located conferences:  The International Conference on Fuzzy Computation (ICFC), the International Conference on Evolutionary Computation (ICEC), and the International Conference on Neural Computation (ICNC). Recent progresses in scientific developments and applications in these three areas are reported in this book. IJCCI received 283 submissions, from 59 countries, in all continents. This book includes the revised and extended versions of a strict selection of the best papers presented at the conference.

  19. External cephalic version-related risks: a meta-analysis.

    Science.gov (United States)

    Grootscholten, Kim; Kok, Marjolein; Oei, S Guid; Mol, Ben W J; van der Post, Joris A

    2008-11-01

    To systematically review the literature on external cephalic version-related complications and to assess if the outcome of a version attempt is related to complications. In March 2007 we searched MEDLINE, EMBASE, and the Cochrane Central Register of Controlled Trials. Studies reporting on complications from an external cephalic version attempt for singleton breech pregnancies after 36 weeks of pregnancy were selected. We calculated odds ratios (ORs) from studies that reported both on complications as well as on the position of the fetus immediately after the procedure. We found 84 studies, reporting on 12,955 version attempts that reported on external cephalic version-related complications. The pooled complication rate was 6.1% (95% CI 4.7-7.8), 0.24% for serious complications (95% confidence interval [CI] 0.17-0.34) and 0.35% for emergency cesarean deliveries (95% CI 0.26-0.47). Complications were not related to external cephalic version outcome (OR 1.2 (95% CI 0.93-1.7). External cephalic version is a safe procedure. Complications are not related to the fetal position after external cephalic version.

  20. Inclusion in the Workplace - Text Version | NREL

    Science.gov (United States)

    Careers » Inclusion in the Workplace - Text Version Inclusion in the Workplace - Text Version This is the text version for the Inclusion: Leading by Example video. I'm Martin Keller. I'm the NREL of the laboratory. Another very important element in inclusion is diversity. Because if we have a