WorldWideScience

Sample records for program computer modeling

  1. A Model of Computation for Bit-Level Concurrent Computing and Programming: APEC

    Science.gov (United States)

    Ajiro, Takashi; Tsuchida, Kensei

    A concurrent model of computation and a language based on the model for bit-level operation are useful for developing asynchronous and concurrent programs compositionally, which frequently use bit-level operations. Some examples are programs for video games, hardware emulation (including virtual machines), and signal processing. However, few models and languages are optimized and oriented to bit-level concurrent computation. We previously developed a visual programming language called A-BITS for bit-level concurrent programming. The language is based on a dataflow-like model that computes using processes that provide serial bit-level operations and FIFO buffers connected to them. It can express bit-level computation naturally and develop compositionally. We then devised a concurrent computation model called APEC (Asynchronous Program Elements Connection) for bit-level concurrent computation. This model enables precise and formal expression of the process of computation, and a notion of primitive program elements for controlling and operating can be expressed synthetically. Specifically, the model is based on a notion of uniform primitive processes, called primitives, that have three terminals and four ordered rules at most, as well as on bidirectional communication using vehicles called carriers. A new notion is that a carrier moving between two terminals can briefly express some kinds of computation such as synchronization and bidirectional communication. The model's properties make it most applicable to bit-level computation compositionally, since the uniform computation elements are enough to develop components that have practical functionality. Through future application of the model, our research may enable further research on a base model of fine-grain parallel computer architecture, since the model is suitable for expressing massive concurrency by a network of primitives.

  2. Description of mathematical models and computer programs

    International Nuclear Information System (INIS)

    1977-01-01

    The paper gives a description of mathematical models and computer programs for analysing possible strategies for spent fuel management, with emphasis on economic analysis. The computer programs developed, describe the material flows, facility construction schedules, capital investment schedules and operating costs for the facilities used in managing the spent fuel. The computer programs use a combination of simulation and optimization procedures for the economic analyses. Many of the fuel cycle steps (such as spent fuel discharges, storage at the reactor, and transport to the RFCC) are described in physical and economic terms through simulation modeling, while others (such as reprocessing plant size and commissioning schedules, interim storage facility commissioning schedules etc.) are subjected to economic optimization procedures to determine the approximate lowest-cost plans from among the available feasible alternatives

  3. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We identify a number of common features in programming that seem...... conspicuously absent from the literature on biomolecular computing; to partially redress this absence, we introduce a model of computation that is evidently programmable, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined...... by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways...

  4. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2010-01-01

    in a strong sense: a universal algorithm exists, that is able to execute any program, and is not asymptotically inefficient. A prototype model has been implemented (for now in silico on a conventional computer). This work opens new perspectives on just how computation may be specified at the biological level......., by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only...... executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways and without arcane encodings of data and algorithm); it is also uniform: new “hardware” is not needed to solve new problems; and (last but not least) it is Turing complete...

  5. More scalability, less pain: A simple programming model and its implementation for extreme computing

    International Nuclear Information System (INIS)

    Lusk, E.L.; Pieper, S.C.; Butler, R.M.

    2010-01-01

    This is the story of a simple programming model, its implementation for extreme computing, and a breakthrough in nuclear physics. A critical issue for the future of high-performance computing is the programming model to use on next-generation architectures. Described here is a promising approach: program very large machines by combining a simplified programming model with a scalable library implementation. The presentation takes the form of a case study in nuclear physics. The chosen application addresses fundamental issues in the origins of our Universe, while the library developed to enable this application on the largest computers may have applications beyond this one.

  6. Mathematical models and algorithms for the computer program 'WOLF'

    International Nuclear Information System (INIS)

    Halbach, K.

    1975-12-01

    The computer program FLOW finds the nonrelativistic self- consistent set of two-dimensional ion trajectories and electric fields (including space charges from ions and electrons) for a given set of initial and boundary conditions for the particles and fields. The combination of FLOW with the optimization code PISA gives the program WOLF, which finds the shape of the emitter which is consistent with the plasma forming it, and in addition varies physical characteristics such as electrode position, shapes, and potentials so that some performance characteristics are optimized. The motivation for developing these programs was the desire to design optimum ion source extractor/accelerator systems in a systematic fashion. The purpose of this report is to explain and derive the mathematical models and algorithms which approximate the real physical processes. It serves primarily to document the computer programs. 10 figures

  7. What do reversible programs compute?

    DEFF Research Database (Denmark)

    Axelsen, Holger Bock; Glück, Robert

    2011-01-01

    Reversible computing is the study of computation models that exhibit both forward and backward determinism. Understanding the fundamental properties of such models is not only relevant for reversible programming, but has also been found important in other fields, e.g., bidirectional model...... transformation, program transformations such as inversion, and general static prediction of program properties. Historically, work on reversible computing has focussed on reversible simulations of irreversible computations. Here, we take the viewpoint that the property of reversibility itself should...... are not strictly classically universal, but that they support another notion of universality; we call this RTM-universality. Thus, even though the RTMs are sub-universal in the classical sense, they are powerful enough as to include a self-interpreter. Lifting this to other computation models, we propose r...

  8. Teaching Scientific Computing: A Model-Centered Approach to Pipeline and Parallel Programming with C

    Directory of Open Access Journals (Sweden)

    Vladimiras Dolgopolovas

    2015-01-01

    Full Text Available The aim of this study is to present an approach to the introduction into pipeline and parallel computing, using a model of the multiphase queueing system. Pipeline computing, including software pipelines, is among the key concepts in modern computing and electronics engineering. The modern computer science and engineering education requires a comprehensive curriculum, so the introduction to pipeline and parallel computing is the essential topic to be included in the curriculum. At the same time, the topic is among the most motivating tasks due to the comprehensive multidisciplinary and technical requirements. To enhance the educational process, the paper proposes a novel model-centered framework and develops the relevant learning objects. It allows implementing an educational platform of constructivist learning process, thus enabling learners’ experimentation with the provided programming models, obtaining learners’ competences of the modern scientific research and computational thinking, and capturing the relevant technical knowledge. It also provides an integral platform that allows a simultaneous and comparative introduction to pipelining and parallel computing. The programming language C for developing programming models and message passing interface (MPI and OpenMP parallelization tools have been chosen for implementation.

  9. CORCON: a computer program for modelling molten fuel/concrete interactions

    International Nuclear Information System (INIS)

    Muir, J.F.

    1980-01-01

    A computer program modelling the interaction between molten core materials and structural concrete is being developed to provide a capability for making quantitative estimates of reactor fuel-melt accidents. The principal phenomenological models, inter-component heat transfer, concrete erosion, and melt/gas chemical reactions, are described. A code test comparison calculation is discussed

  10. Final Report: Center for Programming Models for Scalable Parallel Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [William Marsh Rice University

    2011-09-13

    As part of the Center for Programming Models for Scalable Parallel Computing, Rice University collaborated with project partners in the design, development and deployment of language, compiler, and runtime support for parallel programming models to support application development for the “leadership-class” computer systems at DOE national laboratories. Work over the course of this project has focused on the design, implementation, and evaluation of a second-generation version of Coarray Fortran. Research and development efforts of the project have focused on the CAF 2.0 language, compiler, runtime system, and supporting infrastructure. This has involved working with the teams that provide infrastructure for CAF that we rely on, implementing new language and runtime features, producing an open source compiler that enabled us to evaluate our ideas, and evaluating our design and implementation through the use of benchmarks. The report details the research, development, findings, and conclusions from this work.

  11. Stochastic linear programming models, theory, and computation

    CERN Document Server

    Kall, Peter

    2011-01-01

    This new edition of Stochastic Linear Programming: Models, Theory and Computation has been brought completely up to date, either dealing with or at least referring to new material on models and methods, including DEA with stochastic outputs modeled via constraints on special risk functions (generalizing chance constraints, ICC’s and CVaR constraints), material on Sharpe-ratio, and Asset Liability Management models involving CVaR in a multi-stage setup. To facilitate use as a text, exercises are included throughout the book, and web access is provided to a student version of the authors’ SLP-IOR software. Additionally, the authors have updated the Guide to Available Software, and they have included newer algorithms and modeling systems for SLP. The book is thus suitable as a text for advanced courses in stochastic optimization, and as a reference to the field. From Reviews of the First Edition: "The book presents a comprehensive study of stochastic linear optimization problems and their applications. … T...

  12. Generalized fish life-cycle poplulation model and computer program

    International Nuclear Information System (INIS)

    DeAngelis, D.L.; Van Winkle, W.; Christensen, S.W.; Blum, S.R.; Kirk, B.L.; Rust, B.W.; Ross, C.

    1978-03-01

    A generalized fish life-cycle population model and computer program have been prepared to evaluate the long-term effect of changes in mortality in age class 0. The general question concerns what happens to a fishery when density-independent sources of mortality are introduced that act on age class 0, particularly entrainment and impingement at power plants. This paper discusses the model formulation and computer program, including sample results. The population model consists of a system of difference equations involving age-dependent fecundity and survival. The fecundity for each age class is assumed to be a function of both the fraction of females sexually mature and the weight of females as they enter each age class. Natural mortality for age classes 1 and older is assumed to be independent of population size. Fishing mortality is assumed to vary with the number and weight of fish available to the fishery. Age class 0 is divided into six life stages. The probability of survival for age class 0 is estimated considering both density-independent mortality (natural and power plant) and density-dependent mortality for each life stage. Two types of density-dependent mortality are included. These are cannibalism of each life stage by older age classes and intra-life-stage competition

  13. A programming approach to computability

    CERN Document Server

    Kfoury, A J; Arbib, Michael A

    1982-01-01

    Computability theory is at the heart of theoretical computer science. Yet, ironically, many of its basic results were discovered by mathematical logicians prior to the development of the first stored-program computer. As a result, many texts on computability theory strike today's computer science students as far removed from their concerns. To remedy this, we base our approach to computability on the language of while-programs, a lean subset of PASCAL, and postpone consideration of such classic models as Turing machines, string-rewriting systems, and p. -recursive functions till the final chapter. Moreover, we balance the presentation of un solvability results such as the unsolvability of the Halting Problem with a presentation of the positive results of modern programming methodology, including the use of proof rules, and the denotational semantics of programs. Computer science seeks to provide a scientific basis for the study of information processing, the solution of problems by algorithms, and the design ...

  14. AGRIS: Description of computer programs

    International Nuclear Information System (INIS)

    Schmid, H.; Schallaboeck, G.

    1976-01-01

    The set of computer programs used at the AGRIS (Agricultural Information System) Input Unit at the IAEA, Vienna, Austria to process the AGRIS computer-readable data is described. The processing flow is illustrated. The configuration of the IAEA's computer, a list of error messages generated by the computer, the EBCDIC code table extended for AGRIS and INIS, the AGRIS-6 bit code, the work sheet format, and job control listings are included as appendixes. The programs are written for an IBM 370, model 145, operating system OS or VS, and require a 130K partition. The programming languages are PL/1 (F-compiler) and Assembler

  15. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  16. MININR: a geochemical computer program for inclusion in water flow models - an application study

    Energy Technology Data Exchange (ETDEWEB)

    Felmy, A.R.; Reisenauer, A.E.; Zachara, J.M.; Gee, G.W.

    1984-02-01

    MININR is a reduced form of the computer program MINTEQ which calculates equilibrium precipitation/dissolution of solid phases, aqueous speciation, adsorption, and gas phase equilibrium. The user-oriented features in MINTEQ were removed to reduce the size and increase the computational speed. MININR closely resembles the MINEQL computer program developed by Westall (1976). The main differences between MININR and MINEQL involve modifications to accept an initial starting mass of solid and necessary changes for linking with a water flow model. MININR in combination with a simple water flow model which considers only dilution was applied to a laboratory column packed with retorted oil shale and percolated with distilled water. Experimental and preliminary model simulation results are presented for the constituents K/sup +/, Na/sup +/, SO/sub 4//sup 2 -/, Mg/sup 2 +/, Ca/sup 2 +/, CO/sub 3//sup 2 -/ and pH.

  17. Programming Unconventional Computers: Dynamics, Development, Self-Reference

    Directory of Open Access Journals (Sweden)

    Susan Stepney

    2012-10-01

    Full Text Available Classical computing has well-established formalisms for specifying, refining, composing, proving, and otherwise reasoning about computations. These formalisms have matured over the past 70 years or so. Unconventional Computing includes the use of novel kinds of substrates–from black holes and quantum effects, through to chemicals, biomolecules, even slime moulds–to perform computations that do not conform to the classical model. Although many of these unconventional substrates can be coerced into performing classical computation, this is not how they “naturally” compute. Our ability to exploit unconventional computing is partly hampered by a lack of corresponding programming formalisms: we need models for building, composing, and reasoning about programs that execute in these substrates. What might, say, a slime mould programming language look like? Here I outline some of the issues and properties of these unconventional substrates that need to be addressed to find “natural” approaches to programming them. Important concepts include embodied real values, processes and dynamical systems, generative systems and their meta-dynamics, and embodied self-reference.

  18. RFQ modeling computer program

    International Nuclear Information System (INIS)

    Potter, J.M.

    1985-01-01

    The mathematical background for a multiport-network-solving program is described. A method for accurately numerically modeling an arbitrary, continuous, multiport transmission line is discussed. A modification to the transmission-line equations to accommodate multiple rf drives is presented. An improved model for the radio-frequency quadrupole (RFQ) accelerator that corrects previous errors is given. This model permits treating the RFQ as a true eight-port network for simplicity in interpreting the field distribution and ensures that all modes propagate at the same velocity in the high-frequency limit. The flexibility of the multiport model is illustrated by simple modifications to otherwise two-dimensional systems that permit modeling them as linear chains of multiport networks

  19. Programming Models in HPC

    Energy Technology Data Exchange (ETDEWEB)

    Shipman, Galen M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-06-13

    These are the slides for a presentation on programming models in HPC, at the Los Alamos National Laboratory's Parallel Computing Summer School. The following topics are covered: Flynn's Taxonomy of computer architectures; single instruction single data; single instruction multiple data; multiple instruction multiple data; address space organization; definition of Trinity (Intel Xeon-Phi is a MIMD architecture); single program multiple data; multiple program multiple data; ExMatEx workflow overview; definition of a programming model, programming languages, runtime systems; programming model and environments; MPI (Message Passing Interface); OpenMP; Kokkos (Performance Portable Thread-Parallel Programming Model); Kokkos abstractions, patterns, policies, and spaces; RAJA, a systematic approach to node-level portability and tuning; overview of the Legion Programming Model; mapping tasks and data to hardware resources; interoperability: supporting task-level models; Legion S3D execution and performance details; workflow, integration of external resources into the programming model.

  20. Energy consumption program: A computer model simulating energy loads in buildings

    Science.gov (United States)

    Stoller, F. W.; Lansing, F. L.; Chai, V. W.; Higgins, S.

    1978-01-01

    The JPL energy consumption computer program developed as a useful tool in the on-going building modification studies in the DSN energy conservation project is described. The program simulates building heating and cooling loads and computes thermal and electric energy consumption and cost. The accuracy of computations are not sacrificed, however, since the results lie within + or - 10 percent margin compared to those read from energy meters. The program is carefully structured to reduce both user's time and running cost by asking minimum information from the user and reducing many internal time-consuming computational loops. Many unique features were added to handle two-level electronics control rooms not found in any other program.

  1. Computer Programming Languages for Health Care

    Science.gov (United States)

    O'Neill, Joseph T.

    1979-01-01

    This paper advocates the use of standard high level programming languages for medical computing. It recommends that U.S. Government agencies having health care missions implement coordinated policies that encourage the use of existing standard languages and the development of new ones, thereby enabling them and the medical computing community at large to share state-of-the-art application programs. Examples are based on a model that characterizes language and language translator influence upon the specification, development, test, evaluation, and transfer of application programs.

  2. A SCILAB Program for Computing General-Relativistic Models of Rotating Neutron Stars by Implementing Hartle's Perturbation Method

    Science.gov (United States)

    Papasotiriou, P. J.; Geroyannis, V. S.

    We implement Hartle's perturbation method to the computation of relativistic rigidly rotating neutron star models. The program has been written in SCILAB (© INRIA ENPC), a matrix-oriented high-level programming language. The numerical method is described in very detail and is applied to many models in slow or fast rotation. We show that, although the method is perturbative, it gives accurate results for all practical purposes and it should prove an efficient tool for computing rapidly rotating pulsars.

  3. Computational Physics Program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1984-12-01

    The principal objective of the computational physics group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. A summary of the groups activities is presented, including computational studies in MHD equilibria and stability, plasma transport, Fokker-Planck, and efficient numerical and programming algorithms. References are included

  4. Thermal models of buildings. Determination of temperatures, heating and cooling loads. Theories, models and computer programs

    Energy Technology Data Exchange (ETDEWEB)

    Kaellblad, K

    1998-05-01

    The need to estimate indoor temperatures, heating or cooling load and energy requirements for buildings arises in many stages of a buildings life cycle, e.g. at the early layout stage, during the design of a building and for energy retrofitting planning. Other purposes are to meet the authorities requirements given in building codes. All these situations require good calculation methods. The main purpose of this report is to present the authors work with problems related to thermal models and calculation methods for determination of temperatures and heating or cooling loads in buildings. Thus the major part of the report deals with treatment of solar radiation in glazing systems, shading of solar and sky radiation and the computer program JULOTTA used to simulate the thermal behavior of rooms and buildings. Other parts of thermal models of buildings are more briefly discussed and included in order to give an overview of existing problems and available solutions. A brief presentation of how thermal models can be built up is also given and it is a hope that the report can be useful as an introduction to this part of building physics as well as during development of calculation methods and computer programs. The report may also serve as a help for the users of energy related programs. Independent of which method or program a user choose to work with it is his or her own responsibility to understand the limits of the tool, else wrong conclusions may be drawn from the results 52 refs, 22 figs, 4 tabs

  5. Computer program to solve two-dimensional shock-wave interference problems with an equilibrium chemically reacting air model

    Science.gov (United States)

    Glass, Christopher E.

    1990-08-01

    The computer program EASI, an acronym for Equilibrium Air Shock Interference, was developed to calculate the inviscid flowfield, the maximum surface pressure, and the maximum heat flux produced by six shock wave interference patterns on a 2-D, cylindrical configuration. Thermodynamic properties of the inviscid flowfield are determined using either an 11-specie, 7-reaction equilibrium chemically reacting air model or a calorically perfect air model. The inviscid flowfield is solved using the integral form of the conservation equations. Surface heating calculations at the impingement point for the equilibrium chemically reacting air model use variable transport properties and specific heat. However, for the calorically perfect air model, heating rate calculations use a constant Prandtl number. Sample calculations of the six shock wave interference patterns, a listing of the computer program, and flowcharts of the programming logic are included.

  6. Introducing Seismic Tomography with Computational Modeling

    Science.gov (United States)

    Neves, R.; Neves, M. L.; Teodoro, V.

    2011-12-01

    Learning seismic tomography principles and techniques involves advanced physical and computational knowledge. In depth learning of such computational skills is a difficult cognitive process that requires a strong background in physics, mathematics and computer programming. The corresponding learning environments and pedagogic methodologies should then involve sets of computational modelling activities with computer software systems which allow students the possibility to improve their mathematical or programming knowledge and simultaneously focus on the learning of seismic wave propagation and inverse theory. To reduce the level of cognitive opacity associated with mathematical or programming knowledge, several computer modelling systems have already been developed (Neves & Teodoro, 2010). Among such systems, Modellus is particularly well suited to achieve this goal because it is a domain general environment for explorative and expressive modelling with the following main advantages: 1) an easy and intuitive creation of mathematical models using just standard mathematical notation; 2) the simultaneous exploration of images, tables, graphs and object animations; 3) the attribution of mathematical properties expressed in the models to animated objects; and finally 4) the computation and display of mathematical quantities obtained from the analysis of images and graphs. Here we describe virtual simulations and educational exercises which enable students an easy grasp of the fundamental of seismic tomography. The simulations make the lecture more interactive and allow students the possibility to overcome their lack of advanced mathematical or programming knowledge and focus on the learning of seismological concepts and processes taking advantage of basic scientific computation methods and tools.

  7. LUDEP 1. 0, a personal computer program to implement the new ICRP respiratory tract model

    Energy Technology Data Exchange (ETDEWEB)

    Jarvis, N.S.; Birchall, A. (National Radiological Protection Board, Chilton (United Kingdom))

    1994-01-01

    The International Commission on Radiological Protection has recently approved a new model of the human respiratory tract. This model has been designed to represent realistically the deposition and biokinetic behaviour of inhaled radionuclides, and to calculate doses to the respiratory tract. In order to examine the practical application and radiological implications of the new model, a Personal Computer program has been developed. LUDEP 1.0 is a user-friendly program for the IBM-compatible PC which enables the user to calculate doses to the respiratory tract and to other organs. (author).

  8. Use of the computer program in a cloud computing

    Directory of Open Access Journals (Sweden)

    Radovanović Sanja

    2013-01-01

    Full Text Available Cloud computing represents a specific networking, in which a computer program simulates the operation of one or more server computers. In terms of copyright, all technological processes that take place within the cloud computing are covered by the notion of copying computer programs, and exclusive right of reproduction. However, this right suffers some limitations in order to allow normal use of computer program by users. Based on the fact that the cloud computing is virtualized network, the issue of normal use of the computer program requires to put all aspects of the permitted copying into the context of a specific computing environment and specific processes within the cloud. In this sense, the paper pointed out that the user of a computer program in cloud computing, needs to obtain the consent of the right holder for any act which he undertakes using the program. In other words, the copyright in the cloud computing is a full scale, and thus the freedom of contract (in the case of this particular restriction as well.

  9. Computing Programs for Determining Traffic Flows from Roundabouts

    Science.gov (United States)

    Boroiu, A. A.; Tabacu, I.; Ene, A.; Neagu, E.; Boroiu, A.

    2017-10-01

    For modelling road traffic at the level of a road network it is necessary to specify the flows of all traffic currents at each intersection. These data can be obtained by direct measurements at the traffic light intersections, but in the case of a roundabout this is not possible directly and the literature as well as the traffic modelling software doesn’t offer ways to solve this issue. Two sets of formulas are proposed by which all traffic flows from the roundabouts with 3 or 4 arms are calculated based on the streams that can be measured. The objective of this paper is to develop computational programs to operate with these formulas. For each of the two sets of analytical relations, a computational program was developed in the Java operating language. The obtained results fully confirm the applicability of the calculation programs. The final stage for capitalizing these programs will be to make them web pages in HTML format, so that they can be accessed and used on the Internet. The achievements presented in this paper are an important step to provide a necessary tool for traffic modelling because these computational programs can be easily integrated into specialized software.

  10. Computer program for modelling the history of the in-service bending of fast power reactor fuel assemblies

    International Nuclear Information System (INIS)

    Dienstbier, J.

    1979-04-01

    The studies into stresses and deformations in the core are mainly focused on the fuel rod and the fuel assembly can. In high neutron doses austenitic steel swells and this is associated with a considerable increase in the volume of material. The SANDRA computer program is used for solving the problems of can deformations and stress during long-term reactor operation. The block for the mechanical interaction of cans is the key part of the program. The program input data include temperature distribution, fast neutron flux distribution and coolant overpressure inside the cans. Reactor operation is modelled using operating modes A, B, C which may arbitrarily be combined. Mode A computes bending deformations and the deformations of the can cross-section due to temperature dilatation in the change in temperature fields in the reactor; mode B computes deformations due to swelling and creep in long-term operation; mode C computes thermal deformations in reactor shut-down. A flowsheet is shown of program SANDRA as are examples of computed deformations. (M.S.)

  11. A resource facility for kinetic analysis: modeling using the SAAM computer programs.

    Science.gov (United States)

    Foster, D M; Boston, R C; Jacquez, J A; Zech, L

    1989-01-01

    Kinetic analysis and integrated system modeling have contributed significantly to understanding the physiology and pathophysiology of metabolic systems in humans and animals. Many experimental biologists are aware of the usefulness of these techniques and recognize that kinetic modeling requires special expertise. The Resource Facility for Kinetic Analysis (RFKA) provides this expertise through: (1) development and application of modeling technology for biomedical problems, and (2) development of computer-based kinetic modeling methodologies concentrating on the computer program Simulation, Analysis, and Modeling (SAAM) and its conversational version, CONversational SAAM (CONSAM). The RFKA offers consultation to the biomedical community in the use of modeling to analyze kinetic data and trains individuals in using this technology for biomedical research. Early versions of SAAM were widely applied in solving dosimetry problems; many users, however, are not familiar with recent improvements to the software. The purpose of this paper is to acquaint biomedical researchers in the dosimetry field with RFKA, which, together with the joint National Cancer Institute-National Heart, Lung and Blood Institute project, is overseeing SAAM development and applications. In addition, RFKA provides many service activities to the SAAM user community that are relevant to solving dosimetry problems.

  12. EZLP: An Interactive Computer Program for Solving Linear Programming Problems. Final Report.

    Science.gov (United States)

    Jarvis, John J.; And Others

    Designed for student use in solving linear programming problems, the interactive computer program described (EZLP) permits the student to input the linear programming model in exactly the same manner in which it would be written on paper. This report includes a brief review of the development of EZLP; narrative descriptions of program features,…

  13. The Use of Engineering Design Concept for Computer Programming Course: A Model of Blended Learning Environment

    Science.gov (United States)

    Tritrakan, Kasame; Kidrakarn, Pachoen; Asanok, Manit

    2016-01-01

    The aim of this research is to develop a learning model which blends factors from learning environment and engineering design concept for learning in computer programming course. The usage of the model was also analyzed. This study presents the design, implementation, and evaluation of the model. The research methodology is divided into three…

  14. A computer program for external modes in complex ionic crystals (the rigid molecular-ion model)

    International Nuclear Information System (INIS)

    Chaplot, S.L.

    1978-01-01

    A computer program DISPR has been developed to calculate the external mode phonon dispersion relation in the harmonic approximation for complex ionic crystals using the rigid molecular ion model. A description of the program, the flow diagram and the required input information are given. A sample calculation for α-KNO 3 is presented. The program can handle any type of crystal lattice with any number of atoms and molecules per unit cell with suitable changes in dimension statements. (M.G.B.)

  15. Four-Cylinder Stirling-Engine Computer Program

    Science.gov (United States)

    Daniele, C. J.; Lorenzo, C. F.

    1986-01-01

    Computer program developed for simulating steady-state and transient performance of four-cylinder Stirling engine. In model, four cylinders interconnected by four working spaces. Each working space contains seven volumes: one for expansion space, heater, cooler, and compression space and three for regenerator. Thermal time constant for regenerator mass associated with each regenator gas volume. Former code generates results very quickly, since it has only 14 state variables with no energy equation. Current code then used to study various aspects of Stirling engine in much more detail. Program written in FORTRAN IV for use on IBM 370 computer.

  16. Structured Parallel Programming Patterns for Efficient Computation

    CERN Document Server

    McCool, Michael; Robison, Arch

    2012-01-01

    Programming is now parallel programming. Much as structured programming revolutionized traditional serial programming decades ago, a new kind of structured programming, based on patterns, is relevant to parallel programming today. Parallel computing experts and industry insiders Michael McCool, Arch Robison, and James Reinders describe how to design and implement maintainable and efficient parallel algorithms using a pattern-based approach. They present both theory and practice, and give detailed concrete examples using multiple programming models. Examples are primarily given using two of th

  17. The Computational Physics Program of the national MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1989-01-01

    Since June 1974, the MFE Computer Center has been engaged in a significant computational physics effort. The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generations of supercomputers. The Computational Physics Group has been involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to tokamaks and compact toroids. A third area is the investigation of kinetic instabilities using a 3-D particle code; this work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence have been under examination, with the hope of being able to explain anomalous transport. Also, we are collaborating in an international effort to evaluate fully three-dimensional linear stability of toroidal devices. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers. A summary of these programs are included in this paper. 6 tabs

  18. VNAP2: a computer program for computation of two-dimensional, time-dependent, compressible, turbulent flow

    Energy Technology Data Exchange (ETDEWEB)

    Cline, M.C.

    1981-08-01

    VNAP2 is a computer program for calculating turbulent (as well as laminar and inviscid), steady, and unsteady flow. VNAP2 solves the two-dimensional, time-dependent, compressible Navier-Stokes equations. The turbulence is modeled with either an algebraic mixing-length model, a one-equation model, or the Jones-Launder two-equation model. The geometry may be a single- or a dual-flowing stream. The interior grid points are computed using the unsplit MacCormack scheme. Two options to speed up the calculations for high Reynolds number flows are included. The boundary grid points are computed using a reference-plane-characteristic scheme with the viscous terms treated as source functions. An explicit artificial viscosity is included for shock computations. The fluid is assumed to be a perfect gas. The flow boundaries may be arbitrary curved solid walls, inflow/outflow boundaries, or free-jet envelopes. Typical problems that can be solved concern nozzles, inlets, jet-powered afterbodies, airfoils, and free-jet expansions. The accuracy and efficiency of the program are shown by calculations of several inviscid and turbulent flows. The program and its use are described completely, and six sample cases and a code listing are included.

  19. ICASE Computer Science Program

    Science.gov (United States)

    1985-01-01

    The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.

  20. Geochemical modelling. Column 2: a computer program for simulation of migration

    International Nuclear Information System (INIS)

    Nielsen, O.J.; Carlsen, L.; Bo, P.

    1985-01-01

    COLUMN2 is a 1D FORTRAN77 computer program designed for studies of the effects of various physicochemical processes on migration. It solves the solute transport equation and can take into account dispersion, sorption, ion exchange, first and second order homogeneous chemical reactions. Spatial variations of input pulses and retention factors are possible. The method of solution is based on a finite difference discretion followed by the application of the method of characteristics and two separate grid systems. This report explains the mathematical and numerical methods used, describes the necessary input, contains a number of test examples, provides a listing of the program and explains how to acquire the program, adapt it to other computers and run it. This report serves as a manual for the program

  1. Adolescents' Chunking of Computer Programs.

    Science.gov (United States)

    Magliaro, Susan; Burton, John K.

    To investigate what children learn during computer programming instruction, students attending a summer computer camp were asked to recall either single lines or chunks of computer programs from either coherent or scrambled programs. The 16 subjects, ages 12 to 17, were divided into three instructional groups: (1) beginners, who were taught to…

  2. An interactive program for pharmacokinetic modeling.

    Science.gov (United States)

    Lu, D R; Mao, F

    1993-05-01

    A computer program, PharmK, was developed for pharmacokinetic modeling of experimental data. The program was written in C computer language based on the high-level user-interface Macintosh operating system. The intention was to provide a user-friendly tool for users of Macintosh computers. An interactive algorithm based on the exponential stripping method is used for the initial parameter estimation. Nonlinear pharmacokinetic model fitting is based on the maximum likelihood estimation method and is performed by the Levenberg-Marquardt method based on chi 2 criterion. Several methods are available to aid the evaluation of the fitting results. Pharmacokinetic data sets have been examined with the PharmK program, and the results are comparable with those obtained with other programs that are currently available for IBM PC-compatible and other types of computers.

  3. Application of computers in a radiological survey program

    International Nuclear Information System (INIS)

    Berven, B.A.; Blair, M.S.; Doane, R.W.; Little, C.A.; Perdue, P.T.

    1984-01-01

    Computers have become increasingly important in data analysis and data management as well as assisting in report preparation in the Oak Ridge National Laboratory (ORNL) Radiological Survey Activities (RASA) Program. The primary function of the RASA program is to collect, analyze, report, and manage data collected to characterize the radiological condition of potentially contaminated sites identified in the Department of Energy's (DOE) remedial action programs. Three different computer systems are routinely utilized in ORNL/RASA operations. Two of these systems are employed in specific functions. A Nuclear Data (ND) 682 is used to perform isotopic analysis of gamma spectroscopic data generated by high-purity germanium detectors for air, water and soil samples. The ND682 employs a 16,000-channel analyzer that is routinely used with four germanium spectrometers. Word processing and data management are accomplished using the INtext system implemented on a DEC PDP-11 computer. A group of personal computers are used to perform a diverse number of functions. These computer systems are Commodore Business Machines (CBM) Model 8032 with a dual floppy disk storage medium and line printers (with optional X-Y plotters). The CBM's are utilized for: (1) data analysis -- raw data from radiation detection instrumentation are stored and manipulated with customized computer programs; (2) data reduction -- raw data are converted into report-ready tables using customized programs; (3) data management -- radionuclide data on each air, water and soil sample are stored on diskettes along with location of archived samples; and (4) program management -- site surveys and report status are tracked by computer files as well as program budget information to provide contemporary information of program status

  4. Computer programs as accounting object

    Directory of Open Access Journals (Sweden)

    I.V. Perviy

    2015-03-01

    Full Text Available Existing approaches to the regulation of accounting software as one of the types of intangible assets have been considered. The features and current state of the legal protection of computer programs have been analyzed. The reasons for the need to use patent law as a means of legal protection of individual elements of computer programs have been discovered. The influence of the legal aspects of the use of computer programs for national legislation to their accounting reflection has been analyzed. The possible options for the transfer of rights from computer programs copyright owners have been analyzed that should be considered during creation of software accounting system at the enterprise. Identified and analyzed the characteristics of computer software as an intangible asset under the current law. General economic characteristics of computer programs as one of the types of intangible assets have been grounded. The main distinguishing features of software compared to other types of intellectual property have been all ocated

  5. Repair models of cell survival and corresponding computer program for survival curve fitting

    International Nuclear Information System (INIS)

    Shen Xun; Hu Yiwei

    1992-01-01

    Some basic concepts and formulations of two repair models of survival, the incomplete repair (IR) model and the lethal-potentially lethal (LPL) model, are introduced. An IBM-PC computer program for survival curve fitting with these models was developed and applied to fit the survivals of human melanoma cells HX118 irradiated at different dose rates. Comparison was made between the repair models and two non-repair models, the multitar get-single hit model and the linear-quadratic model, in the fitting and analysis of the survival-dose curves. It was shown that either IR model or LPL model can fit a set of survival curves of different dose rates with same parameters and provide information on the repair capacity of cells. These two mathematical models could be very useful in quantitative study on the radiosensitivity and repair capacity of cells

  6. NASA's computer science research program

    Science.gov (United States)

    Larsen, R. L.

    1983-01-01

    Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.

  7. Computer Program Newsletter No. 7

    International Nuclear Information System (INIS)

    Magnuson, W.G. Jr.

    1982-09-01

    This issue of the Computer Program Newsletter updates an earlier newsletter (Number 2, September 1979) and focuses on electrical network analysis computer programs. In particular, five network analysis programs (SCEPTRE, SPICE2, NET2, CALAHAN, and EMTP) will be described. The objective of this newsletter will be to provide a very brief description of the input syntax and semantics for each program, highlight their strong and weak points, illustrate how the programs are run at Lawrence Livermore National Laboratory using the Octopus computer network, and present examples of input for each of the programs to illustrate some of the features of each program. In a sense, this newsletter can be used as a quick reference guide to the programs

  8. Computer technology and computer programming research and strategies

    CERN Document Server

    Antonakos, James L

    2011-01-01

    Covering a broad range of new topics in computer technology and programming, this volume discusses encryption techniques, SQL generation, Web 2.0 technologies, and visual sensor networks. It also examines reconfigurable computing, video streaming, animation techniques, and more. Readers will learn about an educational tool and game to help students learn computer programming. The book also explores a new medical technology paradigm centered on wireless technology and cloud computing designed to overcome the problems of increasing health technology costs.

  9. Program MASTERCALC: an interactive computer program for radioanalytical computations. Description and operating instructions

    International Nuclear Information System (INIS)

    Goode, W.

    1980-10-01

    MASTERCALC is a computer program written to support radioanalytical computations in the Los Alamos Scientific Laboratory (LASL) Environmental Surveillance Group. Included in the program are routines for gross alpha and beta, 3 H, gross gamma, 90 Sr and alpha spectroscopic determinations. A description of MASTERCALC is presented and its source listing is included. Operating instructions and example computing sessions are given for each type of analysis

  10. Computational Materials Program for Alloy Design

    Science.gov (United States)

    Bozzolo, Guillermo

    2005-01-01

    The research program sponsored by this grant, "Computational Materials Program for Alloy Design", covers a period of time of enormous change in the emerging field of computational materials science. The computational materials program started with the development of the BFS method for alloys, a quantum approximate method for atomistic analysis of alloys specifically tailored to effectively deal with the current challenges in the area of atomistic modeling and to support modern experimental programs. During the grant period, the program benefited from steady growth which, as detailed below, far exceeds its original set of goals and objectives. Not surprisingly, by the end of this grant, the methodology and the computational materials program became an established force in the materials communitiy, with substantial impact in several areas. Major achievements during the duration of the grant include the completion of a Level 1 Milestone for the HITEMP program at NASA Glenn, consisting of the planning, development and organization of an international conference held at the Ohio Aerospace Institute in August of 2002, finalizing a period of rapid insertion of the methodology in the research community worlwide. The conference, attended by citizens of 17 countries representing various fields of the research community, resulted in a special issue of the leading journal in the area of applied surface science. Another element of the Level 1 Milestone was the presentation of the first version of the Alloy Design Workbench software package, currently known as "adwTools". This software package constitutes the first PC-based piece of software for atomistic simulations for both solid alloys and surfaces in the market.Dissemination of results and insertion in the materials community worldwide was a primary focus during this period. As a result, the P.I. was responsible for presenting 37 contributed talks, 19 invited talks, and publishing 71 articles in peer-reviewed journals, as

  11. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-05-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  12. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-01-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  13. HEDPIN: a computer program to estimate pinwise power density

    International Nuclear Information System (INIS)

    Cappiello, M.W.

    1976-05-01

    A description is given of the digital computer program, HEDPIN. This program, modeled after a previously developed program, POWPIN, provides a means of estimating the pinwise power density distribution in fast reactor triangular pitched pin bundles. The capability also exists for computing any reaction rate of interest at the respective pin positions within an assembly. HEDPIN was developed in support of FTR fuel and test management as well as fast reactor core design and core characterization planning and analysis. The results of a test devised to check out HEDPIN's computational method are given, and the realm of application is discussed. Nearly all programming is in FORTRAN IV. Variable dimensioning is employed to make efficient use of core memory and maintain short running time for small problems. Input instructions, sample problem, and a program listing are also given

  14. Computer program CDCID: an automated quality control program using CDC update

    International Nuclear Information System (INIS)

    Singer, G.L.; Aguilar, F.

    1984-04-01

    A computer program, CDCID, has been developed in coordination with a quality control program to provide a highly automated method of documenting changes to computer codes at EG and G Idaho, Inc. The method uses the standard CDC UPDATE program in such a manner that updates and their associated documentation are easily made and retrieved in various formats. The method allows each card image of a source program to point to the document which describes it, who created the card, and when it was created. The method described is applicable to the quality control of computer programs in general. The computer program described is executable only on CDC computing systems, but the program could be modified and applied to any computing system with an adequate updating program

  15. A computer program for scanning transmission ion microscopy simulation

    International Nuclear Information System (INIS)

    Wu, R.; Shen, H.; Mi, Y.; Sun, M.D.; Yang, M.J.

    2005-01-01

    With the installation of the Scanning Proton Microprobe system at Fudan University, we are in the process of developing a three-dimension reconstruction technique based on scanning transmission ion microscopy-computed tomography (STIM-CT). As the first step, a related computer program of STIM simulation has been established. This program is written in the Visual C++[reg], using the technique of OOP (Object Oriented Programming) and it is a standard multiple-document Windows[reg] program. It can be run with all MS Windows[reg] operating systems. The operating mode is the menu mode, using a multiple process technique. The stopping power theory is based on the Bethe-Bloch formula. In order to simplify the calculation, the improved cylindrical coordinate model was introduced in the program instead of a usual spherical or cylindrical coordinate model. The simulated results of a sample at several rotation angles are presented

  16. The FITS model office ergonomics program: a model for best practice.

    Science.gov (United States)

    Chim, Justine M Y

    2014-01-01

    An effective office ergonomics program can predict positive results in reducing musculoskeletal injury rates, enhancing productivity, and improving staff well-being and job satisfaction. Its objective is to provide a systematic solution to manage the potential risk of musculoskeletal disorders among computer users in an office setting. A FITS Model office ergonomics program is developed. The FITS Model Office Ergonomics Program has been developed which draws on the legislative requirements for promoting the health and safety of workers using computers for extended periods as well as previous research findings. The Model is developed according to the practical industrial knowledge in ergonomics, occupational health and safety management, and human resources management in Hong Kong and overseas. This paper proposes a comprehensive office ergonomics program, the FITS Model, which considers (1) Furniture Evaluation and Selection; (2) Individual Workstation Assessment; (3) Training and Education; (4) Stretching Exercises and Rest Break as elements of an effective program. An experienced ergonomics practitioner should be included in the program design and implementation. Through the FITS Model Office Ergonomics Program, the risk of musculoskeletal disorders among computer users can be eliminated or minimized, and workplace health and safety and employees' wellness enhanced.

  17. Computer Programming Education with Miranda

    NARCIS (Netherlands)

    Joosten, S.M.M.; van den Berg, Klaas

    During the past four years, an experiment has been carried out with an introductory course in computer programming, based on functional programming. This article describes the background of this approach, the aim of the computer programming course, the outline and subject matter of the course parts

  18. Basic design of parallel computational program for probabilistic structural analysis

    International Nuclear Information System (INIS)

    Kaji, Yoshiyuki; Arai, Taketoshi; Gu, Wenwei; Nakamura, Hitoshi

    1999-06-01

    In our laboratory, for 'development of damage evaluation method of structural brittle materials by microscopic fracture mechanics and probabilistic theory' (nuclear computational science cross-over research) we examine computational method related to super parallel computation system which is coupled with material strength theory based on microscopic fracture mechanics for latent cracks and continuum structural model to develop new structural reliability evaluation methods for ceramic structures. This technical report is the review results regarding probabilistic structural mechanics theory, basic terms of formula and program methods of parallel computation which are related to principal terms in basic design of computational mechanics program. (author)

  19. Basic design of parallel computational program for probabilistic structural analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kaji, Yoshiyuki; Arai, Taketoshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Gu, Wenwei; Nakamura, Hitoshi

    1999-06-01

    In our laboratory, for `development of damage evaluation method of structural brittle materials by microscopic fracture mechanics and probabilistic theory` (nuclear computational science cross-over research) we examine computational method related to super parallel computation system which is coupled with material strength theory based on microscopic fracture mechanics for latent cracks and continuum structural model to develop new structural reliability evaluation methods for ceramic structures. This technical report is the review results regarding probabilistic structural mechanics theory, basic terms of formula and program methods of parallel computation which are related to principal terms in basic design of computational mechanics program. (author)

  20. Mississippi Curriculum Framework for Computer Information Systems Technology. Computer Information Systems Technology (Program CIP: 52.1201--Management Information Systems & Business Data). Computer Programming (Program CIP: 52.1201). Network Support (Program CIP: 52.1290--Computer Network Support Technology). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for two programs in the state's postsecondary-level computer information systems technology cluster: computer programming and network support. Presented in the introduction are program descriptions and suggested course…

  1. AV Programs for Computer Know-How.

    Science.gov (United States)

    Mandell, Phyllis Levy

    1985-01-01

    Lists 44 audiovisual programs (most released between 1983 and 1984) grouped in seven categories: computers in society, introduction to computers, computer operations, languages and programing, computer graphics, robotics, computer careers. Excerpts from "School Library Journal" reviews, price, and intended grade level are included. Names…

  2. Computer Networking Laboratory for Undergraduate Computer Technology Program

    National Research Council Canada - National Science Library

    Naghedolfeizi, Masoud

    2000-01-01

    ...) To improve the quality of education in the existing courses related to computer networks and data communications as well as other computer science courses such programming languages and computer...

  3. The complete guide to blender graphics computer modeling and animation

    CERN Document Server

    Blain, John M

    2014-01-01

    Smoothly Leads Users into the Subject of Computer Graphics through the Blender GUIBlender, the free and open source 3D computer modeling and animation program, allows users to create and animate models and figures in scenes, compile feature movies, and interact with the models and create video games. Reflecting the latest version of Blender, The Complete Guide to Blender Graphics: Computer Modeling & Animation, 2nd Edition helps beginners learn the basics of computer animation using this versatile graphics program. This edition incorporates many new features of Blender, including developments

  4. Thinking processes used by high-performing students in a computer programming task

    Directory of Open Access Journals (Sweden)

    Marietjie Havenga

    2011-07-01

    Full Text Available Computer programmers must be able to understand programming source code and write programs that execute complex tasks to solve real-world problems. This article is a trans- disciplinary study at the intersection of computer programming, education and psychology. It outlines the role of mental processes in the process of programming and indicates how successful thinking processes can support computer science students in writing correct and well-defined programs. A mixed methods approach was used to better understand the thinking activities and programming processes of participating students. Data collection involved both computer programs and students’ reflective thinking processes recorded in their journals. This enabled analysis of psychological dimensions of participants’ thinking processes and their problem-solving activities as they considered a programming problem. Findings indicate that the cognitive, reflective and psychological processes used by high-performing programmers contributed to their success in solving a complex programming problem. Based on the thinking processes of high performers, we propose a model of integrated thinking processes, which can support computer programming students. Keywords: Computer programming, education, mixed methods research, thinking processes.  Disciplines: Computer programming, education, psychology

  5. Preliminary evaluation of the BIODOSE computer program

    International Nuclear Information System (INIS)

    Bonner, N.A.; Ng, Y.C.

    1979-09-01

    The BIODOSE computer program simulates the environmental transport of radionuclides released to surface water and predicts the dosage to humans. We have evaluated the program for its suitability to the needs of the Nuclear Regulatory Commission Waste Management Program. In particular, it is an evaluation to determine whether BIODOSE models account for the significant pathways and mechanisms resulting in radiological doses to man. In general, BIODOSE is a satisfactory code for converting radionuclide releases to the aqueous environment into doses to man

  6. GRAPHIC, time-sharing magnet design computer programs at Argonne

    International Nuclear Information System (INIS)

    Lari, R.J.

    1974-01-01

    This paper describes three magnet design computer programs in use at the Zero Gradient Synchrotron of Argonne National Laboratory. These programs are used in the time sharing mode in conjunction with a Tektronix model 4012 graphic display terminal. The first program in called TRIM, the second MAGNET, and the third GFUN. (U.S.)

  7. The computational physics program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1988-01-01

    The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generation of supercomputers. The computational physics group is involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to compact toroids. Another major area is the investigation of kinetic instabilities using a 3-D particle code. This work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence are being examined. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers

  8. Recommended programming practices to facilitate the portability of science computer programs

    International Nuclear Information System (INIS)

    Anon.

    1983-01-01

    This standard recommends programming practices to facilitate the portability of computer programs prepared for scientific and engineering computations. These practices are intended to simplify implementation, conversion, and modification of computer programs

  9. COMPSs-Mobile: parallel programming for mobile-cloud computing

    OpenAIRE

    Lordan Gomis, Francesc-Josep; Badia Sala, Rosa Maria

    2016-01-01

    The advent of Cloud and the popularization of mobile devices have led us to a shift in computing access. Computing users will have an interaction display while the real computation will be performed remotely, in the Cloud. COMPSs-Mobile is a framework that aims to ease the development of energy-efficient and high-performing applications for this environment. The framework provides an infrastructure-unaware programming model that allows developers to code regular Android applications that, ...

  10. CMS computing model evolution

    International Nuclear Information System (INIS)

    Grandi, C; Bonacorsi, D; Colling, D; Fisk, I; Girone, M

    2014-01-01

    The CMS Computing Model was developed and documented in 2004. Since then the model has evolved to be more flexible and to take advantage of new techniques, but many of the original concepts remain and are in active use. In this presentation we will discuss the changes planned for the restart of the LHC program in 2015. We will discuss the changes planning in the use and definition of the computing tiers that were defined with the MONARC project. We will present how we intend to use new services and infrastructure to provide more efficient and transparent access to the data. We will discuss the computing plans to make better use of the computing capacity by scheduling more of the processor nodes, making better use of the disk storage, and more intelligent use of the networking.

  11. Finite difference computing with exponential decay models

    CERN Document Server

    Langtangen, Hans Petter

    2016-01-01

    This text provides a very simple, initial introduction to the complete scientific computing pipeline: models, discretization, algorithms, programming, verification, and visualization. The pedagogical strategy is to use one case study – an ordinary differential equation describing exponential decay processes – to illustrate fundamental concepts in mathematics and computer science. The book is easy to read and only requires a command of one-variable calculus and some very basic knowledge about computer programming. Contrary to similar texts on numerical methods and programming, this text has a much stronger focus on implementation and teaches testing and software engineering in particular. .

  12. Integer programming theory, applications, and computations

    CERN Document Server

    Taha, Hamdy A

    1975-01-01

    Integer Programming: Theory, Applications, and Computations provides information pertinent to the theory, applications, and computations of integer programming. This book presents the computational advantages of the various techniques of integer programming.Organized into eight chapters, this book begins with an overview of the general categorization of integer applications and explains the three fundamental techniques of integer programming. This text then explores the concept of implicit enumeration, which is general in a sense that it is applicable to any well-defined binary program. Other

  13. ROBOT3: a computer program to calculate the in-pile three-dimensional bowing of cylindrical fuel rods (AWBA Development Program)

    International Nuclear Information System (INIS)

    Kovscek, S.E.; Martin, S.E.

    1982-10-01

    ROBOT3 is a FORTRAN computer program which is used in conjunction with the CYGRO5 computer program to calculate the time-dependent inelastic bowing of a fuel rod using an incremental finite element method. The fuel rod is modeled as a viscoelastic beam whose material properties are derived as perturbations of the CYGRO5 axisymmetric model. Fuel rod supports are modeled as displacement, force, or spring-type nodal boundary conditions. The program input is described and a sample problem is given

  14. Computational Modeling of Space Physiology

    Science.gov (United States)

    Lewandowski, Beth E.; Griffin, Devon W.

    2016-01-01

    The Digital Astronaut Project (DAP), within NASAs Human Research Program, develops and implements computational modeling for use in the mitigation of human health and performance risks associated with long duration spaceflight. Over the past decade, DAP developed models to provide insights into space flight related changes to the central nervous system, cardiovascular system and the musculoskeletal system. Examples of the models and their applications include biomechanical models applied to advanced exercise device development, bone fracture risk quantification for mission planning, accident investigation, bone health standards development, and occupant protection. The International Space Station (ISS), in its role as a testing ground for long duration spaceflight, has been an important platform for obtaining human spaceflight data. DAP has used preflight, in-flight and post-flight data from short and long duration astronauts for computational model development and validation. Examples include preflight and post-flight bone mineral density data, muscle cross-sectional area, and muscle strength measurements. Results from computational modeling supplement space physiology research by informing experimental design. Using these computational models, DAP personnel can easily identify both important factors associated with a phenomenon and areas where data are lacking. This presentation will provide examples of DAP computational models, the data used in model development and validation, and applications of the model.

  15. Computer model for economic study of unbleached kraft paperboard production

    Science.gov (United States)

    Peter J. Ince

    1984-01-01

    Unbleached kraft paperboard is produced from wood fiber in an industrial papermaking process. A highly specific and detailed model of the process is presented. The model is also presented as a working computer program. A user of the computer program will provide data on physical parameters of the process and on prices of material inputs and outputs. The program is then...

  16. Computer Program Re-layers Engineering Drawings

    Science.gov (United States)

    Crosby, Dewey C., III

    1990-01-01

    RULCHK computer program aids in structuring layers of information pertaining to part or assembly designed with software described in article "Software for Drawing Design Details Concurrently" (MFS-28444). Checks and optionally updates structure of layers for part. Enables designer to construct model and annotate its documentation without burden of manually layering part to conform to standards at design time.

  17. Applications of computer modeling to fusion research

    International Nuclear Information System (INIS)

    Dawson, J.M.

    1989-01-01

    Progress achieved during this report period is presented on the following topics: Development and application of gyrokinetic particle codes to tokamak transport, development of techniques to take advantage of parallel computers; model dynamo and bootstrap current drive; and in general maintain our broad-based program in basic plasma physics and computer modeling

  18. A PC-based computer program for simulation of containment pressurization

    International Nuclear Information System (INIS)

    Seifaee, F.

    1990-01-01

    This paper reports that a PC-based computer program has been developed to simulate a pressurized water reactor (PWR) containment during various transients. This containment model is capable of determining pressure and temperature history of a PWR containment in the event of a loss of coolant accident, as well as main steam line breaks inside the containment. Conservation of mass and energy equations are applied to the containment model. Development of the program is based on minimization of input specified information and user friendliness. Maximization of calculation efficiency is obtained by superseding the traditional trial and error procedure for determination of the state variables and implementation of an explicit solution for pressure. The program includes simplified models for active heat removal systems. The results are in close agreement between the present model and CONTEMPT-MOD5 computer code for pressure and temperature inside the containment

  19. Interactive differential equations modeling program

    International Nuclear Information System (INIS)

    Rust, B.W.; Mankin, J.B.

    1976-01-01

    Due to the recent emphasis on mathematical modeling, many ecologists are using mathematics and computers more than ever, and engineers, mathematicians and physical scientists are now included in ecological projects. However, the individual ecologist, with intuitive knowledge of the system, still requires the means to critically examine and adjust system models. An interactive program was developed with the primary goal of allowing an ecologist with minimal experience in either mathematics or computers to develop a system model. It has also been used successfully by systems ecologists, engineers, and mathematicians. This program was written in FORTRAN for the DEC PDP-10, a remote terminal system at Oak Ridge National Laboratory. However, with relatively minor modifications, it can be implemented on any remote terminal system with a FORTRAN IV compiler, or equivalent. This program may be used to simulate any phenomenon which can be described as a system of ordinary differential equations. The program allows the user to interactively change system parameters and/or initial conditions, to interactively select a set of variables to be plotted, and to model discontinuities in the state variables and/or their derivatives. One of the most useful features to the non-computer specialist is the ability to interactively address the system parameters by name and to interactively adjust their values between simulations. These and other features are described in greater detail

  20. Report of the 2014 Programming Models and Environments Summit

    Energy Technology Data Exchange (ETDEWEB)

    Heroux, Michael [US Dept. of Energy, Washington, DC (United States); Lethin, Richard [US Dept. of Energy, Washington, DC (United States)

    2016-09-19

    Programming models and environments play the essential roles in high performance computing of enabling the conception, design, implementation and execution of science and engineering application codes. Programmer productivity is strongly influenced by the effectiveness of our programming models and environments, as is software sustainability since our codes have lifespans measured in decades, so the advent of new computing architectures, increased concurrency, concerns for resilience, and the increasing demands for high-fidelity, multi-physics, multi-scale and data-intensive computations mean that we have new challenges to address as part of our fundamental R&D requirements. Fortunately, we also have new tools and environments that make design, prototyping and delivery of new programming models easier than ever. The combination of new and challenging requirements and new, powerful toolsets enables significant synergies for the next generation of programming models and environments R&D. This report presents the topics discussed and results from the 2014 DOE Office of Science Advanced Scientific Computing Research (ASCR) Programming Models & Environments Summit, and subsequent discussions among the summit participants and contributors to topics in this report.

  1. Functional programming for computer vision

    Science.gov (United States)

    Breuel, Thomas M.

    1992-04-01

    Functional programming is a style of programming that avoids the use of side effects (like assignment) and uses functions as first class data objects. Compared with imperative programs, functional programs can be parallelized better, and provide better encapsulation, type checking, and abstractions. This is important for building and integrating large vision software systems. In the past, efficiency has been an obstacle to the application of functional programming techniques in computationally intensive areas such as computer vision. We discuss and evaluate several 'functional' data structures for representing efficiently data structures and objects common in computer vision. In particular, we will address: automatic storage allocation and reclamation issues; abstraction of control structures; efficient sequential update of large data structures; representing images as functions; and object-oriented programming. Our experience suggests that functional techniques are feasible for high- performance vision systems, and that a functional approach simplifies the implementation and integration of vision systems greatly. Examples in C++ and SML are given.

  2. Answer Set Programming and Other Computing Paradigms

    Science.gov (United States)

    Meng, Yunsong

    2013-01-01

    Answer Set Programming (ASP) is one of the most prominent and successful knowledge representation paradigms. The success of ASP is due to its expressive non-monotonic modeling language and its efficient computational methods originating from building propositional satisfiability solvers. The wide adoption of ASP has motivated several extensions to…

  3. The reactor physics computer programs in PC's era

    International Nuclear Information System (INIS)

    Nainer, O.; Serghiuta, D.

    1995-01-01

    The main objective of reactor physics analysis is the evaluation of flux and power distribution over the reactor core. For CANDU reactors sophisticated computer programs, such as FMDP and RFSP, were developed 20 years ago for mainframe computers. These programs were adapted to work on workstations with UNIX or DOS, but they lack a feature that could improve their use and that is 'user friendly'. For using these programs the users need to deal with a great amount of information contained in sophisticated files. To modify a model is a great challenge. First of all, it is necessary to bear in mind all the geometrical dimensions and accordingly, to modify the core model to match the new requirements. All this must be done in a line input file. For a DOS platform, using an average performance PC system, could it be possible: to represent and modify all the geometrical and physical parameters in a meaningful way, on screen, using an intuitive graphic user interface; to reduce the real time elapsed in order to perform complex fuel-management analysis 'at home'; to avoid the rewrite of the mainframe version of the program? The author's answer is a fuel-management computer package operating on PC, 3 time faster than on a CDC-Cyber 830 mainframe one (486DX/33MHz/8MbRAM) or 20 time faster (Pentium-PC), respectively. (author). 5 refs., 1 tab., 5 figs

  4. LIAR: A COMPUTER PROGRAM FOR THE SIMULATION AND MODELING OF HIGH PERFORMANCE LINACS

    International Nuclear Information System (INIS)

    Adolphsen, Chris

    2003-01-01

    The computer program LIAR (''LInear Accelerator Research code'') is a numerical simulation and tracking program for linear colliders. The LIAR project was started at SLAC in August 1995 in order to provide a computing and simulation tool that specifically addresses the needs of high energy linear colliders. LIAR is designed to be used for a variety of different linear accelerators. It has been applied for and checked against the existing Stanford Linear Collider (SLC) as well as the linacs of the proposed Next Linear Collider (NLC) and the proposed Linac Coherent Light Source (LCLS). The program includes wakefield effects, a 4D coupled beam description, specific optimization algorithms and other advanced features. We describe the most important concepts and highlights of the program. After having presented the LIAR program at the LINAC96 and the PAC97 conferences, we do now introduce it to the European particle accelerator community

  5. A computer program to calculate the committed dose equivalent after the inhalation of radioactivity

    International Nuclear Information System (INIS)

    Van der Woude, S.

    1989-03-01

    A growing number of people are, as part of their occupation, at risk of being exposed to radiation originating from sources inside their bodies. The quantification of this exposure is an important part of health physics. The International Commission on Radiological Protection (ICRP) developed a first-order kinetics compartmental model to determine the transport of radioactive material through the human body. The model and the parameters involved in its use, are discussed. A versatile computer program was developed to do the following after the in vivo measurement of either the organ- or whole-body activity: calculate the original amount of radioactive material which was inhaled (intake) by employing the ICRP compartmental model of the human body; compare this intake to calculated reference levels and state any action to be taken for the case under consideration; calculate the committed dose equivalent resulting from this intake. In the execution of the above-mentioned calculations, the computer program makes provision for different aerosol particle sizes and the effect of previous intakes. Model parameters can easily be changed to take the effects of, for instance, medical intervention into account. The computer program and the organization of the data in the input files are such that the computer program can be applied to any first-order kinetics compartmental model. The computer program can also conveniently be used for research on problems related to the application of the ICRP model. 18 refs., 25 figs., 5 tabs

  6. The deterministic computational modelling of radioactivity

    International Nuclear Information System (INIS)

    Damasceno, Ralf M.; Barros, Ricardo C.

    2009-01-01

    This paper describes a computational applicative (software) that modelling the simply radioactive decay, the stable nuclei decay, and tbe chain decay directly coupled with superior limit of thirteen radioactive decays, and a internal data bank with the decay constants of the various existent decays, facilitating considerably the use of program by people who does not have access to the program are not connected to the nuclear area; this makes access of the program to people that do not have acknowledgment of that area. The paper presents numerical results for typical problem-models

  7. Methods and computer programs for PWR's fuel management: Programs Sothis and Ciclon

    International Nuclear Information System (INIS)

    Aragones, J.M.; Corella, M.R.; Martinez-Val, J.M.

    1976-01-01

    Methos and computer programs developed at JEN for fuel management in PWR are discussed, including scope of model, procedures for sistematic selection of alternatives to be evaluated, basis of model for neutronic calculation, methods for fuel costs calculation, procedures for equilibrium and trans[tion cycles calculation with Soth[s and Ciclon codes and validation of methods by comparison of results with others of reference (author) ' [es

  8. Computer programs for the numerical modelling of water flow in rock masses

    International Nuclear Information System (INIS)

    Croney, P.; Richards, L.R.

    1985-08-01

    Water flow in rock joints provides a very important possible route for the migration of radio-nuclides from radio-active waste within a repository back to the biosphere. Two computer programs DAPHNE and FPM have been developed to model two dimensional fluid flow in jointed rock masses. They have been developed to run on microcomputer systems suitable for field locations. The fluid flows in a number of jointed rock systems have been examined and certain controlling functions identified. A methodology has been developed for assessing the anisotropic permeability of jointed rock. A number of examples of unconfined flow into surface and underground openings have been analysed and ground water lowering, pore water pressures and flow quantities predicted. (author)

  9. Developing robotic behavior using a genetic programming model

    International Nuclear Information System (INIS)

    Pryor, R.J.

    1998-01-01

    This report describes the methodology for using a genetic programming model to develop tracking behaviors for autonomous, microscale robotic vehicles. The use of such vehicles for surveillance and detection operations has become increasingly important in defense and humanitarian applications. Through an evolutionary process similar to that found in nature, the genetic programming model generates a computer program that when downloaded onto a robotic vehicle's on-board computer will guide the robot to successfully accomplish its task. Simulations of multiple robots engaged in problem-solving tasks have demonstrated cooperative behaviors. This report also discusses the behavior model produced by genetic programming and presents some results achieved during the study

  10. Brine: a computer program to compute brine migration adjacent to a nuclear waste canister in a salt repository

    International Nuclear Information System (INIS)

    Duckworth, G.D.; Fuller, M.E.

    1980-01-01

    This report presents a mathematical model used to predict brine migration toward a nuclear waste canister in a bedded salt repository. The mathematical model is implemented in a computer program called BRINE. The program is written in FORTRAN and executes in the batch mode on a CDC 7600. A description of the program input requirements and output available is included. Samples of input and output are given

  11. Debugging a high performance computing program

    Science.gov (United States)

    Gooding, Thomas M.

    2013-08-20

    Methods, apparatus, and computer program products are disclosed for debugging a high performance computing program by gathering lists of addresses of calling instructions for a plurality of threads of execution of the program, assigning the threads to groups in dependence upon the addresses, and displaying the groups to identify defective threads.

  12. GEOSURF: a computer program for modeling adsorption on mineral surfaces from aqueous solution

    Science.gov (United States)

    Sahai, Nita; Sverjensky, Dimitri A.

    1998-11-01

    A new program, GEOSURF, has been developed for calculating aqueous and surface speciation consistent with the triple-layer model of surface complexation. GEOSURF is an extension of the original programs MINEQL, MICROQL and HYDRAQL. We present, here, the basic algorithm of GEOSURF along with a description of the new features implemented. GEOSURF is linked to internally consistent data bases for surface species (SURFK.DAT) and for aqueous species (AQSOL.DAT). SURFK.DAT contains properties of minerals such as site densities, and equilibrium constants for adsorption of aqueous protons and electrolyte ions on a variety of oxides and hydroxides. The Helgeson, Kirkham and Flowers version of the extended Debye-Huckel Equation for 1:1 electrolytes is implemented for calculating aqueous activity coefficients. This permits the calculation of speciation at ionic strengths greater than 0.5 M. The activity of water is computed explicitly from the osmotic coefficient of the solution, and the total amount of electrolyte cation (or anion) is adjusted to satisfy the electroneutrality condition. Finally, the use of standard symbols for chemical species rather than species identification numbers is included to facilitate use of the program. One of the main limitations of GEOSURF is that aqueous and surface speciation can only be calculated at fixed pH and at fixed concentration of total adsorbate. Thus, the program cannot perform reaction-path calculations: it cannot determine whether or not a solution is over- or under-saturated with respect to one or more solid phases. To check the proper running of GEOSURF, we have compared results generated by GEOSURF with those from two other programs, HYDRAQL and EQ3. The Davies equation and the "bdot" equation, respectively, are used in the latter two programs for calculating aqueous activity coefficients. An example of the model fit to experimental data for rutile in 0.001 M-2.0 M NaNO 3 is included.

  13. Multithreaded transactions in scientific computing. The Growth06_v2 program

    Science.gov (United States)

    Daniluk, Andrzej

    2009-07-01

    Writing a concurrent program can be more difficult than writing a sequential program. Programmer needs to think about synchronization, race conditions and shared variables. Transactions help reduce the inconvenience of using threads. A transaction is an abstraction, which allows programmers to group a sequence of actions on the program into a logical, higher-level computation unit. This paper presents a new version of the GROWTHGr and GROWTH06 programs. New version program summaryProgram title: GROWTH06_v2 Catalogue identifier: ADVL_v2_1 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVL_v2_1.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 65 255 No. of bytes in distributed program, including test data, etc.: 865 985 Distribution format: tar.gz Programming language: Object Pascal Computer: Pentium-based PC Operating system: Windows 9x, XP, NT, Vista RAM: more than 1 MB Classification: 4.3, 7.2, 6.2, 8, 14 Catalogue identifier of previous version: ADVL_v2_0 Journal reference of previous version: Comput. Phys. Comm. 175 (2006) 678 Does the new version supersede the previous version?: Yes Nature of problem: The programs compute the RHEED intensities during the growth of thin epitaxial structures prepared using the molecular beam epitaxy (MBE). The computations are based on the use of kinematical diffraction theory. Solution method: Epitaxial growth of thin films is modelled by a set of non-linear differential equations [1]. The Runge-Kutta method with adaptive stepsize control was used for solving initial value problem for non-linear differential equations [2]. Reasons for new version: According to the users' suggestions functionality of the program has been improved. Moreover, new use cases have been added which make the handling of the program easier and more

  14. Automation of reliability evaluation procedures through CARE - The computer-aided reliability estimation program.

    Science.gov (United States)

    Mathur, F. P.

    1972-01-01

    Description of an on-line interactive computer program called CARE (Computer-Aided Reliability Estimation) which can model self-repair and fault-tolerant organizations and perform certain other functions. Essentially CARE consists of a repository of mathematical equations defining the various basic redundancy schemes. These equations, under program control, are then interrelated to generate the desired mathematical model to fit the architecture of the system under evaluation. The mathematical model is then supplied with ground instances of its variables and is then evaluated to generate values for the reliability-theoretic functions applied to the model.

  15. The psychology of computer programming

    CERN Document Server

    Weinberg, Gerald Marvin

    1998-01-01

    This landmark 1971 classic is reprinted with a new preface, chapter-by-chapter commentary, and straight-from-the-heart observations on topics that affect the professional life of programmers. Long regarded as one of the first books to pioneer a people-oriented approach to computing, The Psychology of Computer Programming endures as a penetrating analysis of the intelligence, skill, teamwork, and problem-solving power of the computer programmer. Finding the chapters strikingly relevant to today's issues in programming, Gerald M. Weinberg adds new insights and highlights the similarities and differences between now and then. Using a conversational style that invites the reader to join him, Weinberg reunites with some of his most insightful writings on the human side of software engineering. Topics include egoless programming, intelligence, psychological measurement, personality factors, motivation, training, social problems on large projects, problem-solving ability, programming language design, team formati...

  16. Exploring Poetry through Interactive Computer Programs.

    Science.gov (United States)

    Nimchinsky, Howard; Camp, Jocelyn

    The goal of a project was to design, test, and evaluate several computer programs that allow students in introductory literature and poetry courses to explore a poem in detail and, through a dialogue with the program, to develop their own interpretation of it. Computer programs were completed on poems by Robert Frost and W.H. Auden. Both programs…

  17. Model description. NUDOS: A computer program for assessing the consequences of airborne releases of radionuclides

    International Nuclear Information System (INIS)

    Poley, A.D.

    1996-02-01

    NUDOS is a computer program that can be used to evaluate the consequences of airborne releases of radioactive material. The consequences which can be evaluated are individual dose and associated radiological risk, collective dose and the contamination of land. The code is capable of dealing with both continuous (routine) and accidental releases. For accidental releases both deterministic and probabilistic calculations can be performed, and the impact and effectiveness of emergency actions can be evaluated. This report contains a description of the models contained in NUDOS92 and the recommended values for the input parameters of these models. Additionally, a short overview is given of the future model improvement planned for the next NUDOS-version. (orig.)

  18. The Y2K program for scientific-analysis computer programs at AECL

    International Nuclear Information System (INIS)

    Popovic, J.; Gaver, C.; Chapman, D.

    1999-01-01

    The evaluation of scientific-analysis computer programs for year-2000 compliance is part of AECL' s year-2000 (Y2K) initiative, which addresses both the infrastructure systems at AECL and AECL's products and services. This paper describes the Y2K-compliance program for scientific-analysis computer codes. This program involves the integrated evaluation of the computer hardware, middleware, and third-party software in addition to the scientific codes developed in-house. The project involves several steps: the assessment of the scientific computer programs for Y2K compliance, performing any required corrective actions, porting the programs to Y2K-compliant platforms, and verification of the programs after porting. Some programs or program versions, deemed no longer required in the year 2000 and beyond, will be retired and archived. (author)

  19. The Y2K program for scientific-analysis computer programs at AECL

    International Nuclear Information System (INIS)

    Popovic, J.; Gaver, C.; Chapman, D.

    1999-01-01

    The evaluation of scientific analysis computer programs for year-2000 compliance is part of AECL's year-2000 (Y2K) initiative, which addresses both the infrastructure systems at AECL and AECL's products and services. This paper describes the Y2K-compliance program for scientific-analysis computer codes. This program involves the integrated evaluation of the computer hardware, middleware, and third-party software in addition to the scientific codes developed in-house. The project involves several steps: the assessment of the scientific computer programs for Y2K compliance, performing any required corrective actions, porting the programs to Y2K-compliant platforms, and verification of the programs after porting. Some programs or program versions, deemed no longer required in the year 2000 and beyond, will be retired and archived. (author)

  20. A Framework for Understanding Physics Students' Computational Modeling Practices

    Science.gov (United States)

    Lunk, Brandon Robert

    With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content knowledge, and physics knowledge in particular, can influence students' programming practices. In an effort to better understand this issue, I have developed a framework for modeling these practices based on a resource stance towards student knowledge. A resource framework models knowledge as the activation of vast networks of elements called "resources." Much like neurons in the brain, resources that become active can trigger cascading events of activation throughout the broader network. This model emphasizes the connectivity between knowledge elements and provides a description of students' knowledge base. Together with resources resources, the concepts of "epistemic games" and "frames" provide a means for addressing the interaction between content knowledge and practices. Although this framework has generally been limited to describing conceptual and mathematical understanding, it also provides a means for addressing students' programming practices. In this dissertation, I will demonstrate this facet of a resource framework as well as fill in an important missing piece: a set of epistemic games that can describe students' computational modeling strategies. The development of this theoretical framework emerged from the analysis of video data of students generating computational models during the laboratory component of a Matter & Interactions: Modern Mechanics course. Student participants across two semesters were recorded as they worked in groups to fix pre-written computational models that were initially missing key lines of code. Analysis of this video data showed that the students' programming practices were highly influenced by

  1. CRYOCOL a computer program to calculate the cryogenic distillation of hydrogen isotopes

    International Nuclear Information System (INIS)

    Douglas, S.R.

    1993-02-01

    This report describes the computer model and mathematical method coded into the AECL Research computer program CRYOCOL. The purpose of CRYOCOL is to calculate the separation of hydrogen isotopes by cryogenic distillation. (Author)

  2. Pair Programming as a Modern Method of Teaching Computer Science

    Directory of Open Access Journals (Sweden)

    Irena Nančovska Šerbec

    2008-10-01

    Full Text Available At the Faculty of Education, University of Ljubljana we educate future computer science teachers. Beside didactical, pedagogical, mathematical and other interdisciplinary knowledge, students gain knowledge and skills of programming that are crucial for computer science teachers. For all courses, the main emphasis is the absorption of professional competences, related to the teaching profession and the programming profile. The latter are selected according to the well-known document, the ACM Computing Curricula. The professional knowledge is therefore associated and combined with the teaching knowledge and skills. In the paper we present how to achieve competences related to programming by using different didactical models (semiotic ladder, cognitive objectives taxonomy, problem solving and modern teaching method “pair programming”. Pair programming differs from standard methods (individual work, seminars, projects etc.. It belongs to the extreme programming as a discipline of software development and is known to have positive effects on teaching first programming language. We have experimentally observed pair programming in the introductory programming course. The paper presents and analyzes the results of using this method: the aspects of satisfaction during programming and the level of gained knowledge. The results are in general positive and demonstrate the promising usage of this teaching method.

  3. USERDA computer program summaries. Numbers 177--239

    International Nuclear Information System (INIS)

    1975-10-01

    Since 1960 the Argonne Code Center has served as a U. S. Atomic Energy Commission information center for computer programs developed and used primarily for the solution of problems in nuclear physics, reactor design, reactor engineering and operation. The Center, through a network of registered installations, collects, validates, maintains, and distributes a library of these computer programs and publishes a compilation of abstracts describing them. In 1972 the scope of the Center's activities was officially expanded to include computer programs developed in all of the U. S. Atomic Energy Commission program areas and the compilation and publication of this report. The Computer Program Summary report contains summaries of computer programs at the specification stage, under development, being checked out, in use, or available at ERDA offices, laboratories, and contractor installations. Programs are divided into the following categories: cross section and resonance integral calculations; spectrum calculations, generation of group constants, lattice and cell problems; static design studies; depletion, fuel management, cost analysis, and reactor economics; space-independent kinetics; space--time kinetics, coupled neutronics--hydrodynamics--thermodynamics and excursion simulations; radiological safety, hazard and accident analysis; heat transfer and fluid flow; deformation and stress distribution computations, structural analysis and engineering design studies; gamma heating and shield design programs; reactor systems analysis; data preparation; data management; subsidiary calculations; experimental data processing; general mathematical and computing system routines; materials; environmental and earth sciences; space sciences; electronics and engineering equipment; chemistry; particle accelerators and high-voltage machines; physics; controlled thermonuclear research; biology and medicine; and data

  4. Multi-level programming paradigm for extreme computing

    International Nuclear Information System (INIS)

    Petiton, S.; Sato, M.; Emad, N.; Calvin, C.; Tsuji, M.; Dandouna, M.

    2013-01-01

    In order to propose a framework and programming paradigms for post peta-scale computing, on the road to exa-scale computing and beyond, we introduced new languages, associated with a hierarchical multi-level programming paradigm, allowing scientific end-users and developers to program highly hierarchical architectures designed for extreme computing. In this paper, we explain the interest of such hierarchical multi-level programming paradigm for extreme computing and its well adaptation to several large computational science applications, such as for linear algebra solvers used for reactor core physic. We describe the YML language and framework allowing describing graphs of parallel components, which may be developed using PGAS-like language such as XMP, scheduled and computed on supercomputers. Then, we propose experimentations on supercomputers (such as the 'K' and 'Hooper' ones) of the hybrid method MERAM (Multiple Explicitly Restarted Arnoldi Method) as a case study for iterative methods manipulating sparse matrices, and the block Gauss-Jordan method as a case study for direct method manipulating dense matrices. We conclude proposing evolutions for this programming paradigm. (authors)

  5. Cross-scale Efficient Tensor Contractions for Coupled Cluster Computations Through Multiple Programming Model Backends

    Energy Technology Data Exchange (ETDEWEB)

    Ibrahim, Khaled Z. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division; Epifanovsky, Evgeny [Q-Chem, Inc., Pleasanton, CA (United States); Williams, Samuel W. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division; Krylov, Anna I. [Univ. of Southern California, Los Angeles, CA (United States). Dept. of Chemistry

    2016-07-26

    Coupled-cluster methods provide highly accurate models of molecular structure by explicit numerical calculation of tensors representing the correlation between electrons. These calculations are dominated by a sequence of tensor contractions, motivating the development of numerical libraries for such operations. While based on matrix-matrix multiplication, these libraries are specialized to exploit symmetries in the molecular structure and in electronic interactions, and thus reduce the size of the tensor representation and the complexity of contractions. The resulting algorithms are irregular and their parallelization has been previously achieved via the use of dynamic scheduling or specialized data decompositions. We introduce our efforts to extend the Libtensor framework to work in the distributed memory environment in a scalable and energy efficient manner. We achieve up to 240 speedup compared with the best optimized shared memory implementation. We attain scalability to hundreds of thousands of compute cores on three distributed-memory architectures, (Cray XC30&XC40, BlueGene/Q), and on a heterogeneous GPU-CPU system (Cray XK7). As the bottlenecks shift from being compute-bound DGEMM's to communication-bound collectives as the size of the molecular system scales, we adopt two radically different parallelization approaches for handling load-imbalance. Nevertheless, we preserve a uni ed interface to both programming models to maintain the productivity of computational quantum chemists.

  6. The Radiological Safety Analysis Computer Program (RSAC-5) user's manual

    International Nuclear Information System (INIS)

    Wenzel, D.R.

    1994-02-01

    The Radiological Safety Analysis Computer Program (RSAC-5) calculates the consequences of the release of radionuclides to the atmosphere. Using a personal computer, a user can generate a fission product inventory from either reactor operating history or nuclear criticalities. RSAC-5 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated through the inhalation, immersion, ground surface, and ingestion pathways. RSAC+, a menu-driven companion program to RSAC-5, assists users in creating and running RSAC-5 input files. This user's manual contains the mathematical models and operating instructions for RSAC-5 and RSAC+. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-5 and RSAC+. These programs are designed for users who are familiar with radiological dose assessment methods

  7. Generic Mathematical Programming Formulation and Solution for Computer-Aided Molecular Design

    DEFF Research Database (Denmark)

    Zhang, Lei; Cignitti, Stefano; Gani, Rafiqul

    2015-01-01

    This short communication presents a generic mathematical programming formulation for Computer-Aided Molecular Design (CAMD). A given CAMD problem, based on target properties, is formulated as a Mixed Integer Linear/Non-Linear Program (MILP/MINLP). The mathematical programming model presented here......, which is formulated as an MILP/MINLP problem, considers first-order and second-order molecular groups for molecular structure representation and property estimation. It is shown that various CAMD problems can be formulated and solved through this model....

  8. Computational challenges in modeling gene regulatory events.

    Science.gov (United States)

    Pataskar, Abhijeet; Tiwari, Vijay K

    2016-10-19

    Cellular transcriptional programs driven by genetic and epigenetic mechanisms could be better understood by integrating "omics" data and subsequently modeling the gene-regulatory events. Toward this end, computational biology should keep pace with evolving experimental procedures and data availability. This article gives an exemplified account of the current computational challenges in molecular biology.

  9. Algebraic computability and enumeration models recursion theory and descriptive complexity

    CERN Document Server

    Nourani, Cyrus F

    2016-01-01

    This book, Algebraic Computability and Enumeration Models: Recursion Theory and Descriptive Complexity, presents new techniques with functorial models to address important areas on pure mathematics and computability theory from the algebraic viewpoint. The reader is first introduced to categories and functorial models, with Kleene algebra examples for languages. Functorial models for Peano arithmetic are described toward important computational complexity areas on a Hilbert program, leading to computability with initial models. Infinite language categories are also introduced to explain descriptive complexity with recursive computability with admissible sets and urelements. Algebraic and categorical realizability is staged on several levels, addressing new computability questions with omitting types realizably. Further applications to computing with ultrafilters on sets and Turing degree computability are examined. Functorial models computability is presented with algebraic trees realizing intuitionistic type...

  10. Analysis of computer programming languages

    International Nuclear Information System (INIS)

    Risset, Claude Alain

    1967-01-01

    This research thesis aims at trying to identify some methods of syntax analysis which can be used for computer programming languages while putting aside computer devices which influence the choice of the programming language and methods of analysis and compilation. In a first part, the author proposes attempts of formalization of Chomsky grammar languages. In a second part, he studies analytical grammars, and then studies a compiler or analytic grammar for the Fortran language

  11. A Computer Program for Assessing Nuclear Safety Culture Impact

    Energy Technology Data Exchange (ETDEWEB)

    Han, Kiyoon; Jae, Moosung [Hanyang Univ., Seoul (Korea, Republic of)

    2014-10-15

    Through several accidents of NPP including the Fukushima Daiichi in 2011 and Chernobyl accidents in 1986, a lack of safety culture was pointed out as one of the root cause of these accidents. Due to its latent influences on safety performance, safety culture has become an important issue in safety researches. Most of the researches describe how to evaluate the state of the safety culture of the organization. However, they did not include a possibility that the accident occurs due to the lack of safety culture. Because of that, a methodology for evaluating the impact of the safety culture on NPP's safety is required. In this study, the methodology for assessing safety culture impact is suggested and a computer program is developed for its application. SCII model which is the new methodology for assessing safety culture impact quantitatively by using PSA model. The computer program is developed for its application. This program visualizes the SCIs and the SCIIs. It might contribute to comparing the level of the safety culture among NPPs as well as improving the management safety of NPP.

  12. GIGMF - A statistical model program

    International Nuclear Information System (INIS)

    Vladuca, G.; Deberth, C.

    1978-01-01

    The program GIGMF computes the differential and integrated statistical model cross sections for the reactions proceeding through a compound nuclear stage. The computational method is based on the Hauser-Feshbach-Wolfenstein theory, modified to include the modern version of Tepel et al. Although the program was written for a PDP-15 computer, with 16K high speed memory, many reaction channels can be taken into account with the following restrictions: the pro ectile spin must be less than 2, the maximum spin momenta of the compound nucleus can not be greater than 10. These restrictions are due solely to the storage allotments and may be easily relaxed. The energy of the impinging particle, the target and projectile masses, the spin and paritjes of the projectile, target, emergent and residual nuclei the maximum orbital momentum and transmission coefficients for each reaction channel are the input parameters of the program. (author)

  13. Two-Language, Two-Paradigm Introductory Computing Curriculum Model and its Implementation

    OpenAIRE

    Zanev, Vladimir; Radenski, Atanas

    2011-01-01

    This paper analyzes difficulties with the introduction of object-oriented concepts in introductory computing education and then proposes a two-language, two-paradigm curriculum model that alleviates such difficulties. Our two-language, two-paradigm curriculum model begins with teaching imperative programming using Python programming language, continues with teaching object-oriented computing using Java, and concludes with teaching object-oriented data structures with Java.

  14. Preschool Cookbook of Computer Programming Topics

    Science.gov (United States)

    Morgado, Leonel; Cruz, Maria; Kahn, Ken

    2010-01-01

    A common problem in computer programming use for education in general, not simply as a technical skill, is that children and teachers find themselves constrained by what is possible through limited expertise in computer programming techniques. This is particularly noticeable at the preliterate level, where constructs tend to be limited to…

  15. ALMOD-JRC computer program

    International Nuclear Information System (INIS)

    Cacciabue, P.C.; Lisanti, B.; Tozzi, A.

    1984-01-01

    This paper discusses the details concerning the newly developed or modified models of the computer program ALMOD-JRC, originating from ALMOD 3/Rel 4. The most important argument for the implementation of the new models was the need to enlarge the spectrum of the simulated phenomena, and to improve the simulation of experimental facilities such as LOFT or LOBI. This has led to a better formulation of the heat transfer and pressure drops correlations and to the implementation of the treatment of the heat losses to structural materials. In particular a series of test cases on real power plants, a pre-test examination of a LOBI station blackout ATWS experiment and the post test analysis of the L9-3 experiment, show the ability of ALMOD-JRC to correctly simulate PWR incident sequences. Although in ALMOD-JRC the code capabilities have been expanded, the limitations of the original version of the program still hold for what concerns the treatment of the coolant thermohydraulics as homogeneous flow for the two phase conditions in the primary coolant circuit. The other interesting feature of the new code is the remarkably shorter running times obtained with the introduction of simplified numerical treatments for the solving equations, without significant loss of accuracy of results

  16. ADAM: A computer program to simulate selective-breeding schemes for animals

    DEFF Research Database (Denmark)

    Pedersen, L D; Sørensen, A C; Henryon, M

    2009-01-01

    ADAM is a computer program that models selective breeding schemes for animals using stochastic simulation. The program simulates a population of animals and traces the genetic changes in the population under different selective breeding scenarios. It caters to different population structures......, genetic models, selection strategies, and mating designs. ADAM can be used to evaluate breeding schemes and generate genetic data to test statistical tools...

  17. Computer program for diagnostic X-ray exposure conversion

    International Nuclear Information System (INIS)

    Lewis, S.L.

    1984-01-01

    Presented is a computer program designed to convert any given set of diagnostic X-ray exposure factors sequentially into another, yielding either an equivalent photographic density or one increased or decreased by a specifiable proportion. In addition to containing the wherewithal with which to manipulate a set of exposure factors, the facility to print hard (paper) copy is included enabling the results to be pasted into a notebook and used at any time. This program was originally written as an investigative exercise into examining the potential use of computers for practical radiographic purposes as conventionally encountered. At the same time, its possible use as an educational tool was borne in mind. To these ends, the current version of this program may be used as a means whereby exposure factors used in a diagnostic department may be altered to suit a particular requirement or may be used in the school as a mathematical model to describe the behaviour of exposure factors under manipulation without patient exposure. (author)

  18. Python for Scientific Computing Education: Modeling of Queueing Systems

    Directory of Open Access Journals (Sweden)

    Vladimiras Dolgopolovas

    2014-01-01

    Full Text Available In this paper, we present the methodology for the introduction to scientific computing based on model-centered learning. We propose multiphase queueing systems as a basis for learning objects. We use Python and parallel programming for implementing the models and present the computer code and results of stochastic simulations.

  19. PRETTA:A COMPUTER PROGRAM FOR PWR PRESSURIZER’S TRANSIENT THERMODYNAMICS

    Institute of Scientific and Technical Information of China (English)

    阿谢德; 徐济鋆

    2001-01-01

    A computer program PRETTA “Pressurizer Transient Thermodynamics Analysis” was developed for the prediction of pressurizer under transient conditions. It is based on the solution of the conservation laws of heat and mass applied to the three separate and non equilibrium thermodynamic regions. In the program all of the important thermal-hydraulics phenomena occurring in the pressurizer: stratification of the hot water and incoming cold water, bulk flashing and condensation, wall condensation, and interfacial heat and mass transfer have been considered. The bubble rising and rain-out models are developed to describe bulk flashing and condensation, respectively. To obtain the wall condensation rate, a one-dimensional heat conduction equation is solved by the pivoting method. The presented computer program will predict the pressure-time behavior of a PWR pressurizer during a variety of transients. The results obtained from the proposed mathematical model are in good agreement with available data on the CHASHMA nuclear power plant's pressurizer performance.

  20. Computer Tutorial Programs in Physics.

    Science.gov (United States)

    Faughn, Jerry; Kuhn, Karl

    1979-01-01

    Describes a series of computer tutorial programs which are intended to help college students in introductory physics courses. Information about these programs, which are either calculus or algebra-trig based, is presented. (HM)

  1. Computer Modelling of Photochemical Smog Formation

    Science.gov (United States)

    Huebert, Barry J.

    1974-01-01

    Discusses a computer program that has been used in environmental chemistry courses as an example of modelling as a vehicle for teaching chemical dynamics, and as a demonstration of some of the factors which affect the production of smog. (Author/GS)

  2. A computer program for activation analysis

    International Nuclear Information System (INIS)

    Rantanen, J.; Rosenberg, R.J.

    1983-01-01

    A computer program for calculating the results of activation analysis is described. The program comprises two gamma spectrum analysis programs, STOAV and SAMPO and one program for calculating elemental concentrations, KVANT. STOAV is based on a simple summation of channels and SAMPO is based on fitting of mathematical functions. The programs are tested by analyzing the IAEA G-1 test spectra. In the determination of peak location SAMPO is somewhat better than STOAV and in the determination of peak area SAMPO is more than twice as accurate as STOAV. On the other hand, SAMPO is three times as expensive as STOAV with the use of a Cyber 170 computer. (author)

  3. 32 CFR 701.125 - Computer matching program.

    Science.gov (United States)

    2010-07-01

    ... counterintelligence matches done in the course of performing a background check for security clearances of Federal... 32 National Defense 5 2010-07-01 2010-07-01 false Computer matching program. 701.125 Section 701... OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC DON Privacy Program § 701.125 Computer matching program...

  4. Modeling of complex melting and solidification behavior in laser-irradiated materials [a description and users guide to the LASER8 computer program

    International Nuclear Information System (INIS)

    Geist, G.A.; Wood, R.F.

    1985-11-01

    The conceptual foundation of a computational model and a computer program based on it have been developed for treating various aspects of the complex melting and solidification behavior observed in pulsed laser-irradiated materials. A particularly important feature of the modeling is the capability of allowing melting and solidification to occur at temperatures other than the thermodynamic phase change temperatures. As a result, interfacial undercooling and overheating can be introduced and various types of nucleation events can be simulated. Calculations on silicon with the model have shown a wide variety of behavior, including the formation and propagation of multiple phase fronts. Although originally developed as a tool for studying certain problems arising in the field of laser annealing of semiconductors, the program should be useful in treating many types of systems in which phase changes and nucleation phenomena play important roles. This report describes the underlying physical and mathematical ideas and the basic relations used in LASER8. It also provides enough specific and detailed information on the program to serve as a guide for its use; a listing of one version of the program is given

  5. Introduction to computation and modeling for differential equations

    CERN Document Server

    Edsberg, Lennart

    2008-01-01

    An introduction to scientific computing for differential equationsIntroduction to Computation and Modeling for Differential Equations provides a unified and integrated view of numerical analysis, mathematical modeling in applications, and programming to solve differential equations, which is essential in problem-solving across many disciplines, such as engineering, physics, and economics. This book successfully introduces readers to the subject through a unique ""Five-M"" approach: Modeling, Mathematics, Methods, MATLAB, and Multiphysics. This approach facilitates a thorough understanding of h

  6. Computer modeling of commercial refrigerated warehouse facilities

    International Nuclear Information System (INIS)

    Nicoulin, C.V.; Jacobs, P.C.; Tory, S.

    1997-01-01

    The use of computer models to simulate the energy performance of large commercial refrigeration systems typically found in food processing facilities is an area of engineering practice that has seen little development to date. Current techniques employed in predicting energy consumption by such systems have focused on temperature bin methods of analysis. Existing simulation tools such as DOE2 are designed to model commercial buildings and grocery store refrigeration systems. The HVAC and Refrigeration system performance models in these simulations tools model equipment common to commercial buildings and groceries, and respond to energy-efficiency measures likely to be applied to these building types. The applicability of traditional building energy simulation tools to model refrigerated warehouse performance and analyze energy-saving options is limited. The paper will present the results of modeling work undertaken to evaluate energy savings resulting from incentives offered by a California utility to its Refrigerated Warehouse Program participants. The TRNSYS general-purpose transient simulation model was used to predict facility performance and estimate program savings. Custom TRNSYS components were developed to address modeling issues specific to refrigerated warehouse systems, including warehouse loading door infiltration calculations, an evaporator model, single-state and multi-stage compressor models, evaporative condenser models, and defrost energy requirements. The main focus of the paper will be on the modeling approach. The results from the computer simulations, along with overall program impact evaluation results, will also be presented

  7. Splitting Computation of Answer Set Program and Its Application on E-service

    Directory of Open Access Journals (Sweden)

    Bo Yang

    2011-10-01

    Full Text Available As a primary means for representing and reasoning about knowledge, Answer Set Programming (ASP has been applying in many areas such as planning, decision making, fault diagnosing and increasingly prevalent e-service. Based on the stable model semantics of logic programming, ASP can be used to solve various combinatorial search problems by finding the answer sets of logic programs which declaratively describe the problems. Itrs not an easy task to compute answer sets of a logic program using Gelfond and Lifschitzrs definition directly. In this paper, we show some results on characterization of answer sets of a logic program with constraints, and propose a way to split a program into several non-intersecting parts step by step, thus the computation of answer sets for every subprogram becomes relatively easy. To instantiate our splitting computation theory, an example about personalized product configuration in e-retailing is given to show the effectiveness of our method.

  8. Personal Computer Transport Analysis Program

    Science.gov (United States)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  9. Effectiveness of a Case-Based Computer Program on Students' Ethical Decision Making.

    Science.gov (United States)

    Park, Eun-Jun; Park, Mihyun

    2015-11-01

    The aim of this study was to test the effectiveness of a case-based computer program, using an integrative ethical decision-making model, on the ethical decision-making competency of nursing students in South Korea. This study used a pre- and posttest comparison design. Students in the intervention group used a computer program for case analysis assignments, whereas students in the standard group used a traditional paper assignment for case analysis. The findings showed that using the case-based computer program as a complementary tool for the ethics courses offered at the university enhanced students' ethical preparedness and satisfaction with the course. On the basis of the findings, it is recommended that nurse educators use a case-based computer program as a complementary self-study tool in ethics courses to supplement student learning without an increase in course hours, particularly in terms of analyzing ethics cases with dilemma scenarios and exercising ethical decision making. Copyright 2015, SLACK Incorporated.

  10. Programming model for distributed intelligent systems

    Science.gov (United States)

    Sztipanovits, J.; Biegl, C.; Karsai, G.; Bogunovic, N.; Purves, B.; Williams, R.; Christiansen, T.

    1988-01-01

    A programming model and architecture which was developed for the design and implementation of complex, heterogeneous measurement and control systems is described. The Multigraph Architecture integrates artificial intelligence techniques with conventional software technologies, offers a unified framework for distributed and shared memory based parallel computational models and supports multiple programming paradigms. The system can be implemented on different hardware architectures and can be adapted to strongly different applications.

  11. Overview of Computer Simulation Modeling Approaches and Methods

    Science.gov (United States)

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  12. Development of a single cell spherical shell model for an investigation of electrical properties with a computing program

    Directory of Open Access Journals (Sweden)

    Boonlamp, M.

    2005-03-01

    Full Text Available A spherical double shell model (SDM for a single cell has been developed, using Laplace’s equation in spherical coordinates and boundary conditions. Electric field intensities and dielectric constants of each region inside and outside of the cell have been estimated. The dielectrophoretic spectrum of the real part of a complex function (Re[f ( ω] were computed using Visual Foxpro Version 6, which gave calculated values pertaining to electrical properties of the cell model as compared with experimental values. The process was repeated until the error percentile was in an acceptable range. The calculated parameters were the dielectric constants and the conductivities of the inner cytoplasm ( εic, σic, the outer cytoplasm ( εoc, σoc, the inner membrane ( εim, σim, the outer membrane ( εom, σom, the suspending solution( εs, σs and the thickness of each layer (dom, doc, dim, respectively. This computer program provides estimated values of cell electrical properties with high accuracy and required minimal computational time.

  13. Functional Programming in Computer Science

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Loren James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Davis, Marion Kei [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-01-19

    We explore functional programming through a 16-week internship at Los Alamos National Laboratory. Functional programming is a branch of computer science that has exploded in popularity over the past decade due to its high-level syntax, ease of parallelization, and abundant applications. First, we summarize functional programming by listing the advantages of functional programming languages over the usual imperative languages, and we introduce the concept of parsing. Second, we discuss the importance of lambda calculus in the theory of functional programming. Lambda calculus was invented by Alonzo Church in the 1930s to formalize the concept of effective computability, and every functional language is essentially some implementation of lambda calculus. Finally, we display the lasting products of the internship: additions to a compiler and runtime system for the pure functional language STG, including both a set of tests that indicate the validity of updates to the compiler and a compiler pass that checks for illegal instances of duplicate names.

  14. Computer programs simplify optical system analysis

    Science.gov (United States)

    1965-01-01

    The optical ray-trace computer program performs geometrical ray tracing. The energy-trace program calculates the relative monochromatic flux density on a specific target area. This program uses the ray-trace program as a subroutine to generate a representation of the optical system.

  15. The RANDOM computer program: A linear congruential random number generator

    Science.gov (United States)

    Miles, R. F., Jr.

    1986-01-01

    The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.

  16. Computer - based modeling in extract sciences research -III ...

    African Journals Online (AJOL)

    Molecular modeling techniques have been of great applicability in the study of the biological sciences and other exact science fields like agriculture, mathematics, computer science and the like. In this write up, a list of computer programs for predicting, for instance, the structure of proteins has been provided. Discussions on ...

  17. Gender Differences in the Use of Computers, Programming, and Peer Interactions in Computer Science Classrooms

    Science.gov (United States)

    Stoilescu, Dorian; Egodawatte, Gunawardena

    2010-01-01

    Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new…

  18. Gender differences in the use of computers, programming, and peer interactions in computer science classrooms

    Science.gov (United States)

    Stoilescu, Dorian; Egodawatte, Gunawardena

    2010-12-01

    Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new definitions for computer science culture but to see how male and female students see themselves involved in computer science practices, how they see computer science as a successful career, and what they like and dislike about current computer science practices. The study took place in a mid-sized university in Ontario. Sixteen students and two instructors were interviewed to get their views. We found that male and female views are different on computer use, programming, and the pattern of student interactions. Female and male students did not have any major issues in using computers. In computing programming, female students were not so involved in computing activities whereas male students were heavily involved. As for the opinions about successful computer science professionals, both female and male students emphasized hard working, detailed oriented approaches, and enjoying playing with computers. The myth of the geek as a typical profile of successful computer science students was not found to be true.

  19. A COMPUTER PROGRAM FOR INTERPRETATION OF THE DATA OF VERTICAL ELECTRICAL SOUNDING VEZ-4A

    Directory of Open Access Journals (Sweden)

    D. G. Koliushko

    2017-06-01

    Full Text Available Purpose. Creating a computer program for interpreting the results of vertical sounding the soil in the form of multilayer model most typical for Ukraine. Methodology. The algorithm of the program is constructed on determination the soil structure with the help of the method of point source current, method of analogy and method of equivalent. The option of automatic interpretation based on Hook-Jeeves method. The program is implemented in the programming language Delphi. Results. The computer program «VEZ-4A» has a possibility of the interactive and automatic interpretation sounding results in the multi-layered geoelectrical model. Originality. In first time the computer program for analyzing and interpreting results of the soil sounding by Wenner configuration was created on the base of the analytical solution for field of current point source located in four-, three- or two-layer structure. In paper the review is presented and basic functions of our program are analyzed. Practical value. The program «VEZ-4A» is created and adapted for use in the electromagnetic diagnostics of grounding of existing power plants and substations.

  20. Several problems of algorithmization in integrated computation programs on third generation computers for short circuit currents in complex power networks

    Energy Technology Data Exchange (ETDEWEB)

    Krylov, V.A.; Pisarenko, V.P.

    1982-01-01

    Methods of modeling complex power networks with short circuits in the networks are described. The methods are implemented in integrated computation programs for short circuit currents and equivalents in electrical networks with a large number of branch points (up to 1000) on a computer with a limited on line memory capacity (M equals 4030 for the computer).

  1. Computer-controlled mechanical lung model for application in pulmonary function studies

    NARCIS (Netherlands)

    A.F.M. Verbraak (Anton); J.E.W. Beneken; J.M. Bogaard (Jan); A. Versprille (Adrian)

    1995-01-01

    textabstractA computer controlled mechanical lung model has been developed for testing lung function equipment, validation of computer programs and simulation of impaired pulmonary mechanics. The construction, function and some applications are described. The physical model is constructed from two

  2. A CAD (Classroom Assessment Design) of a Computer Programming Course

    Science.gov (United States)

    Hawi, Nazir S.

    2012-01-01

    This paper presents a CAD (classroom assessment design) of an entry-level undergraduate computer programming course "Computer Programming I". CAD has been the product of a long experience in teaching computer programming courses including teaching "Computer Programming I" 22 times. Each semester, CAD is evaluated and modified…

  3. SEISRISK II; a computer program for seismic hazard estimation

    Science.gov (United States)

    Bender, Bernice; Perkins, D.M.

    1982-01-01

    The computer program SEISRISK II calculates probabilistic ground motion values for use in seismic hazard mapping. SEISRISK II employs a model that allows earthquakes to occur as points within source zones and as finite-length ruptures along faults. It assumes that earthquake occurrences have a Poisson distribution, that occurrence rates remain constant during the time period considered, that ground motion resulting from an earthquake is a known function of magnitude and distance, that seismically homogeneous source zones are defined, that fault locations are known, that fault rupture lengths depend on magnitude, and that earthquake rates as a function of magnitude are specified for each source. SEISRISK II calculates for each site on a grid of sites the level of ground motion that has a specified probability of being exceeded during a given time period. The program was designed to process a large (essentially unlimited) number of sites and sources efficiently and has been used to produce regional and national maps of seismic hazard.}t is a substantial revision of an earlier program SEISRISK I, which has never been documented. SEISRISK II runs considerably [aster and gives more accurate results than the earlier program and in addition includes rupture length and acceleration variability which were not contained in the original version. We describe the model and how it is implemented in the computer program and provide a flowchart and listing of the code.

  4. Computer modeling of the dynamic processes in the Maryland University Training Reactor - (MUTR)

    International Nuclear Information System (INIS)

    White, Bernard H. IV; Ebert, David

    1988-01-01

    The simulator described in this paper models the behaviour of the Maryland University Training Reactor (MUTR). The reactor is a 250 kW, TRIGA reactor. The computer model is based on a system of five primary equations and eight auxiliary equations. The primary equations consist of the prompt jump approximation, a heat balance equation for the fuel and the moderator, and iodine and xenon buildup equations. For the comparison with the computer program, data from the reactor was acquired by using a personal computer (pc) which contained a Strawberry Tree data acquisition Card, connected to the reactor. The systems monitored by the pc were: two neutron detectors, fuel temperature, water temperature, three control rod positions and the period meter. The time differenced equations were programmed in the basic language. It has been shown by this paper, that the MUTR power rise from low power critical to high power, can be modelled by a relatively simple computer program. The program yields accurate agreement considering the simplicity of the program. The steady state error between the reactor and computer power is 4.4%. The difference in steady state temperatures, 112 deg. C and 117 deg. C, of the reactor and computer program, respectively, also yields a 4.5% error. Further fine tuning of the coefficients will yield higher accuracies

  5. Programming Paradigms in Computer Science Education

    OpenAIRE

    Bolshakova, Elena

    2005-01-01

    Main styles, or paradigms of programming – imperative, functional, logic, and object-oriented – are shortly described and compared, and corresponding programming techniques are outlined. Programming languages are classified in accordance with the main style and techniques supported. It is argued that profound education in computer science should include learning base programming techniques of all main programming paradigms.

  6. A program to compute the soft Robinson-Foulds distance between phylogenetic networks.

    Science.gov (United States)

    Lu, Bingxin; Zhang, Louxin; Leong, Hon Wai

    2017-03-14

    Over the past two decades, phylogenetic networks have been studied to model reticulate evolutionary events. The relationships among phylogenetic networks, phylogenetic trees and clusters serve as the basis for reconstruction and comparison of phylogenetic networks. To understand these relationships, two problems are raised: the tree containment problem, which asks whether a phylogenetic tree is displayed in a phylogenetic network, and the cluster containment problem, which asks whether a cluster is represented at a node in a phylogenetic network. Both the problems are NP-complete. A fast exponential-time algorithm for the cluster containment problem on arbitrary networks is developed and implemented in C. The resulting program is further extended into a computer program for fast computation of the Soft Robinson-Foulds distance between phylogenetic networks. Two computer programs are developed for facilitating reconstruction and validation of phylogenetic network models in evolutionary and comparative genomics. Our simulation tests indicated that they are fast enough for use in practice. Additionally, the distribution of the Soft Robinson-Foulds distance between phylogenetic networks is demonstrated to be unlikely normal by our simulation data.

  7. RECON: a computer program for analyzing repository economics. Documentation and user's manual

    International Nuclear Information System (INIS)

    Clark, L.L.; Cole, B.M.; McNair, G.W.; Schutz, M.E.

    1983-05-01

    From 1981 through 1983 the Pacific Northwest Laboratory has been developing a computer model named RECON to calculate repository costs from parametric data input. The objective of the program has been to develop the capability to evalute the effect on costs of changes in repository design parameters and operating scenario assumptions. This report documents the development of the model through March of 1983. Included in the report are: (1) descriptions of model development and the underlying equations, assumptions and definitions; (2) descriptions of data input either using card images or an interactive data input program; and (3) detailed listings of the program and definitions of program variables. Cost estimates generated using the model have been verified against independent estimates and good agreement has been obtained

  8. Computing Models of M-type Host Stars and their Panchromatic Spectral Output

    Science.gov (United States)

    Linsky, Jeffrey; Tilipman, Dennis; France, Kevin

    2018-06-01

    We have begun a program of computing state-of-the-art model atmospheres from the photospheres to the coronae of M stars that are the host stars of known exoplanets. For each model we are computing the emergent radiation at all wavelengths that are critical for assessingphotochemistry and mass-loss from exoplanet atmospheres. In particular, we are computing the stellar extreme ultraviolet radiation that drives hydrodynamic mass loss from exoplanet atmospheres and is essential for determing whether an exoplanet is habitable. The model atmospheres are computed with the SSRPM radiative transfer/statistical equilibrium code developed by Dr. Juan Fontenla. The code solves for the non-LTE statistical equilibrium populations of 18,538 levels of 52 atomic and ion species and computes the radiation from all species (435,986 spectral lines) and about 20,000,000 spectral lines of 20 diatomic species.The first model computed in this program was for the modestly active M1.5 V star GJ 832 by Fontenla et al. (ApJ 830, 152 (2016)). We will report on a preliminary model for the more active M5 V star GJ 876 and compare this model and its emergent spectrum with GJ 832. In the future, we will compute and intercompare semi-empirical models and spectra for all of the stars observed with the HST MUSCLES Treasury Survey, the Mega-MUSCLES Treasury Survey, and additional stars including Proxima Cen and Trappist-1.This multiyear theory program is supported by a grant from the Space Telescope Science Institute.

  9. A COMPUTER PROGRAM FOR INTERPRETATION OF THE DATA OF VERTICAL ELECTRICAL SOUNDING VEZ-4A

    OpenAIRE

    D. G. Koliushko; S. S. Rudenko

    2017-01-01

    Purpose. Creating a computer program for interpreting the results of vertical sounding the soil in the form of multilayer model most typical for Ukraine. Methodology. The algorithm of the program is constructed on determination the soil structure with the help of the method of point source current, method of analogy and method of equivalent. The option of automatic interpretation based on Hook-Jeeves method. The program is implemented in the programming language Delphi. Results. The computer ...

  10. Introduction to the Atari Computer. A Program Written in the Pilot Programming Language.

    Science.gov (United States)

    Schlenker, Richard M.

    Designed to be an introduction to the Atari microcomputers for beginners, the interactive computer program listed in this document is written in the Pilot programing language. Instructions are given for entering and storing the program in the computer memory for use by students. (MES)

  11. The SX Solver: A New Computer Program for Analyzing Solvent-Extraction Equilibria

    International Nuclear Information System (INIS)

    McNamara, B.K.; Rapko, B.M.; Lumetta, G.J.

    1999-01-01

    A new computer program, the SX Solver, has been developed to analyze solvent-extraction equilibria. The program operates out of Microsoft Excel and uses the built-in ''Solver'' function to minimize the sum of the square of the residuals between measured and calculated distribution coefficients. The extraction of nitric acid by tributylphosphate has been modeled to illustrate the program's use

  12. Computer programming and architecture the VAX

    CERN Document Server

    Levy, Henry

    2014-01-01

    Takes a unique systems approach to programming and architecture of the VAXUsing the VAX as a detailed example, the first half of this book offers a complete course in assembly language programming. The second describes higher-level systems issues in computer architecture. Highlights include the VAX assembler and debugger, other modern architectures such as RISCs, multiprocessing and parallel computing, microprogramming, caches and translation buffers, and an appendix on the Berkeley UNIX assembler.

  13. PHREEQCI; a graphical user interface for the geochemical computer program PHREEQC

    Science.gov (United States)

    Charlton, Scott R.; Macklin, Clifford L.; Parkhurst, David L.

    1997-01-01

    PhreeqcI is a Windows-based graphical user interface for the geochemical computer program PHREEQC. PhreeqcI provides the capability to generate and edit input data files, run simulations, and view text files containing simulation results, all within the framework of a single interface. PHREEQC is a multipurpose geochemical program that can perform speciation, inverse, reaction-path, and 1D advective reaction-transport modeling. Interactive access to all of the capabilities of PHREEQC is available with PhreeqcI. The interface is written in Visual Basic and will run on personal computers under the Windows(3.1), Windows95, and WindowsNT operating systems.

  14. A research program in empirical computer science

    Science.gov (United States)

    Knight, J. C.

    1991-01-01

    During the grant reporting period our primary activities have been to begin preparation for the establishment of a research program in experimental computer science. The focus of research in this program will be safety-critical systems. Many questions that arise in the effort to improve software dependability can only be addressed empirically. For example, there is no way to predict the performance of the various proposed approaches to building fault-tolerant software. Performance models, though valuable, are parameterized and cannot be used to make quantitative predictions without experimental determination of underlying distributions. In the past, experimentation has been able to shed some light on the practical benefits and limitations of software fault tolerance. It is common, also, for experimentation to reveal new questions or new aspects of problems that were previously unknown. A good example is the Consistent Comparison Problem that was revealed by experimentation and subsequently studied in depth. The result was a clear understanding of a previously unknown problem with software fault tolerance. The purpose of a research program in empirical computer science is to perform controlled experiments in the area of real-time, embedded control systems. The goal of the various experiments will be to determine better approaches to the construction of the software for computing systems that have to be relied upon. As such it will validate research concepts from other sources, provide new research results, and facilitate the transition of research results from concepts to practical procedures that can be applied with low risk to NASA flight projects. The target of experimentation will be the production software development activities undertaken by any organization prepared to contribute to the research program. Experimental goals, procedures, data analysis and result reporting will be performed for the most part by the University of Virginia.

  15. K-TIF: a two-fluid computer program for downcomer flow dynamics. [PWR

    Energy Technology Data Exchange (ETDEWEB)

    Amsden, A.A.; Harlow, F.H.

    1977-10-01

    The K-TIF computer program has been developed for numerical solution of the time-varying dynamics of steam and water in a pressurized water reactor downcomer. The current status of physical and mathematical modeling is presented in detail. The report also contains a complete description of the numerical solution technique, a full description and listing of the computer program, instructions for its use, with a sample printout for a specific test problem. A series of calculations, performed with no change in the modeling parameters, shows consistent agreement with the experimental trends over a wide range of conditions, which gives confidence to the calculations as a basis for investigating the complicated physics of steam-water flows in the downcomer.

  16. Human Memory Organization for Computer Programs.

    Science.gov (United States)

    Norcio, A. F.; Kerst, Stephen M.

    1983-01-01

    Results of study investigating human memory organization in processing of computer programming languages indicate that algorithmic logic segments form a cognitive organizational structure in memory for programs. Statement indentation and internal program documentation did not enhance organizational process of recall of statements in five Fortran…

  17. Structured Design Language for Computer Programs

    Science.gov (United States)

    Pace, Walter H., Jr.

    1986-01-01

    Box language used at all stages of program development. Developed to provide improved productivity in designing, coding, and maintaining computer programs. BOX system written in FORTRAN 77 for batch execution.

  18. SPSS and SAS programming for the testing of mediation models.

    Science.gov (United States)

    Dudley, William N; Benuzillo, Jose G; Carrico, Mineh S

    2004-01-01

    Mediation modeling can explain the nature of the relation among three or more variables. In addition, it can be used to show how a variable mediates the relation between levels of intervention and outcome. The Sobel test, developed in 1990, provides a statistical method for determining the influence of a mediator on an intervention or outcome. Although interactive Web-based and stand-alone methods exist for computing the Sobel test, SPSS and SAS programs that automatically run the required regression analyses and computations increase the accessibility of mediation modeling to nursing researchers. To illustrate the utility of the Sobel test and to make this programming available to the Nursing Research audience in both SAS and SPSS. The history, logic, and technical aspects of mediation testing are introduced. The syntax files sobel.sps and sobel.sas, created to automate the computation of the regression analysis and test statistic, are available from the corresponding author. The reported programming allows the user to complete mediation testing with the user's own data in a single-step fashion. A technical manual included with the programming provides instruction on program use and interpretation of the output. Mediation modeling is a useful tool for describing the relation between three or more variables. Programming and manuals for using this model are made available.

  19. Bourbaki's structure theory in the problem of complex systems simulation models synthesis and model-oriented programming

    Science.gov (United States)

    Brodsky, Yu. I.

    2015-01-01

    The work is devoted to the application of Bourbaki's structure theory to substantiate the synthesis of simulation models of complex multicomponent systems, where every component may be a complex system itself. An application of the Bourbaki's structure theory offers a new approach to the design and computer implementation of simulation models of complex multicomponent systems—model synthesis and model-oriented programming. It differs from the traditional object-oriented approach. The central concept of this new approach and at the same time, the basic building block for the construction of more complex structures is the concept of models-components. A model-component endowed with a more complicated structure than, for example, the object in the object-oriented analysis. This structure provides to the model-component an independent behavior-the ability of standard responds to standard requests of its internal and external environment. At the same time, the computer implementation of model-component's behavior is invariant under the integration of models-components into complexes. This fact allows one firstly to construct fractal models of any complexity, and secondly to implement a computational process of such constructions uniformly-by a single universal program. In addition, the proposed paradigm allows one to exclude imperative programming and to generate computer code with a high degree of parallelism.

  20. SHOCK-JR: a computer program to analyze impact response of shipping container

    International Nuclear Information System (INIS)

    Ikushima, Takeshi; Nakazato, Chikara; Shimoda, Osamu; Uchino, Mamoru.

    1983-02-01

    The report is provided for using a computer program, SHOCK-JR, which is used to analyze the impact response of shipping containers. Descriptions are the mathematical model, method of analysis, structures of the program and the input and output variables. The program solves the equations of motion for a one-dimensional, lumped mass and nonlinear spring model. The solution procedure uses Runge-Kutta-Gill and Newmark-β methods. SHOCK-JR is a revised version of SHOCK, which was developed by ORNL. In SHOCK-JR, SI dimension is used and graphical output is available. (author)

  1. Computer modelling of superconductive fault current limiters

    Energy Technology Data Exchange (ETDEWEB)

    Weller, R.A.; Campbell, A.M.; Coombs, T.A.; Cardwell, D.A.; Storey, R.J. [Cambridge Univ. (United Kingdom). Interdisciplinary Research Centre in Superconductivity (IRC); Hancox, J. [Rolls Royce, Applied Science Division, Derby (United Kingdom)

    1998-05-01

    Investigations are being carried out on the use of superconductors for fault current limiting applications. A number of computer programs are being developed to predict the behavior of different `resistive` fault current limiter designs under a variety of fault conditions. The programs achieve solution by iterative methods based around real measured data rather than theoretical models in order to achieve accuracy at high current densities. (orig.) 5 refs.

  2. Understanding Computational Thinking before Programming: Developing Guidelines for the Design of Games to Learn Introductory Programming through Game-Play

    Science.gov (United States)

    Kazimoglu, Cagin; Kiernan, Mary; Bacon, Liz; MacKinnon, Lachlan

    2011-01-01

    This paper outlines an innovative game-based approach to learning introductory programming that is grounded in the development of computational thinking at an abstract conceptual level, but also provides a direct contextual relationship between game-play and learning traditional introductory programming. The paper proposes a possible model for,…

  3. Ewe: a computer model for ultrasonic inspection

    International Nuclear Information System (INIS)

    Douglas, S.R.; Chaplin, K.R.

    1991-11-01

    The computer program EWE simulates the propagation of elastic waves in solids and liquids. It has been applied to ultrasonic testing to study the echoes generated by cracks and other types of defects. A discussion of the elastic wave equations is given, including the first-order formulation, shear and compression waves, surface waves and boundaries, numerical method of solution, models for cracks and slot defects, input wave generation, returning echo construction, and general computer issues

  4. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. Application of computers in a Radiological Survey Program

    International Nuclear Information System (INIS)

    Berven, B.A.; Blair, M.S.; Doane, R.W.; Little, C.A.; Perdue, P.T.

    1984-01-01

    A brief description of some of the applications of computers in a radiological survey program is presented. It has been our experience that computers and computer software have allowed our staff personnel to more productively use their time by using computers to perform the mechanical acquisition, analyses, and storage of data. It is hoped that other organizations may similarly profit from this experience. This effort will ultimately minimize errors and reduce program costs

  6. Development of a fast running accident analysis computer program for use in a simulator

    International Nuclear Information System (INIS)

    Cacciabue, P.C.

    1985-01-01

    This paper describes how a reactor safety nuclear computer program can be modified and improved with the aim of reaching a very fast running tool to be used as a physical model in a plant simulator, without penalizing the accuracy of results. It also discusses some ideas on how the physical theoretical model can be combined to a driving statistical tool for the build up of the entire package of software to be implemented in the simulator for risk and reliability analysis. The approach to the problem, although applied to a specific computer program, can be considered quite general if an already existing and well tested code is being used for the purpose. The computer program considered is ALMOD, originally developed for the analysis of the thermohydraulic and neutronic behaviour of the reactor core, primary circuit and steam generator during operational and special transients. (author)

  7. Translator program converts computer printout into braille language

    Science.gov (United States)

    Powell, R. A.

    1967-01-01

    Computer program converts print image tape files into six dot Braille cells, enabling a blind computer programmer to monitor and evaluate data generated by his own programs. The Braille output is printed 8 lines per inch.

  8. Computer-Aided Corrosion Program Management

    Science.gov (United States)

    MacDowell, Louis

    2010-01-01

    This viewgraph presentation reviews Computer-Aided Corrosion Program Management at John F. Kennedy Space Center. The contents include: 1) Corrosion at the Kennedy Space Center (KSC); 2) Requirements and Objectives; 3) Program Description, Background and History; 4) Approach and Implementation; 5) Challenges; 6) Lessons Learned; 7) Successes and Benefits; and 8) Summary and Conclusions.

  9. Assessing the relationship between computational speed and precision: a case study comparing an interpreted versus compiled programming language using a stochastic simulation model in diabetes care.

    Science.gov (United States)

    McEwan, Phil; Bergenheim, Klas; Yuan, Yong; Tetlow, Anthony P; Gordon, Jason P

    2010-01-01

    Simulation techniques are well suited to modelling diseases yet can be computationally intensive. This study explores the relationship between modelled effect size, statistical precision, and efficiency gains achieved using variance reduction and an executable programming language. A published simulation model designed to model a population with type 2 diabetes mellitus based on the UKPDS 68 outcomes equations was coded in both Visual Basic for Applications (VBA) and C++. Efficiency gains due to the programming language were evaluated, as was the impact of antithetic variates to reduce variance, using predicted QALYs over a 40-year time horizon. The use of C++ provided a 75- and 90-fold reduction in simulation run time when using mean and sampled input values, respectively. For a series of 50 one-way sensitivity analyses, this would yield a total run time of 2 minutes when using C++, compared with 155 minutes for VBA when using mean input values. The use of antithetic variates typically resulted in a 53% reduction in the number of simulation replications and run time required. When drawing all input values to the model from distributions, the use of C++ and variance reduction resulted in a 246-fold improvement in computation time compared with VBA - for which the evaluation of 50 scenarios would correspondingly require 3.8 hours (C++) and approximately 14.5 days (VBA). The choice of programming language used in an economic model, as well as the methods for improving precision of model output can have profound effects on computation time. When constructing complex models, more computationally efficient approaches such as C++ and variance reduction should be considered; concerns regarding model transparency using compiled languages are best addressed via thorough documentation and model validation.

  10. Final Report, Center for Programming Models for Scalable Parallel Computing: Co-Array Fortran, Grant Number DE-FC02-01ER25505

    Energy Technology Data Exchange (ETDEWEB)

    Robert W. Numrich

    2008-04-22

    The major accomplishment of this project is the production of CafLib, an 'object-oriented' parallel numerical library written in Co-Array Fortran. CafLib contains distributed objects such as block vectors and block matrices along with procedures, attached to each object, that perform basic linear algebra operations such as matrix multiplication, matrix transpose and LU decomposition. It also contains constructors and destructors for each object that hide the details of data decomposition from the programmer, and it contains collective operations that allow the programmer to calculate global reductions, such as global sums, global minima and global maxima, as well as vector and matrix norms of several kinds. CafLib is designed to be extensible in such a way that programmers can define distributed grid and field objects, based on vector and matrix objects from the library, for finite difference algorithms to solve partial differential equations. A very important extra benefit that resulted from the project is the inclusion of the co-array programming model in the next Fortran standard called Fortran 2008. It is the first parallel programming model ever included as a standard part of the language. Co-arrays will be a supported feature in all Fortran compilers, and the portability provided by standardization will encourage a large number of programmers to adopt it for new parallel application development. The combination of object-oriented programming in Fortran 2003 with co-arrays in Fortran 2008 provides a very powerful programming model for high-performance scientific computing. Additional benefits from the project, beyond the original goal, include a programto provide access to the co-array model through access to the Cray compiler as a resource for teaching and research. Several academics, for the first time, included the co-array model as a topic in their courses on parallel computing. A separate collaborative project with LANL and PNNL showed how to

  11. A report on intercomparison studies of computer programs which respectively model: i) radionuclide migration ii) equilibrium chemistry of groundwater

    International Nuclear Information System (INIS)

    Broyd, T.W.; McD Grant, M.; Cross, J.E.

    1985-01-01

    This report describes two intercomparison studies of computer programs which respectively model: i) radionuclide migration ii) equilibrium chemistry of groundwaters. These studies have been performed by running a series of test cases with each program and comparing the various results obtained. The work forms a part of the CEC MIRAGE project (MIgration of RAdionuclides in the GEosphere) and has been jointly funded by the CEC and the United Kingdom Department of the Environment. Presentations of the material contained herein were given at plenary meetings of the MIRAGE project in Brussels in March, 1984 (migration) and March, 1985 (equilibrium chemistry) respectively

  12. Structural mode significance using INCA. [Interactive Controls Analysis computer program

    Science.gov (United States)

    Bauer, Frank H.; Downing, John P.; Thorpe, Christopher J.

    1990-01-01

    Structural finite element models are often too large to be used in the design and analysis of control systems. Model reduction techniques must be applied to reduce the structural model to manageable size. In the past, engineers either performed the model order reduction by hand or used distinct computer programs to retrieve the data, to perform the significance analysis and to reduce the order of the model. To expedite this process, the latest version of INCA has been expanded to include an interactive graphical structural mode significance and model order reduction capability.

  13. Intruder dose pathway analysis for the onsite disposal of radioactive wastes: The ONSITE/MAXI1 computer program

    International Nuclear Information System (INIS)

    Kennedy, W.E. Jr.; Peloquin, R.A.; Napier, B.A.; Neuder, S.M.

    1987-02-01

    This document summarizes initial efforts to develop human-intrusion scenarios and a modified version of the MAXI computer program for potential use by the NRC in reviewing applications for onsite radioactive waste disposal. Supplement 1 of NUREG/CR-3620 (1986) summarized modifications and improvements to the ONSITE/MAXI1 software package. This document summarizes a modified version of the ONSITE/MAXI1 computer program. This modified version of the computer program operates on a personal computer and permits the user to optionally select radiation dose conversion factors published by the International Commission on Radiological Protection (ICRP) in their Publication No. 30 (ICRP 1979-1982) in place of those published by the ICRP in their Publication No. 2 (ICRP 1959) (as implemented in the previous versions of the ONSITE/MAXI1 computer program). The pathway-to-human models used in the computer program have not been changed from those described previously. Computer listings of the ONSITE/MAXI1 computer program and supporting data bases are included in the appendices of this document

  14. Game Theory and its Relationship with Linear Programming Models ...

    African Journals Online (AJOL)

    Game Theory and its Relationship with Linear Programming Models. ... This paper shows that game theory and linear programming problem are closely related subjects since any computing method devised for ... AJOL African Journals Online.

  15. Computer-aided performance monitoring program at Diablo Canyon

    International Nuclear Information System (INIS)

    Nelson, T.; Glynn, R. III; Kessler, T.C.

    1992-01-01

    This paper describes the thermal performance monitoring program at Pacific Gas ampersand Electric Company's (PG ampersand E's) Diablo Canyon Nuclear Power Plant. The plant performance monitoring program at Diablo Canyon uses the THERMAC performance monitoring and analysis computer software provided by Expert-EASE Systems. THERMAC is used to collect performance data from the plant process computers, condition that data to adjust for measurement errors and missing data points, evaluate cycle and component-level performance, archive the data for trend analysis and generate performance reports. The current status of the program is that, after a fair amount of open-quotes tuningclose quotes of the basic open-quotes thermal kitclose quotes models provided with the initial THERMAC installation, we have successfully baselined both units to cycle isolation test data from previous reload cycles. Over the course of the past few months, we have accumulated enough data to generate meaningful performance trends and, as a result, have been able to use THERMAC to track a condenser fouling problem that was costing enough megawatts to attract corporate-level attention. Trends from THERMAC clearly related the megawatt loss to a steadily degrading condenser cleanliness factor and verified the subsequent gain in megawatts after the condenser was cleaned. In the future, we expect to rebaseline THERMAC to a beginning of cycle (BOC) data set and to use the program to help track feedwater nozzle fouling

  16. One-dimensional computational modeling on nuclear reactor problems

    International Nuclear Information System (INIS)

    Alves Filho, Hermes; Baptista, Josue Costa; Trindade, Luiz Fernando Santos; Heringer, Juan Diego dos Santos

    2013-01-01

    In this article, we present a computational modeling, which gives us a dynamic view of some applications of Nuclear Engineering, specifically in the power distribution and the effective multiplication factor (keff) calculations. We work with one-dimensional problems of deterministic neutron transport theory, with the linearized Boltzmann equation in the discrete ordinates (SN) formulation, independent of time, with isotropic scattering and then built a software (Simulator) for modeling computational problems used in a typical calculations. The program used in the implementation of the simulator was Matlab, version 7.0. (author)

  17. Dynamical Models for Computer Viruses Propagation

    Directory of Open Access Journals (Sweden)

    José R. C. Piqueira

    2008-01-01

    Full Text Available Nowadays, digital computer systems and networks are the main engineering tools, being used in planning, design, operation, and control of all sizes of building, transportation, machinery, business, and life maintaining devices. Consequently, computer viruses became one of the most important sources of uncertainty, contributing to decrease the reliability of vital activities. A lot of antivirus programs have been developed, but they are limited to detecting and removing infections, based on previous knowledge of the virus code. In spite of having good adaptation capability, these programs work just as vaccines against diseases and are not able to prevent new infections based on the network state. Here, a trial on modeling computer viruses propagation dynamics relates it to other notable events occurring in the network permitting to establish preventive policies in the network management. Data from three different viruses are collected in the Internet and two different identification techniques, autoregressive and Fourier analyses, are applied showing that it is possible to forecast the dynamics of a new virus propagation by using the data collected from other viruses that formerly infected the network.

  18. A Statistical Model and Computer program for Preliminary Calculations Related to the Scaling of Sensor Arrays; TOPICAL

    International Nuclear Information System (INIS)

    Max Morris

    2001-01-01

    Recent advances in sensor technology and engineering have made it possible to assemble many related sensors in a common array, often of small physical size. Sensor arrays may report an entire vector of measured values in each data collection cycle, typically one value per sensor per sampling time. The larger quantities of data provided by larger arrays certainly contain more information, however in some cases experience suggests that dramatic increases in array size do not always lead to corresponding improvements in the practical value of the data. The work leading to this report was motivated by the need to develop computational planning tools to approximate the relative effectiveness of arrays of different size (or scale) in a wide variety of contexts. The basis of the work is a statistical model of a generic sensor array. It includes features representing measurement error, both common to all sensors and independent from sensor to sensor, and the stochastic relationships between the quantities to be measured by the sensors. The model can be used to assess the effectiveness of hypothetical arrays in classifying objects or events from two classes. A computer program is presented for evaluating the misclassification rates which can be expected when arrays are calibrated using a given number of training samples, or the number of training samples required to attain a given level of classification accuracy. The program is also available via email from the first author for a limited time

  19. A SCILAB Program for Computing Rotating Magnetic Compact Objects

    Science.gov (United States)

    Papasotiriou, P. J.; Geroyannis, V. S.

    We implement the so-called ``complex-plane iterative technique'' (CIT) to the computation of classical differentially rotating magnetic white dwarf and neutron star models. The program has been written in SCILAB (© INRIA-ENPC), a matrix-oriented high-level programming language, which can be downloaded free of charge from the site http://www-rocq.inria.fr/scilab. Due to the advanced capabilities of this language, the code is short and understandable. Highlights of the program are: (a) time-saving character, (b) easy use due to the built-in graphics user interface, (c) easy interfacing with Fortran via online dynamic link. We interpret our numerical results in various ways by extensively using the graphics environment of SCILAB.

  20. Fluid dynamics computer programs for NERVA turbopump

    Science.gov (United States)

    Brunner, J. J.

    1972-01-01

    During the design of the NERVA turbopump, numerous computer programs were developed for the analyses of fluid dynamic problems within the machine. Program descriptions, example cases, users instructions, and listings for the majority of these programs are presented.

  1. Advanced Certification Program for Computer Graphic Specialists. Final Performance Report.

    Science.gov (United States)

    Parkland Coll., Champaign, IL.

    A pioneer program in computer graphics was implemented at Parkland College (Illinois) to meet the demand for specialized technicians to visualize data generated on high performance computers. In summer 1989, 23 students were accepted into the pilot program. Courses included C programming, calculus and analytic geometry, computer graphics, and…

  2. Computer Program NIKE

    DEFF Research Database (Denmark)

    Spanget-Larsen, Jens

    2014-01-01

    FORTRAN source code for program NIKE (PC version of QCPE 343). Sample input and output for two model chemical reactions are appended: I. Three consecutive monomolecular reactions, II. A simple chain mechanism......FORTRAN source code for program NIKE (PC version of QCPE 343). Sample input and output for two model chemical reactions are appended: I. Three consecutive monomolecular reactions, II. A simple chain mechanism...

  3. Computer code qualification program for the Advanced CANDU Reactor

    International Nuclear Information System (INIS)

    Popov, N.K.; Wren, D.J.; Snell, V.G.; White, A.J.; Boczar, P.G.

    2003-01-01

    Atomic Energy of Canada Ltd (AECL) has developed and implemented a Software Quality Assurance program (SQA) to ensure that its analytical, scientific and design computer codes meet the required standards for software used in safety analyses. This paper provides an overview of the computer programs used in Advanced CANDU Reactor (ACR) safety analysis, and assessment of their applicability in the safety analyses of the ACR design. An outline of the incremental validation program, and an overview of the experimental program in support of the code validation are also presented. An outline of the SQA program used to qualify these computer codes is also briefly presented. To provide context to the differences in the SQA with respect to current CANDUs, the paper also provides an overview of the ACR design features that have an impact on the computer code qualification. (author)

  4. Fermilab advanced computer program multi-microprocessor project

    International Nuclear Information System (INIS)

    Nash, T.; Areti, H.; Biel, J.

    1985-06-01

    Fermilab's Advanced Computer Program is constructing a powerful 128 node multi-microprocessor system for data analysis in high-energy physics. The system will use commercial 32-bit microprocessors programmed in Fortran-77. Extensive software supports easy migration of user applications from a uniprocessor environment to the multiprocessor and provides sophisticated program development, debugging, and error handling and recovery tools. This system is designed to be readily copied, providing computing cost effectiveness of below $2200 per VAX 11/780 equivalent. The low cost, commercial availability, compatibility with off-line analysis programs, and high data bandwidths (up to 160 MByte/sec) make the system an ideal choice for applications to on-line triggers as well as an offline data processor

  5. Photovoltaic subsystem marketing and distribution model: programming manual. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1982-07-01

    Complete documentation of the marketing and distribution (M and D) computer model is provided. The purpose is to estimate the costs of selling and transporting photovoltaic solar energy products from the manufacturer to the final customer. The model adjusts for the inflation and regional differences in marketing and distribution costs. The model consists of three major components: the marketing submodel, the distribution submodel, and the financial submodel. The computer program is explained including the input requirements, output reports, subprograms and operating environment. The program specifications discuss maintaining the validity of the data and potential improvements. An example for a photovoltaic concentrator collector demonstrates the application of the model.

  6. Development and Study the Usage of Blended Learning Environment Model Using Engineering Design Concept Learning Activities to Computer Programming Courses for Undergraduate Students of Rajabhat Universities

    Directory of Open Access Journals (Sweden)

    Kasame Tritrakan

    2017-06-01

    Full Text Available The objectives of this research were to study and Synthesise the components, to develop, and to study the usage of blended learning environment model using engineering design concept learning activities to computer programming courses for undergraduate students of Rajabhat universities. The research methodology was divided into 3 phases. Phase I: surveying presents, needs and problems in teaching computer programming of 52 lecturers by using in-depth interview from 5 experienced lecturers. The model’s elements were evaluated by 5 experts. The tools were questionnaire, interview form, and model’s elements assessment form. Phase II: developing the model of blended learning environment and learning activities based on engineering design processes and confirming model by 8 experts. The tools were the draft of learning environment, courseware, and assessment forms. Phase III evaluating the effects of using the implemented environment. The samples were students which formed into 2 groups, 25 people in the experiment group and 27 people in the control group by cluster random sampling. The tools were learning environment, courseware, and assessment tools. The statistics used in this research were means, standard deviation, t-test dependent, and one-way MANOVA. The results found that: 1 Lecturers quite agreed with the physical, mental, social, and information learning environment, learning processes, and assessments. There were all needs in high level. However there were physical environment problems in high level yet quite low in other aspects. 2 The developed learning environment had 4 components which were a 4 types of environments b the inputs included blended learning environment, learning motivation factors, and computer programming content c the processes were analysis of state objectives, design learning environment and activities, developing learning environment and testing materials, implement, ation evaluation and evaluate, 4 the outputs

  7. Development of a multimaterial, two-dimensional, arbitrary Lagrangian-Eulerian mesh computer program

    International Nuclear Information System (INIS)

    Barton, R.T.

    1982-01-01

    We have developed a large, multimaterial, two-dimensional Arbitrary Lagrangian-Eulerian (ALE) computer program. The special feature of an ALE mesh is that it can be either an embedded Lagrangian mesh, a fixed Eulerian mesh, or a partially embedded, partially remapped mesh. Remapping is used to remove Lagrangian mesh distortion. This general purpose program has been used for astrophysical modeling, under the guidance of James R. Wilson. The rationale behind the development of this program will be used to highlight several important issues in program design

  8. Program description of FIBRAM (Fiber Optic Radiation Attenuation Model): a radiation attenuation model for optical fibers

    International Nuclear Information System (INIS)

    Ingram, W.J.

    1987-06-01

    The report describes a fiber-optics system model and its computer implementation. This implementation can calculate the bit error ratio (BER) versus time for optical fibers that have been exposed to gamma radiation. The program is designed so that the user may arbitrarily change any or all of the system input variables and produce separate outputs. The primary output of the program is a table of the BER as a function of time. This table may be stored on magnetic media and later incorporated into computer graphic programs. The program was written in FORTRAN 77 for the IBM PC/AT/XT computers. Flow charts and program listings are included in the report

  9. Architecture and Programming Models for High Performance Intensive Computation

    Science.gov (United States)

    2016-06-29

    commands from the data processing center to the sensors is needed. It has been noted that the ubiquity of mobile communication devices offers the...commands from a Processing Facility by way of mobile Relay Stations. The activity of each component of this model other than the Merge module can be...evaluation of the initial system implementation. Gao also was in charge of the development of Fresh Breeze architecture backend on new many-core computers

  10. Review of calculational models and computer codes for environmental dose assessment of radioactive releases

    International Nuclear Information System (INIS)

    Strenge, D.L.; Watson, E.C.; Droppo, J.G.

    1976-06-01

    The development of technological bases for siting nuclear fuel cycle facilities requires calculational models and computer codes for the evaluation of risks and the assessment of environmental impact of radioactive effluents. A literature search and review of available computer programs revealed that no one program was capable of performing all of the great variety of calculations (i.e., external dose, internal dose, population dose, chronic release, accidental release, etc.). Available literature on existing computer programs has been reviewed and a description of each program reviewed is given

  11. Review of calculational models and computer codes for environmental dose assessment of radioactive releases

    Energy Technology Data Exchange (ETDEWEB)

    Strenge, D.L.; Watson, E.C.; Droppo, J.G.

    1976-06-01

    The development of technological bases for siting nuclear fuel cycle facilities requires calculational models and computer codes for the evaluation of risks and the assessment of environmental impact of radioactive effluents. A literature search and review of available computer programs revealed that no one program was capable of performing all of the great variety of calculations (i.e., external dose, internal dose, population dose, chronic release, accidental release, etc.). Available literature on existing computer programs has been reviewed and a description of each program reviewed is given.

  12. The NASA computer science research program plan

    Science.gov (United States)

    1983-01-01

    A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.

  13. Computational neurogenetic modeling

    CERN Document Server

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  14. Case Studies of Liberal Arts Computer Science Programs

    Science.gov (United States)

    Baldwin, D.; Brady, A.; Danyluk, A.; Adams, J.; Lawrence, A.

    2010-01-01

    Many undergraduate liberal arts institutions offer computer science majors. This article illustrates how quality computer science programs can be realized in a wide variety of liberal arts settings by describing and contrasting the actual programs at five liberal arts colleges: Williams College, Kalamazoo College, the State University of New York…

  15. Impact of changing computer technology on hydrologic and water resource modeling

    OpenAIRE

    Loucks, D.P.; Fedra, K.

    1987-01-01

    The increasing availability of substantial computer power at relatively low costs and the increasing ease of using computer graphics, of communicating with other computers and data bases, and of programming using high-level problem-oriented computer languages, is providing new opportunities and challenges for those developing and using hydrologic and water resources models. This paper reviews some of the progress made towards the development and application of computer support systems designe...

  16. LIAR -- A computer program for the modeling and simulation of high performance linacs

    International Nuclear Information System (INIS)

    Assmann, R.; Adolphsen, C.; Bane, K.; Emma, P.; Raubenheimer, T.; Siemann, R.; Thompson, K.; Zimmermann, F.

    1997-04-01

    The computer program LIAR (LInear Accelerator Research Code) is a numerical modeling and simulation tool for high performance linacs. Amongst others, it addresses the needs of state-of-the-art linear colliders where low emittance, high-intensity beams must be accelerated to energies in the 0.05-1 TeV range. LIAR is designed to be used for a variety of different projects. LIAR allows the study of single- and multi-particle beam dynamics in linear accelerators. It calculates emittance dilutions due to wakefield deflections, linear and non-linear dispersion and chromatic effects in the presence of multiple accelerator imperfections. Both single-bunch and multi-bunch beams can be simulated. Several basic and advanced optimization schemes are implemented. Present limitations arise from the incomplete treatment of bending magnets and sextupoles. A major objective of the LIAR project is to provide an open programming platform for the accelerator physics community. Due to its design, LIAR allows straight-forward access to its internal FORTRAN data structures. The program can easily be extended and its interactive command language ensures maximum ease of use. Presently, versions of LIAR are compiled for UNIX and MS Windows operating systems. An interface for the graphical visualization of results is provided. Scientific graphs can be saved in the PS and EPS file formats. In addition a Mathematica interface has been developed. LIAR now contains more than 40,000 lines of source code in more than 130 subroutines. This report describes the theoretical basis of the program, provides a reference for existing features and explains how to add further commands. The LIAR home page and the ONLINE version of this manual can be accessed under: http://www.slac.stanford.edu/grp/arb/rwa/liar.htm

  17. Computer programming in the UK undergraduate mathematics curriculum

    Science.gov (United States)

    Sangwin, Christopher J.; O'Toole, Claire

    2017-11-01

    This paper reports a study which investigated the extent to which undergraduate mathematics students in the United Kingdom are currently taught to programme a computer as a core part of their mathematics degree programme. We undertook an online survey, with significant follow-up correspondence, to gather data on current curricula and received replies from 46 (63%) of the departments who teach a BSc mathematics degree. We found that 78% of BSc degree courses in mathematics included computer programming in a compulsory module but 11% of mathematics degree programmes do not teach programming to all their undergraduate mathematics students. In 2016, programming is most commonly taught to undergraduate mathematics students through imperative languages, notably MATLAB, using numerical analysis as the underlying (or parallel) mathematical subject matter. Statistics is a very popular choice in optional courses, using the package R. Computer algebra systems appear to be significantly less popular for compulsory first-year courses than a decade ago, and there was no mention of logic programming, functional programming or automatic theorem proving software. The modal form of assessment of computing modules is entirely by coursework (i.e. no examination).

  18. Introduction of handheld computing to a family practice residency program.

    Science.gov (United States)

    Rao, Goutham

    2002-01-01

    Handheld computers are valuable practice tools. It is important for residency programs to introduce their trainees and faculty to this technology. This article describes a formal strategy to introduce handheld computing to a family practice residency program. Objectives were selected for the handheld computer training program that reflected skills physicians would find useful in practice. TRGpro handheld computers preloaded with a suite of medical reference programs, a medical calculator, and a database program were supplied to participants. Training consisted of four 1-hour modules each with a written evaluation quiz. Participants completed a self-assessment questionnaire after the program to determine their ability to meet each objective. Sixty of the 62 participants successfully completed the training program. The mean composite score on quizzes was 36 of 40 (90%), with no significant differences by level of residency training. The mean self-ratings of participants across all objectives was 3.31 of 4.00. Third-year residents had higher mean self-ratings than others (mean of group, 3.62). Participants were very comfortable with practical skills, such as using drug reference software, and less comfortable with theory, such as knowing the different types of handheld computers available. Structured training is a successful strategy for introducing handheld computing to a residency program.

  19. Center for Programming Models for Scalable Parallel Computing - Towards Enhancing OpenMP for Manycore and Heterogeneous Nodes

    Energy Technology Data Exchange (ETDEWEB)

    Barbara Chapman

    2012-02-01

    OpenMP was not well recognized at the beginning of the project, around year 2003, because of its limited use in DoE production applications and the inmature hardware support for an efficient implementation. Yet in the recent years, it has been graduately adopted both in HPC applications, mostly in the form of MPI+OpenMP hybrid code, and in mid-scale desktop applications for scientific and experimental studies. We have observed this trend and worked deligiently to improve our OpenMP compiler and runtimes, as well as to work with the OpenMP standard organization to make sure OpenMP are evolved in the direction close to DoE missions. In the Center for Programming Models for Scalable Parallel Computing project, the HPCTools team at the University of Houston (UH), directed by Dr. Barbara Chapman, has been working with project partners, external collaborators and hardware vendors to increase the scalability and applicability of OpenMP for multi-core (and future manycore) platforms and for distributed memory systems by exploring different programming models, language extensions, compiler optimizations, as well as runtime library support.

  20. Systematic control of large computer programs

    International Nuclear Information System (INIS)

    Goedbloed, J.P.; Klieb, L.

    1986-07-01

    A package of CCL, UPDATE, and FORTRAN procedures is described which facilitates the systematic control and development of large scientific computer programs. The package provides a general tool box for this purpose which contains many conveniences for the systematic administration of files, editing, reformating of line printer output files, etc. In addition, a small number of procedures is devoted to the problem of structured development of a large computer program which is used by a group of scientists. The essence of the method is contained in three procedures N, R, and X for the creation of a new UPDATE program library, its revision, and execution, resp., and a procedure REVISE which provides a joint editor - UPDATE session which combines the advantages of the two systems, viz. speed and rigor. (Auth.)

  1. Computer Aided Design System for Developing Musical Fountain Programs

    Institute of Scientific and Technical Information of China (English)

    刘丹; 张乃尧; 朱汉城

    2003-01-01

    A computer aided design system for developing musical fountain programs was developed with multiple functions such as intelligent design, 3-D animation, manual modification and synchronized motion to make the development process more efficient. The system first analyzed the music form and sentiment using many basic features of the music to select a basic fountain program. Then, this program is simulated with 3-D animation and modified manually to achieve the desired results. Finally, the program is transformed to a computer control program to control the musical fountain in time with the music. A prototype system for the musical fountain was also developed. It was tested with many styles of music and users were quite satisfied with its performance. By integrating various functions, the proposed computer aided design system for developing musical fountain programs greatly simplified the design of the musical fountain programs.

  2. Image based Monte Carlo modeling for computational phantom

    International Nuclear Information System (INIS)

    Cheng, M.; Wang, W.; Zhao, K.; Fan, Y.; Long, P.; Wu, Y.

    2013-01-01

    Full text of the publication follows. The evaluation on the effects of ionizing radiation and the risk of radiation exposure on human body has been becoming one of the most important issues for radiation protection and radiotherapy fields, which is helpful to avoid unnecessary radiation and decrease harm to human body. In order to accurately evaluate the dose on human body, it is necessary to construct more realistic computational phantom. However, manual description and verification of the models for Monte Carlo (MC) simulation are very tedious, error-prone and time-consuming. In addition, it is difficult to locate and fix the geometry error, and difficult to describe material information and assign it to cells. MCAM (CAD/Image-based Automatic Modeling Program for Neutronics and Radiation Transport Simulation) was developed as an interface program to achieve both CAD- and image-based automatic modeling. The advanced version (Version 6) of MCAM can achieve automatic conversion from CT/segmented sectioned images to computational phantoms such as MCNP models. Imaged-based automatic modeling program(MCAM6.0) has been tested by several medical images and sectioned images. And it has been applied in the construction of Rad-HUMAN. Following manual segmentation and 3D reconstruction, a whole-body computational phantom of Chinese adult female called Rad-HUMAN was created by using MCAM6.0 from sectioned images of a Chinese visible human dataset. Rad-HUMAN contains 46 organs/tissues, which faithfully represented the average anatomical characteristics of the Chinese female. The dose conversion coefficients (Dt/Ka) from kerma free-in-air to absorbed dose of Rad-HUMAN were calculated. Rad-HUMAN can be applied to predict and evaluate dose distributions in the Treatment Plan System (TPS), as well as radiation exposure for human body in radiation protection. (authors)

  3. An integrated introduction to computer graphics and geometric modeling

    CERN Document Server

    Goldman, Ronald

    2009-01-01

    … this book may be the first book on geometric modelling that also covers computer graphics. In addition, it may be the first book on computer graphics that integrates a thorough introduction to 'freedom' curves and surfaces and to the mathematical foundations for computer graphics. … the book is well suited for an undergraduate course. … The entire book is very well presented and obviously written by a distinguished and creative researcher and educator. It certainly is a textbook I would recommend. …-Computer-Aided Design, 42, 2010… Many books concentrate on computer programming and soon beco

  4. The Use of Molecular Modeling Programs in Medicinal Chemistry Instruction.

    Science.gov (United States)

    Harrold, Marc W.

    1992-01-01

    This paper describes and evaluates the use of a molecular modeling computer program (Alchemy II) in a pharmaceutical education program. Provided are the hardware requirements and basic program features as well as several examples of how this program and its features have been applied in the classroom. (GLR)

  5. 01010000 01001100 01000001 01011001: Play Elements in Computer Programming

    Science.gov (United States)

    Breslin, Samantha

    2013-01-01

    This article explores the role of play in human interaction with computers in the context of computer programming. The author considers many facets of programming including the literary practice of coding, the abstract design of programs, and more mundane activities such as testing, debugging, and hacking. She discusses how these incorporate the…

  6. The Design of Model-Based Training Programs

    Science.gov (United States)

    Polson, Peter; Sherry, Lance; Feary, Michael; Palmer, Everett; Alkin, Marty; McCrobie, Dan; Kelley, Jerry; Rosekind, Mark (Technical Monitor)

    1997-01-01

    This paper proposes a model-based training program for the skills necessary to operate advance avionics systems that incorporate advanced autopilots and fight management systems. The training model is based on a formalism, the operational procedure model, that represents the mission model, the rules, and the functions of a modem avionics system. This formalism has been defined such that it can be understood and shared by pilots, the avionics software, and design engineers. Each element of the software is defined in terms of its intent (What?), the rationale (Why?), and the resulting behavior (How?). The Advanced Computer Tutoring project at Carnegie Mellon University has developed a type of model-based, computer aided instructional technology called cognitive tutors. They summarize numerous studies showing that training times to a specified level of competence can be achieved in one third the time of conventional class room instruction. We are developing a similar model-based training program for the skills necessary to operation the avionics. The model underlying the instructional program and that simulates the effects of pilots entries and the behavior of the avionics is based on the operational procedure model. Pilots are given a series of vertical flightpath management problems. Entries that result in violations, such as failure to make a crossing restriction or violating the speed limits, result in error messages with instruction. At any time, the flightcrew can request suggestions on the appropriate set of actions. A similar and successful training program for basic skills for the FMS on the Boeing 737-300 was developed and evaluated. The results strongly support the claim that the training methodology can be adapted to the cockpit.

  7. Computer Hardware, Advanced Mathematics and Model Physics pilot project final report

    International Nuclear Information System (INIS)

    1992-05-01

    The Computer Hardware, Advanced Mathematics and Model Physics (CHAMMP) Program was launched in January, 1990. A principal objective of the program has been to utilize the emerging capabilities of massively parallel scientific computers in the challenge of regional scale predictions of decade-to-century climate change. CHAMMP has already demonstrated the feasibility of achieving a 10,000 fold increase in computational throughput for climate modeling in this decade. What we have also recognized, however, is the need for new algorithms and computer software to capitalize on the radically new computing architectures. This report describes the pilot CHAMMP projects at the DOE National Laboratories and the National Center for Atmospheric Research (NCAR). The pilot projects were selected to identify the principal challenges to CHAMMP and to entrain new scientific computing expertise. The success of some of these projects has aided in the definition of the CHAMMP scientific plan. Many of the papers in this report have been or will be submitted for publication in the open literature. Readers are urged to consult with the authors directly for questions or comments about their papers

  8. An introduction to Python and computer programming

    CERN Document Server

    Zhang, Yue

    2015-01-01

    This book introduces Python programming language and fundamental concepts in algorithms and computing. Its target audience includes students and engineers with little or no background in programming, who need to master a practical programming language and learn the basic thinking in computer science/programming. The main contents come from lecture notes for engineering students from all disciplines, and has received high ratings. Its materials and ordering have been adjusted repeatedly according to classroom reception. Compared to alternative textbooks in the market, this book introduces the underlying Python implementation of number, string, list, tuple, dict, function, class, instance and module objects in a consistent and easy-to-understand way, making assignment, function definition, function call, mutability and binding environments understandable inside-out. By giving the abstraction of implementation mechanisms, this book builds a solid understanding of the Python programming language.

  9. Computer modeling of ORNL storage tank sludge mobilization and mixing

    International Nuclear Information System (INIS)

    Terrones, G.; Eyler, L.L.

    1993-09-01

    This report presents and analyzes the results of the computer modeling of mixing and mobilization of sludge in horizontal, cylindrical storage tanks using submerged liquid jets. The computer modeling uses the TEMPEST computational fluid dynamics computer program. The horizontal, cylindrical storage tank configuration is similar to the Melton Valley Storage Tanks (MVST) at Oak Ridge National (ORNL). The MVST tank contents exhibit non-homogeneous, non-Newtonian rheology characteristics. The eventual goals of the simulations are to determine under what conditions sludge mobilization using submerged liquid jets is feasible in tanks of this configuration, and to estimate mixing times required to approach homogeneity of the contents of the tanks

  10. Program package for the computation of lenses and deflectors

    International Nuclear Information System (INIS)

    Lencova, B.; Wisselink, G.

    1990-01-01

    In this paper a set of computer programs for the design of electrostatic and magnetic electron lenses and for the design of multipoles for electron microscopy and lithography is described. The two-dimensional field computation is performed by the finite-element method. In order to meet the high demands on accuracy, the programs include the use of a variable step in the fine mesh made with an automeshing procedure, improved methods for coefficient evaluation, a fast solution procedure for the linear equations, and modified algorithms for computation of multipoles and electrostatic lenses. They allow for a fast and accurate computation of electron optical elements. For the input and modification of data, and for presentation of results, graphical menu driven programs written for personal computers are used. For the computation of electron optical properties axial fields are used. (orig.)

  11. Finite Volume Based Computer Program for Ground Source Heat Pump System

    Energy Technology Data Exchange (ETDEWEB)

    Menart, James A. [Wright State University

    2013-02-22

    This report is a compilation of the work that has been done on the grant DE-EE0002805 entitled ?Finite Volume Based Computer Program for Ground Source Heat Pump Systems.? The goal of this project was to develop a detailed computer simulation tool for GSHP (ground source heat pump) heating and cooling systems. Two such tools were developed as part of this DOE (Department of Energy) grant; the first is a two-dimensional computer program called GEO2D and the second is a three-dimensional computer program called GEO3D. Both of these simulation tools provide an extensive array of results to the user. A unique aspect of both these simulation tools is the complete temperature profile information calculated and presented. Complete temperature profiles throughout the ground, casing, tube wall, and fluid are provided as a function of time. The fluid temperatures from and to the heat pump, as a function of time, are also provided. In addition to temperature information, detailed heat rate information at several locations as a function of time is determined. Heat rates between the heat pump and the building indoor environment, between the working fluid and the heat pump, and between the working fluid and the ground are computed. The heat rates between the ground and the working fluid are calculated as a function time and position along the ground loop. The heating and cooling loads of the building being fitted with a GSHP are determined with the computer program developed by DOE called ENERGYPLUS. Lastly COP (coefficient of performance) results as a function of time are provided. Both the two-dimensional and three-dimensional computer programs developed as part of this work are based upon a detailed finite volume solution of the energy equation for the ground and ground loop. Real heat pump characteristics are entered into the program and used to model the heat pump performance. Thus these computer tools simulate the coupled performance of the ground loop and the heat pump

  12. Intruder dose pathway analysis for the onsite disposal of radioactive wastes: the ONSITE/MAXI1 computer program

    International Nuclear Information System (INIS)

    Napier, B.A.; Peloquin, R.A.; Kennedy, W.E. Jr.; Neuder, S.M.

    1984-10-01

    Because of uncertainties associated with assessing the potential risks from onsite burials of radioactive waste, the US Nuclear Regulatory Commission (NRC) has amended its regulations to provide greater assurance that buried radioactive material will not present a hazard to public health and safety. The amended regulations now require licensees to apply for approval of proposed procedures for onsite disposal pursuant to 10 CFR 20.302. The NRC technically reviews these requests on a case-by-case basis. These technical reviews require modeling potential pathways to man and projecting radiation dose commitments. This document contains a summary of our efforts to develop human-intrusion scenarios and to modify a version of the MAXI computer program for potential use by the NRC in reviewing applications for onsite radioactive waste disposal. The documentation of the ONSITE/MAXI computer program is written for two audiences. The first (Audience A) includes persons concerned with the mathematical models and computer algorithms. The second (Audience B) includes persons concerned with exercising the computer program and scenarios for specific onsite disposal applications. Five sample problems are presented and discussed to assist the user in operating the computer program. Summaries of the input and output for the sample problems are included along with a discussion of the hand calculations performed to verify the correct operation of the computer program. Computer listings of the ONSITE/MAXI1 computer program with an abbreviated data base listing are included as Appendix 1 to this document. Finally, complete listings of the data base with listings of the special codes used to create the data base are included in Appendix 2 as a microfiche attachment to this document

  13. STARS: An Integrated, Multidisciplinary, Finite-Element, Structural, Fluids, Aeroelastic, and Aeroservoelastic Analysis Computer Program

    Science.gov (United States)

    Gupta, K. K.

    1997-01-01

    A multidisciplinary, finite element-based, highly graphics-oriented, linear and nonlinear analysis capability that includes such disciplines as structures, heat transfer, linear aerodynamics, computational fluid dynamics, and controls engineering has been achieved by integrating several new modules in the original STARS (STructural Analysis RoutineS) computer program. Each individual analysis module is general-purpose in nature and is effectively integrated to yield aeroelastic and aeroservoelastic solutions of complex engineering problems. Examples of advanced NASA Dryden Flight Research Center projects analyzed by the code in recent years include the X-29A, F-18 High Alpha Research Vehicle/Thrust Vectoring Control System, B-52/Pegasus Generic Hypersonics, National AeroSpace Plane (NASP), SR-71/Hypersonic Launch Vehicle, and High Speed Civil Transport (HSCT) projects. Extensive graphics capabilities exist for convenient model development and postprocessing of analysis results. The program is written in modular form in standard FORTRAN language to run on a variety of computers, such as the IBM RISC/6000, SGI, DEC, Cray, and personal computer; associated graphics codes use OpenGL and IBM/graPHIGS language for color depiction. This program is available from COSMIC, the NASA agency for distribution of computer programs.

  14. SAFE users manual. Volume 4. Computer programs

    International Nuclear Information System (INIS)

    Grady, L.M.

    1983-06-01

    Documentation for the Safeguards Automated Facility Evaluation (SAFE) computer programs is presented. The documentation is in the form of subprogram trees, program abstracts, flowcharts, and listings. Listings are provided on microfiche

  15. A Computer-Aided Writing Program for Learning Disabled Adolescents.

    Science.gov (United States)

    Fais, Laurie; Wanderman, Richard

    The paper describes the application of a computer-assisted writing program in a special high school for learning disabled and dyslexic students and reports on a study of the program's effectiveness. Particular advantages of the Macintosh Computer for such a program are identified including use of the mouse pointing tool, graphic icons to identify…

  16. Development of computer program for simulation of an ice bank system operation, Part I: Mathematical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Halasz, Boris; Grozdek, Marino; Soldo, Vladimir [Faculty of Mechanical Engineering and Naval Architecture, University of Zagreb, Ivana Lucica 5, 10 000 Zagreb (Croatia)

    2009-09-15

    Since the use of standard engineering methods in the process of an ice bank performance evaluation offers neither adequate flexibility nor accuracy, the aim of this research was to provide a powerful tool for an industrial design of an ice storage system allowing to account for the various design parameters and system arrangements over a wide range of time varying operating conditions. In this paper the development of a computer application for the prediction of an ice bank system operation is presented. Static, indirect, cool thermal storage systems with external ice on coil building/melting were considered. The mathematical model was developed by means of energy and mass balance relations for each component of the system and is basically divided into two parts, the model of an ice storage system and the model of a refrigeration unit. Heat transfer processes in an ice silo were modelled by use of empirical correlations while the performance of refrigeration unit components were based on manufacturers data. Programming and application design were made in Fortran 95 language standard. Input of data is enabled through drop down menus and dialog boxes, while the results are presented via figures, diagrams and data (ASCII) files. In addition, to demonstrate the necessity for development of simulation program a case study was performed. Simulation results clearly indicate that no simple engineering methods or rule of thumb principles could be utilised in order to validate performance of an ice bank system properly. (author)

  17. An airborne dispersion/dose assessment computer program. Phase 1

    International Nuclear Information System (INIS)

    Scott, C.K.; Kennedy, E.R.; Hughs, R.

    1991-05-01

    The Atomic Energy Control Board (AECB) staff have a need for an airborne dispersion-dose assessment computer programme for a microcomputer. The programme must be capable of analyzing the dispersion of both radioactive and non-radioactive materials. A further requirement of the programme is that it be implemented on the AECB complex of microcomputers and that it have an advanced graphical user interface. A survey of computer programs was conducted to determine which, if any, could meet the AECB's requirements in whole or in part. Ten programmes were selected for detailed review including programs for nuclear and non-radiological emergencies. None of the available programmes for radiation dose assessment meets all the requirements for reasons of user interaction, method of source term estimation or site specificity. It is concluded that the best option for meeting the AECB requirements is to adopt the CAMEO programme (specifically the ALOHA portion) which has a superior graphical user interface and add the necessary models for radiation dose assessment

  18. A uniform approach for programming distributed heterogeneous computing systems.

    Science.gov (United States)

    Grasso, Ivan; Pellegrini, Simone; Cosenza, Biagio; Fahringer, Thomas

    2014-12-01

    Large-scale compute clusters of heterogeneous nodes equipped with multi-core CPUs and GPUs are getting increasingly popular in the scientific community. However, such systems require a combination of different programming paradigms making application development very challenging. In this article we introduce libWater, a library-based extension of the OpenCL programming model that simplifies the development of heterogeneous distributed applications. libWater consists of a simple interface, which is a transparent abstraction of the underlying distributed architecture, offering advanced features such as inter-context and inter-node device synchronization. It provides a runtime system which tracks dependency information enforced by event synchronization to dynamically build a DAG of commands, on which we automatically apply two optimizations: collective communication pattern detection and device-host-device copy removal. We assess libWater's performance in three compute clusters available from the Vienna Scientific Cluster, the Barcelona Supercomputing Center and the University of Innsbruck, demonstrating improved performance and scaling with different test applications and configurations.

  19. Computer Assistance for Writing Interactive Programs: TICS.

    Science.gov (United States)

    Kaplow, Roy; And Others

    1973-01-01

    Investigators developed an on-line, interactive programing system--the Teacher-Interactive Computer System (TICS)--to provide assistance to those who were not programers, but nevertheless wished to write interactive instructional programs. TICS had two components: an author system and a delivery system. Underlying assumptions were that…

  20. Computer modelling of eddy current probes

    International Nuclear Information System (INIS)

    Sullivan, S.P.

    1992-01-01

    Computer programs have been developed for modelling impedance and transmit-receive eddy current probes in two-dimensional axis-symmetric configurations. These programs, which are based on analytic equations, simulate bobbin probes in infinitely long tubes and surface probes on plates. They calculate probe signal due to uniform variations in conductor thickness, resistivity and permeability. These signals depend on probe design and frequency. A finite element numerical program has been procured to calculate magnetic permeability in non-linear ferromagnetic materials. Permeability values from these calculations can be incorporated into the above analytic programs to predict signals from eddy current probes with permanent magnets in ferromagnetic tubes. These programs were used to test various probe designs for new testing applications. Measurements of magnetic permeability in magnetically biased ferromagnetic materials have been performed by superimposing experimental signals, from special laboratory ET probes, on impedance plane diagrams calculated using these programs. (author). 3 refs., 2 figs

  1. Appplication of a general fluid mechanics program to NTP system modeling

    International Nuclear Information System (INIS)

    Lee, S.K.

    1993-01-01

    An effort is currently underway at NASA and the Department of Energy (DOE) to develop an accurate model for predicting nuclear thermal propulsion (NTP) system performance. The objective of the effort is to develop several levels of computer programs which vary in detail and complexity according to user's needs. The current focus is on the Level 1 steady-state, parametric system model. This system model will combine a general fluid mechanics program, SAFSIM, with the ability to analyze turbines, pumps, nozzles, and reactor physics. SAFSIM (System Analysis Flow SIMulator) is a FORTRAN computer program that simulates integrated performance of systems involving fluid mechanics, heat transfer, and reactor dynamics. SAFSIM has the versatility to allow simulation of almost any system, including a nuclear reactor system. The focus of this paper is the validation of SAFSIM's capabilities as a base computational engine for a nuclear thermal propulsion system model. Validation is being accomplished by modeling of a nuclear engine test using SAFSIM and comparing the results to known experimental data

  2. Mastering cloud computing foundations and applications programming

    CERN Document Server

    Buyya, Rajkumar; Selvi, SThamarai

    2013-01-01

    Mastering Cloud Computing is designed for undergraduate students learning to develop cloud computing applications. Tomorrow's applications won't live on a single computer but will be deployed from and reside on a virtual server, accessible anywhere, any time. Tomorrow's application developers need to understand the requirements of building apps for these virtual systems, including concurrent programming, high-performance computing, and data-intensive systems. The book introduces the principles of distributed and parallel computing underlying cloud architectures and specifical

  3. The Dynamic Geometrisation of Computer Programming

    Science.gov (United States)

    Sinclair, Nathalie; Patterson, Margaret

    2018-01-01

    The goal of this paper is to explore dynamic geometry environments (DGE) as a type of computer programming language. Using projects created by secondary students in one particular DGE, we analyse the extent to which the various aspects of computational thinking--including both ways of doing things and particular concepts--were evident in their…

  4. Computational Modeling | Bioenergy | NREL

    Science.gov (United States)

    cell walls and are the source of biofuels and biomaterials. Our modeling investigates their properties . Quantum Mechanical Models NREL studies chemical and electronic properties and processes to reduce barriers Computational Modeling Computational Modeling NREL uses computational modeling to increase the

  5. RECON: a computer program for analyzing repository economics. Documentation and user's manual. Revision 1

    International Nuclear Information System (INIS)

    Clark, L.L.; Schutz, M.E.; Luksic, A.T.

    1985-07-01

    From 1981 through 1984 the Pacific Northwest Laboratory has been developing a computer model named RECON to calculate repository costs from parametric data input. The objective of the program has been to develop the capability to evaluate the effect on costs of changes in repository design parameters and operating scenario assumptions. This report documents the development of the model through September of 1984. Included in the report are: (1) descriptions of model development and the underlying equations, assumptions and definitions; (2) descriptions of data input using either card images or an interactive data input program; and (3) detailed listings of the program and definitions of program variables. Cost estimates generated using the model have been verified against independent estimates and good agreement has been obtained. 2 refs

  6. The Harwell TAILS computer program user's manual

    International Nuclear Information System (INIS)

    Rouse, K.D.; Cooper, M.J.

    1980-11-01

    The Harwell TAILS computer program is a versatile program for crystal structure refinement through the analysis of neutron or X-ray diffraction data from single crystals or powders. The main features of the program are described and details are given of the data input and output specifications. (author)

  7. Steerability Analysis of Tracked Vehicles: Theory and User’s Guide for Computer Program TVSTEER

    Science.gov (United States)

    1986-08-01

    Baladi , Donald E. Barnes, Rebecca P. BergerC oStructures Laboratory NDEPARTMENT OF THE ARMY ___ Waterways Experiment Station, Corps of Engineers . U P0 Box...Analysis of Tracked Vehicles: Theory and User’s Guide for Computer Program TVSTEER - 12 PERSONAL AUTHOR(S) Baladi , George Y., Barnes, Donald E...mathematical model was formulated by Drs. George Y. Baladi and Behzad Rohani. The logic and computer programming were accomplished by Dr. Baladi and

  8. Finite element and node point generation computer programs used for the design of toroidal field coils in tokamak fusion devices

    International Nuclear Information System (INIS)

    Smith, R.A.

    1975-06-01

    The structural analysis of toroidal field coils in Tokamak fusion machines can be performed with the finite element method. This technique has been employed for design evaluations of toroidal field coils on the Princeton Large Torus (PLT), the Poloidal Diverter Experiment (PDX), and the Tokamak Fusion Test Reactor (TFTR). The application of the finite element method can be simplified with computer programs that are used to generate the input data for the finite element code. There are three areas of data input where significant automation can be provided by supplementary computer codes. These concern the definition of geometry by a node point mesh, the definition of the finite elements from the geometric node points, and the definition of the node point force/displacement boundary conditions. The node point forces in a model of a toroidal field coil are computed from the vector cross product of the coil current and the magnetic field. The computer programs named PDXNODE and ELEMENT are described. The program PDXNODE generates the geometric node points of a finite element model for a toroidal field coil. The program ELEMENT defines the finite elements of the model from the node points and from material property considerations. The program descriptions include input requirements, the output, the program logic, the methods of generating complex geometries with multiple runs, computational time and computer compatibility. The output format of PDXNODE and ELEMENT make them compatible with PDXFORC and two general purpose finite element computer codes: (ANSYS) the Engineering Analysis System written by the Swanson Analysis Systems, Inc., and (WECAN) the Westinghouse Electric Computer Analysis general purpose finite element program. The Fortran listings of PDXNODE and ELEMENT are provided

  9. The SX Solver: A Computer Program for Analyzing Solvent-Extraction Equilibria: Version 3.0

    International Nuclear Information System (INIS)

    Lumetta, Gregg J.

    2001-01-01

    A new computer program, the SX Solver, has been developed to analyze solvent-extraction equilibria. The program operates out of Microsoft Excel and uses the built-in Solver function to minimize the sum of the square of the residuals between measured and calculated distribution coefficients. The extraction of nitric acid by tributyl phosphate has been modeled to illustrate the programs use

  10. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  11. Geochemical databases. Part 1. Pmatch: a program to manage thermochemical data. Part 2. The experimental validation of geochemical computer models

    International Nuclear Information System (INIS)

    Pearson, F.J. Jr.; Avis, J.D.; Nilsson, K.; Skytte Jensen, B.

    1993-01-01

    This work is carried out under cost-sharing contract with European Atomic Energy Community in the framework of its programme on Management and Storage of Radioactive Wastes. Part 1: PMATCH, A Program to Manage Thermochemical Data, describes the development and use of a computer program, by means of which new thermodynamic data from literature may be referenced to a common frame and thereby become internally consistent with an existing database. The report presents the relevant thermodynamic expressions and their use in the program is discussed. When there is not sufficient thermodynamic data available to describe a species behaviour under all conceivable conditions, the problems arising are thoroughly discussed and the available data is handled by approximating expressions. Part II: The Experimental Validation of Geochemical Computer models are the results of experimental investigations of the equilibria established in aqueous suspensions of mixtures of carbonate minerals (Calcium, magnesium, manganese and europium carbonates) compared with theoretical calculations made by means of the geochemical JENSEN program. The study revealed that the geochemical computer program worked well, and that its database was of sufficient validity. However, it was observed that experimental difficulties could hardly be avoided, when as here a gaseous component took part in the equilibria. Whereas the magnesium and calcium carbonates did not demonstrate mutual solid solubility, this produced abnormal effects when manganese and calcium carbonates were mixed resulting in a diminished solubility of both manganese and calcium. With tracer amounts of europium added to a suspension of calcite in sodium carbonate solutions long term experiments revealed a transition after 1-2 months, whereby the tracer became more strongly adsorbed onto calcite. The transition is interpreted as the nucleation and formation of a surface phase incorporating the 'species' NaEu(Co 3 ) 2

  12. Computer model for predicting the effect of inherited sterility on population growth

    International Nuclear Information System (INIS)

    Carpenter, J.E.; Layton, R.C.

    1993-01-01

    A Fortran based computer program was developed to facilitate modelling different inherited sterility data sets under various paradigms. The model was designed to allow variable input for several different parameters, such as rate of increase per generation, release ratio and initial population levels, reproductive rates and sex ratios resulting from different matings, and the number of nights a female is active in mating and oviposition. The model and computer program should be valuable tools for recognizing areas in which information is lacking and for identifying the effect that different parameters can have on the efficacy of the inherited sterility method. (author). 8 refs, 4 figs

  13. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  14. An Analysis of OpenACC Programming Model: Image Processing Algorithms as a Case Study

    Directory of Open Access Journals (Sweden)

    M. J. Mišić

    2014-06-01

    Full Text Available Graphics processing units and similar accelerators have been intensively used in general purpose computations for several years. In the last decade, GPU architecture and organization changed dramatically to support an ever-increasing demand for computing power. Along with changes in hardware, novel programming models have been proposed, such as NVIDIA’s Compute Unified Device Architecture (CUDA and Open Computing Language (OpenCL by Khronos group. Although numerous commercial and scientific applications have been developed using these two models, they still impose a significant challenge for less experienced users. There are users from various scientific and engineering communities who would like to speed up their applications without the need to deeply understand a low-level programming model and underlying hardware. In 2011, OpenACC programming model was launched. Much like OpenMP for multicore processors, OpenACC is a high-level, directive-based programming model for manycore processors like GPUs. This paper presents an analysis of OpenACC programming model and its applicability in typical domains like image processing. Three, simple image processing algorithms have been implemented for execution on the GPU with OpenACC. The results were compared with their sequential counterparts, and results are briefly discussed.

  15. Software survey: VOSviewer, a computer program for bibliometric mapping

    NARCIS (Netherlands)

    N.J.P. van Eck (Nees Jan); L. Waltman (Ludo)

    2010-01-01

    textabstractWe present VOSviewer, a freely available computer program that we have developed for constructing and viewing bibliometric maps. Unlike most computer programs that are used for bibliometric mapping, VOSviewer pays special attention to the graphical representation of bibliometric maps.

  16. Factors that Influence the Success of Male and Female Computer Programming Students in College

    Science.gov (United States)

    Clinkenbeard, Drew A.

    As the demand for a technologically skilled work force grows, experience and skill in computer science have become increasingly valuable for college students. However, the number of students graduating with computer science degrees is not growing proportional to this need. Traditionally several groups are underrepresented in this field, notably women and students of color. This study investigated elements of computer science education that influence academic achievement in beginning computer programming courses. The goal of the study was to identify elements that increase success in computer programming courses. A 38-item questionnaire was developed and administered during the Spring 2016 semester at California State University Fullerton (CSUF). CSUF is an urban public university comprised of about 40,000 students. Data were collected from three beginning programming classes offered at CSUF. In total 411 questionnaires were collected resulting in a response rate of 58.63%. Data for the study were grouped into three broad categories of variables. These included academic and background variables; affective variables; and peer, mentor, and role-model variables. A conceptual model was developed to investigate how these variables might predict final course grade. Data were analyzed using statistical techniques such as linear regression, factor analysis, and path analysis. Ultimately this study found that peer interactions, comfort with computers, computer self-efficacy, self-concept, and perception of achievement were the best predictors of final course grade. In addition, the analyses showed that male students exhibited higher levels of computer self-efficacy and self-concept compared to female students, even when they achieved comparable course grades. Implications and explanations of these findings are explored, and potential policy changes are offered.

  17. Method and computer program product for maintenance and modernization backlogging

    Science.gov (United States)

    Mattimore, Bernard G; Reynolds, Paul E; Farrell, Jill M

    2013-02-19

    According to one embodiment, a computer program product for determining future facility conditions includes a computer readable medium having computer readable program code stored therein. The computer readable program code includes computer readable program code for calculating a time period specific maintenance cost, for calculating a time period specific modernization factor, and for calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. In another embodiment, a computer-implemented method for calculating future facility conditions includes calculating a time period specific maintenance cost, calculating a time period specific modernization factor, and calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. Other embodiments are also presented.

  18. TVENT: a computer program for analysis of tornado-induced transients in ventilation systems

    International Nuclear Information System (INIS)

    Duerre, K.H.; Andrae, R.W.; Gregory, W.S.

    1978-07-01

    The report describes TVENT, a portable FORTRAN computer program for predicting flows and pressures in a ventilation system subject to a tornado. The pressure and flow values calculated by TVENT can be used as a basis for structural analysis. TVENT is a one-dimensional, lumped-parameter model with incompressible flow augmented by fluid storage. The theoretical basis for the mathematical modeling and analysis is presented, and a description of the input for the computer code is provided. Modeling techniques specific to ventilation systems are described. Sample problems illustrate the use of TVENT in analyzing ventilation systems. Other sample problems illustrate modeling techniques used in reducing complex systems

  19. Computations of concentration of radon and its decay products against time. Computer program; Obliczanie koncentracji radonu i jego produktow rozpadu w funkcji czasu. Program komputerowy

    Energy Technology Data Exchange (ETDEWEB)

    Machaj, B. [Institute of Nuclear Chemistry and Technology, Warsaw (Poland)

    1996-12-31

    This research is aimed to develop a device for continuous monitoring of radon in the air, by measuring alpha activity of radon and its short lived decay products. The influence of alpha activity variation of radon and its daughters on the measured results is of importance and requires a knowledge of this variation with time. Employing the measurement of alpha radiation of radon and of its short lived decay products, require knowledge of radon concentration variation and its decay products against the time. A computer program in Turbo Pascal language was therefore developed performing the computations employing the known relations involved, the program being adapted for IBM PC computers. The presented program enables computation of activity of {sup 222}Rn and its daughter products: {sup 218}Po, {sup 214}Pb, {sup 214}Bi and {sup 214}Po every 1 min within the period of 0-255 min for any state of radiation equilibrium between the radon and its daughter products. The program permits also to compute alpha activity of {sup 222}Rn + {sup 218}Po + {sup 214}Po against time and the total alpha activity at selected interval of time. The results of computations are stored on the computer hard disk in ASCII format and are used a graphic program e.g. by DrawPerfect program to make diagrams. Equations employed for computation of the alpha activity of radon and its decay products as well as the description of program functions are given. (author). 2 refs, 4 figs.

  20. Computer programs for display. [magnetic tapes - project planning/NASA programs

    Science.gov (United States)

    1975-01-01

    The developments of an information storage and retrieval system are presented. Computer programs used in the system are described; the programs allow display messages to be placed on disks in an off-line environment permitting a more efficient use of memory. A time table that shows complete and scheduled developments of the system is given.

  1. The NEA computer program library: a possible GDMS application

    International Nuclear Information System (INIS)

    Schuler, W.

    1978-01-01

    NEA Computer Program library maintains a series of eleven sequential computer files, used for linked applications in managing their stock of computer codes for nuclear reactor calculations, storing index and program abstract information, and administering their service to requesters. The high data redundancy beween the files suggests that a data base approach would be valid and this paper suggests a possible 'schema' for an CODASYL GDMS

  2. 78 FR 15734 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2013-03-12

    ... 1974; Computer Matching Program AGENCY: Department of Homeland Security/U.S. Citizenship and... computer matching program between the Department of Homeland Security/U.S. Citizenship and Immigration... Protection Act of 1988 (Pub. L. 100-503) and the Computer Matching and Privacy Protection Amendments of 1990...

  3. 78 FR 38724 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2013-06-27

    ... 1974; Computer Matching Program AGENCY: Department of Homeland Security/U.S. Citizenship and... Agreement that establishes a computer matching program between the Department of Homeland Security/U.S... and Privacy Protection Act of 1988 (Pub. L. 100-503) and the Computer Matching and Privacy Protection...

  4. 78 FR 15733 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2013-03-12

    ... 1974; Computer Matching Program AGENCY: Department of Homeland Security/U.S. Citizenship and... computer matching program between the Department of Homeland Security/U.S. Citizenship and Immigration... Protection Act of 1988 (Pub. L. 100-503) and the Computer Matching and Privacy Protection Amendments of 1990...

  5. Parallel computing in enterprise modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.; Vanderveen, Keith; Ray, Jaideep; Heath, Zach; Allan, Benjamin A.

    2008-08-01

    This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priori ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.

  6. User's manual for computer program BASEPLOT

    Science.gov (United States)

    Sanders, Curtis L.

    2002-01-01

    The checking and reviewing of daily records of streamflow within the U.S. Geological Survey is traditionally accomplished by hand-plotting and mentally collating tables of data. The process is time consuming, difficult to standardize, and subject to errors in computation, data entry, and logic. In addition, the presentation of flow data on the internet requires more timely and accurate computation of daily flow records. BASEPLOT was developed for checking and review of primary streamflow records within the U.S. Geological Survey. Use of BASEPLOT enables users to (1) provide efficiencies during the record checking and review process, (2) improve quality control, (3) achieve uniformity of checking and review techniques of simple stage-discharge relations, and (4) provide a tool for teaching streamflow computation techniques. The BASEPLOT program produces tables of quality control checks and produces plots of rating curves and discharge measurements; variable shift (V-shift) diagrams; and V-shifts converted to stage-discharge plots, using data stored in the U.S. Geological Survey Automatic Data Processing System database. In addition, the program plots unit-value hydrographs that show unit-value stages, shifts, and datum corrections; input shifts, datum corrections, and effective dates; discharge measurements; effective dates for rating tables; and numeric quality control checks. Checklist/tutorial forms are provided for reviewers to ensure completeness of review and standardize the review process. The program was written for the U.S. Geological Survey SUN computer using the Statistical Analysis System (SAS) software produced by SAS Institute, Incorporated.

  7. SONATINA-2H: a computer program for seismic analysis of the two-dimensional horizontal slice HTGR core

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1990-02-01

    A Computer program SONATINA-2H has been developed for predicting the behavior of a two-dimensional horizontal HTGR core under seismic excitation. SONATINA-2H is a general two-dimensional computer program capable of analyzing the horizontal slice HTGR core with the fixed side reflector blocks and its restraint structures and the core support structure. In the analytical model, each block is treated as a rigid body and represent one column of the reactor core and is connected to the core support structure by means of column springs and viscous dampers. A single dashpot model is used for the collision process between adjacent blocks. The core support structure is represented by a single block. The computer program SONATINA-2H is capable of analyzing the core behavior for an excitation input applied simultaneously in two mutually perpendicular horizontal directions. In the present report are given, the theoretical formulation of the analytical model, an user's manual to describe the input and output format and sample problems. (author)

  8. POBE: A Computer Program for Optimal Design of Multi-Subject Blocked fMRI Experiments

    Directory of Open Access Journals (Sweden)

    Bärbel Maus

    2014-01-01

    Full Text Available For functional magnetic resonance imaging (fMRI studies, researchers can use multi-subject blocked designs to identify active brain regions for a certain stimulus type of interest. Before performing such an experiment, careful planning is necessary to obtain efficient stimulus effect estimators within the available financial resources. The optimal number of subjects and the optimal scanning time for a multi-subject blocked design with fixed experimental costs can be determined using optimal design methods. In this paper, the user-friendly computer program POBE 1.2 (program for optimal design of blocked experiments, version 1.2 is presented. POBE provides a graphical user interface for fMRI researchers to easily and efficiently design their experiments. The computer program POBE calculates the optimal number of subjects and the optimal scanning time for user specified experimental factors and model parameters so that the statistical efficiency is maximised for a given study budget. POBE can also be used to determine the minimum budget for a given power. Furthermore, a maximin design can be determined as efficient design for a possible range of values for the unknown model parameters. In this paper, the computer program is described and illustrated with typical experimental factors for a blocked fMRI experiment.

  9. HAL/SM language specification. [programming languages and computer programming for space shuttles

    Science.gov (United States)

    Williams, G. P. W., Jr.; Ross, C.

    1975-01-01

    A programming language is presented for the flight software of the NASA Space Shuttle program. It is intended to satisfy virtually all of the flight software requirements of the space shuttle. To achieve this, it incorporates a wide range of features, including applications-oriented data types and organizations, real time control mechanisms, and constructs for systems programming tasks. It is a higher order language designed to allow programmers, analysts, and engineers to communicate with the computer in a form approximating natural mathematical expression. Parts of the English language are combined with standard notation to provide a tool that readily encourages programming without demanding computer hardware expertise. Block diagrams and flow charts are included. The semantics of the language is discussed.

  10. Computationally intensive econometrics using a distributed matrix-programming language.

    Science.gov (United States)

    Doornik, Jurgen A; Hendry, David F; Shephard, Neil

    2002-06-15

    This paper reviews the need for powerful computing facilities in econometrics, focusing on concrete problems which arise in financial economics and in macroeconomics. We argue that the profession is being held back by the lack of easy-to-use generic software which is able to exploit the availability of cheap clusters of distributed computers. Our response is to extend, in a number of directions, the well-known matrix-programming interpreted language Ox developed by the first author. We note three possible levels of extensions: (i) Ox with parallelization explicit in the Ox code; (ii) Ox with a parallelized run-time library; and (iii) Ox with a parallelized interpreter. This paper studies and implements the first case, emphasizing the need for deterministic computing in science. We give examples in the context of financial economics and time-series modelling.

  11. A Computer Program for Practical Semivariogram Modeling and Ordinary Kriging: A Case Study of Porosity Distribution in an Oil Field

    Science.gov (United States)

    Mert, Bayram Ali; Dag, Ahmet

    2017-12-01

    In this study, firstly, a practical and educational geostatistical program (JeoStat) was developed, and then example analysis of porosity parameter distribution, using oilfield data, was presented. With this program, two or three-dimensional variogram analysis can be performed by using normal, log-normal or indicator transformed data. In these analyses, JeoStat offers seven commonly used theoretical variogram models (Spherical, Gaussian, Exponential, Linear, Generalized Linear, Hole Effect and Paddington Mix) to the users. These theoretical models can be easily and quickly fitted to experimental models using a mouse. JeoStat uses ordinary kriging interpolation technique for computation of point or block estimate, and also uses cross-validation test techniques for validation of the fitted theoretical model. All the results obtained by the analysis as well as all the graphics such as histogram, variogram and kriging estimation maps can be saved to the hard drive, including digitised graphics and maps. As such, the numerical values of any point in the map can be monitored using a mouse and text boxes. This program is available to students, researchers, consultants and corporations of any size free of charge. The JeoStat software package and source codes available at: http://www.jeostat.com/JeoStat_2017.0.rar.

  12. Flexible building stock modelling with array-programming

    DEFF Research Database (Denmark)

    Brøgger, Morten; Wittchen, Kim Bjarne

    2017-01-01

    Many building stock models employ archetype-buildings in order to capture the essential characteristics of a diverse building stock. However, these models often require multiple archetypes, which make them inflexible. This paper proposes an array-programming based model, which calculates the heat...... tend to overestimate potential energy-savings, if we do not consider these discrepancies. The proposed model makes it possible to compute and visualize potential energy-savings in a flexible and transparent way....

  13. Regression modeling methods, theory, and computation with SAS

    CERN Document Server

    Panik, Michael

    2009-01-01

    Regression Modeling: Methods, Theory, and Computation with SAS provides an introduction to a diverse assortment of regression techniques using SAS to solve a wide variety of regression problems. The author fully documents the SAS programs and thoroughly explains the output produced by the programs.The text presents the popular ordinary least squares (OLS) approach before introducing many alternative regression methods. It covers nonparametric regression, logistic regression (including Poisson regression), Bayesian regression, robust regression, fuzzy regression, random coefficients regression,

  14. Computer programs in accelerator physics

    International Nuclear Information System (INIS)

    Keil, E.

    1984-01-01

    Three areas of accelerator physics are discussed in which computer programs have been applied with much success: i) single-particle beam dynamics in circular machines, i.e. the design and matching of machine lattices; ii) computations of electromagnetic fields in RF cavities and similar objects, useful for the design of RF cavities and for the calculation of wake fields; iii) simulation of betatron and synchrotron oscillations in a machine with non-linear elements, e.g. sextupoles, and of bunch lengthening due to longitudinal wake fields. (orig.)

  15. Evolution of a minimal parallel programming model

    International Nuclear Information System (INIS)

    Lusk, Ewing; Butler, Ralph; Pieper, Steven C.

    2017-01-01

    Here, we take a historical approach to our presentation of self-scheduled task parallelism, a programming model with its origins in early irregular and nondeterministic computations encountered in automated theorem proving and logic programming. We show how an extremely simple task model has evolved into a system, asynchronous dynamic load balancing (ADLB), and a scalable implementation capable of supporting sophisticated applications on today’s (and tomorrow’s) largest supercomputers; and we illustrate the use of ADLB with a Green’s function Monte Carlo application, a modern, mature nuclear physics code in production use. Our lesson is that by surrendering a certain amount of generality and thus applicability, a minimal programming model (in terms of its basic concepts and the size of its application programmer interface) can achieve extreme scalability without introducing complexity.

  16. A large-scale computer facility for computational aerodynamics

    International Nuclear Information System (INIS)

    Bailey, F.R.; Balhaus, W.F.

    1985-01-01

    The combination of computer system technology and numerical modeling have advanced to the point that computational aerodynamics has emerged as an essential element in aerospace vehicle design methodology. To provide for further advances in modeling of aerodynamic flow fields, NASA has initiated at the Ames Research Center the Numerical Aerodynamic Simulation (NAS) Program. The objective of the Program is to develop a leading-edge, large-scale computer facility, and make it available to NASA, DoD, other Government agencies, industry and universities as a necessary element in ensuring continuing leadership in computational aerodynamics and related disciplines. The Program will establish an initial operational capability in 1986 and systematically enhance that capability by incorporating evolving improvements in state-of-the-art computer system technologies as required to maintain a leadership role. This paper briefly reviews the present and future requirements for computational aerodynamics and discusses the Numerical Aerodynamic Simulation Program objectives, computational goals, and implementation plans

  17. PEDIC - A COMPUTER PROGRAM TO ESTIMATE THE EFFECT OF EVACUATION ON POPULATION EXPOSURE FOLLOWING ACUTE RADIONUCLIDE RELEASES TO THE ATOMSPHERE

    Energy Technology Data Exchange (ETDEWEB)

    Strenge, D. L.; Peloquin, R. A.

    1981-01-01

    The computer program PEDIC is described for estimation of the effect of evacuation on population exposure. The program uses joint frequency, annual average meteorological data and a simple population evacuation model to estimate exposure reduction due to movement of people away from radioactive plumes following an acute release of activity. Atmospheric dispersion is based on a sector averaged Gaussian model with consideration of plume rise and building wake effects. Appendices to the report provide details of the computer program design, a program listing, input card preparation instructions and sample problems.

  18. New Mexico district work-effort analysis computer program

    Science.gov (United States)

    Hiss, W.L.; Trantolo, A.P.; Sparks, J.L.

    1972-01-01

    The computer program (CAN 2) described in this report is one of several related programs used in the New Mexico District cost-analysis system. The work-effort information used in these programs is accumulated and entered to the nearest hour on forms completed by each employee. Tabulating cards are punched directly from these forms after visual examinations for errors are made. Reports containing detailed work-effort data itemized by employee within each project and account and by account and project for each employee are prepared for both current-month and year-to-date periods by the CAN 2 computer program. An option allowing preparation of reports for a specified 3-month period is provided. The total number of hours worked on each account and project and a grand total of hours worked in the New Mexico District is computed and presented in a summary report for each period. Work effort not chargeable directly to individual projects or accounts is considered as overhead and can be apportioned to the individual accounts and projects on the basis of the ratio of the total hours of work effort for the individual accounts or projects to the total New Mexico District work effort at the option of the user. The hours of work performed by a particular section, such as General Investigations or Surface Water, are prorated and charged to the projects or accounts within the particular section. A number of surveillance or buffer accounts are employed to account for the hours worked on special events or on those parts of large projects or accounts that require a more detailed analysis. Any part of the New Mexico District operation can be separated and analyzed in detail by establishing an appropriate buffer account. With the exception of statements associated with word size, the computer program is written in FORTRAN IV in a relatively low and standard language level to facilitate its use on different digital computers. The program has been run only on a Control Data Corporation

  19. Injecting Artificial Memory Errors Into a Running Computer Program

    Science.gov (United States)

    Bornstein, Benjamin J.; Granat, Robert A.; Wagstaff, Kiri L.

    2008-01-01

    Single-event upsets (SEUs) or bitflips are computer memory errors caused by radiation. BITFLIPS (Basic Instrumentation Tool for Fault Localized Injection of Probabilistic SEUs) is a computer program that deliberately injects SEUs into another computer program, while the latter is running, for the purpose of evaluating the fault tolerance of that program. BITFLIPS was written as a plug-in extension of the open-source Valgrind debugging and profiling software. BITFLIPS can inject SEUs into any program that can be run on the Linux operating system, without needing to modify the program s source code. Further, if access to the original program source code is available, BITFLIPS offers fine-grained control over exactly when and which areas of memory (as specified via program variables) will be subjected to SEUs. The rate of injection of SEUs is controlled by specifying either a fault probability or a fault rate based on memory size and radiation exposure time, in units of SEUs per byte per second. BITFLIPS can also log each SEU that it injects and, if program source code is available, report the magnitude of effect of the SEU on a floating-point value or other program variable.

  20. Computer programs for nonlinear algebraic equations

    International Nuclear Information System (INIS)

    Asaoka, Takumi

    1977-10-01

    We have provided principal computer subroutines for obtaining numerical solutions of nonlinear algebraic equations through a review of the various methods. Benchmark tests were performed on these subroutines to grasp the characteristics of them compared to the existing subroutines. As computer programs based on the secant method, subroutines of the Muller's method using the Chambers' algorithm were newly developed, in addition to the equipment of subroutines of the Muller's method itself. The programs based on the Muller-Chambers' method are useful especially for low-order polynomials with complex coefficients except for the case of finding the triple roots, three close roots etc. In addition, we have equipped subroutines based on the Madsen's algorithm, a variant of the Newton's method. The subroutines have revealed themselves very useful as standard programs because all the roots are found accurately for every case though they take longer computing time than other subroutines for low-order polynomials. It is shown also that an existing subroutine of the Bairstow's method gives the fastest algorithm for polynomials with complex coefficients, except for the case of finding the triple roots etc. We have provided also subroutines to estimate error bounds for all the roots produced with the various algorithms. (auth.)

  1. The Outlook for Computer Professions: 1985 Rewrites the Program.

    Science.gov (United States)

    Drake, Larry

    1986-01-01

    The author states that graduates of junior college programs who learn COBOL will continue to find jobs, but employers will increasingly seek college graduates when filling positions for computer programers and systems analysts. Areas of growth for computer applications (services, military, data communications, and artificial intelligence) are…

  2. Near-Surface Seismic Velocity Data: A Computer Program For ...

    African Journals Online (AJOL)

    A computer program (NESURVELANA) has been developed in Visual Basic Computer programming language to carry out a near surface velocity analysis. The method of analysis used includes: Algorithms design and Visual Basic codes generation for plotting arrival time (ms) against geophone depth (m) employing the ...

  3. Computer modeling of flow induced in-reactor vibrations

    International Nuclear Information System (INIS)

    Turula, P.; Mulcahy, T.M.

    1977-01-01

    An assessment of the reliability of finite element method computer models, as applied to the computation of flow induced vibration response of components used in nuclear reactors, is presented. The prototype under consideration was the Fast Flux Test Facility reactor being constructed for US-ERDA. Data were available from an extensive test program which used a scale model simulating the hydraulic and structural characteristics of the prototype components, subjected to scaled prototypic flow conditions as well as to laboratory shaker excitations. Corresponding analytical solutions of the component vibration problems were obtained using the NASTRAN computer code. Modal analyses and response analyses were performed. The effect of the surrounding fluid was accounted for. Several possible forcing function definitions were considered. Results indicate that modal computations agree well with experimental data. Response amplitude comparisons are good only under conditions favorable to a clear definition of the structural and hydraulic properties affecting the component motion. 20 refs

  4. Multi-binding site model-based curve-fitting program for the computation of RIA data

    International Nuclear Information System (INIS)

    Malan, P.G.; Ekins, R.P.; Cox, M.G.; Long, E.M.R.

    1977-01-01

    In this paper, a comparison will be made of model-based and empirical curve-fitting procedures. The implementation of a multiple binding-site curve-fitting model which will successfully fit a wide range of assay data, and which can be run on a mini-computer is described. The latter sophisticated model also provides estimates of binding site concentrations and the values of the respective equilibrium constants present: the latter have been used for refining assay conditions using computer optimisation techniques. (orig./AJ) [de

  5. Distributed parallel computing in stochastic modeling of groundwater systems.

    Science.gov (United States)

    Dong, Yanhui; Li, Guomin; Xu, Haizhen

    2013-03-01

    Stochastic modeling is a rapidly evolving, popular approach to the study of the uncertainty and heterogeneity of groundwater systems. However, the use of Monte Carlo-type simulations to solve practical groundwater problems often encounters computational bottlenecks that hinder the acquisition of meaningful results. To improve the computational efficiency, a system that combines stochastic model generation with MODFLOW-related programs and distributed parallel processing is investigated. The distributed computing framework, called the Java Parallel Processing Framework, is integrated into the system to allow the batch processing of stochastic models in distributed and parallel systems. As an example, the system is applied to the stochastic delineation of well capture zones in the Pinggu Basin in Beijing. Through the use of 50 processing threads on a cluster with 10 multicore nodes, the execution times of 500 realizations are reduced to 3% compared with those of a serial execution. Through this application, the system demonstrates its potential in solving difficult computational problems in practical stochastic modeling. © 2012, The Author(s). Groundwater © 2012, National Ground Water Association.

  6. A REDUCE program for symbolic computation of Puiseux expansions

    International Nuclear Information System (INIS)

    Gerdt, V.P.; Tiller, P.

    1991-01-01

    The program is described for computation of Puiseux expansions of algebraic functions. The Newton diagram method is used for construction of initial coefficients of all the Puiseux series at the given point. The program is written in computer algebra language Reduce. Some illustrative examples are given. 20 refs

  7. The Implementation of Blended Learning Using Android-Based Tutorial Video in Computer Programming Course II

    Science.gov (United States)

    Huda, C.; Hudha, M. N.; Ain, N.; Nandiyanto, A. B. D.; Abdullah, A. G.; Widiaty, I.

    2018-01-01

    Computer programming course is theoretical. Sufficient practice is necessary to facilitate conceptual understanding and encouraging creativity in designing computer programs/animation. The development of tutorial video in an Android-based blended learning is needed for students’ guide. Using Android-based instructional material, students can independently learn anywhere and anytime. The tutorial video can facilitate students’ understanding about concepts, materials, and procedures of programming/animation making in detail. This study employed a Research and Development method adapting Thiagarajan’s 4D model. The developed Android-based instructional material and tutorial video were validated by experts in instructional media and experts in physics education. The expert validation results showed that the Android-based material was comprehensive and very feasible. The tutorial video was deemed feasible as it received average score of 92.9%. It was also revealed that students’ conceptual understanding, skills, and creativity in designing computer program/animation improved significantly.

  8. FISS: a computer program for reactor systems studies

    International Nuclear Information System (INIS)

    Tamm, H.; Sherman, G.R.; Wright, J.H.; Nieman, R.E.

    1979-08-01

    ΣFISSΣ is a computer code for use in investigating alternative fuel cycle strategies for Canadian and world nuclear programs. The code performs a system simulation accounting for dynamic effects of growing nuclear systems. Facilities in the model include storage for irradiated fuel, mines, plants for enrichment, fuel fabrication, fuel reprocessing and heavy water, and reactors. FISS is particularly useful for comparing various reactor strategies and studying sensitivities of resource consumption, capital investment and energy costs with changes in fuel cycle parameters, reactor parameters and financial variables. (author)

  9. Technical review of the dispersion and dose models used in the MILDOS computer program

    International Nuclear Information System (INIS)

    Horst, T.W.; Soldat, J.K.; Bander, T.J.

    1982-05-01

    The MILDOS computer code is used to estimate impacts of radioactive emissions from uranium milling facilities. This report reviews the technical basis of the models used in the MILDOS computer code. The models were compared with state-of-the-art predictions, taking into account the intended uses of the MILDOS code. Several suggested modifications are presented and the technical basis for those changes are given

  10. The Caltech Concurrent Computation Program - Project description

    Science.gov (United States)

    Fox, G.; Otto, S.; Lyzenga, G.; Rogstad, D.

    1985-01-01

    The Caltech Concurrent Computation Program wwhich studies basic issues in computational science is described. The research builds on initial work where novel concurrent hardware, the necessary systems software to use it and twenty significant scientific implementations running on the initial 32, 64, and 128 node hypercube machines have been constructed. A major goal of the program will be to extend this work into new disciplines and more complex algorithms including general packages that decompose arbitrary problems in major application areas. New high-performance concurrent processors with up to 1024-nodes, over a gigabyte of memory and multigigaflop performance are being constructed. The implementations cover a wide range of problems in areas such as high energy and astrophysics, condensed matter, chemical reactions, plasma physics, applied mathematics, geophysics, simulation, CAD for VLSI, graphics and image processing. The products of the research program include the concurrent algorithms, hardware, systems software, and complete program implementations.

  11. Heat exchanger performance analysis programs for the personal computer

    International Nuclear Information System (INIS)

    Putman, R.E.

    1992-01-01

    Numerous utility industry heat exchange calculations are repetitive and thus lend themselves to being performed on a Personal Computer. These programs may be regarded as engineering tools which, when put together, can form a Toolbox. However, the practicing Results Engineer in the utility industry desires not only programs that are robust as well as easy to use but can also be used both on desktop and laptop PC's. The latter also offer the opportunity to take the computer into the plant or control room, and use it there to process test or operating data right on the spot. Most programs evolve through the needs which arise in the course of day-to-day work. This paper describes several of the more useful programs of this type and outlines some of the guidelines to be followed when designing personal computer programs for use by the practicing Results Engineer

  12. The SIMRAND 1 computer program: Simulation of research and development projects

    Science.gov (United States)

    Miles, R. F., Jr.

    1986-01-01

    The SIMRAND I Computer Program (Version 5.0 x 0.3) written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles is described. The SIMRAND I Computer Program comprises eleven modules-a main routine and ten subroutines. Two additional files are used at compile time; one inserts the system or task equations into the source code, while the other inserts the dimension statements and common blocks. The SIMRAND I Computer Program can be run on most microcomputers or mainframe computers with only minor modifications to the computer code.

  13. Computer-based Programs in Speech Therapy of Dyslalia and Dyslexia- Dysgraphia

    Directory of Open Access Journals (Sweden)

    Mirela Danubianu

    2010-04-01

    Full Text Available During the last years, the researchers and therapists in speech therapy have been more and more concerned with the elaboration and use of computer programs in speech disorders therapy. The main objective of this study was to evaluate the therapeutic effectiveness of computer-based programs for the Romanian language in speech therapy. Along the study, we will present the experimental research through assessing the effectiveness of computer programs in the speech therapy for speech disorders: dyslalia, dyslexia and dysgraphia. Methodologically, the use of the computer in the therapeutic phases was carried out with the help of some computer-based programs (Logomon, Dislex-Test etc. that we elaborated and we experimented with during several years of therapeutic activity. The sample used in our experiments was composed of 120 subjects; two groups of 60 children with speech disorders were selected for both speech disorders: 30 for the experimental ('computer-based' group and 30 for the control ('classical method' group. The study hypotheses verified whether the results, obtained by the subjects within the experimental group, improved significantly after using the computer-based program, compared to the subjects within the control group, who did not use this program but got a classical therapy. The hypotheses were confirmed for the speech disorders included in this research; the conclusions of the study confirm the advantages of using computer-based programs within speech therapy by correcting these disorders, as well as due to the positive influence these programs have on the development of children’s personality.

  14. SONATINA-2V: a computer program for seismic analysis of the two-dimensional vertical slice HTGR core

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1982-07-01

    A computer program SONATINA-2V has been developed for predicting the behavior of a two-dimensional vertical slice HTGR core under seismic excitation. SONATINA-2V is a general two-dimensional computer program capable of analyzing the vertical slice HTGR core with the permanent side reflector blocks and its restraint structures. In the analytical model, each block is treated as rigid body and is restrained by dowel pins which restrict relative horizontal movement but allow vertical and rocking motions between upper and lower blocks. Coulomb friction is taken into account between blocks and between dowel pin and hole. A spring dashpot model is used for the collision process between adjacent blocks. The core support structure is represented by a single block. The computer program SONATINA-2V is capable of analyzing the core behavior for an excitation input applied simultaneously to both vertical and horizontal directions. Analytical results obtained from SONATINA-2V are compared with experimental results and are found to be in good agreement. The computer program can thus be used to predict with a good accuracy the behavior of the HTGR core under seismic excitation. In the present report are given, the theoretical formulation of the analytical model, a user's manual to describe the input and output format, and sample problems. (author)

  15. Program POD; A computer code to calculate nuclear elastic scattering cross sections with the optical model and neutron inelastic scattering cross sections by the distorted-wave born approximation

    International Nuclear Information System (INIS)

    Ichihara, Akira; Kunieda, Satoshi; Chiba, Satoshi; Iwamoto, Osamu; Shibata, Keiichi; Nakagawa, Tsuneo; Fukahori, Tokio; Katakura, Jun-ichi

    2005-07-01

    The computer code, POD, was developed to calculate angle-differential cross sections and analyzing powers for shape-elastic scattering for collisions of neutron or light ions with target nucleus. The cross sections are computed with the optical model. Angle-differential cross sections for neutron inelastic scattering can also be calculated with the distorted-wave Born approximation. The optical model potential parameters are the most essential inputs for those model computations. In this program, the cross sections and analyzing powers are obtained by using the existing local or global parameters. The parameters can also be inputted by users. In this report, the theoretical formulas, the computational methods, and the input parameters are explained. The sample inputs and outputs are also presented. (author)

  16. Theory of computation

    CERN Document Server

    Tourlakis, George

    2012-01-01

    Learn the skills and acquire the intuition to assess the theoretical limitations of computer programming Offering an accessible approach to the topic, Theory of Computation focuses on the metatheory of computing and the theoretical boundaries between what various computational models can do and not do—from the most general model, the URM (Unbounded Register Machines), to the finite automaton. A wealth of programming-like examples and easy-to-follow explanations build the general theory gradually, which guides readers through the modeling and mathematical analysis of computational pheno

  17. Input data instructions - simplified documentation of the computer program ANSYS. Report for 10 June 1976--31 March 1978

    International Nuclear Information System (INIS)

    Chang, P.Y.

    1978-02-01

    A simplified version of the input instructions for the computer program 'ANSYS' is presented for the non-linear elastoplastic analysis of a ship collision protection barrier structure. All essential information necessary for the grillage model are summarized while eliminating the instructions for other types of the problems. A benchmark example is given for checking the computer program

  18. ACRO - a computer program for calculating organ doses from acute or chronic inhalation and ingestion of radionuclides

    International Nuclear Information System (INIS)

    Hirayama, Akio; Kishimoto, Yoichiro; Shinohara, Kunihiko.

    1978-01-01

    The computer program ACRO has been developed to calculate organ doses from acute or chronic inhalation and ingestion of radionuclides. The ICRP Task Group Lung Model (TGLM) was used for inhalation model, and a simple one-compartment model for ingestion. This program is written in FORTRAN IV, and can be executed with storage requirements of about 260 K bytes. (auth.)

  19. A Computer Program for Modeling the Conversion of Organic Waste to Energy

    Directory of Open Access Journals (Sweden)

    Pragasen Pillay

    2011-11-01

    Full Text Available This paper presents a tool for the analysis of conversion of organic waste into energy. The tool is a program that uses waste characterization parameters and mass flow rates at each stage of the waste treatment process to predict the given products. The specific waste treatment process analysed in this paper is anaerobic digestion. The different waste treatment stages of the anaerobic digestion process are: conditioning of input waste, secondary treatment, drying of sludge, conditioning of digestate, treatment of digestate, storage of liquid and solid effluent, disposal of liquid and solid effluents, purification, utilization and storage of combustible gas. The program uses mass balance equations to compute the amount of CH4, NH3, CO2 and H2S produced from anaerobic digestion of organic waste, and hence the energy available. Case studies are also presented.

  20. Algebraic computing program for studying the gauge theory

    International Nuclear Information System (INIS)

    Zet, G.

    2005-01-01

    An algebraic computing program running on Maple V platform is presented. The program is devoted to the study of the gauge theory with an internal Lie group as local symmetry. The physical quantities (gauge potentials, strength tensors, dual tensors etc.) are introduced either as equations in terms of previous defined quantities (tensors), or by manual entry of the component values. The components of the strength tensor and of its dual are obtained with respect to a given metric of the space-time used for describing the gauge theory. We choose a Minkowski space-time endowed with spherical symmetry and give some example of algebraic computing that are adequate for studying electroweak or gravitational interactions. The field equations are also obtained and their solutions are determined using the DEtools facilities of the Maple V computing program. (author)

  1. Measuring attitudes towards nuclear and technological risks (computer programs in SPSS language)

    International Nuclear Information System (INIS)

    Leonin, T.V. Jr.

    1981-04-01

    A number of methodologies have been developed for measuring public attitudes towards nuclear and other technological risks. The Fishbein model, as modified by the IAEA Risk Assessment group, and which was found to be applicable for Philippine public attitude measurements, is briefly explained together with two other models which are utilized for comparative correlations. A step by step guide on the procedures involved and the calculations required in measuring and analyzing attitude using these models is likewise described, with special emphasis on the computer processing aspect. The use of the Statistical Package for the Social Sciences (SPSS) in the analysis is also described and a number of computer programs in SPSS for the various statistical calculations required in the analysis is presented. (author)

  2. Computer program for computing the properties of seventeen fluids. [cryogenic liquids

    Science.gov (United States)

    Brennan, J. A.; Friend, D. G.; Arp, V. D.; Mccarty, R. D.

    1992-01-01

    The present study describes modifications and additions to the MIPROPS computer program for calculating the thermophysical properties of 17 fluids. These changes include adding new fluids, new properties, and a new interface to the program. The new program allows the user to select the input and output parameters and the units to be displayed for each parameter. Fluids added to the MIPROPS program are carbon dioxide, carbon monoxide, deuterium, helium, normal hydrogen, and xenon. The most recent modifications to the MIPROPS program are the addition of viscosity and thermal conductivity correlations for parahydrogen and the addition of the fluids normal hydrogen and xenon. The recently added interface considerably increases the program's utility.

  3. Automatic control of commercial computer programs

    International Nuclear Information System (INIS)

    Rezvov, B.A.; Artem'ev, A.N.; Maevskij, A.G.; Demkiv, A.A.; Kirillov, B.F.; Belyaev, A.D.; Artem'ev, N.A.

    2010-01-01

    The way of automatic control of commercial computer programs is presented. The developed connection of the EXAFS spectrometer automatic system (which is managed by PC for DOS) is taken with the commercial program for the CCD detector control (which is managed by PC for Windows). The described complex system is used for the automation of intermediate amplitude spectra processing in EXAFS spectrum measurements at Kurchatov SR source

  4. Radiation and pregnancy: a self-teaching computer program

    International Nuclear Information System (INIS)

    Kumar, Pratik; Rehani, M.M.

    1994-01-01

    Two self-interacting computer programs have been developed. The first program which consists of fifteen topics apprises the users with broad spectrum of radiation risks to the unborn during pregnancy and the status of various views in this regard. Another program estimates the dose to uterus in sixteen radiological examinations depending upon the radiographic parameters used. The dose to uterus and hence to the fetus calculated by computer program in different radiographic views have been found to be in agreement with that reported in NRPB-R200 survey report. The two programs combined provide a better understanding of the rather confusing situation regarding dilemma about termination of pregnancy following inadvertent radiation exposure, apprehension about radiation effect in the minds of prescribing doctor and patients, dose estimation and advice to pregnant workers and like. (author). 10 refs

  5. A computer program for two-particle intrinsic coefficients of fractional parentage

    Science.gov (United States)

    Deveikis, A.

    2012-06-01

    A Fortran 90 program CESOS for the calculation of the two-particle intrinsic coefficients of fractional parentage for several j-shells with isospin and an arbitrary number of oscillator quanta (CESOs) is presented. The implemented procedure for CESOs calculation consistently follows the principles of antisymmetry and translational invariance. The approach is based on a simple enumeration scheme for antisymmetric many-particle states, efficient algorithms for calculation of the coefficients of fractional parentage for j-shells with isospin, and construction of the subspace of the center-of-mass Hamiltonian eigenvectors corresponding to the minimal eigenvalue equal to 3/2 (in ℏω). The program provides fast calculation of CESOs for a given particle number and produces results possessing small numerical uncertainties. The introduced CESOs may be used for calculation of expectation values of two-particle nuclear shell-model operators within the isospin formalism. Program summaryProgram title: CESOS Catalogue identifier: AELT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 10 932 No. of bytes in distributed program, including test data, etc.: 61 023 Distribution format: tar.gz Programming language: Fortran 90 Computer: Any computer with a Fortran 90 compiler Operating system: Windows XP, Linux RAM: The memory demand depends on the number of particles A and the excitation energy of the system E. Computation of the A=6 particle system with the total angular momentum J=0 and the total isospin T=1 requires around 4 kB of RAM at E=0,˜3 MB at E=3, and ˜172 MB at E=5. Classification: 17.18 Nature of problem: The code CESOS generates a list of two-particle intrinsic coefficients of fractional parentage for several

  6. Web Program for Development of GUIs for Cluster Computers

    Science.gov (United States)

    Czikmantory, Akos; Cwik, Thomas; Klimeck, Gerhard; Hua, Hook; Oyafuso, Fabiano; Vinyard, Edward

    2003-01-01

    WIGLAF (a Web Interface Generator and Legacy Application Facade) is a computer program that provides a Web-based, distributed, graphical-user-interface (GUI) framework that can be adapted to any of a broad range of application programs, written in any programming language, that are executed remotely on any cluster computer system. WIGLAF enables the rapid development of a GUI for controlling and monitoring a specific application program running on the cluster and for transferring data to and from the application program. The only prerequisite for the execution of WIGLAF is a Web-browser program on a user's personal computer connected with the cluster via the Internet. WIGLAF has a client/server architecture: The server component is executed on the cluster system, where it controls the application program and serves data to the client component. The client component is an applet that runs in the Web browser. WIGLAF utilizes the Extensible Markup Language to hold all data associated with the application software, Java to enable platform-independent execution on the cluster system and the display of a GUI generator through the browser, and the Java Remote Method Invocation software package to provide simple, effective client/server networking.

  7. Nuclear model codes available at the Nuclear Energy Agency Computer Program Library (NEA-CPL)

    International Nuclear Information System (INIS)

    Sartori, E.; Garcia Viedma, L. de

    1976-01-01

    This paper briefly outlines the objectives of the NEA-CPL and its activities in the field of Nuclear Model Computer Codes. A short description of the computer codes available from the CPL in this field is also presented. (author)

  8. The CMS Computing Model

    International Nuclear Information System (INIS)

    Bonacorsi, D.

    2007-01-01

    The CMS experiment at LHC has developed a baseline Computing Model addressing the needs of a computing system capable to operate in the first years of LHC running. It is focused on a data model with heavy streaming at the raw data level based on trigger, and on the achievement of the maximum flexibility in the use of distributed computing resources. The CMS distributed Computing Model includes a Tier-0 centre at CERN, a CMS Analysis Facility at CERN, several Tier-1 centres located at large regional computing centres, and many Tier-2 centres worldwide. The workflows have been identified, along with a baseline architecture for the data management infrastructure. This model is also being tested in Grid Service Challenges of increasing complexity, coordinated with the Worldwide LHC Computing Grid community

  9. Integrating computer programs for engineering analysis and design

    Science.gov (United States)

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.

    1983-01-01

    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  10. Newnes circuit calculations pocket book with computer programs

    CERN Document Server

    Davies, Thomas J

    2013-01-01

    Newnes Circuit Calculations Pocket Book: With Computer Programs presents equations, examples, and problems in circuit calculations. The text includes 300 computer programs that help solve the problems presented. The book is comprised of 20 chapters that tackle different aspects of circuit calculation. The coverage of the text includes dc voltage, dc circuits, and network theorems. The book also covers oscillators, phasors, and transformers. The text will be useful to electrical engineers and other professionals whose work involves electronic circuitry.

  11. Modeling the Isentropic Head Value of Centrifugal Gas Compressor using Genetic Programming

    Directory of Open Access Journals (Sweden)

    Safiyullah Ferozkhan

    2016-01-01

    Full Text Available Gas compressor performance is vital in oil and gas industry because of the equipment criticality which requires continuous operations. Plant operators often face difficulties in predicting appropriate time for maintenance and would usually rely on time based predictive maintenance intervals as recommended by original equipment manufacturer (OEM. The objective of this work is to develop the computational model to find the isentropic head value using genetic programming. The isentropic head value is calculated from the OEM performance chart. Inlet mass flow rate and speed of the compressor are taken as the input value. The obtained results from the GP computational models show good agreement with experimental and target data with the average prediction error of 1.318%. The genetic programming computational model will assist machinery engineers to quantify performance deterioration of gas compressor and the results from this study will be then utilized to estimate future maintenance requirements based on the historical data. In general, this genetic programming modelling provides a powerful solution for gas compressor operators to realize predictive maintenance approach in their operations.

  12. DITTY - a computer program for calculating population dose integrated over ten thousand years

    International Nuclear Information System (INIS)

    Napier, B.A.; Peloquin, R.A.; Strenge, D.L.

    1986-03-01

    The computer program DITTY (Dose Integrated Over Ten Thousand Years) was developed to determine the collective dose from long term nuclear waste disposal sites resulting from the ground-water pathways. DITTY estimates the time integral of collective dose over a ten-thousand-year period for time-variant radionuclide releases to surface waters, wells, or the atmosphere. This document includes the following information on DITTY: a description of the mathematical models, program designs, data file requirements, input preparation, output interpretations, sample problems, and program-generated diagnostic messages

  13. A Tangible Programming Tool for Children to Cultivate Computational Thinking

    Science.gov (United States)

    Wang, Danli; Liu, Zhen

    2014-01-01

    Game and creation are activities which have good potential for computational thinking skills. In this paper we present T-Maze, an economical tangible programming tool for children aged 5–9 to build computer programs in maze games by placing wooden blocks. Through the use of computer vision technology, T-Maze provides a live programming interface with real-time graphical and voice feedback. We conducted a user study with 7 children using T-Maze to play two levels of maze-escape games and create their own mazes. The results show that T-Maze is not only easy to use, but also has the potential to help children cultivate computational thinking like abstraction, problem decomposition, and creativity. PMID:24719575

  14. A Tangible Programming Tool for Children to Cultivate Computational Thinking

    Directory of Open Access Journals (Sweden)

    Danli Wang

    2014-01-01

    Full Text Available Game and creation are activities which have good potential for computational thinking skills. In this paper we present T-Maze, an economical tangible programming tool for children aged 5–9 to build computer programs in maze games by placing wooden blocks. Through the use of computer vision technology, T-Maze provides a live programming interface with real-time graphical and voice feedback. We conducted a user study with 7 children using T-Maze to play two levels of maze-escape games and create their own mazes. The results show that T-Maze is not only easy to use, but also has the potential to help children cultivate computational thinking like abstraction, problem decomposition, and creativity.

  15. Field load and displacement boundary condition computer program used for the finite element analysis and design of toroidal field coils in a tokamak

    International Nuclear Information System (INIS)

    Smith, R.A.

    1975-06-01

    The design evaluation of toroidal field coils on the Princeton Large Torus (PLT), the Poloidal Diverter Experiment (PDX) and the Tokamak Fusion Test Reactor (TFTR) has been performed by structural analysis with the finite element method. The technique employed has been simplified with supplementary computer programs that are used to generate the input data for the finite element computer program. Significant automation has been provided by computer codes in three areas of data input. These are the definition of coil geometry by a mesh of node points, the definition of finite elements via the node points and the definition of the node point force/displacement boundary conditions. The computer programs by name that have been used to perform the above functions are PDXNODE, ELEMENT and PDXFORC. The geometric finite element modeling options for toroidal field coils provided by PDXNODE include one-fourth or one-half symmetric sections of circular coils, oval shaped coils or dee-shaped coils with or without a beveled wedging surface. The program ELEMENT which defines the finite elements for input to the finite element computer code can provide considerable time and labor savings when defining the model of coils of non-uniform cross-section or when defining the model of coils whose material properties are different in the R and THETA directions due to the laminations of alternate epoxy and copper windings. The modeling features provided by the program ELEMENT have been used to analyze the PLT and the TFTR toroidal field coils with integral support structures. The computer program named PDXFORC is described. It computes the node point forces in a model of a toroidal field coil from the vector crossproduct of the coil current and the magnetic field. The model can be of one-half or one-fourth symmetry to be consistent with the node model defined by PDXNODE, and the magnetic field is computed from toroidal or poloidal coils

  16. Medium-term generation programming in competitive environments: a new optimisation approach for market equilibrium computing

    International Nuclear Information System (INIS)

    Barquin, J.; Centeno, E.; Reneses, J.

    2004-01-01

    The paper proposes a model to represent medium-term hydro-thermal operation of electrical power systems in deregulated frameworks. The model objective is to compute the oligopolistic market equilibrium point in which each utility maximises its profit, based on other firms' behaviour. This problem is not an optimisation one. The main contribution of the paper is to demonstrate that, nevertheless, under some reasonable assumptions, it can be formulated as an equivalent minimisation problem. A computer program has been coded by using the proposed approach. It is used to compute the market equilibrium of a real-size system. (author)

  17. Concentrator optical characterization using computer mathematical modelling and point source testing

    Science.gov (United States)

    Dennison, E. W.; John, S. L.; Trentelman, G. F.

    1984-01-01

    The optical characteristics of a paraboloidal solar concentrator are analyzed using the intercept factor curve (a format for image data) to describe the results of a mathematical model and to represent reduced data from experimental testing. This procedure makes it possible not only to test an assembled concentrator, but also to evaluate single optical panels or to conduct non-solar tests of an assembled concentrator. The use of three-dimensional ray tracing computer programs to calculate the mathematical model is described. These ray tracing programs can include any type of optical configuration from simple paraboloids to array of spherical facets and can be adapted to microcomputers or larger computers, which can graphically display real-time comparison of calculated and measured data.

  18. The AAHA Computer Program. American Animal Hospital Association.

    Science.gov (United States)

    Albers, J W

    1986-07-01

    The American Animal Hospital Association Computer Program should benefit all small animal practitioners. Through the availability of well-researched and well-developed certified software, veterinarians will have increased confidence in their purchase decisions. With the expansion of computer applications to improve practice management efficiency, veterinary computer systems will further justify their initial expense. The development of the Association's veterinary computer network will provide a variety of important services to the profession.

  19. Using the Computer in Special Vocational Programs. Inservice Activities.

    Science.gov (United States)

    Lane, Kenneth; Ward, Raymond

    This inservice manual is intended to assist vocational education teachers in using the techniques of computer-assisted instruction in special vocational education programs. Addressed in the individual units are the following topics: the basic principles of computer-assisted instruction (TRS-80 computers and typing on a computer keyboard); money…

  20. Programs for Testing Processor-in-Memory Computing Systems

    Science.gov (United States)

    Katz, Daniel S.

    2006-01-01

    The Multithreaded Microbenchmarks for Processor-In-Memory (PIM) Compilers, Simulators, and Hardware are computer programs arranged in a series for use in testing the performances of PIM computing systems, including compilers, simulators, and hardware. The programs at the beginning of the series test basic functionality; the programs at subsequent positions in the series test increasingly complex functionality. The programs are intended to be used while designing a PIM system, and can be used to verify that compilers, simulators, and hardware work correctly. The programs can also be used to enable designers of these system components to examine tradeoffs in implementation. Finally, these programs can be run on non-PIM hardware (either single-threaded or multithreaded) using the POSIX pthreads standard to verify that the benchmarks themselves operate correctly. [POSIX (Portable Operating System Interface for UNIX) is a set of standards that define how programs and operating systems interact with each other. pthreads is a library of pre-emptive thread routines that comply with one of the POSIX standards.

  1. Computer program user's manual for FIREFINDER digital topographic data verification library dubbing system

    Science.gov (United States)

    Ceres, M.; Heselton, L. R., III

    1981-11-01

    This manual describes the computer programs for the FIREFINDER Digital Topographic Data Verification-Library-Dubbing System (FFDTDVLDS), and will assist in the maintenance of these programs. The manual contains detailed flow diagrams and associated descriptions for each computer program routine and subroutine. Complete computer program listings are also included. This information should be used when changes are made in the computer programs. The operating system has been designed to minimize operator intervention.

  2. TRIGASIM: A computer program to simulate a TRIGA Mark I Reactor

    International Nuclear Information System (INIS)

    Ruby, Lawrence

    1992-01-01

    A Fortran-77 computer program has been written which simulates the operation of a TRIGA Mark I Reactor. The 'operator' has options at 1-second intervals, of raising rods, lowering rods, maintaining rods steady, dropping a rod, or scramming the reactor. Results are printed to the screen, and to 2 output files - a tabular record and a logarithmic plot of the power. The Point Kinetic Equations are programmed with 6 delayed groups, quasi-static power feedback, and forward differencing. A pulsing option is available, with simulation which employs the Fuchs Model. A pulse-tail model has been devised to simulate behavior for a few minutes following a pulse. Both graphic and tabular output are also available for the pulses. (author)

  3. Scientific Computing in the CH Programming Language

    Directory of Open Access Journals (Sweden)

    Harry H. Cheng

    1993-01-01

    Full Text Available We have developed a general-purpose block-structured interpretive programming Ianguage. The syntax and semantics of this language called CH are similar to C. CH retains most features of C from the scientific computing point of view. In this paper, the extension of C to CH for numerical computation of real numbers will be described. Metanumbers of −0.0, 0.0, Inf, −Inf, and NaN are introduced in CH. Through these metanumbers, the power of the IEEE 754 arithmetic standard is easily available to the programmer. These metanumbers are extended to commonly used mathematical functions in the spirit of the IEEE 754 standard and ANSI C. The definitions for manipulation of these metanumbers in I/O; arithmetic, relational, and logic operations; and built-in polymorphic mathematical functions are defined. The capabilities of bitwise, assignment, address and indirection, increment and decrement, as well as type conversion operations in ANSI C are extended in CH. In this paper, mainly new linguistic features of CH in comparison to C will be described. Example programs programmed in CH with metanumbers and polymorphic mathematical functions will demonstrate capabilities of CH in scientific computing.

  4. Implementation of a Thermodynamic Solver within a Computer Program for Calculating Fission-Product Release Fractions

    Science.gov (United States)

    Barber, Duncan Henry

    During some postulated accidents at nuclear power stations, fuel cooling may be impaired. In such cases, the fuel heats up and the subsequent increased fission-gas release from the fuel to the gap may result in fuel sheath failure. After fuel sheath failure, the barrier between the coolant and the fuel pellets is lost or impaired, gases and vapours from the fuel-to-sheath gap and other open voids in the fuel pellets can be vented. Gases and steam from the coolant can enter the broken fuel sheath and interact with the fuel pellet surfaces and the fission-product inclusion on the fuel surface (including material at the surface of the fuel matrix). The chemistry of this interaction is an important mechanism to model in order to assess fission-product releases from fuel. Starting in 1995, the computer program SOURCE 2.0 was developed by the Canadian nuclear industry to model fission-product release from fuel during such accidents. SOURCE 2.0 has employed an early thermochemical model of irradiated uranium dioxide fuel developed at the Royal Military College of Canada. To overcome the limitations of computers of that time, the implementation of the RMC model employed lookup tables to pre-calculated equilibrium conditions. In the intervening years, the RMC model has been improved, the power of computers has increased significantly, and thermodynamic subroutine libraries have become available. This thesis is the result of extensive work based on these three factors. A prototype computer program (referred to as SC11) has been developed that uses a thermodynamic subroutine library to calculate thermodynamic equilibria using Gibbs energy minimization. The Gibbs energy minimization requires the system temperature (T) and pressure (P), and the inventory of chemical elements (n) in the system. In order to calculate the inventory of chemical elements in the fuel, the list of nuclides and nuclear isomers modelled in SC11 had to be expanded from the list used by SOURCE 2.0. A

  5. COLUMN2 - A computer program for simulating migration

    International Nuclear Information System (INIS)

    Nielsen, O.J.; Bo, P.; Carlsen, L.

    1985-10-01

    COLUMN2 is a 1D FORTRAN77 computer program designed for studies of the effects of various physicochemical processes on migration. It solves the solute transport equation and cant take into account dispersion, sorption, ion exchange, first and second order homogeneous chemical reactions. Spacial variations of input pulses and retention factors are possible. The method of solution is based on a finite difference discretion followed by the application of the method of characteristics and two separate grid systems. This report explains the mathematical and numerical methods used, describes the necessary input, contains a number of test examples, provides a listing of the program and explains how to acquire the program, adapt it to other computers and run it. This report serves as a manual for the program. (author)

  6. TASAC a computer program for thermal analysis of severe accident conditions. Version 3/01, Dec 1991. Model description and user's guide

    International Nuclear Information System (INIS)

    Stempniewicz, M.; Marks, P.; Salwa, K.

    1992-06-01

    TASAC (Thermal Analysis of Severe Accident Conditions) is computer code developed in the Institute of Atomic Energy written in FORTRAN 77 for the digital computer analysis of PWR rod bundle behaviour during severe accident conditions. The code has the ability to model an early stage of core degradation including heat transfer inside the rods, convective and radiative heat exchange as well as cladding interactions with coolant and fuel, hydrogen generation, melting, relocations and refreezing of fuel rod materials with dissolution of UO 2 and ZrO 2 in liquid phase. The code was applied for the simulation of International Standard Problem number 28, performed on PHEBUS test facility. This report contains the program physical models description, detailed description of input data requirements and results of code verification. The main directions for future TASAC code development are formulated. (author). 20 refs, 39 figs, 4 tabs

  7. Computing risk for oil prospects: principles and programs

    International Nuclear Information System (INIS)

    Harbaugh, J.W.; Davis, J.C.; Wendebourg, J.

    1995-01-01

    This volume in the series Computer Methods in the Geosciences examines the challenge of risk assessment, field size distributions, and success, sequence and gambler's ruin. The estimation of the discovery size from the prospect size, outcome probabilities and success ratios, modeling prospects, and mapping properties and uncertainties are reviewed, and discriminating discoveries and dry holes, forecasting cash flow for a prospect, the worth of money, and use of risk analysis tables, decision tables and trees are considered. Appendices cover the installation of the RISK program and user manuals, and two disks are included with the volume. (UK)

  8. Program description of FIBRAM: a radiation attenuation model for optical fibers

    International Nuclear Information System (INIS)

    Ingram, W.J.

    1987-06-01

    The report describes a fiber optics system model and its computer implementation. This implementation can calculate the bit error ratio (BER) versus time for optical fibers that have been exposed to gamma radiation. The program is designed so that the user may arbitrarily change any or all of the system input variables and produce separate output calculations. The primary output of the program is a table of the BER as a function of time. This table may be stored on magnetic media and later incorporated into computer graphics programs

  9. Integration of a Multizone Airflow Model into a Thermalsimulation Program

    DEFF Research Database (Denmark)

    Jensen, Rasmus Lund; Sørensen, Karl Grau; Heiselberg, Per

    2007-01-01

    An existing computer model for dynamic hygrothermal analysis of buildings has been extended with a multizone airflow model based on loop equations to account for the coupled thermal and airflow in natural and hybrid ventilated buildings. In water distribution network and related fields loop...... a methodology adopted from water distribution network that automatically sets up the independent loops and is easy to implement into a computer program. Finally an example of verification of the model is given which demonstrates the ability of the models to accurately predict the airflow of a simple multizone...

  10. Computational models of neuromodulation.

    Science.gov (United States)

    Fellous, J M; Linster, C

    1998-05-15

    Computational modeling of neural substrates provides an excellent theoretical framework for the understanding of the computational roles of neuromodulation. In this review, we illustrate, with a large number of modeling studies, the specific computations performed by neuromodulation in the context of various neural models of invertebrate and vertebrate preparations. We base our characterization of neuromodulations on their computational and functional roles rather than on anatomical or chemical criteria. We review the main framework in which neuromodulation has been studied theoretically (central pattern generation and oscillations, sensory processing, memory and information integration). Finally, we present a detailed mathematical overview of how neuromodulation has been implemented at the single cell and network levels in modeling studies. Overall, neuromodulation is found to increase and control computational complexity.

  11. Computer-modeling codes to improve exploration nuclear-logging methods. National Uranium Resource Evaluation

    International Nuclear Information System (INIS)

    Wilson, R.D.; Price, R.K.; Kosanke, K.L.

    1983-03-01

    As part of the Department of Energy's National Uranium Resource Evaluation (NURE) project's Technology Development effort, a number of computer codes and accompanying data bases were assembled for use in modeling responses of nuclear borehole logging Sondes. The logging methods include fission neutron, active and passive gamma-ray, and gamma-gamma. These CDC-compatible computer codes and data bases are available on magnetic tape from the DOE Technical Library at its Grand Junction Area Office. Some of the computer codes are standard radiation-transport programs that have been available to the radiation shielding community for several years. Other codes were specifically written to model the response of borehole radiation detectors or are specialized borehole modeling versions of existing Monte Carlo transport programs. Results from several radiation modeling studies are available as two large data bases (neutron and gamma-ray). These data bases are accompanied by appropriate processing programs that permit the user to model a wide range of borehole and formation-parameter combinations for fission-neutron, neutron-, activation and gamma-gamma logs. The first part of this report consists of a brief abstract for each code or data base. The abstract gives the code name and title, short description, auxiliary requirements, typical running time (CDC 6600), and a list of references. The next section gives format specifications and/or directory for the tapes. The final section of the report presents listings for programs used to convert data bases between machine floating-point and EBCDIC

  12. Logic integer programming models for signaling networks.

    Science.gov (United States)

    Haus, Utz-Uwe; Niermann, Kathrin; Truemper, Klaus; Weismantel, Robert

    2009-05-01

    We propose a static and a dynamic approach to model biological signaling networks, and show how each can be used to answer relevant biological questions. For this, we use the two different mathematical tools of Propositional Logic and Integer Programming. The power of discrete mathematics for handling qualitative as well as quantitative data has so far not been exploited in molecular biology, which is mostly driven by experimental research, relying on first-order or statistical models. The arising logic statements and integer programs are analyzed and can be solved with standard software. For a restricted class of problems the logic models reduce to a polynomial-time solvable satisfiability algorithm. Additionally, a more dynamic model enables enumeration of possible time resolutions in poly-logarithmic time. Computational experiments are included.

  13. Applying Model Checking to Industrial-Sized PLC Programs

    CERN Document Server

    AUTHOR|(CDS)2079190; Darvas, Daniel; Blanco Vinuela, Enrique; Tournier, Jean-Charles; Bliudze, Simon; Blech, Jan Olaf; Gonzalez Suarez, Victor M

    2015-01-01

    Programmable logic controllers (PLCs) are embedded computers widely used in industrial control systems. Ensuring that a PLC software complies with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of safety-critical software but is still underused in industry due to the complexity of building and managing formal models of real applications. In this paper, we propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (\\eg CTL, LTL) on PLC programs. This methodology is based on an intermediate model (IM), meant to transform PLC programs written in various standard languages (ST, SFC, etc.) to different modeling languages of verification tools. We present the syntax and semantics of the IM and the transformation rules of the ST and SFC languages to the nuXmv model checker passing through the intermediate model. Finally, two real cases studies of \\CERN PLC programs, written mainly in th...

  14. A Programming Model for Massive Data Parallelism with Data Dependencies

    International Nuclear Information System (INIS)

    Cui, Xiaohui; Mueller, Frank; Potok, Thomas E.; Zhang, Yongpeng

    2009-01-01

    Accelerating processors can often be more cost and energy effective for a wide range of data-parallel computing problems than general-purpose processors. For graphics processor units (GPUs), this is particularly the case when program development is aided by environments such as NVIDIA s Compute Unified Device Architecture (CUDA), which dramatically reduces the gap between domain-specific architectures and general purpose programming. Nonetheless, general-purpose GPU (GPGPU) programming remains subject to several restrictions. Most significantly, the separation of host (CPU) and accelerator (GPU) address spaces requires explicit management of GPU memory resources, especially for massive data parallelism that well exceeds the memory capacity of GPUs. One solution to this problem is to transfer data between the GPU and host memories frequently. In this work, we investigate another approach. We run massively data-parallel applications on GPU clusters. We further propose a programming model for massive data parallelism with data dependencies for this scenario. Experience from micro benchmarks and real-world applications shows that our model provides not only ease of programming but also significant performance gains

  15. Computer-Supported Modelling of Multi modal Transportation Networks Rationalization

    Directory of Open Access Journals (Sweden)

    Ratko Zelenika

    2007-09-01

    Full Text Available This paper deals with issues of shaping and functioning ofcomputer programs in the modelling and solving of multimoda Itransportation network problems. A methodology of an integrateduse of a programming language for mathematical modellingis defined, as well as spreadsheets for the solving of complexmultimodal transportation network problems. The papercontains a comparison of the partial and integral methods ofsolving multimodal transportation networks. The basic hypothesisset forth in this paper is that the integral method results inbetter multimodal transportation network rationalization effects,whereas a multimodal transportation network modelbased on the integral method, once built, can be used as the basisfor all kinds of transportation problems within multimodaltransport. As opposed to linear transport problems, multimodaltransport network can assume very complex shapes. This papercontains a comparison of the partial and integral approach totransp01tation network solving. In the partial approach, astraightforward model of a transp01tation network, which canbe solved through the use of the Solver computer tool within theExcel spreadsheet inteiface, is quite sufficient. In the solving ofa multimodal transportation problem through the integralmethod, it is necessmy to apply sophisticated mathematicalmodelling programming languages which supp01t the use ofcomplex matrix functions and the processing of a vast amountof variables and limitations. The LINGO programming languageis more abstract than the Excel spreadsheet, and it requiresa certain programming knowledge. The definition andpresentation of a problem logic within Excel, in a manner whichis acceptable to computer software, is an ideal basis for modellingin the LINGO programming language, as well as a fasterand more effective implementation of the mathematical model.This paper provides proof for the fact that it is more rational tosolve the problem of multimodal transportation networks by

  16. COMPUTATIONAL MODELS USED FOR MINIMIZING THE NEGATIVE IMPACT OF ENERGY ON THE ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Oprea D.

    2012-04-01

    Full Text Available Optimizing energy system is a problem that is extensively studied for many years by scientists. This problem can be studied from different views and using different computer programs. The work is characterized by one of the following calculation methods used in Europe for modelling, power system optimization. This method shall be based on reduce action of energy system on environment. Computer program used and characterized in this article is GEMIS.

  17. The ASC Sequoia Programming Model

    Energy Technology Data Exchange (ETDEWEB)

    Seager, M

    2008-08-06

    In the late 1980's and early 1990's, Lawrence Livermore National Laboratory was deeply engrossed in determining the next generation programming model for the Integrated Design Codes (IDC) beyond vectorization for the Cray 1s series of computers. The vector model, developed in mid 1970's first for the CDC 7600 and later extended from stack based vector operation to memory to memory operations for the Cray 1s, lasted approximately 20 years (See Slide 5). The Cray vector era was deemed an extremely long lived era as it allowed vector codes to be developed over time (the Cray 1s were faster in scalar mode than the CDC 7600) with vector unit utilization increasing incrementally over time. The other attributes of the Cray vector era at LLNL were that we developed, supported and maintained the Operating System (LTSS and later NLTSS), communications protocols (LINCS), Compilers (Civic Fortran77 and Model), operating system tools (e.g., batch system, job control scripting, loaders, debuggers, editors, graphics utilities, you name it) and math and highly machine optimized libraries (e.g., SLATEC, and STACKLIB). Although LTSS was adopted by Cray for early system generations, they later developed COS and UNICOS operating systems and environment on their own. In the late 1970s and early 1980s two trends appeared that made the Cray vector programming model (described above including both the hardware and system software aspects) seem potentially dated and slated for major revision. These trends were the appearance of low cost CMOS microprocessors and their attendant, departmental and mini-computers and later workstations and personal computers. With the wide spread adoption of Unix in the early 1980s, it appeared that LLNL (and the other DOE Labs) would be left out of the mainstream of computing without a rapid transition to these 'Killer Micros' and modern OS and tools environments. The other interesting advance in the period is that systems were being

  18. Introduction to programming multiple-processor computers

    International Nuclear Information System (INIS)

    Hicks, H.R.; Lynch, V.E.

    1985-04-01

    FORTRAN applications programs can be executed on multiprocessor computers in either a unitasking (traditional) or multitasking form. The latter allows a single job to use more than one processor simultaneously, with a consequent reduction in wall-clock time and, perhaps, the cost of the calculation. An introduction to programming in this environment is presented. The concepts of synchronization and data sharing using EVENTS and LOCKS are illustrated with examples. The strategy of strong synchronization and the use of synchronization templates are proposed. We emphasize that incorrect multitasking programs can produce irreproducible results, which makes debugging more difficult

  19. Utilization of logistic computer programs in the power plant piping industry

    International Nuclear Information System (INIS)

    Motzel, E.

    1982-01-01

    Starting from the general situation of the power plant piping industry, the utilization of computer programs as well as the specific magnitude of complexity connected with the project realisation, the necessity for using logistic computer programs especially in case of nuclear power plants is explained. The logistic term as well as the logistic data are described. At the example of the nuclear power plant KRB II, Gundremmingen, Block B/C the practical use of such programs is shown. The planning, scheduling and supervision is carried out computer-aided by means of network-technique. The material management, prefabrication, installation including management of certificates for welding and testing activities is planned and controlled by computer programs as well. With the piping systems installed a complete erection work documentation is available which also serves as base for the billing versus the client. The budgeted costs are continuously controlled by means of a cost control program. Summing-up the further development in controlling piping contracts computer-supported is described with regard to software, hardware and the organisation structure. Furthermore the concept of a self-supporting field computer is introduced for the first time. (orig.) [de

  20. On Computational Power of Quantum Read-Once Branching Programs

    Directory of Open Access Journals (Sweden)

    Farid Ablayev

    2011-03-01

    Full Text Available In this paper we review our current results concerning the computational power of quantum read-once branching programs. First of all, based on the circuit presentation of quantum branching programs and our variant of quantum fingerprinting technique, we show that any Boolean function with linear polynomial presentation can be computed by a quantum read-once branching program using a relatively small (usually logarithmic in the size of input number of qubits. Then we show that the described class of Boolean functions is closed under the polynomial projections.

  1. Computation and experiment results of the grounding model of Three Gorges Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Wen Xishan; Zhang Yuanfang; Yu Jianhui; Chen Cixuan [Wuhan University of Hydraulic and Electrical Engineering (China); Qin Liming; Xu Jun; Shu Lianfu [Yangtze River Water Resources Commission, Wuhan (China)

    1999-07-01

    A model for the computation of the grounding parameters of the grids of Three Gorges Power Plant (TGPP) on the Yangtze River is presented in this paper. Using this model computation and analysis of grounding grids is carried out. The results show that reinforcing the grid of the dam is the main body of current dissipation. It must be reliably welded to form a good grounding grid. The experimental results show that the method and program of the computations are correct. (UK)

  2. Department of Energy Mathematical, Information, and Computational Sciences Division: High Performance Computing and Communications Program

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-11-01

    This document is intended to serve two purposes. Its first purpose is that of a program status report of the considerable progress that the Department of Energy (DOE) has made since 1993, the time of the last such report (DOE/ER-0536, The DOE Program in HPCC), toward achieving the goals of the High Performance Computing and Communications (HPCC) Program. The second purpose is that of a summary report of the many research programs administered by the Mathematical, Information, and Computational Sciences (MICS) Division of the Office of Energy Research under the auspices of the HPCC Program and to provide, wherever relevant, easy access to pertinent information about MICS-Division activities via universal resource locators (URLs) on the World Wide Web (WWW).

  3. Computational methods for a three-dimensional model of the petroleum-discovery process

    Science.gov (United States)

    Schuenemeyer, J.H.; Bawiec, W.J.; Drew, L.J.

    1980-01-01

    A discovery-process model devised by Drew, Schuenemeyer, and Root can be used to predict the amount of petroleum to be discovered in a basin from some future level of exploratory effort: the predictions are based on historical drilling and discovery data. Because marginal costs of discovery and production are a function of field size, the model can be used to make estimates of future discoveries within deposit size classes. The modeling approach is a geometric one in which the area searched is a function of the size and shape of the targets being sought. A high correlation is assumed between the surface-projection area of the fields and the volume of petroleum. To predict how much oil remains to be found, the area searched must be computed, and the basin size and discovery efficiency must be estimated. The basin is assumed to be explored randomly rather than by pattern drilling. The model may be used to compute independent estimates of future oil at different depth intervals for a play involving multiple producing horizons. We have written FORTRAN computer programs that are used with Drew, Schuenemeyer, and Root's model to merge the discovery and drilling information and perform the necessary computations to estimate undiscovered petroleum. These program may be modified easily for the estimation of remaining quantities of commodities other than petroleum. ?? 1980.

  4. A directory of computer programs for assessment of radioactive waste disposal in geological formations

    International Nuclear Information System (INIS)

    Broyd, T.W.; Dean, R.B.; Hobbs, G.D.; Knowles, N.C.; Putney, J.M.; Wrigley, J.

    1984-01-01

    This Directory describes computer programs suitable for the assessment of radioactive waste disposal facilities in geological formations. The programs, which are mainly applicable to the post closure analysis of the repository, address combinations of the following topics: nuclide inventory, corrosion, leaching, geochemistry, stress analysis, heat transfer, groundwater flow and radionuclide transport. Biosphere modelling, surface water flow and risk analysis are not covered. A total of 248 programs are identified, of which 50 are reviewed in detail, 134 in summary and 64 in tabular fashion. The directory has been compiled using a combination of literature searches, telephone and postal correspondence and meetings with recognised experts in the respective areas of work covered. It differs from previous reviews of computer programs for similar topics areas in two main respects. Firstly, the method of obtaining information has resulted in program descriptions of considerable breadth and detail. Secondly, the Directory has concentrated wherever possible on European codes, whereas most previous work of this nature has looked solely at programs developed in North America. The reviews are presented in good faith, but it has not been possible to run any of the programs on a computer, and so truly objective comparisons may not be made. Finally, although the Directory is specific to the post-closure assessment of a repository site, some of the programs described could also be used in other areas of repository analysis (eg repository design)

  5. Approach to Computer Implementation of Mathematical Model of 3-Phase Induction Motor

    Science.gov (United States)

    Pustovetov, M. Yu

    2018-03-01

    This article discusses the development of the computer model of an induction motor based on the mathematical model in a three-phase stator reference frame. It uses an approach that allows combining during preparation of the computer model dual methods: means of visual programming circuitry (in the form of electrical schematics) and logical one (in the form of block diagrams). The approach enables easy integration of the model of an induction motor as part of more complex models of electrical complexes and systems. The developed computer model gives the user access to the beginning and the end of a winding of each of the three phases of the stator and rotor. This property is particularly important when considering the asymmetric modes of operation or when powered by the special circuitry of semiconductor converters.

  6. Recovery Act: Finite Volume Based Computer Program for Ground Source Heat Pump Systems

    Energy Technology Data Exchange (ETDEWEB)

    James A Menart, Professor

    2013-02-22

    This report is a compilation of the work that has been done on the grant DE-EE0002805 entitled Finite Volume Based Computer Program for Ground Source Heat Pump Systems. The goal of this project was to develop a detailed computer simulation tool for GSHP (ground source heat pump) heating and cooling systems. Two such tools were developed as part of this DOE (Department of Energy) grant; the first is a two-dimensional computer program called GEO2D and the second is a three-dimensional computer program called GEO3D. Both of these simulation tools provide an extensive array of results to the user. A unique aspect of both these simulation tools is the complete temperature profile information calculated and presented. Complete temperature profiles throughout the ground, casing, tube wall, and fluid are provided as a function of time. The fluid temperatures from and to the heat pump, as a function of time, are also provided. In addition to temperature information, detailed heat rate information at several locations as a function of time is determined. Heat rates between the heat pump and the building indoor environment, between the working fluid and the heat pump, and between the working fluid and the ground are computed. The heat rates between the ground and the working fluid are calculated as a function time and position along the ground loop. The heating and cooling loads of the building being fitted with a GSHP are determined with the computer program developed by DOE called ENERGYPLUS. Lastly COP (coefficient of performance) results as a function of time are provided. Both the two-dimensional and three-dimensional computer programs developed as part of this work are based upon a detailed finite volume solution of the energy equation for the ground and ground loop. Real heat pump characteristics are entered into the program and used to model the heat pump performance. Thus these computer tools simulate the coupled performance of the ground loop and the heat pump. The

  7. Computer programs for the in-core fuel management of power reactors

    International Nuclear Information System (INIS)

    1981-08-01

    This document gives a survey of the presently tested and used computer programs applicable to the in-core fuel management of light and heavy water moderated nuclear power reactors. Each computer program is described (provided that enough information was supplied) such that the nature of the physical problem solved and the basic mathematical or calculational approach are evident. In addition, further information regarding computer requirements, up-to-date applications and experiences and specific details concerning implementation, staff requirements, etc., are provided. Program procurement conditions, possible program implementation assistance and commercial conditions (where applicable) are given. (author)

  8. PDDP, A Data Parallel Programming Model

    Directory of Open Access Journals (Sweden)

    Karen H. Warren

    1996-01-01

    Full Text Available PDDP, the parallel data distribution preprocessor, is a data parallel programming model for distributed memory parallel computers. PDDP implements high-performance Fortran-compatible data distribution directives and parallelism expressed by the use of Fortran 90 array syntax, the FORALL statement, and the WHERE construct. Distributed data objects belong to a global name space; other data objects are treated as local and replicated on each processor. PDDP allows the user to program in a shared memory style and generates codes that are portable to a variety of parallel machines. For interprocessor communication, PDDP uses the fastest communication primitives on each platform.

  9. Demand side management scheme in smart grid with cloud computing approach using stochastic dynamic programming

    Directory of Open Access Journals (Sweden)

    S. Sofana Reka

    2016-09-01

    Full Text Available This paper proposes a cloud computing framework in smart grid environment by creating small integrated energy hub supporting real time computing for handling huge storage of data. A stochastic programming approach model is developed with cloud computing scheme for effective demand side management (DSM in smart grid. Simulation results are obtained using GUI interface and Gurobi optimizer in Matlab in order to reduce the electricity demand by creating energy networks in a smart hub approach.

  10. Modeling, Simulation and Analysis of Complex Networked Systems: A Program Plan for DOE Office of Advanced Scientific Computing Research

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L

    2009-05-01

    Many complex systems of importance to the U.S. Department of Energy consist of networks of discrete components. Examples are cyber networks, such as the internet and local area networks over which nearly all DOE scientific, technical and administrative data must travel, the electric power grid, social networks whose behavior can drive energy demand, and biological networks such as genetic regulatory networks and metabolic networks. In spite of the importance of these complex networked systems to all aspects of DOE's operations, the scientific basis for understanding these systems lags seriously behind the strong foundations that exist for the 'physically-based' systems usually associated with DOE research programs that focus on such areas as climate modeling, fusion energy, high-energy and nuclear physics, nano-science, combustion, and astrophysics. DOE has a clear opportunity to develop a similarly strong scientific basis for understanding the structure and dynamics of networked systems by supporting a strong basic research program in this area. Such knowledge will provide a broad basis for, e.g., understanding and quantifying the efficacy of new security approaches for computer networks, improving the design of computer or communication networks to be more robust against failures or attacks, detecting potential catastrophic failure on the power grid and preventing or mitigating its effects, understanding how populations will respond to the availability of new energy sources or changes in energy policy, and detecting subtle vulnerabilities in large software systems to intentional attack. This white paper outlines plans for an aggressive new research program designed to accelerate the advancement of the scientific basis for complex networked systems of importance to the DOE. It will focus principally on four research areas: (1) understanding network structure, (2) understanding network dynamics, (3) predictive modeling and simulation for complex

  11. Modeling, Simulation and Analysis of Complex Networked Systems: A Program Plan for DOE Office of Advanced Scientific Computing Research

    International Nuclear Information System (INIS)

    Brown, D.L.

    2009-01-01

    Many complex systems of importance to the U.S. Department of Energy consist of networks of discrete components. Examples are cyber networks, such as the internet and local area networks over which nearly all DOE scientific, technical and administrative data must travel, the electric power grid, social networks whose behavior can drive energy demand, and biological networks such as genetic regulatory networks and metabolic networks. In spite of the importance of these complex networked systems to all aspects of DOE's operations, the scientific basis for understanding these systems lags seriously behind the strong foundations that exist for the 'physically-based' systems usually associated with DOE research programs that focus on such areas as climate modeling, fusion energy, high-energy and nuclear physics, nano-science, combustion, and astrophysics. DOE has a clear opportunity to develop a similarly strong scientific basis for understanding the structure and dynamics of networked systems by supporting a strong basic research program in this area. Such knowledge will provide a broad basis for, e.g., understanding and quantifying the efficacy of new security approaches for computer networks, improving the design of computer or communication networks to be more robust against failures or attacks, detecting potential catastrophic failure on the power grid and preventing or mitigating its effects, understanding how populations will respond to the availability of new energy sources or changes in energy policy, and detecting subtle vulnerabilities in large software systems to intentional attack. This white paper outlines plans for an aggressive new research program designed to accelerate the advancement of the scientific basis for complex networked systems of importance to the DOE. It will focus principally on four research areas: (1) understanding network structure, (2) understanding network dynamics, (3) predictive modeling and simulation for complex networked systems

  12. Development of computer program for safety of nuclear power plant against tsunami

    International Nuclear Information System (INIS)

    Jin, S. B.; Choi, K. R.; Lee, S. K.; Cho, Y. S.

    2001-01-01

    The main objective of this study is the development of a computer program to check the safety of nuclear power plants along the coastline of the Korean Peninsula. The computer program describes the propagation and associated run-up process of tsunamis by solving linear and nonlinear shallow-water equations with finite difference methods. The computer program has been applied to several ideal and simplified problems. Obtained numerical solutions are compared to existing and available solutions and measurements. A very good agreement between numerical solutions and existing measurement is observed. The computer program developed in this study can be to check the safety analysis of nuclear power plants against tsunamis. The program can also be used to study the propagation of tsunamis for a long distance, and associated run-up and run-down process along a shoreline. Furthermore, the computer program can be used to provide the proper design criteria of coastal facilities and structures

  13. MaMR: High-performance MapReduce programming model for material cloud applications

    Science.gov (United States)

    Jing, Weipeng; Tong, Danyu; Wang, Yangang; Wang, Jingyuan; Liu, Yaqiu; Zhao, Peng

    2017-02-01

    With the increasing data size in materials science, existing programming models no longer satisfy the application requirements. MapReduce is a programming model that enables the easy development of scalable parallel applications to process big data on cloud computing systems. However, this model does not directly support the processing of multiple related data, and the processing performance does not reflect the advantages of cloud computing. To enhance the capability of workflow applications in material data processing, we defined a programming model for material cloud applications that supports multiple different Map and Reduce functions running concurrently based on hybrid share-memory BSP called MaMR. An optimized data sharing strategy to supply the shared data to the different Map and Reduce stages was also designed. We added a new merge phase to MapReduce that can efficiently merge data from the map and reduce modules. Experiments showed that the model and framework present effective performance improvements compared to previous work.

  14. Plasticity: modeling & computation

    National Research Council Canada - National Science Library

    Borja, Ronaldo Israel

    2013-01-01

    .... "Plasticity Modeling & Computation" is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids...

  15. Computing, Information, and Communications Technology (CICT) Program Overview

    Science.gov (United States)

    VanDalsem, William R.

    2003-01-01

    The Computing, Information and Communications Technology (CICT) Program's goal is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communication technologies

  16. Use of computer programs to evaluate effectiveness of security systems

    International Nuclear Information System (INIS)

    Harris, L. Jr.; Goldman, L.A.; Mc Daniel, T.L.

    1987-01-01

    Thirty or more computer programs for security vulnerability analysis were developed from 1975 through 1980. Most of these programs are intended for evaluating security system effectiveness against outsider threats, but at least six programs are primarily oriented to insider threats. Some strengths and weaknesses of these programs are described. Six of these programs, four for outsider threats and two for insider threats, have been revised and adapted for use with IBM personal computers. The vulnerability analysis process is discussed with emphasis on data collection. The difference between design data and operational data is described. For performance-type operational data, such as detection probabilities and barrier delay times, the difference between unstressed and stressed performance data is discussed. Stressed performance data correspond to situations where an adversary attempts to weaken a security system by mitigating certain security measures. Suggestions are made on the combined use of manual analysis and computer analysis

  17. Computer program system for evaluation of FP nuclear data for JENDL. Smooth part

    Energy Technology Data Exchange (ETDEWEB)

    Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Watanabe, Takashi; Iijima, Shungo

    1997-12-01

    This report describes computer programs used to evaluate nuclear data of fission product (FP) nuclides stored in an evaluated nuclear data library JENDL, especially in the smooth part above the resonance region. Many programs were used for determination of nuclear model parameters, calculation of nuclear data, handling of experimental and/or calculated data, and so on. Among them, reported here are programs for determination of level density parameters (ENSDFRET, LVLPLOT, LEVDES), for making sets of JCL and input data for the theoretical calculation program CASTHY (JOBSETTER, INDES/CASTHY), and for conversion of format of CASTHY output files to the ENDF format (CTOB2). (author). 51 refs.

  18. Method for Statically Checking an Object-oriented Computer Program Module

    Science.gov (United States)

    Bierhoff, Kevin M. (Inventor); Aldrich, Jonathan (Inventor)

    2012-01-01

    A method for statically checking an object-oriented computer program module includes the step of identifying objects within a computer program module, at least one of the objects having a plurality of references thereto, possibly from multiple clients. A discipline of permissions is imposed on the objects identified within the computer program module. The permissions enable tracking, from among a discrete set of changeable states, a subset of states each object might be in. A determination is made regarding whether the imposed permissions are violated by a potential reference to any of the identified objects. The results of the determination are output to a user.

  19. Automatic generation of Fortran programs for algebraic simulation models

    International Nuclear Information System (INIS)

    Schopf, W.; Rexer, G.; Ruehle, R.

    1978-04-01

    This report documents a generator program by which econometric simulation models formulated in an application-orientated language can be transformed automatically in a Fortran program. Thus the model designer is able to build up, test and modify models without the need of a Fortran programmer. The development of a computer model is therefore simplified and shortened appreciably; in chapter 1-3 of this report all rules are presented for the application of the generator to the model design. Algebraic models including exogeneous and endogeneous time series variables, lead and lag function can be generated. In addition, to these language elements, Fortran sequences can be applied to the formulation of models in the case of complex model interrelations. Automatically the generated model is a module of the program system RSYST III and is therefore able to exchange input and output data with the central data bank of the system and in connection with the method library modules can be used to handle planning problems. (orig.) [de

  20. Emotion Oriented Programming: Computational Abstractions for AI Problem Solving

    OpenAIRE

    Darty , Kevin; Sabouret , Nicolas

    2012-01-01

    International audience; In this paper, we present a programming paradigm for AI problem solving based on computational concepts drawn from Affective Computing. It is believed that emotions participate in human adaptability and reactivity, in behaviour selection and in complex and dynamic environments. We propose to define a mechanism inspired from this observation for general AI problem solving. To this purpose, we synthesize emotions as programming abstractions that represent the perception ...

  1. Computational model for superconducting toroidal-field magnets for a tokamak reactor

    International Nuclear Information System (INIS)

    Turner, L.R.; Abdou, M.A.

    1978-01-01

    A computational model for predicting the performance characteristics and cost of superconducting toroidal-field (TF) magnets in tokamak reactors is presented. The model can be used to compare the technical and economic merits of different approaches to the design of TF magnets for a reactor system. The model has been integrated into the ANL Systems Analysis Program. Samples of results obtainable with the model are presented

  2. Teaching and Learning of Computational Modelling in Creative Shaping Processes

    Directory of Open Access Journals (Sweden)

    Daniela REIMANN

    2017-10-01

    Full Text Available Today, not only diverse design-related disciplines are required to actively deal with the digitization of information and its potentials and side effects for education processes. In Germany, technology didactics developed in vocational education and computer science education in general education, both separated from media pedagogy as an after-school program. Media education is not a subject in German schools yet. However, in the paper we argue for an interdisciplinary approach to learn about computational modeling in creative processes and aesthetic contexts. It crosses the borders of programming technology, arts and design processes in meaningful contexts. Educational scenarios using smart textile environments are introduced and reflected for project based learning.

  3. Computer Programming Games and Gender Oriented Cultural Forms

    Science.gov (United States)

    AlSulaiman, Sarah Abdulmalik

    I present the design and evaluation of two games designed to help elementary and middle school students learn computer programming concepts. The first game was designed to be "gender neutral", aligning with might be described as a consensus opinion on best practices for computational learning environments. The second game, based on the cultural form of dress up dolls was deliberately designed to appeal to females. I recruited 70 participants in an international two-phase study to investigate the relationship between games, gender, attitudes towards computer programming, and learning. My findings suggest that while the two games were equally effective in terms of learning outcomes, I saw differences in motivation between players of the two games. Specifically, participants who reported a preference for female- oriented games were more motivated to learn about computer programming when they played a game that they perceived as designed for females. In addition, I describe how the two games seemed to encourage different types of social activity between players in a classroom setting. Based on these results, I reflect on the strategy of exclusively designing games and activities as "gender neutral", and suggest that employing cultural forms, including gendered ones, may help create a more productive experience for learners.

  4. Low-Budget Computer Programming in Your School (An Alternative to the Cost of Large Computers). Illinois Series on Educational Applications of Computers. No. 14.

    Science.gov (United States)

    Dennis, J. Richard; Thomson, David

    This paper is concerned with a low cost alternative for providing computer experience to secondary school students. The brief discussion covers the programmable calculator and its relevance for teaching the concepts and the rudiments of computer programming and for computer problem solving. A list of twenty-five programming activities related to…

  5. An Analysis on Distance Education Computer Programming Students' Attitudes Regarding Programming and Their Self-Efficacy for Programming

    Science.gov (United States)

    Ozyurt, Ozcan

    2015-01-01

    This study aims to analyze the attitudes of students studying computer programming through the distance education regarding programming, and their self-efficacy for programming and the relation between these two factors. The study is conducted with 104 students being thought with distance education in a university in the north region of Turkey in…

  6. Mathematica: A System of Computer Programs

    OpenAIRE

    Maiti, Santanu K.

    2006-01-01

    Starting from the basic level of mathematica here we illustrate how to use a mathematica notebook and write a program in the notebook. Next, we investigate elaborately the way of linking of external programs with mathematica, so-called the mathlink operation. Using this technique we can run very tedious jobs quite efficiently, and the operations become extremely fast. Sometimes it is quite desirable to run jobs in background of a computer which can take considerable amount of time to finish, ...

  7. Interactive computer programs for applied nutrition education.

    Science.gov (United States)

    Wise, A

    1985-12-01

    DIET2 and DIET3 are programs written for a Dec2050 computer and intended for teaching applied nutrition to students of nutrition, dietetics, home economics, and hotel and institutional administration. DIET2 combines all the facilities of the separate dietary programs already available at Robert Gordon's Institute of Technology into a single package, and extends these to give students a large amount of relevant information about the nutritional balance of foods (including DHSS and NACNE recommendations) prior to choosing them for meals. Students are also helped by the inclusion of typical portion weights. They are presented with an analysis of nutrients and their balance in the menu created, with an easy mechanism for ammendation of the menu and addition of foods which provide the nutrients that are lacking. At any stage the computer can give the proportion of total nutrient provided by each meal. DIET3 is a relatively simple program that displays the nutritional profile of foods and diets semigraphically.

  8. Seventy Years of Computing in the Nuclear Weapons Program

    Energy Technology Data Exchange (ETDEWEB)

    Archer, Billy Joe [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-05-30

    Los Alamos has continuously been on the forefront of scientific computing since it helped found the field. This talk will explore the rich history of computing in the Los Alamos weapons program. The current status of computing will be discussed, as will the expectations for the near future.

  9. Use of computer programs STLK1 and STWT1 for analysis of stream-aquifer hydraulic interaction

    Science.gov (United States)

    Desimone, Leslie A.; Barlow, Paul M.

    1999-01-01

    Quantifying the hydraulic interaction of aquifers and streams is important in the analysis of stream base fow, flood-wave effects, and contaminant transport between surface- and ground-water systems. This report describes the use of two computer programs, STLK1 and STWT1, to analyze the hydraulic interaction of streams with confined, leaky, and water-table aquifers during periods of stream-stage fuctuations and uniform, areal recharge. The computer programs are based on analytical solutions to the ground-water-flow equation in stream-aquifer settings and calculate ground-water levels, seepage rates across the stream-aquifer boundary, and bank storage that result from arbitrarily varying stream stage or recharge. Analysis of idealized, hypothetical stream-aquifer systems is used to show how aquifer type, aquifer boundaries, and aquifer and streambank hydraulic properties affect aquifer response to stresses. Published data from alluvial and stratifed-drift aquifers in Kentucky, Massachusetts, and Iowa are used to demonstrate application of the programs to field settings. Analytical models of these three stream-aquifer systems are developed on the basis of available hydrogeologic information. Stream-stage fluctuations and recharge are applied to the systems as hydraulic stresses. The models are calibrated by matching ground-water levels calculated with computer program STLK1 or STWT1 to measured ground-water levels. The analytical models are used to estimate hydraulic properties of the aquifer, aquitard, and streambank; to evaluate hydrologic conditions in the aquifer; and to estimate seepage rates and bank-storage volumes resulting from flood waves and recharge. Analysis of field examples demonstrates the accuracy and limitations of the analytical solutions and programs when applied to actual ground-water systems and the potential uses of the analytical methods as alternatives to numerical modeling for quantifying stream-aquifer interactions.

  10. Design and evaluation of the computer-based training program Calcularis for enhancing numerical cognition

    Directory of Open Access Journals (Sweden)

    Tanja eKäser

    2013-08-01

    Full Text Available This article presents the design and a first pilot evaluation of the computer-based training program Calcularis for children with developmental dyscalculia (DD or difficulties in learning mathematics. The program has been designed according to insights on the typical and atypical development of mathematical abilities. The learning process is supported through multimodal cues, which encode different properties of numbers. To offer optimal learning conditions, a user model completes the program and allows flexible adaptation to a child’s individual learning and knowledge profile. 32 children with difficulties in learning mathematics completed the 6 to 12-weeks computer training. The children played the game for 20 minutes per day for 5 days a week. The training effects were evaluated using neuropsychological tests. Generally, children benefited significantly from the training regarding number representation and arithmetic operations. Furthermore, children liked to play with the program and reported that the training improved their mathematical abilities.

  11. Computer modelling of an underground mine ventilation system

    International Nuclear Information System (INIS)

    1984-12-01

    The ability to control workplace short-lived radon daughter concentrations to appropriate levels is crucial to the underground mining of uranium ores. Recognizing that mine ventilation models can be used to design ventilation facilities in new mines and to evaluate proposed ventilation changes in existing mines the Atomic Energy Control Board (AECB) initiated this study to first investigate existing mine ventilation models and then develop a suitable model for use by AECB staff. At the start of the study, available literature on mine ventilation models, in partiuclar models suitable for the unique task of predicting radon daughter levels, were reviewed. While the details of the models varied, it was found that the basic calculation procedures used by the various models were similar. Consequently, a model developed at Queen's University that not only already incorporated most of the desired features but was also readily available, was selected for implementation. Subsequently, the Queen's computer program (actually two programs, one for mine ventilation and one to calculate radon daughter levels) was extended and tested. The following report provides the relevant documentation for setting up and running the models. The mathematical basis of the calculational procedures used in the models are also described

  12. Client/server models for transparent, distributed computational resources

    International Nuclear Information System (INIS)

    Hammer, K.E.; Gilman, T.L.

    1991-01-01

    Client/server models are proposed to address issues of shared resources in a distributed, heterogeneous UNIX environment. Recent development of automated Remote Procedure Call (RPC) interface generator has simplified the development of client/server models. Previously, implementation of the models was only possible at the UNIX socket level. An overview of RPCs and the interface generator will be presented and will include a discussion of generation and installation of remote services, the RPC paradigm, and the three levels of RPC programming. Two applications, the Nuclear Plant Analyzer (NPA) and a fluids simulation using molecular modelling, will be presented to demonstrate how client/server models using RPCs and External Data Representations (XDR) have been used production/computation situations. The NPA incorporates a client/server interface for transferring/translation of TRAC or RELAP results from the UNICOS Cray to a UNIX workstation. The fluids simulation program utilizes the client/server model to access the Cray via a single function allowing it to become a shared co-processor to the workstation application. 5 refs., 6 figs

  13. High Performance Computing Modernization Program Kerberos Throughput Test Report

    Science.gov (United States)

    2017-10-26

    Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/5524--17-9751 High Performance Computing Modernization Program Kerberos Throughput Test ...NUMBER 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 2. REPORT TYPE1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 6. AUTHOR(S) 8. PERFORMING...PAGE 18. NUMBER OF PAGES 17. LIMITATION OF ABSTRACT High Performance Computing Modernization Program Kerberos Throughput Test Report Daniel G. Gdula* and

  14. Penerapan Aplikasi Program Penjualan Dan Pembelian Menggunakan Model Rapid Application Development

    Directory of Open Access Journals (Sweden)

    Annisa Febriani

    2017-09-01

    Abstract The development of information technology at the moment quickly and rapidly, supported by one means namely computer. Of course the computer has been equipped with a particular application is used to help facilitate the work of the man to manage the data of an organization or company so that getting accurate results and according to needs. The results of the observations that have been made, showed a sales and purchase activities are still using manual systems, one of them at a clothing store. Starting from the data processing of the goods, the difficulty of checking stock, purchase transaction, sales transactions, as well as other data storage associated with all types of such activities, so that it could make a loss for the store owner, errors in the logging and less akuratnya the report is made. Judging from the large number of transactions done on clothing stores, required system information more quickly and accurately. Thus, the author makes the program architecture-based computer, use the Microsoft Visual Basic.net programming language and the MySQL database, so that the information and activities that occur can be done quickly and accurately. The methods used in making architecture the program using the model of Rapid Application Development (RAD. This RAD model is an adaptation of the waterfall model for high speed version of the development of each component of its software. Results achieved from the discussion of this theme is the form of the application program selling and buying the ready-made. In this case, the use of the application program is the best solution to solve the existing problems, as well as with the use of application programs can be reached by an activity which is effective and efficient in supporting that activity, especially for addressing the problem of the sale and purchase of.   Keywords: Sales Program, Purchasing Program.

  15. TET_2MCNP: A conversion program to implement tetrahearal-mesh models in MCNP

    International Nuclear Information System (INIS)

    Han, Min Cheol; Yeom, Yeon Soo; Nguyen, Thng Tat; Choi, Chan Soo; Lee, Hyun Su; Kim, Chan Hyeong

    2016-01-01

    Tetrahedral-mesh geometries can be used in the MCNP code, but the MCNP code accepts only the geometry in the Abaqus input file format; hence, the existing tetrahedral-mesh models first need to be converted to the Abacus input file format to be used in the MCNP code. In the present study, we developed a simple but useful computer program, TET_2MCNP, for converting TetGen-generated tetrahedral-mesh models to the Abacus input file format. TET_2MCNP is written in C++ and contains two components: one for converting a TetGen output file to the Abacus input file and the other for the reverse conversion process. The TET_2MCP program also produces an MCNP input file. Further, the program provides some MCNP-specific functions: the maximum number of elements (i.e., tetrahedrons) per part can be limited, and the material density of each element can be transferred to the MCNP input file. To test the developed program, two tetrahedral-mesh models were generated using TetGen and converted to the Abaqus input file format using TET_2MCNP. Subsequently, the converted files were used in the MCNP code to calculate the object- and organ-averaged absorbed dose in the sphere and phantom, respectively. The results show that the converted models provide, within statistical uncertainties, identical dose values to those obtained using the PHITS code, which uses the original tetrahedral-mesh models produced by the TetGen program. The results show that the developed program can successfully convert TetGen tetrahedral-mesh models to Abacus input files. In the present study, we have developed a computer program, TET_2MCNP, which can be used to convert TetGen-generated tetrahedral-mesh models to the Abaqus input file format for use in the MCNP code. We believe this program will be used by many MCNP users for implementing complex tetrahedral-mesh models, including computational human phantoms, in the MCNP code

  16. TASAC a computer program for thermal analysis of severe accident conditions. Version 3/01, Dec 1991. Model description and user`s guide

    Energy Technology Data Exchange (ETDEWEB)

    Stempniewicz, M; Marks, P; Salwa, K

    1992-06-01

    TASAC (Thermal Analysis of Severe Accident Conditions) is computer code developed in the Institute of Atomic Energy written in FORTRAN 77 for the digital computer analysis of PWR rod bundle behaviour during severe accident conditions. The code has the ability to model an early stage of core degradation including heat transfer inside the rods, convective and radiative heat exchange as well as cladding interactions with coolant and fuel, hydrogen generation, melting, relocations and refreezing of fuel rod materials with dissolution of UO{sub 2} and ZrO{sub 2} in liquid phase. The code was applied for the simulation of International Standard Problem number 28, performed on PHEBUS test facility. This report contains the program physical models description, detailed description of input data requirements and results of code verification. The main directions for future TASAC code development are formulated. (author). 20 refs, 39 figs, 4 tabs.

  17. Programming While Construction of Engineering 3D Models of Complex Geometry

    Science.gov (United States)

    Kheyfets, A. L.

    2017-11-01

    The capabilities of geometrically accurate computational 3D models construction with the use of programming are presented. The construction of models of an architectural arch and a glo-boid worm gear is considered as an example. The models are designed in the AutoCAD pack-age. Three programs of construction are given. The first program is for designing a multi-section architectural arch. The control of the arch’s geometry by impacting its main parameters is shown. The second program is for designing and studying the working surface of a globoid gear’s worm. The article shows how to make the animation for this surface’s formation. The third program is for formation of a worm gear cavity surface. The cavity formation dynamics is studied. The programs are written in the AutoLisp programming language. The program texts are provided.

  18. Designing Educational Games for Computer Programming: A Holistic Framework

    Science.gov (United States)

    Malliarakis, Christos; Satratzemi, Maya; Xinogalos, Stelios

    2014-01-01

    Computer science is continuously evolving during the past decades. This has also brought forth new knowledge that should be incorporated and new learning strategies must be adopted for the successful teaching of all sub-domains. For example, computer programming is a vital knowledge area within computer science with constantly changing curriculum…

  19. Computer-Aided Modeling Framework

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    Models are playing important roles in design and analysis of chemicals based products and the processes that manufacture them. Computer-aided methods and tools have the potential to reduce the number of experiments, which can be expensive and time consuming, and there is a benefit of working...... development and application. The proposed work is a part of the project for development of methods and tools that will allow systematic generation, analysis and solution of models for various objectives. It will use the computer-aided modeling framework that is based on a modeling methodology, which combines....... In this contribution, the concept of template-based modeling is presented and application is highlighted for the specific case of catalytic membrane fixed bed models. The modeling template is integrated in a generic computer-aided modeling framework. Furthermore, modeling templates enable the idea of model reuse...

  20. EQ6, a computer program for reaction path modeling of aqueous geochemical systems: Theoretical manual, user's guide, and related documentation (Version 7.0)

    International Nuclear Information System (INIS)

    Wolery, T.J.; Daveler, S.A.

    1992-01-01

    EQ6 is a FORTRAN computer program in the EQ3/6 software package (Wolery, 1979). It calculates reaction paths (chemical evolution) in reacting water-rock and water-rock-waste systems. Speciation in aqueous solution is an integral part of these calculations. EQ6 computes models of titration processes (including fluid mixing), irreversible reaction in closed systems, irreversible reaction in some simple kinds of open systems, and heating or cooling processes, as well as solve ''single-point'' thermodynamic equilibrium problems. A reaction path calculation normally involves a sequence of thermodynamic equilibrium calculations. Chemical evolution is driven by a set of irreversible reactions (i.e., reactions out of equilibrium) and/or changes in temperature and/or pressure. These irreversible reactions usually represent the dissolution or precipitation of minerals or other solids. The code computes the appearance and disappearance of phases in solubility equilibrium with the water. It finds the identities of these phases automatically. The user may specify which potential phases are allowed to form and which are not. There is an option to fix the fugacities of specified gas species, simulating contact with a large external reservoir. Rate laws for irreversible reactions may be either relative rates or actual rates. If any actual rates are used, the calculation has a time frame. Several forms for actual rate laws are programmed into the code. EQ6 is presently able to model both mineral dissolution and growth kinetics

  1. Creating the computer player: an engaging and collaborative approach to introduce computational thinking by combining ‘unplugged’ activities with visual programming

    Directory of Open Access Journals (Sweden)

    Anna Gardeli

    2017-11-01

    Full Text Available Ongoing research is being conducted on appropriate course design, practices and teacher interventions for improving the efficiency of computer science and programming courses in K-12 education. The trend is towards a more constructivist problem-based learning approach. Computational thinking, which refers to formulating and solving problems in a form that can be efficiently processed by a computer, raises an important educational challenge. Our research aims to explore possible ways of enriching computer science teaching with a focus on development of computational thinking. We have prepared and evaluated a learning intervention for introducing computer programming to children between 10 and 14 years old; this involves students working in groups to program the behavior of the computer player of a well-known game. The programming process is split into two parts. First, students design a high-level version of their algorithm during an ‘unplugged’ pen & paper phase, and then they encode their solution as an executable program in a visual programming environment. Encouraging evaluation results have been achieved regarding the educational and motivational value of the proposed approach.

  2. SONATINA-1: a computer program for seismic response analysis of column in HTGR core

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1980-11-01

    An computer program SONATINA-1 for predicting the behavior of a prismatic high-temperature gas-cooled reactor (HTGR) core under seismic excitation has been developed. In this analytical method, blocks are treated as rigid bodies and are constrained by dowel pins which restrict relative horizontal movement but allow vertical and rocking motions. Coulomb friction between blocks and between dowel holes and pins is also considered. A spring dashpot model is used for the collision process between adjacent blocks and between blocks and boundary walls. Analytical results are compared with experimental results and are found to be in good agreement. The computer program can be used to predict the behavior of the HTGR core under seismic excitation. (author)

  3. F-Nets and Software Cabling: Deriving a Formal Model and Language for Portable Parallel Programming

    Science.gov (United States)

    DiNucci, David C.; Saini, Subhash (Technical Monitor)

    1998-01-01

    Parallel programming is still being based upon antiquated sequence-based definitions of the terms "algorithm" and "computation", resulting in programs which are architecture dependent and difficult to design and analyze. By focusing on obstacles inherent in existing practice, a more portable model is derived here, which is then formalized into a model called Soviets which utilizes a combination of imperative and functional styles. This formalization suggests more general notions of algorithm and computation, as well as insights into the meaning of structured programming in a parallel setting. To illustrate how these principles can be applied, a very-high-level graphical architecture-independent parallel language, called Software Cabling, is described, with many of the features normally expected from today's computer languages (e.g. data abstraction, data parallelism, and object-based programming constructs).

  4. Computer Profiling Based Model for Investigation

    OpenAIRE

    Neeraj Choudhary; Nikhil Kumar Singh; Parmalik Singh

    2011-01-01

    Computer profiling is used for computer forensic analysis, and proposes and elaborates on a novel model for use in computer profiling, the computer profiling object model. The computer profiling object model is an information model which models a computer as objects with various attributes and inter-relationships. These together provide the information necessary for a human investigator or an automated reasoning engine to make judgments as to the probable usage and evidentiary value of a comp...

  5. Guidelines for development of NASA (National Aeronautics and Space Administration) computer security training programs

    Science.gov (United States)

    Tompkins, F. G.

    1983-01-01

    The report presents guidance for the NASA Computer Security Program Manager and the NASA Center Computer Security Officials as they develop training requirements and implement computer security training programs. NASA audiences are categorized based on the computer security knowledge required to accomplish identified job functions. Training requirements, in terms of training subject areas, are presented for both computer security program management personnel and computer resource providers and users. Sources of computer security training are identified.

  6. History and future perspectives of the Monte Carlo shell model -from Alphleet to K computer-

    International Nuclear Information System (INIS)

    Shimizu, Noritaka; Otsuka, Takaharu; Utsuno, Yutaka; Mizusaki, Takahiro; Honma, Michio; Abe, Takashi

    2013-01-01

    We report a history of the developments of the Monte Carlo shell model (MCSM). The MCSM was proposed in order to perform large-scale shell-model calculations which direct diagonalization method cannot reach. Since 1999 PC clusters were introduced for parallel computation of the MCSM. Since 2011 we participated the High Performance Computing Infrastructure Strategic Program and developed a new MCSM code for current massively parallel computers such as K computer. We discuss future perspectives concerning a new framework and parallel computation of the MCSM by incorporating conjugate gradient method and energy-variance extrapolation

  7. Computability, complexity, and languages fundamentals of theoretical computer science

    CERN Document Server

    Davis, Martin D; Rheinboldt, Werner

    1983-01-01

    Computability, Complexity, and Languages: Fundamentals of Theoretical Computer Science provides an introduction to the various aspects of theoretical computer science. Theoretical computer science is the mathematical study of models of computation. This text is composed of five parts encompassing 17 chapters, and begins with an introduction to the use of proofs in mathematics and the development of computability theory in the context of an extremely simple abstract programming language. The succeeding parts demonstrate the performance of abstract programming language using a macro expa

  8. A general method for generating bathymetric data for hydrodynamic computer models

    Science.gov (United States)

    Burau, J.R.; Cheng, R.T.

    1989-01-01

    To generate water depth data from randomly distributed bathymetric data for numerical hydrodymamic models, raw input data from field surveys, water depth data digitized from nautical charts, or a combination of the two are sorted to given an ordered data set on which a search algorithm is used to isolate data for interpolation. Water depths at locations required by hydrodynamic models are interpolated from the bathymetric data base using linear or cubic shape functions used in the finite-element method. The bathymetric database organization and preprocessing, the search algorithm used in finding the bounding points for interpolation, the mathematics of the interpolation formulae, and the features of the automatic generation of water depths at hydrodynamic model grid points are included in the analysis. This report includes documentation of two computer programs which are used to: (1) organize the input bathymetric data; and (2) to interpolate depths for hydrodynamic models. An example of computer program operation is drawn from a realistic application to the San Francisco Bay estuarine system. (Author 's abstract)

  9. STRAY - An interactive program for the computation of stray radiation in infrared telescopes

    Science.gov (United States)

    St. Clair Dinger, Ann

    1987-01-01

    The STRAY program to model the amount of stray radiation reaching the focal plane of a well-baffled telescope is described. The STRAY telescope model is addressed, including the aperture shade, barrel baffle, optics, mirror sectioning and chopping, and off-axis points in focal plane. The possible illumination paths are shown, and calculation options using STRAY are discussed. The stored data and computational aspects of STRAY are addressed. STRAY is compared to the MINI-APART model, and applications of STRAY are described.

  10. BALANCER: A Computer Program for Balancing Chemical Equations.

    Science.gov (United States)

    Jones, R. David; Schwab, A. Paul

    1989-01-01

    Describes the theory and operation of a computer program which was written to balance chemical equations. Software consists of a compiled file of 46K for use under MS-DOS 2.0 or later on IBM PC or compatible computers. Additional specifications of courseware and availability information are included. (Author/RT)

  11. Computational Software to Fit Seismic Data Using Epidemic-Type Aftershock Sequence Models and Modeling Performance Comparisons

    Science.gov (United States)

    Chu, A.

    2016-12-01

    Modern earthquake catalogs are often analyzed using spatial-temporal point process models such as the epidemic-type aftershock sequence (ETAS) models of Ogata (1998). My work implements three of the homogeneous ETAS models described in Ogata (1998). With a model's log-likelihood function, my software finds the Maximum-Likelihood Estimates (MLEs) of the model's parameters to estimate the homogeneous background rate and the temporal and spatial parameters that govern triggering effects. EM-algorithm is employed for its advantages of stability and robustness (Veen and Schoenberg, 2008). My work also presents comparisons among the three models in robustness, convergence speed, and implementations from theory to computing practice. Up-to-date regional seismic data of seismic active areas such as Southern California and Japan are used to demonstrate the comparisons. Data analysis has been done using computer languages Java and R. Java has the advantages of being strong-typed and easiness of controlling memory resources, while R has the advantages of having numerous available functions in statistical computing. Comparisons are also made between the two programming languages in convergence and stability, computational speed, and easiness of implementation. Issues that may affect convergence such as spatial shapes are discussed.

  12. Computational chemistry with transputers: A direct SCF program

    International Nuclear Information System (INIS)

    Wedig, U.; Burkhardt, A.; Schnering, H.G. von

    1989-01-01

    By using transputers it is possible to build up networks of parallel processors with varying topology. Due to the architecture of the processors it is appropriate to use the MIMD (multiple instruction multiple data) concept of parallel computing. The most suitable programming language is OCCAM. We investigate the use of transputer networks in computational chemistry, starting with the direct SCF method. The most time consuming step, the calculation of the two electron integrals is executed parallelly. Each node in the network calculates whole batches of integrals. The main program is written in OCCAM. For some large-scale arithmetic processes running on a single node, however, we used FORTRAN subroutines out of standard ab-initio programs to reduce the programming effort. Test calculations show, that the integral calculation step can be parallelled very efficiently. We observed a speed-up of almost 8 using eight network processors. Even in consideration of the scalar part of the SCF iteration, the speed-up is not less than 7.1. (orig.)

  13. Human Inspired Self-developmental Model of Neural Network (HIM): Introducing Content/Form Computing

    Science.gov (United States)

    Krajíček, Jiří

    This paper presents cross-disciplinary research between medical/psychological evidence on human abilities and informatics needs to update current models in computer science to support alternative methods for computation and communication. In [10] we have already proposed hypothesis introducing concept of human information model (HIM) as cooperative system. Here we continue on HIM design in detail. In our design, first we introduce Content/Form computing system which is new principle of present methods in evolutionary computing (genetic algorithms, genetic programming). Then we apply this system on HIM (type of artificial neural network) model as basic network self-developmental paradigm. Main inspiration of our natural/human design comes from well known concept of artificial neural networks, medical/psychological evidence and Sheldrake theory of "Nature as Alive" [22].

  14. Analyzing C2 Structures and Self-Synchronization with Simple Computational Models

    Science.gov (United States)

    2011-06-01

    16th ICCRTS “Collective C2 in Multinational Civil-Military Operations” Analyzing C2 Structures and Self- Synchronization with Simple...Self- Synchronization with Simple Computational Models 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT...models. The Kuramoto Model, though with some serious limitations, provides a representation of information flow and self- synchronization in an

  15. Scratch as a Computational Modelling Tool for Teaching Physics

    Science.gov (United States)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-01-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…

  16. Models of optical quantum computing

    Directory of Open Access Journals (Sweden)

    Krovi Hari

    2017-03-01

    Full Text Available I review some work on models of quantum computing, optical implementations of these models, as well as the associated computational power. In particular, we discuss the circuit model and cluster state implementations using quantum optics with various encodings such as dual rail encoding, Gottesman-Kitaev-Preskill encoding, and coherent state encoding. Then we discuss intermediate models of optical computing such as boson sampling and its variants. Finally, we review some recent work in optical implementations of adiabatic quantum computing and analog optical computing. We also provide a brief description of the relevant aspects from complexity theory needed to understand the results surveyed.

  17. Automated a complex computer aided design concept generated using macros programming

    Science.gov (United States)

    Rizal Ramly, Mohammad; Asrokin, Azharrudin; Abd Rahman, Safura; Zulkifly, Nurul Ain Md

    2013-12-01

    Changing a complex Computer Aided design profile such as car and aircraft surfaces has always been difficult and challenging. The capability of CAD software such as AutoCAD and CATIA show that a simple configuration of a CAD design can be easily modified without hassle, but it is not the case with complex design configuration. Design changes help users to test and explore various configurations of the design concept before the production of a model. The purpose of this study is to look into macros programming as parametric method of the commercial aircraft design. Macros programming is a method where the configurations of the design are done by recording a script of commands, editing the data value and adding a certain new command line to create an element of parametric design. The steps and the procedure to create a macro programming are discussed, besides looking into some difficulties during the process of creation and advantage of its usage. Generally, the advantages of macros programming as a method of parametric design are; allowing flexibility for design exploration, increasing the usability of the design solution, allowing proper contained by the model while restricting others and real time feedback changes.

  18. Automated a complex computer aided design concept generated using macros programming

    International Nuclear Information System (INIS)

    Ramly, Mohammad Rizal; Asrokin, Azharrudin; Rahman, Safura Abd; Zulkifly, Nurul Ain Md

    2013-01-01

    Changing a complex Computer Aided design profile such as car and aircraft surfaces has always been difficult and challenging. The capability of CAD software such as AutoCAD and CATIA show that a simple configuration of a CAD design can be easily modified without hassle, but it is not the case with complex design configuration. Design changes help users to test and explore various configurations of the design concept before the production of a model. The purpose of this study is to look into macros programming as parametric method of the commercial aircraft design. Macros programming is a method where the configurations of the design are done by recording a script of commands, editing the data value and adding a certain new command line to create an element of parametric design. The steps and the procedure to create a macro programming are discussed, besides looking into some difficulties during the process of creation and advantage of its usage. Generally, the advantages of macros programming as a method of parametric design are; allowing flexibility for design exploration, increasing the usability of the design solution, allowing proper contained by the model while restricting others and real time feedback changes

  19. An adaptive maneuvering logic computer program for the simulation of one-on-one air-to-air combat. Volume 1: General description

    Science.gov (United States)

    Burgin, G. H.; Fogel, L. J.; Phelps, J. P.

    1975-01-01

    A technique for computer simulation of air combat is described. Volume 1 decribes the computer program and its development in general terms. Two versions of the program exist. Both incorporate a logic for selecting and executing air combat maneuvers with performance models of specific fighter aircraft. In the batch processing version the flight paths of two aircraft engaged in interactive aerial combat and controlled by the same logic are computed. The realtime version permits human pilots to fly air-to-air combat against the adaptive maneuvering logic (AML) in Langley Differential Maneuvering Simulator (DMS). Volume 2 consists of a detailed description of the computer programs.

  20. Computer program for optical systems ray tracing

    Science.gov (United States)

    Ferguson, T. J.; Konn, H.

    1967-01-01

    Program traces rays of light through optical systems consisting of up to 65 different optical surfaces and computes the aberrations. For design purposes, paraxial tracings with astigmation and third order tracings are provided.

  1. The Main Tendencies in the Development of Startup Projects as a Form of Innovative-Creative Enterprises in the Ukrainian Computer Programming Market

    Directory of Open Access Journals (Sweden)

    Garafonova Olga I.

    2017-10-01

    Full Text Available The article is aimed at studying the main tendencies in the development of startup projects as a form of innovative-creative enterprises in the Ukrainian computer programming market. A definition of «innovative-creative enterprises» has been proposed, the main features of startups as a form of innovative-creative enterprises has been considered. The directions of development of the computer programming market were analyzed, considering the most significant future trends, products and services in the computer programming sector. An analysis of startups in the Ukrainian computer programming market, based on the volume of investments made, was carried out. A model for the development of startup projects as a form of innovative-creative enterprises has been designed. The unfamiliar promising spheres, wherein have not yet been launched startups in the Ukrainian computer programming market, have been indicated.

  2. Final Report for Award #DE-SC3956 Separating Algorithm and Implementation via programming Model Injection (SAIMI)

    Energy Technology Data Exchange (ETDEWEB)

    Strout, Michelle [Colorado State Univ., Fort Collins, CO (United States)

    2015-08-15

    Programming parallel machines is fraught with difficulties: the obfuscation of algorithms due to implementation details such as communication and synchronization, the need for transparency between language constructs and performance, the difficulty of performing program analysis to enable automatic parallelization techniques, and the existence of important "dusty deck" codes. The SAIMI project developed abstractions that enable the orthogonal specification of algorithms and implementation details within the context of existing DOE applications. The main idea is to enable the injection of small programming models such as expressions involving transcendental functions, polyhedral iteration spaces with sparse constraints, and task graphs into full programs through the use of pragmas. These smaller, more restricted programming models enable orthogonal specification of many implementation details such as how to map the computation on to parallel processors, how to schedule the computation, and how to allocation storage for the computation. At the same time, these small programming models enable the expression of the most computationally intense and communication heavy portions in many scientific simulations. The ability to orthogonally manipulate the implementation for such computations will significantly ease performance programming efforts and expose transformation possibilities and parameter to automated approaches such as autotuning. At Colorado State University, the SAIMI project was supported through DOE grant DE-SC3956 from April 2010 through August 2015. The SAIMI project has contributed a number of important results to programming abstractions that enable the orthogonal specification of implementation details in scientific codes. This final report summarizes the research that was funded by the SAIMI project.

  3. Computer programs for TRIGA calibration, burnup evaluation, and bookkeeping

    International Nuclear Information System (INIS)

    Nelson, George W.

    1978-01-01

    Several computer programs have been developed at the University of Arizona to assist the direction and operation of the TRIGA Reactor Laboratory. The programs fall into the following three categories: 1. Programs for calculation of burnup of each fuel element in the reactor core, for maintaining an inventory of fuel element location and fissile content at any time, and for evaluation of the reactivity effects of burnup or proposed fuel element rearrangement in the core. 2. Programs for evaluation, function fitting, and tabulation of control rod measurements. 3. Bookkeeping programs to summarize and tabulate reactor runs and irradiations according to time, energy release, purpose, responsible party, etc. These summarized data are reported in an annual operating report for the facility. The use of these programs has saved innumerable hours of repetitious work, assuring more accurate, objective results, and requiring a minimum of effort to repeat calculations when input data are modified. The programs are written in FORTRAN-IV, and have been used on a CDC-6400 computer. (author)

  4. Present situation and necessity of computer programmed X-ray equipment

    International Nuclear Information System (INIS)

    Shiraishi, Junji; Hatagawa, Masakatsu

    1989-01-01

    In order to take radiographic photos, it is necessary to know a great deal of information. For example, mA, kV, exposure time, screen, grid, phototimer, etc. In this study, we examined 4 widely-used types computer-programmed X-ray equipment. These computer-assisted processes are designed to look at the data and make programmed individual decisions. The purpose is to reduce the differences in radiographic quality and the risk of retaking caused by the differences in experience among technicians. We concluded that none of them has ideal functions but each can be used as a source supply information, if a little remodeling is done, and we confirm that computer-programmed X-ray equipment is, and will continue to go be, necessary. (author)

  5. HEATUP: a computer program for the thermal anaysis of a LOFC accident in an HTGR

    International Nuclear Information System (INIS)

    Siman-Tov, I.I.; Turner, W.D.

    1976-11-01

    The HEATUP code, a modification of the general, time-dependent, one-, two-, and three-dimensional program HEATING5, was designed for the thermal analysis of a Loss of Forced Circulation accident in a High Temperature Gas-Cooled Reactor. This report contains a description of the computational model which includes: a description of the basic problem; a short review of preliminary results related to the choice of thermal properties, boundary conditions and initial conditions; a full description of a typical three-dimensional R-Z model and a limited one of a two-dimensional RZ model. HEATUP's additional computations are presented together with the method of input preparation. The three-dimensional model of the Fulton Generating Station Loss of Forced Circulation accident is used as a sample problem. A complete presentation of the input data is made. Also, the computer printout of the sample problem input data and results are given

  6. HEATUP: a computer program for the thermal anaysis of a LOFC accident in an HTGR

    Energy Technology Data Exchange (ETDEWEB)

    Siman-Tov, I.I.; Turner, W.D.

    1976-11-01

    The HEATUP code, a modification of the general, time-dependent, one-, two-, and three-dimensional program HEATING5, was designed for the thermal analysis of a Loss of Forced Circulation accident in a High Temperature Gas-Cooled Reactor. This report contains a description of the computational model which includes: a description of the basic problem; a short review of preliminary results related to the choice of thermal properties, boundary conditions and initial conditions; a full description of a typical three-dimensional R-Z model and a limited one of a two-dimensional RZ model. HEATUP's additional computations are presented together with the method of input preparation. The three-dimensional model of the Fulton Generating Station Loss of Forced Circulation accident is used as a sample problem. A complete presentation of the input data is made. Also, the computer printout of the sample problem input data and results are given.

  7. Thermal model of laser-induced skin damage: computer program operator's manual. Final report, September 1976--April 1977

    Energy Technology Data Exchange (ETDEWEB)

    Takata, A.N.

    1977-12-01

    A user-oriented description is given of a computer program for predicting temperature rises, irreversible damage, and degree of burns caused to skin by laser exposures. This report describes the parameters necessary to run the program and provides suggested values for the parameters. Input data are described in detail as well as the capabilities and limitations of the program. (Author)

  8. Precision Modeling Of Targets Using The VALUE Computer Program

    Science.gov (United States)

    Hoffman, George A.; Patton, Ronald; Akerman, Alexander

    1989-08-01

    The 1976-vintage LASERX computer code has been augmented to produce realistic electro-optical images of targets. Capabilities lacking in LASERX but recently incorporated into its VALUE successor include: •Shadows cast onto the ground •Shadows cast onto parts of the target •See-through transparencies (e.g.,canopies) •Apparent images due both to atmospheric scattering and turbulence •Surfaces characterized by multiple bi-directional reflectance functions VALUE provides not only realistic target modeling by its precise and comprehensive representation of all target attributes, but additionally VALUE is very user friendly. Specifically, setup of runs is accomplished by screen prompting menus in a sequence of queries that is logical to the user. VALUE also incorporates the Optical Encounter (OPEC) software developed by Tricor Systems,Inc., Elgin, IL.

  9. Programming Non-Trivial Algorithms in the Measurement Based Quantum Computation Model

    Energy Technology Data Exchange (ETDEWEB)

    Alsing, Paul [United States Air Force Research Laboratory, Wright-Patterson Air Force Base; Fanto, Michael [United States Air Force Research Laboratory, Wright-Patterson Air Force Base; Lott, Capt. Gordon [United States Air Force Research Laboratory, Wright-Patterson Air Force Base; Tison, Christoper C. [United States Air Force Research Laboratory, Wright-Patterson Air Force Base

    2014-01-01

    We provide a set of prescriptions for implementing a quantum circuit model algorithm as measurement based quantum computing (MBQC) algorithm1, 2 via a large cluster state. As means of illustration we draw upon our numerical modeling experience to describe a large graph state capable of searching a logical 8 element list (a non-trivial version of Grover's algorithm3 with feedforward). We develop several prescriptions based on analytic evaluation of cluster states and graph state equations which can be generalized into any circuit model operations. Such a resulting cluster state will be able to carry out the desired operation with appropriate measurements and feed forward error correction. We also discuss the physical implementation and the analysis of the principal 3-qubit entangling gate (Toffoli) required for a non-trivial feedforward realization of an 8-element Grover search algorithm.

  10. An ODP computational model of a cooperative binding object

    Science.gov (United States)

    Logé, Christophe; Najm, Elie; Chen, Ken

    1997-12-01

    A next generation of systems that should appear will have to manage simultaneously several geographically distributed users. These systems belong to the class of computer-supported cooperative work systems (CSCW). The development of such complex systems requires rigorous development methods and flexible open architectures. Open distributed processing (ODP) is a standardization effort that aims at providing such architectures. ODP features appropriate abstraction levels and a clear articulation between requirements, programming and infrastructure support. ODP advocates the use of formal methods for the specification of systems and components. The computational model, an object-based model, one of the abstraction levels identified within ODP, plays a central role in the global architecture. In this model, basic objects can be composed with communication and distribution abstractions (called binding objects) to form a computational specification of distributed systems, or applications. Computational specifications can then be mapped (in a mechanism akin to compilation) onto an engineering solution. We use an ODP-inspired method to computationally specify a cooperative system. We start from a general purpose component that we progressively refine into a collection of basic and binding objects. We focus on two issues of a co-authoring application, namely, dynamic reconfiguration and multiview synchronization. We discuss solutions for these issues and formalize them using the MT-LOTOS specification language that is currently studied in the ISO standardization formal description techniques group.

  11. RISKIND: A computer program for calculating radiological consequences and health risks from transportation of spent nuclear fuel

    Energy Technology Data Exchange (ETDEWEB)

    Yuan, Y.C. [Square Y, Orchard Park, NY (United States); Chen, S.Y.; LePoire, D.J. [Argonne National Lab., IL (United States). Environmental Assessment and Information Sciences Div.; Rothman, R. [USDOE Idaho Field Office, Idaho Falls, ID (United States)

    1993-02-01

    This report presents the technical details of RISIUND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the collective population from exposures associated with the transportation of spent nuclear fuel. RISKIND is a user-friendly, semiinteractive program that can be run on an IBM or equivalent personal computer. The program language is FORTRAN-77. Several models are included in RISKIND that have been tailored to calculate the exposure to individuals under various incident-free and accident conditions. The incidentfree models assess exposures from both gamma and neutron radiation and can account for different cask designs. The accident models include accidental release, atmospheric transport, and the environmental pathways of radionuclides from spent fuels; these models also assess health risks to individuals and the collective population. The models are supported by databases that are specific to spent nuclear fuels and include a radionudide inventory and dose conversion factors.

  12. RISKIND: A computer program for calculating radiological consequences and health risks from transportation of spent nuclear fuel

    International Nuclear Information System (INIS)

    Yuan, Y.C.; Chen, S.Y.; LePoire, D.J.

    1993-02-01

    This report presents the technical details of RISIUND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the collective population from exposures associated with the transportation of spent nuclear fuel. RISKIND is a user-friendly, semiinteractive program that can be run on an IBM or equivalent personal computer. The program language is FORTRAN-77. Several models are included in RISKIND that have been tailored to calculate the exposure to individuals under various incident-free and accident conditions. The incidentfree models assess exposures from both gamma and neutron radiation and can account for different cask designs. The accident models include accidental release, atmospheric transport, and the environmental pathways of radionuclides from spent fuels; these models also assess health risks to individuals and the collective population. The models are supported by databases that are specific to spent nuclear fuels and include a radionudide inventory and dose conversion factors

  13. Employing subgoals in computer programming education

    Science.gov (United States)

    Margulieux, Lauren E.; Catrambone, Richard; Guzdial, Mark

    2016-01-01

    The rapid integration of technology into our professional and personal lives has left many education systems ill-equipped to deal with the influx of people seeking computing education. To improve computing education, we are applying techniques that have been developed for other procedural fields. The present study applied such a technique, subgoal labeled worked examples, to explore whether it would improve programming instruction. The first two experiments, conducted in a laboratory, suggest that the intervention improves undergraduate learners' problem-solving performance and affects how learners approach problem-solving. The third experiment demonstrates that the intervention has similar, and perhaps stronger, effects in an online learning environment with in-service K-12 teachers who want to become qualified to teach computing courses. By implementing this subgoal intervention as a tool for educators to teach themselves and their students, education systems could improve computing education and better prepare learners for an increasingly technical world.

  14. Cognitive computing and eScience in health and life science research: artificial intelligence and obesity intervention programs.

    Science.gov (United States)

    Marshall, Thomas; Champagne-Langabeer, Tiffiany; Castelli, Darla; Hoelscher, Deanna

    2017-12-01

    To present research models based on artificial intelligence and discuss the concept of cognitive computing and eScience as disruptive factors in health and life science research methodologies. The paper identifies big data as a catalyst to innovation and the development of artificial intelligence, presents a framework for computer-supported human problem solving and describes a transformation of research support models. This framework includes traditional computer support; federated cognition using machine learning and cognitive agents to augment human intelligence; and a semi-autonomous/autonomous cognitive model, based on deep machine learning, which supports eScience. The paper provides a forward view of the impact of artificial intelligence on our human-computer support and research methods in health and life science research. By augmenting or amplifying human task performance with artificial intelligence, cognitive computing and eScience research models are discussed as novel and innovative systems for developing more effective adaptive obesity intervention programs.

  15. Computer Presentation Programs and Teaching Research Methodologies

    Directory of Open Access Journals (Sweden)

    Vahid Motamedi

    2015-05-01

    Full Text Available Supplementing traditional chalk and board instruction with computer delivery has been viewed positively by students who have reported increased understanding and more interaction with the instructor when computer presentations are used in the classroom. Some problems contributing to student errors while taking class notes might be transcription of numbers to the board, and handwriting of the instructor can be resolved in careful construction of computer presentations. The use of computer presentation programs promises to increase the effectiveness of learning by making content more readily available, by reducing the cost and effort of producing quality content, and by allowing content to be more easily shared. This paper describes how problems can be overcome by using presentation packages for instruction.

  16. Computer programs for eddy-current defect studies

    Energy Technology Data Exchange (ETDEWEB)

    Pate, J. R.; Dodd, C. V. [Oak Ridge National Lab., TN (USA)

    1990-06-01

    Several computer programs to aid in the design of eddy-current tests and probes have been written. The programs, written in Fortran, deal in various ways with the response to defects exhibited by four types of probes: the pancake probe, the reflection probe, the circumferential boreside probe, and the circumferential encircling probe. Programs are included which calculate the impedance or voltage change in a coil due to a defect, which calculate and plot the defect sensitivity factor of a coil, and which invert calculated or experimental readings to obtain the size of a defect. The theory upon which the programs are based is the Burrows point defect theory, and thus the calculations of the programs will be more accurate for small defects. 6 refs., 21 figs.

  17. Computer programs for eddy-current defect studies

    International Nuclear Information System (INIS)

    Pate, J.R.; Dodd, C.V.

    1990-06-01

    Several computer programs to aid in the design of eddy-current tests and probes have been written. The programs, written in Fortran, deal in various ways with the response to defects exhibited by four types of probes: the pancake probe, the reflection probe, the circumferential boreside probe, and the circumferential encircling probe. Programs are included which calculate the impedance or voltage change in a coil due to a defect, which calculate and plot the defect sensitivity factor of a coil, and which invert calculated or experimental readings to obtain the size of a defect. The theory upon which the programs are based is the Burrows point defect theory, and thus the calculations of the programs will be more accurate for small defects. 6 refs., 21 figs

  18. ITAC, an insider threat assessment computer program

    International Nuclear Information System (INIS)

    Eggers, R.F.; Giese, E.W.

    1988-01-01

    The insider threat assessment computer program, ITAC, is used to evaluate the vulnerability of nuclear material processing facilities to theft of special nuclear material by one or more authorized insider adversaries. The program includes two main parts: one is used to determine the timeliness of nuclear material accounting tests for loss of special nuclear material, and the other determines pathway aggregate detection probabilities for physical protection systems and material control procedures that could detect the theft. Useful features of ITAC include its ability to (1) evaluate and quantify the timeliness of material accounting tests, (2) analyze branching systems of physical pathways and adversary strategies, (3) analyze trickle or abrupt theft situations for combinations of insiders, (4) accept input probabilities and times in the form of ranges rather than discrete points, and (5) simulate input data using Monte Carlo methods to produce statistically distributed aggregate delay times and detection probabilities. The ITAC program was developed by the Security Applications Center of Westinghouse Hanford Comapny and Boeing Computer Services, Richland, WA

  19. Computational programs for shielding calculation with transport of one dimensional and monoenergetic SN

    International Nuclear Information System (INIS)

    Nunes, Carlos Eduardo A.; Barros, Ricardo C.

    2009-01-01

    This paper describes a computational program for result simulation of neutron transport problems at one velocity with isotropic scattering in Cartesian onedimensional geometry. Describing the physical modelling, the next phase is a mathematical modelling of the physical problem for simulation of the neutron distribution. The mathematical modelling uses the linearized Boltzmann equation which represents a balance among the production and loss of particles. The formulation of the discrete ordinates S N consists of discretization of angular variables at N directions (discrete ordinates), and using a set of angular quadratures for the approximation of integral terms of scattering sources. The S N equations are numerically solved. This work describes three numerical methods: diamond difference, step and characteristic step. The paper also presents numerical results for illustration of the efficiency of the developed program

  20. Design and Curriculum Considerations for a Computer Graphics Program in the Arts.

    Science.gov (United States)

    Leeman, Ruedy W.

    This history and state-of-the-art review of computer graphics describes computer graphics programs and proposed programs at Sheridan College (Canada), the Rhode Island School of Design, the University of Oregon, Northern Illinois University, and Ohio State University. These programs are discussed in terms of their philosophy, curriculum, student…

  1. Overhead Crane Computer Model

    Science.gov (United States)

    Enin, S. S.; Omelchenko, E. Y.; Fomin, N. V.; Beliy, A. V.

    2018-03-01

    The paper has a description of a computer model of an overhead crane system. The designed overhead crane system consists of hoisting, trolley and crane mechanisms as well as a payload two-axis system. With the help of the differential equation of specified mechanisms movement derived through Lagrange equation of the II kind, it is possible to build an overhead crane computer model. The computer model was obtained using Matlab software. Transients of coordinate, linear speed and motor torque of trolley and crane mechanism systems were simulated. In addition, transients of payload swaying were obtained with respect to the vertical axis. A trajectory of the trolley mechanism with simultaneous operation with the crane mechanism is represented in the paper as well as a two-axis trajectory of payload. The designed computer model of an overhead crane is a great means for studying positioning control and anti-sway control systems.

  2. A phenomenographic study of the ways of understanding conditional and repetition structures in computer programming languages

    Science.gov (United States)

    Bucks, Gregory Warren

    Computers have become an integral part of how engineers complete their work, allowing them to collect and analyze data, model potential solutions and aiding in production through automation and robotics. In addition, computers are essential elements of the products themselves, from tennis shoes to construction materials. An understanding of how computers function, both at the hardware and software level, is essential for the next generation of engineers. Despite the need for engineers to develop a strong background in computing, little opportunity is given for engineering students to develop these skills. Learning to program is widely seen as a difficult task, requiring students to develop not only an understanding of specific concepts, but also a way of thinking. In addition, students are forced to learn a new tool, in the form of the programming environment employed, along with these concepts and thought processes. Because of this, many students will not develop a sufficient proficiency in programming, even after progressing through the traditional introductory programming sequence. This is a significant problem, especially in the engineering disciplines, where very few students receive more than one or two semesters' worth of instruction in an already crowded engineering curriculum. To address these issues, new pedagogical techniques must be investigated in an effort to enhance the ability of engineering students to develop strong computing skills. However, these efforts are hindered by the lack of published assessment instruments available for probing an individual's understanding of programming concepts across programming languages. Traditionally, programming knowledge has been assessed by producing written code in a specific language. This can be an effective method, but does not lend itself well to comparing the pedagogical impact of different programming environments, languages or paradigms. This dissertation presents a phenomenographic research study

  3. COMMIX-1AR/P: A three-dimensional transient single-phase computer program for thermal hydraulic analysis of single and multicomponent systems

    International Nuclear Information System (INIS)

    Garner, P.L.; Blomquist, R.N.; Gelbard, E.M.

    1992-09-01

    The COMMIX-1AR/P computer program is designed for analyzing the steady-state and transient aspects of single-phase fluid flow and heat transfer in three spatial dimensions. This version is an extension of the modeling in COMMIX-1A to include multiple fluids in physically separate regions of the computational domain, modeling descriptions for pumps, radiation heat transfer between surfaces of the solids which are embedded in or surround the fluid, a k-var-epsilon model for fluid turbulence, and improved numerical techniques. The porous-medium formulation in COMMIX allows the program to be applied to a wide range of problems involving both simple and complex geometrical arrangements. The input preparation and execution procedures are presented for the COMMIX-1AR/P program and several postprocessor programs which produce graphical displays of the calculated results

  4. COMMIX-1AR/P: A three-dimensional transient single-phase computer program for thermal hydraulic analysis of single and multicomponent systems

    Energy Technology Data Exchange (ETDEWEB)

    Garner, P.L.; Blomquist, R.N.; Gelbard, E.M.

    1992-09-01

    The COMMIX-1AR/P computer program is designed for analyzing the steady-state and transient aspects of single-phase fluid flow and heat transfer in three spatial dimensions. This version is an extension of the modeling in COMMIX-1A to include multiple fluids in physically separate regions of the computational domain, modeling descriptions for pumps, radiation heat transfer between surfaces of the solids which are embedded in or surround the fluid, a k-[var epsilon] model for fluid turbulence, and improved numerical techniques. The porous-medium formulation in COMMIX allows the program to be applied to a wide range of problems involving both simple and complex geometrical arrangements. The input preparation and execution procedures are presented for the COMMIX-1AR/P program and several postprocessor programs which produce graphical displays of the calculated results.

  5. A room acoustical computer model for industrial environments - the model and its verification

    DEFF Research Database (Denmark)

    Christensen, Claus Lynge; Foged, Hans Torben

    1998-01-01

    This paper presents an extension to the traditional room acoustic modelling methods allowing computer modelling of huge machinery in industrial spaces. The program in question is Odeon 3.0 Industrial and Odeon 3.0 Combined which allows the modelling of point sources, surface sources and line...... of an omnidirectional sound source and a microphone. This allows the comparison of simulated results with the ones measured in real rooms. However when simulating the acoustic environment in industrial rooms, the sound sources are often far from being point like, as they can be distributed over a large space...

  6. Survey of using GPU CUDA programming model in medical image analysis

    Directory of Open Access Journals (Sweden)

    T. Kalaiselvi

    2017-01-01

    Full Text Available With the technology development of medical industry, processing data is expanding rapidly and computation time also increases due to many factors like 3D, 4D treatment planning, the increasing sophistication of MRI pulse sequences and the growing complexity of algorithms. Graphics processing unit (GPU addresses these problems and gives the solutions for using their features such as, high computation throughput, high memory bandwidth, support for floating-point arithmetic and low cost. Compute unified device architecture (CUDA is a popular GPU programming model introduced by NVIDIA for parallel computing. This review paper briefly discusses the need of GPU CUDA computing in the medical image analysis. The GPU performances of existing algorithms are analyzed and the computational gain is discussed. A few open issues, hardware configurations and optimization principles of existing methods are discussed. This survey concludes the few optimization techniques with the medical imaging algorithms on GPU. Finally, limitation and future scope of GPU programming are discussed.

  7. TET{sub 2}MCNP: A conversion program to implement tetrahearal-mesh models in MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Han, Min Cheol; Yeom, Yeon Soo; Nguyen, Thng Tat; Choi, Chan Soo; Lee, Hyun Su; Kim, Chan Hyeong [Dept. of Nuclear Engineering, Hanyang University, Seoul (Korea, Republic of)

    2016-12-15

    Tetrahedral-mesh geometries can be used in the MCNP code, but the MCNP code accepts only the geometry in the Abaqus input file format; hence, the existing tetrahedral-mesh models first need to be converted to the Abacus input file format to be used in the MCNP code. In the present study, we developed a simple but useful computer program, TET{sub 2}MCNP, for converting TetGen-generated tetrahedral-mesh models to the Abacus input file format. TET{sub 2}MCNP is written in C++ and contains two components: one for converting a TetGen output file to the Abacus input file and the other for the reverse conversion process. The TET{sub 2}MCP program also produces an MCNP input file. Further, the program provides some MCNP-specific functions: the maximum number of elements (i.e., tetrahedrons) per part can be limited, and the material density of each element can be transferred to the MCNP input file. To test the developed program, two tetrahedral-mesh models were generated using TetGen and converted to the Abaqus input file format using TET{sub 2}MCNP. Subsequently, the converted files were used in the MCNP code to calculate the object- and organ-averaged absorbed dose in the sphere and phantom, respectively. The results show that the converted models provide, within statistical uncertainties, identical dose values to those obtained using the PHITS code, which uses the original tetrahedral-mesh models produced by the TetGen program. The results show that the developed program can successfully convert TetGen tetrahedral-mesh models to Abacus input files. In the present study, we have developed a computer program, TET{sub 2}MCNP, which can be used to convert TetGen-generated tetrahedral-mesh models to the Abaqus input file format for use in the MCNP code. We believe this program will be used by many MCNP users for implementing complex tetrahedral-mesh models, including computational human phantoms, in the MCNP code.

  8. A computer program for the pointwise functions generation

    International Nuclear Information System (INIS)

    Caldeira, Alexandre D.

    1995-01-01

    A computer program that was developed with the objective of generating pointwise functions, by a combination of tabulated values and/or mathematical expressions, to be used as weighting functions for nuclear data is presented. This simple program can be an important tool for researchers involved in group constants generation. (author). 5 refs, 2 figs

  9. Phenomenological optical potentials and optical model computer codes

    International Nuclear Information System (INIS)

    Prince, A.

    1980-01-01

    An introduction to the Optical Model is presented. Starting with the purpose and nature of the physical problems to be analyzed, a general formulation and the various phenomenological methods of solution are discussed. This includes the calculation of observables based on assumed potentials such as local and non-local and their forms, e.g. Woods-Saxon, folded model etc. Also discussed are the various calculational methods and model codes employed to describe nuclear reactions in the spherical and deformed regions (e.g. coupled-channel analysis). An examination of the numerical solutions and minimization techniques associated with the various codes, is briefly touched upon. Several computer programs are described for carrying out the calculations. The preparation of input, (formats and options), determination of model parameters and analysis of output are described. The class is given a series of problems to carry out using the available computer. Interpretation and evaluation of the samples includes the effect of varying parameters, and comparison of calculations with the experimental data. Also included is an intercomparison of the results from the various model codes, along with their advantages and limitations. (author)

  10. Positioning Continuing Education Computer Programs for the Corporate Market.

    Science.gov (United States)

    Tilney, Ceil

    1993-01-01

    Summarizes the findings of the market assessment phase of Bellevue Community College's evaluation of its continuing education computer training program. Indicates that marketing efforts must stress program quality and software training to help overcome strong antiacademic client sentiment. (MGB)

  11. Improving science and mathematics education with computational modelling in interactive engagement environments

    Science.gov (United States)

    Neves, Rui Gomes; Teodoro, Vítor Duarte

    2012-09-01

    A teaching approach aiming at an epistemologically balanced integration of computational modelling in science and mathematics education is presented. The approach is based on interactive engagement learning activities built around computational modelling experiments that span the range of different kinds of modelling from explorative to expressive modelling. The activities are designed to make a progressive introduction to scientific computation without requiring prior development of a working knowledge of programming, generate and foster the resolution of cognitive conflicts in the understanding of scientific and mathematical concepts and promote performative competency in the manipulation of different and complementary representations of mathematical models. The activities are supported by interactive PDF documents which explain the fundamental concepts, methods and reasoning processes using text, images and embedded movies, and include free space for multimedia enriched student modelling reports and teacher feedback. To illustrate, an example from physics implemented in the Modellus environment and tested in undergraduate university general physics and biophysics courses is discussed.

  12. An introduction to programming multiple-processor computers

    International Nuclear Information System (INIS)

    Hicks, H.R.; Lynch, V.E.

    1986-01-01

    Fortran applications programs can be executed on multiprocessor computers in either a unitasking (traditional) or multitasking form. The later allows a single job to use more than one processor simultaneously, with a consequent reduction in elapsed time and, perhaps, the cost of the calculation. An introduction to programming in this environment is presented. The concept of synchronization and data sharing using EVENTS and LOCKS are illustrated with examples. The strategy of strong synchronization and the use of synchronization templates are proposed. We emphasize that incorrect multitasking programs can produce irreducible results, which makes debugging more difficult

  13. Innovative Partnerships Assist Community College Computing Programs.

    Science.gov (United States)

    O'Banion, Terry

    1987-01-01

    Relates efforts of major corporations in providing assistance to community college computing programs. Explains the goals of the League for Innovation in the Community College, a consortium of 19 community colleges, and cites examples of collaborative projects. (ML)

  14. Development of a 3-D flow analysis computer program for integral reactor

    International Nuclear Information System (INIS)

    Youn, H. Y.; Lee, K. H.; Kim, H. K.; Whang, Y. D.; Kim, H. C.

    2003-01-01

    A 3-D computational fluid dynamics program TASS-3D is being developed for the flow analysis of primary coolant system consists of complex geometries such as SMART. A pre/post processor also is being developed to reduce the pre/post processing works such as a computational grid generation, set-up the analysis conditions and analysis of the calculated results. TASS-3D solver employs a non-orthogonal coordinate system and FVM based on the non-staggered grid system. The program includes the various models to simulate the physical phenomena expected to be occurred in the integral reactor and will be coupled with core dynamics code, core T/H code and the secondary system code modules. Currently, the application of TASS-3D is limited to the single phase of liquid, but the code will be further developed including 2-phase phenomena expected for the normal operation and the various transients of the integrator reactor in the next stage

  15. Optimization and mathematical modeling in computer architecture

    CERN Document Server

    Sankaralingam, Karu; Nowatzki, Tony

    2013-01-01

    In this book we give an overview of modeling techniques used to describe computer systems to mathematical optimization tools. We give a brief introduction to various classes of mathematical optimization frameworks with special focus on mixed integer linear programming which provides a good balance between solver time and expressiveness. We present four detailed case studies -- instruction set customization, data center resource management, spatial architecture scheduling, and resource allocation in tiled architectures -- showing how MILP can be used and quantifying by how much it outperforms t

  16. Simulation model for wind energy storage systems. Volume III. Program descriptions. [SIMWEST CODE

    Energy Technology Data Exchange (ETDEWEB)

    Warren, A.W.; Edsinger, R.W.; Burroughs, J.D.

    1977-08-01

    The effort developed a comprehensive computer program for the modeling of wind energy/storage systems utilizing any combination of five types of storage (pumped hydro, battery, thermal, flywheel and pneumatic). An acronym for the program is SIMWEST (Simulation Model for Wind Energy Storage). The level of detail of SIMWEST is consistent with a role of evaluating the economic feasibility as well as the general performance of wind energy systems. The software package consists of two basic programs and a library of system, environmental, and load components. Volume III, the SIMWEST program description contains program descriptions, flow charts and program listings for the SIMWEST Model Generation Program, the Simulation program, the File Maintenance program and the Printer Plotter program. Volume III generally would not be required by SIMWEST user.

  17. Computational Hydrodynamics: How Portable and Scalable Are Heterogeneous Programming Paradigms?

    DEFF Research Database (Denmark)

    Pawlak, Wojciech; Glimberg, Stefan Lemvig; Engsig-Karup, Allan Peter

    New many-core era applications at the interface of mathematics and computer science adopt modern parallel programming paradigms and expose parallelism through proper algorithms. We present new performance results for a novel massively parallel free surface wave model suitable for advanced......-device system sizes from desktops to large HPC systems such as superclusters and in the cloud utilizing heterogeneous devices like multi-core CPUs, GPUs, and Xeon Phi coprocessors. The numerical efficiency is evaluated on heterogeneous devices like multi-core CPUs, GPUs and Xeon Phi coprocessors to test...

  18. Pair Programming as a Modern Method of Teaching Computer Science

    OpenAIRE

    Irena Nančovska Šerbec; Branko Kaučič; Jože Rugelj

    2008-01-01

    At the Faculty of Education, University of Ljubljana we educate future computer science teachers. Beside didactical, pedagogical, mathematical and other interdisciplinary knowledge, students gain knowledge and skills of programming that are crucial for computer science teachers. For all courses, the main emphasis is the absorption of professional competences, related to the teaching profession and the programming profile. The latter are selected according to the well-known document, the ACM C...

  19. Computer program for calculation of ideal gas thermodynamic data

    Science.gov (United States)

    Gordon, S.; Mc Bride, B. J.

    1968-01-01

    Computer program calculates ideal gas thermodynamic properties for any species for which molecular constant data is available. Partial functions and derivatives from formulas based on statistical mechanics are provided by the program which is written in FORTRAN 4 and MAP.

  20. 78 FR 15731 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2013-03-12

    ... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary [Docket No. DHS-2013-0011] Privacy Act of 1974; Computer Matching Program AGENCY: Department of Homeland Security/U.S. Citizenship and... amended by the Computer Matching and Privacy Protection Act of 1988 (Pub. L. 100-503) and the Computer...

  1. The 12-th INS scientific computational programs

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-06-01

    This issue is the collection of the paper on INS scientific computational programs. Separate abstracts were presented for 3 of the papers in this report. The remaining 5 were considered outside the subject scope of INIS. (J.P.N.)

  2. HAMOC: a computer program for fluid hammer analysis

    International Nuclear Information System (INIS)

    Johnson, H.G.

    1975-12-01

    A computer program has been developed for fluid hammer analysis of piping systems attached to a vessel which has undergone a known rapid pressure transient. The program is based on the characteristics method for solution of the partial differential equations of motion and continuity. Column separation logic is included for situations in which pressures fall to saturation values

  3. Computationally Modeling Interpersonal Trust

    Directory of Open Access Journals (Sweden)

    Jin Joo eLee

    2013-12-01

    Full Text Available We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind’s readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naivete' of this domain knowledge. We then present the construction of hidden Markov models to incorporate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust.

  4. Multi-objective reverse logistics model for integrated computer waste management.

    Science.gov (United States)

    Ahluwalia, Poonam Khanijo; Nema, Arvind K

    2006-12-01

    This study aimed to address the issues involved in the planning and design of a computer waste management system in an integrated manner. A decision-support tool is presented for selecting an optimum configuration of computer waste management facilities (segregation, storage, treatment/processing, reuse/recycle and disposal) and allocation of waste to these facilities. The model is based on an integer linear programming method with the objectives of minimizing environmental risk as well as cost. The issue of uncertainty in the estimated waste quantities from multiple sources is addressed using the Monte Carlo simulation technique. An illustrated example of computer waste management in Delhi, India is presented to demonstrate the usefulness of the proposed model and to study tradeoffs between cost and risk. The results of the example problem show that it is possible to reduce the environmental risk significantly by a marginal increase in the available cost. The proposed model can serve as a powerful tool to address the environmental problems associated with exponentially growing quantities of computer waste which are presently being managed using rudimentary methods of reuse, recovery and disposal by various small-scale vendors.

  5. Case Study: Creation of a Degree Program in Computer Security. White Paper.

    Science.gov (United States)

    Belon, Barbara; Wright, Marie

    This paper reports on research into the field of computer security, and undergraduate degrees offered in that field. Research described in the paper reveals only one computer security program at the associate's degree level in the entire country. That program, at Texas State Technical College in Waco, is a 71-credit-hour program leading to an…

  6. Comparison of structural computer programs used for the analysis of spent fuel shipping casks

    International Nuclear Information System (INIS)

    Friley, J.R.

    1984-09-01

    Several structural analysis computer programs were selected and used in analyses relevant to the hypothetical impact requirements for spent fuel shipping cask designs. The objective of the study was to evaluate the computer codes by performing a series of analyses and comparing results. The code evaluation efforts treated end and side impact situations only. As a result, the models were either one or two dimensional. Both clad lead and solid wall construction types were considered. For clad lead models, frictionless sliding between the lead and cladding was assumed. General agreement was achieved between the codes for problems involving non-clad models. For clad models, agreement between the codes was poor. This work was sponsored by the Department of Energy through the Transportation Technology Center at Sandia National Laboratories, Albuquerque, New Mexico. 16 references, 18 figures, 9 tables

  7. FLANGE: a computer program for the analysis of flanged joints with ring-type gaskets

    International Nuclear Information System (INIS)

    Rodabaugh, E.C.; O'Hara, F.M. Jr.; Moore, S.E.

    1976-01-01

    The computer program FLANGE was written to calculate not only the stresses due to moment loads on the flange ring but also stresses due to internal pressure; stresses due to a temperature difference between the hub and ring; and stresses due to the variations in bolt load that result from pressure, hub-ring temperature gradient, and/or bolt-ring temperature difference. The program FLANGE is applicable to tapered-hub, straight, and blind flanges. The analysis method is based on the differential equations for thin plates and shells. The stresses due to moment loading calculated by the two methods are essentially identical for identical boundary conditions. A description of the general model of flanges used in the theoretical development of the computer code is provided. The actual mathematical expressions for calculating stresses and displacements due to moment and pressure loads are derived. Example calculations, listings, and flowcharts of the program and its subroutines are included as appendices

  8. 78 FR 73195 - Privacy Act of 1974: CMS Computer Matching Program Match No. 2013-01; HHS Computer Matching...

    Science.gov (United States)

    2013-12-05

    ... 1974: CMS Computer Matching Program Match No. 2013-01; HHS Computer Matching Program Match No. 1312 AGENCY: Centers for Medicare & Medicaid Services (CMS), Department of Health and Human Services (HHS... Privacy Act of 1974 (5 U.S.C. 552a), as amended, this notice announces the renewal of a CMP that CMS plans...

  9. An object-oriented programming paradigm for parallelization of computational fluid dynamics

    International Nuclear Information System (INIS)

    Ohta, Takashi.

    1997-03-01

    We propose an object-oriented programming paradigm for parallelization of scientific computing programs, and show that the approach can be a very useful strategy. Generally, parallelization of scientific programs tends to be complicated and unportable due to the specific requirements of each parallel computer or compiler. In this paper, we show that the object-oriented programming design, which separates the parallel processing parts from the solver of the applications, can achieve the large improvement in the maintenance of the codes, as well as the high portability. We design the program for the two-dimensional Euler equations according to the paradigm, and evaluate the parallel performance on IBM SP2. (author)

  10. Analytical SN solutions in heterogeneous slabs using symbolic algebra computer programs

    International Nuclear Information System (INIS)

    Warsa, J.S.

    2002-01-01

    A modern symbolic algebra computer program, MAPLE, is used to compute solutions to the well-known analytical discrete ordinates, or S N , solutions in one-dimensional, slab geometry. Symbolic algebra programs compute the solutions with arbitrary precision and are free of spatial discretization error so they can be used to investigate new discretizations for one-dimensional slab, geometry S N methods. Pointwise scalar flux solutions are computed for several sample calculations of interest. Sample MAPLE command scripts are provided to illustrate how easily the theory can be translated into a working solution and serve as a complete tool capable of computing analytical S N solutions for mono-energetic, one-dimensional transport problems

  11. Computer-Assisted Language Learning: Current Programs and Projects. ERIC Digest.

    Science.gov (United States)

    Higgins, Chris

    For many years, foreign language teachers have used the computer to provide supplemental exercises in the instruction of foreign languages. In recent years, advances in computer technology have motivated teachers to reassess the computer and consider it a valuable part of daily foreign language learning. Innovative software programs, authoring…

  12. 78 FR 1275 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2013-01-08

    ... Social Security Administration (Computer Matching Agreement 1071). SUMMARY: In accordance with the Privacy Act of 1974 (5 U.S.C. 552a), as amended by the Computer Matching and Privacy Protection Act of... of its new computer matching program with the Social Security Administration (SSA). DATES: OPM will...

  13. SYNCOM: A general syntax conversion language and computer program

    International Nuclear Information System (INIS)

    Bindon, D.C.

    1972-09-01

    The problems of syntax conversion are discussed and the reasons given for the choice of the Interpretive method. A full description is given of the SYNCON language and computer program together with brief details of some programs written in the language. (author)

  14. The Application of Visual Basic Computer Programming Language to Simulate Numerical Iterations

    Directory of Open Access Journals (Sweden)

    Abdulkadir Baba HASSAN

    2006-06-01

    Full Text Available This paper examines the application of Visual Basic Computer Programming Language to Simulate Numerical Iterations, the merit of Visual Basic as a Programming Language and the difficulties faced when solving numerical iterations analytically, this research paper encourage the uses of Computer Programming methods for the execution of numerical iterations and finally fashion out and develop a reliable solution using Visual Basic package to write a program for some selected iteration problems.

  15. Computer program for calculation of complex chemical equilibrium compositions and applications. Part 1: Analysis

    Science.gov (United States)

    Gordon, Sanford; Mcbride, Bonnie J.

    1994-01-01

    This report presents the latest in a number of versions of chemical equilibrium and applications programs developed at the NASA Lewis Research Center over more than 40 years. These programs have changed over the years to include additional features and improved calculation techniques and to take advantage of constantly improving computer capabilities. The minimization-of-free-energy approach to chemical equilibrium calculations has been used in all versions of the program since 1967. The two principal purposes of this report are presented in two parts. The first purpose, which is accomplished here in part 1, is to present in detail a number of topics of general interest in complex equilibrium calculations. These topics include mathematical analyses and techniques for obtaining chemical equilibrium; formulas for obtaining thermodynamic and transport mixture properties and thermodynamic derivatives; criteria for inclusion of condensed phases; calculations at a triple point; inclusion of ionized species; and various applications, such as constant-pressure or constant-volume combustion, rocket performance based on either a finite- or infinite-chamber-area model, shock wave calculations, and Chapman-Jouguet detonations. The second purpose of this report, to facilitate the use of the computer code, is accomplished in part 2, entitled 'Users Manual and Program Description'. Various aspects of the computer code are discussed, and a number of examples are given to illustrate its versatility.

  16. Computer programs for developing source terms for a UF{sub 6} dispersion model to simulate postulated UF{sub 6} releases from buildings

    Energy Technology Data Exchange (ETDEWEB)

    Williams, W.R.

    1985-03-01

    Calculational methods and computer programs for the analysis of source terms for postulated releases of UF{sub 6} are presented. Required thermophysical properties of UF{sub 6}, HF, and H{sub 2}O are described in detail. UF{sub 6} reacts with moisture in the ambient environment to form HF and H{sub 2}O. The coexistence of HF and H{sub 2}O significantly alters their pure component properties, and HF vapor polymerizes. Transient compartment models for simulating UF{sub 6} releases inside gaseous diffusion plant feed and withdrawl buildings and cascade buildings are also described. The basic compartment model mass and energy balances are supported by simple heat transfer, ventilation system, and deposition models. A model that can simulate either a closed compartment or a steady-state ventilation system is also discussed. The transient compartment models provide input to an atmospheric dispersion model as output.

  17. Electromagnetic Physics Models for Parallel Computing Architectures

    Science.gov (United States)

    Amadio, G.; Ananya, A.; Apostolakis, J.; Aurora, A.; Bandieramonte, M.; Bhattacharyya, A.; Bianchini, C.; Brun, R.; Canal, P.; Carminati, F.; Duhem, L.; Elvira, D.; Gheata, A.; Gheata, M.; Goulas, I.; Iope, R.; Jun, S. Y.; Lima, G.; Mohanty, A.; Nikitina, T.; Novak, M.; Pokorski, W.; Ribon, A.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.; Zhang, Y.

    2016-10-01

    The recent emergence of hardware architectures characterized by many-core or accelerated processors has opened new opportunities for concurrent programming models taking advantage of both SIMD and SIMT architectures. GeantV, a next generation detector simulation, has been designed to exploit both the vector capability of mainstream CPUs and multi-threading capabilities of coprocessors including NVidia GPUs and Intel Xeon Phi. The characteristics of these architectures are very different in terms of the vectorization depth and type of parallelization needed to achieve optimal performance. In this paper we describe implementation of electromagnetic physics models developed for parallel computing architectures as a part of the GeantV project. Results of preliminary performance evaluation and physics validation are presented as well.

  18. Patient-Specific Computational Modeling

    CERN Document Server

    Peña, Estefanía

    2012-01-01

    This book addresses patient-specific modeling. It integrates computational modeling, experimental procedures, imagine clinical segmentation and mesh generation with the finite element method (FEM) to solve problems in computational biomedicine and bioengineering. Specific areas of interest include cardiovascular problems, ocular and muscular systems and soft tissue modeling. Patient-specific modeling has been the subject of serious research over the last seven years and interest in the area is continually growing and this area is expected to further develop in the near future.

  19. Fuzzy Goal Programming Approach in Selective Maintenance Reliability Model

    Directory of Open Access Journals (Sweden)

    Neha Gupta

    2013-12-01

    Full Text Available 800x600 In the present paper, we have considered the allocation problem of repairable components for a parallel-series system as a multi-objective optimization problem and have discussed two different models. In first model the reliability of subsystems are considered as different objectives. In second model the cost and time spent on repairing the components are considered as two different objectives. These two models is formulated as multi-objective Nonlinear Programming Problem (MONLPP and a Fuzzy goal programming method is used to work out the compromise allocation in multi-objective selective maintenance reliability model in which we define the membership functions of each objective function and then transform membership functions into equivalent linear membership functions by first order Taylor series and finally by forming a fuzzy goal programming model obtain a desired compromise allocation of maintenance components. A numerical example is also worked out to illustrate the computational details of the method.  Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4

  20. Computer program for source distribution process in radiation facility

    International Nuclear Information System (INIS)

    Al-Kassiri, H.; Abdul Ghani, B.

    2007-08-01

    Computer simulation for dose distribution using Visual Basic has been done according to the arrangement and activities of Co-60 sources. This program provides dose distribution in treated products depending on the product density and desired dose. The program is useful for optimization of sources distribution during loading process. there is good agreement between calculated data for the program and experimental data.(Author)

  1. A review of small canned computer programs for survey research and demographic analysis.

    Science.gov (United States)

    Sinquefield, J C

    1976-12-01

    A variety of small canned computer programs for survey research and demographic analysis appropriate for use in developing countries are reviewed in this article. The programs discussed are SPSS (Statistical Package for the Social Sciences); CENTS, CO-CENTS, CENTS-AID, CENTS-AIE II; MINI-TAB EDIT, FREQUENCIES, TABLES, REGRESSION, CLIENT RECORD, DATES, MULT, LIFE, and PREGNANCY HISTORY; FIVFIV and SINSIN; DCL (Demographic Computer Library); MINI-TAB Population Projection, Functional Population Projection, and Family Planning Target Projection. A description and evaluation for each program of uses, instruction manuals, computer requirements, and procedures for obtaining manuals and programs are provided. Such information is intended to facilitate and encourage the use of the computer by data processors in developing countries.

  2. UCODE, a computer code for universal inverse modeling

    Science.gov (United States)

    Poeter, E.P.; Hill, M.C.

    1999-01-01

    This article presents the US Geological Survey computer program UCODE, which was developed in collaboration with the US Army Corps of Engineers Waterways Experiment Station and the International Ground Water Modeling Center of the Colorado School of Mines. UCODE performs inverse modeling, posed as a parameter-estimation problem, using nonlinear regression. Any application model or set of models can be used; the only requirement is that they have numerical (ASCII or text only) input and output files and that the numbers in these files have sufficient significant digits. Application models can include preprocessors and postprocessors as well as models related to the processes of interest (physical, chemical and so on), making UCODE extremely powerful for model calibration. Estimated parameters can be defined flexibly with user-specified functions. Observations to be matched in the regression can be any quantity for which a simulated equivalent value can be produced, thus simulated equivalent values are calculated using values that appear in the application model output files and can be manipulated with additive and multiplicative functions, if necessary. Prior, or direct, information on estimated parameters also can be included in the regression. The nonlinear regression problem is solved by minimizing a weighted least-squares objective function with respect to the parameter values using a modified Gauss-Newton method. Sensitivities needed for the method are calculated approximately by forward or central differences and problems and solutions related to this approximation are discussed. Statistics are calculated and printed for use in (1) diagnosing inadequate data or identifying parameters that probably cannot be estimated with the available data, (2) evaluating estimated parameter values, (3) evaluating the model representation of the actual processes and (4) quantifying the uncertainty of model simulated values. UCODE is intended for use on any computer operating

  3. 43 Computer Assisted Programmed Instruction and Cognitive ...

    African Journals Online (AJOL)

    cce

    between Cognitive Preference Style and Computer Assisted Programmed ... teaching the subjects makes a wide range of students who have moderate numerical ability and ... on achievement of physics students, more so when such strategy has .... explaining prompting, thinking, discussing, clarifying concepts, asking ...

  4. Department of Energy: MICS (Mathematical Information, and Computational Sciences Division). High performance computing and communications program

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-06-01

    This document is intended to serve two purposes. Its first purpose is that of a program status report of the considerable progress that the Department of Energy (DOE) has made since 1993, the time of the last such report (DOE/ER-0536, {open_quotes}The DOE Program in HPCC{close_quotes}), toward achieving the goals of the High Performance Computing and Communications (HPCC) Program. The second purpose is that of a summary report of the many research programs administered by the Mathematical, Information, and Computational Sciences (MICS) Division of the Office of Energy Research under the auspices of the HPCC Program and to provide, wherever relevant, easy access to pertinent information about MICS-Division activities via universal resource locators (URLs) on the World Wide Web (WWW). The information pointed to by the URL is updated frequently, and the interested reader is urged to access the WWW for the latest information.

  5. Computer modeling of the gyrocon

    International Nuclear Information System (INIS)

    Tallerico, P.J.; Rankin, J.E.

    1979-01-01

    A gyrocon computer model is discussed in which the electron beam is followed from the gun output to the collector region. The initial beam may be selected either as a uniform circular beam or may be taken from the output of an electron gun simulated by the program of William Herrmannsfeldt. The fully relativistic equations of motion are then integrated numerically to follow the beam successively through a drift tunnel, a cylindrical rf beam deflection cavity, a combination drift space and magnetic bender region, and an output rf cavity. The parameters for each region are variable input data from a control file. The program calculates power losses in the cavity wall, power required by beam loading, power transferred from the beam to the output cavity fields, and electronic and overall efficiency. Space-charge effects are approximated if selected. Graphical displays of beam motions are produced. We discuss the Los Alamos Scientific Laboratory (LASL) prototype design as an example of code usage. The design shows a gyrocon of about two-thirds megawatt output at 450 MHz with up to 86% overall efficiency

  6. Permitted decompilation of a computer program in order to protect the general interest

    Directory of Open Access Journals (Sweden)

    Radovanović Sanja M.

    2015-01-01

    Full Text Available Computer program is an intellectual creation protected by copyright. However, unlike other items with equivalent legal protection, a computer program has a strong technical functionality, which is, in nowadays' society, an indispensable factor in everyday business activities, exchange of information, entertainment or achieving other similar purposes. Precisely because of this feature, computer program can rarely be seen in isolation from the hardware and software environment. In other words, the functionality of a computer program reaches its full scope only in interaction with other computer program or device. Bearing in mind the fact that this intellectual creation is in the focus of technological, and thus social, development, legislators are trying to provide a legal framework in which these interactions take place unhindered. In fact, considering that each aspect of the use of a computer program presents the exclusive right of the author, relying on his or her consent to undertake certain perpetration which would provide the necessary connectivity of the various components, could put in risk further technological development. Therefore, the lawmakers provide that, in certain cases and under certain conditions, the author's exclusive right could be restricted or excluded. This paper aims to analyze a normative contribution in achieving, technical and technological needed, and therefore, in terms of general interest justified, interactions.

  7. Software for the ACP [Advanced Computer Program] multiprocessor system

    International Nuclear Information System (INIS)

    Biel, J.; Areti, H.; Atac, R.

    1987-01-01

    Software has been developed for use with the Fermilab Advanced Computer Program (ACP) multiprocessor system. The software was designed to make a system of a hundred independent node processors as easy to use as a single, powerful CPU. Subroutines have been developed by which a user's host program can send data to and get results from the program running in each of his ACP node processors. Utility programs make it easy to compile and link host and node programs, to debug a node program on an ACP development system, and to submit a debugged program to an ACP production system

  8. Design and performance analysis of solid-propellant rocket motors using a simplified computer program

    Science.gov (United States)

    Sforzini, R. H.

    1972-01-01

    An analysis and a computer program are presented which represent a compromise between the more sophisticated programs using precise burning geometric relations and the textbook type of solutions. The program requires approximately 900 computer cards including a set of 20 input data cards required for a typical problem. The computer operating time for a single configuration is approximately 1 minute and 30 seconds on the IBM 360 computer. About l minute and l5 seconds of the time is compilation time so that additional configurations input at the same time require approximately 15 seconds each. The program uses approximately 11,000 words on the IBM 360. The program is written in FORTRAN 4 and is readily adaptable for use on a number of different computers: IBM 7044, IBM 7094, and Univac 1108.

  9. Interactive house investigation and radon diagnostics computer program

    International Nuclear Information System (INIS)

    Gillette, L.M.

    1990-01-01

    This paper reports on the interactive computer program called Dungeons and Radon which was developed as part of the Environmental Protection Agency's (EPA's) Radon Contractor Proficiency (RCP) Program's Radon Technology for Mitigators (RTM) course which is currently being offered in the Regional Radon Training Centers (RRTCs). The program was designed by Terry Brennan to be used in training radon mitigation contractors. The Macintosh based program consists of a series of animated, sound and voice enhanced house scenes. The participants choose where and what to investigate and where to perform diagnostic tests in order to gather enough information to design a successful mitigation system

  10. Radiological Safety Analysis Computer (RSAC) Program Version 7.0 Users’ Manual

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Bradley J Schrader

    2009-03-01

    The Radiological Safety Analysis Computer (RSAC) Program Version 7.0 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users’ manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods.

  11. Radiological Safety Analysis Computer (RSAC) Program Version 7.2 Users’ Manual

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Bradley J Schrader

    2010-10-01

    The Radiological Safety Analysis Computer (RSAC) Program Version 7.2 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users’ manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods.

  12. Radiological Safety Analysis Computer (RSAC) Program Version 7.0 Users Manual

    International Nuclear Information System (INIS)

    Schrader, Bradley J.

    2009-01-01

    The Radiological Safety Analysis Computer (RSAC) Program Version 7.0 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods

  13. Implementing a Computer Program that Captures Students' Work on Customizable, Periodic-System Data Assignments

    Science.gov (United States)

    Wiediger, Susan D.

    2009-01-01

    The periodic table and the periodic system are central to chemistry and thus to many introductory chemistry courses. A number of existing activities use various data sets to model the development process for the periodic table. This paper describes an image arrangement computer program developed to mimic a paper-based card sorting periodic table…

  14. 78 FR 15732 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2013-03-12

    ... 1974; Computer Matching Program AGENCY: Department of Homeland Security/U.S. Citizenship and... Privacy Act of 1974 (5 U.S.C. 552a), as amended by the Computer Matching and Privacy Protection Act of 1988 (Pub. L. 100-503) and the Computer Matching and Privacy Protection Amendments of 1990 (Pub. L. 101...

  15. The computer program system for structural design of nuclear power plants

    International Nuclear Information System (INIS)

    Aihara, S.; Atsumi, K.; Sasagawa, K.; Satoh, S.

    1979-01-01

    In recent days, the design method of the Nuclear Power Plant has become more complex than in the past. The Finite Element Method (FEM) applied for analysis of Nuclear Power Plants, especially requires more computer use. The recent computers have made remarkable progress, so that in design work manpower and time necessary for analysis have been reduced considerably. However, instead the arrangement of outputs have increased tremendously. Therefore, a computer program system was developed for performing all of the processes, from data making to output arrangement, and rebar evaluations. This report introduces the computer program system pertaining to the design flow of the Reactor Building. (orig.)

  16. TITAN: a computer program for accident occurrence frequency analyses by component Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Nomura, Yasushi [Department of Fuel Cycle Safety Research, Nuclear Safety Research Center, Tokai Research Establishment, Japan Atomic Energy Research Institute, Tokai, Ibaraki (Japan); Tamaki, Hitoshi [Department of Safety Research Technical Support, Tokai Research Establishment, Japan Atomic Energy Research Institute, Tokai, Ibaraki (Japan); Kanai, Shigeru [Fuji Research Institute Corporation, Tokyo (Japan)

    2000-04-01

    In a plant system consisting of complex equipments and components for a reprocessing facility, there might be grace time between an initiating event and a resultant serious accident, allowing operating personnel to take remedial actions, thus, terminating the ongoing accident sequence. A component Monte Carlo simulation computer program TITAN has been developed to analyze such a complex reliability model including the grace time without any difficulty to obtain an accident occurrence frequency. Firstly, basic methods for the component Monte Carlo simulation is introduced to obtain an accident occurrence frequency, and then, the basic performance such as precision, convergence, and parallelization of calculation, is shown through calculation of a prototype accident sequence model. As an example to illustrate applicability to a real scale plant model, a red oil explosion in a German reprocessing plant model is simulated to show that TITAN can give an accident occurrence frequency with relatively good accuracy. Moreover, results of uncertainty analyses by TITAN are rendered to show another performance, and a proposal is made for introducing of a new input-data format to adapt the component Monte Carlo simulation. The present paper describes the calculational method, performance, applicability to a real scale, and new proposal for the TITAN code. In the Appendixes, a conventional analytical method is shown to avoid complex and laborious calculation to obtain a strict solution of accident occurrence frequency, compared with Monte Carlo method. The user's manual and the list/structure of program are also contained in the Appendixes to facilitate TITAN computer program usage. (author)

  17. TITAN: a computer program for accident occurrence frequency analyses by component Monte Carlo simulation

    International Nuclear Information System (INIS)

    Nomura, Yasushi; Tamaki, Hitoshi; Kanai, Shigeru

    2000-04-01

    In a plant system consisting of complex equipments and components for a reprocessing facility, there might be grace time between an initiating event and a resultant serious accident, allowing operating personnel to take remedial actions, thus, terminating the ongoing accident sequence. A component Monte Carlo simulation computer program TITAN has been developed to analyze such a complex reliability model including the grace time without any difficulty to obtain an accident occurrence frequency. Firstly, basic methods for the component Monte Carlo simulation is introduced to obtain an accident occurrence frequency, and then, the basic performance such as precision, convergence, and parallelization of calculation, is shown through calculation of a prototype accident sequence model. As an example to illustrate applicability to a real scale plant model, a red oil explosion in a German reprocessing plant model is simulated to show that TITAN can give an accident occurrence frequency with relatively good accuracy. Moreover, results of uncertainty analyses by TITAN are rendered to show another performance, and a proposal is made for introducing of a new input-data format to adapt the component Monte Carlo simulation. The present paper describes the calculational method, performance, applicability to a real scale, and new proposal for the TITAN code. In the Appendixes, a conventional analytical method is shown to avoid complex and laborious calculation to obtain a strict solution of accident occurrence frequency, compared with Monte Carlo method. The user's manual and the list/structure of program are also contained in the Appendixes to facilitate TITAN computer program usage. (author)

  18. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    The fields of sensitivity and uncertainty analysis have traditionally been dominated by statistical techniques when large-scale modeling codes are being analyzed. These methods are able to estimate sensitivities, generate response surfaces, and estimate response probability distributions given the input parameter probability distributions. Because the statistical methods are computationally costly, they are usually applied only to problems with relatively small parameter sets. Deterministic methods, on the other hand, are very efficient and can handle large data sets, but generally require simpler models because of the considerable programming effort required for their implementation. The first part of this paper reports on the development and availability of two systems, GRESS and ADGEN, that make use of computer calculus compilers to automate the implementation of deterministic sensitivity analysis capability into existing computer models. This automation removes the traditional limitation of deterministic sensitivity methods. This second part of the paper describes a deterministic uncertainty analysis method (DUA) that uses derivative information as a basis to propagate parameter probability distributions to obtain result probability distributions. This paper is applicable to low-level radioactive waste disposal system performance assessment

  19. MULGRES: a computer program for stepwise multiple regression analysis

    Science.gov (United States)

    A. Jeff Martin

    1971-01-01

    MULGRES is a computer program source deck that is designed for multiple regression analysis employing the technique of stepwise deletion in the search for most significant variables. The features of the program, along with inputs and outputs, are briefly described, with a note on machine compatibility.

  20. An introduction to NASA's advanced computing program: Integrated computing systems in advanced multichip modules

    Science.gov (United States)

    Fang, Wai-Chi; Alkalai, Leon

    1996-01-01

    Recent changes within NASA's space exploration program favor the design, implementation, and operation of low cost, lightweight, small and micro spacecraft with multiple launches per year. In order to meet the future needs of these missions with regard to the use of spacecraft microelectronics, NASA's advanced flight computing (AFC) program is currently considering industrial cooperation and advanced packaging architectures. In relation to this, the AFC program is reviewed, considering the design and implementation of NASA's AFC multichip module.

  1. A computer program for multiple decrement life table analyses.

    Science.gov (United States)

    Poole, W K; Cooley, P C

    1977-06-01

    Life table analysis has traditionally been the tool of choice in analyzing distribution of "survival" times when a parametric form for the survival curve could not be reasonably assumed. Chiang, in two papers [1,2] formalized the theory of life table analyses in a Markov chain framework and derived maximum likelihood estimates of the relevant parameters for the analyses. He also discussed how the techniques could be generalized to consider competing risks and follow-up studies. Although various computer programs exist for doing different types of life table analysis [3] to date, there has not been a generally available, well documented computer program to carry out multiple decrement analyses, either by Chiang's or any other method. This paper describes such a program developed by Research Triangle Institute. A user's manual is available at printing costs which supplements the contents of this paper with a discussion of the formula used in the program listing.

  2. Program system for computation of the terrestrial gamma-radiation field

    International Nuclear Information System (INIS)

    Kirkegaard, P.; Loevborg, L.

    1979-02-01

    A system of computer programs intended for solution of the plane one-dimensional photon transport equation in the case of two adjacent media is described, and user's guides for the programs are given. One medium represents a natural ground with uniformly distributed potassium, uranium, and thorium gamma-ray emitters. The other medium is usually air with no radioactive contaminants. The solution method is the double-P 1 approximation with logarithmic energy spacing. The complete data-processing system GB contains the transport-theory code GAMP1, the code GFX for computation of scalar flux and dose rate, and a number of auxiliary programs and data files. (author)

  3. BWR Refill-Reflood Program, Task 4.7 - model development: TRAC-BWR component models

    International Nuclear Information System (INIS)

    Cheung, Y.K.; Parameswaran, V.; Shaug, J.C.

    1983-09-01

    TRAC (Transient Reactor Analysis Code) is a computer code for best-estimate analysis for the thermal hydraulic conditions in a reactor system. The development and assessment of the BWR component models developed under the Refill/Reflood Program that are necessary to structure a BWR-version of TRAC are described in this report. These component models are the jet pump, steam separator, steam dryer, two-phase level tracking model, and upper-plenum mixing model. These models have been implemented into TRAC-B02. Also a single-channel option has been developed for individual fuel-channel analysis following a system-response calculation

  4. Algorithm-structured computer arrays and networks architectures and processes for images, percepts, models, information

    CERN Document Server

    Uhr, Leonard

    1984-01-01

    Computer Science and Applied Mathematics: Algorithm-Structured Computer Arrays and Networks: Architectures and Processes for Images, Percepts, Models, Information examines the parallel-array, pipeline, and other network multi-computers.This book describes and explores arrays and networks, those built, being designed, or proposed. The problems of developing higher-level languages for systems and designing algorithm, program, data flow, and computer structure are also discussed. This text likewise describes several sequences of successively more general attempts to combine the power of arrays wi

  5. Personalized Computer-Assisted Mathematics Problem-Solving Program and Its Impact on Taiwanese Students

    Science.gov (United States)

    Chen, Chiu-Jung; Liu, Pei-Lin

    2007-01-01

    This study evaluated the effects of a personalized computer-assisted mathematics problem-solving program on the performance and attitude of Taiwanese fourth grade students. The purpose of this study was to determine whether the personalized computer-assisted program improved student performance and attitude over the nonpersonalized program.…

  6. Gender and stereotypes in motivation to study computer programming for careers in multimedia

    Science.gov (United States)

    Doubé, Wendy; Lang, Catherine

    2012-03-01

    A multimedia university programme with relatively equal numbers of male and female students in elective programming subjects provided a rare opportunity to investigate female motivation to study and pursue computer programming in a career. The MSLQ was used to survey 85 participants. In common with research into deterrence of females from STEM domains, females displayed significantly lower self-efficacy and expectancy for success. In contrast to research into deterrence of females from STEM domains, both genders placed similar high values on computer programming and shared high extrinsic and intrinsic goal orientation. The authors propose that the stereotype associated with a creative multimedia career could attract female participation in computer programming whereas the stereotype associated with computer science could be a deterrent.

  7. Viscous wing theory development. Volume 2: GRUMWING computer program user's manual

    Science.gov (United States)

    Chow, R. R.; Ogilvie, P. L.

    1986-01-01

    This report is a user's manual which describes the operation of the computer program, GRUMWING. The program computes the viscous transonic flow over three-dimensional wings using a boundary layer type viscid-inviscid interaction approach. The inviscid solution is obtained by an approximate factorization (AFZ)method for the full potential equation. The boundary layer solution is based on integral entrainment methods.

  8. Operational procedure for computer program for design point characteristics of a compressed-air generator with through-flow combustor for V/STOL applications

    Science.gov (United States)

    Krebs, R. P.

    1971-01-01

    The computer program described in this report calculates the design-point characteristics of a compressed-air generator for use in V/STOL applications such as systems with a tip-turbine-driven lift fan. The program computes the dimensions and mass, as well as the thermodynamic performance of a model air generator configuration which involves a straight through-flow combustor. Physical and thermodynamic characteristics of the air generator components are also given. The program was written in FORTRAN IV language. Provision has been made so that the program will accept input values in either SI units or U.S. customary units. Each air generator design-point calculation requires about 1.5 seconds of 7094 computer time for execution.

  9. International Conference on Computational Intelligence, Cyber Security, and Computational Models

    CERN Document Server

    Ramasamy, Vijayalakshmi; Sheen, Shina; Veeramani, C; Bonato, Anthony; Batten, Lynn

    2016-01-01

    This book aims at promoting high-quality research by researchers and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security, and Computational Models ICC3 2015 organized by PSG College of Technology, Coimbatore, India during December 17 – 19, 2015. This book enriches with innovations in broad areas of research like computational modeling, computational intelligence and cyber security. These emerging inter disciplinary research areas have helped to solve multifaceted problems and gained lot of attention in recent years. This encompasses theory and applications, to provide design, analysis and modeling of the aforementioned key areas.

  10. Computer model for estimating electric utility environmental noise

    International Nuclear Information System (INIS)

    Teplitzky, A.M.; Hahn, K.J.

    1991-01-01

    This paper reports on a computer code for estimating environmental noise emissions from the operation and the construction of electric power plants that was developed based on algorithms. The computer code (Model) is used to predict octave band sound power levels for power plant operation and construction activities on the basis of the equipment operating characteristics and calculates off-site sound levels for each noise source and for an entire plant. Estimated noise levels are presented either as A-weighted sound level contours around the power plant or as octave band levels at user defined receptor locations. Calculated sound levels can be compared with user designated noise criteria, and the program can assist the user in analyzing alternative noise control strategies

  11. Gstat: a program for geostatistical modelling, prediction and simulation

    Science.gov (United States)

    Pebesma, Edzer J.; Wesseling, Cees G.

    1998-01-01

    Gstat is a computer program for variogram modelling, and geostatistical prediction and simulation. It provides a generic implementation of the multivariable linear model with trends modelled as a linear function of coordinate polynomials or of user-defined base functions, and independent or dependent, geostatistically modelled, residuals. Simulation in gstat comprises conditional or unconditional (multi-) Gaussian sequential simulation of point values or block averages, or (multi-) indicator sequential simulation. Besides many of the popular options found in other geostatistical software packages, gstat offers the unique combination of (i) an interactive user interface for modelling variograms and generalized covariances (residual variograms), that uses the device-independent plotting program gnuplot for graphical display, (ii) support for several ascii and binary data and map file formats for input and output, (iii) a concise, intuitive and flexible command language, (iv) user customization of program defaults, (v) no built-in limits, and (vi) free, portable ANSI-C source code. This paper describes the class of problems gstat can solve, and addresses aspects of efficiency and implementation, managing geostatistical projects, and relevant technical details.

  12. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  13. Research on uranium resource models. Part IV. Logic: a computer graphics program to construct integrated logic circuits for genetic-geologic models. Progress report

    International Nuclear Information System (INIS)

    Scott, W.A.; Turner, R.M.; McCammon, R.B.

    1981-01-01

    Integrated logic circuits were described as a means of formally representing genetic-geologic models for estimating undiscovered uranium resources. The logic circuits are logical combinations of selected geologic characteristics judged to be associated with particular types of uranium deposits. Each combination takes on a value which corresponds to the combined presence, absence, or don't know states of the selected characteristic within a specified geographic cell. Within each cell, the output of the logic circuit is taken as a measure of the favorability of occurrence of an undiscovered deposit of the type being considered. In this way, geological, geochemical, and geophysical data are incorporated explicitly into potential uranium resource estimates. The present report describes how integrated logic circuits are constructed by use of a computer graphics program. A user's guide is also included

  14. Scientific and Computational Challenges of the Fusion Simulation Program (FSP)

    International Nuclear Information System (INIS)

    Tang, William M.

    2011-01-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Program (FSP) a major national initiative in the United States with the primary objective being to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. It is expected to provide a suite of advanced modeling tools for reliably predicting fusion device behavior with comprehensive and targeted science-based simulations of nonlinearly-coupled phenomena in the core plasma, edge plasma, and wall region on time and space scales required for fusion energy production. As such, it will strive to embody the most current theoretical and experimental understanding of magnetic fusion plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing the ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices with high physics fidelity on all relevant time and space scales. From a computational perspective, this will demand computing resources in the petascale range and beyond together with the associated multi-core algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative experiment involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics modeling projects (e

  15. Scientific and computational challenges of the fusion simulation program (FSP)

    International Nuclear Information System (INIS)

    Tang, William M.

    2011-01-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Program (FSP) - a major national initiative in the United States with the primary objective being to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. It is expected to provide a suite of advanced modeling tools for reliably predicting fusion device behavior with comprehensive and targeted science-based simulations of nonlinearly-coupled phenomena in the core plasma, edge plasma, and wall region on time and space scales required for fusion energy production. As such, it will strive to embody the most current theoretical and experimental understanding of magnetic fusion plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing the ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices with high physics fidelity on all relevant time and space scales. From a computational perspective, this will demand computing resources in the petascale range and beyond together with the associated multi-core algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative experiment involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics modeling projects (e

  16. 76 FR 50460 - Privacy Act of 1974; Notice of a Computer Matching Program

    Science.gov (United States)

    2011-08-15

    ... records will be disclosed for the purpose of this computer match are as follows: OPM will use the system... entitled to health care under TRS and TRR.'' E. Description of Computer Matching Program: Under the terms...; Notice of a Computer Matching Program AGENCY: Defense Manpower Data Center, Department of Defense (DoD...

  17. The Design of an Undergraduate Degree Program in Computer & Digital Forensics

    Directory of Open Access Journals (Sweden)

    Gary C. Kessler

    2006-09-01

    Full Text Available Champlain College formally started an undergraduate degree program in Computer & Digital Forensics in 2003. The underlying goals were that the program be multidisciplinary, bringing together the law, computer technology, and the basics of digital investigations; would be available as on online and on-campus offering; and would have a process-oriented focus. Success of this program has largely been due to working closely with practitioners, maintaining activity in events related to both industry and academia, and flexibility to respond to ever-changing needs. This paper provides an overview of how this program was conceived, developed, and implemented; its evolution over time; and current and planned initiatives.

  18. A computer program for two-particle generalized coefficients of fractional parentage

    Science.gov (United States)

    Deveikis, A.; Juodagalvis, A.

    2008-10-01

    We present a FORTRAN90 program GCFP for the calculation of the generalized coefficients of fractional parentage (generalized CFPs or GCFP). The approach is based on the observation that the multi-shell CFPs can be expressed in terms of single-shell CFPs, while the latter can be readily calculated employing a simple enumeration scheme of antisymmetric A-particle states and an efficient method of construction of the idempotent matrix eigenvectors. The program provides fast calculation of GCFPs for a given particle number and produces results possessing numerical uncertainties below the desired tolerance. A single j-shell is defined by four quantum numbers, (e,l,j,t). A supplemental C++ program parGCFP allows calculation to be done in batches and/or in parallel. Program summaryProgram title:GCFP, parGCFP Catalogue identifier: AEBI_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEBI_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 17 199 No. of bytes in distributed program, including test data, etc.: 88 658 Distribution format: tar.gz Programming language: FORTRAN 77/90 ( GCFP), C++ ( parGCFP) Computer: Any computer with suitable compilers. The program GCFP requires a FORTRAN 77/90 compiler. The auxiliary program parGCFP requires GNU-C++ compatible compiler, while its parallel version additionally requires MPI-1 standard libraries Operating system: Linux (Ubuntu, Scientific) (all programs), also checked on Windows XP ( GCFP, serial version of parGCFP) RAM: The memory demand depends on the computation and output mode. If this mode is not 4, the program GCFP demands the following amounts of memory on a computer with Linux operating system. It requires around 2 MB of RAM for the A=12 system at E⩽2. Computation of the A=50 particle system requires around 60 MB of

  19. A model surveillance program based on regulatory experience

    International Nuclear Information System (INIS)

    Conte, R.J.

    1980-01-01

    A model surveillance program is presented based on regulatory experience. The program consists of three phases: Program Delineation, Data Acquistion and Data Analysis. Each phase is described in terms of key quality assurance elements and some current philosophies is the United States Licensing Program. Other topics include the application of these ideas to test equipment used in the surveillance progam and audits of the established program. Program Delineation discusses the establishment of administrative controls for organization and the description of responsibilities using the 'Program Coordinator' concept, with assistance from Data Acquisition and Analysis Teams. Ideas regarding frequency of surveillance testing are also presented. The Data Acquisition Phase discusses various methods for acquiring data including operator observations, test procedures, operator logs, and computer output, for trending equipment performance. The Data Analysis Phase discusses the process for drawing conclusions regarding component/equipment service life, proper application, and generic problems through the use of trend analysis and failure rate data. (orig.)

  20. The algebraic manipulation program DIRAC on IBM personal computers

    International Nuclear Information System (INIS)

    Grozin, A.G.; Perlt, H.

    1989-01-01

    The version DIRAC (2.2) for IBM compatible personal computers is described. It is designed to manipulate algebraically with polynomials and tensors. After a short introduction concerning implementation and usage on personal computers an example program is given. It contains a detailed user's guide to DIRAC (2.2) and, additionally some useful applications. 4 refs

  1. The ISCB Student Council Internship Program: Expanding computational biology capacity worldwide.

    Science.gov (United States)

    Anupama, Jigisha; Francescatto, Margherita; Rahman, Farzana; Fatima, Nazeefa; DeBlasio, Dan; Shanmugam, Avinash Kumar; Satagopam, Venkata; Santos, Alberto; Kolekar, Pandurang; Michaut, Magali; Guney, Emre

    2018-01-01

    Education and training are two essential ingredients for a successful career. On one hand, universities provide students a curriculum for specializing in one's field of study, and on the other, internships complement coursework and provide invaluable training experience for a fruitful career. Consequently, undergraduates and graduates are encouraged to undertake an internship during the course of their degree. The opportunity to explore one's research interests in the early stages of their education is important for students because it improves their skill set and gives their career a boost. In the long term, this helps to close the gap between skills and employability among students across the globe and balance the research capacity in the field of computational biology. However, training opportunities are often scarce for computational biology students, particularly for those who reside in less-privileged regions. Aimed at helping students develop research and academic skills in computational biology and alleviating the divide across countries, the Student Council of the International Society for Computational Biology introduced its Internship Program in 2009. The Internship Program is committed to providing access to computational biology training, especially for students from developing regions, and improving competencies in the field. Here, we present how the Internship Program works and the impact of the internship opportunities so far, along with the challenges associated with this program.

  2. The ISCB Student Council Internship Program: Expanding computational biology capacity worldwide.

    Directory of Open Access Journals (Sweden)

    Jigisha Anupama

    2018-01-01

    Full Text Available Education and training are two essential ingredients for a successful career. On one hand, universities provide students a curriculum for specializing in one's field of study, and on the other, internships complement coursework and provide invaluable training experience for a fruitful career. Consequently, undergraduates and graduates are encouraged to undertake an internship during the course of their degree. The opportunity to explore one's research interests in the early stages of their education is important for students because it improves their skill set and gives their career a boost. In the long term, this helps to close the gap between skills and employability among students across the globe and balance the research capacity in the field of computational biology. However, training opportunities are often scarce for computational biology students, particularly for those who reside in less-privileged regions. Aimed at helping students develop research and academic skills in computational biology and alleviating the divide across countries, the Student Council of the International Society for Computational Biology introduced its Internship Program in 2009. The Internship Program is committed to providing access to computational biology training, especially for students from developing regions, and improving competencies in the field. Here, we present how the Internship Program works and the impact of the internship opportunities so far, along with the challenges associated with this program.

  3. Program computes single-point failures in critical system designs

    Science.gov (United States)

    Brown, W. R.

    1967-01-01

    Computer program analyzes the designs of critical systems that will either prove the design is free of single-point failures or detect each member of the population of single-point failures inherent in a system design. This program should find application in the checkout of redundant circuits and digital systems.

  4. Investigating Difficulties of Learning Computer Programming in Saudi Arabia

    Science.gov (United States)

    Alakeel, Ali M.

    2015-01-01

    Learning computer programming is one of the main requirements of many educational study plans in higher education. Research has shown that many students face difficulties acquiring reasonable programming skills during their first year of college. In Saudi Arabia, there are twenty-three state-owned universities scattered around the country that…

  5. HIGHTEX: a computer program for the steady-state simulation of steam-methane reformers used in a nuclear process heat plant

    International Nuclear Information System (INIS)

    Tadokoro, Yoshihiro; Seya, Toko

    1977-08-01

    This report describes a computational model and the input procedure of HIGHTEX, a computer program for steady-state simulation of the steam-methane reformers used in a nuclear process heat plant. The HIGHTEX program simulates rapidly a single reformer tube, and treats the reactant single-phase in the two-dimensional catalyst bed. Output of the computer program is radial distributions of temperature and reaction products in the catalyst-packed bed, pressure loss of the packed bed, stress in the reformer tube, hydrogen permeation rate through the reformer tube, heat rate of reaction, and heat-transfer rate between helium and process gas. The running time (cpu) for a 9m-long bayonet type reformer tube is 12 min with FACOM-230/75. (auth.)

  6. Use of a radiation therapy treatment planning computer in a hospital health physics program

    International Nuclear Information System (INIS)

    Addison, S.J.

    1984-01-01

    An onsite treatment planning computer has become state of the art in the care of radiation therapy patients, but in most installations the computer is used for therapy planning a diminutive amount of the day. At St. Mary's Hospital, arrangements have been negotiated for part time use of the treatment planning computer for health physics purposes. Computerized Medical Systems, Inc. (CMS) produces the Modulex radiotherapy planning system which is programmed in MUMPS, a user oriented language specially adapted for handling text string information. St. Mary's Hospital's CMS computer has currently been programmed to assist in data collection and write-up of diagnostic x-ray surveys, meter calibrations, and wipe/leak tests. The computer is setup to provide timely reminders of tests and surveys, and billing for consultation work. Programs are currently being developed for radionuclide inventories. Use of a therapy planning computer for health physics purposes can enhance the radiation safety program and provide additional grounds for the acquisition of such a computer system

  7. Electromagnetic Physics Models for Parallel Computing Architectures

    International Nuclear Information System (INIS)

    Amadio, G; Bianchini, C; Iope, R; Ananya, A; Apostolakis, J; Aurora, A; Bandieramonte, M; Brun, R; Carminati, F; Gheata, A; Gheata, M; Goulas, I; Nikitina, T; Bhattacharyya, A; Mohanty, A; Canal, P; Elvira, D; Jun, S Y; Lima, G; Duhem, L

    2016-01-01

    The recent emergence of hardware architectures characterized by many-core or accelerated processors has opened new opportunities for concurrent programming models taking advantage of both SIMD and SIMT architectures. GeantV, a next generation detector simulation, has been designed to exploit both the vector capability of mainstream CPUs and multi-threading capabilities of coprocessors including NVidia GPUs and Intel Xeon Phi. The characteristics of these architectures are very different in terms of the vectorization depth and type of parallelization needed to achieve optimal performance. In this paper we describe implementation of electromagnetic physics models developed for parallel computing architectures as a part of the GeantV project. Results of preliminary performance evaluation and physics validation are presented as well. (paper)

  8. Contributions to computational stereology and parallel programming

    DEFF Research Database (Denmark)

    Rasmusson, Allan

    rotator, even without the need for isotropic sections. To meet the need for computational power to perform image restoration of virtual tissue sections, parallel programming on GPUs has also been part of the project. This has lead to a significant change in paradigm for a previously developed surgical...

  9. New Developments in Modeling MHD Systems on High Performance Computing Architectures

    Science.gov (United States)

    Germaschewski, K.; Raeder, J.; Larson, D. J.; Bhattacharjee, A.

    2009-04-01

    Modeling the wide range of time and length scales present even in fluid models of plasmas like MHD and X-MHD (Extended MHD including two fluid effects like Hall term, electron inertia, electron pressure gradient) is challenging even on state-of-the-art supercomputers. In the last years, HPC capacity has continued to grow exponentially, but at the expense of making the computer systems more and more difficult to program in order to get maximum performance. In this paper, we will present a new approach to managing the complexity caused by the need to write efficient codes: Separating the numerical description of the problem, in our case a discretized right hand side (r.h.s.), from the actual implementation of efficiently evaluating it. An automatic code generator is used to describe the r.h.s. in a quasi-symbolic form while leaving the translation into efficient and parallelized code to a computer program itself. We implemented this approach for OpenGGCM (Open General Geospace Circulation Model), a model of the Earth's magnetosphere, which was accelerated by a factor of three on regular x86 architecture and a factor of 25 on the Cell BE architecture (commonly known for its deployment in Sony's PlayStation 3).

  10. Modeling Computer Virus and Its Dynamics

    Directory of Open Access Journals (Sweden)

    Mei Peng

    2013-01-01

    Full Text Available Based on that the computer will be infected by infected computer and exposed computer, and some of the computers which are in suscepitible status and exposed status can get immunity by antivirus ability, a novel coumputer virus model is established. The dynamic behaviors of this model are investigated. First, the basic reproduction number R0, which is a threshold of the computer virus spreading in internet, is determined. Second, this model has a virus-free equilibrium P0, which means that the infected part of the computer disappears, and the virus dies out, and P0 is a globally asymptotically stable equilibrium if R01 then this model has only one viral equilibrium P*, which means that the computer persists at a constant endemic level, and P* is also globally asymptotically stable. Finally, some numerical examples are given to demonstrate the analytical results.

  11. Computer Recreations.

    Science.gov (United States)

    Dewdney, A. K.

    1988-01-01

    Describes the creation of the computer program "BOUNCE," designed to simulate a weighted piston coming into equilibrium with a cloud of bouncing balls. The model follows the ideal gas law. Utilizes the critical event technique to create the model. Discusses another program, "BOOM," which simulates a chain reaction. (CW)

  12. Programming guidelines for computer systems of NPPs

    International Nuclear Information System (INIS)

    Suresh babu, R.M.; Mahapatra, U.

    1999-09-01

    Software quality is assured by systematic development and adherence to established standards. All national and international software quality standards have made it mandatory for the software development organisation to produce programming guidelines as part of software documentation. This document contains a set of programming guidelines for detailed design and coding phases of software development cycle. These guidelines help to improve software quality by increasing visibility, verifiability, testability and maintainability. This can be used organisation-wide for various computer systems being developed for our NPPs. This also serves as a guide for reviewers. (author)

  13. Computer models and simulations of IGCC power plants with Canadian coals

    Energy Technology Data Exchange (ETDEWEB)

    Zheng, L.; Furimsky, E.

    1999-07-01

    In this paper, three steady state computer models for simulation of IGCC power plants with Shell, Texaco and BGL (British Gas Lurgi) gasifiers will be presented. All models were based on a study by Bechtel for Nova Scotia Power Corporation. They were built by using Advanced System for Process Engineering (ASPEN) steady state simulation software together with Fortran programs developed in house. Each model was integrated from several sections which can be simulated independently, such as coal preparation, gasification, gas cooling, acid gas removing, sulfur recovery, gas turbine, heat recovery steam generation, and steam cycle. A general description of each process, model's overall structure, capability, testing results, and background reference will be given. The performance of some Canadian coals on these models will be discussed as well. The authors also built a computer model of IGCC power plant with Kellogg-Rust-Westinghouse gasifier, however, due to limitation of paper length, it is not presented here.

  14. CRUSH1: a simplified computer program for impact analysis of radioactive material transport casks

    Energy Technology Data Exchange (ETDEWEB)

    Ikushima, Takeshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1996-07-01

    In drop impact analyses for radioactive transport casks, it has become possible to perform them in detail by using interaction evaluation, computer programs, such as DYNA2D, DYNA3D, PISCES and HONDO. However, the considerable cost and computer time are necessitated to perform analyses by these programs. To meet the above requirements, a simplified computer program CRUSH1 has been developed. The CRUSH1 is a static calculation computer program capable of evaluating the maximum acceleration of cask bodies and the maximum deformation of shock absorbers using an Uniaxial Displacement Method (UDM). The CRUSH1 is a revised version of the CRUSH. Main revisions of the computer program are as follows; (1) not only main frame computer but also work stations (OS UNIX) and personal computer (OS Windows 3.1 or Windows NT) are available for use of the CRUSH1 and (2) input data set are revised. In the paper, brief illustration of calculation method using UDM is presented. The second section presents comparisons between UDM and the detailed method. The third section provides a use`s guide for CRUSH1. (author)

  15. CRUSH1: a simplified computer program for impact analysis of radioactive material transport casks

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1996-07-01

    In drop impact analyses for radioactive transport casks, it has become possible to perform them in detail by using interaction evaluation, computer programs, such as DYNA2D, DYNA3D, PISCES and HONDO. However, the considerable cost and computer time are necessitated to perform analyses by these programs. To meet the above requirements, a simplified computer program CRUSH1 has been developed. The CRUSH1 is a static calculation computer program capable of evaluating the maximum acceleration of cask bodies and the maximum deformation of shock absorbers using an Uniaxial Displacement Method (UDM). The CRUSH1 is a revised version of the CRUSH. Main revisions of the computer program are as follows; (1) not only main frame computer but also work stations (OS UNIX) and personal computer (OS Windows 3.1 or Windows NT) are available for use of the CRUSH1 and (2) input data set are revised. In the paper, brief illustration of calculation method using UDM is presented. The second section presents comparisons between UDM and the detailed method. The third section provides a use's guide for CRUSH1. (author)

  16. Computationally-optimized bone mechanical modeling from high-resolution structural images.

    Directory of Open Access Journals (Sweden)

    Jeremy F Magland

    Full Text Available Image-based mechanical modeling of the complex micro-structure of human bone has shown promise as a non-invasive method for characterizing bone strength and fracture risk in vivo. In particular, elastic moduli obtained from image-derived micro-finite element (μFE simulations have been shown to correlate well with results obtained by mechanical testing of cadaveric bone. However, most existing large-scale finite-element simulation programs require significant computing resources, which hamper their use in common laboratory and clinical environments. In this work, we theoretically derive and computationally evaluate the resources needed to perform such simulations (in terms of computer memory and computation time, which are dependent on the number of finite elements in the image-derived bone model. A detailed description of our approach is provided, which is specifically optimized for μFE modeling of the complex three-dimensional architecture of trabecular bone. Our implementation includes domain decomposition for parallel computing, a novel stopping criterion, and a system for speeding up convergence by pre-iterating on coarser grids. The performance of the system is demonstrated on a dual quad-core Xeon 3.16 GHz CPUs equipped with 40 GB of RAM. Models of distal tibia derived from 3D in-vivo MR images in a patient comprising 200,000 elements required less than 30 seconds to converge (and 40 MB RAM. To illustrate the system's potential for large-scale μFE simulations, axial stiffness was estimated from high-resolution micro-CT images of a voxel array of 90 million elements comprising the human proximal femur in seven hours CPU time. In conclusion, the system described should enable image-based finite-element bone simulations in practical computation times on high-end desktop computers with applications to laboratory studies and clinical imaging.

  17. COMPUTATIONAL MODELS FOR SUSTAINABLE DEVELOPMENT

    OpenAIRE

    Monendra Grover; Rajesh Kumar; Tapan Kumar Mondal; S. Rajkumar

    2011-01-01

    Genetic erosion is a serious problem and computational models have been developed to prevent it. The computational modeling in this field not only includes (terrestrial) reserve design, but also decision modeling for related problems such as habitat restoration, marine reserve design, and nonreserve approaches to conservation management. Models have been formulated for evaluating tradeoffs between socioeconomic, biophysical, and spatial criteria in establishing marine reserves. The percolatio...

  18. How robotics programs influence young women's career choices : a grounded theory model

    Science.gov (United States)

    Craig, Cecilia Dosh-Bluhm

    The fields of engineering, computer science, and physics have a paucity of women despite decades of intervention by universities and organizations. Women's graduation rates in these fields continue to stagnate, posing a critical problem for society. This qualitative grounded theory (GT) study sought to understand how robotics programs influenced young women's career decisions and the program's effect on engineering, physics, and computer science career interests. To test this, a study was mounted to explore how the FIRST (For Inspiration and Recognition of Science and Technology) Robotics Competition (FRC) program influenced young women's college major and career choices. Career theories suggested that experiential programs coupled with supportive relationships strongly influence career decisions, especially for science, technology, engineering, and mathematics careers. The study explored how and when young women made career decisions and how the experiential program and! its mentors and role models influenced career choice. Online focus groups and interviews (online and face-to-face) with 10 female FRC alumnae and GT processes (inductive analysis, open coding, categorizations using mind maps and content clouds) were used to generate a general systems theory style model of the career decision process for these young women. The study identified gender stereotypes and other career obstacles for women. The study's conclusions include recommendations to foster connections to real-world challenges, to develop training programs for mentors, and to nurture social cohesion, a mostly untapped area. Implementing these recommendations could help grow a critical mass of women in engineering, physics, and computer science careers, a social change worth pursuing.

  19. A depth-first search algorithm to compute elementary flux modes by linear programming.

    Science.gov (United States)

    Quek, Lake-Ee; Nielsen, Lars K

    2014-07-30

    The decomposition of complex metabolic networks into elementary flux modes (EFMs) provides a useful framework for exploring reaction interactions systematically. Generating a complete set of EFMs for large-scale models, however, is near impossible. Even for moderately-sized models (linear programming (LP) to enumerate EFMs in an exhaustive fashion. Constraints can be introduced to directly generate a subset of EFMs satisfying the set of constraints. The depth-first search algorithm has a constant memory overhead. Using flux constraints, a large LP problem can be massively divided and parallelized into independent sub-jobs for deployment into computing clusters. Since the sub-jobs do not overlap, the approach scales to utilize all available computing nodes with minimal coordination overhead or memory limitations. The speed of the algorithm was comparable to efmtool, a mainstream Double Description method, when enumerating all EFMs; the attrition power gained from performing flux feasibility tests offsets the increased computational demand of running an LP solver. Unlike the Double Description method, the algorithm enables accelerated enumeration of all EFMs satisfying a set of constraints.

  20. Computer model for refinery operations with emphasis on jet fuel production. Volume 3: Detailed systems and programming documentation

    Science.gov (United States)

    Dunbar, D. N.; Tunnah, B. G.

    1978-01-01

    The FORTRAN computing program predicts flow streams and material, energy, and economic balances of a typical petroleum refinery, with particular emphasis on production of aviation turbine fuels of varying end point and hydrogen content specifications. The program has a provision for shale oil and coal oil in addition to petroleum crudes. A case study feature permits dependent cases to be run for parametric or optimization studies by input of only the variables which are changed from the base case.