WorldWideScience

Sample records for program computer modeling

  1. STEW A Nonlinear Data Modeling Computer Program

    CERN Document Server

    Chen, H

    2000-01-01

    A nonlinear data modeling computer program, STEW, employing the Levenberg-Marquardt algorithm, has been developed to model the experimental sup 2 sup 3 sup 9 Pu(n,f) and sup 2 sup 3 sup 5 U(n,f) cross sections. This report presents results of the modeling of the sup 2 sup 3 sup 9 Pu(n,f) and sup 2 sup 3 sup 5 U(n,f) cross-section data. The calculation of the fission transmission coefficient is based on the double-humped-fission-barrier model of Bjornholm and Lynn. Incident neutron energies of up to 5 MeV are considered.

  2. Stochastic linear programming models, theory, and computation

    CERN Document Server

    Kall, Peter

    2011-01-01

    This new edition of Stochastic Linear Programming: Models, Theory and Computation has been brought completely up to date, either dealing with or at least referring to new material on models and methods, including DEA with stochastic outputs modeled via constraints on special risk functions (generalizing chance constraints, ICC’s and CVaR constraints), material on Sharpe-ratio, and Asset Liability Management models involving CVaR in a multi-stage setup. To facilitate use as a text, exercises are included throughout the book, and web access is provided to a student version of the authors’ SLP-IOR software. Additionally, the authors have updated the Guide to Available Software, and they have included newer algorithms and modeling systems for SLP. The book is thus suitable as a text for advanced courses in stochastic optimization, and as a reference to the field. From Reviews of the First Edition: "The book presents a comprehensive study of stochastic linear optimization problems and their applications. … T...

  3. Human operator identification model and related computer programs

    Science.gov (United States)

    Kessler, K. M.; Mohr, J. N.

    1978-01-01

    Four computer programs which provide computational assistance in the analysis of man/machine systems are reported. The programs are: (1) Modified Transfer Function Program (TF); (2) Time Varying Response Program (TVSR); (3) Optimal Simulation Program (TVOPT); and (4) Linear Identification Program (SCIDNT). The TV program converts the time domain state variable system representative to frequency domain transfer function system representation. The TVSR program computes time histories of the input/output responses of the human operator model. The TVOPT program is an optimal simulation program and is similar to TVSR in that it produces time histories of system states associated with an operator in the loop system. The differences between the two programs are presented. The SCIDNT program is an open loop identification code which operates on the simulated data from TVOPT (or TVSR) or real operator data from motion simulators.

  4. Architecture and Programming Models for High Performance Intensive Computation

    Science.gov (United States)

    2016-06-29

    AFRL-AFOSR-VA-TR-2016-0230 Architecture and Programming Models for High Performance Intensive Computation XiaoMing Li UNIVERSITY OF DELAWARE Final...TITLE AND SUBTITLE Architecture and Programming Models for High Performance Intensive Computation 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9550-13-1-0213...developing an efficient system architecture and software tools for building and running Dynamic Data Driven Application Systems (DDDAS). The foremost

  5. Center for Programming Models for Scalable Parallel Computing

    Energy Technology Data Exchange (ETDEWEB)

    John Mellor-Crummey

    2008-02-29

    Rice University's achievements as part of the Center for Programming Models for Scalable Parallel Computing include: (1) design and implemention of cafc, the first multi-platform CAF compiler for distributed and shared-memory machines, (2) performance studies of the efficiency of programs written using the CAF and UPC programming models, (3) a novel technique to analyze explicitly-parallel SPMD programs that facilitates optimization, (4) design, implementation, and evaluation of new language features for CAF, including communication topologies, multi-version variables, and distributed multithreading to simplify development of high-performance codes in CAF, and (5) a synchronization strength reduction transformation for automatically replacing barrier-based synchronization with more efficient point-to-point synchronization. The prototype Co-array Fortran compiler cafc developed in this project is available as open source software from http://www.hipersoft.rice.edu/caf.

  6. Modeling of rolling element bearing mechanics. Computer program user's manual

    Science.gov (United States)

    Greenhill, Lyn M.; Merchant, David H.

    1994-10-01

    This report provides the user's manual for the Rolling Element Bearing Analysis System (REBANS) analysis code which determines the quasistatic response to external loads or displacement of three types of high-speed rolling element bearings: angular contact ball bearings, duplex angular contact ball bearings, and cylindrical roller bearings. The model includes the defects of bearing ring and support structure flexibility. It is comprised of two main programs: the Preprocessor for Bearing Analysis (PREBAN) which creates the input files for the main analysis program, and Flexibility Enhanced Rolling Element Bearing Analysis (FEREBA), the main analysis program. This report addresses input instructions for and features of the computer codes. A companion report addresses the theoretical basis for the computer codes. REBANS extends the capabilities of the SHABERTH (Shaft and Bearing Thermal Analysis) code to include race and housing flexibility, including such effects as dead band and preload springs.

  7. Final Report: Center for Programming Models for Scalable Parallel Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [William Marsh Rice University

    2011-09-13

    As part of the Center for Programming Models for Scalable Parallel Computing, Rice University collaborated with project partners in the design, development and deployment of language, compiler, and runtime support for parallel programming models to support application development for the “leadership-class” computer systems at DOE national laboratories. Work over the course of this project has focused on the design, implementation, and evaluation of a second-generation version of Coarray Fortran. Research and development efforts of the project have focused on the CAF 2.0 language, compiler, runtime system, and supporting infrastructure. This has involved working with the teams that provide infrastructure for CAF that we rely on, implementing new language and runtime features, producing an open source compiler that enabled us to evaluate our ideas, and evaluating our design and implementation through the use of benchmarks. The report details the research, development, findings, and conclusions from this work.

  8. Computable models

    CERN Document Server

    Turner, Raymond

    2009-01-01

    Computational models can be found everywhere in present day science and engineering. In providing a logical framework and foundation for the specification and design of specification languages, Raymond Turner uses this framework to introduce and study computable models. In doing so he presents the first systematic attempt to provide computational models with a logical foundation. Computable models have wide-ranging applications from programming language semantics and specification languages, through to knowledge representation languages and formalism for natural language semantics. They are al

  9. Building an advanced climate model: Program plan for the CHAMMP (Computer Hardware, Advanced Mathematics, and Model Physics) Climate Modeling Program

    Energy Technology Data Exchange (ETDEWEB)

    1990-12-01

    The issue of global warming and related climatic changes from increasing concentrations of greenhouse gases in the atmosphere has received prominent attention during the past few years. The Computer Hardware, Advanced Mathematics, and Model Physics (CHAMMP) Climate Modeling Program is designed to contribute directly to this rapid improvement. The goal of the CHAMMP Climate Modeling Program is to develop, verify, and apply a new generation of climate models within a coordinated framework that incorporates the best available scientific and numerical approaches to represent physical, biogeochemical, and ecological processes, that fully utilizes the hardware and software capabilities of new computer architectures, that probes the limits of climate predictability, and finally that can be used to address the challenging problem of understanding the greenhouse climate issue through the ability of the models to simulate time-dependent climatic changes over extended times and with regional resolution.

  10. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We identify a number of common features in programming that seem...... conspicuously absent from the literature on biomolecular computing; to partially redress this absence, we introduce a model of computation that is evidently programmable, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined...... by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways...

  11. Computer programs for forward and inverse modeling of acoustic and electromagnetic data

    Science.gov (United States)

    Ellefsen, Karl J.

    2011-01-01

    A suite of computer programs was developed by U.S. Geological Survey personnel for forward and inverse modeling of acoustic and electromagnetic data. This report describes the computer resources that are needed to execute the programs, the installation of the programs, the program designs, some tests of their accuracy, and some suggested improvements.

  12. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2010-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We introduce a model of computation that is evidently programmable......, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only...... in a strong sense: a universal algorithm exists, that is able to execute any program, and is not asymptotically inefficient. A prototype model has been implemented (for now in silico on a conventional computer). This work opens new perspectives on just how computation may be specified at the biological level....

  13. Introduction to ''Interactive models of computation and program behaviour"

    OpenAIRE

    Curien, Pierre-Louis

    2009-01-01

    Since the mid-eighties of the last century, a fruitful interplay between computer scientists and mathematicians has led to much progress in the understanding of programming languages, and has given new impulse to areas of mathematics such as proof theory or category theory. The volume of which this text is an introduction contains three contributions: Categorical semantics of linear logic, by P.-A. Melliès, Realizability in classical logic, by J.-L. Krivien, Abstract machines for dialogue gam...

  14. Quantum computer simulation using the CUDA programming model

    Science.gov (United States)

    Gutiérrez, Eladio; Romero, Sergio; Trenas, María A.; Zapata, Emilio L.

    2010-02-01

    Quantum computing emerges as a field that captures a great theoretical interest. Its simulation represents a problem with high memory and computational requirements which makes advisable the use of parallel platforms. In this work we deal with the simulation of an ideal quantum computer on the Compute Unified Device Architecture (CUDA), as such a problem can benefit from the high computational capacities of Graphics Processing Units (GPU). After all, modern GPUs are becoming very powerful computational architectures which is causing a growing interest in their application for general purpose. CUDA provides an execution model oriented towards a more general exploitation of the GPU allowing to use it as a massively parallel SIMT (Single-Instruction Multiple-Thread) multiprocessor. A simulator that takes into account memory reference locality issues is proposed, showing that the challenge of achieving a high performance depends strongly on the explicit exploitation of memory hierarchy. Several strategies have been experimentally evaluated obtaining good performance results in comparison with conventional platforms.

  15. Execution Models for Mapping Programs onto Distributed Memory Parallel Computers

    Science.gov (United States)

    1992-03-01

    DISTRIBUTED MEMORY PARALLEL COMPUTERS Alan Sussman Contract No. NAS1-18605 March 1992 Institute for Computer Applications in Science and Engineering NASA...MEMORY PARALLEL COMPUTERS Alan Sussman 1 Institute for Computer Applications in Science and Engineering NASA Langley Research Center Hampton, VA 23665...Computation onto Distributed Mem- ory Parallel Computers . PhD thesis, Carnegie Mellon University, September 1991. Also available as Technical Report

  16. Computer Programs.

    Science.gov (United States)

    Anderson, Tiffoni

    This module provides information on development and use of a Material Safety Data Sheet (MSDS) software program that seeks to link literacy skills education, safety training, and human-centered design. Section 1 discusses the development of the software program that helps workers understand the MSDSs that accompany the chemicals with which they…

  17. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  18. Thermal models of buildings. Determination of temperatures, heating and cooling loads. Theories, models and computer programs

    Energy Technology Data Exchange (ETDEWEB)

    Kaellblad, K.

    1998-05-01

    The need to estimate indoor temperatures, heating or cooling load and energy requirements for buildings arises in many stages of a buildings life cycle, e.g. at the early layout stage, during the design of a building and for energy retrofitting planning. Other purposes are to meet the authorities requirements given in building codes. All these situations require good calculation methods. The main purpose of this report is to present the authors work with problems related to thermal models and calculation methods for determination of temperatures and heating or cooling loads in buildings. Thus the major part of the report deals with treatment of solar radiation in glazing systems, shading of solar and sky radiation and the computer program JULOTTA used to simulate the thermal behavior of rooms and buildings. Other parts of thermal models of buildings are more briefly discussed and included in order to give an overview of existing problems and available solutions. A brief presentation of how thermal models can be built up is also given and it is a hope that the report can be useful as an introduction to this part of building physics as well as during development of calculation methods and computer programs. The report may also serve as a help for the users of energy related programs. Independent of which method or program a user choose to work with it is his or her own responsibility to understand the limits of the tool, else wrong conclusions may be drawn from the results 52 refs, 22 figs, 4 tabs

  19. Causal Analysis for Performance Modeling of Computer Programs

    Directory of Open Access Journals (Sweden)

    Jan Lemeire

    2007-01-01

    Full Text Available Causal modeling and the accompanying learning algorithms provide useful extensions for in-depth statistical investigation and automation of performance modeling. We enlarged the scope of existing causal structure learning algorithms by using the form-free information-theoretic concept of mutual information and by introducing the complexity criterion for selecting direct relations among equivalent relations. The underlying probability distribution of experimental data is estimated by kernel density estimation. We then reported on the benefits of a dependency analysis and the decompositional capacities of causal models. Useful qualitative models, providing insight into the role of every performance factor, were inferred from experimental data. This paper reports on the results for a LU decomposition algorithm and on the study of the parameter sensitivity of the Kakadu implementation of the JPEG-2000 standard. Next, the analysis was used to search for generic performance characteristics of the applications.

  20. Teaching Scientific Computing: A Model-Centered Approach to Pipeline and Parallel Programming with C

    Directory of Open Access Journals (Sweden)

    Vladimiras Dolgopolovas

    2015-01-01

    Full Text Available The aim of this study is to present an approach to the introduction into pipeline and parallel computing, using a model of the multiphase queueing system. Pipeline computing, including software pipelines, is among the key concepts in modern computing and electronics engineering. The modern computer science and engineering education requires a comprehensive curriculum, so the introduction to pipeline and parallel computing is the essential topic to be included in the curriculum. At the same time, the topic is among the most motivating tasks due to the comprehensive multidisciplinary and technical requirements. To enhance the educational process, the paper proposes a novel model-centered framework and develops the relevant learning objects. It allows implementing an educational platform of constructivist learning process, thus enabling learners’ experimentation with the provided programming models, obtaining learners’ competences of the modern scientific research and computational thinking, and capturing the relevant technical knowledge. It also provides an integral platform that allows a simultaneous and comparative introduction to pipelining and parallel computing. The programming language C for developing programming models and message passing interface (MPI and OpenMP parallelization tools have been chosen for implementation.

  1. STRUCTURE AND INTERFACE OF PROGRAM FACILITIES FOR RESEARCH OF PHYSICAL PROCESSES ON COMPUTER MODELS

    Directory of Open Access Journals (Sweden)

    Vitalii M. Bazurin

    2014-11-01

    Full Text Available Research of physical processes on computer models is the one of ways of research approach in the study of general physics in pedagogical universities. The basic elements of software for research of physical processes on computer models are certain in the article: structure of program mean and feature of interface. The author offers his vision of structure of program facilities by means which the computer models of the physical phenomena and processes are realized: block of registration, block of entrance background check, block of design of physical process and block of results verification. A structure of this kinds of software is given, in opinion of author, is universal. The program facilities developed by an author are described in the article.

  2. ICASE Computer Science Program

    Science.gov (United States)

    1985-01-01

    The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.

  3. Manual of phosphoric acid fuel cell stack three-dimensional model and computer program

    Science.gov (United States)

    Lu, C. Y.; Alkasab, K. A.

    1984-01-01

    A detailed distributed mathematical model of phosphoric acid fuel cell stack have been developed, with the FORTRAN computer program, for analyzing the temperature distribution in the stack and the associated current density distribution on the cell plates. Energy, mass, and electrochemical analyses in the stack were combined to develop the model. Several reasonable assumptions were made to solve this mathematical model by means of the finite differences numerical method.

  4. Phosphoric acid fuel cell power plant system performance model and computer program

    Science.gov (United States)

    Alkasab, K. A.; Lu, C. Y.

    1984-01-01

    A FORTRAN computer program was developed for analyzing the performance of phosphoric acid fuel cell power plant systems. Energy mass and electrochemical analysis in the reformer, the shaft converters, the heat exchangers, and the fuel cell stack were combined to develop a mathematical model for the power plant for both atmospheric and pressurized conditions, and for several commercial fuels.

  5. Manual of phosphoric acid fuel cell power plant optimization model and computer program

    Science.gov (United States)

    Lu, C. Y.; Alkasab, K. A.

    1984-01-01

    An optimized cost and performance model for a phosphoric acid fuel cell power plant system was derived and developed into a modular FORTRAN computer code. Cost, energy, mass, and electrochemical analyses were combined to develop a mathematical model for optimizing the steam to methane ratio in the reformer, hydrogen utilization in the PAFC plates per stack. The nonlinear programming code, COMPUTE, was used to solve this model, in which the method of mixed penalty function combined with Hooke and Jeeves pattern search was chosen to evaluate this specific optimization problem.

  6. An inverse modeling strategy and a computer program to model garnet growth and resorption

    Science.gov (United States)

    Lanari, Pierre; Giuntoli, Francesco

    2017-04-01

    GrtMod is a computer program that allows numerical simulation of the pressure-temperature (P-T) evolution of garnet porphyroblasts based on the composition of successive growth zones preserved in natural samples. For each garnet growth stage, a new reactive bulk composition is optimized, allowing for resorption and/or fractionation of the previously crystalized garnet. The successive minimizations are performed using a heuristic search method and an objective function that quantify the amount by which the predicted garnet composition deviates from the measured values. The automated strategy of GrtMod includes a two stages optimization and one refinement stage. In this contribution, we will present several application examples. The new strategy provides quantitative estimates of the optimal P-T conditions whereas it was generally derived in a qualitatively way by using garnet isopleth intersections in equilibrium phase diagrams. GrtMod can also be used to model the evolution of the reactive bulk composition along any P-T trajectories. The results for typical MORB and metapelite compositions demonstrate that fractional crystallization models are required to derive accurate P-T information from garnet compositional zoning. GrtMod can also be used to retrieve complex garnet histories involving several stages of resorption. For instance, it has been used to model the P-T condition of garnet growth in grains from the Sesia Zone (Western Alps). The compositional variability of successive growth zones is characterized using standardized X-ray maps and the program XMapTools. Permian garnet cores crystalized under granulite facies conditions (T > 800°C and P = 6 kbar), whereas Alpine garnet rims grew at eclogite facies conditions (650°C and 16 kbar) involving several successive episodes of resorption. The model predicts that up to 50 vol% of garnet was dissolved before a new episode of garnet growth.

  7. The Use of Computer-Based Programming Environments as Computer Modelling Tools in Early Science Education: The Cases of Textual and Graphical Program Languages

    Science.gov (United States)

    Louca, Loucas T.; Zacharia, Zacharia C.

    2008-01-01

    This is an interpretive case study seeking to develop detailed and comparative descriptions of how two groups of fifth-grade students used two different Computer-based Programming Environments (CPEs) (namely Microworlds Logo and Stagecast Creator) during scientific modelling. The primary sources of data that were used in this 4-month study include…

  8. Implications of the Turing machine model of computation for processor and programming language design

    Science.gov (United States)

    Hunter, Geoffrey

    2004-01-01

    A computational process is classified according to the theoretical model that is capable of executing it; computational processes that require a non-predeterminable amount of intermediate storage for their execution are Turing-machine (TM) processes, while those whose storage are predeterminable are Finite Automation (FA) processes. Simple processes (such as traffic light controller) are executable by Finite Automation, whereas the most general kind of computation requires a Turing Machine for its execution. This implies that a TM process must have a non-predeterminable amount of memory allocated to it at intermediate instants of its execution; i.e. dynamic memory allocation. Many processes encountered in practice are TM processes. The implication for computational practice is that the hardware (CPU) architecture and its operating system must facilitate dynamic memory allocation, and that the programming language used to specify TM processes must have statements with the semantic attribute of dynamic memory allocation, for in Alan Turing"s thesis on computation (1936) the "standard description" of a process is invariant over the most general data that the process is designed to process; i.e. the program describing the process should never have to be modified to allow for differences in the data that is to be processed in different instantiations; i.e. data-invariant programming. Any non-trivial program is partitioned into sub-programs (procedures, subroutines, functions, modules, etc). Examination of the calls/returns between the subprograms reveals that they are nodes in a tree-structure; this tree-structure is independent of the programming language used to encode (define) the process. Each sub-program typically needs some memory for its own use (to store values intermediate between its received data and its computed results); this locally required memory is not needed before the subprogram commences execution, and it is not needed after its execution terminates

  9. Creation of a Course in Computer Methods and Modeling for Undergraduate Earth Science Programs

    Science.gov (United States)

    Menking, K. M.; Dashnaw, J. M.

    2003-12-01

    In recent years computer modeling has gained importance in geological research as a means to generate and test hypotheses and to allow simulation of processes in places inaccessible to humans (e.g., outer core fluid dynamics), too slow to permit observation (e.g., erosionally-induced uplift of topography), or too large to facilitate construction of physical models (e.g., faulting on the San Andreas). Entire fields within the Earth sciences now exist in which computer modeling has become the core work of the discipline. Undergraduate geology/Earth science programs have been slow to adapt to this change, and computer science curricular offerings often do not meet geology students' needs. To address these problems, a course in Computer Methods and Modeling in the Earth Sciences is being developed at Vassar College. The course uses the STELLA iconographical box modeling software developed by High Performance Systems, Inc. to teach students the fundamentals of dynamical systems modeling and then builds on the knowledge students have constructed with STELLA to teach introductory computer programming in Fortran. Fully documented and debugged STELLA and Fortran models along with reading lists, answer keys, and course notes are being developed for distribution to anyone interested in teaching a course such as this. Modeling topics include U-Pb concordia/discordia dating techniques, the global phosphorus cycle, Earth's energy balance and temperature, the impact of climate change on a chain of lakes in eastern California, heat flow in permafrost, and flow of ice in glaciers by plastic deformation. The course has been taught twice at Vassar and has been enthusiastically received by students who reported not only that they enjoyed learning the process of modeling, but also that they had a newfound appreciation for the role of mathematics in geology and intended to enroll in more math courses in the future.

  10. A programming approach to computability

    CERN Document Server

    Kfoury, A J; Arbib, Michael A

    1982-01-01

    Computability theory is at the heart of theoretical computer science. Yet, ironically, many of its basic results were discovered by mathematical logicians prior to the development of the first stored-program computer. As a result, many texts on computability theory strike today's computer science students as far removed from their concerns. To remedy this, we base our approach to computability on the language of while-programs, a lean subset of PASCAL, and postpone consideration of such classic models as Turing machines, string-rewriting systems, and p. -recursive functions till the final chapter. Moreover, we balance the presentation of un solvability results such as the unsolvability of the Halting Problem with a presentation of the positive results of modern programming methodology, including the use of proof rules, and the denotational semantics of programs. Computer science seeks to provide a scientific basis for the study of information processing, the solution of problems by algorithms, and the design ...

  11. Flexible Animation Computer Program

    Science.gov (United States)

    Stallcup, Scott S.

    1990-01-01

    FLEXAN (Flexible Animation), computer program animating structural dynamics on Evans and Sutherland PS300-series graphics workstation with VAX/VMS host computer. Typical application is animation of spacecraft undergoing structural stresses caused by thermal and vibrational effects. Displays distortions in shape of spacecraft. Program displays single natural mode of vibration, mode history, or any general deformation of flexible structure. Written in FORTRAN 77.

  12. Model-Driven Development for scientific computing. An upgrade of the RHEEDGr program

    Science.gov (United States)

    Daniluk, Andrzej

    2009-11-01

    Model-Driven Engineering (MDE) is the software engineering discipline, which considers models as the most important element for software development, and for the maintenance and evolution of software, through model transformation. Model-Driven Architecture (MDA) is the approach for software development under the Model-Driven Engineering framework. This paper surveys the core MDA technology that was used to upgrade of the RHEEDGR program to C++0x language standards. New version program summaryProgram title: RHEEDGR-09 Catalogue identifier: ADUY_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADUY_v3_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 21 263 No. of bytes in distributed program, including test data, etc.: 1 266 982 Distribution format: tar.gz Programming language: Code Gear C++ Builder Computer: Intel Core Duo-based PC Operating system: Windows XP, Vista, 7 RAM: more than 1 MB Classification: 4.3, 7.2, 6.2, 8, 14 Does the new version supersede the previous version?: Yes Nature of problem: Reflection High-Energy Electron Diffraction (RHEED) is a very useful technique for studying growth and surface analysis of thin epitaxial structures prepared by the Molecular Beam Epitaxy (MBE). The RHEED technique can reveal, almost instantaneously, changes either in the coverage of the sample surface by adsorbates or in the surface structure of a thin film. Solution method: The calculations are based on the use of a dynamical diffraction theory in which the electrons are taken to be diffracted by a potential, which is periodic in the dimension perpendicular to the surface. Reasons for new version: Responding to the user feedback the graphical version of the RHEED program has been upgraded to C++0x language standards. Also, functionality and documentation of the

  13. Computer Security Assistance Program

    Science.gov (United States)

    1997-09-01

    Information COMPUTER SECURITY ASSISTANCE PROGRAM OPR: HQ AFCA/SYS (CMSgt Hogan) Certified by: HQ USAF/SCXX (Lt Col Francis X. McGovern) Pages: 5...Distribution: F This instruction implements Air Force Policy Directive (AFPD) 33-2, Information Protection, establishes the Air Force Computer Security Assistance...Force single point of contact for reporting and handling computer security incidents and vulnerabilities including AFCERT advisories and Defense

  14. A system approach to pharmacodynamics. III: An algorithm and computer program, COLAPS, for pharmacodynamic modeling.

    Science.gov (United States)

    Veng-Pedersen, P; Mandema, J W; Danhof, M

    1991-05-01

    Many pharmacodynamic (PD) models may be generalized in the form E(t) = N(L[c(t)]), where E(t) is a recorded effect response, c(t) is a sampled drug level, N is a nonlinear autonomic function, and L is a linear operator that commonly is a convolution operation. The NL class of PD models includes the traditional effect compartment PD models as a subclass, but is not limited to such models. An algorithm and computer program named COLAPS, based on system analysis principles and hysteresis minimization, that enable N and L to be empirically determined for the NL class of models without addressing specific kinetic structure aspects ("model independence") are presented. The kinetic concepts of biophase conduction and transduction functions are used by COLAPS. Such an approach is more general than the effect compartment approaches because it does not assume first-order transport principles. The pitfalls of hysteresis minimization in PD modeling are discussed and the procedures taken by COLAPS to avoid these pitfalls are outlined. A transformation technique prevents improper convergence to a point. A novel reparameterization scheme is introduced that maximizes the flexibility of the kinetic functions and extends the generality of the analysis. Inequality function constraints are maintained without the need for troublesome constrained nonlinear optimization procedures. Usage of the COLAPS program is illustrated in the analysis of the PD of amobarbital. The COLAPS program resulted in an excellent minimization of the effect versus biophase level hysteresis. The biophase conduction function, the biophase drug level (normalized), and the transduction curve were determined. The transduction curve showed clear biphasic behavior.

  15. A Solution Methodology and Computer Program to Efficiently Model Thermodynamic and Transport Coefficients of Mixtures

    Science.gov (United States)

    Ferlemann, Paul G.

    2000-01-01

    A solution methodology has been developed to efficiently model multi-specie, chemically frozen, thermally perfect gas mixtures. The method relies on the ability to generate a single (composite) set of thermodynamic and transport coefficients prior to beginning a CFD solution. While not fundamentally a new concept, many applied CFD users are not aware of this capability nor have a mechanism to easily and confidently generate new coefficients. A database of individual specie property coefficients has been created for 48 species. The seven coefficient form of the thermodynamic functions is currently used rather then the ten coefficient form due to the similarity of the calculated properties, low temperature behavior and reduced CPU requirements. Sutherland laminar viscosity and thermal conductivity coefficients were computed in a consistent manner from available reference curves. A computer program has been written to provide CFD users with a convenient method to generate composite specie coefficients for any mixture. Mach 7 forebody/inlet calculations demonstrated nearly equivalent results and significant CPU time savings compared to a multi-specie solution approach. Results from high-speed combustor analysis also illustrate the ability to model inert test gas contaminants without additional computational expense.

  16. Cross-scale Efficient Tensor Contractions for Coupled Cluster Computations Through Multiple Programming Model Backends

    Energy Technology Data Exchange (ETDEWEB)

    Ibrahim, Khaled Z. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division; Epifanovsky, Evgeny [Q-Chem, Inc., Pleasanton, CA (United States); Williams, Samuel W. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division; Krylov, Anna I. [Univ. of Southern California, Los Angeles, CA (United States). Dept. of Chemistry

    2016-07-26

    Coupled-cluster methods provide highly accurate models of molecular structure by explicit numerical calculation of tensors representing the correlation between electrons. These calculations are dominated by a sequence of tensor contractions, motivating the development of numerical libraries for such operations. While based on matrix-matrix multiplication, these libraries are specialized to exploit symmetries in the molecular structure and in electronic interactions, and thus reduce the size of the tensor representation and the complexity of contractions. The resulting algorithms are irregular and their parallelization has been previously achieved via the use of dynamic scheduling or specialized data decompositions. We introduce our efforts to extend the Libtensor framework to work in the distributed memory environment in a scalable and energy efficient manner. We achieve up to 240 speedup compared with the best optimized shared memory implementation. We attain scalability to hundreds of thousands of compute cores on three distributed-memory architectures, (Cray XC30&XC40, BlueGene/Q), and on a heterogeneous GPU-CPU system (Cray XK7). As the bottlenecks shift from being compute-bound DGEMM's to communication-bound collectives as the size of the molecular system scales, we adopt two radically different parallelization approaches for handling load-imbalance. Nevertheless, we preserve a uni ed interface to both programming models to maintain the productivity of computational quantum chemists.

  17. The updated algorithm of the Energy Consumption Program (ECP): A computer model simulating heating and cooling energy loads in buildings

    Science.gov (United States)

    Lansing, F. L.; Strain, D. M.; Chai, V. W.; Higgins, S.

    1979-01-01

    The energy Comsumption Computer Program was developed to simulate building heating and cooling loads and compute thermal and electric energy consumption and cost. This article reports on the new additional algorithms and modifications made in an effort to widen the areas of application. The program structure was rewritten accordingly to refine and advance the building model and to further reduce the processing time and cost. The program is noted for its very low cost and ease of use compared to other available codes. The accuracy of computations is not sacrificed however, since the results are expected to lie within + or - 10% of actual energy meter readings.

  18. PhasePlot: A Software Program for Visualizing Phase Relations Computed Using Thermochemical Models and Databases

    Science.gov (United States)

    Ghiorso, M. S.

    2011-12-01

    A new software program has been developed for Macintosh computers that permits the visualization of phase relations calculated from thermodynamic data-model collections. The data-model collections of MELTS (Ghiorso and Sack, 1995, CMP 119, 197-212), pMELTS (Ghiorso et al., 2002, G-cubed 3, 10.1029/2001GC000217) and the deep mantle database of Stixrude and Lithgow-Bertelloni (2011, GJI 184, 1180-1213) are currently implemented. The software allows users to enter a system bulk composition and a range of reference conditions and then calculate a grid of phase relations. These relations may be visualized in a variety of ways including phase diagrams, phase proportion plots, and contour diagrams of phase compositions and abundances. Results may be exported into Excel or similar spreadsheet applications. Flexibility in stipulating reference conditions permit the construction of temperature-pressure, temperature-volume, entropy-pressure, or entropy-volume display grids. Calculations on the grid are performed for fixed bulk composition or in open systems governed by user specified constraints on component chemical potentials (e.g., specified oxygen fugacity buffers). The calculation engine for the software is optimized for multi-core compute architectures and is very fast, allowing a typical grid of 64 points to be calculated in under 10 seconds on a dual-core laptop/iMac. The underlying computational thermodynamic algorithms have been optimized for speed and robust behavior. Taken together, both of these advances facilitate in classroom demonstrations and permit novice users to work with the program effectively, focusing on problem specification and interpretation of results rather than on manipulation and mechanics of computation - a key feature of an effective instructional tool. The emphasis in this software package is graphical visualization, which aids in better comprehension of complex phase relations in multicomponent systems. Anecdotal experience in using Phase

  19. Liquid rocket combustion computer model with distributed energy release. DER computer program documentation and user's guide, volume 1

    Science.gov (United States)

    Combs, L. P.

    1974-01-01

    A computer program for analyzing rocket engine performance was developed. The program is concerned with the formation, distribution, flow, and combustion of liquid sprays and combustion product gases in conventional rocket combustion chambers. The capabilities of the program to determine the combustion characteristics of the rocket engine are described. Sample data code sheets show the correct sequence and formats for variable values and include notes concerning options to bypass the input of certain data. A seperate list defines the variables and indicates their required dimensions.

  20. CONNEC3D: a computer program for connectivity analysis of 3D random set models

    Science.gov (United States)

    Pardo-Igúzquiza, Eulogio; Dowd, Peter A.

    2003-07-01

    Geostatistical simulation is used in risk analysis studies to incorporate the spatial uncertainty of experimental variables that are significantly under-sampled. For example, the values of hydraulic conductivity or porosity are critical in petroleum reservoir production modelling and prediction, in assessing underground sites as waste repositories, and in modelling the transport of contaminants in aquifers. In all these examples connectivity of the permeable phase or permeable lithofacies is a critical issue. Given an indicator map on a regular two- or three-dimensional grid, which can be obtained from continuous-valued or from categorical variables, CONNEC3D performs a connectivity analysis of the phase of interest (coded 0 or 1 by an indicator function). 3D maps of multiple indicators, categories or continuous variables can also be analysed for connectivity by suitable coding of the input map. Connectivity analysis involves the estimation of the connectivity function τ( h) for different spatial directions and a number of connectivity statistics. Included in the latter are the number of connected components (ncc), average size of a connected component (cc), mean length of a cc in the X, Y and Z directions, size of the largest cc, maximum length of a cc along X, Y and Z and the numbers of percolating components along X, Y and Z. In addition, the program provides as output a file in which each cc is identified by an integer number ranging from 1 to ncc. The implementation of the program is demonstrated on a random set model generated by the sequential indicator algorithm. This provides a means of estimating the computational time required for different grid sizes and is also used to demonstrate computationally that when the semi-variogram of the indicator function is anisotropic the connectivity function is also anisotropic. There are options within the program for 6-connectivity analysis, 18-connectivity analysis and 26-connectivity analysis. The software is

  1. LIAR -- A computer program for the modeling and simulation of high performance linacs

    Energy Technology Data Exchange (ETDEWEB)

    Assmann, R.; Adolphsen, C.; Bane, K.; Emma, P.; Raubenheimer, T.; Siemann, R.; Thompson, K.; Zimmermann, F.

    1997-04-01

    The computer program LIAR (LInear Accelerator Research Code) is a numerical modeling and simulation tool for high performance linacs. Amongst others, it addresses the needs of state-of-the-art linear colliders where low emittance, high-intensity beams must be accelerated to energies in the 0.05-1 TeV range. LIAR is designed to be used for a variety of different projects. LIAR allows the study of single- and multi-particle beam dynamics in linear accelerators. It calculates emittance dilutions due to wakefield deflections, linear and non-linear dispersion and chromatic effects in the presence of multiple accelerator imperfections. Both single-bunch and multi-bunch beams can be simulated. Several basic and advanced optimization schemes are implemented. Present limitations arise from the incomplete treatment of bending magnets and sextupoles. A major objective of the LIAR project is to provide an open programming platform for the accelerator physics community. Due to its design, LIAR allows straight-forward access to its internal FORTRAN data structures. The program can easily be extended and its interactive command language ensures maximum ease of use. Presently, versions of LIAR are compiled for UNIX and MS Windows operating systems. An interface for the graphical visualization of results is provided. Scientific graphs can be saved in the PS and EPS file formats. In addition a Mathematica interface has been developed. LIAR now contains more than 40,000 lines of source code in more than 130 subroutines. This report describes the theoretical basis of the program, provides a reference for existing features and explains how to add further commands. The LIAR home page and the ONLINE version of this manual can be accessed under: http://www.slac.stanford.edu/grp/arb/rwa/liar.htm.

  2. Computer Program NIKE

    DEFF Research Database (Denmark)

    Spanget-Larsen, Jens

    2014-01-01

    FORTRAN source code for program NIKE (PC version of QCPE 343). Sample input and output for two model chemical reactions are appended: I. Three consecutive monomolecular reactions, II. A simple chain mechanism...

  3. OPR-PPR, a Computer Program for Assessing Data Importance to Model Predictions Using Linear Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Matthew J. Tonkin; Claire R. Tiedeman; D. Matthew Ely; and Mary C. Hill

    2007-08-16

    The OPR-PPR program calculates the Observation-Prediction (OPR) and Parameter-Prediction (PPR) statistics that can be used to evaluate the relative importance of various kinds of data to simulated predictions. The data considered fall into three categories: (1) existing observations, (2) potential observations, and (3) potential information about parameters. The first two are addressed by the OPR statistic; the third is addressed by the PPR statistic. The statistics are based on linear theory and measure the leverage of the data, which depends on the location, the type, and possibly the time of the data being considered. For example, in a ground-water system the type of data might be a head measurement at a particular location and time. As a measure of leverage, the statistics do not take into account the value of the measurement. As linear measures, the OPR and PPR statistics require minimal computational effort once sensitivities have been calculated. Sensitivities need to be calculated for only one set of parameter values; commonly these are the values estimated through model calibration. OPR-PPR can calculate the OPR and PPR statistics for any mathematical model that produces the necessary OPR-PPR input files. In this report, OPR-PPR capabilities are presented in the context of using the ground-water model MODFLOW-2000 and the universal inverse program UCODE_2005. The method used to calculate the OPR and PPR statistics is based on the linear equation for prediction standard deviation. Using sensitivities and other information, OPR-PPR calculates (a) the percent increase in the prediction standard deviation that results when one or more existing observations are omitted from the calibration data set; (b) the percent decrease in the prediction standard deviation that results when one or more potential observations are added to the calibration data set; or (c) the percent decrease in the prediction standard deviation that results when potential information on one

  4. OPR-PPR, a Computer Program for Assessing Data Importance to Model Predictions Using Linear Statistics

    Science.gov (United States)

    Tonkin, Matthew J.; Tiedeman, Claire R.; Ely, D. Matthew; Hill, Mary C.

    2007-01-01

    The OPR-PPR program calculates the Observation-Prediction (OPR) and Parameter-Prediction (PPR) statistics that can be used to evaluate the relative importance of various kinds of data to simulated predictions. The data considered fall into three categories: (1) existing observations, (2) potential observations, and (3) potential information about parameters. The first two are addressed by the OPR statistic; the third is addressed by the PPR statistic. The statistics are based on linear theory and measure the leverage of the data, which depends on the location, the type, and possibly the time of the data being considered. For example, in a ground-water system the type of data might be a head measurement at a particular location and time. As a measure of leverage, the statistics do not take into account the value of the measurement. As linear measures, the OPR and PPR statistics require minimal computational effort once sensitivities have been calculated. Sensitivities need to be calculated for only one set of parameter values; commonly these are the values estimated through model calibration. OPR-PPR can calculate the OPR and PPR statistics for any mathematical model that produces the necessary OPR-PPR input files. In this report, OPR-PPR capabilities are presented in the context of using the ground-water model MODFLOW-2000 and the universal inverse program UCODE_2005. The method used to calculate the OPR and PPR statistics is based on the linear equation for prediction standard deviation. Using sensitivities and other information, OPR-PPR calculates (a) the percent increase in the prediction standard deviation that results when one or more existing observations are omitted from the calibration data set; (b) the percent decrease in the prediction standard deviation that results when one or more potential observations are added to the calibration data set; or (c) the percent decrease in the prediction standard deviation that results when potential information on one

  5. Understanding student computational thinking with computational modeling

    Science.gov (United States)

    Aiken, John M.; Caballero, Marcos D.; Douglas, Scott S.; Burk, John B.; Scanlon, Erin M.; Thoms, Brian D.; Schatz, Michael F.

    2013-01-01

    Recently, the National Research Council's framework for next generation science standards highlighted "computational thinking" as one of its "fundamental practices". 9th Grade students taking a physics course that employed the Arizona State University's Modeling Instruction curriculum were taught to construct computational models of physical systems. Student computational thinking was assessed using a proctored programming assignment, written essay, and a series of think-aloud interviews, where the students produced and discussed a computational model of a baseball in motion via a high-level programming environment (VPython). Roughly a third of the students in the study were successful in completing the programming assignment. Student success on this assessment was tied to how students synthesized their knowledge of physics and computation. On the essay and interview assessments, students displayed unique views of the relationship between force and motion; those who spoke of this relationship in causal (rather than observational) terms tended to have more success in the programming exercise.

  6. Risk-Assessment Computer Program

    Science.gov (United States)

    Dias, William C.; Mittman, David S.

    1993-01-01

    RISK D/C is prototype computer program assisting in attempts to do program risk modeling for Space Exploration Initiative (SEI) architectures proposed in Synthesis Group Report. Risk assessment performed with respect to risk events, probabilities, and severities of potential results. Enables ranking, with respect to effectiveness, of risk-mitigation strategies proposed for exploration program architecture. Allows for fact that risk assessment in early phases of planning subjective. Although specific to SEI in present form, also used as software framework for development of risk-assessment programs for other specific uses. Developed for Macintosh(TM) series computer. Requires HyperCard(TM) 2.0 or later, as well as 2 Mb of random-access memory and System 6.0.8 or later.

  7. Understanding Student Computational Thinking with Computational Modeling

    CERN Document Server

    Aiken, John M; Douglas, Scott S; Burk, John B; Scanlon, Erin M; Thoms, Brian D; Schatz, Michael F

    2012-01-01

    Recently, the National Research Council's framework for next generation science standards highlighted "computational thinking" as one of its "fundamental practices". Students taking a physics course that employed the Arizona State University's Modeling Instruction curriculum were taught to construct computational models of physical systems. Student computational thinking was assessed using a proctored programming assignment, written essay, and a series of think-aloud interviews, where the students produced and discussed a computational model of a baseball in motion via a high-level programming environment (VPython). Roughly a third of the students in the study were successful in completing the programming assignment. Student success on this assessment was tied to how students synthesized their knowledge of physics and computation. On the essay and interview assessments, students displayed unique views of the relationship between force and motion; those who spoke of this relationship in causal (rather than obs...

  8. Wusor II: A Computer Aided Instruction Program with Student Modelling Capabilities.

    Science.gov (United States)

    1977-06-01

    Model. In some cases, this degrading of the Student Know l edge Moda l w i l l cause the Student Learning Model to deci de that the player does not...260, Psychology and Education Series , Institute for Mathematical Studies in the Social Sciences , August , 1975. - Bork , AM., “Effective Computer

  9. Programming Models in HPC

    Energy Technology Data Exchange (ETDEWEB)

    Shipman, Galen M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-06-13

    These are the slides for a presentation on programming models in HPC, at the Los Alamos National Laboratory's Parallel Computing Summer School. The following topics are covered: Flynn's Taxonomy of computer architectures; single instruction single data; single instruction multiple data; multiple instruction multiple data; address space organization; definition of Trinity (Intel Xeon-Phi is a MIMD architecture); single program multiple data; multiple program multiple data; ExMatEx workflow overview; definition of a programming model, programming languages, runtime systems; programming model and environments; MPI (Message Passing Interface); OpenMP; Kokkos (Performance Portable Thread-Parallel Programming Model); Kokkos abstractions, patterns, policies, and spaces; RAJA, a systematic approach to node-level portability and tuning; overview of the Legion Programming Model; mapping tasks and data to hardware resources; interoperability: supporting task-level models; Legion S3D execution and performance details; workflow, integration of external resources into the programming model.

  10. Program management model study

    Science.gov (United States)

    Connelly, J. J.; Russell, J. E.; Seline, J. R.; Sumner, N. R., Jr.

    1972-01-01

    Two models, a system performance model and a program assessment model, have been developed to assist NASA management in the evaluation of development alternatives for the Earth Observations Program. Two computer models were developed and demonstrated on the Goddard Space Flight Center Computer Facility. Procedures have been outlined to guide the user of the models through specific evaluation processes, and the preparation of inputs describing earth observation needs and earth observation technology. These models are intended to assist NASA in increasing the effectiveness of the overall Earth Observation Program by providing a broader view of system and program development alternatives.

  11. CMS Computing Model Evolution

    CERN Document Server

    Grandi, Claudio; Colling, D; Fisk, I; Girone, M

    2014-01-01

    The CMS Computing Model was developed and documented in 2004. Since then the model has evolved to be more flexible and to take advantage of new techniques, but many of the original concepts remain and are in active use. In this presentation we will discuss the changes planned for the restart of the LHC program in 2015. We will discuss the changes planning in the use and definition of the computing tiers, that were defined with the MONARC project. We will present how we intend to use new services and infrastructure to provide more efficient and transparent access to the data. We will discuss the computing plans to make better use of the computing capacity by scheduling more of the processor nodes, making better use of the disk storage, and more intelligent use of the networking.

  12. Computer Assisted Parallel Program Generation

    CERN Document Server

    Kawata, Shigeo

    2015-01-01

    Parallel computation is widely employed in scientific researches, engineering activities and product development. Parallel program writing itself is not always a simple task depending on problems solved. Large-scale scientific computing, huge data analyses and precise visualizations, for example, would require parallel computations, and the parallel computing needs the parallelization techniques. In this Chapter a parallel program generation support is discussed, and a computer-assisted parallel program generation system P-NCAS is introduced. Computer assisted problem solving is one of key methods to promote innovations in science and engineering, and contributes to enrich our society and our life toward a programming-free environment in computing science. Problem solving environments (PSE) research activities had started to enhance the programming power in 1970's. The P-NCAS is one of the PSEs; The PSE concept provides an integrated human-friendly computational software and hardware system to solve a target ...

  13. A Computer Program for Modeling the Conversion of Organic Waste to Energy

    Directory of Open Access Journals (Sweden)

    Pragasen Pillay

    2011-11-01

    Full Text Available This paper presents a tool for the analysis of conversion of organic waste into energy. The tool is a program that uses waste characterization parameters and mass flow rates at each stage of the waste treatment process to predict the given products. The specific waste treatment process analysed in this paper is anaerobic digestion. The different waste treatment stages of the anaerobic digestion process are: conditioning of input waste, secondary treatment, drying of sludge, conditioning of digestate, treatment of digestate, storage of liquid and solid effluent, disposal of liquid and solid effluents, purification, utilization and storage of combustible gas. The program uses mass balance equations to compute the amount of CH4, NH3, CO2 and H2S produced from anaerobic digestion of organic waste, and hence the energy available. Case studies are also presented.

  14. Computer programs as accounting object

    Directory of Open Access Journals (Sweden)

    I.V. Perviy

    2015-03-01

    Full Text Available Existing approaches to the regulation of accounting software as one of the types of intangible assets have been considered. The features and current state of the legal protection of computer programs have been analyzed. The reasons for the need to use patent law as a means of legal protection of individual elements of computer programs have been discovered. The influence of the legal aspects of the use of computer programs for national legislation to their accounting reflection has been analyzed. The possible options for the transfer of rights from computer programs copyright owners have been analyzed that should be considered during creation of software accounting system at the enterprise. Identified and analyzed the characteristics of computer software as an intangible asset under the current law. General economic characteristics of computer programs as one of the types of intangible assets have been grounded. The main distinguishing features of software compared to other types of intellectual property have been all ocated

  15. Programming Non-Trivial Algorithms in the Measurement Based Quantum Computation Model

    Energy Technology Data Exchange (ETDEWEB)

    Alsing, Paul [United States Air Force Research Laboratory, Wright-Patterson Air Force Base; Fanto, Michael [United States Air Force Research Laboratory, Wright-Patterson Air Force Base; Lott, Capt. Gordon [United States Air Force Research Laboratory, Wright-Patterson Air Force Base; Tison, Christoper C. [United States Air Force Research Laboratory, Wright-Patterson Air Force Base

    2014-01-01

    We provide a set of prescriptions for implementing a quantum circuit model algorithm as measurement based quantum computing (MBQC) algorithm1, 2 via a large cluster state. As means of illustration we draw upon our numerical modeling experience to describe a large graph state capable of searching a logical 8 element list (a non-trivial version of Grover's algorithm3 with feedforward). We develop several prescriptions based on analytic evaluation of cluster states and graph state equations which can be generalized into any circuit model operations. Such a resulting cluster state will be able to carry out the desired operation with appropriate measurements and feed forward error correction. We also discuss the physical implementation and the analysis of the principal 3-qubit entangling gate (Toffoli) required for a non-trivial feedforward realization of an 8-element Grover search algorithm.

  16. Computer Program Newsletter No. 7

    Energy Technology Data Exchange (ETDEWEB)

    Magnuson, W.G. Jr.

    1982-09-01

    This issue of the Computer Program Newsletter updates an earlier newsletter (Number 2, September 1979) and focuses on electrical network analysis computer programs. In particular, five network analysis programs (SCEPTRE, SPICE2, NET2, CALAHAN, and EMTP) will be described. The objective of this newsletter will be to provide a very brief description of the input syntax and semantics for each program, highlight their strong and weak points, illustrate how the programs are run at Lawrence Livermore National Laboratory using the Octopus computer network, and present examples of input for each of the programs to illustrate some of the features of each program. In a sense, this newsletter can be used as a quick reference guide to the programs.

  17. NASA's computer science research program

    Science.gov (United States)

    Larsen, R. L.

    1983-01-01

    Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.

  18. Plasticity: modeling & computation

    National Research Council Canada - National Science Library

    Borja, Ronaldo Israel

    2013-01-01

    .... "Plasticity Modeling & Computation" is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids...

  19. Computer-based construction of gene models using the GRAIL Gene Assembly Program

    Energy Technology Data Exchange (ETDEWEB)

    Einstein, J.R.; Mural, R.J.; Guan, X.; Uberbacher, E.C.

    1992-09-01

    The Gene Assembly Program (GAP), a module of GRAIL, assembles and scores gene models, given a DNA sequence and the outputs of other GRAIL modules for the sequence. The latter modules determine the positions of coding regions, the positions and scores of possible splice junctions, the positions of possible translation-initiation sites, the coding strand for the gene, and the probable-translation-frame function over the sequence. GAP tests combinations of those splice junctions which are within acceptable distances from the initial estimated edges of the coding regions. Every complete gene model, comprising translation-initiation site, splice junctions and stop codon, which agrees with GAP's set of rules is scored, and the ten highest-scoring models are saved. Each gene-model score depends on the input scores of splice junctions used in the model, their positions relative to the initial estimated edges of the included coding regions, and the degree of agreement of the entire model with the probable-translation-frame function. If error conditions are detected, the present version of GAP attempts to correct them by the insertion and/or deletion of one or more coding regions. These insertions and deletions have resulted in a net improvement of gene models, and a particularly large improvement in the recognition and characterization of very short coding regions. The results of GRAIL including the GAP module for 26 sequences from GenBank, each with a biochemically characterized single gene, are quite promising and demonstrate the feasibility of constructing largely accurate gene models strictly on the basis of sequence data.

  20. Computer-based construction of gene models using the GRAIL Gene Assembly Program

    Energy Technology Data Exchange (ETDEWEB)

    Einstein, J.R.; Mural, R.J.; Guan, X.; Uberbacher, E.C.

    1992-09-01

    The Gene Assembly Program (GAP), a module of GRAIL, assembles and scores gene models, given a DNA sequence and the outputs of other GRAIL modules for the sequence. The latter modules determine the positions of coding regions, the positions and scores of possible splice junctions, the positions of possible translation-initiation sites, the coding strand for the gene, and the probable-translation-frame function over the sequence. GAP tests combinations of those splice junctions which are within acceptable distances from the initial estimated edges of the coding regions. Every complete gene model, comprising translation-initiation site, splice junctions and stop codon, which agrees with GAP`s set of rules is scored, and the ten highest-scoring models are saved. Each gene-model score depends on the input scores of splice junctions used in the model, their positions relative to the initial estimated edges of the included coding regions, and the degree of agreement of the entire model with the probable-translation-frame function. If error conditions are detected, the present version of GAP attempts to correct them by the insertion and/or deletion of one or more coding regions. These insertions and deletions have resulted in a net improvement of gene models, and a particularly large improvement in the recognition and characterization of very short coding regions. The results of GRAIL including the GAP module for 26 sequences from GenBank, each with a biochemically characterized single gene, are quite promising and demonstrate the feasibility of constructing largely accurate gene models strictly on the basis of sequence data.

  1. YASEIS: Yet Another computer program to calculate synthetic SEISmograms for a spherically multi-layered Earth model

    Science.gov (United States)

    Ma, Yanlu

    2013-04-01

    Although most researches focus on the lateral heterogeneity of 3D Earth nowadays, a spherically multi-layered model where the parameters depend only on depth still represents a good first order approximation of real Earth. Such 1D models could be used as starting models for seismic tomographic inversion or as background model where the source mechanisms are inverted. The problem of wave propagation in a spherically layered model had been solved theoretically long time ago (Takeuchi and Saito, 1972). The existing computer programs such as Mineos (developed by G. Master, J. Woodhouse and F. Gilbert), Gemini (Friederich and Dalkolmo 1995), DSM (Kawai et. al. 2006) and QSSP (Wang 1999) tackled the computational aspects of the problem. A new simple and fast program for computing the Green's function of a stack of spherical dissipative layers is presented here. The analytical solutions within each homogeneous spherical layer are joined through the continuous boundary conditions and propagated from the center of model up to the level of source depth. Another solution is built by propagating downwardly from the free surface of model to the source level. The final solution is then constructed in frequency domain from the previous two solutions to satisfy the discontinuities of displacements and stresses at the source level which are required by the focal mechanism. The numerical instability in the propagator approach is solved by complementing the matrix propagating with an orthonormalization procedure (Wang 1999). Another unstable difficulty due to the high attenuation in the upper mantle low velocity zone is overcome by switching the bases of solutions from the spherical Bessel functions to the spherical Hankel functions when necessary. We compared the synthetic seismograms obtained from the new program YASEIS with those computed by Gemini and QSSP. In the range of near distances, the synthetics by a reflectivity code for the horizontally layers are also compared with

  2. Designing computer programs

    CERN Document Server

    Haigh, Jim

    1994-01-01

    This is a book for students at every level who are learning to program for the first time - and for the considerable number who learned how to program but were never taught to structure their programs. The author presents a simple set of guidelines that show the programmer how to design in a manageable structure from the outset. The method is suitable for most languages, and is based on the widely used 'JSP' method, to which the student may easily progress if it is needed at a later stage.Most language specific texts contain very little if any information on design, whilst books on des

  3. Structured Parallel Programming Patterns for Efficient Computation

    CERN Document Server

    McCool, Michael; Robison, Arch

    2012-01-01

    Programming is now parallel programming. Much as structured programming revolutionized traditional serial programming decades ago, a new kind of structured programming, based on patterns, is relevant to parallel programming today. Parallel computing experts and industry insiders Michael McCool, Arch Robison, and James Reinders describe how to design and implement maintainable and efficient parallel algorithms using a pattern-based approach. They present both theory and practice, and give detailed concrete examples using multiple programming models. Examples are primarily given using two of th

  4. What do reversible programs compute?

    DEFF Research Database (Denmark)

    Axelsen, Holger Bock; Glück, Robert

    2011-01-01

    are not strictly classically universal, but that they support another notion of universality; we call this RTM-universality. Thus, even though the RTMs are sub-universal in the classical sense, they are powerful enough as to include a self-interpreter. Lifting this to other computation models, we propose r...... be the starting point of a computational theory of reversible computing. We provide a novel semantics-based approach to such a theory, using reversible Turing machines (RTMs) as the underlying computation model. We show that the RTMs can compute exactly all injective, computable functions. We find that the RTMs...

  5. Solar Pilot Plant, Phase I. Preliminary design report. Volume II, Book 3. Dynamic simulation model and computer program descriptions. CDRL item 2. [SPP dynamics simulation program

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-05-01

    The mathematical models and computer program comprising the SPP Dynamic Simulation are described. The SPP Dynamic Simulation is a computerized model representing the time-varying performance characteristics of the SPP. The model incorporates all the principal components of the pilot plant. Time-dependent direct normal solar insulation, as corrupted by simulated cloud passages, is transformed into absorbed radiant power by actions of the heliostat field and enclosed receiver cavity. The absorbed power then drives the steam generator model to produce superheated steam for the turbine and/or thermal storage subsystems. The thermal storage subsystem can, in turn, also produce steam for the turbine. The turbine using the steam flow energy produces the mechanical shaft power necessary for the generator to convert it to electrical power. This electrical power is subsequently transmitted to a transmission grid system. Exhaust steam from the turbine is condensed, reheated, deaerated, and pressurized by pumps for return as feedwater to the thermal storage and/or steam generator. A master control/instrumentation system is utilized to coordinate the various plant operations. The master controller reacts to plant operator demands and control settings to effect the desired output response. The SPP Dynamic Simulation Computer program is written in FORTRAN language. Various input options (e.g., insolation values, load demands, initial pressures/temperatures/flows) are permitted. Plant performance may be monitored via computer printout or computer generated plots. The remainder of this document describes the detailed pilot plant dynamic model, the basis for this simulation, and the utilization of this simulation to obtain analytical plant performance results.

  6. Personal Computer Transport Analysis Program

    Science.gov (United States)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  7. SGOTL: A Computer Program for Modeling High-Resolution, Height-Dependent Gravity Effect of Ocean Tide Loading

    Directory of Open Access Journals (Sweden)

    Cheinway Hwang Jiu-Fu Huang

    2012-01-01

    Full Text Available SGOTL, a computer package coded in FORTRAN, has been developed to model the gravity effect due to ocean tide loading (OTL, especially for a coastal station with large ocean tides. SGOTL uses a regional and a global tide model to account separately for near (inner and far (outer zone contributions, and optimizes an inner-zone region and grid interval for numerical convolution. Height dependent Green¡¦s functions for Newtonian and elastic effects are developed. The coastline is defined by the full-resolution GMT shoreline, and optionally a digital elevation model (DEM. A case study using gravity observations at the Hsinchu superconducting gravity station and some offshore islands around the Taiwan Strait suggests that SGOTL outperforms some selected global OTL programs and achieves an accuracy of 0.1 μGal for 8 leading tidal constituents.

  8. Computational neurogenetic modeling

    CERN Document Server

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  9. Computer Security Models

    Science.gov (United States)

    1984-09-01

    September 1984 MTR9S31 " J. K. Millen Computer Security C. M. Cerniglia Models * 0 Ne c - ¢- C. S• ~CONTRACT SPONSOR OUSDRE/C31 & ESO/ALEE...given in Section 5, in the form of questions and answers about security modeling. A glossary of terms used in the context of computer security is...model, so we will not be able to pursue it in this report. MODEL CHARACTERISTICS Computer security models are engineering models, giving them somewhat

  10. Development of a single cell spherical shell model for an investigation of electrical properties with a computing program

    Directory of Open Access Journals (Sweden)

    Boonlamp, M.

    2005-03-01

    Full Text Available A spherical double shell model (SDM for a single cell has been developed, using Laplace’s equation in spherical coordinates and boundary conditions. Electric field intensities and dielectric constants of each region inside and outside of the cell have been estimated. The dielectrophoretic spectrum of the real part of a complex function (Re[f ( ω] were computed using Visual Foxpro Version 6, which gave calculated values pertaining to electrical properties of the cell model as compared with experimental values. The process was repeated until the error percentile was in an acceptable range. The calculated parameters were the dielectric constants and the conductivities of the inner cytoplasm ( εic, σic, the outer cytoplasm ( εoc, σoc, the inner membrane ( εim, σim, the outer membrane ( εom, σom, the suspending solution( εs, σs and the thickness of each layer (dom, doc, dim, respectively. This computer program provides estimated values of cell electrical properties with high accuracy and required minimal computational time.

  11. Line-Editor Computer Program

    Science.gov (United States)

    Scott, Peter J.

    1989-01-01

    ZED editing program for DEC VAX computer simple, powerful line editor for text, program source code, and nonbinary data. Excels in processing of text by use of procedure files. Also features versatile search qualifiers, global changes, conditionals, online help, hexadecimal mode, space compression, looping, logical combinations of search strings, journaling, visible control characters, and automatic detabbing. Users of Cambridge implementation devised such ZED procedures as chess games, calculators, and programs for evaluating pi. Written entirely in C.

  12. The psychology of computer programming

    CERN Document Server

    Weinberg, Gerald Marvin

    1998-01-01

    This landmark 1971 classic is reprinted with a new preface, chapter-by-chapter commentary, and straight-from-the-heart observations on topics that affect the professional life of programmers. Long regarded as one of the first books to pioneer a people-oriented approach to computing, The Psychology of Computer Programming endures as a penetrating analysis of the intelligence, skill, teamwork, and problem-solving power of the computer programmer. Finding the chapters strikingly relevant to today's issues in programming, Gerald M. Weinberg adds new insights and highlights the similarities and differences between now and then. Using a conversational style that invites the reader to join him, Weinberg reunites with some of his most insightful writings on the human side of software engineering. Topics include egoless programming, intelligence, psychological measurement, personality factors, motivation, training, social problems on large projects, problem-solving ability, programming language design, team formati...

  13. Functional Programming in Computer Science

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Loren James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Davis, Marion Kei [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-01-19

    We explore functional programming through a 16-week internship at Los Alamos National Laboratory. Functional programming is a branch of computer science that has exploded in popularity over the past decade due to its high-level syntax, ease of parallelization, and abundant applications. First, we summarize functional programming by listing the advantages of functional programming languages over the usual imperative languages, and we introduce the concept of parsing. Second, we discuss the importance of lambda calculus in the theory of functional programming. Lambda calculus was invented by Alonzo Church in the 1930s to formalize the concept of effective computability, and every functional language is essentially some implementation of lambda calculus. Finally, we display the lasting products of the internship: additions to a compiler and runtime system for the pure functional language STG, including both a set of tests that indicate the validity of updates to the compiler and a compiler pass that checks for illegal instances of duplicate names.

  14. Modeling, Simulation and Analysis of Complex Networked Systems: A Program Plan for DOE Office of Advanced Scientific Computing Research

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L

    2009-05-01

    Many complex systems of importance to the U.S. Department of Energy consist of networks of discrete components. Examples are cyber networks, such as the internet and local area networks over which nearly all DOE scientific, technical and administrative data must travel, the electric power grid, social networks whose behavior can drive energy demand, and biological networks such as genetic regulatory networks and metabolic networks. In spite of the importance of these complex networked systems to all aspects of DOE's operations, the scientific basis for understanding these systems lags seriously behind the strong foundations that exist for the 'physically-based' systems usually associated with DOE research programs that focus on such areas as climate modeling, fusion energy, high-energy and nuclear physics, nano-science, combustion, and astrophysics. DOE has a clear opportunity to develop a similarly strong scientific basis for understanding the structure and dynamics of networked systems by supporting a strong basic research program in this area. Such knowledge will provide a broad basis for, e.g., understanding and quantifying the efficacy of new security approaches for computer networks, improving the design of computer or communication networks to be more robust against failures or attacks, detecting potential catastrophic failure on the power grid and preventing or mitigating its effects, understanding how populations will respond to the availability of new energy sources or changes in energy policy, and detecting subtle vulnerabilities in large software systems to intentional attack. This white paper outlines plans for an aggressive new research program designed to accelerate the advancement of the scientific basis for complex networked systems of importance to the DOE. It will focus principally on four research areas: (1) understanding network structure, (2) understanding network dynamics, (3) predictive modeling and simulation for complex

  15. Sonic boom research. [computer program

    Science.gov (United States)

    Zakkay, V.; Ting, L.

    1976-01-01

    A computer program for CDC 6600 is developed for the nonlinear sonic boom analysis including the asymmetric effect of lift near the vertical plane of symmetry. The program is written in FORTRAN 4 language. This program carries out the numerical integration of the nonlinear governing equations from the input data at a finite distance from the airplane configuration at a flight altitude to yield the pressure signitude at ground. The required input data and the format for the output are described. A complete program listing and a sample calculation are given.

  16. Manual of phosphoric acid fuel cell power plant cost model and computer program

    Science.gov (United States)

    Lu, C. Y.; Alkasab, K. A.

    1984-01-01

    Cost analysis of phosphoric acid fuel cell power plant includes two parts: a method for estimation of system capital costs, and an economic analysis which determines the levelized annual cost of operating the system used in the capital cost estimation. A FORTRAN computer has been developed for this cost analysis.

  17. Creating a Pipeline for African American Computing Science Faculty: An Innovative Faculty/Research Mentoring Program Model

    Science.gov (United States)

    Charleston, LaVar J.; Gilbert, Juan E.; Escobar, Barbara; Jackson, Jerlando F. L.

    2014-01-01

    African Americans represent 1.3% of all computing sciences faculty in PhD-granting departments, underscoring the severe underrepresentation of Black/African American tenure-track faculty in computing (CRA, 2012). The Future Faculty/Research Scientist Mentoring (FFRM) program, funded by the National Science Foundation, was found to be an effective…

  18. Computational modelling of polymers

    Science.gov (United States)

    Celarier, Edward A.

    1991-01-01

    Polymeric materials and polymer/graphite composites show a very diverse range of material properties, many of which make them attractive candidates for a variety of high performance engineering applications. Their properties are ultimately determined largely by their chemical structure, and the conditions under which they are processed. It is the aim of computational chemistry to be able to simulate candidate polymers on a computer, and determine what their likely material properties will be. A number of commercially available software packages purport to predict the material properties of samples, given the chemical structures of their constituent molecules. One such system, Cerius, has been in use at LaRC. It is comprised of a number of modules, each of which performs a different kind of calculation on a molecule in the programs workspace. Particularly, interest is in evaluating the suitability of this program to aid in the study of microcrystalline polymeric materials. One of the first model systems examined was benzophenone. The results of this investigation are discussed.

  19. Interpreting beyond Syntactics: A Semiotic Learning Model for Computer Programming Languages

    Science.gov (United States)

    May, Jeffrey; Dhillon, Gurpreet

    2009-01-01

    In the information systems field there are numerous programming languages that can be used in specifying the behavior of concurrent and distributed systems. In the literature it has been argued that a lack of pragmatic and semantic consideration decreases the effectiveness of such specifications. In other words, to simply understand the syntactic…

  20. The Specification and Modeling of Computer Security

    Science.gov (United States)

    1990-01-01

    Computer security models are specifications designed, among other things, to limit the damage caused by Trojan Horse programs such as computer... computer security modeling in general, the Bell and LaPadula model in particular, and the limitations of the model. Many of the issues raised are of

  1. Center for Programming Models for Scalable Parallel Computing - Towards Enhancing OpenMP for Manycore and Heterogeneous Nodes

    Energy Technology Data Exchange (ETDEWEB)

    Barbara Chapman

    2012-02-01

    OpenMP was not well recognized at the beginning of the project, around year 2003, because of its limited use in DoE production applications and the inmature hardware support for an efficient implementation. Yet in the recent years, it has been graduately adopted both in HPC applications, mostly in the form of MPI+OpenMP hybrid code, and in mid-scale desktop applications for scientific and experimental studies. We have observed this trend and worked deligiently to improve our OpenMP compiler and runtimes, as well as to work with the OpenMP standard organization to make sure OpenMP are evolved in the direction close to DoE missions. In the Center for Programming Models for Scalable Parallel Computing project, the HPCTools team at the University of Houston (UH), directed by Dr. Barbara Chapman, has been working with project partners, external collaborators and hardware vendors to increase the scalability and applicability of OpenMP for multi-core (and future manycore) platforms and for distributed memory systems by exploring different programming models, language extensions, compiler optimizations, as well as runtime library support.

  2. A Computer Program for Modeling the Conversion of Organic Waste to Energy

    OpenAIRE

    Namuli, Rachel; Laflamme, Claude B.; Pillay, Pragasen

    2011-01-01

    This paper presents a tool for the analysis of conversion of organic waste into energy. The tool is a program that uses waste characterization parameters and mass flow rates at each stage of the waste treatment process to predict the given products. The specific waste treatment process analysed in this paper is anaerobic digestion. The different waste treatment stages of the anaerobic digestion process are: conditioning of input waste, secondary treatment, drying of sludge, conditioning of di...

  3. Elliptical Orbit Performance Computer Program

    Science.gov (United States)

    Myler, T.

    1984-01-01

    Elliptical Orbit Performance (ELOPE) computer program for analyzing orbital performance of space boosters uses orbit insertion data obtained from trajectory simulation to generate parametric data on apogee and perigee altitudes as function of payload data. Data used to generate presentation plots that display elliptical orbit performance capability of space booster.

  4. K-FIX(GT): A computer program for modeling the expansion phase of steam explosions within complex three dimensional cavities

    Energy Technology Data Exchange (ETDEWEB)

    Hyder, M.L. [Westinghouse Savannah River Co., Aiken, SC (United States); Farawila, Y.M.; Abdel-Khalik, S.I.; Halvorson, P.J. [Georgia Inst. of Tech., Atlanta, GA (US)

    1992-05-01

    In the development of the Severe Accident Analysis Program for the Savannah River production reactors, it was recognized that certain accidents have the potential for causing damaging steam explosions. The massive SRS reactor buildings are likely to withstand any imaginable steam explosion. However, reactor components and building structures including hatches, ventilation ducts, etc., could be at risk if such an explosion occurred. No tools were available to estimate the effects of such explosions on actual structures. To meet this need, the Savannah River Laboratory contracted with the Georgia Institute of Technology Research Institute for development of a computer-based calculational tool for estimating the effects of steam explosions. The goal for this study was to develop a computer code that could be used parametrically to predict the effects of various steam explosions on their surroundings. This would be able to predict whether a steam explosion of a given magnitude would be likely to fail a particular structure. This would require, of course, that the magnitude of the explosion be specified through some combination of judgment and calculation. The requested code, identified as the K-FIX(GT) code, was developed and delivered by the contractor, along with extensive documentation. The several individual reports that constitute the documentation are each being issued as a separate WSRC report. Documentation includes several model calculations, and representation of these in graphic form. This report gives detailed instructions for the use of the code, including identification of all input parameters required.

  5. Computer Program Re-layers Engineering Drawings

    Science.gov (United States)

    Crosby, Dewey C., III

    1990-01-01

    RULCHK computer program aids in structuring layers of information pertaining to part or assembly designed with software described in article "Software for Drawing Design Details Concurrently" (MFS-28444). Checks and optionally updates structure of layers for part. Enables designer to construct model and annotate its documentation without burden of manually layering part to conform to standards at design time.

  6. The Computational Physics Program of the national MFE Computer Center

    Energy Technology Data Exchange (ETDEWEB)

    Mirin, A.A.

    1989-01-01

    Since June 1974, the MFE Computer Center has been engaged in a significant computational physics effort. The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generations of supercomputers. The Computational Physics Group has been involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to tokamaks and compact toroids. A third area is the investigation of kinetic instabilities using a 3-D particle code; this work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence have been under examination, with the hope of being able to explain anomalous transport. Also, we are collaborating in an international effort to evaluate fully three-dimensional linear stability of toroidal devices. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers. A summary of these programs are included in this paper. 6 tabs.

  7. Statistical modeling of program performance

    Directory of Open Access Journals (Sweden)

    A. P. Karpenko

    2014-01-01

    Full Text Available A task of evaluation of program performance often occurs in the process of design of computer systems or during iterative compilation. A traditional way to solve this problem is emulation of program execution on the target system. A modern alternative approach to evaluation of program performance is based on statistical modeling of program performance on a computer under investigation. This statistical method of modeling program performance called Velocitas was introduced in this work. The method and its implementation in the Adaptor framework were presented. Investigation of the method's effectiveness showed high adequacy of program performance prediction.

  8. Programs=data=first-class citizens in a computational world

    DEFF Research Database (Denmark)

    Jones, Neil; Simonsen, Jakob Grue

    2012-01-01

    From a programming perspective, Alan Turing's epochal 1936 paper on computable functions introduced several new concepts, including what is today known as self-interpreters and programs as data, and invented a great many now-common programming techniques. We begin by reviewing Turing's contribution...... from a programming perspective; and then systematize and mention some of the many ways that later developments in models of computation (MOCs) have interacted with computability theory and programming language research. Next, we describe the ‘blob’ MOC: a recent stored-program computational model...... without pointers. In the blob model, programs are truly first-class citizens, capable of being automatically compiled, or interpreted, or executed directly. Further, the blob model appears closer to being physically realizable than earlier computation models. In part, this is due to strong finiteness...

  9. Computational Modeling of Space Physiology

    Science.gov (United States)

    Lewandowski, Beth E.; Griffin, Devon W.

    2016-01-01

    The Digital Astronaut Project (DAP), within NASAs Human Research Program, develops and implements computational modeling for use in the mitigation of human health and performance risks associated with long duration spaceflight. Over the past decade, DAP developed models to provide insights into space flight related changes to the central nervous system, cardiovascular system and the musculoskeletal system. Examples of the models and their applications include biomechanical models applied to advanced exercise device development, bone fracture risk quantification for mission planning, accident investigation, bone health standards development, and occupant protection. The International Space Station (ISS), in its role as a testing ground for long duration spaceflight, has been an important platform for obtaining human spaceflight data. DAP has used preflight, in-flight and post-flight data from short and long duration astronauts for computational model development and validation. Examples include preflight and post-flight bone mineral density data, muscle cross-sectional area, and muscle strength measurements. Results from computational modeling supplement space physiology research by informing experimental design. Using these computational models, DAP personnel can easily identify both important factors associated with a phenomenon and areas where data are lacking. This presentation will provide examples of DAP computational models, the data used in model development and validation, and applications of the model.

  10. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  11. Computer Education in Dental Laboratory Technology Programs.

    Science.gov (United States)

    Rogers, William A.; Hawkins, Robert Ross

    1991-01-01

    A 1990 survey of 37 dental technology programs investigated 3 areas of computer use: current and anticipated general computer education courses; incorporation of computer applications into technology and management courses; and faculty use of the computer. Most programs are beginning to expand use of technology. (MSE)

  12. Computational human body models

    NARCIS (Netherlands)

    Wismans, J.S.H.M.; Happee, R.; Dommelen, J.A.W. van

    2005-01-01

    Computational human body models are widely used for automotive crashsafety research and design and as such have significantly contributed to a reduction of traffic injuries and fatalities. Currently crash simulations are mainly performed using models based on crash-dummies. However crash dummies

  13. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  14. Computationally modeling interpersonal trust

    Science.gov (United States)

    Lee, Jin Joo; Knox, W. Bradley; Wormwood, Jolie B.; Breazeal, Cynthia; DeSteno, David

    2013-01-01

    We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind's readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naiveté of this domain knowledge. We then present the construction of hidden Markov models to investigate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust. PMID:24363649

  15. Computationally Modeling Interpersonal Trust

    Directory of Open Access Journals (Sweden)

    Jin Joo eLee

    2013-12-01

    Full Text Available We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind’s readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naivete' of this domain knowledge. We then present the construction of hidden Markov models to incorporate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust.

  16. Development and Study the Usage of Blended Learning Environment Model Using Engineering Design Concept Learning Activities to Computer Programming Courses for Undergraduate Students of Rajabhat Universities

    Directory of Open Access Journals (Sweden)

    Kasame Tritrakan

    2017-06-01

    Full Text Available The objectives of this research were to study and Synthesise the components, to develop, and to study the usage of blended learning environment model using engineering design concept learning activities to computer programming courses for undergraduate students of Rajabhat universities. The research methodology was divided into 3 phases. Phase I: surveying presents, needs and problems in teaching computer programming of 52 lecturers by using in-depth interview from 5 experienced lecturers. The model’s elements were evaluated by 5 experts. The tools were questionnaire, interview form, and model’s elements assessment form. Phase II: developing the model of blended learning environment and learning activities based on engineering design processes and confirming model by 8 experts. The tools were the draft of learning environment, courseware, and assessment forms. Phase III evaluating the effects of using the implemented environment. The samples were students which formed into 2 groups, 25 people in the experiment group and 27 people in the control group by cluster random sampling. The tools were learning environment, courseware, and assessment tools. The statistics used in this research were means, standard deviation, t-test dependent, and one-way MANOVA. The results found that: 1 Lecturers quite agreed with the physical, mental, social, and information learning environment, learning processes, and assessments. There were all needs in high level. However there were physical environment problems in high level yet quite low in other aspects. 2 The developed learning environment had 4 components which were a 4 types of environments b the inputs included blended learning environment, learning motivation factors, and computer programming content c the processes were analysis of state objectives, design learning environment and activities, developing learning environment and testing materials, implement, ation evaluation and evaluate, 4 the outputs

  17. Note on: 'EMLCLLER-A program for computing the EM response of a large loop source over a layered earth model' by N.P. Singh and T. Mogi, Computers & Geosciences 29 (2003) 1301-1307

    Science.gov (United States)

    Jamie, Majid

    2016-11-01

    Singh and Mogi (2003) presented a forward modeling (FWD) program, coded in FORTRAN 77 called "EMLCLLER", which is capable of computing the frequency-domain electromagnetic (EM) response of a large circular loop, in terms of vertical magnetic component (Hz), over 1D layer earth models; computations at this program could be performed by assuming variable transmitter-receiver configurations and incorporating both conduction and displacement currents into computations. Integral equations at this program are computed through digital linear filters based on the Hankel transforms together with analytic solutions based on hyper-geometric functions. Despite capabilities of EMLCLLER, there are some mistakes at this program that make its FWD results unreliable. The mistakes in EMLCLLER arise in using wrong algorithm for computing reflection coefficient of the EM wave in TE-mode (rTE), and using flawed algorithms for computing phase and normalized phase values relating to Hz; in this paper corrected form of these mistakes are presented. Moreover, in order to illustrate how these mistakes can affect FWD results, EMLCLLER and corrected version of this program presented in this paper titled "EMLCLLER_Corr" are conducted on different two- and three-layered earth models; afterwards their FWD results in terms of real and imaginary parts of Hz, its normalized amplitude, and the corresponding normalized phase curves are plotted versus frequency and compared to each other. In addition, in Singh and Mogi (2003) extra derivations for computing radial component of the magnetic field (Hr) and angular component of the electric field (Eϕ) are also presented where the numerical solution presented for Hr is incorrect; in this paper the correct numerical solution for this derivation is also presented.

  18. Assessing the relationship between computational speed and precision: a case study comparing an interpreted versus compiled programming language using a stochastic simulation model in diabetes care.

    Science.gov (United States)

    McEwan, Phil; Bergenheim, Klas; Yuan, Yong; Tetlow, Anthony P; Gordon, Jason P

    2010-01-01

    Simulation techniques are well suited to modelling diseases yet can be computationally intensive. This study explores the relationship between modelled effect size, statistical precision, and efficiency gains achieved using variance reduction and an executable programming language. A published simulation model designed to model a population with type 2 diabetes mellitus based on the UKPDS 68 outcomes equations was coded in both Visual Basic for Applications (VBA) and C++. Efficiency gains due to the programming language were evaluated, as was the impact of antithetic variates to reduce variance, using predicted QALYs over a 40-year time horizon. The use of C++ provided a 75- and 90-fold reduction in simulation run time when using mean and sampled input values, respectively. For a series of 50 one-way sensitivity analyses, this would yield a total run time of 2 minutes when using C++, compared with 155 minutes for VBA when using mean input values. The use of antithetic variates typically resulted in a 53% reduction in the number of simulation replications and run time required. When drawing all input values to the model from distributions, the use of C++ and variance reduction resulted in a 246-fold improvement in computation time compared with VBA - for which the evaluation of 50 scenarios would correspondingly require 3.8 hours (C++) and approximately 14.5 days (VBA). The choice of programming language used in an economic model, as well as the methods for improving precision of model output can have profound effects on computation time. When constructing complex models, more computationally efficient approaches such as C++ and variance reduction should be considered; concerns regarding model transparency using compiled languages are best addressed via thorough documentation and model validation.

  19. Computer technology and computer programming research and strategies

    CERN Document Server

    Antonakos, James L

    2011-01-01

    Covering a broad range of new topics in computer technology and programming, this volume discusses encryption techniques, SQL generation, Web 2.0 technologies, and visual sensor networks. It also examines reconfigurable computing, video streaming, animation techniques, and more. Readers will learn about an educational tool and game to help students learn computer programming. The book also explores a new medical technology paradigm centered on wireless technology and cloud computing designed to overcome the problems of increasing health technology costs.

  20. Use of Approximate Bayesian Computation to Assess and Fit Models of Mycobacterium leprae to Predict Outcomes of the Brazilian Control Program.

    Directory of Open Access Journals (Sweden)

    Rebecca Lee Smith

    Full Text Available Hansen's disease (leprosy elimination has proven difficult in several countries, including Brazil, and there is a need for a mathematical model that can predict control program efficacy. This study applied the Approximate Bayesian Computation algorithm to fit 6 different proposed models to each of the 5 regions of Brazil, then fitted hierarchical models based on the best-fit regional models to the entire country. The best model proposed for most regions was a simple model. Posterior checks found that the model results were more similar to the observed incidence after fitting than before, and that parameters varied slightly by region. Current control programs were predicted to require additional measures to eliminate Hansen's Disease as a public health problem in Brazil.

  1. Use of the computer program in a cloud computing

    Directory of Open Access Journals (Sweden)

    Radovanović Sanja

    2013-01-01

    Full Text Available Cloud computing represents a specific networking, in which a computer program simulates the operation of one or more server computers. In terms of copyright, all technological processes that take place within the cloud computing are covered by the notion of copying computer programs, and exclusive right of reproduction. However, this right suffers some limitations in order to allow normal use of computer program by users. Based on the fact that the cloud computing is virtualized network, the issue of normal use of the computer program requires to put all aspects of the permitted copying into the context of a specific computing environment and specific processes within the cloud. In this sense, the paper pointed out that the user of a computer program in cloud computing, needs to obtain the consent of the right holder for any act which he undertakes using the program. In other words, the copyright in the cloud computing is a full scale, and thus the freedom of contract (in the case of this particular restriction as well.

  2. The ATLAS Computing Model

    CERN Document Server

    Adams, D; Bee, C P; Hawkings, R; Jarp, S; Jones, R; Malon, D; Poggioli, L; Poulard, G; Quarrie, D; Wenaus, T

    2005-01-01

    The ATLAS Offline Computing Model is described. The main emphasis is on the steady state, when normal running is established. The data flow from the output of the ATLAS trigger system through processing and analysis stages is analysed, in order to estimate the computing resources, in terms of CPU power, disk and tape storage and network bandwidth, which will be necessary to guarantee speedy access to ATLAS data to all members of the Collaboration. Data Challenges and the commissioning runs are used to prototype the Computing Model and test the infrastructure before the start of LHC operation. The initial planning for the early stages of data-taking is also presented. In this phase, a greater degree of access to the unprocessed or partially processed raw data is envisaged.

  3. Chaos Modelling with Computers

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Chaos Modelling with Computers Unpredicatable Behaviour of ... Author Affiliations. Balakrishnan Ramasamy1 T S K V Iyer2. Siemens Communication Software, 10th floor Raheja Towers 26-27, M G Road Bangalore 560 001, India.

  4. The computational physics program of the National MFE Computer Center

    Energy Technology Data Exchange (ETDEWEB)

    Mirin, A.A.

    1988-01-01

    The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generation of supercomputers. The computational physics group is involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to compact toroids. Another major area is the investigation of kinetic instabilities using a 3-D particle code. This work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence are being examined. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers.

  5. Debugging a high performance computing program

    Science.gov (United States)

    Gooding, Thomas M.

    2013-08-20

    Methods, apparatus, and computer program products are disclosed for debugging a high performance computing program by gathering lists of addresses of calling instructions for a plurality of threads of execution of the program, assigning the threads to groups in dependence upon the addresses, and displaying the groups to identify defective threads.

  6. Computer Assistance for Writing Interactive Programs: TICS

    Science.gov (United States)

    Kaplow, Ray; And Others

    1973-01-01

    A description of an on-line and interactive programing system (TICS - Teacher-Interactive-Computer-System), which is aimed at facilitating the authoring of interactive, instructional computer programs by persons who are experts on the subject matter being addressed, but not necessarily programers. (Author)

  7. The NASA computer science research program plan

    Science.gov (United States)

    1983-01-01

    A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.

  8. High performance computing and communications program

    Science.gov (United States)

    Holcomb, Lee

    1992-01-01

    A review of the High Performance Computing and Communications (HPCC) program is provided in vugraph format. The goals and objectives of this federal program are as follows: extend U.S. leadership in high performance computing and computer communications; disseminate the technologies to speed innovation and to serve national goals; and spur gains in industrial competitiveness by making high performance computing integral to design and production.

  9. Applications of computer modeling to fusion research

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, J.M.

    1989-01-01

    Progress achieved during this report period is presented on the following topics: Development and application of gyrokinetic particle codes to tokamak transport, development of techniques to take advantage of parallel computers; model dynamo and bootstrap current drive; and in general maintain our broad-based program in basic plasma physics and computer modeling.

  10. Computationally modeling interpersonal trust

    OpenAIRE

    Lee, Jin Joo; Knox, W. Bradley; Wormwood, Jolie B.; Breazeal, Cynthia; DeSteno, David

    2013-01-01

    We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind’s readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our pr...

  11. Integer programming theory, applications, and computations

    CERN Document Server

    Taha, Hamdy A

    1975-01-01

    Integer Programming: Theory, Applications, and Computations provides information pertinent to the theory, applications, and computations of integer programming. This book presents the computational advantages of the various techniques of integer programming.Organized into eight chapters, this book begins with an overview of the general categorization of integer applications and explains the three fundamental techniques of integer programming. This text then explores the concept of implicit enumeration, which is general in a sense that it is applicable to any well-defined binary program. Other

  12. NASA High Performance Computing and Communications program

    Science.gov (United States)

    Holcomb, Lee; Smith, Paul; Hunter, Paul

    1994-01-01

    The National Aeronautics and Space Administration's HPCC program is part of a new Presidential initiative aimed at producing a 1000-fold increase in supercomputing speed and a 1(X)-fold improvement in available communications capability by 1997. As more advanced technologies are developed under the HPCC program, they will be used to solve NASA's 'Grand Challenge' problems, which include improving the design and simulation of advanced aerospace vehicles, allowing people at remote locations to communicate more effectively and share information, increasing scientists' abilities to model the Earth's climate and forecast global environmental trends, and improving the development of advanced spacecraft. NASA's HPCC program is organized into three projects which are unique to the agency's mission: the Computational Aerosciences (CAS) project, the Earth and Space Sciences (ESS) project, and the Remote Exploration and Experimentation (REE) project. An additional project, the Basic Research and Human Resources (BRHR) project, exists to promote long term research in computer science and engineering and to increase the pool of trained personnel in a variety of scientific disciplines. This document presents an overview of the objectives and organization of these projects, as well as summaries of early accomplishments and the significance, status, and plans for individual research and development programs within each project. Areas of emphasis include benchmarking, testbeds, software and simulation methods.

  13. Mastering cloud computing foundations and applications programming

    CERN Document Server

    Buyya, Rajkumar; Selvi, SThamarai

    2013-01-01

    Mastering Cloud Computing is designed for undergraduate students learning to develop cloud computing applications. Tomorrow's applications won't live on a single computer but will be deployed from and reside on a virtual server, accessible anywhere, any time. Tomorrow's application developers need to understand the requirements of building apps for these virtual systems, including concurrent programming, high-performance computing, and data-intensive systems. The book introduces the principles of distributed and parallel computing underlying cloud architectures and specifical

  14. Implementation of a damage model in a finite element program for computation of structures under dynamic loading

    Directory of Open Access Journals (Sweden)

    Nasserdine Oudni

    2016-01-01

    Full Text Available This work is a numerical simulation of nonlinear problems of the damage process and fracture of quasi-brittle materials especially concrete. In this study, we model the macroscopic behavior of concrete material, taking into account the phenomenon of damage. J. Mazars model whose principle is based on damage mechanics has been implemented in a finite element program written Fortran 90, it takes into account the dissymmetry of concrete behavior in tension and in compression, this model takes into account the cracking tensile and rupture in compression. It is a model that is commonly used for static and pseudo-static systems, but in this work, it was used in the dynamic case.

  15. The Antares computing model

    Energy Technology Data Exchange (ETDEWEB)

    Kopper, Claudio, E-mail: claudio.kopper@nikhef.nl [NIKHEF, Science Park 105, 1098 XG Amsterdam (Netherlands)

    2013-10-11

    Completed in 2008, Antares is now the largest water Cherenkov neutrino telescope in the Northern Hemisphere. Its main goal is to detect neutrinos from galactic and extra-galactic sources. Due to the high background rate of atmospheric muons and the high level of bioluminescence, several on-line and off-line filtering algorithms have to be applied to the raw data taken by the instrument. To be able to handle this data stream, a dedicated computing infrastructure has been set up. The paper covers the main aspects of the current official Antares computing model. This includes an overview of on-line and off-line data handling and storage. In addition, the current usage of the “IceTray” software framework for Antares data processing is highlighted. Finally, an overview of the data storage formats used for high-level analysis is given.

  16. Temperature based daily incoming solar radiation modeling based on gene expression programming, neuro-fuzzy and neural network computing techniques.

    Science.gov (United States)

    Landeras, G.; López, J. J.; Kisi, O.; Shiri, J.

    2012-04-01

    The correct observation/estimation of surface incoming solar radiation (RS) is very important for many agricultural, meteorological and hydrological related applications. While most weather stations are provided with sensors for air temperature detection, the presence of sensors necessary for the detection of solar radiation is not so habitual and the data quality provided by them is sometimes poor. In these cases it is necessary to estimate this variable. Temperature based modeling procedures are reported in this study for estimating daily incoming solar radiation by using Gene Expression Programming (GEP) for the first time, and other artificial intelligence models such as Artificial Neural Networks (ANNs), and Adaptive Neuro-Fuzzy Inference System (ANFIS). Traditional temperature based solar radiation equations were also included in this study and compared with artificial intelligence based approaches. Root mean square error (RMSE), mean absolute error (MAE) RMSE-based skill score (SSRMSE), MAE-based skill score (SSMAE) and r2 criterion of Nash and Sutcliffe criteria were used to assess the models' performances. An ANN (a four-input multilayer perceptron with ten neurons in the hidden layer) presented the best performance among the studied models (2.93 MJ m-2 d-1 of RMSE). A four-input ANFIS model revealed as an interesting alternative to ANNs (3.14 MJ m-2 d-1 of RMSE). Very limited number of studies has been done on estimation of solar radiation based on ANFIS, and the present one demonstrated the ability of ANFIS to model solar radiation based on temperatures and extraterrestrial radiation. By the way this study demonstrated, for the first time, the ability of GEP models to model solar radiation based on daily atmospheric variables. Despite the accuracy of GEP models was slightly lower than the ANFIS and ANN models the genetic programming models (i.e., GEP) are superior to other artificial intelligence models in giving a simple explicit equation for the

  17. Computer programs for developing source terms for a UF{sub 6} dispersion model to simulate postulated UF{sub 6} releases from buildings

    Energy Technology Data Exchange (ETDEWEB)

    Williams, W.R.

    1985-03-01

    Calculational methods and computer programs for the analysis of source terms for postulated releases of UF{sub 6} are presented. Required thermophysical properties of UF{sub 6}, HF, and H{sub 2}O are described in detail. UF{sub 6} reacts with moisture in the ambient environment to form HF and H{sub 2}O. The coexistence of HF and H{sub 2}O significantly alters their pure component properties, and HF vapor polymerizes. Transient compartment models for simulating UF{sub 6} releases inside gaseous diffusion plant feed and withdrawl buildings and cascade buildings are also described. The basic compartment model mass and energy balances are supported by simple heat transfer, ventilation system, and deposition models. A model that can simulate either a closed compartment or a steady-state ventilation system is also discussed. The transient compartment models provide input to an atmospheric dispersion model as output.

  18. On Verified Numerical Computations in Convex Programming

    OpenAIRE

    Jansson, Christian

    2009-01-01

    This survey contains recent developments for computing verified results of convex constrained optimization problems, with emphasis on applications. Especially, we consider the computation of verified error bounds for non-smooth convex conic optimization in the framework of functional analysis, for linear programming, and for semidefinite programming. A discussion of important problem transformations to special types of convex problems and convex relaxations is included...

  19. Deterministic and Stochastic Study for an Infected Computer Network Model Powered by a System of Antivirus Programs

    Directory of Open Access Journals (Sweden)

    Youness El Ansari

    2017-01-01

    Full Text Available We investigate the various conditions that control the extinction and stability of a nonlinear mathematical spread model with stochastic perturbations. This model describes the spread of viruses into an infected computer network which is powered by a system of antivirus software. The system is analyzed by using the stability theory of stochastic differential equations and the computer simulations. First, we study the global stability of the virus-free equilibrium state and the virus-epidemic equilibrium state. Furthermore, we use the Itô formula and some other theoretical theorems of stochastic differential equation to discuss the extinction and the stationary distribution of our system. The analysis gives a sufficient condition for the infection to be extinct (i.e., the number of viruses tends exponentially to zero. The ergodicity of the solution and the stationary distribution can be obtained if the basic reproduction number Rp is bigger than 1, and the intensities of stochastic fluctuations are small enough. Numerical simulations are carried out to illustrate the theoretical results.

  20. Programs=data=first-class citizens in a computational world.

    Science.gov (United States)

    Jones, Neil D; Simonsen, Jakob Grue

    2012-07-28

    From a programming perspective, Alan Turing's epochal 1936 paper on computable functions introduced several new concepts, including what is today known as self-interpreters and programs as data, and invented a great many now-common programming techniques. We begin by reviewing Turing's contribution from a programming perspective; and then systematize and mention some of the many ways that later developments in models of computation (MOCs) have interacted with computability theory and programming language research. Next, we describe the 'blob' MOC: a recent stored-program computational model without pointers. In the blob model, programs are truly first-class citizens, capable of being automatically compiled, or interpreted, or executed directly. Further, the blob model appears closer to being physically realizable than earlier computation models. In part, this is due to strong finiteness owing to early binding in the program; and a strong adjacency property: the active instruction is always adjacent to the piece of data on which it operates. The model is Turing complete in a strong sense: a universal interpretation algorithm exists that is able to run any program in a natural way and without arcane data encodings. Next, some of the best known among the numerous existing MOCs are described, and we develop a list of traits an 'ideal' MOC should possess from our perspective. We make no attempt to consider all models put forth since Turing's 1936 paper, and the selection of models covered concerns only models with discrete, atomic computation steps. The next step is to classify the selected models by qualitative rather than quantitative features. Finally, we describe how the blob model differs from an 'ideal' MOC, and identify some natural next steps to achieve such a model.

  1. A research program in empirical computer science

    Science.gov (United States)

    Knight, J. C.

    1991-01-01

    During the grant reporting period our primary activities have been to begin preparation for the establishment of a research program in experimental computer science. The focus of research in this program will be safety-critical systems. Many questions that arise in the effort to improve software dependability can only be addressed empirically. For example, there is no way to predict the performance of the various proposed approaches to building fault-tolerant software. Performance models, though valuable, are parameterized and cannot be used to make quantitative predictions without experimental determination of underlying distributions. In the past, experimentation has been able to shed some light on the practical benefits and limitations of software fault tolerance. It is common, also, for experimentation to reveal new questions or new aspects of problems that were previously unknown. A good example is the Consistent Comparison Problem that was revealed by experimentation and subsequently studied in depth. The result was a clear understanding of a previously unknown problem with software fault tolerance. The purpose of a research program in empirical computer science is to perform controlled experiments in the area of real-time, embedded control systems. The goal of the various experiments will be to determine better approaches to the construction of the software for computing systems that have to be relied upon. As such it will validate research concepts from other sources, provide new research results, and facilitate the transition of research results from concepts to practical procedures that can be applied with low risk to NASA flight projects. The target of experimentation will be the production software development activities undertaken by any organization prepared to contribute to the research program. Experimental goals, procedures, data analysis and result reporting will be performed for the most part by the University of Virginia.

  2. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  3. Finite difference computing with exponential decay models

    CERN Document Server

    Langtangen, Hans Petter

    2016-01-01

    This text provides a very simple, initial introduction to the complete scientific computing pipeline: models, discretization, algorithms, programming, verification, and visualization. The pedagogical strategy is to use one case study – an ordinary differential equation describing exponential decay processes – to illustrate fundamental concepts in mathematics and computer science. The book is easy to read and only requires a command of one-variable calculus and some very basic knowledge about computer programming. Contrary to similar texts on numerical methods and programming, this text has a much stronger focus on implementation and teaches testing and software engineering in particular. .

  4. EZLP: An Interactive Computer Program for Solving Linear Programming Problems. Final Report.

    Science.gov (United States)

    Jarvis, John J.; And Others

    Designed for student use in solving linear programming problems, the interactive computer program described (EZLP) permits the student to input the linear programming model in exactly the same manner in which it would be written on paper. This report includes a brief review of the development of EZLP; narrative descriptions of program features,…

  5. Parallel computing in enterprise modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.; Vanderveen, Keith; Ray, Jaideep; Heath, Zach; Allan, Benjamin A.

    2008-08-01

    This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priori ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.

  6. EQ6, a computer program for reaction path modeling of aqueous geochemical systems: Theoretical manual, user`s guide, and related documentation (Version 7.0); Part 4

    Energy Technology Data Exchange (ETDEWEB)

    Wolery, T.J.; Daveler, S.A.

    1992-10-09

    EQ6 is a FORTRAN computer program in the EQ3/6 software package (Wolery, 1979). It calculates reaction paths (chemical evolution) in reacting water-rock and water-rock-waste systems. Speciation in aqueous solution is an integral part of these calculations. EQ6 computes models of titration processes (including fluid mixing), irreversible reaction in closed systems, irreversible reaction in some simple kinds of open systems, and heating or cooling processes, as well as solve ``single-point`` thermodynamic equilibrium problems. A reaction path calculation normally involves a sequence of thermodynamic equilibrium calculations. Chemical evolution is driven by a set of irreversible reactions (i.e., reactions out of equilibrium) and/or changes in temperature and/or pressure. These irreversible reactions usually represent the dissolution or precipitation of minerals or other solids. The code computes the appearance and disappearance of phases in solubility equilibrium with the water. It finds the identities of these phases automatically. The user may specify which potential phases are allowed to form and which are not. There is an option to fix the fugacities of specified gas species, simulating contact with a large external reservoir. Rate laws for irreversible reactions may be either relative rates or actual rates. If any actual rates are used, the calculation has a time frame. Several forms for actual rate laws are programmed into the code. EQ6 is presently able to model both mineral dissolution and growth kinetics.

  7. Modeling EERE deployment programs

    Energy Technology Data Exchange (ETDEWEB)

    Cort, K. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hostick, D. J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Belzer, D. B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Livingston, O. V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2007-11-01

    The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address possible improvements to the modeling process, and note gaps in knowledge for future research.

  8. Computer modeling and urban recycling

    Energy Technology Data Exchange (ETDEWEB)

    Biddle, D.C.; Storey, M

    1989-09-01

    A computer model developed by Philadelphia Recycling Office (PRO) to determine the operational constraints of various policy choices in planning municipal recycling collection services is described. Such a computer model can provide quick and organized summaries of policy options without overwelming decision makers with detailed and time-consuming calculations. Named OMAR, (Operations Model for the Analysis of Recycling), this program is a Lotus 1-2-3 spreadsheet. Data collected from the city's pilot project are central to some of the indices used by the model. Pilot project data and indices are imported from other files in a somewhat lengthy procedure. There are two components to the structure of the analytical section of OMAR. The first, the material components, is based on the algorithm which estimates the amount of material that is available for collection on a given day. The second, the capacity component, is derived from the algorithm which estimates the amount of material that can be collected by a single crew in a day. Equations for calculating such components are presented. The feasibitlity of using OMAR as a reporting tool for planners is also discussed.

  9. Models of optical quantum computing

    Directory of Open Access Journals (Sweden)

    Krovi Hari

    2017-03-01

    Full Text Available I review some work on models of quantum computing, optical implementations of these models, as well as the associated computational power. In particular, we discuss the circuit model and cluster state implementations using quantum optics with various encodings such as dual rail encoding, Gottesman-Kitaev-Preskill encoding, and coherent state encoding. Then we discuss intermediate models of optical computing such as boson sampling and its variants. Finally, we review some recent work in optical implementations of adiabatic quantum computing and analog optical computing. We also provide a brief description of the relevant aspects from complexity theory needed to understand the results surveyed.

  10. Computer Programs in Marine Science

    Science.gov (United States)

    1976-04-01

    between two locations. Requires subroutines COS, SIN, ARCOS . Author - Ralph Johnson. Oceanographic Services Branch Copy on file at XODC National...STEREOGRAPHIC PROJECTICN 65 FORTRAN CDC 3800 PIE SCATTERINC COMPUTATIONS 41 FURTRAN COC 36UO PLCTS TRACK $AD DATA PROFILE .RACK 47 FORTRAN COC 3.00 PLCTS

  11. FORTRAN computer program for seismic risk analysis

    Science.gov (United States)

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  12. Computer programming and architecture the VAX

    CERN Document Server

    Levy, Henry

    2014-01-01

    Takes a unique systems approach to programming and architecture of the VAXUsing the VAX as a detailed example, the first half of this book offers a complete course in assembly language programming. The second describes higher-level systems issues in computer architecture. Highlights include the VAX assembler and debugger, other modern architectures such as RISCs, multiprocessing and parallel computing, microprogramming, caches and translation buffers, and an appendix on the Berkeley UNIX assembler.

  13. The Dynamic Geometrisation of Computer Programming

    Science.gov (United States)

    Sinclair, Nathalie; Patterson, Margaret

    2018-01-01

    The goal of this paper is to explore dynamic geometry environments (DGE) as a type of computer programming language. Using projects created by secondary students in one particular DGE, we analyse the extent to which the various aspects of computational thinking--including both ways of doing things and particular concepts--were evident in their…

  14. 43 Computer Assisted Programmed Instruction and Cognitive ...

    African Journals Online (AJOL)

    cce

    Computer Assisted Programmed Instruction and Cognitive Preference Style as. Determinant of Achievement of Secondary School Physics Students. Sotayo, M. A. O.. Federal College of Education, Osiele, Abeokuta, Nigeria. Abstract. The study probes into the effect of Computer Assisted Instruction and Cognitive preference.

  15. NASA High-End Computing Program Website

    Science.gov (United States)

    Cohen, Jarrett S.

    2008-01-01

    If you are a NASA-sponsored scientist or engineer. computing time is available to you at the High-End Computing (HEC) Program's NASA Advanced Supercomputing (NAS) Facility and NASA Center for Computational Sciences (NCCS). The Science Mission Directorate will select from requests NCCS Portals submitted to the e-Books online system for awards beginning on May 1. Current projects set to explore on April 30 must have a request in e-Books to be considered for renewal

  16. Computing the Line Index of Balance Using Integer Programming Optimisation

    OpenAIRE

    Aref, Samin; Andrew J. Mason; Wilson, Mark C.

    2017-01-01

    An important measure of a signed graph is the line index of balance which has several applications in many fields. However, this graph-theoretic measure was underused for decades because of the inherent complexity in its computation which is closely related to solving NP-hard graph optimisation problems like MAXCUT. We develop new quadratic and linear programming models to compute the line index of balance exactly. Using the Gurobi integer programming optimisation solver, we evaluate the line...

  17. Computer-Aided Corrosion Program Management

    Science.gov (United States)

    MacDowell, Louis

    2010-01-01

    This viewgraph presentation reviews Computer-Aided Corrosion Program Management at John F. Kennedy Space Center. The contents include: 1) Corrosion at the Kennedy Space Center (KSC); 2) Requirements and Objectives; 3) Program Description, Background and History; 4) Approach and Implementation; 5) Challenges; 6) Lessons Learned; 7) Successes and Benefits; and 8) Summary and Conclusions.

  18. Computer model for economic study of unbleached kraft paperboard production

    Science.gov (United States)

    Peter J. Ince

    1984-01-01

    Unbleached kraft paperboard is produced from wood fiber in an industrial papermaking process. A highly specific and detailed model of the process is presented. The model is also presented as a working computer program. A user of the computer program will provide data on physical parameters of the process and on prices of material inputs and outputs. The program is then...

  19. Developing computer training programs for blood bankers.

    Science.gov (United States)

    Eisenbrey, L

    1992-01-01

    Two surveys were conducted in July 1991 to gather information about computer training currently performed within American Red Cross Blood Services Regions. One survey was completed by computer trainers from software developer-vendors and regional centers. The second survey was directed to the trainees, to determine their perception of the computer training. The surveys identified the major concepts, length of training, evaluations, and methods of instruction used. Strengths and weaknesses of training programs were highlighted by trainee respondents. Using the survey information and other sources, recommendations (including those concerning which computer skills and tasks should be covered) are made that can be used as guidelines for developing comprehensive computer training programs at any blood bank or blood center.

  20. Computer modeling of human decision making

    Science.gov (United States)

    Gevarter, William B.

    1991-01-01

    Models of human decision making are reviewed. Models which treat just the cognitive aspects of human behavior are included as well as models which include motivation. Both models which have associated computer programs, and those that do not, are considered. Since flow diagrams, that assist in constructing computer simulation of such models, were not generally available, such diagrams were constructed and are presented. The result provides a rich source of information, which can aid in construction of more realistic future simulations of human decision making.

  1. Modeling EERE Deployment Programs

    Energy Technology Data Exchange (ETDEWEB)

    Cort, K. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hostick, D. J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Belzer, D. B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Livingston, O. V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2007-11-01

    This report compiles information and conclusions gathered as part of the “Modeling EERE Deployment Programs” project. The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address possible improvements to the modeling process, and note gaps in knowledge in which future research is needed.

  2. Final Report, Center for Programming Models for Scalable Parallel Computing: Co-Array Fortran, Grant Number DE-FC02-01ER25505

    Energy Technology Data Exchange (ETDEWEB)

    Robert W. Numrich

    2008-04-22

    The major accomplishment of this project is the production of CafLib, an 'object-oriented' parallel numerical library written in Co-Array Fortran. CafLib contains distributed objects such as block vectors and block matrices along with procedures, attached to each object, that perform basic linear algebra operations such as matrix multiplication, matrix transpose and LU decomposition. It also contains constructors and destructors for each object that hide the details of data decomposition from the programmer, and it contains collective operations that allow the programmer to calculate global reductions, such as global sums, global minima and global maxima, as well as vector and matrix norms of several kinds. CafLib is designed to be extensible in such a way that programmers can define distributed grid and field objects, based on vector and matrix objects from the library, for finite difference algorithms to solve partial differential equations. A very important extra benefit that resulted from the project is the inclusion of the co-array programming model in the next Fortran standard called Fortran 2008. It is the first parallel programming model ever included as a standard part of the language. Co-arrays will be a supported feature in all Fortran compilers, and the portability provided by standardization will encourage a large number of programmers to adopt it for new parallel application development. The combination of object-oriented programming in Fortran 2003 with co-arrays in Fortran 2008 provides a very powerful programming model for high-performance scientific computing. Additional benefits from the project, beyond the original goal, include a programto provide access to the co-array model through access to the Cray compiler as a resource for teaching and research. Several academics, for the first time, included the co-array model as a topic in their courses on parallel computing. A separate collaborative project with LANL and PNNL showed how to

  3. Using computer modelled life expectancy to evaluate the impact of Australian Primary Care Incentive programs for patients with type 2 diabetes.

    Science.gov (United States)

    Staff, Michael; Chen, Jian Sheng; March, Lyn

    2015-08-01

    To evaluate the impact of enhanced primary care and practice incentive programs on the care of patients with type 2 diabetes in the Australian primary care setting using routinely collected data and computer modelling software. Primary care patient data were electronically extracted from practices and inputted into the United Kingdom Prospective Diabetes Study (UKPDS) Outcomes model. A retrospective cohort study design was employed with predicted life expectancies compared between patients who had a recorded diabetes cycle of care (DCoC) and those who did not. Changes in glycated haemoglobin (HbA1c) were also analysed using a mixed-effects regression model. Potential life expectancy gains were estimated by inputting theoretical risk factors data consistent with current guidelines. Twelve primary care practices were recruited and suitable data were available for 559 people with type 2 diabetes. Two hundred and twenty five patients (40%) were identified as having completed at least one DCoC and as a group had a predicted additional life expectancy of 0.65 years (95% CI, -0.22 to 1.5). However, once this was adjusted for comorbidities the difference reduced to 0.03 years. There was no significant difference in HbA1c levels attributable to the intervention. An estimated 0.5 year of additional life expectancy was predicted should all patients have complied with current risk factor guideline recommendations. Computer modelling using routinely collected primary care data can be used to evaluate the effectiveness of primary care programs. However, there are some data availability and linkage limitations in the Australian setting. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  4. Methods, Devices and Computer Program Products Providing for Establishing a Model for Emulating a Physical Quantity Which Depends on at Least One Input Parameter, and Use Thereof

    DEFF Research Database (Denmark)

    2014-01-01

    The present invention proposes methods, devices and computer program products. To this extent, there is defined a set X including N distinct parameter values x_i for at least one input parameter x, N being an integer greater than or equal to 1, first measured the physical quantity Pm1 for each...... of the N distinct parameter values x_i of the at least one input parameter x, while keeping all other input parameters fixed, constructed a Vandermonde matrix VM using the set of N parameter values x_i of the at least one input parameter x, and computed the model W for emulating the physical quantity P...... based on the Vandermonde matrix and the first measured physical quantity according to the equation W=(VMT*VM)-1*VMT*Pm1. The model is iteratively refined so as to obtained a desired emulation precision.; The model can later be used to emulate the physical quantity based on input parameters or logs taken...

  5. Computational modeling in biomechanics

    CERN Document Server

    Mofrad, Mohammad

    2010-01-01

    This book provides a glimpse of the diverse and important roles that modern computational technology is playing in various areas of biomechanics. It includes unique chapters on ab initio quantum mechanical, molecular dynamic and scale coupling methods..

  6. An introduction to Python and computer programming

    CERN Document Server

    Zhang, Yue

    2015-01-01

    This book introduces Python programming language and fundamental concepts in algorithms and computing. Its target audience includes students and engineers with little or no background in programming, who need to master a practical programming language and learn the basic thinking in computer science/programming. The main contents come from lecture notes for engineering students from all disciplines, and has received high ratings. Its materials and ordering have been adjusted repeatedly according to classroom reception. Compared to alternative textbooks in the market, this book introduces the underlying Python implementation of number, string, list, tuple, dict, function, class, instance and module objects in a consistent and easy-to-understand way, making assignment, function definition, function call, mutability and binding environments understandable inside-out. By giving the abstraction of implementation mechanisms, this book builds a solid understanding of the Python programming language.

  7. TRAC-P1: an advanced best estimate computer program for PWR LOCA analysis. I. Methods, models, user information, and programming details

    Energy Technology Data Exchange (ETDEWEB)

    1978-05-01

    The Transient Reactor Analysis Code (TRAC) is being developed at the Los Alamos Scientific Laboratory (LASL) to provide an advanced ''best estimate'' predictive capability for the analysis of postulated accidents in light water reactors (LWRs). TRAC-Pl provides this analysis capability for pressurized water reactors (PWRs) and for a wide variety of thermal-hydraulic experimental facilities. It features a three-dimensional treatment of the pressure vessel and associated internals; two-phase nonequilibrium hydrodynamics models; flow-regime-dependent constitutive equation treatment; reflood tracking capability for both bottom flood and falling film quench fronts; and consistent treatment of entire accident sequences including the generation of consistent initial conditions. The TRAC-Pl User's Manual is composed of two separate volumes. Volume I gives a description of the thermal-hydraulic models and numerical solution methods used in the code. Detailed programming and user information is also provided. Volume II presents the results of the developmental verification calculations.

  8. Geometric Modeling for Computer Vision

    Science.gov (United States)

    1974-10-01

    The main contribution of this thesis is the development of a three dimensional geometric modeling system for application to computer vision . In... computer vision geometric models provide a goal for descriptive image analysis, an origin for verification image synthesis, and a context for spatial

  9. Mathematical Modeling and Computational Thinking

    Science.gov (United States)

    Sanford, John F.; Naidu, Jaideep T.

    2017-01-01

    The paper argues that mathematical modeling is the essence of computational thinking. Learning a computer language is a valuable assistance in learning logical thinking but of less assistance when learning problem-solving skills. The paper is third in a series and presents some examples of mathematical modeling using spreadsheets at an advanced…

  10. Standard test method for accelerated leach test for diffusive releases from solidified waste and a computer program to model diffusive, fractional leaching from cylindrical waste forms

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This test method provides procedures for measuring the leach rates of elements from a solidified matrix material, determining if the releases are controlled by mass diffusion, computing values of diffusion constants based on models, and verifying projected long-term diffusive releases. This test method is applicable to any material that does not degrade or deform during the test. 1.1.1 If mass diffusion is the dominant step in the leaching mechanism, then the results of this test can be used to calculate diffusion coefficients using mathematical diffusion models. A computer program developed for that purpose is available as a companion to this test method (Note 1). 1.1.2 It should be verified that leaching is controlled by diffusion by a means other than analysis of the leach test solution data. Analysis of concentration profiles of species of interest near the surface of the solid waste form after the test is recommended for this purpose. 1.1.3 Potential effects of partitioning on the test results can...

  11. Montgomery Blair Science, Mathematics and Computer Science Magnet Program: A Successful Model for Meeting the Needs of Highly Able STEM Learners

    Science.gov (United States)

    Stein, David; Ostrander, Peter; Lee, G. Maie

    2016-01-01

    The Magnet Program at Montgomery Blair High School is an application-based magnet program utilizing a curriculum focused on science, mathematics, and computer science catering to interested, talented, and eager to learn students in Montgomery County, Maryland. This article identifies and discusses some of the unique aspects of the Magnet Program…

  12. Role of logic programming in computer studies

    Directory of Open Access Journals (Sweden)

    Nicolae PELIN

    2016-09-01

    Full Text Available The paper contains the analysis of the opinions of a number of scholars and specialists on the importance and the role in logic programming methodology of studying computer science, philosophy about the logic programs and interpreter, concerning the burden of which is opposite to the programmer if there is logic interpreter. The presented material is meant, according to the author, to help the reader to understand more easily the analyzed multilateral problem.

  13. Program computes turbine steam rates and properties

    Energy Technology Data Exchange (ETDEWEB)

    Ganapathy, V. (ABCO Industries, Inc., Abilene, TX (US))

    1988-11-01

    BASIC computer program quickly evaluates steam properties and rates during expansion in a steam turbine. Engineers involved in cogeneration projects and power plant studies often need to calculate the steam properties during expansion in a steam turbine to evaluate the theoretical and actual steam rates and hence, the electrical power output. With the help of this program written in BASIC, one can quickly evaluate all the pertinent data. Correlations used for steam property evaluation are also presented.

  14. A Computational Theory of Modelling

    Science.gov (United States)

    Rossberg, Axel G.

    2003-04-01

    A metatheory is developed which characterizes the relationship between a modelled system, which complies with some ``basic theory'', and a model, which does not, and yet reproduces important aspects of the modelled system. A model is represented by an (in a certain sense, s.b.) optimal algorithm which generates data that describe the model's state or evolution complying with a ``reduced theory''. Theories are represented by classes of (in a similar sense, s.b.) optimal algorithms that test if their input data comply with the theory. The metatheory does not prescribe the formalisms (data structure, language) to be used for the description of states or evolutions. Transitions to other formalisms and loss of accuracy, common to theory reduction, are explicitly accounted for. The basic assumption of the theory is that resources such as the code length (~ programming time) and the computation time for modelling and testing are costly, but the relative cost of each recourse is unknown. Thus, if there is an algorithm a for which there is no other algorithm b solving the same problem but using less of each recourse, then a is considered optimal. For tests (theories), the set X of wrongly admitted inputs is treated as another resource. It is assumed that X1 is cheaper than X2 when X1 ⊂ X2 (X1 ≠ X2). Depending on the problem, the algorithmic complexity of a reduced theory can be smaller or larger than that of the basic theory. The theory might help to distinguish actual properties of complex systems from mere mental constructs. An application to complex spatio-temporal patterns is discussed.

  15. Documentation of Computer Program INFIL3.0 - A Distributed-Parameter Watershed Model to Estimate Net Infiltration Below the Root Zone

    Science.gov (United States)

    ,

    2008-01-01

    This report documents the computer program INFIL3.0, which is a grid-based, distributed-parameter, deterministic water-balance watershed model that calculates the temporal and spatial distribution of daily net infiltration of water across the lower boundary of the root zone. The bottom of the root zone is the estimated maximum depth below ground surface affected by evapotranspiration. In many field applications, net infiltration below the bottom of the root zone can be assumed to equal net recharge to an underlying water-table aquifer. The daily water balance simulated by INFIL3.0 includes precipitation as either rain or snow; snowfall accumulation, sublimation, and snowmelt; infiltration into the root zone; evapotranspiration from the root zone; drainage and water-content redistribution within the root-zone profile; surface-water runoff from, and run-on to, adjacent grid cells; and net infiltration across the bottom of the root zone. The water-balance model uses daily climate records of precipitation and air temperature and a spatially distributed representation of drainage-basin characteristics defined by topography, geology, soils, and vegetation to simulate daily net infiltration at all locations, including stream channels with intermittent streamflow in response to runoff from rain and snowmelt. The model does not simulate streamflow originating as ground-water discharge. Drainage-basin characteristics are represented in the model by a set of spatially distributed input variables uniquely assigned to each grid cell of a model grid. The report provides a description of the conceptual model of net infiltration on which the INFIL3.0 computer code is based and a detailed discussion of the methods by which INFIL3.0 simulates the net-infiltration process. The report also includes instructions for preparing input files necessary for an INFIL3.0 simulation, a description of the output files that are created as part of an INFIL3.0 simulation, and a sample problem that

  16. Computational modeling of microstructure

    OpenAIRE

    Luskin, Mitchell

    2003-01-01

    Many materials such as martensitic or ferromagnetic crystals are observed to be in metastable states exhibiting a fine-scale, structured spatial oscillation called microstructure; and hysteresis is observed as the temperature, boundary forces, or external magnetic field changes. We have developed a numerical analysis of microstructure and used this theory to construct numerical methods that have been used to compute approximations to the deformation of crystals with microstructure.

  17. A CAD (Classroom Assessment Design) of a Computer Programming Course

    Science.gov (United States)

    Hawi, Nazir S.

    2012-01-01

    This paper presents a CAD (classroom assessment design) of an entry-level undergraduate computer programming course "Computer Programming I". CAD has been the product of a long experience in teaching computer programming courses including teaching "Computer Programming I" 22 times. Each semester, CAD is evaluated and modified…

  18. Computer program documentation for the enhanced stream-water quality model QUAL2E. Final report, August 1984-June 1985

    Energy Technology Data Exchange (ETDEWEB)

    Brown, L.C.; Barnwell, T.O.

    1985-08-01

    Presented in the manual are recent modifications and improvements to the widely used stream water quality model QUAL-II. Called QUAL2E, the enhanced model incorporates improvements in eight areas: (1) algal, nitrogen, phosphorus, and dissolved oxygen interactions; (2) algal growth rate; (3) temperature; (4) dissolved oxygen; (5) arbitrary non-conservative constituents; (6) hydraulics; (7) downstream boundary concentrations; and (8) input/output modifications. QUAL2E, which can be operated either as a steady-state or as a dynamic model, is intended for use as a water-quality planning tool.

  19. Introduction to programming multiple-processor computers

    Energy Technology Data Exchange (ETDEWEB)

    Hicks, H.R.; Lynch, V.E.

    1985-04-01

    FORTRAN applications programs can be executed on multiprocessor computers in either a unitasking (traditional) or multitasking form. The latter allows a single job to use more than one processor simultaneously, with a consequent reduction in wall-clock time and, perhaps, the cost of the calculation. An introduction to programming in this environment is presented. The concepts of synchronization and data sharing using EVENTS and LOCKS are illustrated with examples. The strategy of strong synchronization and the use of synchronization templates are proposed. We emphasize that incorrect multitasking programs can produce irreproducible results, which makes debugging more difficult.

  20. Computer Model Of Fragmentation Of Atomic Nuclei

    Science.gov (United States)

    Wilson, John W.; Townsend, Lawrence W.; Tripathi, Ram K.; Norbury, John W.; KHAN FERDOUS; Badavi, Francis F.

    1995-01-01

    High Charge and Energy Semiempirical Nuclear Fragmentation Model (HZEFRG1) computer program developed to be computationally efficient, user-friendly, physics-based program for generating data bases on fragmentation of atomic nuclei. Data bases generated used in calculations pertaining to such radiation-transport applications as shielding against radiation in outer space, radiation dosimetry in outer space, cancer therapy in laboratories with beams of heavy ions, and simulation studies for designing detectors for experiments in nuclear physics. Provides cross sections for production of individual elements and isotopes in breakups of high-energy heavy ions by combined nuclear and Coulomb fields of interacting nuclei. Written in ANSI FORTRAN 77.

  1. Computer Assisted Programmed Instruction and Cognitive ...

    African Journals Online (AJOL)

    The achievement of students of application learning mode was also significantly higher than those of recall and principle respectively. There was no significant interaction effect between Cognitive Preference Style and Computer Assisted Programmed Instruction. The implications of the result to the stakeholder were ...

  2. Computer program package for PIXE spectra evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Kajfosz, J. [Institute of Nuclear Physics, Cracow (Poland)

    1992-12-31

    The computer programs described here were developed for calculating the concentrations of elements in samples analysed by the PIXE (Proton Induced X-ray Emission) method from the X-ray spectra obtained in those analyses. (author). 10 refs, 2 figs.

  3. Computer programming students head to Tokyo

    OpenAIRE

    Crumbley, Liz

    2007-01-01

    "The Milk's Gone Bad," a team of three undergraduate students from the Virginia Tech College of Engineering, will compete in the World Finals of the Association of Computing Machinery International Collegiate Programming Contest (ACM-ICPC) March 12-16 in Tokyo, Japan.

  4. Data systems and computer science programs: Overview

    Science.gov (United States)

    Smith, Paul H.; Hunter, Paul

    1991-01-01

    An external review of the Integrated Technology Plan for the Civil Space Program is presented. The topics are presented in viewgraph form and include the following: onboard memory and storage technology; advanced flight computers; special purpose flight processors; onboard networking and testbeds; information archive, access, and retrieval; visualization; neural networks; software engineering; and flight control and operations.

  5. Computational models of syntactic acquisition.

    Science.gov (United States)

    Yang, Charles

    2012-03-01

    The computational approach to syntactic acquisition can be fruitfully pursued by integrating results and perspectives from computer science, linguistics, and developmental psychology. In this article, we first review some key results in computational learning theory and their implications for language acquisition. We then turn to examine specific learning models, some of which exploit distributional information in the input while others rely on a constrained space of hypotheses, yet both approaches share a common set of characteristics to overcome the learning problem. We conclude with a discussion of how computational models connects with the empirical study of child grammar, making the case for computationally tractable, psychologically plausible and developmentally realistic models of acquisition. WIREs Cogn Sci 2012, 3:205-213. doi: 10.1002/wcs.1154 For further resources related to this article, please visit the WIREs website. Copyright © 2011 John Wiley & Sons, Ltd.

  6. Application of an object-oriented programming paradigm in three-dimensional computer modeling of mechanically active gastrointestinal tissues.

    Science.gov (United States)

    Rashev, P Z; Mintchev, M P; Bowes, K L

    2000-09-01

    The aim of this study was to develop a novel three-dimensional (3-D) object-oriented modeling approach incorporating knowledge of the anatomy, electrophysiology, and mechanics of externally stimulated excitable gastrointestinal (GI) tissues and emphasizing the "stimulus-response" principle of extracting the modeling parameters. The modeling method used clusters of class hierarchies representing GI tissues from three perspectives: 1) anatomical; 2) electrophysiological; and 3) mechanical. We elaborated on the first four phases of the object-oriented system development life-cycle: 1) analysis; 2) design; 3) implementation; and 4) testing. Generalized cylinders were used for the implementation of 3-D tissue objects modeling the cecum, the descending colon, and the colonic circular smooth muscle tissue. The model was tested using external neural electrical tissue excitation of the descending colon with virtual implanted electrodes and the stimulating current density distributions over the modeled surfaces were calculated. Finally, the tissue deformations invoked by electrical stimulation were estimated and represented by a mesh-surface visualization technique.

  7. Modeling EERE Deployment Programs

    Energy Technology Data Exchange (ETDEWEB)

    Cort, Katherine A.; Hostick, Donna J.; Belzer, David B.; Livingston, Olga V.

    2007-11-08

    The purpose of this report is to compile information and conclusions gathered as part of three separate tasks undertaken as part of the overall project, “Modeling EERE Deployment Programs,” sponsored by the Planning, Analysis, and Evaluation office within the Department of Energy’s Office of Energy Efficiency and Renewable Energy (EERE). The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address improvements to modeling in the near term, and note gaps in knowledge where future research is needed.

  8. Employing subgoals in computer programming education

    Science.gov (United States)

    Margulieux, Lauren E.; Catrambone, Richard; Guzdial, Mark

    2016-01-01

    The rapid integration of technology into our professional and personal lives has left many education systems ill-equipped to deal with the influx of people seeking computing education. To improve computing education, we are applying techniques that have been developed for other procedural fields. The present study applied such a technique, subgoal labeled worked examples, to explore whether it would improve programming instruction. The first two experiments, conducted in a laboratory, suggest that the intervention improves undergraduate learners' problem-solving performance and affects how learners approach problem-solving. The third experiment demonstrates that the intervention has similar, and perhaps stronger, effects in an online learning environment with in-service K-12 teachers who want to become qualified to teach computing courses. By implementing this subgoal intervention as a tool for educators to teach themselves and their students, education systems could improve computing education and better prepare learners for an increasingly technical world.

  9. Computer Presentation Programs and Teaching Research Methodologies

    Directory of Open Access Journals (Sweden)

    Vahid Motamedi

    2015-05-01

    Full Text Available Supplementing traditional chalk and board instruction with computer delivery has been viewed positively by students who have reported increased understanding and more interaction with the instructor when computer presentations are used in the classroom. Some problems contributing to student errors while taking class notes might be transcription of numbers to the board, and handwriting of the instructor can be resolved in careful construction of computer presentations. The use of computer presentation programs promises to increase the effectiveness of learning by making content more readily available, by reducing the cost and effort of producing quality content, and by allowing content to be more easily shared. This paper describes how problems can be overcome by using presentation packages for instruction.

  10. Integrated Computational Model Development

    Science.gov (United States)

    2014-03-01

    and manufacturing. To start to achieve these goals, the program contained four task or thrust areas dealing with a) materials and processes (M&P), b...Cr, and its lattice parameter after HIP was a = 324.76 ± 0.16 pm. The BCC2 phase was enriched with Zr and Ti and considerably depleted with Mo, Cr...and Ta, and its lattice parameter after HIP was estimated to be a = 341.0 ± 1.0 pm. The FCC phase was highly enriched with Cr and it was identified

  11. Overview of ASC Capability Computing System Governance Model

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott W. [Los Alamos National Laboratory

    2012-07-11

    This document contains a description of the Advanced Simulation and Computing Program's Capability Computing System Governance Model. Objectives of the Governance Model are to ensure that the capability system resources are allocated on a priority-driven basis according to the Program requirements; and to utilize ASC Capability Systems for the large capability jobs for which they were designed and procured.

  12. PDEPTH—A computer program for the geophysical interpretation of magnetic and gravity profiles through Fourier filtering, source-depth analysis, and forward modeling

    Science.gov (United States)

    Phillips, Jeffrey D.

    2018-01-10

    PDEPTH is an interactive, graphical computer program used to construct interpreted geological source models for observed potential-field geophysical profile data. The current version of PDEPTH has been adapted to the Windows platform from an earlier DOS-based version. The input total-field magnetic anomaly and vertical gravity anomaly profiles can be filtered to produce derivative products such as reduced-to-pole magnetic profiles, pseudogravity profiles, pseudomagnetic profiles, and upward-or-downward-continued profiles. A variety of source-location methods can be applied to the original and filtered profiles to estimate (and display on a cross section) the locations and physical properties of contacts, sheet edges, horizontal line sources, point sources, and interface surfaces. Two-and-a-half-dimensional source bodies having polygonal cross sections can be constructed using a mouse and keyboard. These bodies can then be adjusted until the calculated gravity and magnetic fields of the source bodies are close to the observed profiles. Auxiliary information such as the topographic surface, bathymetric surface, seismic basement, and geologic contact locations can be displayed on the cross section using optional input files. Test data files, used to demonstrate the source location methods in the report, and several utility programs are included.

  13. Computer models of concrete structures

    OpenAIRE

    Cervenka, Vladimir; Eligehausen, Rolf; Pukl, Radomir

    1991-01-01

    The application of the nonlinear finite element analysis of concrete structures as a design tool is discussed. A computer program for structures in plane stress state is described and examples of its application in the research of fastening technique and in engineering practice are shown.

  14. Foresters' Metric Conversions program (version 1.0). [Computer program

    Science.gov (United States)

    Jefferson A. Palmer

    1999-01-01

    The conversion of scientific measurements has become commonplace in the fields of - engineering, research, and forestry. Foresters? Metric Conversions is a Windows-based computer program that quickly converts user-defined measurements from English to metric and from metric to English. Foresters? Metric Conversions was derived from the publication "Metric...

  15. Patient-Specific Computational Modeling

    CERN Document Server

    Peña, Estefanía

    2012-01-01

    This book addresses patient-specific modeling. It integrates computational modeling, experimental procedures, imagine clinical segmentation and mesh generation with the finite element method (FEM) to solve problems in computational biomedicine and bioengineering. Specific areas of interest include cardiovascular problems, ocular and muscular systems and soft tissue modeling. Patient-specific modeling has been the subject of serious research over the last seven years and interest in the area is continually growing and this area is expected to further develop in the near future.

  16. Trust Models in Ubiquitous Computing

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl; Sassone, Vladimiro

    2008-01-01

    We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models.......We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models....

  17. Documentation of a computer program to simulate lake-aquifer interaction using the MODFLOW ground water flow model and the MOC3D solute-transport model

    Science.gov (United States)

    Merritt, Michael L.; Konikow, Leonard F.

    2000-01-01

    Heads and flow patterns in surficial aquifers can be strongly influenced by the presence of stationary surface-water bodies (lakes) that are in direct contact, vertically and laterally, with the aquifer. Conversely, lake stages can be significantly affected by the volume of water that seeps through the lakebed that separates the lake from the aquifer. For these reasons, a set of computer subroutines called the Lake Package (LAK3) was developed to represent lake/aquifer interaction in numerical simulations using the U.S. Geological Survey three-dimensional, finite-difference, modular ground-water flow model MODFLOW and the U.S. Geological Survey three-dimensional method-of-characteristics solute-transport model MOC3D. In the Lake Package described in this report, a lake is represented as a volume of space within the model grid which consists of inactive cells extending downward from the upper surface of the grid. Active model grid cells bordering this space, representing the adjacent aquifer, exchange water with the lake at a rate determined by the relative heads and by conductances that are based on grid cell dimensions, hydraulic conductivities of the aquifer material, and user-specified leakance distributions that represent the resistance to flow through the material of the lakebed. Parts of the lake may become ?dry? as upper layers of the model are dewatered, with a concomitant reduction in lake surface area, and may subsequently rewet when aquifer heads rise. An empirical approximation has been encoded to simulate the rewetting of a lake that becomes completely dry. The variations of lake stages are determined by independent water budgets computed for each lake in the model grid. This lake budget process makes the package a simulator of the response of lake stage to hydraulic stresses applied to the aquifer. Implementation of a lake water budget requires input of parameters including those representing the rate of lake atmospheric recharge and evaporation

  18. Building Program Models Incrementally from Informal Descriptions.

    Science.gov (United States)

    1979-10-01

    AD-AOB6 50 STANFORD UNIV CA DEPT OF COMPUTER SCIENCE F/G 9/2 BUILDING PROGRAM MODELS INCREMENTALLY FROM INFORMAL DESCRIPTION--ETC(U) OCT 79 B P...port SCI.ICS.U.79.2 t Building Program Models Incrementally from Informal Descriptions by Brian P. McCune Research sponsored by Defense Advanced...TYPE OF REPORT & PERIOD COVERED Building Program Models Incrementally from Informal Descriptions. , technical, October 1979 6. PERFORMING ORG

  19. Tpetra, and the Use of Generic Programming in Scientific Computing

    Directory of Open Access Journals (Sweden)

    C.G. Baker

    2012-01-01

    Full Text Available We present Tpetra, a Trilinos package for parallel linear algebra primitives implementing the Petra object model. We describe Tpetra's design, based on generic programming via C++ templated types and template metaprogramming. We discuss some benefits of this approach in the context of scientific computing, with illustrations consisting of code and notable empirical results.

  20. GAP: A computer program for gene assembly

    Energy Technology Data Exchange (ETDEWEB)

    Eisnstein, J.R.; Uberbacher, E.C.; Guan, X.; Mural, R.J.; Mann, R.C.

    1991-09-01

    A computer program, GAP (Gene Assembly Program), has been written to assemble and score hypothetical genes, given a DNA sequence containing the gene, and the outputs of several other programs which analyze the sequence. These programs include the codign-recognition and splice-junction-recognition modules developed in this laboratory. GAP is a prototype of a planned system in which it will be integrated with an expert system and rule base. Initial tests of GAP have been carried out with four sequences, the exons of which have been determined by biochemcial methods. The highest-scoring hypothetical genes for each of the four sequences had percent correct splice junctions ranging from 50 to 100% (average 81%) and percent correct bases ranging from 92 to 100% (average 96%). 9 refs., 1 tab.

  1. Computer-Aided Modeling Framework

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    Models are playing important roles in design and analysis of chemicals based products and the processes that manufacture them. Computer-aided methods and tools have the potential to reduce the number of experiments, which can be expensive and time consuming, and there is a benefit of working...... development and application. The proposed work is a part of the project for development of methods and tools that will allow systematic generation, analysis and solution of models for various objectives. It will use the computer-aided modeling framework that is based on a modeling methodology, which combines...... as the user can then generate many problem-specific models for different applications. The templates are part of the model generation feature of the framework. Also, the model development and use for a product performance evaluation has been developed. The application of the modeling template is highlighted...

  2. Rapid freeze-drying cycle optimization using computer programs developed based on heat and mass transfer models and facilitated by tunable diode laser absorption spectroscopy (TDLAS).

    Science.gov (United States)

    Kuu, Wei Y; Nail, Steven L

    2009-09-01

    Computer programs in FORTRAN were developed to rapidly determine the optimal shelf temperature, T(f), and chamber pressure, P(c), to achieve the shortest primary drying time. The constraint for the optimization is to ensure that the product temperature profile, T(b), is below the target temperature, T(target). Five percent mannitol was chosen as the model formulation. After obtaining the optimal sets of T(f) and P(c), each cycle was assigned with a cycle rank number in terms of the length of drying time. Further optimization was achieved by dividing the drying time into a series of ramping steps for T(f), in a cascading manner (termed the cascading T(f) cycle), to further shorten the cycle time. For the purpose of demonstrating the validity of the optimized T(f) and P(c), four cycles with different predicted lengths of drying time, along with the cascading T(f) cycle, were chosen for experimental cycle runs. Tunable diode laser absorption spectroscopy (TDLAS) was used to continuously measure the sublimation rate. As predicted, maximum product temperatures were controlled slightly below the target temperature of -25 degrees C, and the cascading T(f)-ramping cycle is the most efficient cycle design. In addition, the experimental cycle rank order closely matches with that determined by modeling.

  3. Testing computational toxicology models with phytochemicals.

    Science.gov (United States)

    Valerio, Luis G; Arvidson, Kirk B; Busta, Emily; Minnier, Barbara L; Kruhlak, Naomi L; Benz, R Daniel

    2010-02-01

    Computational toxicology employing quantitative structure-activity relationship (QSAR) modeling is an evidence-based predictive method being evaluated by regulatory agencies for risk assessment and scientific decision support for toxicological endpoints of interest such as rodent carcinogenicity. Computational toxicology is being tested for its usefulness to support the safety assessment of drug-related substances (e.g. active pharmaceutical ingredients, metabolites, impurities), indirect food additives, and other applied uses of value for protecting public health including safety assessment of environmental chemicals. The specific use of QSAR as a chemoinformatic tool for estimating the rodent carcinogenic potential of phytochemicals present in botanicals, herbs, and natural dietary sources is investigated here by an external validation study, which is the most stringent scientific method of measuring predictive performance. The external validation statistics for predicting rodent carcinogenicity of 43 phytochemicals, using two computational software programs evaluated at the FDA, are discussed. One software program showed very good performance for predicting non-carcinogens (high specificity), but both exhibited poor performance in predicting carcinogens (sensitivity), which is consistent with the design of the models. When predictions were considered in combination with each other rather than based on any one software, the performance for sensitivity was enhanced, However, Chi-square values indicated that the overall predictive performance decreases when using the two computational programs with this particular data set. This study suggests that complementary multiple computational toxicology software need to be carefully selected to improve global QSAR predictions for this complex toxicological endpoint.

  4. Introducing Program Evaluation Models

    Directory of Open Access Journals (Sweden)

    Raluca GÂRBOAN

    2008-02-01

    Full Text Available Programs and project evaluation models can be extremely useful in project planning and management. The aim is to set the right questions as soon as possible in order to see in time and deal with the unwanted program effects, as well as to encourage the positive elements of the project impact. In short, different evaluation models are used in order to minimize losses and maximize the benefits of the interventions upon small or large social groups. This article introduces some of the most recently used evaluation models.

  5. Structural mode significance using INCA. [Interactive Controls Analysis computer program

    Science.gov (United States)

    Bauer, Frank H.; Downing, John P.; Thorpe, Christopher J.

    1990-01-01

    Structural finite element models are often too large to be used in the design and analysis of control systems. Model reduction techniques must be applied to reduce the structural model to manageable size. In the past, engineers either performed the model order reduction by hand or used distinct computer programs to retrieve the data, to perform the significance analysis and to reduce the order of the model. To expedite this process, the latest version of INCA has been expanded to include an interactive graphical structural mode significance and model order reduction capability.

  6. COMPUTER PROGRAMMING AND ROBOTICS IN BASIC EDUCATION

    Directory of Open Access Journals (Sweden)

    José Manuel Cabrera Delgado

    2015-12-01

    Full Text Available This article aims to get an overview of the process of including the computer programming and the robotics in the educational curriculum of the basic education in several European countries, including Spain. For this purpose, the cases of Estonia and France are briefly analyzed, two countries in the European Union, which can be considered pioneers in implementing such teaching. Also, in relation to Spain, it is analyzed some of the current initiatives implemented by some Autonomous Communities in this sense.

  7. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  8. Scientific Computing in the CH Programming Language

    Directory of Open Access Journals (Sweden)

    Harry H. Cheng

    1993-01-01

    Full Text Available We have developed a general-purpose block-structured interpretive programming Ianguage. The syntax and semantics of this language called CH are similar to C. CH retains most features of C from the scientific computing point of view. In this paper, the extension of C to CH for numerical computation of real numbers will be described. Metanumbers of −0.0, 0.0, Inf, −Inf, and NaN are introduced in CH. Through these metanumbers, the power of the IEEE 754 arithmetic standard is easily available to the programmer. These metanumbers are extended to commonly used mathematical functions in the spirit of the IEEE 754 standard and ANSI C. The definitions for manipulation of these metanumbers in I/O; arithmetic, relational, and logic operations; and built-in polymorphic mathematical functions are defined. The capabilities of bitwise, assignment, address and indirection, increment and decrement, as well as type conversion operations in ANSI C are extended in CH. In this paper, mainly new linguistic features of CH in comparison to C will be described. Example programs programmed in CH with metanumbers and polymorphic mathematical functions will demonstrate capabilities of CH in scientific computing.

  9. Computing Linear Mathematical Models Of Aircraft

    Science.gov (United States)

    Duke, Eugene L.; Antoniewicz, Robert F.; Krambeer, Keith D.

    1991-01-01

    Derivation and Definition of Linear Aircraft Model (LINEAR) computer program provides user with powerful, and flexible, standard, documented, and verified software tool for linearization of mathematical models of aerodynamics of aircraft. Intended for use in software tool to drive linear analysis of stability and design of control laws for aircraft. Capable of both extracting such linearized engine effects as net thrust, torque, and gyroscopic effects, and including these effects in linear model of system. Designed to provide easy selection of state, control, and observation variables used in particular model. Also provides flexibility of allowing alternate formulations of both state and observation equations. Written in FORTRAN.

  10. Computing Programs for Determining Traffic Flows from Roundabouts

    Science.gov (United States)

    Boroiu, A. A.; Tabacu, I.; Ene, A.; Neagu, E.; Boroiu, A.

    2017-10-01

    For modelling road traffic at the level of a road network it is necessary to specify the flows of all traffic currents at each intersection. These data can be obtained by direct measurements at the traffic light intersections, but in the case of a roundabout this is not possible directly and the literature as well as the traffic modelling software doesn’t offer ways to solve this issue. Two sets of formulas are proposed by which all traffic flows from the roundabouts with 3 or 4 arms are calculated based on the streams that can be measured. The objective of this paper is to develop computational programs to operate with these formulas. For each of the two sets of analytical relations, a computational program was developed in the Java operating language. The obtained results fully confirm the applicability of the calculation programs. The final stage for capitalizing these programs will be to make them web pages in HTML format, so that they can be accessed and used on the Internet. The achievements presented in this paper are an important step to provide a necessary tool for traffic modelling because these computational programs can be easily integrated into specialized software.

  11. Computational models of complex systems

    CERN Document Server

    Dabbaghian, Vahid

    2014-01-01

    Computational and mathematical models provide us with the opportunities to investigate the complexities of real world problems. They allow us to apply our best analytical methods to define problems in a clearly mathematical manner and exhaustively test our solutions before committing expensive resources. This is made possible by assuming parameter(s) in a bounded environment, allowing for controllable experimentation, not always possible in live scenarios. For example, simulation of computational models allows the testing of theories in a manner that is both fundamentally deductive and experimental in nature. The main ingredients for such research ideas come from multiple disciplines and the importance of interdisciplinary research is well recognized by the scientific community. This book provides a window to the novel endeavours of the research communities to present their works by highlighting the value of computational modelling as a research tool when investigating complex systems. We hope that the reader...

  12. User's manual for computer program BASEPLOT

    Science.gov (United States)

    Sanders, Curtis L.

    2002-01-01

    The checking and reviewing of daily records of streamflow within the U.S. Geological Survey is traditionally accomplished by hand-plotting and mentally collating tables of data. The process is time consuming, difficult to standardize, and subject to errors in computation, data entry, and logic. In addition, the presentation of flow data on the internet requires more timely and accurate computation of daily flow records. BASEPLOT was developed for checking and review of primary streamflow records within the U.S. Geological Survey. Use of BASEPLOT enables users to (1) provide efficiencies during the record checking and review process, (2) improve quality control, (3) achieve uniformity of checking and review techniques of simple stage-discharge relations, and (4) provide a tool for teaching streamflow computation techniques. The BASEPLOT program produces tables of quality control checks and produces plots of rating curves and discharge measurements; variable shift (V-shift) diagrams; and V-shifts converted to stage-discharge plots, using data stored in the U.S. Geological Survey Automatic Data Processing System database. In addition, the program plots unit-value hydrographs that show unit-value stages, shifts, and datum corrections; input shifts, datum corrections, and effective dates; discharge measurements; effective dates for rating tables; and numeric quality control checks. Checklist/tutorial forms are provided for reviewers to ensure completeness of review and standardize the review process. The program was written for the U.S. Geological Survey SUN computer using the Statistical Analysis System (SAS) software produced by SAS Institute, Incorporated.

  13. The complete guide to blender graphics computer modeling and animation

    CERN Document Server

    Blain, John M

    2014-01-01

    Smoothly Leads Users into the Subject of Computer Graphics through the Blender GUIBlender, the free and open source 3D computer modeling and animation program, allows users to create and animate models and figures in scenes, compile feature movies, and interact with the models and create video games. Reflecting the latest version of Blender, The Complete Guide to Blender Graphics: Computer Modeling & Animation, 2nd Edition helps beginners learn the basics of computer animation using this versatile graphics program. This edition incorporates many new features of Blender, including developments

  14. Computational models of adult neurogenesis

    Science.gov (United States)

    Cecchi, Guillermo A.; Magnasco, Marcelo O.

    2005-10-01

    Experimental results in recent years have shown that adult neurogenesis is a significant phenomenon in the mammalian brain. Little is known, however, about the functional role played by the generation and destruction of neurons in the context of an adult brain. Here, we propose two models where new projection neurons are incorporated. We show that in both models, using incorporation and removal of neurons as a computational tool, it is possible to achieve a higher computational efficiency that in purely static, synapse-learning-driven networks. We also discuss the implication for understanding the role of adult neurogenesis in specific brain areas like the olfactory bulb and the dentate gyrus.

  15. A program for reading DNA sequence gels using a small computer equipped with a graphics tablet.

    Science.gov (United States)

    Lautenberger, J A

    1982-01-01

    A program has been written in BASIC that allows DNA sequence gels to be read by a Tektronix model 4052 computer equipped with a graphics tablet. Sequences from each gel are stored on tape for later transfer to a larger computer where they are melded into a complete overall sequence. The program should be adaptable to other small computers. PMID:7063401

  16. Gender Differences in the Use of Computers, Programming, and Peer Interactions in Computer Science Classrooms

    Science.gov (United States)

    Stoilescu, Dorian; Egodawatte, Gunawardena

    2010-01-01

    Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new…

  17. Computer Implementation of the Two-Factor DP Model for ...

    African Journals Online (AJOL)

    A computer program known as Program Simplex which takes advantage of this sparseness has been applied to obtain an optimal solution to the manpower planning problem presented. It has also been observed that LP models with few nonzero coefficients can easily be solved by using a computer to obtain an optimal ...

  18. General Purpose Cost Distribution Model for Computer Assisted Instruction.

    Science.gov (United States)

    Voeller, Rick

    To compare the unit cost of computer-assisted instruction (CAI) programs, there must be a standard model for calculating the cost of computer services. Such cost can be classified into direct costs--expenditures made directly by the group in charge of CAI programs, and indirect costs--expenditures made by other groups in support of CAI services.…

  19. Airline return-on-investment model for technology evaluation. [computer program to measure economic value of advanced technology applied to passenger aircraft

    Science.gov (United States)

    1974-01-01

    This report presents the derivation, description, and operating instructions for a computer program (TEKVAL) which measures the economic value of advanced technology features applied to long range commercial passenger aircraft. The program consists of three modules; and airplane sizing routine, a direct operating cost routine, and an airline return-on-investment routine. These modules are linked such that they may be operated sequentially or individually, with one routine generating the input for the next or with the option of externally specifying the input for either of the economic routines. A very simple airplane sizing technique was previously developed, based on the Brequet range equation. For this program, that sizing technique has been greatly expanded and combined with the formerly separate DOC and ROI programs to produce TEKVAL.

  20. The Impact of IBM Cell Technology on the Programming Paradigm in the Context of Computer Systems for Climate and Weather Models

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Shujia; Duffy, Daniel; Clune, Thomas; Suarez, Max; Williams, Samuel; Halem, Milton

    2009-01-10

    The call for ever-increasing model resolutions and physical processes in climate and weather models demands a continual increase in computing power. The IBM Cell processor's order-of-magnitude peak performance increase over conventional processors makes it very attractive to fulfill this requirement. However, the Cell's characteristics, 256KB local memory per SPE and the new low-level communication mechanism, make it very challenging to port an application. As a trial, we selected the solar radiation component of the NASA GEOS-5 climate model, which: (1) is representative of column physics components (half the total computational time), (2) has an extremely high computational intensity: the ratio of computational load to main memory transfers, and (3) exhibits embarrassingly parallel column computations. In this paper, we converted the baseline code (single-precision Fortran) to C and ported it to an IBM BladeCenter QS20. For performance, we manually SIMDize four independent columns and include several unrolling optimizations. Our results show that when compared with the baseline implementation running on one core of Intel's Xeon Woodcrest, Dempsey, and Itanium2, the Cell is approximately 8.8x, 11.6x, and 12.8x faster, respectively. Our preliminary analysis shows that the Cell can also accelerate the dynamics component (~;;25percent total computational time). We believe these dramatic performance improvements make the Cell processor very competitive as an accelerator.

  1. DFI Computer Modeling Software (CMS)

    Energy Technology Data Exchange (ETDEWEB)

    Cazalet, E.G.; Deziel, L.B. Jr.; Haas, S.M.; Martin, T.W.; Nesbitt, D.M.; Phillips, R.L.

    1979-10-01

    The data base management system used to create, edit and store models data and solutions for the LEAP system is described. The software is entirely in FORTRAN-G for the IBM 370 series of computers and provides interface with a commercial data base system SYSTEM-2000.

  2. Pervasive Computing and Prosopopoietic Modelling

    DEFF Research Database (Denmark)

    Michelsen, Anders Ib

    2011-01-01

    of the classical rhetoric term of ’prosopopoeia’ into the debate on large technological systems. First, the paper introduces the paradoxical distinction/complicity by debating Gilbert Simondon’s notion of a ‘margin of indeterminacy’ vis-a-vis computing. Second, it debates the idea of prosopopoietic modeling...

  3. Computational Modeling in Liver Surgery

    Directory of Open Access Journals (Sweden)

    Bruno Christ

    2017-11-01

    Full Text Available The need for extended liver resection is increasing due to the growing incidence of liver tumors in aging societies. Individualized surgical planning is the key for identifying the optimal resection strategy and to minimize the risk of postoperative liver failure and tumor recurrence. Current computational tools provide virtual planning of liver resection by taking into account the spatial relationship between the tumor and the hepatic vascular trees, as well as the size of the future liver remnant. However, size and function of the liver are not necessarily equivalent. Hence, determining the future liver volume might misestimate the future liver function, especially in cases of hepatic comorbidities such as hepatic steatosis. A systems medicine approach could be applied, including biological, medical, and surgical aspects, by integrating all available anatomical and functional information of the individual patient. Such an approach holds promise for better prediction of postoperative liver function and hence improved risk assessment. This review provides an overview of mathematical models related to the liver and its function and explores their potential relevance for computational liver surgery. We first summarize key facts of hepatic anatomy, physiology, and pathology relevant for hepatic surgery, followed by a description of the computational tools currently used in liver surgical planning. Then we present selected state-of-the-art computational liver models potentially useful to support liver surgery. Finally, we discuss the main challenges that will need to be addressed when developing advanced computational planning tools in the context of liver surgery.

  4. Computer programming: Science, art, or both?

    Science.gov (United States)

    Gum, Sandra Trent

    The purpose of this study was to determine if spatial intelligence contributes to a student's success in a computer science major or if mathematical-logical intelligence is sufficient data on which to base a prediction of success. The study was performed at a small university. The sample consisted of 15 computer science (CS) majors, enrolled in a computer science class, and 15 non-CS-majors, enrolled in a statistics class. Seven of the CS-majors were considered advanced and seven were considered less advanced. The independent measures were: the mathematics and the English scores from the ACT/SAT (CS-majors); a questionnaire to obtain personal information; the major area of study which compared CS-majors to all other majors; and the number of completed computer science classes (CS-majors) to determine advanced and less advanced CS-majors. The dependent measures were: a multiple intelligence inventory for adults to determine perception of intelligences; the GEFT to determine field independence independence; the Card Rotations Test to determine spatial orientation ability; the Maze Tracing Speed Test to determine spatial scanning ability; and the Surface Development test to determine visualization ability. The visualization measure correlated positively and significantly with the GEFT. The year in college correlated positively and significantly with the GEFT and visualization measure for CS-majors and correlated negatively for non-CS-majors. Although non-CS-majors scored higher on the spatial orientation measure, CS-majors scored significantly higher on the spatial scanning measure. The year in college correlated negatively with many of the measures and perceptions of intelligences among both groups; however, there were more significant negative correlations among non-CS-majors. Results indicated that experience in computer programming may increase field independence, visualization ability, and spatial scanning ability while decreasing spatial orientation ability. The

  5. Implementing and assessing computational modeling in introductory mechanics

    CERN Document Server

    Caballero, Marcos D; Schatz, Michael F

    2011-01-01

    Students taking introductory physics are rarely exposed to computational modeling. In a one-semester large lecture introductory calculus-based mechanics course at Georgia Tech, students learned to solve physics problems using the VPython programming environment. During the term 1357 students in this course solved a suite of fourteen computational modeling homework questions delivered using an online commercial course management system. Their proficiency with computational modeling was evaluated in a proctored environment using a novel central force problem. The majority of students (60.4%) successfully completed the evaluation. Analysis of erroneous student-submitted programs indicated that a small set of student errors explained why most programs failed. We discuss the design and implementation of the computational modeling homework and evaluation, the results from the evaluation and the implications for instruction in computational modeling in introductory STEM courses.

  6. Computational Models of Face Perception.

    Science.gov (United States)

    Martinez, Aleix M

    2017-06-01

    Faces are one of the most important means of communication in humans. For example, a short glance at a person's face provides information on identity and emotional state. What are the computations the brain uses to solve these problems so accurately and seemingly effortlessly? This article summarizes current research on computational modeling, a technique used to answer this question. Specifically, my research studies the hypothesis that this algorithm is tasked to solve the inverse problem of production. For example, to recognize identity, our brain needs to identify shape and shading image features that are invariant to facial expression, pose and illumination. Similarly, to recognize emotion, the brain needs to identify shape and shading features that are invariant to identity, pose and illumination. If one defines the physics equations that render an image under different identities, expressions, poses and illuminations, then gaining invariance to these factors is readily resolved by computing the inverse of this rendering function. I describe our current understanding of the algorithms used by our brains to resolve this inverse problem. I also discuss how these results are driving research in computer vision to design computer systems that are as accurate, robust and efficient as humans.

  7. Computer Model Locates Environmental Hazards

    Science.gov (United States)

    2008-01-01

    Catherine Huybrechts Burton founded San Francisco-based Endpoint Environmental (2E) LLC in 2005 while she was a student intern and project manager at Ames Research Center with NASA's DEVELOP program. The 2E team created the Tire Identification from Reflectance model, which algorithmically processes satellite images using turnkey technology to retain only the darkest parts of an image. This model allows 2E to locate piles of rubber tires, which often are stockpiled illegally and cause hazardous environmental conditions and fires.

  8. Cosmic logic: a computational model

    Science.gov (United States)

    Vanchurin, Vitaly

    2016-02-01

    We initiate a formal study of logical inferences in context of the measure problem in cosmology or what we call cosmic logic. We describe a simple computational model of cosmic logic suitable for analysis of, for example, discretized cosmological systems. The construction is based on a particular model of computation, developed by Alan Turing, with cosmic observers (CO), cosmic measures (CM) and cosmic symmetries (CS) described by Turing machines. CO machines always start with a blank tape and CM machines take CO's Turing number (also known as description number or Gödel number) as input and output the corresponding probability. Similarly, CS machines take CO's Turing number as input, but output either one if the CO machines are in the same equivalence class or zero otherwise. We argue that CS machines are more fundamental than CM machines and, thus, should be used as building blocks in constructing CM machines. We prove the non-computability of a CS machine which discriminates between two classes of CO machines: mortal that halts in finite time and immortal that runs forever. In context of eternal inflation this result implies that it is impossible to construct CM machines to compute probabilities on the set of all CO machines using cut-off prescriptions. The cut-off measures can still be used if the set is reduced to include only machines which halt after a finite and predetermined number of steps.

  9. Python for Scientific Computing Education: Modeling of Queueing Systems

    Directory of Open Access Journals (Sweden)

    Vladimiras Dolgopolovas

    2014-01-01

    Full Text Available In this paper, we present the methodology for the introduction to scientific computing based on model-centered learning. We propose multiphase queueing systems as a basis for learning objects. We use Python and parallel programming for implementing the models and present the computer code and results of stochastic simulations.

  10. Generation of river discharge using water balance computer model ...

    African Journals Online (AJOL)

    The paper presents a study on river discharge generation using a water balance computer model. The results of the data generated shows that the computer program designed gave a good· prediction of the recorded discharge within 95% confidence interval. The model is therefore recommended for other catchments with ...

  11. Minimal models of multidimensional computations.

    Directory of Open Access Journals (Sweden)

    Jeffrey D Fitzgerald

    2011-03-01

    Full Text Available The multidimensional computations performed by many biological systems are often characterized with limited information about the correlations between inputs and outputs. Given this limitation, our approach is to construct the maximum noise entropy response function of the system, leading to a closed-form and minimally biased model consistent with a given set of constraints on the input/output moments; the result is equivalent to conditional random field models from machine learning. For systems with binary outputs, such as neurons encoding sensory stimuli, the maximum noise entropy models are logistic functions whose arguments depend on the constraints. A constraint on the average output turns the binary maximum noise entropy models into minimum mutual information models, allowing for the calculation of the information content of the constraints and an information theoretic characterization of the system's computations. We use this approach to analyze the nonlinear input/output functions in macaque retina and thalamus; although these systems have been previously shown to be responsive to two input dimensions, the functional form of the response function in this reduced space had not been unambiguously identified. A second order model based on the logistic function is found to be both necessary and sufficient to accurately describe the neural responses to naturalistic stimuli, accounting for an average of 93% of the mutual information with a small number of parameters. Thus, despite the fact that the stimulus is highly non-Gaussian, the vast majority of the information in the neural responses is related to first and second order correlations. Our results suggest a principled and unbiased way to model multidimensional computations and determine the statistics of the inputs that are being encoded in the outputs.

  12. Computational Models of Rock Failure

    Science.gov (United States)

    May, Dave A.; Spiegelman, Marc

    2017-04-01

    Practitioners in computational geodynamics, as per many other branches of applied science, typically do not analyse the underlying PDE's being solved in order to establish the existence or uniqueness of solutions. Rather, such proofs are left to the mathematicians, and all too frequently these results lag far behind (in time) the applied research being conducted, are often unintelligible to the non-specialist, are buried in journals applied scientists simply do not read, or simply have not been proven. As practitioners, we are by definition pragmatic. Thus, rather than first analysing our PDE's, we first attempt to find approximate solutions by throwing all our computational methods and machinery at the given problem and hoping for the best. Typically this approach leads to a satisfactory outcome. Usually it is only if the numerical solutions "look odd" that we start delving deeper into the math. In this presentation I summarise our findings in relation to using pressure dependent (Drucker-Prager type) flow laws in a simplified model of continental extension in which the material is assumed to be an incompressible, highly viscous fluid. Such assumptions represent the current mainstream adopted in computational studies of mantle and lithosphere deformation within our community. In short, we conclude that for the parameter range of cohesion and friction angle relevant to studying rocks, the incompressibility constraint combined with a Drucker-Prager flow law can result in problems which have no solution. This is proven by a 1D analytic model and convincingly demonstrated by 2D numerical simulations. To date, we do not have a robust "fix" for this fundamental problem. The intent of this submission is to highlight the importance of simple analytic models, highlight some of the dangers / risks of interpreting numerical solutions without understanding the properties of the PDE we solved, and lastly to stimulate discussions to develop an improved computational model of

  13. Molecular Sieve Bench Testing and Computer Modeling

    Science.gov (United States)

    Mohamadinejad, Habib; DaLee, Robert C.; Blackmon, James B.

    1995-01-01

    The design of an efficient four-bed molecular sieve (4BMS) CO2 removal system for the International Space Station depends on many mission parameters, such as duration, crew size, cost of power, volume, fluid interface properties, etc. A need for space vehicle CO2 removal system models capable of accurately performing extrapolated hardware predictions is inevitable due to the change of the parameters which influences the CO2 removal system capacity. The purpose is to investigate the mathematical techniques required for a model capable of accurate extrapolated performance predictions and to obtain test data required to estimate mass transfer coefficients and verify the computer model. Models have been developed to demonstrate that the finite difference technique can be successfully applied to sorbents and conditions used in spacecraft CO2 removal systems. The nonisothermal, axially dispersed, plug flow model with linear driving force for 5X sorbent and pore diffusion for silica gel are then applied to test data. A more complex model, a non-darcian model (two dimensional), has also been developed for simulation of the test data. This model takes into account the channeling effect on column breakthrough. Four FORTRAN computer programs are presented: a two-dimensional model of flow adsorption/desorption in a packed bed; a one-dimensional model of flow adsorption/desorption in a packed bed; a model of thermal vacuum desorption; and a model of a tri-sectional packed bed with two different sorbent materials. The programs are capable of simulating up to four gas constituents for each process, which can be increased with a few minor changes.

  14. The Model of Computation of CUDA and its Formal Semantics

    OpenAIRE

    Habermaier, Axel

    2011-01-01

    We formalize the model of computation of modern graphics cards based on the specification of Nvidia's Compute Unified Device Architecture (CUDA). CUDA programs are executed by thousands of threads concurrently and have access to several different types of memory with unique access patterns and latencies. The underlying hardware uses a single instruction, multiple threads execution model that groups threads into warps. All threads of the same warp execute the program in lockstep. If threads of...

  15. Computational modeling in cognitive science: a manifesto for change.

    Science.gov (United States)

    Addyman, Caspar; French, Robert M

    2012-07-01

    Computational modeling has long been one of the traditional pillars of cognitive science. Unfortunately, the computer models of cognition being developed today have not kept up with the enormous changes that have taken place in computer technology and, especially, in human-computer interfaces.  For all intents and purposes, modeling is still done today as it was 25, or even 35, years ago. Everyone still programs in his or her own favorite programming language, source code is rarely made available, accessibility of models to non-programming researchers is essentially non-existent, and even for other modelers, the profusion of source code in a multitude of programming languages, written without programming guidelines, makes it almost impossible to access, check, explore, re-use, or continue to develop. It is high time to change this situation, especially since the tools are now readily available to do so. We propose that the modeling community adopt three simple guidelines that would ensure that computational models would be accessible to the broad range of researchers in cognitive science. We further emphasize the pivotal role that journal editors must play in making computational models accessible to readers of their journals. Copyright © 2012 Cognitive Science Society, Inc.

  16. Computational Modeling in Tissue Engineering

    CERN Document Server

    2013-01-01

    One of the major challenges in tissue engineering is the translation of biological knowledge on complex cell and tissue behavior into a predictive and robust engineering process. Mastering this complexity is an essential step towards clinical applications of tissue engineering. This volume discusses computational modeling tools that allow studying the biological complexity in a more quantitative way. More specifically, computational tools can help in:  (i) quantifying and optimizing the tissue engineering product, e.g. by adapting scaffold design to optimize micro-environmental signals or by adapting selection criteria to improve homogeneity of the selected cell population; (ii) quantifying and optimizing the tissue engineering process, e.g. by adapting bioreactor design to improve quality and quantity of the final product; and (iii) assessing the influence of the in vivo environment on the behavior of the tissue engineering product, e.g. by investigating vascular ingrowth. The book presents examples of each...

  17. Business model elements impacting cloud computing adoption

    DEFF Research Database (Denmark)

    Bogataj, Kristina; Pucihar, Andreja; Sudzina, Frantisek

    The paper presents a proposed research framework for identification of business model elements impacting Cloud Computing Adoption. We provide a definition of main Cloud Computing characteristics, discuss previous findings on factors impacting Cloud Computing Adoption, and investigate technology...... adoption theories, such as Diffusion of Innovations, Technology Acceptance Model, Unified Theory of Acceptance and Use of Technology. Further on, at research model for identification of Cloud Computing Adoption factors from a business model perspective is presented. The following business model building...

  18. Modeling Computer Virus and Its Dynamics

    OpenAIRE

    Peng, Mei; He, Xing; Huang, Junjian; Dong, Tao

    2013-01-01

    Based on that the computer will be infected by infected computer and exposed computer, and some of the computers which are in suscepitible status and exposed status can get immunity by antivirus ability, a novel coumputer virus model is established. The dynamic behaviors of this model are investigated. First, the basic reproduction number R0, which is a threshold of the computer virus spreading in internet, is determined. Second, this model has a virus-free equilibrium P0, which means that th...

  19. Computationally intensive econometrics using a distributed matrix-programming language.

    Science.gov (United States)

    Doornik, Jurgen A; Hendry, David F; Shephard, Neil

    2002-06-15

    This paper reviews the need for powerful computing facilities in econometrics, focusing on concrete problems which arise in financial economics and in macroeconomics. We argue that the profession is being held back by the lack of easy-to-use generic software which is able to exploit the availability of cheap clusters of distributed computers. Our response is to extend, in a number of directions, the well-known matrix-programming interpreted language Ox developed by the first author. We note three possible levels of extensions: (i) Ox with parallelization explicit in the Ox code; (ii) Ox with a parallelized run-time library; and (iii) Ox with a parallelized interpreter. This paper studies and implements the first case, emphasizing the need for deterministic computing in science. We give examples in the context of financial economics and time-series modelling.

  20. Opportunity for Realizing Ideal Computing System using Cloud Computing Model

    OpenAIRE

    Sreeramana Aithal; Vaikunth Pai T

    2017-01-01

    An ideal computing system is a computing system with ideal characteristics. The major components and their performance characteristics of such hypothetical system can be studied as a model with predicted input, output, system and environmental characteristics using the identified objectives of computing which can be used in any platform, any type of computing system, and for application automation, without making modifications in the form of structure, hardware, and software coding by an exte...

  1. Experiences with Efficient Methodologies for Teaching Computer Programming to Geoscientists

    Science.gov (United States)

    Jacobs, Christian T.; Gorman, Gerard J.; Rees, Huw E.; Craig, Lorraine E.

    2016-01-01

    Computer programming was once thought of as a skill required only by professional software developers. But today, given the ubiquitous nature of computation and data science it is quickly becoming necessary for all scientists and engineers to have at least a basic knowledge of how to program. Teaching how to program, particularly to those students…

  2. Computer Programs for Characteristic Modes of Bodies of Revolution

    Science.gov (United States)

    Computer programs are given for calculating the characteristic currents and characteristic gain patterns of conducting bodies of revolution. Also...given are computer programs for using these characteristic currents in aperture radiation and plane-wave scattering problems. Plot programs for use with

  3. 01010000 01001100 01000001 01011001: Play Elements in Computer Programming

    Science.gov (United States)

    Breslin, Samantha

    2013-01-01

    This article explores the role of play in human interaction with computers in the context of computer programming. The author considers many facets of programming including the literary practice of coding, the abstract design of programs, and more mundane activities such as testing, debugging, and hacking. She discusses how these incorporate the…

  4. ROUTES: a computer program for preliminary route location.

    Science.gov (United States)

    S.E. Reutebuch

    1988-01-01

    An analytical description of the ROUTES computer program is presented. ROUTES is part of the integrated preliminary harvest- and transportation-planning software package, PLANS. The ROUTES computer program is useful where grade and sideslope limitations are important in determining routes for vehicular travel. With the program, planners can rapidly identify route...

  5. International Conference on Computational Intelligence, Cyber Security, and Computational Models

    CERN Document Server

    Ramasamy, Vijayalakshmi; Sheen, Shina; Veeramani, C; Bonato, Anthony; Batten, Lynn

    2016-01-01

    This book aims at promoting high-quality research by researchers and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security, and Computational Models ICC3 2015 organized by PSG College of Technology, Coimbatore, India during December 17 – 19, 2015. This book enriches with innovations in broad areas of research like computational modeling, computational intelligence and cyber security. These emerging inter disciplinary research areas have helped to solve multifaceted problems and gained lot of attention in recent years. This encompasses theory and applications, to provide design, analysis and modeling of the aforementioned key areas.

  6. Generating Turing Machines by Use of Other Computation Models

    Directory of Open Access Journals (Sweden)

    Leszek Dubiel

    2003-01-01

    Full Text Available For each problem that can be solved there exists algorithm, which can be described with a program of Turing machine. Because this is very simple model programs tend to be very complicated and hard to analyse by human. The best practice to solve given type of problems is to define a new model of computation that allows for quick and easy programming, and then to emulate its operation with Turing machine. This article shows how to define most suitable model for computation on natural numbers and defines Turing machine that emulates its operation.

  7. Some queuing network models of computer systems

    Science.gov (United States)

    Herndon, E. S.

    1980-01-01

    Queuing network models of a computer system operating with a single workload type are presented. Program algorithms are adapted for use on the Texas Instruments SR-52 programmable calculator. By slightly altering the algorithm to process the G and H matrices row by row instead of column by column, six devices and an unlimited job/terminal population could be handled on the SR-52. Techniques are also introduced for handling a simple load dependent server and for studying interactive systems with fixed multiprogramming limits.

  8. Optimization and mathematical modeling in computer architecture

    CERN Document Server

    Sankaralingam, Karu; Nowatzki, Tony

    2013-01-01

    In this book we give an overview of modeling techniques used to describe computer systems to mathematical optimization tools. We give a brief introduction to various classes of mathematical optimization frameworks with special focus on mixed integer linear programming which provides a good balance between solver time and expressiveness. We present four detailed case studies -- instruction set customization, data center resource management, spatial architecture scheduling, and resource allocation in tiled architectures -- showing how MILP can be used and quantifying by how much it outperforms t

  9. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. Biocellion: accelerating computer simulation of multicellular biological system models

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-01-01

    Motivation: Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. Results: We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Availability and implementation: Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. Contact: seunghwa.kang@pnnl.gov PMID:25064572

  11. Computing Models for FPGA-Based Accelerators

    Science.gov (United States)

    Herbordt, Martin C.; Gu, Yongfeng; VanCourt, Tom; Model, Josh; Sukhwani, Bharat; Chiu, Matt

    2011-01-01

    Field-programmable gate arrays are widely considered as accelerators for compute-intensive applications. A critical phase of FPGA application development is finding and mapping to the appropriate computing model. FPGA computing enables models with highly flexible fine-grained parallelism and associative operations such as broadcast and collective response. Several case studies demonstrate the effectiveness of using these computing models in developing FPGA applications for molecular modeling. PMID:21603152

  12. A multilingual programming model for coupled systems.

    Energy Technology Data Exchange (ETDEWEB)

    Ong, E. T.; Larson, J. W.; Norris, B.; Tobis, M.; Steder, M.; Jacob, R. L.; Mathematics and Computer Science; Univ. of Wisconsin; Univ. of Chicago; The Australian National Univ.

    2008-01-01

    Multiphysics and multiscale simulation systems share a common software requirement-infrastructure to implement data exchanges between their constituent parts-often called the coupling problem. On distributed-memory parallel platforms, the coupling problem is complicated by the need to describe, transfer, and transform distributed data, known as the parallel coupling problem. Parallel coupling is emerging as a new grand challenge in computational science as scientists attempt to build multiscale and multiphysics systems on parallel platforms. An additional coupling problem in these systems is language interoperability between their constituent codes. We have created a multilingual parallel coupling programming model based on a successful open-source parallel coupling library, the Model Coupling Toolkit (MCT). This programming model's capabilities reach beyond MCT's native Fortran implementation to include bindings for the C++ and Python programming languages. We describe the method used to generate the interlanguage bindings. This approach enables an object-based programming model for implementing parallel couplings in non-Fortran coupled systems and in systems with language heterogeneity. We describe the C++ and Python versions of the MCT programming model and provide short examples. We report preliminary performance results for the MCT interpolation benchmark. We describe a major Python application that uses the MCT Python bindings, a Python implementation of the control and coupling infrastructure for the community climate system model. We conclude with a discussion of the significance of this work to productivity computing in multidisciplinary computational science.

  13. Computer models for economic and silvicultural decisions

    Science.gov (United States)

    Rosalie J. Ingram

    1989-01-01

    Computer systems can help simplify decisionmaking to manage forest ecosystems. We now have computer models to help make forest management decisions by predicting changes associated with a particular management action. Models also help you evaluate alternatives. To be effective, the computer models must be reliable and appropriate for your situation.

  14. The ASC Sequoia Programming Model

    Energy Technology Data Exchange (ETDEWEB)

    Seager, M

    2008-08-06

    In the late 1980's and early 1990's, Lawrence Livermore National Laboratory was deeply engrossed in determining the next generation programming model for the Integrated Design Codes (IDC) beyond vectorization for the Cray 1s series of computers. The vector model, developed in mid 1970's first for the CDC 7600 and later extended from stack based vector operation to memory to memory operations for the Cray 1s, lasted approximately 20 years (See Slide 5). The Cray vector era was deemed an extremely long lived era as it allowed vector codes to be developed over time (the Cray 1s were faster in scalar mode than the CDC 7600) with vector unit utilization increasing incrementally over time. The other attributes of the Cray vector era at LLNL were that we developed, supported and maintained the Operating System (LTSS and later NLTSS), communications protocols (LINCS), Compilers (Civic Fortran77 and Model), operating system tools (e.g., batch system, job control scripting, loaders, debuggers, editors, graphics utilities, you name it) and math and highly machine optimized libraries (e.g., SLATEC, and STACKLIB). Although LTSS was adopted by Cray for early system generations, they later developed COS and UNICOS operating systems and environment on their own. In the late 1970s and early 1980s two trends appeared that made the Cray vector programming model (described above including both the hardware and system software aspects) seem potentially dated and slated for major revision. These trends were the appearance of low cost CMOS microprocessors and their attendant, departmental and mini-computers and later workstations and personal computers. With the wide spread adoption of Unix in the early 1980s, it appeared that LLNL (and the other DOE Labs) would be left out of the mainstream of computing without a rapid transition to these 'Killer Micros' and modern OS and tools environments. The other interesting advance in the period is that systems were being

  15. Integrating Numerical Computation into the Modeling Instruction Curriculum

    CERN Document Server

    Caballero, Marcos D; Aiken, John M; Douglas, Scott S; Scanlon, Erin M; Thoms, Brian; Schatz, Michael F

    2012-01-01

    We describe a way to introduce physics high school students with no background in programming to computational problem-solving experiences. Our approach builds on the great strides made by the Modeling Instruction reform curriculum. This approach emphasizes the practices of "Developing and using models" and "Computational thinking" highlighted by the NRC K-12 science standards framework. We taught 9th-grade students in a Modeling-Instruction-based physics course to construct computational models using the VPython programming environment. Numerical computation within the Modeling Instruction curriculum provides coherence among the curriculum's different force and motion models, links the various representations which the curriculum employs, and extends the curriculum to include real-world problems that are inaccessible to a purely analytic approach.

  16. Overview of Computer Simulation Modeling Approaches and Methods

    Science.gov (United States)

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  17. Computational Modeling and Simulation of Developmental ...

    Science.gov (United States)

    Standard practice for assessing developmental toxicity is the observation of apical endpoints (intrauterine death, fetal growth retardation, structural malformations) in pregnant rats/rabbits following exposure during organogenesis. EPA’s computational toxicology research program (ToxCast) generated vast in vitro cellular and molecular effects data on >1858 chemicals in >600 high-throughput screening (HTS) assays. The diversity of assays has been increased for developmental toxicity with several HTS platforms, including the devTOX-quickPredict assay from Stemina Biomarker Discovery utilizing the human embryonic stem cell line (H9). Translating these HTS data into higher order-predictions of developmental toxicity is a significant challenge. Here, we address the application of computational systems models that recapitulate the kinematics of dynamical cell signaling networks (e.g., SHH, FGF, BMP, retinoids) in a CompuCell3D.org modeling environment. Examples include angiogenesis (angiodysplasia) and dysmorphogenesis. Being numerically responsive to perturbation, these models are amenable to data integration for systems Toxicology and Adverse Outcome Pathways (AOPs). The AOP simulation outputs predict potential phenotypes based on the in vitro HTS data ToxCast. A heuristic computational intelligence framework that recapitulates the kinematics of dynamical cell signaling networks in the embryo, together with the in vitro profiling data, produce quantitative predic

  18. Macroevolution simulated with autonomously replicating computer programs.

    Science.gov (United States)

    Yedid, Gabriel; Bell, Graham

    The process of adaptation occurs on two timescales. In the short term, natural selection merely sorts the variation already present in a population, whereas in the longer term genotypes quite different from any that were initially present evolve through the cumulation of new mutations. The first process is described by the mathematical theory of population genetics. However, this theory begins by defining a fixed set of genotypes and cannot provide a satisfactory analysis of the second process because it does not permit any genuinely new type to arise. The evolutionary outcome of selection acting on novel variation arising over long periods is therefore difficult to predict. The classical problem of this kind is whether 'replaying the tape of life' would invariably lead to the familiar organisms of the modern biota. Here we study the long-term behaviour of populations of autonomously replicating computer programs and find that the same type, introduced into the same simple environment, evolves on any given occasion along a unique trajectory towards one of many well-adapted end points.

  19. Dynamical Models for Computer Viruses Propagation

    Directory of Open Access Journals (Sweden)

    José R. C. Piqueira

    2008-01-01

    Full Text Available Nowadays, digital computer systems and networks are the main engineering tools, being used in planning, design, operation, and control of all sizes of building, transportation, machinery, business, and life maintaining devices. Consequently, computer viruses became one of the most important sources of uncertainty, contributing to decrease the reliability of vital activities. A lot of antivirus programs have been developed, but they are limited to detecting and removing infections, based on previous knowledge of the virus code. In spite of having good adaptation capability, these programs work just as vaccines against diseases and are not able to prevent new infections based on the network state. Here, a trial on modeling computer viruses propagation dynamics relates it to other notable events occurring in the network permitting to establish preventive policies in the network management. Data from three different viruses are collected in the Internet and two different identification techniques, autoregressive and Fourier analyses, are applied showing that it is possible to forecast the dynamics of a new virus propagation by using the data collected from other viruses that formerly infected the network.

  20. Basis And Application Of The CARES/LIFE Computer Program

    Science.gov (United States)

    Nemeth, Noel N.; Janosik, Lesley A.; Gyekenyesi, John P.; Powers, Lynn M.

    1996-01-01

    Report discusses physical and mathematical basis of Ceramics Analysis and Reliability Evaluation of Structures LIFE prediction (CARES/LIFE) computer program, described in "Program for Evaluation of Reliability of Ceramic Parts" (LEW-16018).

  1. Positioning Continuing Education Computer Programs for the Corporate Market.

    Science.gov (United States)

    Tilney, Ceil

    1993-01-01

    Summarizes the findings of the market assessment phase of Bellevue Community College's evaluation of its continuing education computer training program. Indicates that marketing efforts must stress program quality and software training to help overcome strong antiacademic client sentiment. (MGB)

  2. Soft Computing Models in Industrial and Environmental Applications

    CERN Document Server

    Abraham, Ajith; Corchado, Emilio; 7th International Conference, SOCO’12

    2013-01-01

    This volume of Advances in Intelligent and Soft Computing contains accepted papers presented at SOCO 2012, held in the beautiful and historic city of Ostrava (Czech Republic), in September 2012.   Soft computing represents a collection or set of computational techniques in machine learning, computer science and some engineering disciplines, which investigate, simulate, and analyze very complex issues and phenomena.   After a through peer-review process, the SOCO 2012 International Program Committee selected 75 papers which are published in these conference proceedings, and represents an acceptance rate of 38%. In this relevant edition a special emphasis was put on the organization of special sessions. Three special sessions were organized related to relevant topics as: Soft computing models for Control Theory & Applications in Electrical Engineering, Soft computing models for biomedical signals and data processing and Advanced Soft Computing Methods in Computer Vision and Data Processing.   The selecti...

  3. Thinking processes used by high-performing students in a computer programming task

    Directory of Open Access Journals (Sweden)

    Marietjie Havenga

    2011-07-01

    Full Text Available Computer programmers must be able to understand programming source code and write programs that execute complex tasks to solve real-world problems. This article is a trans- disciplinary study at the intersection of computer programming, education and psychology. It outlines the role of mental processes in the process of programming and indicates how successful thinking processes can support computer science students in writing correct and well-defined programs. A mixed methods approach was used to better understand the thinking activities and programming processes of participating students. Data collection involved both computer programs and students’ reflective thinking processes recorded in their journals. This enabled analysis of psychological dimensions of participants’ thinking processes and their problem-solving activities as they considered a programming problem. Findings indicate that the cognitive, reflective and psychological processes used by high-performing programmers contributed to their success in solving a complex programming problem. Based on the thinking processes of high performers, we propose a model of integrated thinking processes, which can support computer programming students. Keywords: Computer programming, education, mixed methods research, thinking processes.  Disciplines: Computer programming, education, psychology

  4. Introduction to computation and modeling for differential equations

    CERN Document Server

    Edsberg, Lennart

    2008-01-01

    An introduction to scientific computing for differential equationsIntroduction to Computation and Modeling for Differential Equations provides a unified and integrated view of numerical analysis, mathematical modeling in applications, and programming to solve differential equations, which is essential in problem-solving across many disciplines, such as engineering, physics, and economics. This book successfully introduces readers to the subject through a unique ""Five-M"" approach: Modeling, Mathematics, Methods, MATLAB, and Multiphysics. This approach facilitates a thorough understanding of h

  5. Enhanced absorption cycle computer model

    Science.gov (United States)

    Grossman, G.; Wilk, M.

    1993-09-01

    Absorption heat pumps have received renewed and increasing attention in the past two decades. The rising cost of electricity has made the particular features of this heat-powered cycle attractive for both residential and industrial applications. Solar-powered absorption chillers, gas-fired domestic heat pumps, and waste-heat-powered industrial temperature boosters are a few of the applications recently subjected to intensive research and development. The absorption heat pump research community has begun to search for both advanced cycles in various multistage configurations and new working fluid combinations with potential for enhanced performance and reliability. The development of working absorption systems has created a need for reliable and effective system simulations. A computer code has been developed for simulation of absorption systems at steady state in a flexible and modular form, making it possible to investigate various cycle configurations with different working fluids. The code is based on unit subroutines containing the governing equations for the system's components and property subroutines containing thermodynamic properties of the working fluids. The user conveys to the computer an image of his cycle by specifying the different subunits and their interconnections. Based on this information, the program calculates the temperature, flow rate, concentration, pressure, and vapor fraction at each state point in the system, and the heat duty at each unit, from which the coefficient of performance (COP) may be determined. This report describes the code and its operation, including improvements introduced into the present version. Simulation results are described for LiBr-H2O triple-effect cycles, LiCl-H2O solar-powered open absorption cycles, and NH3-H2O single-effect and generator-absorber heat exchange cycles. An appendix contains the user's manual.

  6. Towards the Epidemiological Modeling of Computer Viruses

    Directory of Open Access Journals (Sweden)

    Xiaofan Yang

    2012-01-01

    Full Text Available Epidemic dynamics of computer viruses is an emerging discipline aiming to understand the way that computer viruses spread on networks. This paper is intended to establish a series of rational epidemic models of computer viruses. First, a close inspection of some common characteristics shared by all typical computer viruses clearly reveals the flaws of previous models. Then, a generic epidemic model of viruses, which is named as the SLBS model, is proposed. Finally, diverse generalizations of the SLBS model are suggested. We believe this work opens a door to the full understanding of how computer viruses prevail on the Internet.

  7. A Computer Program for Assessing Nuclear Safety Culture Impact

    Energy Technology Data Exchange (ETDEWEB)

    Han, Kiyoon; Jae, Moosung [Hanyang Univ., Seoul (Korea, Republic of)

    2014-10-15

    Through several accidents of NPP including the Fukushima Daiichi in 2011 and Chernobyl accidents in 1986, a lack of safety culture was pointed out as one of the root cause of these accidents. Due to its latent influences on safety performance, safety culture has become an important issue in safety researches. Most of the researches describe how to evaluate the state of the safety culture of the organization. However, they did not include a possibility that the accident occurs due to the lack of safety culture. Because of that, a methodology for evaluating the impact of the safety culture on NPP's safety is required. In this study, the methodology for assessing safety culture impact is suggested and a computer program is developed for its application. SCII model which is the new methodology for assessing safety culture impact quantitatively by using PSA model. The computer program is developed for its application. This program visualizes the SCIs and the SCIIs. It might contribute to comparing the level of the safety culture among NPPs as well as improving the management safety of NPP.

  8. Quantum Computation Beyond the Circuit Model

    OpenAIRE

    Jordan, Stephen P.

    2008-01-01

    The quantum circuit model is the most widely used model of quantum computation. It provides both a framework for formulating quantum algorithms and an architecture for the physical construction of quantum computers. However, several other models of quantum computation exist which provide useful alternative frameworks for both discovering new quantum algorithms and devising new physical implementations of quantum computers. In this thesis, I first present necessary background material for a ge...

  9. Computational modeling of epithelial tissues.

    Science.gov (United States)

    Smallwood, Rod

    2009-01-01

    There is an extensive literature on the computational modeling of epithelial tissues at all levels from subcellular to whole tissue. This review concentrates on behavior at the individual cell to whole tissue level, and particularly on organizational aspects, and provides an indication of where information from other areas, such as the modeling of angiogenesis, is relevant. The skin, and the lining of all of the body cavities (lung, gut, cervix, bladder etc) are epithelial tissues, which in a topological sense are the boundary between inside and outside the body. They are thin sheets of cells (usually of the order of 0.5 mm thick) without extracellular matrix, have a relatively simple structure, and contain few types of cells. They have important barrier, secretory and transport functions, which are essential for the maintenance of life, so homeostasis and wound healing are important aspects of the behavior of epithelial tissues. Carcinomas originate in epithelial tissues.There are essentially two approaches to modeling tissues--to start at the level of the tissue (i.e., a length scale of the order of 1 mm) and develop generalized equations for behavior (a continuum approach); or to start at the level of the cell (i.e., a length scale of the order of 10 µm) and develop tissue behavior as an emergent property of cellular behavior (an individual-based approach). As will be seen, these are not mutually exclusive approaches, and they come in a variety of flavors.

  10. Model dynamics for quantum computing

    Science.gov (United States)

    Tabakin, Frank

    2017-08-01

    A model master equation suitable for quantum computing dynamics is presented. In an ideal quantum computer (QC), a system of qubits evolves in time unitarily and, by virtue of their entanglement, interfere quantum mechanically to solve otherwise intractable problems. In the real situation, a QC is subject to decoherence and attenuation effects due to interaction with an environment and with possible short-term random disturbances and gate deficiencies. The stability of a QC under such attacks is a key issue for the development of realistic devices. We assume that the influence of the environment can be incorporated by a master equation that includes unitary evolution with gates, supplemented by a Lindblad term. Lindblad operators of various types are explored; namely, steady, pulsed, gate friction, and measurement operators. In the master equation, we use the Lindblad term to describe short time intrusions by random Lindblad pulses. The phenomenological master equation is then extended to include a nonlinear Beretta term that describes the evolution of a closed system with increasing entropy. An external Bath environment is stipulated by a fixed temperature in two different ways. Here we explore the case of a simple one-qubit system in preparation for generalization to multi-qubit, qutrit and hybrid qubit-qutrit systems. This model master equation can be used to test the stability of memory and the efficacy of quantum gates. The properties of such hybrid master equations are explored, with emphasis on the role of thermal equilibrium and entropy constraints. Several significant properties of time-dependent qubit evolution are revealed by this simple study.

  11. An integrated introduction to computer graphics and geometric modeling

    CERN Document Server

    Goldman, Ronald

    2009-01-01

    … this book may be the first book on geometric modelling that also covers computer graphics. In addition, it may be the first book on computer graphics that integrates a thorough introduction to 'freedom' curves and surfaces and to the mathematical foundations for computer graphics. … the book is well suited for an undergraduate course. … The entire book is very well presented and obviously written by a distinguished and creative researcher and educator. It certainly is a textbook I would recommend. …-Computer-Aided Design, 42, 2010… Many books concentrate on computer programming and soon beco

  12. Transforming High School Physics with Modeling and Computation

    CERN Document Server

    Aiken, John M

    2013-01-01

    The Engage to Excel (PCAST) report, the National Research Council's Framework for K-12 Science Education, and the Next Generation Science Standards all call for transforming the physics classroom into an environment that teaches students real scientific practices. This work describes the early stages of one such attempt to transform a high school physics classroom. Specifically, a series of model-building and computational modeling exercises were piloted in a ninth grade Physics First classroom. Student use of computation was assessed using a proctored programming assignment, where the students produced and discussed a computational model of a baseball in motion via a high-level programming environment (VPython). Student views on computation and its link to mechanics was assessed with a written essay and a series of think-aloud interviews. This pilot study shows computation's ability for connecting scientific practice to the high school science classroom.

  13. Computational Hydrodynamics: How Portable and Scalable Are Heterogeneous Programming Paradigms?

    DEFF Research Database (Denmark)

    Pawlak, Wojciech; Glimberg, Stefan Lemvig; Engsig-Karup, Allan Peter

    New many-core era applications at the interface of mathematics and computer science adopt modern parallel programming paradigms and expose parallelism through proper algorithms. We present new performance results for a novel massively parallel free surface wave model suitable for advanced......-device system sizes from desktops to large HPC systems such as superclusters and in the cloud utilizing heterogeneous devices like multi-core CPUs, GPUs, and Xeon Phi coprocessors. The numerical efficiency is evaluated on heterogeneous devices like multi-core CPUs, GPUs and Xeon Phi coprocessors to test...

  14. Computational Modeling aided Near Net Shape Manufacturing for Aluminum Alloys Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This program will focus on developing and validating computational models for near-net shape processing of aluminum alloys. Computational models will be developed...

  15. Generic Mathematical Programming Formulation and Solution for Computer-Aided Molecular Design

    DEFF Research Database (Denmark)

    Zhang, Lei; Cignitti, Stefano; Gani, Rafiqul

    2015-01-01

    This short communication presents a generic mathematical programming formulation for Computer-Aided Molecular Design (CAMD). A given CAMD problem, based on target properties, is formulated as a Mixed Integer Linear/Non-Linear Program (MILP/MINLP). The mathematical programming model presented here...

  16. The Outlook for Computer Professions: 1985 Rewrites the Program.

    Science.gov (United States)

    Drake, Larry

    1986-01-01

    The author states that graduates of junior college programs who learn COBOL will continue to find jobs, but employers will increasingly seek college graduates when filling positions for computer programers and systems analysts. Areas of growth for computer applications (services, military, data communications, and artificial intelligence) are…

  17. Basic BASIC; An Introduction to Computer Programming in BASIC Language.

    Science.gov (United States)

    Coan, James S.

    With the increasing availability of computer access through remote terminals and time sharing, more and more schools and colleges are able to introduce programing to substantial numbers of students. This book is an attempt to incorporate computer programming, using BASIC language, and the teaching of mathematics. The general approach of the book…

  18. A Computer Program for Short Circuit Analysis of Electric Power ...

    African Journals Online (AJOL)

    This paper described the mathematical basis and computational framework of a computer program developed for short circuit studies of electric power systems. The Short Circuit Analysis Program (SCAP) is to be used to assess the composite effects of unbalanced and balanced faults on the overall reliability of electric ...

  19. Software survey: VOSviewer, a computer program for bibliometric mapping

    NARCIS (Netherlands)

    N.J.P. van Eck (Nees Jan); L. Waltman (Ludo)

    2010-01-01

    textabstractWe present VOSviewer, a freely available computer program that we have developed for constructing and viewing bibliometric maps. Unlike most computer programs that are used for bibliometric mapping, VOSviewer pays special attention to the graphical representation of bibliometric maps.

  20. Near-Surface Seismic Velocity Data: A Computer Program For ...

    African Journals Online (AJOL)

    A computer program (NESURVELANA) has been developed in Visual Basic Computer programming language to carry out a near surface velocity analysis. The method of analysis used includes: Algorithms design and Visual Basic codes generation for plotting arrival time (ms) against geophone depth (m) employing the ...

  1. Case Studies of Liberal Arts Computer Science Programs

    Science.gov (United States)

    Baldwin, D.; Brady, A.; Danyluk, A.; Adams, J.; Lawrence, A.

    2010-01-01

    Many undergraduate liberal arts institutions offer computer science majors. This article illustrates how quality computer science programs can be realized in a wide variety of liberal arts settings by describing and contrasting the actual programs at five liberal arts colleges: Williams College, Kalamazoo College, the State University of New York…

  2. Programming language for computations in the Interkosmos program

    Science.gov (United States)

    Schmidt, K.

    1975-01-01

    The programming system for Intercosmos data processing, based on the structural programming theory, which considers a program as an ordered set of standardized elementary parts, from which the user programs are automatically generated, is described. The programs are comprised of several modules, which are briefly summarized. The general structure of the programming system is presented in a block diagram. A programming control language developed to formulate the problem quickly and completely is presented along with basic symbols which are characteristic of the Intercosmos programming system.

  3. Method and computer program product for maintenance and modernization backlogging

    Science.gov (United States)

    Mattimore, Bernard G; Reynolds, Paul E; Farrell, Jill M

    2013-02-19

    According to one embodiment, a computer program product for determining future facility conditions includes a computer readable medium having computer readable program code stored therein. The computer readable program code includes computer readable program code for calculating a time period specific maintenance cost, for calculating a time period specific modernization factor, and for calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. In another embodiment, a computer-implemented method for calculating future facility conditions includes calculating a time period specific maintenance cost, calculating a time period specific modernization factor, and calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. Other embodiments are also presented.

  4. An Experimental Investigation of Computer Program Development Approaches and Computer Programming Metrics.

    Science.gov (United States)

    1979-12-01

    Approved for plbi elaSO; 81 1ibutio u0lim t 03 /1 Technical Report TR-853 December 1979 An Experimental Investigation of Computer Program Development...this Report) Approved for public release; distribution unlimited. 17. DISTRIBUTION STATEMENT (of rhe ah.tr.ct entled i f [31-k 20, ! dil(enot fo, ).R...55t -3 O 2 a. tL loco o.uaZ 4 1 0 0 - .. a .CO acca 4 * ao -- c- - .an za-e~a- 2.5 aa34z98ba a- CHAPTER VII coipletely differentiated outcome is

  5. Designing Educational Games for Computer Programming: A Holistic Framework

    Science.gov (United States)

    Malliarakis, Christos; Satratzemi, Maya; Xinogalos, Stelios

    2014-01-01

    Computer science is continuously evolving during the past decades. This has also brought forth new knowledge that should be incorporated and new learning strategies must be adopted for the successful teaching of all sub-domains. For example, computer programming is a vital knowledge area within computer science with constantly changing curriculum…

  6. Seventy Years of Computing in the Nuclear Weapons Program

    Energy Technology Data Exchange (ETDEWEB)

    Archer, Billy Joe [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-05-30

    Los Alamos has continuously been on the forefront of scientific computing since it helped found the field. This talk will explore the rich history of computing in the Los Alamos weapons program. The current status of computing will be discussed, as will the expectations for the near future.

  7. A Methodology for Teaching Computer Programming: first year students’ perspective

    OpenAIRE

    Bassey Isong

    2014-01-01

    The teaching of computer programming is one of the greatest challenges that have remained for years in Computer Science Education. A particular case is computer programming course for the beginners. While the traditional objectivist lecture-based approaches do not actively engage students to achieve their learning outcome, we believe that integrating some cutting-edge processes and practices like agile method into the teaching approaches will be leverage. Agile software development has gained...

  8. Attitude, Gender and Achievement in Computer Programming

    Science.gov (United States)

    Baser, Mustafa

    2013-01-01

    The aim of this research was to explore the relationship among students' attitudes toward programming, gender and academic achievement in programming. The scale used for measuring students' attitudes toward programming was developed by the researcher and consisted of 35 five-point Likert type items in four subscales. The scale was administered to…

  9. The Use of Molecular Modeling Programs in Medicinal Chemistry Instruction.

    Science.gov (United States)

    Harrold, Marc W.

    1992-01-01

    This paper describes and evaluates the use of a molecular modeling computer program (Alchemy II) in a pharmaceutical education program. Provided are the hardware requirements and basic program features as well as several examples of how this program and its features have been applied in the classroom. (GLR)

  10. Computational biomechanics for medicine imaging, modeling and computing

    CERN Document Server

    Doyle, Barry; Wittek, Adam; Nielsen, Poul; Miller, Karol

    2016-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologies and advancements. This volume comprises eighteen of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, UK, Switzerland, Scotland, France and Russia. Some of the interesting topics discussed are: tailored computational models; traumatic brain injury; soft-tissue mechanics; medical image analysis; and clinically-relevant simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  11. Disciplines, models, and computers: the path to computational quantum chemistry.

    Science.gov (United States)

    Lenhard, Johannes

    2014-12-01

    Many disciplines and scientific fields have undergone a computational turn in the past several decades. This paper analyzes this sort of turn by investigating the case of computational quantum chemistry. The main claim is that the transformation from quantum to computational quantum chemistry involved changes in three dimensions. First, on the side of instrumentation, small computers and a networked infrastructure took over the lead from centralized mainframe architecture. Second, a new conception of computational modeling became feasible and assumed a crucial role. And third, the field of computa- tional quantum chemistry became organized in a market-like fashion and this market is much bigger than the number of quantum theory experts. These claims will be substantiated by an investigation of the so-called density functional theory (DFT), the arguably pivotal theory in the turn to computational quantum chemistry around 1990.

  12. ADAM: A computer program to simulate selective-breeding schemes for animals

    DEFF Research Database (Denmark)

    Pedersen, L D; Sørensen, A C; Henryon, M

    2009-01-01

    ADAM is a computer program that models selective breeding schemes for animals using stochastic simulation. The program simulates a population of animals and traces the genetic changes in the population under different selective breeding scenarios. It caters to different population structures......, genetic models, selection strategies, and mating designs. ADAM can be used to evaluate breeding schemes and generate genetic data to test statistical tools...

  13. Modeling Reality - How Computers Mirror Life

    Science.gov (United States)

    Bialynicki-Birula, Iwo; Bialynicka-Birula, Iwona

    2005-01-01

    The bookModeling Reality covers a wide range of fascinating subjects, accessible to anyone who wants to learn about the use of computer modeling to solve a diverse range of problems, but who does not possess a specialized training in mathematics or computer science. The material presented is pitched at the level of high-school graduates, even though it covers some advanced topics (cellular automata, Shannon's measure of information, deterministic chaos, fractals, game theory, neural networks, genetic algorithms, and Turing machines). These advanced topics are explained in terms of well known simple concepts: Cellular automata - Game of Life, Shannon's formula - Game of twenty questions, Game theory - Television quiz, etc. The book is unique in explaining in a straightforward, yet complete, fashion many important ideas, related to various models of reality and their applications. Twenty-five programs, written especially for this book, are provided on an accompanying CD. They greatly enhance its pedagogical value and make learning of even the more complex topics an enjoyable pleasure.

  14. Model for teaching distributed computing in a distance-based educational environment

    CSIR Research Space (South Africa)

    le Roux, P

    2010-10-01

    Full Text Available , teaching students in these new technologies remains a challenge. Even though several models for teaching computer programming and teaching programming in a distance-based educational environment (DEE) exist, limited literature is available on models...

  15. Quantum vertex model for reversible classical computing.

    Science.gov (United States)

    Chamon, C; Mucciolo, E R; Ruckenstein, A E; Yang, Z-C

    2017-05-12

    Mappings of classical computation onto statistical mechanics models have led to remarkable successes in addressing some complex computational problems. However, such mappings display thermodynamic phase transitions that may prevent reaching solution even for easy problems known to be solvable in polynomial time. Here we map universal reversible classical computations onto a planar vertex model that exhibits no bulk classical thermodynamic phase transition, independent of the computational circuit. Within our approach the solution of the computation is encoded in the ground state of the vertex model and its complexity is reflected in the dynamics of the relaxation of the system to its ground state. We use thermal annealing with and without 'learning' to explore typical computational problems. We also construct a mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating an approach to reversible classical computation based on state-of-the-art implementations of quantum annealing.

  16. Quantum vertex model for reversible classical computing

    Science.gov (United States)

    Chamon, C.; Mucciolo, E. R.; Ruckenstein, A. E.; Yang, Z.-C.

    2017-05-01

    Mappings of classical computation onto statistical mechanics models have led to remarkable successes in addressing some complex computational problems. However, such mappings display thermodynamic phase transitions that may prevent reaching solution even for easy problems known to be solvable in polynomial time. Here we map universal reversible classical computations onto a planar vertex model that exhibits no bulk classical thermodynamic phase transition, independent of the computational circuit. Within our approach the solution of the computation is encoded in the ground state of the vertex model and its complexity is reflected in the dynamics of the relaxation of the system to its ground state. We use thermal annealing with and without `learning' to explore typical computational problems. We also construct a mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating an approach to reversible classical computation based on state-of-the-art implementations of quantum annealing.

  17. Modeling Computer Virus and Its Dynamics

    Directory of Open Access Journals (Sweden)

    Mei Peng

    2013-01-01

    Full Text Available Based on that the computer will be infected by infected computer and exposed computer, and some of the computers which are in suscepitible status and exposed status can get immunity by antivirus ability, a novel coumputer virus model is established. The dynamic behaviors of this model are investigated. First, the basic reproduction number R0, which is a threshold of the computer virus spreading in internet, is determined. Second, this model has a virus-free equilibrium P0, which means that the infected part of the computer disappears, and the virus dies out, and P0 is a globally asymptotically stable equilibrium if R01 then this model has only one viral equilibrium P*, which means that the computer persists at a constant endemic level, and P* is also globally asymptotically stable. Finally, some numerical examples are given to demonstrate the analytical results.

  18. The IceCube Computing Infrastructure Model

    CERN Document Server

    CERN. Geneva

    2012-01-01

    Besides the big LHC experiments a number of mid-size experiments is coming online which need to define new computing models to meet the demands on processing and storage requirements of those experiments. We present the hybrid computing model of IceCube which leverages GRID models with a more flexible direct user model as an example of a possible solution. In IceCube a central datacenter at UW-Madison servers as Tier-0 with a single Tier-1 datacenter at DESY Zeuthen. We describe the setup of the IceCube computing infrastructure and report on our experience in successfully provisioning the IceCube computing needs.

  19. Computational Models for Nonlinear Aeroelastic Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. and Duke University propose to develop and demonstrate new and efficient computational methods of modeling nonlinear aeroelastic systems. The...

  20. The CRAFT Fortran Programming Model

    Directory of Open Access Journals (Sweden)

    Douglas M. Pase

    1994-01-01

    Full Text Available Many programming models for massively parallel machines exist, and each has its advantages and disadvantages. In this article we present a programming model that combines features from other programming models that (1 can be efficiently implemented on present and future Cray Research massively parallel processor (MPP systems and (2 are useful in constructing highly parallel programs. The model supports several styles of programming: message-passing, data parallel, global address (shared data, and work-sharing. These styles may be combined within the same program. The model includes features that allow a user to define a program in terms of the behavior of the system as a whole, where the behavior of individual tasks is implicit from this systemic definition. (In general, features marked as shared are designed to support this perspective. It also supports an opposite perspective, where a program may be defined in terms of the behaviors of individual tasks, and a program is implicitly the sum of the behaviors of all tasks. (Features marked as private are designed to support this perspective. Users can exploit any combination of either set of features without ambiguity and thus are free to define a program from whatever perspective is most appropriate to the problem at hand.

  1. Generic Assessment Rubrics for Computer Programming Courses

    Science.gov (United States)

    Mustapha, Aida; Samsudin, Noor Azah; Arbaiy, Nurieze; Mohammed, Rozlini; Hamid, Isredza Rahmi

    2016-01-01

    In programming, one problem can usually be solved using different logics and constructs but still producing the same output. Sometimes students get marked down inappropriately if their solutions do not follow the answer scheme. In addition, lab exercises and programming assignments are not necessary graded by the instructors but most of the time…

  2. Instructional Uses of the Computer: Program Force

    Science.gov (United States)

    Ostrander, P.

    1975-01-01

    Describes a program which simulates motion in two dimensions of a point mass subject to a force which is a function of position, velocity, or time. Sample applications are noted and a source of a complete list of applications and programs is given. (GH)

  3. Introduction of handheld computing to a family practice residency program.

    Science.gov (United States)

    Rao, Goutham

    2002-01-01

    Handheld computers are valuable practice tools. It is important for residency programs to introduce their trainees and faculty to this technology. This article describes a formal strategy to introduce handheld computing to a family practice residency program. Objectives were selected for the handheld computer training program that reflected skills physicians would find useful in practice. TRGpro handheld computers preloaded with a suite of medical reference programs, a medical calculator, and a database program were supplied to participants. Training consisted of four 1-hour modules each with a written evaluation quiz. Participants completed a self-assessment questionnaire after the program to determine their ability to meet each objective. Sixty of the 62 participants successfully completed the training program. The mean composite score on quizzes was 36 of 40 (90%), with no significant differences by level of residency training. The mean self-ratings of participants across all objectives was 3.31 of 4.00. Third-year residents had higher mean self-ratings than others (mean of group, 3.62). Participants were very comfortable with practical skills, such as using drug reference software, and less comfortable with theory, such as knowing the different types of handheld computers available. Structured training is a successful strategy for introducing handheld computing to a residency program.

  4. Teacher Training Programs for Computer Education and Computer Assisted Education in Turkey

    Science.gov (United States)

    Usun, Salih

    2007-01-01

    The aim of this descriptive study is to review the applications and problems on the teacher training programs for computer education and computer assisted education (CAE) in Turkey. The study, firstly, introduces some applications and major problems on using instructional media and computers in developing countries and instructional technology…

  5. A computer program for planimetric analysis of digitized images

    DEFF Research Database (Denmark)

    Lynnerup, N; Lynnerup, O; Homøe, P

    1992-01-01

    bones as seen on X-rays. By placing the X-rays on a digitizer tablet and tracing the outline of the cell system, the area was calculated by the program. The calculated data and traced images could be stored and printed. The program is written in BASIC; necessary hardware is an IBM-compatible personal......Planimetrical measurements are made to calculate the area of an entity. By digitizing the entity the planimetrical measurements may be done by computer. This computer program was developed in conjunction with a research project involving measurement of the pneumatized cell system of the temporal...... computer, a digitizer tablet and a printer....

  6. Computational nanophotonics modeling and applications

    CERN Document Server

    Musa, Sarhan M

    2013-01-01

    This reference offers tools for engineers, scientists, biologists, and others working with the computational techniques of nanophotonics. It introduces the key concepts of computational methods in a manner that is easily digestible for newcomers to the field. The book also examines future applications of nanophotonics in the technical industry and covers new developments and interdisciplinary research in engineering, science, and medicine. It provides an overview of the key computational nanophotonics and describes the technologies with an emphasis on how they work and their key benefits.

  7. APPLICATION OF CLOUD COMPUTING IN PROGRAMMING INTELLIGENT ELECTRIC NETWORKS IN PROSUMERS’ HOUSEHOLDS

    Directory of Open Access Journals (Sweden)

    Marek Horyński

    2016-11-01

    Full Text Available Currently, cloud computing models are provided with increasing a number of new functionalities. The paper describes a laboratory model of intelligent KNX system which makes it possible to test the practical use of cloud computing in components programming for this system. The cloud computing elements were used for this purpose. Innovatory features of services transferred to the cloud computing models consists in the integration of advanced IT techniques and Internet technologies based systems with the users services, in this case with the management of an intelligent building system. The software ETS5 installed on workstation is another important component of the station being discussed.

  8. Climate Ocean Modeling on Parallel Computers

    Science.gov (United States)

    Wang, P.; Cheng, B. N.; Chao, Y.

    1998-01-01

    Ocean modeling plays an important role in both understanding the current climatic conditions and predicting future climate change. However, modeling the ocean circulation at various spatial and temporal scales is a very challenging computational task.

  9. Computational Intelligence. Mortality Models for the Actuary

    NARCIS (Netherlands)

    Willemse, W.J.

    2001-01-01

    This thesis applies computational intelligence to the field of actuarial (insurance) science. In particular, this thesis deals with life insurance where mortality modelling is important. Actuaries use ancient models (mortality laws) from the nineteenth century, for example Gompertz' and Makeham's

  10. Intelligent physical blocks for introducing computer programming in developing countries

    CSIR Research Space (South Africa)

    Smith, Adrew C

    2007-05-01

    Full Text Available This paper reports on the evaluation of a novel affordable system that incorporates intelligent physical blocks to introduce illiterate children in developing countries to the logical thinking process required in computer programming. Both...

  11. Advanced wellbore thermal simulator GEOTEMP2. Appendix. Computer program listing

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, R.F.

    1982-02-01

    This appendix gives the program listing of GEOTEMP2 with comments and discussion to make the program organization more understandable. This appendix is divided into an introduction and four main blocks of code: main program, program initiation, wellbore flow, and wellbore heat transfer. The purpose and use of each subprogram is discussed and the program listing is given. Flowcharts will be included to clarify code organization when needed. GEOTEMP2 was written in FORTRAN IV. Efforts have been made to keep the programing as conventional as possible so that GEOTEMP2 will run without modification on most computers.

  12. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  13. Computer Aided Continuous Time Stochastic Process Modelling

    DEFF Research Database (Denmark)

    Kristensen, N.R.; Madsen, Henrik; Jørgensen, Sten Bay

    2001-01-01

    A grey-box approach to process modelling that combines deterministic and stochastic modelling is advocated for identification of models for model-based control of batch and semi-batch processes. A computer-aided tool designed for supporting decision-making within the corresponding modelling cycle...

  14. Software survey: VOSviewer, a computer program for bibliometric mapping

    OpenAIRE

    van Eck, Nees Jan; Waltman, Ludo

    2010-01-01

    textabstractWe present VOSviewer, a freely available computer program that we have developed for constructing and viewing bibliometric maps. Unlike most computer programs that are used for bibliometric mapping, VOSviewer pays special attention to the graphical representation of bibliometric maps. The functionality of VOSviewer is especially useful for displaying large bibliometric maps in an easy-to-interpret way. The paper consists of three parts. In the first part, an overview of VOSviewer'...

  15. Newnes circuit calculations pocket book with computer programs

    CERN Document Server

    Davies, Thomas J

    2013-01-01

    Newnes Circuit Calculations Pocket Book: With Computer Programs presents equations, examples, and problems in circuit calculations. The text includes 300 computer programs that help solve the problems presented. The book is comprised of 20 chapters that tackle different aspects of circuit calculation. The coverage of the text includes dc voltage, dc circuits, and network theorems. The book also covers oscillators, phasors, and transformers. The text will be useful to electrical engineers and other professionals whose work involves electronic circuitry.

  16. Computer program compatible with a laser nephelometer

    Science.gov (United States)

    Paroskie, R. M.; Blau, H. H., Jr.; Blinn, J. C., III

    1975-01-01

    The laser nephelometer data system was updated to provide magnetic tape recording of data, and real time or near real time processing of data to provide particle size distribution and liquid water content. Digital circuits were provided to interface the laser nephelometer to a Data General Nova 1200 minicomputer. Communications are via a teletypewriter. A dual Linc Magnetic Tape System is used for program storage and data recording. Operational programs utilize the Data General Real-Time Operating System (RTOS) and the ERT AIRMAP Real-Time Operating System (ARTS). The programs provide for acquiring data from the laser nephelometer, acquiring data from auxiliary sources, keeping time, performing real time calculations, recording data and communicating with the teletypewriter.

  17. Contributions to computational stereology and parallel programming

    DEFF Research Database (Denmark)

    Rasmusson, Allan

    Stereology is the science of interpreting 3D structures from 2D sections planes and it is used in a multitude of disciplines including bioscience, material science and more. At its core is the use of random systematic sampling and geometrical probes which allow valid statistical inference...... between computer science and stereology, we try to overcome these problems by developing new virtual stereological probes and virtual tissue sections. A concrete result is the development of a new virtual 3D probe, the spatial rotator, which was found to have lower variance than the widely used planar...... simulator and a memory efficient, GPU implementation of for connected components labeling. This was furthermore extended to produce signed distance fields and Voronoi diagrams, all with real-time performance. It has during the course of the project been realized that many disciplines within computer science...

  18. The engineering design integration (EDIN) system. [digital computer program complex

    Science.gov (United States)

    Glatt, C. R.; Hirsch, G. N.; Alford, G. E.; Colquitt, W. N.; Reiners, S. J.

    1974-01-01

    A digital computer program complex for the evaluation of aerospace vehicle preliminary designs is described. The system consists of a Univac 1100 series computer and peripherals using the Exec 8 operating system, a set of demand access terminals of the alphanumeric and graphics types, and a library of independent computer programs. Modification of the partial run streams, data base maintenance and construction, and control of program sequencing are provided by a data manipulation program called the DLG processor. The executive control of library program execution is performed by the Univac Exec 8 operating system through a user established run stream. A combination of demand and batch operations is employed in the evaluation of preliminary designs. Applications accomplished with the EDIN system are described.

  19. 77 FR 38610 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2012-06-28

    ... education at the time of the parent or guardian's death. Beginning July 1, 2010, students who are otherwise... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF EDUCATION Privacy Act of 1974; Computer Matching Program AGENCY: Department of Education. ACTION: Notice--Computer...

  20. Computing, Information, and Communications Technology (CICT) Program Overview

    Science.gov (United States)

    VanDalsem, William R.

    2003-01-01

    The Computing, Information and Communications Technology (CICT) Program's goal is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communication technologies

  1. 78 FR 1275 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2013-01-08

    ...: Notice--computer matching between the Office of Personnel Management and the Social Security... matching program with the Social Security Administration (SSA). DATES: OPM will file a report of the..., as amended, regulates the use of computer matching by Federal agencies when records in a system of...

  2. 77 FR 74518 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2012-12-14

    ...: Notice--computer matching between the Office of Personnel Management and the Social Security... Personnel Management (OPM) is publishing notice of its new computer matching program with the Social... matching by Federal agencies when records in a system of records are matched with other Federal, State, or...

  3. A computer program for analysis of fuelwood harvesting costs

    Science.gov (United States)

    George B. Harpole; Giuseppe Rensi

    1985-01-01

    The fuelwood harvesting computer program (FHP) is written in FORTRAN 60 and designed to select a collection of harvest units and systems from among alternatives to satisfy specified energy requirements at a lowest cost per million Btu's as recovered in a boiler, or thousand pounds of H2O evaporative capacity kiln drying. Computed energy costs are used as a...

  4. Gender Digital Divide and Challenges in Undergraduate Computer Science Programs

    Science.gov (United States)

    Stoilescu, Dorian; McDougall, Douglas

    2011-01-01

    Previous research revealed a reduced number of female students registered in computer science studies. In addition, the female students feel isolated, have reduced confidence, and underperform. This article explores differences between female and male students in undergraduate computer science programs in a mid-size university in Ontario. Based on…

  5. Program Analysis as Model Checking

    DEFF Research Database (Denmark)

    Olesen, Mads Chr.

    Software programs are proliferating throughout modern life, to a point where even the simplest appliances such as lightbulbs contain software, in addition to the software embedded in cars and airplanes. The correct functioning of these programs is therefore of the utmost importance, for the quality...... and sustenance of life. Due to the complexity inherent in the software it can be very difficult for the software developer to guarantee the absence of errors; automated support in the form of automated program analysis is therefore essential. Two methods have traditionally been proposed: model checking...... and abstract interpretation. Model checking views the program as a finite automaton and tries to prove logical properties over the automaton model, or present a counter-example if not possible — with a focus on precision. Abstract interpretation translates the program semantics into abstract semantics...

  6. Computer Programming with Infants and Juniors.

    Science.gov (United States)

    Hind, Jim

    1984-01-01

    The article argues that even extremely young children can be taught to program microcomputers from their very first contact. A teaching strategy is proposed, having more in common with the teaching of language than with the more traditional didactic-reinforcement cycle commonly employed in the text books. (Author/CL)

  7. Programming Languages for Distributed Computing Systems

    NARCIS (Netherlands)

    Bal, H.E.; Steiner, J.G.; Tanenbaum, A.S.

    1989-01-01

    When distributed systems first appeared, they were programmed in traditional sequential languages, usually with the addition of a few library procedures for sending and receiving messages. As distributed applications became more commonplace and more sophisticated, this ad hoc approach became less

  8. Model Railroading and Computer Fundamentals

    Science.gov (United States)

    McCormick, John W.

    2007-01-01

    Less than one half of one percent of all processors manufactured today end up in computers. The rest are embedded in other devices such as automobiles, airplanes, trains, satellites, and nearly every modern electronic device. Developing software for embedded systems requires a greater knowledge of hardware than developing for a typical desktop…

  9. Manned space station environmental control and life support system computer-aided technology assessment program

    Science.gov (United States)

    Hall, J. B., Jr.; Pickett, S. J.; Sage, K. H.

    1984-01-01

    A computer program for assessing manned space station environmental control and life support systems technology is described. The methodology, mission model parameters, evaluation criteria, and data base for 17 candidate technologies for providing metabolic oxygen and water to the crew are discussed. Examples are presented which demonstrate the capability of the program to evaluate candidate technology options for evolving space station requirements.

  10. Demand side management scheme in smart grid with cloud computing approach using stochastic dynamic programming

    Directory of Open Access Journals (Sweden)

    S. Sofana Reka

    2016-09-01

    Full Text Available This paper proposes a cloud computing framework in smart grid environment by creating small integrated energy hub supporting real time computing for handling huge storage of data. A stochastic programming approach model is developed with cloud computing scheme for effective demand side management (DSM in smart grid. Simulation results are obtained using GUI interface and Gurobi optimizer in Matlab in order to reduce the electricity demand by creating energy networks in a smart hub approach.

  11. Dynamic programming models and applications

    CERN Document Server

    Denardo, Eric V

    2003-01-01

    Introduction to sequential decision processes covers use of dynamic programming in studying models of resource allocation, methods for approximating solutions of control problems in continuous time, production control, more. 1982 edition.

  12. From Clover to computer. Towards programmed anaesthesia?

    Science.gov (United States)

    Mapleson, W W

    1979-02-01

    The control of depth of anaesthesia has been viewed as a control-system problem the solution of which can involve both feedback and feedforward techniques. The nature of the problem in Clover's day and the solutions he found have been examined. A similar analysis has been made in respect of the modern anaesthetist. Finally, the way in which computers may aid the anaesthetist in his task has been illustrated by reference to various attempts reported from around the world and, in particular, by describing the development in Cardiff of a system which should produce, in the brain of the patient, any tension of an inhaled anaesthetic which the anaesthetist chooses to specify.

  13. A Linguistic Model in Component Oriented Programming

    Science.gov (United States)

    Crăciunean, Daniel Cristian; Crăciunean, Vasile

    2016-12-01

    It is a fact that the component-oriented programming, well organized, can bring a large increase in efficiency in the development of large software systems. This paper proposes a model for building software systems by assembling components that can operate independently of each other. The model is based on a computing environment that runs parallel and distributed applications. This paper introduces concepts as: abstract aggregation scheme and aggregation application. Basically, an aggregation application is an application that is obtained by combining corresponding components. In our model an aggregation application is a word in a language.

  14. MULGRES: a computer program for stepwise multiple regression analysis

    Science.gov (United States)

    A. Jeff Martin

    1971-01-01

    MULGRES is a computer program source deck that is designed for multiple regression analysis employing the technique of stepwise deletion in the search for most significant variables. The features of the program, along with inputs and outputs, are briefly described, with a note on machine compatibility.

  15. A Successful Course of Study in Computer Programming

    Science.gov (United States)

    Seeger, David H.

    1977-01-01

    Three keys to the successful development of the program of the computer programming department of the Technical Institute of Oklahoma State University are discussed: Community involvement, faculty/administration commitment to the basic principles of technical career education, and availability of appropriate equipment for student use. (HD)

  16. A Research Program in Computer Technology

    Science.gov (United States)

    1979-01-01

    14 (7), 1971, 453-360. 5. Donzeau-Gouge, V., G. Kahn, and B. Lang , A Complete Machine-Checked Definition of a Simple Programming Language Using...Denotational Semantics, IRIA Laborla, Technical Report 330, October 1978. 6. Donzeau-Gouge, V., G. Kahn, and B. Lang , Formal Definition of Ada, Honeywell...May 1976. r S.-..-. . . . . . . . 12. ARPANET TENEX SERVICE T’fhttiral Staff Marion McKinley, Jr. William H. Moore Robert Hines Serge Poievitzky Edward

  17. Statistics, Computation, and Modeling in Cosmology

    Science.gov (United States)

    Jewell, Jeff; Guiness, Joe; SAMSI 2016 Working Group in Cosmology

    2017-01-01

    Current and future ground and space based missions are designed to not only detect, but map out with increasing precision, details of the universe in its infancy to the present-day. As a result we are faced with the challenge of analyzing and interpreting observations from a wide variety of instruments to form a coherent view of the universe. Finding solutions to a broad range of challenging inference problems in cosmology is one of the goals of the “Statistics, Computation, and Modeling in Cosmology” workings groups, formed as part of the year long program on ‘Statistical, Mathematical, and Computational Methods for Astronomy’, hosted by the Statistical and Applied Mathematical Sciences Institute (SAMSI), a National Science Foundation funded institute. Two application areas have emerged for focused development in the cosmology working group involving advanced algorithmic implementations of exact Bayesian inference for the Cosmic Microwave Background, and statistical modeling of galaxy formation. The former includes study and development of advanced Markov Chain Monte Carlo algorithms designed to confront challenging inference problems including inference for spatial Gaussian random fields in the presence of sources of galactic emission (an example of a source separation problem). Extending these methods to future redshift survey data probing the nonlinear regime of large scale structure formation is also included in the working group activities. In addition, the working group is also focused on the study of ‘Galacticus’, a galaxy formation model applied to dark matter-only cosmological N-body simulations operating on time-dependent halo merger trees. The working group is interested in calibrating the Galacticus model to match statistics of galaxy survey observations; specifically stellar mass functions, luminosity functions, and color-color diagrams. The group will use subsampling approaches and fractional factorial designs to statistically and

  18. A new computer program for QSAR-analysis: ARTE-QSAR.

    Science.gov (United States)

    Van Damme, Sofie; Bultinck, Patrick

    2007-08-01

    A new computer program has been designed to build and analyze quantitative-structure activity relationship (QSAR) models through regression analysis. The user is provided with a range of regression and validation techniques. The emphasis of the program lies mainly in the validation of QSAR models in chemical applications. ARTE-QSAR produces an easy interpretable output from which the user can conclude if the obtained model is suitable for prediction and analysis.

  19. PDDP, A Data Parallel Programming Model

    Directory of Open Access Journals (Sweden)

    Karen H. Warren

    1996-01-01

    Full Text Available PDDP, the parallel data distribution preprocessor, is a data parallel programming model for distributed memory parallel computers. PDDP implements high-performance Fortran-compatible data distribution directives and parallelism expressed by the use of Fortran 90 array syntax, the FORALL statement, and the WHERE construct. Distributed data objects belong to a global name space; other data objects are treated as local and replicated on each processor. PDDP allows the user to program in a shared memory style and generates codes that are portable to a variety of parallel machines. For interprocessor communication, PDDP uses the fastest communication primitives on each platform.

  20. Integrating computer programs for engineering analysis and design

    Science.gov (United States)

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.

    1983-01-01

    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  1. A Categorisation of Cloud Computing Business Models

    OpenAIRE

    Chang, V; Bacigalupo, D; Wills, G; De Roure, D

    2010-01-01

    This paper reviews current cloud computing business models and presents proposals on how organisations can achieve sustainability by adopting appropriate models. We classify cloud computing business models into eight types: (1) Service Provider and Service Orientation; (2) Support and Services Contracts; (3) In-House Private Clouds; (4) All-In-One Enterprise Cloud; (5) One-Stop Resources and Services; (6) Government funding; (7) Venture Capitals; and (8) Entertainment and Social Networking. U...

  2. A Tangible Programming Tool for Children to Cultivate Computational Thinking

    Science.gov (United States)

    Wang, Danli; Liu, Zhen

    2014-01-01

    Game and creation are activities which have good potential for computational thinking skills. In this paper we present T-Maze, an economical tangible programming tool for children aged 5–9 to build computer programs in maze games by placing wooden blocks. Through the use of computer vision technology, T-Maze provides a live programming interface with real-time graphical and voice feedback. We conducted a user study with 7 children using T-Maze to play two levels of maze-escape games and create their own mazes. The results show that T-Maze is not only easy to use, but also has the potential to help children cultivate computational thinking like abstraction, problem decomposition, and creativity. PMID:24719575

  3. On Computational Power of Quantum Read-Once Branching Programs

    Directory of Open Access Journals (Sweden)

    Farid Ablayev

    2011-03-01

    Full Text Available In this paper we review our current results concerning the computational power of quantum read-once branching programs. First of all, based on the circuit presentation of quantum branching programs and our variant of quantum fingerprinting technique, we show that any Boolean function with linear polynomial presentation can be computed by a quantum read-once branching program using a relatively small (usually logarithmic in the size of input number of qubits. Then we show that the described class of Boolean functions is closed under the polynomial projections.

  4. computer modeling ter modeling ter modeling of platinum reforming

    African Journals Online (AJOL)

    eobe

    Usually, the reformate that is leaving any stage o composition is assessed by laboratory analysis. The i which in most cases is avoided because of long comp hich in most cases is avoided because of long comp approach has considered a computer model as mean bed reactors in platforming u bed reactors in platforming ...

  5. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  6. Injecting Artificial Memory Errors Into a Running Computer Program

    Science.gov (United States)

    Bornstein, Benjamin J.; Granat, Robert A.; Wagstaff, Kiri L.

    2008-01-01

    Single-event upsets (SEUs) or bitflips are computer memory errors caused by radiation. BITFLIPS (Basic Instrumentation Tool for Fault Localized Injection of Probabilistic SEUs) is a computer program that deliberately injects SEUs into another computer program, while the latter is running, for the purpose of evaluating the fault tolerance of that program. BITFLIPS was written as a plug-in extension of the open-source Valgrind debugging and profiling software. BITFLIPS can inject SEUs into any program that can be run on the Linux operating system, without needing to modify the program s source code. Further, if access to the original program source code is available, BITFLIPS offers fine-grained control over exactly when and which areas of memory (as specified via program variables) will be subjected to SEUs. The rate of injection of SEUs is controlled by specifying either a fault probability or a fault rate based on memory size and radiation exposure time, in units of SEUs per byte per second. BITFLIPS can also log each SEU that it injects and, if program source code is available, report the magnitude of effect of the SEU on a floating-point value or other program variable.

  7. Computational Model for Corneal Transplantation

    Science.gov (United States)

    Cabrera, Delia

    2003-10-01

    We evaluated the refractive consequences of corneal transplants using a biomechanical model with homogeneous and inhomogeneous Young's modulus distributions within the cornea, taking into account ablation of some stromal tissue. A FEM model was used to simulate corneal transplants in diseased cornea. The diseased cornea was modeled as an axisymmetric structure taking into account a nonlinearly elastic, isotropic formulation. The model simulating the penetrating keratoplasty procedure gives more change in the postoperative corneal curvature when compared to the models simulating the anterior and posterior lamellar graft procedures. When a lenticle shaped tissue was ablated in the graft during the anterior and posterior keratoplasty, the models provided an additional correction of about -3.85 and -4.45 diopters, respectively. Despite the controversy around the corneal thinning disorders treatment with volume removal procedures, results indicate that significant changes in corneal refractive power could be introduced by a corneal transplantation combined with myopic laser ablation.

  8. Notions of similarity for computational biology models

    KAUST Repository

    Waltemath, Dagmar

    2016-03-21

    Computational models used in biology are rapidly increasing in complexity, size, and numbers. To build such large models, researchers need to rely on software tools for model retrieval, model combination, and version control. These tools need to be able to quantify the differences and similarities between computational models. However, depending on the specific application, the notion of similarity may greatly vary. A general notion of model similarity, applicable to various types of models, is still missing. Here, we introduce a general notion of quantitative model similarities, survey the use of existing model comparison methods in model building and management, and discuss potential applications of model comparison. To frame model comparison as a general problem, we describe a theoretical approach to defining and computing similarities based on different model aspects. Potentially relevant aspects of a model comprise its references to biological entities, network structure, mathematical equations and parameters, and dynamic behaviour. Future similarity measures could combine these model aspects in flexible, problem-specific ways in order to mimic users\\' intuition about model similarity, and to support complex model searches in databases.

  9. Material Programming: a Design Practice for Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna; Boer, Laurens; Tsaknaki, Vasiliki

    2016-01-01

    In this paper we propose the notion of material programming as a future design practice for computational composites. Material programming would be a way for the interaction designer to better explore the dynamic potential of computational materials at hand and through that familiarity be able...... to compose more sophisticated and complex temporal forms in their designs. The contribution of the paper is an analysis of qualities that we find a material programming practice would and should support: designs grounded in material properties and experiences, embodied programming practice, real-time on......-site explorations, and finally a reasonable level of complexity in couplings between input and output. We propose material programming knowing that the technology and materials are not entirely ready to support this practice yet, however, we are certain they will be and that the interaction design community...

  10. VNAP2: a computer program for computation of two-dimensional, time-dependent, compressible, turbulent flow

    Energy Technology Data Exchange (ETDEWEB)

    Cline, M.C.

    1981-08-01

    VNAP2 is a computer program for calculating turbulent (as well as laminar and inviscid), steady, and unsteady flow. VNAP2 solves the two-dimensional, time-dependent, compressible Navier-Stokes equations. The turbulence is modeled with either an algebraic mixing-length model, a one-equation model, or the Jones-Launder two-equation model. The geometry may be a single- or a dual-flowing stream. The interior grid points are computed using the unsplit MacCormack scheme. Two options to speed up the calculations for high Reynolds number flows are included. The boundary grid points are computed using a reference-plane-characteristic scheme with the viscous terms treated as source functions. An explicit artificial viscosity is included for shock computations. The fluid is assumed to be a perfect gas. The flow boundaries may be arbitrary curved solid walls, inflow/outflow boundaries, or free-jet envelopes. Typical problems that can be solved concern nozzles, inlets, jet-powered afterbodies, airfoils, and free-jet expansions. The accuracy and efficiency of the program are shown by calculations of several inviscid and turbulent flows. The program and its use are described completely, and six sample cases and a code listing are included.

  11. Computational fluid dynamics modeling in yarn engineering

    CSIR Research Space (South Africa)

    Patanaik, A

    2011-07-01

    Full Text Available This chapter deals with the application of computational fluid dynamics (CFD) modeling in reducing yarn hairiness during the ring spinning process and thereby “engineering” yarn with desired properties. Hairiness significantly affects the appearance...

  12. A new epidemic model of computer viruses

    Science.gov (United States)

    Yang, Lu-Xing; Yang, Xiaofan

    2014-06-01

    This paper addresses the epidemiological modeling of computer viruses. By incorporating the effect of removable storage media, considering the possibility of connecting infected computers to the Internet, and removing the conservative restriction on the total number of computers connected to the Internet, a new epidemic model is proposed. Unlike most previous models, the proposed model has no virus-free equilibrium and has a unique endemic equilibrium. With the aid of the theory of asymptotically autonomous systems as well as the generalized Poincare-Bendixson theorem, the endemic equilibrium is shown to be globally asymptotically stable. By analyzing the influence of different system parameters on the steady number of infected computers, a collection of policies is recommended to prohibit the virus prevalence.

  13. Computational Models for Nonlinear Aeroelastic Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. and Duke University propose to develop and demonstrate a new and efficient computational method of modeling nonlinear aeroelastic systems. The...

  14. Efficient sampling and meta-modeling in computational economic models

    NARCIS (Netherlands)

    Salle, I.; Yıldızoğlu, M.

    2014-01-01

    Extensive exploration of simulation models comes at a high computational cost, all the more when the model involves a lot of parameters. Economists usually rely on random explorations, such as Monte Carlo simulations, and basic econometric modeling to approximate the properties of computational

  15. Predictive Models and Computational Embryology

    Science.gov (United States)

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  16. Preliminary Phase Field Computational Model Development

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yulan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hu, Shenyang Y. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Xu, Ke [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Suter, Jonathan D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McCloy, John S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Johnson, Bradley R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ramuhalli, Pradeep [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-15

    This interim report presents progress towards the development of meso-scale models of magnetic behavior that incorporate microstructural information. Modeling magnetic signatures in irradiated materials with complex microstructures (such as structural steels) is a significant challenge. The complexity is addressed incrementally, using the monocrystalline Fe (i.e., ferrite) film as model systems to develop and validate initial models, followed by polycrystalline Fe films, and by more complicated and representative alloys. In addition, the modeling incrementally addresses inclusion of other major phases (e.g., martensite, austenite), minor magnetic phases (e.g., carbides, FeCr precipitates), and minor nonmagnetic phases (e.g., Cu precipitates, voids). The focus of the magnetic modeling is on phase-field models. The models are based on the numerical solution to the Landau-Lifshitz-Gilbert equation. From the computational standpoint, phase-field modeling allows the simulation of large enough systems that relevant defect structures and their effects on functional properties like magnetism can be simulated. To date, two phase-field models have been generated in support of this work. First, a bulk iron model with periodic boundary conditions was generated as a proof-of-concept to investigate major loop effects of single versus polycrystalline bulk iron and effects of single non-magnetic defects. More recently, to support the experimental program herein using iron thin films, a new model was generated that uses finite boundary conditions representing surfaces and edges. This model has provided key insights into the domain structures observed in magnetic force microscopy (MFM) measurements. Simulation results for single crystal thin-film iron indicate the feasibility of the model for determining magnetic domain wall thickness and mobility in an externally applied field. Because the phase-field model dimensions are limited relative to the size of most specimens used in

  17. Computer programs for analysis of geophysical data

    Energy Technology Data Exchange (ETDEWEB)

    Rozhkov, M.; Nakanishi, K.

    1994-06-01

    This project is oriented toward the application of the mobile seismic array data analysis technique in seismic investigations of the Earth (the noise-array method). The technique falls into the class of emission tomography methods but, in contrast to classic tomography, 3-D images of the microseismic activity of the media are obtained by passive seismic antenna scanning of the half-space, rather than by solution of the inverse Radon`s problem. It is reasonable to expect that areas of geothermal activity, active faults, areas of volcanic tremors and hydrocarbon deposits act as sources of intense internal microseismic activity or as effective sources for scattered (secondary) waves. The conventional approaches of seismic investigations of a geological medium include measurements of time-limited determinate signals from artificial or natural sources. However, the continuous seismic oscillations, like endogenous microseisms, coda and scattering waves, can give very important information about the structure of the Earth. The presence of microseismic sources or inhomogeneities within the Earth results in the appearance of coherent seismic components in a stochastic wave field recorded on the surface by a seismic array. By careful processing of seismic array data, these coherent components can be used to develop a 3-D model of the microseismic activity of the media or images of the noisy objects. Thus, in contrast to classic seismology where narrow windows are used to get the best time resolution of seismic signals, our model requires long record length for the best spatial resolution.

  18. A Computational Framework for Realistic Retina Modeling.

    Science.gov (United States)

    Martínez-Cañada, Pablo; Morillas, Christian; Pino, Begoña; Ros, Eduardo; Pelayo, Francisco

    2016-11-01

    Computational simulations of the retina have led to valuable insights about the biophysics of its neuronal activity and processing principles. A great number of retina models have been proposed to reproduce the behavioral diversity of the different visual processing pathways. While many of these models share common computational stages, previous efforts have been more focused on fitting specific retina functions rather than generalizing them beyond a particular model. Here, we define a set of computational retinal microcircuits that can be used as basic building blocks for the modeling of different retina mechanisms. To validate the hypothesis that similar processing structures may be repeatedly found in different retina functions, we implemented a series of retina models simply by combining these computational retinal microcircuits. Accuracy of the retina models for capturing neural behavior was assessed by fitting published electrophysiological recordings that characterize some of the best-known phenomena observed in the retina: adaptation to the mean light intensity and temporal contrast, and differential motion sensitivity. The retinal microcircuits are part of a new software platform for efficient computational retina modeling from single-cell to large-scale levels. It includes an interface with spiking neural networks that allows simulation of the spiking response of ganglion cells and integration with models of higher visual areas.

  19. Dynamics of Population and Economic Growth: A Computer-Based Instruction Program.

    Science.gov (United States)

    Roh, Chaisung; Handler, Paul

    A computer-assisted instructional (CAI) program at the University of Illinois is used to teach the dynamics of population growth. Socio-economic models are also developed to show the consequences of population growth upon variables such as income, productivity, and the demand for food. A one-sex population projection model allows students to…

  20. Model to Implement Virtual Computing Labs via Cloud Computing Services

    Directory of Open Access Journals (Sweden)

    Washington Luna Encalada

    2017-07-01

    Full Text Available In recent years, we have seen a significant number of new technological ideas appearing in literature discussing the future of education. For example, E-learning, cloud computing, social networking, virtual laboratories, virtual realities, virtual worlds, massive open online courses (MOOCs, and bring your own device (BYOD are all new concepts of immersive and global education that have emerged in educational literature. One of the greatest challenges presented to e-learning solutions is the reproduction of the benefits of an educational institution’s physical laboratory. For a university without a computing lab, to obtain hands-on IT training with software, operating systems, networks, servers, storage, and cloud computing similar to that which could be received on a university campus computing lab, it is necessary to use a combination of technological tools. Such teaching tools must promote the transmission of knowledge, encourage interaction and collaboration, and ensure students obtain valuable hands-on experience. That, in turn, allows the universities to focus more on teaching and research activities than on the implementation and configuration of complex physical systems. In this article, we present a model for implementing ecosystems which allow universities to teach practical Information Technology (IT skills. The model utilizes what is called a “social cloud”, which utilizes all cloud computing services, such as Software as a Service (SaaS, Platform as a Service (PaaS, and Infrastructure as a Service (IaaS. Additionally, it integrates the cloud learning aspects of a MOOC and several aspects of social networking and support. Social clouds have striking benefits such as centrality, ease of use, scalability, and ubiquity, providing a superior learning environment when compared to that of a simple physical lab. The proposed model allows students to foster all the educational pillars such as learning to know, learning to be, learning

  1. Computer Modeling of Direct Metal Laser Sintering

    Science.gov (United States)

    Cross, Matthew

    2014-01-01

    A computational approach to modeling direct metal laser sintering (DMLS) additive manufacturing process is presented. The primary application of the model is for determining the temperature history of parts fabricated using DMLS to evaluate residual stresses found in finished pieces and to assess manufacturing process strategies to reduce part slumping. The model utilizes MSC SINDA as a heat transfer solver with imbedded FORTRAN computer code to direct laser motion, apply laser heating as a boundary condition, and simulate the addition of metal powder layers during part fabrication. Model results are compared to available data collected during in situ DMLS part manufacture.

  2. Web Program for Development of GUIs for Cluster Computers

    Science.gov (United States)

    Czikmantory, Akos; Cwik, Thomas; Klimeck, Gerhard; Hua, Hook; Oyafuso, Fabiano; Vinyard, Edward

    2003-01-01

    WIGLAF (a Web Interface Generator and Legacy Application Facade) is a computer program that provides a Web-based, distributed, graphical-user-interface (GUI) framework that can be adapted to any of a broad range of application programs, written in any programming language, that are executed remotely on any cluster computer system. WIGLAF enables the rapid development of a GUI for controlling and monitoring a specific application program running on the cluster and for transferring data to and from the application program. The only prerequisite for the execution of WIGLAF is a Web-browser program on a user's personal computer connected with the cluster via the Internet. WIGLAF has a client/server architecture: The server component is executed on the cluster system, where it controls the application program and serves data to the client component. The client component is an applet that runs in the Web browser. WIGLAF utilizes the Extensible Markup Language to hold all data associated with the application software, Java to enable platform-independent execution on the cluster system and the display of a GUI generator through the browser, and the Java Remote Method Invocation software package to provide simple, effective client/server networking.

  3. An introduction to NASA's advanced computing program: Integrated computing systems in advanced multichip modules

    Science.gov (United States)

    Fang, Wai-Chi; Alkalai, Leon

    1996-01-01

    Recent changes within NASA's space exploration program favor the design, implementation, and operation of low cost, lightweight, small and micro spacecraft with multiple launches per year. In order to meet the future needs of these missions with regard to the use of spacecraft microelectronics, NASA's advanced flight computing (AFC) program is currently considering industrial cooperation and advanced packaging architectures. In relation to this, the AFC program is reviewed, considering the design and implementation of NASA's AFC multichip module.

  4. Visual and Computational Modelling of Minority Games

    Directory of Open Access Journals (Sweden)

    Robertas Damaševičius

    2017-02-01

    Full Text Available The paper analyses the Minority Game and focuses on analysis and computational modelling of several variants (variable payoff, coalition-based and ternary voting of Minority Game using UAREI (User-Action-Rule-Entities-Interface model. UAREI is a model for formal specification of software gamification, and the UAREI visual modelling language is a language used for graphical representation of game mechanics. The URAEI model also provides the embedded executable modelling framework to evaluate how the rules of the game will work for the players in practice. We demonstrate flexibility of UAREI model for modelling different variants of Minority Game rules for game design.

  5. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Likewise, ships and buildings are built by naval and civil architects. While these are useful, they are, in most cases, static models. We are ..... The basic theory of transition from one state to another was developed by the Russian mathematician. Andrei Markov and hence the name Markov chains. Andrei Markov [1856-1922] ...

  6. Computational aspects of premixing modelling

    Energy Technology Data Exchange (ETDEWEB)

    Fletcher, D.F. [Sydney Univ., NSW (Australia). Dept. of Chemical Engineering; Witt, P.J.

    1998-01-01

    In the steam explosion research field there is currently considerable effort being devoted to the modelling of premixing. Practically all models are based on the multiphase flow equations which treat the mixture as an interpenetrating continuum. Solution of these equations is non-trivial and a wide range of solution procedures are in use. This paper addresses some numerical aspects of this problem. In particular, we examine the effect of the differencing scheme for the convective terms and show that use of hybrid differencing can cause qualitatively wrong solutions in some situations. Calculations are performed for the Oxford tests, the BNL tests, a MAGICO test and to investigate various sensitivities of the solution. In addition, we show that use of a staggered grid can result in a significant error which leads to poor predictions of `melt` front motion. A correction is given which leads to excellent convergence to the analytic solution. Finally, we discuss the issues facing premixing model developers and highlight the fact that model validation is hampered more by the complexity of the process than by numerical issues. (author)

  7. Computational Modeling of Culture's Consequences

    NARCIS (Netherlands)

    Hofstede, G.J.; Jonker, C.M.; Verwaart, T.

    2010-01-01

    This paper presents an approach to formalize the influence of culture on the decision functions of agents in social simulations. The key components are (a) a definition of the domain of study in the form of a decision model, (b) knowledge acquisition based on a dimensional theory of culture,

  8. Computational modeling of concrete flow

    DEFF Research Database (Denmark)

    Roussel, Nicolas; Geiker, Mette Rica; Dufour, Frederic

    2007-01-01

    particle flow, and numerical techniques allowing the modeling of particles suspended in a fluid. The general concept behind each family of techniques is described. Pros and cons for each technique are given along with examples and references to applications to fresh cementitious materials....

  9. A digital computer program for the dynamic interaction simulation of controls and structure (DISCOS), volume 1

    Science.gov (United States)

    Bodley, C. S.; Devers, A. D.; Park, A. C.; Frisch, H. P.

    1978-01-01

    A theoretical development and associated digital computer program system for the dynamic simulation and stability analysis of passive and actively controlled spacecraft are presented. The dynamic system (spacecraft) is modeled as an assembly of rigid and/or flexible bodies not necessarily in a topological tree configuration. The computer program system is used to investigate total system dynamic characteristics, including interaction effects between rigid and/or flexible bodies, control systems, and a wide range of environmental loadings. In addition, the program system is used for designing attitude control systems and for evaluating total dynamic system performance, including time domain response and frequency domain stability analyses.

  10. Computational Economic Modeling of Migration

    OpenAIRE

    Klabunde, Anna

    2014-01-01

    In this paper an agent-based model of endogenously evolving migrant networks is developed to identify the determinants of migration and return decisions. Individuals are connected by links, the strength of which declines over time and distance. Methodologically, this paper combines parameterization using data from the Mexican Migration Project with calibration. It is shown that expected earnings, an idiosyncratic home bias, network ties to other migrants, strength of links to the home country...

  11. Ablative Rocket Deflector Testing and Computational Modeling

    Science.gov (United States)

    Allgood, Daniel C.; Lott, Jeffrey W.; Raines, Nickey

    2010-01-01

    A deflector risk mitigation program was recently conducted at the NASA Stennis Space Center. The primary objective was to develop a database that characterizes the behavior of industry-grade refractory materials subjected to rocket plume impingement conditions commonly experienced on static test stands. The program consisted of short and long duration engine tests where the supersonic exhaust flow from the engine impinged on an ablative panel. Quasi time-dependent erosion depths and patterns generated by the plume impingement were recorded for a variety of different ablative materials. The erosion behavior was found to be highly dependent on the material s composition and corresponding thermal properties. For example, in the case of the HP CAST 93Z ablative material, the erosion rate actually decreased under continued thermal heating conditions due to the formation of a low thermal conductivity "crystallization" layer. The "crystallization" layer produced near the surface of the material provided an effective insulation from the hot rocket exhaust plume. To gain further insight into the complex interaction of the plume with the ablative deflector, computational fluid dynamic modeling was performed in parallel to the ablative panel testing. The results from the current study demonstrated that locally high heating occurred due to shock reflections. These localized regions of shock-induced heat flux resulted in non-uniform erosion of the ablative panels. In turn, it was observed that the non-uniform erosion exacerbated the localized shock heating causing eventual plume separation and reversed flow for long duration tests under certain conditions. Overall, the flow simulations compared very well with the available experimental data obtained during this project.

  12. Model Checker for Java Programs

    Science.gov (United States)

    Visser, Willem

    2007-01-01

    Java Pathfinder (JPF) is a verification and testing environment for Java that integrates model checking, program analysis, and testing. JPF consists of a custom-made Java Virtual Machine (JVM) that interprets bytecode, combined with a search interface to allow the complete behavior of a Java program to be analyzed, including interleavings of concurrent programs. JPF is implemented in Java, and its architecture is highly modular to support rapid prototyping of new features. JPF is an explicit-state model checker, because it enumerates all visited states and, therefore, suffers from the state-explosion problem inherent in analyzing large programs. It is suited to analyzing programs less than 10kLOC, but has been successfully applied to finding errors in concurrent programs up to 100kLOC. When an error is found, a trace from the initial state to the error is produced to guide the debugging. JPF works at the bytecode level, meaning that all of Java can be model-checked. By default, the software checks for all runtime errors (uncaught exceptions), assertions violations (supports Java s assert), and deadlocks. JPF uses garbage collection and symmetry reductions of the heap during model checking to reduce state-explosion, as well as dynamic partial order reductions to lower the number of interleavings analyzed. JPF is capable of symbolic execution of Java programs, including symbolic execution of complex data such as linked lists and trees. JPF is extensible as it allows for the creation of listeners that can subscribe to events during searches. The creation of dedicated code to be executed in place of regular classes is supported and allows users to easily handle native calls and to improve the efficiency of the analysis.

  13. Transnational nursing programs: models, advantages and challenges.

    Science.gov (United States)

    Wilson, Michael

    2002-07-01

    Conducting transnational programs can be a very rewarding activity for a School, Faculty or University. Apart from increasing the profile of the university, the conduct of transnational programs can also provide the university with openings for business opportunities, consultative activities, and collaborative research. It can also be a costly exercise placing an enormous strain on limited resources with little reward for the provider. Transnational ventures can become nonviable entities in a very short period of time due to unanticipated global economic trends. Transnational courses offered by Faculties of Business and Computing are commonplace, however, there is a growing number of health science programs, particularly nursing that are being offered transnational. This paper plans an overview of several models employed for the delivery of transnational nursing courses and discusses several key issues pertaining to conducting courses outside the host university's country.

  14. CRECTJ: a computer program for compilation of evaluated nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1999-09-01

    In order to compile evaluated nuclear data in the ENDF format, the computer program CRECTJ has been developed. CRECTJ has two versions; CRECTJ5 treats the data in the ENDF/B-IV and ENDF/B-V format, and CRECTJ6 the data in the ENDF-6 format. These programs have been frequently used to make Japanese Evaluated Nuclear Data Library (JENDL). This report describes input data and examples of CRECTJ. (author)

  15. A Framework for Understanding Physics Students' Computational Modeling Practices

    Science.gov (United States)

    Lunk, Brandon Robert

    With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content knowledge, and physics knowledge in particular, can influence students' programming practices. In an effort to better understand this issue, I have developed a framework for modeling these practices based on a resource stance towards student knowledge. A resource framework models knowledge as the activation of vast networks of elements called "resources." Much like neurons in the brain, resources that become active can trigger cascading events of activation throughout the broader network. This model emphasizes the connectivity between knowledge elements and provides a description of students' knowledge base. Together with resources resources, the concepts of "epistemic games" and "frames" provide a means for addressing the interaction between content knowledge and practices. Although this framework has generally been limited to describing conceptual and mathematical understanding, it also provides a means for addressing students' programming practices. In this dissertation, I will demonstrate this facet of a resource framework as well as fill in an important missing piece: a set of epistemic games that can describe students' computational modeling strategies. The development of this theoretical framework emerged from the analysis of video data of students generating computational models during the laboratory component of a Matter & Interactions: Modern Mechanics course. Student participants across two semesters were recorded as they worked in groups to fix pre-written computational models that were initially missing key lines of code. Analysis of this video data showed that the students' programming practices were highly influenced by

  16. Department of Energy Mathematical, Information, and Computational Sciences Division: High Performance Computing and Communications Program

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-11-01

    This document is intended to serve two purposes. Its first purpose is that of a program status report of the considerable progress that the Department of Energy (DOE) has made since 1993, the time of the last such report (DOE/ER-0536, The DOE Program in HPCC), toward achieving the goals of the High Performance Computing and Communications (HPCC) Program. The second purpose is that of a summary report of the many research programs administered by the Mathematical, Information, and Computational Sciences (MICS) Division of the Office of Energy Research under the auspices of the HPCC Program and to provide, wherever relevant, easy access to pertinent information about MICS-Division activities via universal resource locators (URLs) on the World Wide Web (WWW).

  17. Department of Energy: MICS (Mathematical Information, and Computational Sciences Division). High performance computing and communications program

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-06-01

    This document is intended to serve two purposes. Its first purpose is that of a program status report of the considerable progress that the Department of Energy (DOE) has made since 1993, the time of the last such report (DOE/ER-0536, {open_quotes}The DOE Program in HPCC{close_quotes}), toward achieving the goals of the High Performance Computing and Communications (HPCC) Program. The second purpose is that of a summary report of the many research programs administered by the Mathematical, Information, and Computational Sciences (MICS) Division of the Office of Energy Research under the auspices of the HPCC Program and to provide, wherever relevant, easy access to pertinent information about MICS-Division activities via universal resource locators (URLs) on the World Wide Web (WWW). The information pointed to by the URL is updated frequently, and the interested reader is urged to access the WWW for the latest information.

  18. Introductory review of computational cell cycle modeling.

    Science.gov (United States)

    Kriete, Andres; Noguchi, Eishi; Sell, Christian

    2014-01-01

    Recent advances in the modeling of the cell cycle through computer simulation demonstrate the power of systems biology. By definition, systems biology has the goal to connect a parts list, prioritized through experimental observation or high-throughput screens, by the topology of interactions defining intracellular networks to predict system function. Computer modeling of biological systems is often compared to a process of reverse engineering. Indeed, designed or engineered technical systems share many systems-level properties with biological systems; thus studying biological systems within an engineering framework has proven successful. Here we review some aspects of this process as it pertains to cell cycle modeling.

  19. A computational model of the cerebellum

    Energy Technology Data Exchange (ETDEWEB)

    Travis, B.J.

    1990-01-01

    The need for realistic computational models of neural microarchitecture is growing increasingly apparent. While traditional neural networks have made inroads on understanding cognitive functions, more realism (in the form of structural and connectivity constraints) is required to explain processes such as vision or motor control. A highly detailed computational model of mammalian cerebellum has been developed. It is being compared to physiological recordings for validation purposes. The model is also being used to study the relative contributions of each component to cerebellar processing. 28 refs., 4 figs.

  20. Computational modeling of failure in composite laminates

    NARCIS (Netherlands)

    Van der Meer, F.P.

    2010-01-01

    There is no state of the art computational model that is good enough for predictive simulation of the complete failure process in laminates. Already on the single ply level controversy exists. Much work has been done in recent years in the development of continuum models, but these fail to predict

  1. Modeling User Behavior in Computer Learning Tasks.

    Science.gov (United States)

    Mantei, Marilyn M.

    Model building techniques from Artifical Intelligence and Information-Processing Psychology are applied to human-computer interface tasks to evaluate existing interfaces and suggest new and better ones. The model is in the form of an augmented transition network (ATN) grammar which is built by applying grammar induction heuristics on a sequential…

  2. Generating Computational Models for Serious Gaming

    NARCIS (Netherlands)

    Westera, Wim

    2018-01-01

    Many serious games include computational models that simulate dynamic systems. These models promote enhanced interaction and responsiveness. Under the social web paradigm more and more usable game authoring tools become available that enable prosumers to create their own games, but the inclusion of

  3. Generating computational models for serious gaming

    NARCIS (Netherlands)

    Westera, Wim

    2014-01-01

    Many serious games include computational models that simulate dynamic systems. These models promote enhanced interaction and responsiveness. Under the social web paradigm more and more usable game authoring tools become available that enable prosumers to create their own games, but the inclusion of

  4. Do's and Don'ts of Computer Models for Planning

    Science.gov (United States)

    Hammond, John S., III

    1974-01-01

    Concentrates on the managerial issues involved in computer planning models. Describes what computer planning models are and the process by which managers can increase the likelihood of computer planning models being successful in their organizations. (Author/DN)

  5. Models of neuromodulation for computational psychiatry.

    Science.gov (United States)

    Iglesias, Sandra; Tomiello, Sara; Schneebeli, Maya; Stephan, Klaas E

    2017-05-01

    Psychiatry faces fundamental challenges: based on a syndrome-based nosology, it presently lacks clinical tests to infer on disease processes that cause symptoms of individual patients and must resort to trial-and-error treatment strategies. These challenges have fueled the recent emergence of a novel field-computational psychiatry-that strives for mathematical models of disease processes at physiological and computational (information processing) levels. This review is motivated by one particular goal of computational psychiatry: the development of 'computational assays' that can be applied to behavioral or neuroimaging data from individual patients and support differential diagnosis and guiding patient-specific treatment. Because the majority of available pharmacotherapeutic approaches in psychiatry target neuromodulatory transmitters, models that infer (patho)physiological and (patho)computational actions of different neuromodulatory transmitters are of central interest for computational psychiatry. This article reviews the (many) outstanding questions on the computational roles of neuromodulators (dopamine, acetylcholine, serotonin, and noradrenaline), outlines available evidence, and discusses promises and pitfalls in translating these findings to clinical applications. WIREs Cogn Sci 2017, 8:e1420. doi: 10.1002/wcs.1420 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.

  6. Recent Advances in Computational Modeling of Thrombosis

    OpenAIRE

    Yesudasan, Sumith; Averett, Rodney D.

    2018-01-01

    The study of thrombosis is crucial to understand and develop new therapies for diseases like deep vein thrombosis, diabetes related strokes, pulmonary embolism etc. The last two decades have seen an exponential growth in studies related to the blood clot formation using computational tools and through experiments. Despite of this growth, the complete mechanism behind thrombus formation and hemostasis is not known yet. The computational models and methods used in this context are diversified i...

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  8. The Implementation of Blended Learning Using Android-Based Tutorial Video in Computer Programming Course II

    Science.gov (United States)

    Huda, C.; Hudha, M. N.; Ain, N.; Nandiyanto, A. B. D.; Abdullah, A. G.; Widiaty, I.

    2018-01-01

    Computer programming course is theoretical. Sufficient practice is necessary to facilitate conceptual understanding and encouraging creativity in designing computer programs/animation. The development of tutorial video in an Android-based blended learning is needed for students’ guide. Using Android-based instructional material, students can independently learn anywhere and anytime. The tutorial video can facilitate students’ understanding about concepts, materials, and procedures of programming/animation making in detail. This study employed a Research and Development method adapting Thiagarajan’s 4D model. The developed Android-based instructional material and tutorial video were validated by experts in instructional media and experts in physics education. The expert validation results showed that the Android-based material was comprehensive and very feasible. The tutorial video was deemed feasible as it received average score of 92.9%. It was also revealed that students’ conceptual understanding, skills, and creativity in designing computer program/animation improved significantly.

  9. Computer Program Plagiarism Detection: The Limits of the Halstead Metric.

    Science.gov (United States)

    Berghel, H. L.; Sallach, David L.

    1985-01-01

    Discusses two alternative metrics to detect computer software plagiarism: the Halstead metric drawn from the software science discipline and an ad hoc method drawn from program grading experience and identified by factor analysis. Possible explanations as to why the ad hoc method is more useful in identical-task environments are considered.…

  10. Computer Programming with Early Elementary Students with Down Syndrome

    Science.gov (United States)

    Taylor, Matthew S.; Vasquez, Eleazar; Donehower, Claire

    2017-01-01

    Students of all ages and abilities must be given the opportunity to learn academic skills that can shape future opportunities and careers. Researchers in the mid-1970s and 1980s began teaching young students the processes of computer programming using basic coding skills and limited technology. As technology became more personalized and easily…

  11. 77 FR 56824 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2012-09-14

    ... information contained in the USCIS database is referred to as the Verification Information System (VIS), which... records entitled ``Verification Information System Records Notice (DHS-2007-0010).'' Where there is a... Information: Privacy Act of 1974; Computer Matching Program between the U.S. Department of Education and the...

  12. Computers for All Children: A Handbook for Program Design.

    Science.gov (United States)

    Sharp, Pamela; Crist-Whitzel, Janet

    One of three publications of the Research on Equitable Access to Technology (REAT) project, this practitioner's handbook is designed to assist educators in the design and implementation of computer instruction programs for underserved groups of students, including low-income, minority, low-achieving, limited-English speaking, female, and rural…

  13. What's New in Software? Integrated Computer Programs and Daily Living.

    Science.gov (United States)

    Hedley, Carolyn N.

    1989-01-01

    Various kinds of electronic information media can now be integrated to plan educational programs, through use of computer videodiscs, hypercards, and hypertexts. Discussed are the components of integrative technology, including audio technology, video technology, and electronic text and graphics, and possibilities for interfacing the various…

  14. Intellectual Property Law and the Protection of Computer Programs.

    Science.gov (United States)

    Lomio, J. Paul

    1990-01-01

    Briefly reviews the laws pertaining to copyrights, patents, and trade secrets, and discusses how each of these may be applied to the protection of computer programs. The comparative merits and limitations of each category of law are discussed and recent court decisions are summarized. (CLB)

  15. Learning Computer Programming: Implementing a Fractal in a Turing Machine

    Science.gov (United States)

    Pereira, Hernane B. de B.; Zebende, Gilney F.; Moret, Marcelo A.

    2010-01-01

    It is common to start a course on computer programming logic by teaching the algorithm concept from the point of view of natural languages, but in a schematic way. In this sense we note that the students have difficulties in understanding and implementation of the problems proposed by the teacher. The main idea of this paper is to show that the…

  16. Individual Differences in Learning Computer Programming: A Social Cognitive Approach

    Science.gov (United States)

    Akar, Sacide Guzin Mazman; Altun, Arif

    2017-01-01

    The purpose of this study is to investigate and conceptualize the ranks of importance of social cognitive variables on university students' computer programming performances. Spatial ability, working memory, self-efficacy, gender, prior knowledge and the universities students attend were taken as variables to be analyzed. The study has been…

  17. A Domain-Specific Programming Language for Secure Multiparty Computation

    DEFF Research Database (Denmark)

    Nielsen, Janus Dam; Schwartzbach, Michael Ignatieff

    2007-01-01

    We present a domain-specific programming language for Secure Multiparty Computation (SMC). Information is a resource of vital importance and considerable economic value to individuals, public administration, and private companies. This means that the confidentiality of information is crucial, but...... application development. The language is implemented in a prototype compiler that generates Java code exploiting a distributed cryptographic runtime....

  18. A Research Program in Computer Technology. 1987 Annual Technical Report

    Science.gov (United States)

    1990-07-01

    mathematical approach to computational network design," in E. E. Swartzlander (ed.), Systolic Signal Processing Systems, chapter 1, Marcel Dekker, 1987...Intention-Based Diagnosis of Novice Programming Errors, Morgan Kaufmann, Los Altos, California, 1986. 32. Johnson, W. L., and E. Soloway, " PROUST

  19. An Analysis on Distance Education Computer Programming Students' Attitudes Regarding Programming and Their Self-Efficacy for Programming

    Science.gov (United States)

    Ozyurt, Ozcan

    2015-01-01

    This study aims to analyze the attitudes of students studying computer programming through the distance education regarding programming, and their self-efficacy for programming and the relation between these two factors. The study is conducted with 104 students being thought with distance education in a university in the north region of Turkey in…

  20. TVENT: a computer program for analysis of tornado-induced transients in ventilation systems. [TVENT

    Energy Technology Data Exchange (ETDEWEB)

    Duerre, K.H.; Andrae, R.W.; Gregory, W.S.

    1978-07-01

    The report describes TVENT, a portable FORTRAN computer program for predicting flows and pressures in a ventilation system subject to a tornado. The pressure and flow values calculated by TVENT can be used as a basis for structural analysis. TVENT is a one-dimensional, lumped-parameter model with incompressible flow augmented by fluid storage. The theoretical basis for the mathematical modeling and analysis is presented, and a description of the input for the computer code is provided. Modeling techniques specific to ventilation systems are described. Sample problems illustrate the use of TVENT in analyzing ventilation systems. Other sample problems illustrate modeling techniques used in reducing complex systems.

  1. Photovoltaic subsystem marketing and distribution model: programming manual. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1982-07-01

    Complete documentation of the marketing and distribution (M and D) computer model is provided. The purpose is to estimate the costs of selling and transporting photovoltaic solar energy products from the manufacturer to the final customer. The model adjusts for the inflation and regional differences in marketing and distribution costs. The model consists of three major components: the marketing submodel, the distribution submodel, and the financial submodel. The computer program is explained including the input requirements, output reports, subprograms and operating environment. The program specifications discuss maintaining the validity of the data and potential improvements. An example for a photovoltaic concentrator collector demonstrates the application of the model.

  2. Computational disease modeling – fact or fiction?

    Directory of Open Access Journals (Sweden)

    Stephan Klaas

    2009-06-01

    Full Text Available Abstract Background Biomedical research is changing due to the rapid accumulation of experimental data at an unprecedented scale, revealing increasing degrees of complexity of biological processes. Life Sciences are facing a transition from a descriptive to a mechanistic approach that reveals principles of cells, cellular networks, organs, and their interactions across several spatial and temporal scales. There are two conceptual traditions in biological computational-modeling. The bottom-up approach emphasizes complex intracellular molecular models and is well represented within the systems biology community. On the other hand, the physics-inspired top-down modeling strategy identifies and selects features of (presumably essential relevance to the phenomena of interest and combines available data in models of modest complexity. Results The workshop, "ESF Exploratory Workshop on Computational disease Modeling", examined the challenges that computational modeling faces in contributing to the understanding and treatment of complex multi-factorial diseases. Participants at the meeting agreed on two general conclusions. First, we identified the critical importance of developing analytical tools for dealing with model and parameter uncertainty. Second, the development of predictive hierarchical models spanning several scales beyond intracellular molecular networks was identified as a major objective. This contrasts with the current focus within the systems biology community on complex molecular modeling. Conclusion During the workshop it became obvious that diverse scientific modeling cultures (from computational neuroscience, theory, data-driven machine-learning approaches, agent-based modeling, network modeling and stochastic-molecular simulations would benefit from intense cross-talk on shared theoretical issues in order to make progress on clinically relevant problems.

  3. Implementation of the Distributed Parallel Program for Geoid Heights Computation Using MPI and Openmp

    Science.gov (United States)

    Lee, S.; Kim, J.; Jung, Y.; Choi, J.; Choi, C.

    2012-07-01

    Much research have been carried out using optimization algorithms for developing high-performance program, under the parallel computing environment with the evolution of the computer hardware technology such as dual-core processor and so on. Then, the studies by the parallel computing in geodesy and surveying fields are not so many. The present study aims to reduce running time for the geoid heights computation and carrying out least-squares collocation to improve its accuracy using distributed parallel technology. A distributed parallel program was developed in which a multi-core CPU-based PC cluster was adopted using MPI and OpenMP library. Geoid heights were calculated by the spherical harmonic analysis using the earth geopotential model of the National Geospatial-Intelligence Agency(2008). The geoid heights around the Korean Peninsula were calculated and tested in diskless-based PC cluster environment. As results, for the computing geoid heights by a earth geopotential model, the distributed parallel program was confirmed more effective to reduce the computational time compared to the sequential program.

  4. Computer programs for eddy-current defect studies

    Energy Technology Data Exchange (ETDEWEB)

    Pate, J. R.; Dodd, C. V. [Oak Ridge National Lab., TN (USA)

    1990-06-01

    Several computer programs to aid in the design of eddy-current tests and probes have been written. The programs, written in Fortran, deal in various ways with the response to defects exhibited by four types of probes: the pancake probe, the reflection probe, the circumferential boreside probe, and the circumferential encircling probe. Programs are included which calculate the impedance or voltage change in a coil due to a defect, which calculate and plot the defect sensitivity factor of a coil, and which invert calculated or experimental readings to obtain the size of a defect. The theory upon which the programs are based is the Burrows point defect theory, and thus the calculations of the programs will be more accurate for small defects. 6 refs., 21 figs.

  5. Introductory Computer Programming Course Teaching Improvement Using Immersion Language, Extreme Programming, and Education Theories

    Science.gov (United States)

    Velez-Rubio, Miguel

    2013-01-01

    Teaching computer programming to freshmen students in Computer Sciences and other Information Technology areas has been identified as a complex activity. Different approaches have been studied looking for the best one that could help to improve this teaching process. A proposed approach was implemented which is based in the language immersion…

  6. A parallel computational model for GATE simulations.

    Science.gov (United States)

    Rannou, F R; Vega-Acevedo, N; El Bitar, Z

    2013-12-01

    GATE/Geant4 Monte Carlo simulations are computationally demanding applications, requiring thousands of processor hours to produce realistic results. The classical strategy of distributing the simulation of individual events does not apply efficiently for Positron Emission Tomography (PET) experiments, because it requires a centralized coincidence processing and large communication overheads. We propose a parallel computational model for GATE that handles event generation and coincidence processing in a simple and efficient way by decentralizing event generation and processing but maintaining a centralized event and time coordinator. The model is implemented with the inclusion of a new set of factory classes that can run the same executable in sequential or parallel mode. A Mann-Whitney test shows that the output produced by this parallel model in terms of number of tallies is equivalent (but not equal) to its sequential counterpart. Computational performance evaluation shows that the software is scalable and well balanced. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  7. A COMPUTER PROGRAM FOR INTERPRETATION OF THE DATA OF VERTICAL ELECTRICAL SOUNDING VEZ-4A

    Directory of Open Access Journals (Sweden)

    D. G. Koliushko

    2017-06-01

    Full Text Available Purpose. Creating a computer program for interpreting the results of vertical sounding the soil in the form of multilayer model most typical for Ukraine. Methodology. The algorithm of the program is constructed on determination the soil structure with the help of the method of point source current, method of analogy and method of equivalent. The option of automatic interpretation based on Hook-Jeeves method. The program is implemented in the programming language Delphi. Results. The computer program «VEZ-4A» has a possibility of the interactive and automatic interpretation sounding results in the multi-layered geoelectrical model. Originality. In first time the computer program for analyzing and interpreting results of the soil sounding by Wenner configuration was created on the base of the analytical solution for field of current point source located in four-, three- or two-layer structure. In paper the review is presented and basic functions of our program are analyzed. Practical value. The program «VEZ-4A» is created and adapted for use in the electromagnetic diagnostics of grounding of existing power plants and substations.

  8. Portable Computer Technology (PCT) Research and Development Program Phase 2

    Science.gov (United States)

    Castillo, Michael; McGuire, Kenyon; Sorgi, Alan

    1995-01-01

    The subject of this project report, focused on: (1) Design and development of two Advanced Portable Workstation 2 (APW 2) units. These units incorporate advanced technology features such as a low power Pentium processor, a high resolution color display, National Television Standards Committee (NTSC) video handling capabilities, a Personal Computer Memory Card International Association (PCMCIA) interface, and Small Computer System Interface (SCSI) and ethernet interfaces. (2) Use these units to integrate and demonstrate advanced wireless network and portable video capabilities. (3) Qualification of the APW 2 systems for use in specific experiments aboard the Mir Space Station. A major objective of the PCT Phase 2 program was to help guide future choices in computing platforms and techniques for meeting National Aeronautics and Space Administration (NASA) mission objectives. The focus being on the development of optimal configurations of computing hardware, software applications, and network technologies for use on NASA missions.

  9. New Mexico district work-effort analysis computer program

    Science.gov (United States)

    Hiss, W.L.; Trantolo, A.P.; Sparks, J.L.

    1972-01-01

    The computer program (CAN 2) described in this report is one of several related programs used in the New Mexico District cost-analysis system. The work-effort information used in these programs is accumulated and entered to the nearest hour on forms completed by each employee. Tabulating cards are punched directly from these forms after visual examinations for errors are made. Reports containing detailed work-effort data itemized by employee within each project and account and by account and project for each employee are prepared for both current-month and year-to-date periods by the CAN 2 computer program. An option allowing preparation of reports for a specified 3-month period is provided. The total number of hours worked on each account and project and a grand total of hours worked in the New Mexico District is computed and presented in a summary report for each period. Work effort not chargeable directly to individual projects or accounts is considered as overhead and can be apportioned to the individual accounts and projects on the basis of the ratio of the total hours of work effort for the individual accounts or projects to the total New Mexico District work effort at the option of the user. The hours of work performed by a particular section, such as General Investigations or Surface Water, are prorated and charged to the projects or accounts within the particular section. A number of surveillance or buffer accounts are employed to account for the hours worked on special events or on those parts of large projects or accounts that require a more detailed analysis. Any part of the New Mexico District operation can be separated and analyzed in detail by establishing an appropriate buffer account. With the exception of statements associated with word size, the computer program is written in FORTRAN IV in a relatively low and standard language level to facilitate its use on different digital computers. The program has been run only on a Control Data Corporation

  10. Solutions manual and computer programs for physical and computational aspects of convective heat transfer

    CERN Document Server

    Cebeci, Tuncer

    1989-01-01

    This book is designed to accompany Physical and Computational Aspects of Convective Heat Transfer by T Cebeci and P Bradshaw and contains solutions to the exercises and computer programs for the numerical methods contained in that book Physical and Computational Aspects of Convective Heat Transfer begins with a thorough discussion of the physical aspects of convective heat transfer and presents in some detail the partial differential equations governing the transport of thermal energy in various types of flows The book is intended for senior undergraduate and graduate students of aeronautical, chemical, civil and mechanical engineering It can also serve as a reference for the practitioner

  11. Computer program system for evaluation of FP nuclear data for JENDL. Smooth part

    Energy Technology Data Exchange (ETDEWEB)

    Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Watanabe, Takashi; Iijima, Shungo

    1997-12-01

    This report describes computer programs used to evaluate nuclear data of fission product (FP) nuclides stored in an evaluated nuclear data library JENDL, especially in the smooth part above the resonance region. Many programs were used for determination of nuclear model parameters, calculation of nuclear data, handling of experimental and/or calculated data, and so on. Among them, reported here are programs for determination of level density parameters (ENSDFRET, LVLPLOT, LEVDES), for making sets of JCL and input data for the theoretical calculation program CASTHY (JOBSETTER, INDES/CASTHY), and for conversion of format of CASTHY output files to the ENDF format (CTOB2). (author). 51 refs.

  12. Biomedical Imaging and Computational Modeling in Biomechanics

    CERN Document Server

    Iacoviello, Daniela

    2013-01-01

    This book collects the state-of-art and new trends in image analysis and biomechanics. It covers a wide field of scientific and cultural topics, ranging from remodeling of bone tissue under the mechanical stimulus up to optimizing the performance of sports equipment, through the patient-specific modeling in orthopedics, microtomography and its application in oral and implant research, computational modeling in the field of hip prostheses, image based model development and analysis of the human knee joint, kinematics of the hip joint, micro-scale analysis of compositional and mechanical properties of dentin, automated techniques for cervical cell image analysis, and iomedical imaging and computational modeling in cardiovascular disease.   The book will be of interest to researchers, Ph.D students, and graduate students with multidisciplinary interests related to image analysis and understanding, medical imaging, biomechanics, simulation and modeling, experimental analysis.

  13. Analisis Model Manajemen Insiden Berbasis Cloud Computing

    Directory of Open Access Journals (Sweden)

    Anggi Sukamto

    2015-05-01

    Full Text Available Dukungan teknologi informasi yang diterapkan oleh organisasi membutuhkan suatu manajemen agar penggunaannya dapat memenuhi tujuan penerapan teknologi tersebut. Salah satu kerangka kerja manajemen layanan teknologi informasi yang dapat diadopsi oleh organisasi adalah Information Technology Infrastructure Library (ITIL. Dukungan layanan (service support merupakan bagian dari proses ITIL. Pada umumnya, aktivitas dukungan layanan dilaksanakan dengan penggunaan teknologi yang dapat diakses melalui internet. Kondisi tersebut mengarah pada suatu konsep cloud computing. Cloud computing memungkinkan suatu instansi atau perusahaan untuk bisa mengatur sumber daya melalui jaringan internet. Fokus penelitian ini adalah menganalisis proses dan pelaku yang terlibat dalam dukungan layanan khususnya pada proses manajemen insiden, serta mengidentifikasi potensi penyerahan pelaku ke bentuk layanan cloud computing. Berdasarkan analisis yang dilakukan maka usulan model manajemen insiden berbasis cloud ini dapat diterapkan dalam suatu organisasi yang telah menggunakan teknologi komputer untuk mendukung kegiatan operasional. Kata Kunci—Cloud computing, ITIL, Manajemen Insiden, Service Support, Service Desk.

  14. My Program Is Ok--Am I? Computing Freshmen's Experiences of Doing Programming Assignments

    Science.gov (United States)

    Kinnunen, Paivi; Simon, Beth

    2012-01-01

    This article provides insight into how computing majors experience the process of doing programming assignments in their first programming course. This grounded theory study sheds light on the various processes and contexts through which students constantly assess their self-efficacy as a programmer. The data consists of a series of four…

  15. Motivating Programming: Using Storytelling to Make Computer Programming Attractive to Middle School Girls

    Science.gov (United States)

    2006-11-01

    In Generic Alice, there are two common default Chapter 5: Developing the Storytelling Gallery 114 positions for characters and, consequently , two...balance or generating the nth Fibonacci number. Often, students write programs in introductory computer science using professional programming...handle without fundamentally changing the common control structures found in general -purpose languages. Consequently , when a student moves from one of

  16. Computer Programming Games and Gender Oriented Cultural Forms

    Science.gov (United States)

    AlSulaiman, Sarah Abdulmalik

    I present the design and evaluation of two games designed to help elementary and middle school students learn computer programming concepts. The first game was designed to be "gender neutral", aligning with might be described as a consensus opinion on best practices for computational learning environments. The second game, based on the cultural form of dress up dolls was deliberately designed to appeal to females. I recruited 70 participants in an international two-phase study to investigate the relationship between games, gender, attitudes towards computer programming, and learning. My findings suggest that while the two games were equally effective in terms of learning outcomes, I saw differences in motivation between players of the two games. Specifically, participants who reported a preference for female- oriented games were more motivated to learn about computer programming when they played a game that they perceived as designed for females. In addition, I describe how the two games seemed to encourage different types of social activity between players in a classroom setting. Based on these results, I reflect on the strategy of exclusively designing games and activities as "gender neutral", and suggest that employing cultural forms, including gendered ones, may help create a more productive experience for learners.

  17. Computational methods of the Advanced Fluid Dynamics Model

    Energy Technology Data Exchange (ETDEWEB)

    Bohl, W.R.; Wilhelm, D.; Parker, F.R.; Berthier, J.; Maudlin, P.J.; Schmuck, P.; Goutagny, L.; Ichikawa, S.; Ninokata, H.; Luck, L.B.

    1987-01-01

    To more accurately treat severe accidents in fast reactors, a program has been set up to investigate new computational models and approaches. The product of this effort is a computer code, the Advanced Fluid Dynamics Model (AFDM). This paper describes some of the basic features of the numerical algorithm used in AFDM. Aspects receiving particular emphasis are the fractional-step method of time integration, the semi-implicit pressure iteration, the virtual mass inertial terms, the use of three velocity fields, higher order differencing, convection of interfacial area with source and sink terms, multicomponent diffusion processes in heat and mass transfer, the SESAME equation of state, and vectorized programming. A calculated comparison with an isothermal tetralin/ammonia experiment is performed. We conclude that significant improvements are possible in reliably calculating the progression of severe accidents with further development.

  18. Computer-Guided Deep Brain Stimulation Programming for Parkinson's Disease.

    Science.gov (United States)

    Heldman, Dustin A; Pulliam, Christopher L; Urrea Mendoza, Enrique; Gartner, Maureen; Giuffrida, Joseph P; Montgomery, Erwin B; Espay, Alberto J; Revilla, Fredy J

    2016-02-01

    Pilot study to evaluate computer-guided deep brain stimulation (DBS) programming designed to optimize stimulation settings using objective motion sensor-based motor assessments. Seven subjects (five males; 54-71 years) with Parkinson's disease (PD) and recently implanted DBS systems participated in this pilot study. Within two months of lead implantation, the subject returned to the clinic to undergo computer-guided programming and parameter selection. A motion sensor was placed on the index finger of the more affected hand. Software guided a monopolar survey during which monopolar stimulation on each contact was iteratively increased followed by an automated assessment of tremor and bradykinesia. After completing assessments at each setting, a software algorithm determined stimulation settings designed to minimize symptom severities, side effects, and battery usage. Optimal DBS settings were chosen based on average severity of motor symptoms measured by the motion sensor. Settings chosen by the software algorithm identified a therapeutic window and improved tremor and bradykinesia by an average of 35.7% compared with baseline in the "off" state (p computer-guided DBS programming identified stimulation parameters that significantly improved tremor and bradykinesia with minimal clinician involvement. Automated motion sensor-based mapping is worthy of further investigation and may one day serve to extend programming to populations without access to specialized DBS centers. © 2015 International Neuromodulation Society.

  19. Computational modeling for irrigated agriculture planning. Part I: general description and linear programming Modelagem computacional para planejamento em agricultura irrigada: Parte I: descrição geral e programação linear

    Directory of Open Access Journals (Sweden)

    João C. F. Borges Júnior

    2008-09-01

    Full Text Available Linear programming models are effective tools to support initial or periodic planning of agricultural enterprises, requiring, however, technical coefficients that can be determined using computer simulation models. This paper, presented in two parts, deals with the development, application and tests of a methodology and of a computational modeling tool to support planning of irrigated agriculture activities. Part I aimed at the development and application, including sensitivity analysis, of a multiyear linear programming model to optimize the financial return and water use, at farm level for Jaíba irrigation scheme, Minas Gerais State, Brazil, using data on crop irrigation requirement and yield, obtained from previous simulation with MCID model. The linear programming model outputted a crop pattern to which a maximum total net present value of R$ 372,723.00 for the four years period, was obtained. Constraints on monthly water availability, labor, land and production were critical in the optimal solution. In relation to the water use optimization, it was verified that an expressive reductions on the irrigation requirements may be achieved by small reductions on the maximum total net present value.Modelos de programação linear são ferramentas eficazes de suporte ao planejamento inicial ou periódico de empreendimentos agrícolas, requerendo, todavia, coeficientes técnicos que podem ser obtidos por modelos computacionais de simulação. Este trabalho, dividido em duas partes, aborda o desenvolvimento, a aplicação e os testes de metodologia e da modelagem computacional de uma ferramenta de auxílio ao planejamento da exploração agrícola irrigada. Teve-se o objetivo de desenvolver e aplicar, com análise de sensibilidade, um modelo de programação linear plurianual para otimização do retorno financeiro e uso da água, em nível de propriedade rural no perímetro de irrigação do Jaíba - MG, utilizando dados de requerimento de irriga

  20. Computer Aided Modelling – Opportunities and Challenges

    DEFF Research Database (Denmark)

    2011-01-01

    This chapter considers the opportunities that are present in developing, extending and applying aspects of computer-aided modelling principles and practice. What are the best tasks to be done by modellers and what needs the application of CAPE tools? How do we efficiently develop model-based solu......This chapter considers the opportunities that are present in developing, extending and applying aspects of computer-aided modelling principles and practice. What are the best tasks to be done by modellers and what needs the application of CAPE tools? How do we efficiently develop model......-based solutions to significant problems? The important issues of workflow and data flow are discussed together with fit-for-purpose model development. As well, the lack of tools around multiscale modelling provides opportunities for the development of efficient tools to address such challenges. The ability...... to easily generate new models from underlying phenomena continues to be a challenge, especially in the face of time and cost constraints.Integrated frameworks that allow flexibility of model development and access to a range of embedded tools are central to future model developments. The challenges...

  1. Computer Adaptive Testing for Small Scale Programs and Instructional Systems

    Science.gov (United States)

    Rudner, Lawrence M.; Guo, Fanmin

    2011-01-01

    This study investigates measurement decision theory (MDT) as an underlying model for computer adaptive testing when the goal is to classify examinees into one of a finite number of groups. The first analysis compares MDT with a popular item response theory model and finds little difference in terms of the percentage of correct classifications. The…

  2. Applied Mathematics, Modelling and Computational Science

    CERN Document Server

    Kotsireas, Ilias; Makarov, Roman; Melnik, Roderick; Shodiev, Hasan

    2015-01-01

    The Applied Mathematics, Modelling, and Computational Science (AMMCS) conference aims to promote interdisciplinary research and collaboration. The contributions in this volume cover the latest research in mathematical and computational sciences, modeling, and simulation as well as their applications in natural and social sciences, engineering and technology, industry, and finance. The 2013 conference, the second in a series of AMMCS meetings, was held August 26–30 and organized in cooperation with AIMS and SIAM, with support from the Fields Institute in Toronto, and Wilfrid Laurier University. There were many young scientists at AMMCS-2013, both as presenters and as organizers. This proceedings contains refereed papers contributed by the participants of the AMMCS-2013 after the conference. This volume is suitable for researchers and graduate students, mathematicians and engineers, industrialists, and anyone who would like to delve into the interdisciplinary research of applied and computational mathematics ...

  3. Integrating interactive computational modeling in biology curricula.

    Directory of Open Access Journals (Sweden)

    Tomáš Helikar

    2015-03-01

    Full Text Available While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  4. Integrating Interactive Computational Modeling in Biology Curricula

    Science.gov (United States)

    Helikar, Tomáš; Cutucache, Christine E.; Dahlquist, Lauren M.; Herek, Tyler A.; Larson, Joshua J.; Rogers, Jim A.

    2015-01-01

    While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by “building and breaking it” via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the “Vision and Change” call to action in undergraduate biology education by providing a hands-on approach to biology. PMID:25790483

  5. Integrating interactive computational modeling in biology curricula.

    Science.gov (United States)

    Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A

    2015-03-01

    While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  6. The Performance Improvement of the Lagrangian Particle Dispersion Model (LPDM) Using Graphics Processing Unit (GPU) Computing

    Science.gov (United States)

    2017-08-01

    Using Graphics Processing Unit ( GPU ) Computing by Leelinda P Dawson Approved for public release; distribution unlimited...The Performance Improvement of the Lagrangian Particle Dispersion Model (LPDM) Using Graphics Processing Unit ( GPU ) Computing by Leelinda...Using Graphics Processing Unit ( GPU ) Computing 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Leelinda P

  7. Engineering and programming manual: Two-dimensional kinetic reference computer program (TDK)

    Science.gov (United States)

    Nickerson, G. R.; Dang, L. D.; Coats, D. E.

    1985-01-01

    The Two Dimensional Kinetics (TDK) computer program is a primary tool in applying the JANNAF liquid rocket thrust chamber performance prediction methodology. The development of a methodology that includes all aspects of rocket engine performance from analytical calculation to test measurements, that is physically accurate and consistent, and that serves as an industry and government reference is presented. Recent interest in rocket engines that operate at high expansion ratio, such as most Orbit Transfer Vehicle (OTV) engine designs, has required an extension of the analytical methods used by the TDK computer program. Thus, the version of TDK that is described in this manual is in many respects different from the 1973 version of the program. This new material reflects the new capabilities of the TDK computer program, the most important of which are described.

  8. Techniques for Engaging Students in an Online Computer Programming Course

    Directory of Open Access Journals (Sweden)

    Eman M. El-Sheikh

    2009-02-01

    Full Text Available Many institutions of higher education are significantly expanding their online program and course offerings to deal with the rapidly increasing demand for flexible educational alternatives. One of the main challenges that faculty who teach online courses face is determining how to engage students in an online environment. Teaching computer programming effectively requires demonstration of programming techniques, examples, and environments, and interaction with the students, making online delivery even more challenging. This paper describes efforts to engage students in an online introductory programming course at our institution. The tools and methods used to promote student engagement in the course are described, in addition to the lessons learned from the design and delivery of the online course and opportunities for future work.

  9. A Stochastic Dynamic Model of Computer Viruses

    Directory of Open Access Journals (Sweden)

    Chunming Zhang

    2012-01-01

    Full Text Available A stochastic computer virus spread model is proposed and its dynamic behavior is fully investigated. Specifically, we prove the existence and uniqueness of positive solutions, and the stability of the virus-free equilibrium and viral equilibrium by constructing Lyapunov functions and applying Ito's formula. Some numerical simulations are finally given to illustrate our main results.

  10. Computational Failure Modeling of Lower Extremities

    Science.gov (United States)

    2012-01-01

    State University, 2010. RTO-HFM-207 25- 17 Computational Failure Modeling of Lower Extremities [28] Kleb, B., “nato-rto—A LATEX Class and BIBTEX Style ...BIOMEDICAL SCIENCES A HEPPER D POPE RM 1A BLDG 245 PORTON DOWN SALISBURY WILTSHIRE SP4 OJQ UNITED KINGDOM 4 DRDC VALCARTIER

  11. Computational Modeling of Complex Protein Activity Networks

    NARCIS (Netherlands)

    Schivo, Stefano; Leijten, Jeroen; Karperien, Marcel; Post, Janine N.; Prignet, Claude

    2017-01-01

    Because of the numerous entities interacting, the complexity of the networks that regulate cell fate makes it impossible to analyze and understand them using the human brain alone. Computational modeling is a powerful method to unravel complex systems. We recently described the development of a

  12. Computational modelling for dry-powder inhalers

    NARCIS (Netherlands)

    Kröger, Ralf; Woolhouse, Robert; Becker, Michael; Wachtel, Herbert; de Boer, Anne; Horner, Marc

    2012-01-01

    Computational fluid dynamics (CFD) is a simulation tool used for modelling powder flow through inhalers to allow optimisation both of device design and drug powder. Here, Ralf Kröger, Consulting Senior CFD Engineer, ANSYS Germany GmbH; Marc Horner, Lead Technical Services Engineer, Healthcare,

  13. Multithreaded transactions in scientific computing. The Growth06_v2 program

    Science.gov (United States)

    Daniluk, Andrzej

    2009-07-01

    Writing a concurrent program can be more difficult than writing a sequential program. Programmer needs to think about synchronization, race conditions and shared variables. Transactions help reduce the inconvenience of using threads. A transaction is an abstraction, which allows programmers to group a sequence of actions on the program into a logical, higher-level computation unit. This paper presents a new version of the GROWTHGr and GROWTH06 programs. New version program summaryProgram title: GROWTH06_v2 Catalogue identifier: ADVL_v2_1 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVL_v2_1.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 65 255 No. of bytes in distributed program, including test data, etc.: 865 985 Distribution format: tar.gz Programming language: Object Pascal Computer: Pentium-based PC Operating system: Windows 9x, XP, NT, Vista RAM: more than 1 MB Classification: 4.3, 7.2, 6.2, 8, 14 Catalogue identifier of previous version: ADVL_v2_0 Journal reference of previous version: Comput. Phys. Comm. 175 (2006) 678 Does the new version supersede the previous version?: Yes Nature of problem: The programs compute the RHEED intensities during the growth of thin epitaxial structures prepared using the molecular beam epitaxy (MBE). The computations are based on the use of kinematical diffraction theory. Solution method: Epitaxial growth of thin films is modelled by a set of non-linear differential equations [1]. The Runge-Kutta method with adaptive stepsize control was used for solving initial value problem for non-linear differential equations [2]. Reasons for new version: According to the users' suggestions functionality of the program has been improved. Moreover, new use cases have been added which make the handling of the program easier and more

  14. A Multilayer Model of Computer Networks

    OpenAIRE

    Shchurov, Andrey A.

    2015-01-01

    The fundamental concept of applying the system methodology to network analysis declares that network architecture should take into account services and applications which this network provides and supports. This work introduces a formal model of computer networks on the basis of the hierarchical multilayer networks. In turn, individual layers are represented as multiplex networks. The concept of layered networks provides conditions of top-down consistency of the model. Next, we determined the...

  15. Applied modelling and computing in social science

    CERN Document Server

    Povh, Janez

    2015-01-01

    In social science outstanding results are yielded by advanced simulation methods, based on state of the art software technologies and an appropriate combination of qualitative and quantitative methods. This book presents examples of successful applications of modelling and computing in social science: business and logistic process simulation and optimization, deeper knowledge extractions from big data, better understanding and predicting of social behaviour and modelling health and environment changes.

  16. RECON: a computer program for analyzing repository economics. Documentation and user's manual

    Energy Technology Data Exchange (ETDEWEB)

    Clark, L.L.; Cole, B.M.; McNair, G.W.; Schutz, M.E.

    1983-05-01

    From 1981 through 1983 the Pacific Northwest Laboratory has been developing a computer model named RECON to calculate repository costs from parametric data input. The objective of the program has been to develop the capability to evalute the effect on costs of changes in repository design parameters and operating scenario assumptions. This report documents the development of the model through March of 1983. Included in the report are: (1) descriptions of model development and the underlying equations, assumptions and definitions; (2) descriptions of data input either using card images or an interactive data input program; and (3) detailed listings of the program and definitions of program variables. Cost estimates generated using the model have been verified against independent estimates and good agreement has been obtained.

  17. Qualification of a computer program for drill string dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Stone, C.M.; Carne, T.G.; Caskey, B.C.

    1985-01-01

    A four point plan for the qualification of the GEODYN drill string dynamics computer program is described. The qualification plan investigates both modal response and transient response of a short drill string subjected to simulated cutting loads applied through a polycrystalline diamond compact (PDC) bit. The experimentally based qualification shows that the analytical techniques included in Phase 1 GEODYN correctly simulate the dynamic response of the bit-drill string system. 6 refs., 8 figs.

  18. PET computer programs for use with the 88-inch cyclotron

    Energy Technology Data Exchange (ETDEWEB)

    Gough, R.A.; Chlosta, L.

    1981-06-01

    This report describes in detail several offline programs written for the PET computer which provide an efficient data management system to assist with the operation of the 88-Inch Cyclotron. This function includes the capability to predict settings for all cyclotron and beam line parameters for all beams within the present operating domain of the facility. The establishment of a data base for operational records is also described from which various aspects of the operating history can be projected.

  19. Designing, programming, and optimizing a (small) quantum computer

    Science.gov (United States)

    Svore, Krysta

    In 1982, Richard Feynman proposed to use a computer founded on the laws of quantum physics to simulate physical systems. In the more than thirty years since, quantum computers have shown promise to solve problems in number theory, chemistry, and materials science that would otherwise take longer than the lifetime of the universe to solve on an exascale classical machine. The practical realization of a quantum computer requires understanding and manipulating subtle quantum states while experimentally controlling quantum interference. It also requires an end-to-end software architecture for programming, optimizing, and implementing a quantum algorithm on the quantum device hardware. In this talk, we will introduce recent advances in connecting abstract theory to present-day real-world applications through software. We will highlight recent advancement of quantum algorithms and the challenges in ultimately performing a scalable solution on a quantum device.

  20. Integrated use of computer programs for analysis and optimization of aircraft structures

    Science.gov (United States)

    Moon, Young IN

    1990-01-01

    The objective is to present results from investigating the integrated use of five computer programs: ANALYZE, ASTROS, NASTRAN, OPTSTAT, and VAASEL for analysis and optimization of a given structure. The structure designated for study purposes was the F-15E vertical tail. The concept of integrated use is limited to the capability of using a NASTRAN structural model to run each of the other specified programs. This was actually accomplished in practice by converting a NASTRAN model of the designated structure into an appropriate analysis model. For optimization purposes, the torque-box of the F-15E vertical tail was used.

  1. Computational modelling of evolution: ecosystems and language

    CERN Document Server

    Lipowski, Adam

    2008-01-01

    Recently, computational modelling became a very important research tool that enables us to study problems that for decades evaded scientific analysis. Evolutionary systems are certainly examples of such problems: they are composed of many units that might reproduce, diffuse, mutate, die, or in some cases for example communicate. These processes might be of some adaptive value, they influence each other and occur on various time scales. That is why such systems are so difficult to study. In this paper we briefly review some computational approaches, as well as our contributions, to the evolution of ecosystems and language. We start from Lotka-Volterra equations and the modelling of simple two-species prey-predator systems. Such systems are canonical example for studying oscillatory behaviour in competitive populations. Then we describe various approaches to study long-term evolution of multi-species ecosystems. We emphasize the need to use models that take into account both ecological and evolutionary processe...

  2. Rough – Granular Computing knowledge discovery models

    Directory of Open Access Journals (Sweden)

    Mohammed M. Eissa

    2016-11-01

    Full Text Available Medical domain has become one of the most important areas of research in order to richness huge amounts of medical information about the symptoms of diseases and how to distinguish between them to diagnose it correctly. Knowledge discovery models play vital role in refinement and mining of medical indicators to help medical experts to settle treatment decisions. This paper introduces four hybrid Rough – Granular Computing knowledge discovery models based on Rough Sets Theory, Artificial Neural Networks, Genetic Algorithm and Rough Mereology Theory. A comparative analysis of various knowledge discovery models that use different knowledge discovery techniques for data pre-processing, reduction, and data mining supports medical experts to extract the main medical indicators, to reduce the misdiagnosis rates and to improve decision-making for medical diagnosis and treatment. The proposed models utilized two medical datasets: Coronary Heart Disease dataset and Hepatitis C Virus dataset. The main purpose of this paper was to explore and evaluate the proposed models based on Granular Computing methodology for knowledge extraction according to different evaluation criteria for classification of medical datasets. Another purpose is to make enhancement in the frame of KDD processes for supervised learning using Granular Computing methodology.

  3. The ACP (Advanced Computer Program) multiprocessor system at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Nash, T.; Areti, H.; Atac, R.; Biel, J.; Case, G.; Cook, A.; Fischler, M.; Gaines, I.; Hance, R.; Husby, D.

    1986-09-01

    The Advanced Computer Program at Fermilab has developed a multiprocessor system which is easy to use and uniquely cost effective for many high energy physics problems. The system is based on single board computers which cost under $2000 each to build including 2 Mbytes of on board memory. These standard VME modules each run experiment reconstruction code in Fortran at speeds approaching that of a VAX 11/780. Two versions have been developed: one uses Motorola's 68020 32 bit microprocessor, the other runs with AT and T's 32100. both include the corresponding floating point coprocessor chip. The first system, when fully configured, uses 70 each of the two types of processors. A 53 processor system has been operated for several months with essentially no down time by computer operators in the Fermilab Computer Center, performing at nearly the capacity of 6 CDC Cyber 175 mainframe computers. The VME crates in which the processing ''nodes'' sit are connected via a high speed ''Branch Bus'' to one or more MicroVAX computers which act as hosts handling system resource management and all I/O in offline applications. An interface from Fastbus to the Branch Bus has been developed for online use which has been tested error free at 20 Mbytes/sec for 48 hours. ACP hardware modules are now available commercially. A major package of software, including a simulator that runs on any VAX, has been developed. It allows easy migration of existing programs to this multiprocessor environment. This paper describes the ACP Multiprocessor System and early experience with it at Fermilab and elsewhere.

  4. Towards quantum computing for the classical O(2) model

    CERN Document Server

    Zou, Haiyuan; Lai, Chen-Yen; Unmuth-Yockey, J; Bazavov, A; Xie, Z Y; Xiang, T; Chandrasekharan, S; Tsai, S -W; Meurice, Y

    2014-01-01

    We construct a sequence of steps connecting the classical $O(2)$ model in 1+1 dimensions, a model having common features with those considered in lattice gauge theory, to physical models potentially implementable on optical lattices and evolving at physical time. We show that the tensor renormalization group formulation of the classical model allows reliable calculations of the largest eigenvalues of the transfer matrix. We take the time continuum limit and check that finite dimensional projections used in recent proposals for quantum simulators provide controllable approximations of the original model. We propose two-species Bose-Hubbard models corresponding to these finite dimensional projections at strong coupling and discuss their possible implementations on optical lattices. The full completion of this program would provide a proof of principle that quantum computing is possible for classical lattice models.

  5. Natural Resources Research Program: Catalog of Computer Programs for Project Management.

    Science.gov (United States)

    project management . These include programs developed for use on a microcomputer, as well as those which run on a host computer but are accessed by a terminal in a field office. A one-page description of each program contains the title; preparing agency; abstract; a summary of the data inputs and outputs; equipment, disk, and memory requirements; operating system and programming language; and a contact for further information. The programs described in this publication are not limited to those available within the Corps, but also include those available from other

  6. Analysis of a Model for Computer Virus Transmission

    OpenAIRE

    Qin, Peng

    2015-01-01

    Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our t...

  7. Computational modeling of neurostimulation in brain diseases.

    Science.gov (United States)

    Wang, Yujiang; Hutchings, Frances; Kaiser, Marcus

    2015-01-01

    Neurostimulation as a therapeutic tool has been developed and used for a range of different diseases such as Parkinson's disease, epilepsy, and migraine. However, it is not known why the efficacy of the stimulation varies dramatically across patients or why some patients suffer from severe side effects. This is largely due to the lack of mechanistic understanding of neurostimulation. Hence, theoretical computational approaches to address this issue are in demand. This chapter provides a review of mechanistic computational modeling of brain stimulation. In particular, we will focus on brain diseases, where mechanistic models (e.g., neural population models or detailed neuronal models) have been used to bridge the gap between cellular-level processes of affected neural circuits and the symptomatic expression of disease dynamics. We show how such models have been, and can be, used to investigate the effects of neurostimulation in the diseased brain. We argue that these models are crucial for the mechanistic understanding of the effect of stimulation, allowing for a rational design of stimulation protocols. Based on mechanistic models, we argue that the development of closed-loop stimulation is essential in order to avoid inference with healthy ongoing brain activity. Furthermore, patient-specific data, such as neuroanatomic information and connectivity profiles obtainable from neuroimaging, can be readily incorporated to address the clinical issue of variability in efficacy between subjects. We conclude that mechanistic computational models can and should play a key role in the rational design of effective, fully integrated, patient-specific therapeutic brain stimulation. © 2015 Elsevier B.V. All rights reserved.

  8. Design and evaluation of the computer-based training program Calcularis for enhancing numerical cognition

    Directory of Open Access Journals (Sweden)

    Tanja eKäser

    2013-08-01

    Full Text Available This article presents the design and a first pilot evaluation of the computer-based training program Calcularis for children with developmental dyscalculia (DD or difficulties in learning mathematics. The program has been designed according to insights on the typical and atypical development of mathematical abilities. The learning process is supported through multimodal cues, which encode different properties of numbers. To offer optimal learning conditions, a user model completes the program and allows flexible adaptation to a child’s individual learning and knowledge profile. 32 children with difficulties in learning mathematics completed the 6 to 12-weeks computer training. The children played the game for 20 minutes per day for 5 days a week. The training effects were evaluated using neuropsychological tests. Generally, children benefited significantly from the training regarding number representation and arithmetic operations. Furthermore, children liked to play with the program and reported that the training improved their mathematical abilities.

  9. Computational Modeling of Pollution Transmission in Rivers

    Science.gov (United States)

    Parsaie, Abbas; Haghiabi, Amir Hamzeh

    2017-06-01

    Modeling of river pollution contributes to better management of water quality and this will lead to the improvement of human health. The advection dispersion equation (ADE) is the government equation on pollutant transmission in the river. Modeling the pollution transmission includes numerical solution of the ADE and estimating the longitudinal dispersion coefficient (LDC). In this paper, a novel approach is proposed for numerical modeling of the pollution transmission in rivers. It is related to use both finite volume method as numerical method and artificial neural network (ANN) as soft computing technique together in simulation. In this approach, the result of the ANN for predicting the LDC was considered as input parameter for the numerical solution of the ADE. To validate the model performance in real engineering problems, the pollutant transmission in Severn River has been simulated. Comparison of the final model results with measured data of the Severn River showed that the model has good performance. Predicting the LDC by ANN model significantly improved the accuracy of computer simulation of the pollution transmission in river.

  10. Towards a Serious Game to Help Students Learn Computer Programming

    Directory of Open Access Journals (Sweden)

    Mathieu Muratet

    2009-01-01

    Full Text Available Video games are part of our culture like TV, movies, and books. We believe that this kind of software can be used to increase students' interest in computer science. Video games with other goals than entertainment, serious games, are present, today, in several fields such as education, government, health, defence, industry, civil security, and science. This paper presents a study around a serious game dedicated to strengthening programming skills. Real-Time Strategy, which is a popular game genre, seems to be the most suitable kind of game to support such a serious game. From programming teaching features to video game characteristics, we define a teaching organisation to experiment if a serious game can be adapted to learn programming.

  11. Teaching and Learning of Computational Modelling in Creative Shaping Processes

    Directory of Open Access Journals (Sweden)

    Daniela REIMANN

    2017-10-01

    Full Text Available Today, not only diverse design-related disciplines are required to actively deal with the digitization of information and its potentials and side effects for education processes. In Germany, technology didactics developed in vocational education and computer science education in general education, both separated from media pedagogy as an after-school program. Media education is not a subject in German schools yet. However, in the paper we argue for an interdisciplinary approach to learn about computational modeling in creative processes and aesthetic contexts. It crosses the borders of programming technology, arts and design processes in meaningful contexts. Educational scenarios using smart textile environments are introduced and reflected for project based learning.

  12. Report of the 2014 Programming Models and Environments Summit

    Energy Technology Data Exchange (ETDEWEB)

    Heroux, Michael [US Dept. of Energy, Washington, DC (United States); Lethin, Richard [US Dept. of Energy, Washington, DC (United States)

    2016-09-19

    Programming models and environments play the essential roles in high performance computing of enabling the conception, design, implementation and execution of science and engineering application codes. Programmer productivity is strongly influenced by the effectiveness of our programming models and environments, as is software sustainability since our codes have lifespans measured in decades, so the advent of new computing architectures, increased concurrency, concerns for resilience, and the increasing demands for high-fidelity, multi-physics, multi-scale and data-intensive computations mean that we have new challenges to address as part of our fundamental R&D requirements. Fortunately, we also have new tools and environments that make design, prototyping and delivery of new programming models easier than ever. The combination of new and challenging requirements and new, powerful toolsets enables significant synergies for the next generation of programming models and environments R&D. This report presents the topics discussed and results from the 2014 DOE Office of Science Advanced Scientific Computing Research (ASCR) Programming Models & Environments Summit, and subsequent discussions among the summit participants and contributors to topics in this report.

  13. HEATUP: a computer program for the thermal anaysis of a LOFC accident in an HTGR

    Energy Technology Data Exchange (ETDEWEB)

    Siman-Tov, I.I.; Turner, W.D.

    1976-11-01

    The HEATUP code, a modification of the general, time-dependent, one-, two-, and three-dimensional program HEATING5, was designed for the thermal analysis of a Loss of Forced Circulation accident in a High Temperature Gas-Cooled Reactor. This report contains a description of the computational model which includes: a description of the basic problem; a short review of preliminary results related to the choice of thermal properties, boundary conditions and initial conditions; a full description of a typical three-dimensional R-Z model and a limited one of a two-dimensional RZ model. HEATUP's additional computations are presented together with the method of input preparation. The three-dimensional model of the Fulton Generating Station Loss of Forced Circulation accident is used as a sample problem. A complete presentation of the input data is made. Also, the computer printout of the sample problem input data and results are given.

  14. Applications of computer modeling to fusion research. Progress report, 1988--1989

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, J.M.

    1989-12-31

    Progress achieved during this report period is presented on the following topics: Development and application of gyrokinetic particle codes to tokamak transport, development of techniques to take advantage of parallel computers; model dynamo and bootstrap current drive; and in general maintain our broad-based program in basic plasma physics and computer modeling.

  15. Computer Aided Modeling of Aquaculture Plants

    Directory of Open Access Journals (Sweden)

    Arne Tyssø

    1986-10-01

    Full Text Available Mathematical modeling of dynamic processes is often considered an intricate and time consuming task. Program packages for simulation, time series analysis and identification in combination with modern data logging equipment allow the task to be handled in a simpler and more efficient way.

  16. Distributed parallel computing in stochastic modeling of groundwater systems.

    Science.gov (United States)

    Dong, Yanhui; Li, Guomin; Xu, Haizhen

    2013-03-01

    Stochastic modeling is a rapidly evolving, popular approach to the study of the uncertainty and heterogeneity of groundwater systems. However, the use of Monte Carlo-type simulations to solve practical groundwater problems often encounters computational bottlenecks that hinder the acquisition of meaningful results. To improve the computational efficiency, a system that combines stochastic model generation with MODFLOW-related programs and distributed parallel processing is investigated. The distributed computing framework, called the Java Parallel Processing Framework, is integrated into the system to allow the batch processing of stochastic models in distributed and parallel systems. As an example, the system is applied to the stochastic delineation of well capture zones in the Pinggu Basin in Beijing. Through the use of 50 processing threads on a cluster with 10 multicore nodes, the execution times of 500 realizations are reduced to 3% compared with those of a serial execution. Through this application, the system demonstrates its potential in solving difficult computational problems in practical stochastic modeling. © 2012, The Author(s). Groundwater © 2012, National Ground Water Association.

  17. WATRE: a program for computing water and gas released from heated concrete

    Energy Technology Data Exchange (ETDEWEB)

    Claybrook, S.W.; Muhlestein, L.D.

    1985-01-01

    The WATRE computer program calculates the rate and quantity of water and carbon dioxide gas released from heated concrete. Recent development efforts have improved the numerical solution scheme, resulting in increased computational efficiency. The WATRE model is presented and the numerical procedure used to solve the governing equations is outlined. Validation of the WATRE model by comparison with extensive experimental data is emphasized. Results of a sensitivity study which investigated the effects that changes in input data have on WATRE calculations are also discussed.

  18. UDATE1: A computer program for the calculation of uranium-series isotopic ages

    Science.gov (United States)

    Rosenbauer, Robert J.

    UDATE1 is a FORTRAN-77 program with an interface for an Apple Macintosh computer that calculates isotope activities from measured count rates to date geologic materials by uranium-series disequilibria. Dates on pure samples can be determined directly by the accumulation of 230Th from 234U and of 231Pa from 235U. Dates for samples contaminated by clays containing abundant natural thorium can be corrected by the program using various mixing models. Input to the program and file management are made simple and user friendly by a series of Macintosh modal dialog boxes.

  19. DITTY - a computer program for calculating population dose integrated over ten thousand years

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Peloquin, R.A.; Strenge, D.L.

    1986-03-01

    The computer program DITTY (Dose Integrated Over Ten Thousand Years) was developed to determine the collective dose from long term nuclear waste disposal sites resulting from the ground-water pathways. DITTY estimates the time integral of collective dose over a ten-thousand-year period for time-variant radionuclide releases to surface waters, wells, or the atmosphere. This document includes the following information on DITTY: a description of the mathematical models, program designs, data file requirements, input preparation, output interpretations, sample problems, and program-generated diagnostic messages.

  20. Computational hemodynamics theory, modelling and applications

    CERN Document Server

    Tu, Jiyuan; Wong, Kelvin Kian Loong

    2015-01-01

    This book discusses geometric and mathematical models that can be used to study fluid and structural mechanics in the cardiovascular system.  Where traditional research methodologies in the human cardiovascular system are challenging due to its invasive nature, several recent advances in medical imaging and computational fluid and solid mechanics modelling now provide new and exciting research opportunities. This emerging field of study is multi-disciplinary, involving numerical methods, computational science, fluid and structural mechanics, and biomedical engineering. Certainly any new student or researcher in this field may feel overwhelmed by the wide range of disciplines that need to be understood. This unique book is one of the first to bring together knowledge from multiple disciplines, providing a starting point to each of the individual disciplines involved, attempting to ease the steep learning curve. This book presents elementary knowledge on the physiology of the cardiovascular system; basic knowl...

  1. Method of generating a computer readable model

    DEFF Research Database (Denmark)

    2008-01-01

    A method of generating a computer readable model of a geometrical object constructed from a plurality of interconnectable construction elements, wherein each construction element has a number of connection elements for connecting the construction element with another construction element. The met......A method of generating a computer readable model of a geometrical object constructed from a plurality of interconnectable construction elements, wherein each construction element has a number of connection elements for connecting the construction element with another construction element....... The method comprises encoding a first and a second one of the construction elements as corresponding data structures, each representing the connection elements of the corresponding construction element, and each of the connection elements having associated with it a predetermined connection type. The method...

  2. STARS: An Integrated, Multidisciplinary, Finite-Element, Structural, Fluids, Aeroelastic, and Aeroservoelastic Analysis Computer Program

    Science.gov (United States)

    Gupta, K. K.

    1997-01-01

    A multidisciplinary, finite element-based, highly graphics-oriented, linear and nonlinear analysis capability that includes such disciplines as structures, heat transfer, linear aerodynamics, computational fluid dynamics, and controls engineering has been achieved by integrating several new modules in the original STARS (STructural Analysis RoutineS) computer program. Each individual analysis module is general-purpose in nature and is effectively integrated to yield aeroelastic and aeroservoelastic solutions of complex engineering problems. Examples of advanced NASA Dryden Flight Research Center projects analyzed by the code in recent years include the X-29A, F-18 High Alpha Research Vehicle/Thrust Vectoring Control System, B-52/Pegasus Generic Hypersonics, National AeroSpace Plane (NASP), SR-71/Hypersonic Launch Vehicle, and High Speed Civil Transport (HSCT) projects. Extensive graphics capabilities exist for convenient model development and postprocessing of analysis results. The program is written in modular form in standard FORTRAN language to run on a variety of computers, such as the IBM RISC/6000, SGI, DEC, Cray, and personal computer; associated graphics codes use OpenGL and IBM/graPHIGS language for color depiction. This program is available from COSMIC, the NASA agency for distribution of computer programs.

  3. A hybrid computational model for phagocyte transmigration

    OpenAIRE

    Xue, Jiaxing; Gao, Jean; Tang, Liping

    2008-01-01

    Phagocyte transmigration is the initiation of a series of phagocyte responses that are believed important in the formation of fibrotic capsules surrounding implanted medical devices. Understanding the molecular mechanisms governing phagocyte transmigration is highly desired in order to improve the stability and functionality of the implanted devices. A hybrid computational model that combines control theory and kinetics Monte Carlo (KMC) algorithm is proposed to simulate and predict phagocyte...

  4. An Impulse Model for Computer Viruses

    Directory of Open Access Journals (Sweden)

    Chunming Zhang

    2012-01-01

    Full Text Available Computer virus spread model concerning impulsive control strategy is proposed and analyzed. We prove that there exists a globally attractive infection-free periodic solution when the vaccination rate is larger than θ0. Moreover, we show that the system is uniformly persistent if the vaccination rate is less than θ1. Some numerical simulations are finally given to illustrate the main results.

  5. Programming models used on Many-Core architectures

    Science.gov (United States)

    Novotný, Jan

    2014-12-01

    The time in which we live is characterized by an ever-increasing amount of data that we are able to explore and acquire. In all fields of science we could find some examples. Processing large volumes of information thus brings the requirement for engaging computational science. With increasing demands on data processing is advantageous to use new technology and start using parallel computation. Effective use of current technology requires from programmers new knowledge and skills. They meet with the countless new programming models and tools. In this article, we summarize the most commonly used programming models and points which good programming model should meet. The article also try to highlight the reasons why one should use a structured parallel programming.

  6. HAL/SM language specification. [programming languages and computer programming for space shuttles

    Science.gov (United States)

    Williams, G. P. W., Jr.; Ross, C.

    1975-01-01

    A programming language is presented for the flight software of the NASA Space Shuttle program. It is intended to satisfy virtually all of the flight software requirements of the space shuttle. To achieve this, it incorporates a wide range of features, including applications-oriented data types and organizations, real time control mechanisms, and constructs for systems programming tasks. It is a higher order language designed to allow programmers, analysts, and engineers to communicate with the computer in a form approximating natural mathematical expression. Parts of the English language are combined with standard notation to provide a tool that readily encourages programming without demanding computer hardware expertise. Block diagrams and flow charts are included. The semantics of the language is discussed.

  7. Computational fluid dynamics modelling in cardiovascular medicine.

    Science.gov (United States)

    Morris, Paul D; Narracott, Andrew; von Tengg-Kobligk, Hendrik; Silva Soto, Daniel Alejandro; Hsiao, Sarah; Lungu, Angela; Evans, Paul; Bressloff, Neil W; Lawford, Patricia V; Hose, D Rodney; Gunn, Julian P

    2016-01-01

    This paper reviews the methods, benefits and challenges associated with the adoption and translation of computational fluid dynamics (CFD) modelling within cardiovascular medicine. CFD, a specialist area of mathematics and a branch of fluid mechanics, is used routinely in a diverse range of safety-critical engineering systems, which increasingly is being applied to the cardiovascular system. By facilitating rapid, economical, low-risk prototyping, CFD modelling has already revolutionised research and development of devices such as stents, valve prostheses, and ventricular assist devices. Combined with cardiovascular imaging, CFD simulation enables detailed characterisation of complex physiological pressure and flow fields and the computation of metrics which cannot be directly measured, for example, wall shear stress. CFD models are now being translated into clinical tools for physicians to use across the spectrum of coronary, valvular, congenital, myocardial and peripheral vascular diseases. CFD modelling is apposite for minimally-invasive patient assessment. Patient-specific (incorporating data unique to the individual) and multi-scale (combining models of different length- and time-scales) modelling enables individualised risk prediction and virtual treatment planning. This represents a significant departure from traditional dependence upon registry-based, population-averaged data. Model integration is progressively moving towards 'digital patient' or 'virtual physiological human' representations. When combined with population-scale numerical models, these models have the potential to reduce the cost, time and risk associated with clinical trials. The adoption of CFD modelling signals a new era in cardiovascular medicine. While potentially highly beneficial, a number of academic and commercial groups are addressing the associated methodological, regulatory, education- and service-related challenges. Published by the BMJ Publishing Group Limited. For permission

  8. Computer-Supported Modelling of Multi modal Transportation Networks Rationalization

    Directory of Open Access Journals (Sweden)

    Ratko Zelenika

    2007-09-01

    Full Text Available This paper deals with issues of shaping and functioning ofcomputer programs in the modelling and solving of multimoda Itransportation network problems. A methodology of an integrateduse of a programming language for mathematical modellingis defined, as well as spreadsheets for the solving of complexmultimodal transportation network problems. The papercontains a comparison of the partial and integral methods ofsolving multimodal transportation networks. The basic hypothesisset forth in this paper is that the integral method results inbetter multimodal transportation network rationalization effects,whereas a multimodal transportation network modelbased on the integral method, once built, can be used as the basisfor all kinds of transportation problems within multimodaltransport. As opposed to linear transport problems, multimodaltransport network can assume very complex shapes. This papercontains a comparison of the partial and integral approach totransp01tation network solving. In the partial approach, astraightforward model of a transp01tation network, which canbe solved through the use of the Solver computer tool within theExcel spreadsheet inteiface, is quite sufficient. In the solving ofa multimodal transportation problem through the integralmethod, it is necessmy to apply sophisticated mathematicalmodelling programming languages which supp01t the use ofcomplex matrix functions and the processing of a vast amountof variables and limitations. The LINGO programming languageis more abstract than the Excel spreadsheet, and it requiresa certain programming knowledge. The definition andpresentation of a problem logic within Excel, in a manner whichis acceptable to computer software, is an ideal basis for modellingin the LINGO programming language, as well as a fasterand more effective implementation of the mathematical model.This paper provides proof for the fact that it is more rational tosolve the problem of multimodal transportation networks by

  9. Computer Based Modelling and Simulation-Modelling and ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 4. Computer Based Modelling and Simulation-Modelling and Simulation with Probability and Throwing Dice. N K Srinivasan. General Article Volume 6 Issue 4 April 2001 pp 69-77 ...

  10. Parallel Computation of the Regional Ocean Modeling System (ROMS)

    Energy Technology Data Exchange (ETDEWEB)

    Wang, P; Song, Y T; Chao, Y; Zhang, H

    2005-04-05

    The Regional Ocean Modeling System (ROMS) is a regional ocean general circulation modeling system solving the free surface, hydrostatic, primitive equations over varying topography. It is free software distributed world-wide for studying both complex coastal ocean problems and the basin-to-global scale ocean circulation. The original ROMS code could only be run on shared-memory systems. With the increasing need to simulate larger model domains with finer resolutions and on a variety of computer platforms, there is a need in the ocean-modeling community to have a ROMS code that can be run on any parallel computer ranging from 10 to hundreds of processors. Recently, we have explored parallelization for ROMS using the MPI programming model. In this paper, an efficient parallelization strategy for such a large-scale scientific software package, based on an existing shared-memory computing model, is presented. In addition, scientific applications and data-performance issues on a couple of SGI systems, including Columbia, the world's third-fastest supercomputer, are discussed.

  11. Analytical performance modeling for computer systems

    CERN Document Server

    Tay, Y C

    2013-01-01

    This book is an introduction to analytical performance modeling for computer systems, i.e., writing equations to describe their performance behavior. It is accessible to readers who have taken college-level courses in calculus and probability, networking and operating systems. This is not a training manual for becoming an expert performance analyst. Rather, the objective is to help the reader construct simple models for analyzing and understanding the systems that they are interested in.Describing a complicated system abstractly with mathematical equations requires a careful choice of assumpti

  12. Factors that Influence the Success of Male and Female Computer Programming Students in College

    Science.gov (United States)

    Clinkenbeard, Drew A.

    As the demand for a technologically skilled work force grows, experience and skill in computer science have become increasingly valuable for college students. However, the number of students graduating with computer science degrees is not growing proportional to this need. Traditionally several groups are underrepresented in this field, notably women and students of color. This study investigated elements of computer science education that influence academic achievement in beginning computer programming courses. The goal of the study was to identify elements that increase success in computer programming courses. A 38-item questionnaire was developed and administered during the Spring 2016 semester at California State University Fullerton (CSUF). CSUF is an urban public university comprised of about 40,000 students. Data were collected from three beginning programming classes offered at CSUF. In total 411 questionnaires were collected resulting in a response rate of 58.63%. Data for the study were grouped into three broad categories of variables. These included academic and background variables; affective variables; and peer, mentor, and role-model variables. A conceptual model was developed to investigate how these variables might predict final course grade. Data were analyzed using statistical techniques such as linear regression, factor analysis, and path analysis. Ultimately this study found that peer interactions, comfort with computers, computer self-efficacy, self-concept, and perception of achievement were the best predictors of final course grade. In addition, the analyses showed that male students exhibited higher levels of computer self-efficacy and self-concept compared to female students, even when they achieved comparable course grades. Implications and explanations of these findings are explored, and potential policy changes are offered.

  13. Computational acoustic modeling of cetacean vocalizations

    Science.gov (United States)

    Gurevich, Michael Dixon

    A framework for computational acoustic modeling of hypothetical vocal production mechanisms in cetaceans is presented. As a specific example, a model of a proposed source in the larynx of odontocetes is developed. Whales and dolphins generate a broad range of vocal sounds, but the exact mechanisms they use are not conclusively understood. In the fifty years since it has become widely accepted that whales can and do make sound, how they do so has remained particularly confounding. Cetaceans' highly divergent respiratory anatomy, along with the difficulty of internal observation during vocalization have contributed to this uncertainty. A variety of acoustical, morphological, ethological and physiological evidence has led to conflicting and often disputed theories of the locations and mechanisms of cetaceans' sound sources. Computational acoustic modeling has been used to create real-time parametric models of musical instruments and the human voice. These techniques can be applied to cetacean vocalizations to help better understand the nature and function of these sounds. Extensive studies of odontocete laryngeal morphology have revealed vocal folds that are consistently similar to a known but poorly understood acoustic source, the ribbon reed. A parametric computational model of the ribbon reed is developed, based on simplified geometrical, mechanical and fluid models drawn from the human voice literature. The physical parameters of the ribbon reed model are then adapted to those of the odontocete larynx. With reasonable estimates of real physical parameters, both the ribbon reed and odontocete larynx models produce sounds that are perceptually similar to their real-world counterparts, and both respond realistically under varying control conditions. Comparisons of acoustic features of the real-world and synthetic systems show a number of consistencies. While this does not on its own prove that either model is conclusively an accurate description of the source, it

  14. Computational Design Modelling : Proceedings of the Design Modelling Symposium

    CERN Document Server

    Kilian, Axel; Palz, Norbert; Scheurer, Fabian

    2012-01-01

    This book publishes the peer-reviewed proceeding of the third Design Modeling Symposium Berlin . The conference constitutes a platform for dialogue on experimental practice and research within the field of computationally informed architectural design. More than 60 leading experts the computational processes within the field of computationally informed architectural design to develop a broader and less exotic building practice that bears more subtle but powerful traces of the complex tool set and approaches we have developed and studied over recent years. The outcome are new strategies for a reasonable and innovative implementation of digital potential in truly innovative and radical design guided by both responsibility towards processes and the consequences they initiate.

  15. [An algol program for the computation of empiric regressions].

    Science.gov (United States)

    Peil, J; Schmerling, S

    1977-01-01

    An explanation is given about the meaning of empirical regression and on the domain of application of this biomathematical-statistical procedure. It may be helpful in data handling after the measurements and in a first stage of data processing especially if there is a large amount of datas. An empirical regression can provide the basis for a functional relationship analysis by giving hints for the choice of empirical mathematical functions. This will be useful and necessary in such cases where the measured values have a greater dispersion and one wants to get an analytical expression for the course of measured points. In the appendix a program listing of the ALGOL-program for empirical regression is presented. Detailed remarks are made in the text concerning the program structure, the data input and output resp. the program control parameters to enable the biological or medical user to adapt the program to their special problems without the help by a mathematician, and neither with deeper knowledge of mathematics nor with detailed insight to computer technical aspects of data processing.

  16. Toward a computational model of hemostasis

    Science.gov (United States)

    Leiderman, Karin; Danes, Nicholas; Schoeman, Rogier; Neeves, Keith

    2017-11-01

    Hemostasis is the process by which a blood clot forms to prevent bleeding at a site of injury. The formation time, size and structure of a clot depends on the local hemodynamics and the nature of the injury. Our group has previously developed computational models to study intravascular clot formation, a process confined to the interior of a single vessel. Here we present the first stage of an experimentally-validated, computational model of extravascular clot formation (hemostasis) in which blood through a single vessel initially escapes through a hole in the vessel wall and out a separate injury channel. This stage of the model consists of a system of partial differential equations that describe platelet aggregation and hemodynamics, solved via the finite element method. We also present results from the analogous, in vitro, microfluidic model. In both models, formation of a blood clot occludes the injury channel and stops flow from escaping while blood in the main vessel retains its fluidity. We discuss the different biochemical and hemodynamic effects on clot formation using distinct geometries representing intra- and extravascular injuries.

  17. Frequency Domain Computer Programs for Prediction and Analysis of Rail Vehicle Dynamics : Volume 1. Technical Report

    Science.gov (United States)

    1975-12-01

    Frequency domain computer programs developed or acquired by TSC for the analysis of rail vehicle dynamics are described in two volumes. Volume I defines the general analytical capabilities required for computer programs applicable to single rail vehi...

  18. Computer Science and Perl Programming Best of Perl Journal

    CERN Document Server

    2002-01-01

    With more than a million dedicated programmers, Perl has proven to be the best computing language for the latest trends in computing and business. While other languages have stagnated, Perl remains fresh, thanks to its community-based development model, which encourages the sharing of information among users. This tradition of knowledge-sharing allows developers to find answers to almost any Perl question they can dream up.And you can find many of those answers right here in Perl Hacks. Like all books in O'Reilly's Hacks Series, Perl Hacks appeals to a variety of programmers, whether you're a

  19. Computer Modeling of Human Delta Opioid Receptor

    Directory of Open Access Journals (Sweden)

    Tatyana Dzimbova

    2013-04-01

    Full Text Available The development of selective agonists of δ-opioid receptor as well as the model of interaction of ligands with this receptor is the subjects of increased interest. In the absence of crystal structures of opioid receptors, 3D homology models with different templates have been reported in the literature. The problem is that these models are not available for widespread use. The aims of our study are: (1 to choose within recently published crystallographic structures templates for homology modeling of the human δ-opioid receptor (DOR; (2 to evaluate the models with different computational tools; and (3 to precise the most reliable model basing on correlation between docking data and in vitro bioassay results. The enkephalin analogues, as ligands used in this study, were previously synthesized by our group and their biological activity was evaluated. Several models of DOR were generated using different templates. All these models were evaluated by PROCHECK and MolProbity and relationship between docking data and in vitro results was determined. The best correlations received for the tested models of DOR were found between efficacy (erel of the compounds, calculated from in vitro experiments and Fitness scoring function from docking studies. New model of DOR was generated and evaluated by different approaches. This model has good GA341 value (0.99 from MODELLER, good values from PROCHECK (92.6% of most favored regions and MolProbity (99.5% of favored regions. Scoring function correlates (Pearson r = -0.7368, p-value = 0.0097 with erel of a series of enkephalin analogues, calculated from in vitro experiments. So, this investigation allows suggesting a reliable model of DOR. Newly generated model of DOR receptor could be used further for in silico experiments and it will give possibility for faster and more correct design of selective and effective ligands for δ-opioid receptor.

  20. Stochastic computations in cortical microcircuit models.

    Directory of Open Access Journals (Sweden)

    Stefan Habenschuss

    Full Text Available Experimental data from neuroscience suggest that a substantial amount of knowledge is stored in the brain in the form of probability distributions over network states and trajectories of network states. We provide a theoretical foundation for this hypothesis by showing that even very detailed models for cortical microcircuits, with data-based diverse nonlinear neurons and synapses, have a stationary distribution of network states and trajectories of network states to which they converge exponentially fast from any initial state. We demonstrate that this convergence holds in spite of the non-reversibility of the stochastic dynamics of cortical microcircuits. We further show that, in the presence of background network oscillations, separate stationary distributions emerge for different phases of the oscillation, in accordance with experimentally reported phase-specific codes. We complement these theoretical results by computer simulations that investigate resulting computation times for typical probabilistic inference tasks on these internally stored distributions, such as marginalization or marginal maximum-a-posteriori estimation. Furthermore, we show that the inherent stochastic dynamics of generic cortical microcircuits enables them to quickly generate approximate solutions to difficult constraint satisfaction problems, where stored knowledge and current inputs jointly constrain possible solutions. This provides a powerful new computing paradigm for networks of spiking neurons, that also throws new light on how networks of neurons in the brain could carry out complex computational tasks such as prediction, imagination, memory recall and problem solving.

  1. Stochastic computations in cortical microcircuit models.

    Science.gov (United States)

    Habenschuss, Stefan; Jonke, Zeno; Maass, Wolfgang

    2013-01-01

    Experimental data from neuroscience suggest that a substantial amount of knowledge is stored in the brain in the form of probability distributions over network states and trajectories of network states. We provide a theoretical foundation for this hypothesis by showing that even very detailed models for cortical microcircuits, with data-based diverse nonlinear neurons and synapses, have a stationary distribution of network states and trajectories of network states to which they converge exponentially fast from any initial state. We demonstrate that this convergence holds in spite of the non-reversibility of the stochastic dynamics of cortical microcircuits. We further show that, in the presence of background network oscillations, separate stationary distributions emerge for different phases of the oscillation, in accordance with experimentally reported phase-specific codes. We complement these theoretical results by computer simulations that investigate resulting computation times for typical probabilistic inference tasks on these internally stored distributions, such as marginalization or marginal maximum-a-posteriori estimation. Furthermore, we show that the inherent stochastic dynamics of generic cortical microcircuits enables them to quickly generate approximate solutions to difficult constraint satisfaction problems, where stored knowledge and current inputs jointly constrain possible solutions. This provides a powerful new computing paradigm for networks of spiking neurons, that also throws new light on how networks of neurons in the brain could carry out complex computational tasks such as prediction, imagination, memory recall and problem solving.

  2. Computational model of a copper laser

    Energy Technology Data Exchange (ETDEWEB)

    Boley, C.D.; Molander, W.A.; Warner, B.E.

    1997-03-26

    This report describes a computational model of a copper laser amplifier. The model contains rate equations for copper and the buffer gas species (neon and hydrogen), along with equations for the electron temperature, the laser intensity, and the diffusing magnetic field of the discharge. Rates are given for all pertinent atomic reactions. The radial profile of the gas temperature is determined by the time-averaged power deposited in the gas. The presence of septum inserts, which aid gas cooling, is taken into account. Fields are calculated consistently throughout the plasma and the surrounding insulation. Employed in conjunction with a modulator model, the model is used to calculate comprehensive performance predictions for a high- power operational amplifier.

  3. Computational Modeling of Large Wildfires: A Roadmap

    KAUST Repository

    Coen, Janice L.

    2010-08-01

    Wildland fire behavior, particularly that of large, uncontrolled wildfires, has not been well understood or predicted. Our methodology to simulate this phenomenon uses high-resolution dynamic models made of numerical weather prediction (NWP) models coupled to fire behavior models to simulate fire behavior. NWP models are capable of modeling very high resolution (< 100 m) atmospheric flows. The wildland fire component is based upon semi-empirical formulas for fireline rate of spread, post-frontal heat release, and a canopy fire. The fire behavior is coupled to the atmospheric model such that low level winds drive the spread of the surface fire, which in turn releases sensible heat, latent heat, and smoke fluxes into the lower atmosphere, feeding back to affect the winds directing the fire. These coupled dynamic models capture the rapid spread downwind, flank runs up canyons, bifurcations of the fire into two heads, and rough agreement in area, shape, and direction of spread at periods for which fire location data is available. Yet, intriguing computational science questions arise in applying such models in a predictive manner, including physical processes that span a vast range of scales, processes such as spotting that cannot be modeled deterministically, estimating the consequences of uncertainty, the efforts to steer simulations with field data ("data assimilation"), lingering issues with short term forecasting of weather that may show skill only on the order of a few hours, and the difficulty of gathering pertinent data for verification and initialization in a dangerous environment. © 2010 IEEE.

  4. A Neural Computational Model of Incentive Salience

    Science.gov (United States)

    Zhang, Jun; Berridge, Kent C.; Tindell, Amy J.; Smith, Kyle S.; Aldridge, J. Wayne

    2009-01-01

    Incentive salience is a motivational property with ‘magnet-like’ qualities. When attributed to reward-predicting stimuli (cues), incentive salience triggers a pulse of ‘wanting’ and an individual is pulled toward the cues and reward. A key computational question is how incentive salience is generated during a cue re-encounter, which combines both learning and the state of limbic brain mechanisms. Learning processes, such as temporal-difference models, provide one way for stimuli to acquire cached predictive values of rewards. However, empirical data show that subsequent incentive values are also modulated on the fly by dynamic fluctuation in physiological states, altering cached values in ways requiring additional motivation mechanisms. Dynamic modulation of incentive salience for a Pavlovian conditioned stimulus (CS or cue) occurs during certain states, without necessarily requiring (re)learning about the cue. In some cases, dynamic modulation of cue value occurs during states that are quite novel, never having been experienced before, and even prior to experience of the associated unconditioned reward in the new state. Such cases can include novel drug-induced mesolimbic activation and addictive incentive-sensitization, as well as natural appetite states such as salt appetite. Dynamic enhancement specifically raises the incentive salience of an appropriate CS, without necessarily changing that of other CSs. Here we suggest a new computational model that modulates incentive salience by integrating changing physiological states with prior learning. We support the model with behavioral and neurobiological data from empirical tests that demonstrate dynamic elevations in cue-triggered motivation (involving natural salt appetite, and drug-induced intoxication and sensitization). Our data call for a dynamic model of incentive salience, such as presented here. Computational models can adequately capture fluctuations in cue-triggered ‘wanting’ only by

  5. Ethics of prevention: an interactive computer-tailored program.

    Science.gov (United States)

    Van Hooren, Rob H; Van den Borne, Bart W; Curfs, Leopold M G; Widdershoven, Guy A M

    2007-01-01

    This article describes the contents of an interactive computer-tailored program. The program is based on previous studies of the practice of care for persons with Prader-Willi syndrome. This genetic condition is associated with a constant overeating behaviour with the risk of obesity. The aim of the program is to start a process of awareness, reflection, and discussion by caregivers who are confronted with the moral dilemma of respect for autonomy versus restricting overeating behaviour. The program focuses on values (such as health and well-being) that are relevant to caregivers in daily practice. Furthermore, the focus is on various ways of interaction with the client. Caregivers were expected to focus mainly on health, and on both paternalistic and interpretive/deliberative forms of interaction. Sixteen professionals and 12 parents pilot-tested the program contents. With a pre-test, responses on one central case were collected for tailored feedback; with a post-test, the effects of the program were measured. Significant correlations were found between the values of autonomy and consultation and between autonomy and well-being. In contrast to our expectations respondents valued all categories (autonomy, consultation, health, well-being, and liveability for others) as equally important in the pre-test. No significant changes in scores were found between pre- and post-test. The open answers and remarks of participants support the program contents. Participants' responses support previous research findings, advocating a concept of autonomy in terms of positive freedom, through support by others. The promotion of the client's self-understanding and self-development is central in this concept.

  6. High Performance Computing - Power Application Programming Interface Specification.

    Energy Technology Data Exchange (ETDEWEB)

    Laros, James H.,; Kelly, Suzanne M.; Pedretti, Kevin; Grant, Ryan; Olivier, Stephen Lecler; Levenhagen, Michael J.; DeBonis, David

    2014-08-01

    Measuring and controlling the power and energy consumption of high performance computing systems by various components in the software stack is an active research area [13, 3, 5, 10, 4, 21, 19, 16, 7, 17, 20, 18, 11, 1, 6, 14, 12]. Implementations in lower level software layers are beginning to emerge in some production systems, which is very welcome. To be most effective, a portable interface to measurement and control features would significantly facilitate participation by all levels of the software stack. We present a proposal for a standard power Application Programming Interface (API) that endeavors to cover the entire software space, from generic hardware interfaces to the input from the computer facility manager.

  7. Building Computer-Based Experiments in Psychology without Programming Skills.

    Science.gov (United States)

    Ruisoto, Pablo; Bellido, Alberto; Ruiz, Javier; Juanes, Juan A

    2016-06-01

    Research in Psychology usually requires to build and run experiments. However, although this task has required scripting, recent computer tools based on graphical interfaces offer new opportunities in this field for researchers with non-programming skills. The purpose of this study is to illustrate and provide a comparative overview of two of the main free open source "point and click" software packages for building and running experiments in Psychology: PsychoPy and OpenSesame. Recommendations for their potential use are further discussed.

  8. Computer modeling for optimal placement of gloveboxes

    Energy Technology Data Exchange (ETDEWEB)

    Hench, K.W.; Olivas, J.D. [Los Alamos National Lab., NM (United States); Finch, P.R. [New Mexico State Univ., Las Cruces, NM (United States)

    1997-08-01

    Reduction of the nuclear weapons stockpile and the general downsizing of the nuclear weapons complex has presented challenges for Los Alamos. One is to design an optimized fabrication facility to manufacture nuclear weapon primary components (pits) in an environment of intense regulation and shrinking budgets. Historically, the location of gloveboxes in a processing area has been determined without benefit of industrial engineering studies to ascertain the optimal arrangement. The opportunity exists for substantial cost savings and increased process efficiency through careful study and optimization of the proposed layout by constructing a computer model of the fabrication process. This paper presents an integrative two- stage approach to modeling the casting operation for pit fabrication. The first stage uses a mathematical technique for the formulation of the facility layout problem; the solution procedure uses an evolutionary heuristic technique. The best solutions to the layout problem are used as input to the second stage - a computer simulation model that assesses the impact of competing layouts on operational performance. The focus of the simulation model is to determine the layout that minimizes personnel radiation exposures and nuclear material movement, and maximizes the utilization of capacity for finished units.

  9. Activity computer program for calculating ion irradiation activation

    Science.gov (United States)

    Palmer, Ben; Connolly, Brian; Read, Mark

    2017-07-01

    A computer program, Activity, was developed to predict the activity and gamma lines of materials irradiated with an ion beam. It uses the TENDL (Koning and Rochman, 2012) [1] proton reaction cross section database, the Stopping and Range of Ions in Matter (SRIM) (Biersack et al., 2010) code, a Nuclear Data Services (NDS) radioactive decay database (Sonzogni, 2006) [2] and an ENDF gamma decay database (Herman and Chadwick, 2006) [3]. An extended version of Bateman's equation is used to calculate the activity at time t, and this equation is solved analytically, with the option to also solve by numeric inverse Laplace Transform as a failsafe. The program outputs the expected activity and gamma lines of the activated material.

  10. Programming a massively parallel, computation universal system: static behavior

    Energy Technology Data Exchange (ETDEWEB)

    Lapedes, A.; Farber, R.

    1986-01-01

    In previous work by the authors, the ''optimum finding'' properties of Hopfield neural nets were applied to the nets themselves to create a ''neural compiler.'' This was done in such a way that the problem of programming the attractors of one neural net (called the Slave net) was expressed as an optimization problem that was in turn solved by a second neural net (the Master net). In this series of papers that approach is extended to programming nets that contain interneurons (sometimes called ''hidden neurons''), and thus deals with nets capable of universal computation. 22 refs.

  11. Accelerated Strategic Computing Initiative (ASCI) Program Plan [FY2000

    Energy Technology Data Exchange (ETDEWEB)

    None

    2000-01-01

    In August 1995, the United States took a significant step to reduce the nuclear danger. The decision to pursue a zero- yield Comprehensive Test Ban Treaty will allow greater control over the proliferation of nuclear weapons and will halt the growth of new nuclear systems. This step is only possible because of the Stockpile Stewardship Program, which provides an alternative means of ensuring the safety, performance, and reliability of the United States' enduring stockpile. At the heart of the Stockpile Stewardship Program is ASCI, which will create the high-confidence simulation capabilities needed to integrate fundamental science, experiments, and archival data into the stewardship of the actual weapons in the stockpile. ASCI will also serve to drive the development of simulation as a national resource by working closely with the computer industry and with universities.

  12. A Computer Program for a Canonical Problem in Underwater Shock

    Directory of Open Access Journals (Sweden)

    Thomas L. Geers

    1994-01-01

    Full Text Available Finite-element/boundary-element codes are widely used to analyze the response of marine structures to underwater explosions. An important step in verifying the correctness and accuracy of such codes is the comparison of code-generated results for canonical problems with corresponding analytical or semianalytical results. At the present time, such comparisons rely on hardcopy results presented in technical journals and reports. This article describes a computer program available from SAVIAC that produces user-selected numerical results for a step-wave-excited spherical shell submerged in and (optionally filled with an acoustic fluid. The method of solution employed in the program is based on classical expansion of the field quantities in generalized Fourier series in the meridional coordinate. Convergence of the series is enhanced by judicious application of modified Cesàro summation and partial closed-form solution.

  13. The Effects of a Robot Game Environment on Computer Programming Education for Elementary School Students

    Science.gov (United States)

    Shim, Jaekwoun; Kwon, Daiyoung; Lee, Wongyu

    2017-01-01

    In the past, computer programming was perceived as a task only carried out by computer scientists; in the 21st century, however, computer programming is viewed as a critical and necessary skill that everyone should learn. In order to improve teaching of problem-solving abilities in a computing environment, extensive research is being done on…

  14. Easy-to-use application programs for decay heat and delayed neutron calculations on personal computers

    Energy Technology Data Exchange (ETDEWEB)

    Oyamatsu, Kazuhiro [Nagoya Univ. (Japan)

    1998-03-01

    Application programs for personal computers are developed to calculate the decay heat power and delayed neutron activity from fission products. The main programs can be used in any computers from personal computers to main frames because their sources are written in Fortran. These programs have user friendly interfaces to be used easily not only for research activities but also for educational purposes. (author)

  15. UCODE, a computer code for universal inverse modeling

    Science.gov (United States)

    Poeter, Eileen P.; Hill, Mary C.

    1999-05-01

    This article presents the US Geological Survey computer program UCODE, which was developed in collaboration with the US Army Corps of Engineers Waterways Experiment Station and the International Ground Water Modeling Center of the Colorado School of Mines. UCODE performs inverse modeling, posed as a parameter-estimation problem, using nonlinear regression. Any application model or set of models can be used; the only requirement is that they have numerical (ASCII or text only) input and output files and that the numbers in these files have sufficient significant digits. Application models can include preprocessors and postprocessors as well as models related to the processes of interest (physical, chemical and so on), making UCODE extremely powerful for model calibration. Estimated parameters can be defined flexibly with user-specified functions. Observations to be matched in the regression can be any quantity for which a simulated equivalent value can be produced, thus simulated equivalent values are calculated using values that appear in the application model output files and can be manipulated with additive and multiplicative functions, if necessary. Prior, or direct, information on estimated parameters also can be included in the regression. The nonlinear regression problem is solved by minimizing a weighted least-squares objective function with respect to the parameter values using a modified Gauss-Newton method. Sensitivities needed for the method are calculated approximately by forward or central differences and problems and solutions related to this approximation are discussed. Statistics are calculated and printed for use in (1) diagnosing inadequate data or identifying parameters that probably cannot be estimated with the available data, (2) evaluating estimated parameter values, (3) evaluating the model representation of the actual processes and (4) quantifying the uncertainty of model simulated values. UCODE is intended for use on any computer operating

  16. COMPUTER MODEL OF TEMPERATURE DISTRIBUTION IN OPTICALLY PUMPED LASER RODS

    Science.gov (United States)

    Farrukh, U. O.

    1994-01-01

    Managing the thermal energy that accumulates within a solid-state laser material under active pumping is of critical importance in the design of laser systems. Earlier models that calculated the temperature distribution in laser rods were single dimensional and assumed laser rods of infinite length. This program presents a new model which solves the temperature distribution problem for finite dimensional laser rods and calculates both the radial and axial components of temperature distribution in these rods. The modeled rod is either side-pumped or end-pumped by a continuous or a single pulse pump beam. (At the present time, the model cannot handle a multiple pulsed pump source.) The optical axis is assumed to be along the axis of the rod. The program also assumes that it is possible to cool different surfaces of the rod at different rates. The user defines the laser rod material characteristics, determines the types of cooling and pumping to be modeled, and selects the time frame desired via the input file. The program contains several self checking schemes to prevent overwriting memory blocks and to provide simple tracing of information in case of trouble. Output for the program consists of 1) an echo of the input file, 2) diffusion properties, radius and length, and time for each data block, 3) the radial increments from the center of the laser rod to the outer edge of the laser rod, and 4) the axial increments from the front of the laser rod to the other end of the rod. This program was written in Microsoft FORTRAN77 and implemented on a Tandon AT with a 287 math coprocessor. The program can also run on a VAX 750 mini-computer. It has a memory requirement of about 147 KB and was developed in 1989.

  17. SPSS and SAS programming for the testing of mediation models.

    Science.gov (United States)

    Dudley, William N; Benuzillo, Jose G; Carrico, Mineh S

    2004-01-01

    Mediation modeling can explain the nature of the relation among three or more variables. In addition, it can be used to show how a variable mediates the relation between levels of intervention and outcome. The Sobel test, developed in 1990, provides a statistical method for determining the influence of a mediator on an intervention or outcome. Although interactive Web-based and stand-alone methods exist for computing the Sobel test, SPSS and SAS programs that automatically run the required regression analyses and computations increase the accessibility of mediation modeling to nursing researchers. To illustrate the utility of the Sobel test and to make this programming available to the Nursing Research audience in both SAS and SPSS. The history, logic, and technical aspects of mediation testing are introduced. The syntax files sobel.sps and sobel.sas, created to automate the computation of the regression analysis and test statistic, are available from the corresponding author. The reported programming allows the user to complete mediation testing with the user's own data in a single-step fashion. A technical manual included with the programming provides instruction on program use and interpretation of the output. Mediation modeling is a useful tool for describing the relation between three or more variables. Programming and manuals for using this model are made available.

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  19. A computational model of consciousness for artificial emotional agents.

    Directory of Open Access Journals (Sweden)

    Kotov Artemy A.

    2017-10-01

    Full Text Available Background. The structure of consciousness has long been a cornerstone problem in the cognitive sciences. Recently it took on applied significance in the design of computer agents and mobile robots. This problem can thus be examined from perspectives of phi­losophy, neuropsychology, and computer modeling. Objective. In the present paper, we address the problem of the computational model of consciousness by designing computer agents aimed at simulating “speech understand­ing” and irony. Further, we look for a “minimal architecture” that is able to mimic the effects of consciousness in computing systems. Method. For the base architecture, we used a software agent, which was programmed to operate with scripts (productions or inferences, to process incoming texts (or events by extracting their semantic representations, and to select relevant reactions. Results. It is shown that the agent can simulate speech irony by replacing a direct aggressive behavior with a positive sarcastic utterance. This is achieved by balancing be­tween several scripts available to the agent. We suggest that the extension of this scheme may serve as a minimal architecture of consciousness, wherein the agent distinguishes own representations and potential cognitive representations of other agents. Within this architecture, there are two stages of processing. First, the agent activates several scripts by placing their if-statements or actions (inferences within a processing scope. Second, the agent differentiates the scripts depending on their activation by another script. This multilevel scheme allows the agent to simulate imaginary situations, one’s own imagi­nary actions, and imaginary actions of other agents, i.e. the agent demonstrates features considered essential for conscious agents in the philosophy of mind and cognitive psy­chology. Conclusion. Our computer systems for understanding speech and simulation of irony can serve as a basis for further

  20. A computational model of consciousness for artificial emotional agents

    Directory of Open Access Journals (Sweden)

    Kotov A. A.

    2017-09-01

    Full Text Available Background. The structure of consciousness has long been a cornerstone problem in the cognitive sciences. Recently it took on applied significance in the design of computer agents and mobile robots. This problem can thus be examined from perspectives of phi­losophy, neuropsychology, and computer modeling. Objective. In the present paper, we address the problem of the computational model of consciousness by designing computer agents aimed at simulating “speech understand­ing” and irony. Further, we look for a “minimal architecture” that is able to mimic the effects of consciousness in computing systems. Method. For the base architecture, we used a software agent, which was programmed to operate with scripts (productions or inferences, to process incoming texts (or events by extracting their semantic representations, and to select relevant reactions. Results. It is shown that the agent can simulate speech irony by replacing a direct aggressive behavior with a positive sarcastic utterance. This is achieved by balancing be­tween several scripts available to the agent. We suggest that the extension of this scheme may serve as a minimal architecture of consciousness, wherein the agent distinguishes own representations and potential cognitive representations of other agents. Within this architecture, there are two stages of processing. First, the agent activates several scripts by placing their if-statements or actions (inferences within a processing scope. Second, the agent differentiates the scripts depending on their activation by another script. This multilevel scheme allows the agent to simulate imaginary situations, one’s own imagi­nary actions, and imaginary actions of other agents, i.e. the agent demonstrates features considered essential for conscious agents in the philosophy of mind and cognitive psy­chology. Conclusion. Our computer systems for understanding speech and simulation of irony can serve as a basis for further

  1. Visual Teaching Model for Introducing Programming Languages

    Science.gov (United States)

    Shehane, Ronald; Sherman, Steven

    2014-01-01

    This study examines detailed usage of online training videos that were designed to address specific course problems that were encountered in an online computer programming course. The study presents the specifics of a programming course where training videos were used to provide students with a quick start path to learning a new programming…

  2. Computer Program Recognizes Patterns in Time-Series Data

    Science.gov (United States)

    Hand, Charles

    2003-01-01

    A computer program recognizes selected patterns in time-series data like digitized samples of seismic or electrophysiological signals. The program implements an artificial neural network (ANN) and a set of N clocks for the purpose of determining whether N or more instances of a certain waveform, W, occur within a given time interval, T. The ANN must be trained to recognize W in the incoming stream of data. The first time the ANN recognizes W, it sets clock 1 to count down from T to zero; the second time it recognizes W, it sets clock 2 to count down from T to zero, and so forth through the Nth instance. On the N + 1st instance, the cycle is repeated, starting with clock 1. If any clock has not reached zero when it is reset, then N instances of W have been detected within time T, and the program so indicates. The program can readily be encoded in a field-programmable gate array or an application-specific integrated circuit that could be used, for example, to detect electroencephalographic or electrocardiographic waveforms indicative of epileptic seizures or heart attacks, respectively.

  3. Program Predicts Time Courses of Human/Computer Interactions

    Science.gov (United States)

    Vera, Alonso; Howes, Andrew

    2005-01-01

    CPM X is a computer program that predicts sequences of, and amounts of time taken by, routine actions performed by a skilled person performing a task. Unlike programs that simulate the interaction of the person with the task environment, CPM X predicts the time course of events as consequences of encoded constraints on human behavior. The constraints determine which cognitive and environmental processes can occur simultaneously and which have sequential dependencies. The input to CPM X comprises (1) a description of a task and strategy in a hierarchical description language and (2) a description of architectural constraints in the form of rules governing interactions of fundamental cognitive, perceptual, and motor operations. The output of CPM X is a Program Evaluation Review Technique (PERT) chart that presents a schedule of predicted cognitive, motor, and perceptual operators interacting with a task environment. The CPM X program allows direct, a priori prediction of skilled user performance on complex human-machine systems, providing a way to assess critical interfaces before they are deployed in mission contexts.

  4. Functional computational model for optimal color coding.

    Science.gov (United States)

    Romney, A Kimball; Chiao, Chuan-Chin

    2009-06-23

    This paper presents a computational model for color coding that provides a functional explanation of how humans perceive colors in a homogeneous color space. Beginning with known properties of human cone photoreceptors, the model estimates the locations of the reflectance spectra of Munsell color chips in perceptual color space as represented in the CIE L*a*b* color system. The fit between the two structures is within the limits of expected measurement error. Estimates of the structure of perceptual color space for color anomalous dichromats missing one of the normal cone photoreceptors correspond closely to results from the Farnsworth-Munsell color test. An unanticipated outcome of the model provides a functional explanation of why additive lights are always red, green, and blue and provide maximum gamut for color monitors and color television even though they do not correspond to human cone absorption spectra.

  5. Computer models in the design of FXR

    Energy Technology Data Exchange (ETDEWEB)

    Vogtlin, G.; Kuenning, R.

    1980-01-01

    Lawrence Livermore National Laboratory is developing a 15 to 20 MeV electron accelerator with a beam current goal of 4 kA. This accelerator will be used for flash radiography and has a requirement of high reliability. Components being developed include spark gaps, Marx generators, water Blumleins and oil insulation systems. A SCEPTRE model was developed that takes into consideration the non-linearity of the ferrite and the time dependency of the emission from a field emitter cathode. This model was used to predict an optimum charge time to obtain maximum magnetic flux change from the ferrite. This model and its application will be discussed. JASON was used extensively to determine optimum locations and shapes of supports and insulators. It was also used to determine stress within bubbles adjacent to walls in oil. Computer results will be shown and bubble breakdown will be related to bubble size.

  6. Computational fluid dynamic modelling of cavitation

    Science.gov (United States)

    Deshpande, Manish; Feng, Jinzhang; Merkle, Charles L.

    1993-01-01

    Models in sheet cavitation in cryogenic fluids are developed for use in Euler and Navier-Stokes codes. The models are based upon earlier potential-flow models but enable the cavity inception point, length, and shape to be determined as part of the computation. In the present paper, numerical solutions are compared with experimental measurements for both pressure distribution and cavity length. Comparisons between models are also presented. The CFD model provides a relatively simple modification to an existing code to enable cavitation performance predictions to be included. The analysis also has the added ability of incorporating thermodynamic effects of cryogenic fluids into the analysis. Extensions of the current two-dimensional steady state analysis to three-dimensions and/or time-dependent flows are, in principle, straightforward although geometrical issues become more complicated. Linearized models, however offer promise of providing effective cavitation modeling in three-dimensions. This analysis presents good potential for improved understanding of many phenomena associated with cavity flows.

  7. Computational models of neurophysiological correlates of tinnitus.

    Science.gov (United States)

    Schaette, Roland; Kempter, Richard

    2012-01-01

    The understanding of tinnitus has progressed considerably in the past decade, but the details of the mechanisms that give rise to this phantom perception of sound without a corresponding acoustic stimulus have not yet been pinpointed. It is now clear that tinnitus is generated in the brain, not in the ear, and that it is correlated with pathologically altered spontaneous activity of neurons in the central auditory system. Both increased spontaneous firing rates and increased neuronal synchrony have been identified as putative neuronal correlates of phantom sounds in animal models, and both phenomena can be triggered by damage to the cochlea. Various mechanisms could underlie the generation of such aberrant activity. At the cellular level, decreased synaptic inhibition and increased neuronal excitability, which may be related to homeostatic plasticity, could lead to an over-amplification of natural spontaneous activity. At the network level, lateral inhibition could amplify differences in spontaneous activity, and structural changes such as reorganization of tonotopic maps could lead to self-sustained activity in recurrently connected neurons. However, it is difficult to disentangle the contributions of different mechanisms in experiments, especially since not all changes observed in animal models of tinnitus are necessarily related to tinnitus. Computational modeling presents an opportunity of evaluating these mechanisms and their relation to tinnitus. Here we review the computational models for the generation of neurophysiological correlates of tinnitus that have been proposed so far, and evaluate predictions and compare them to available data. We also assess the limits of their explanatory power, thus demonstrating where an understanding is still lacking and where further research may be needed. Identifying appropriate models is important for finding therapies, and we therefore, also summarize the implications of the models for approaches to treat tinnitus.

  8. Computational models of neurophysiological correlates of tinnitus

    Directory of Open Access Journals (Sweden)

    Roland eSchaette

    2012-05-01

    Full Text Available The understanding of tinnitus has progressed considerably in the past decade, but the details of the mechanisms that give rise to this phantom perception of sound without a corresponding acoustic stimulus have not been pinpointed yet. It is now clear that tinnitus is generated in the brain, not in the ear, and that it is correlated with pathologically altered spontaneous activity of neurons in the central auditory system. Both increased spontaneous firing rates and increased neuronal synchrony have been identified as putative neuronal correlates of phantom sounds in animal models, and both phenomena can be triggered by damage to the cochlea. Various mechanisms could underlie the generation of such aberrant activity. At the cellular level, decreased synaptic inhibition and increased neuronal excitability, which may be related to homeostatic plasticity, could lead to an over-amplification of natural spontaneous activity. At the network level, lateral inhibition could amplify differences in spontaneous activity, and structural changes such as reorganization of tonotopic maps could lead to self-sustained activity in recurrently connected neurons. It is difficult to disentangle the contributions of different mechanisms in experiments, especially since not all changes observed in animal models of tinnitus are necessarily related to tinnitus. Computational modelling presents an opportunity of evaluating these mechanisms and their relation to tinnitus. Here we review the computational models for the generation of neurophysiological correlates of tinnitus that have been proposed so far, evaluate predictions and compare them to available data. We also evaluate the limits of their explanatory power, thus demonstrating where an understanding is still lacking and where further research may be needed. Identifying appropriate models is important for finding therapies and we therefore also summarize the implications of the models for approaches to treat

  9. Computer-aided modeling framework – a generic modeling template

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    This work focuses on the development of a computer-aided modeling framework. The framework is a knowledge-based system that is built on a generic modeling language and structured on workflows for different modeling tasks. The overall objective is to support model developers and users to generate...... and test models systematically, efficiently and reliably. In this way, development of products and processes can be made faster, cheaper and more efficient. In this contribution, as part of the framework, a generic modeling template for the systematic derivation of problem specific models is presented....... The application of the modeling template is highlighted with a case study related to the modeling of a catalytic membrane reactor coupling dehydrogenation of ethylbenzene with hydrogenation of nitrobenzene...

  10. Modelling of data uncertainties on hybrid computers

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, Anke (ed.)

    2016-06-15

    The codes d{sup 3}f and r{sup 3}t are well established for modelling density-driven flow and nuclide transport in the far field of repositories for hazardous material in deep geological formations. They are applicable in porous media as well as in fractured rock or mudstone, for modelling salt- and heat transport as well as a free groundwater surface. Development of the basic framework of d{sup 3}f and r{sup 3}t had begun more than 20 years ago. Since that time significant advancements took place in the requirements for safety assessment as well as for computer hardware development. The period of safety assessment for a repository of high-level radioactive waste was extended to 1 million years, and the complexity of the models is steadily growing. Concurrently, the demands on accuracy increase. Additionally, model and parameter uncertainties become more and more important for an increased understanding of prediction reliability. All this leads to a growing demand for computational power that requires a considerable software speed-up. An effective way to achieve this is the use of modern, hybrid computer architectures which requires basically the set-up of new data structures and a corresponding code revision but offers a potential speed-up by several orders of magnitude. The original codes d{sup 3}f and r{sup 3}t were applications of the software platform UG /BAS 94/ whose development had begun in the early nineteennineties. However, UG had recently been advanced to the C++ based, substantially revised version UG4 /VOG 13/. To benefit also in the future from state-of-the-art numerical algorithms and to use hybrid computer architectures, the codes d{sup 3}f and r{sup 3}t were transferred to this new code platform. Making use of the fact that coupling between different sets of equations is natively supported in UG4, d{sup 3}f and r{sup 3}t were combined to one conjoint code d{sup 3}f++. A direct estimation of uncertainties for complex groundwater flow models with the

  11. A Unified Approach to Modeling and Programming

    DEFF Research Database (Denmark)

    Madsen, Ole Lehrmann; Møller-Pedersen, Birger

    2010-01-01

    of this paper is to go back to the future and get inspiration from SIMULA and propose a unied approach. In addition to reintroducing the contributions of SIMULA and the Scandinavian approach to object-oriented programming, we do this by discussing a number of issues in modeling and programming and argue3 why we......SIMULA was a language for modeling and programming and provided a unied approach to modeling and programming in contrast to methodologies based on structured analysis and design. The current development seems to be going in the direction of separation of modeling and programming. The goal...

  12. Intruder dose pathway analysis for the onsite disposal of radioactive wastes: The ONSITE/MAXI1 computer program

    Energy Technology Data Exchange (ETDEWEB)

    Kennedy, W.E. Jr.; Peloquin, R.A.; Napier, B.A.; Neuder, S.M.

    1987-02-01

    This document summarizes initial efforts to develop human-intrusion scenarios and a modified version of the MAXI computer program for potential use by the NRC in reviewing applications for onsite radioactive waste disposal. Supplement 1 of NUREG/CR-3620 (1986) summarized modifications and improvements to the ONSITE/MAXI1 software package. This document summarizes a modified version of the ONSITE/MAXI1 computer program. This modified version of the computer program operates on a personal computer and permits the user to optionally select radiation dose conversion factors published by the International Commission on Radiological Protection (ICRP) in their Publication No. 30 (ICRP 1979-1982) in place of those published by the ICRP in their Publication No. 2 (ICRP 1959) (as implemented in the previous versions of the ONSITE/MAXI1 computer program). The pathway-to-human models used in the computer program have not been changed from those described previously. Computer listings of the ONSITE/MAXI1 computer program and supporting data bases are included in the appendices of this document.

  13. Chapter 24: Computational modeling of self-organized spindle formation.

    Science.gov (United States)

    Schaffner, Stuart C; José, Jorge V

    2008-01-01

    In this chapter, we provide a derivation and computational details of a biophysical model we introduced to describe the self-organized mitotic spindle formation properties in the chromosome dominated pathway studied in Xenopus meiotic extracts. The mitotic spindle is a biological structure composed of microtubules. This structure forms the scaffold on which mitosis and cytokinesis occurs. Despite the seeming mechanical simplicity of the spindle itself, its formation and the way in which it is used in mitosis and cytokinesis is complex and not fully understood. Biophysical modeling of a system as complex as mitosis requires contributions from biologists, biochemists, mathematicians, physicists, and software engineers. This chapter is written for biologists and biochemists who wish to understand how biophysical modeling can complement a program of biological experimentation. It is also written for a physicist, computer scientist, or mathematician unfamiliar with this class of biological physics model. We will describe how we built such a mathematical model and its numerical simulator to obtain results that agree with many of the results found experimentally. The components of this system are large enough to be described in terms of coarse-grained approximations. We will discuss how to properly model such systems and will suggest effective tradeoffs between reliability, simulation speed, and accuracy. At all times we have in mind the realistic biophysical properties of the system we are trying to model.

  14. Computational Model for Internal Relative Humidity Distributions in Concrete

    Directory of Open Access Journals (Sweden)

    Wondwosen Ali

    2014-01-01

    Full Text Available A computational model is developed for predicting nonuniform internal relative humidity distribution in concrete. Internal relative humidity distribution is known to have a direct effect on the nonuniform drying shrinkage strains. These nonuniform drying shrinkage strains result in the buildup of internal stresses, which may lead to cracking of concrete. This may be particularly true at early ages of concrete since the concrete is relatively weak while the difference in internal relative humidity is probably high. The results obtained from this model can be used by structural and construction engineers to predict critical drying shrinkage stresses induced due to differential internal humidity distribution. The model uses finite elment-finite difference numerical methods. The finite element is used to space discretization while the finite difference is used to obtain transient solutions of the model. The numerical formulations are then programmed in Matlab. The numerical results were compared with experimental results found in the literature and demonstrated very good agreement.

  15. Computational modeling of corneal refractive surgery

    Science.gov (United States)

    Cabrera Fernandez, Delia; Niazy, Abdel-Salam M.; Kurtz, Ronald M.; Djotyan, Gagik P.; Juhasz, Tibor

    2004-07-01

    A finite element method was used to study the biomechanical behavior of the cornea and its response to refractive surgery when stiffness inhomogeneities varying with depth are considered. Side-by-side comparisons of different constitutive laws that have been commonly used to model refractive surgery were also performed. To facilitate the comparison, the material property constants were identified from the same experimental data, which were obtained from mechanical tests on corneal strips and membrane inflation experiments. We then validated the resulting model by comparing computed refractive power changes with clinical results. The model developed provides a much more predictable refractive outcome when the stiffness inhomogeneities of the cornea and nonlinearities of the deformations are included in the finite element simulations. Thus, it can be stated that the inhomogeneous model is a more accurate representation of the corneal material properties in order to model the biomechanical effects of refractive surgery. The simulations also revealed that the para-central and peripheral parts of the cornea deformed less in response to pressure loading compared to the central cornea and the limbus. Furthermore, the deformations in response to pressure loading predicted by the non-homogeneous and nonlinear model, showed that the para-central region is mechanically enhanced in the meridional direction. This result is in agreement with the experimentally documented regional differences reported in the literature by other investigators.

  16. Computational social dynamic modeling of group recruitment.

    Energy Technology Data Exchange (ETDEWEB)

    Berry, Nina M.; Lee, Marinna; Pickett, Marc; Turnley, Jessica Glicken (Sandia National Laboratories, Albuquerque, NM); Smrcka, Julianne D. (Sandia National Laboratories, Albuquerque, NM); Ko, Teresa H.; Moy, Timothy David (Sandia National Laboratories, Albuquerque, NM); Wu, Benjamin C.

    2004-01-01

    The Seldon software toolkit combines concepts from agent-based modeling and social science to create a computationally social dynamic model for group recruitment. The underlying recruitment model is based on a unique three-level hybrid agent-based architecture that contains simple agents (level one), abstract agents (level two), and cognitive agents (level three). This uniqueness of this architecture begins with abstract agents that permit the model to include social concepts (gang) or institutional concepts (school) into a typical software simulation environment. The future addition of cognitive agents to the recruitment model will provide a unique entity that does not exist in any agent-based modeling toolkits to date. We use social networks to provide an integrated mesh within and between the different levels. This Java based toolkit is used to analyze different social concepts based on initialization input from the user. The input alters a set of parameters used to influence the values associated with the simple agents, abstract agents, and the interactions (simple agent-simple agent or simple agent-abstract agent) between these entities. The results of phase-1 Seldon toolkit provide insight into how certain social concepts apply to different scenario development for inner city gang recruitment.

  17. Analysis of a Model for Computer Virus Transmission

    Directory of Open Access Journals (Sweden)

    Peng Qin

    2015-01-01

    Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.

  18. POBE: A Computer Program for Optimal Design of Multi-Subject Blocked fMRI Experiments

    Directory of Open Access Journals (Sweden)

    Bärbel Maus

    2014-01-01

    Full Text Available For functional magnetic resonance imaging (fMRI studies, researchers can use multi-subject blocked designs to identify active brain regions for a certain stimulus type of interest. Before performing such an experiment, careful planning is necessary to obtain efficient stimulus effect estimators within the available financial resources. The optimal number of subjects and the optimal scanning time for a multi-subject blocked design with fixed experimental costs can be determined using optimal design methods. In this paper, the user-friendly computer program POBE 1.2 (program for optimal design of blocked experiments, version 1.2 is presented. POBE provides a graphical user interface for fMRI researchers to easily and efficiently design their experiments. The computer program POBE calculates the optimal number of subjects and the optimal scanning time for user specified experimental factors and model parameters so that the statistical efficiency is maximised for a given study budget. POBE can also be used to determine the minimum budget for a given power. Furthermore, a maximin design can be determined as efficient design for a possible range of values for the unknown model parameters. In this paper, the computer program is described and illustrated with typical experimental factors for a blocked fMRI experiment.

  19. Addressing Dynamic Issues of Program Model Checking

    Science.gov (United States)

    Lerda, Flavio; Visser, Willem

    2001-01-01

    Model checking real programs has recently become an active research area. Programs however exhibit two characteristics that make model checking difficult: the complexity of their state and the dynamic nature of many programs. Here we address both these issues within the context of the Java PathFinder (JPF) model checker. Firstly, we will show how the state of a Java program can be encoded efficiently and how this encoding can be exploited to improve model checking. Next we show how to use symmetry reductions to alleviate some of the problems introduced by the dynamic nature of Java programs. Lastly, we show how distributed model checking of a dynamic program can be achieved, and furthermore, how dynamic partitions of the state space can improve model checking. We support all our findings with results from applying these techniques within the JPF model checker.

  20. Model Energy Efficiency Program Impact Evaluation Guide

    Science.gov (United States)

    This document provides guidance on model approaches for calculating energy, demand, and emissions savings resulting from energy efficiency programs. It describes several standard approaches that can be used in order to make these programs more efficient.