WorldWideScience

Sample records for machine code programs

  1. Assembly processor program converts symbolic programming language to machine language

    Science.gov (United States)

    Pelto, E. V.

    1967-01-01

    Assembly processor program converts symbolic programming language to machine language. This program translates symbolic codes into computer understandable instructions, assigns locations in storage for successive instructions, and computer locations from symbolic addresses.

  2. Ocean circulation code on machine connection

    International Nuclear Information System (INIS)

    Vitart, F.

    1993-01-01

    This work is part of a development of a global climate model based on a coupling between an ocean model and an atmosphere model. The objective was to develop this global model on a massively parallel machine (CM2). The author presents the OPA7 code (equations, boundary conditions, equation system resolution) and parallelization on the CM2 machine. CM2 data structure is briefly evoked, and two tests are reported (on a flat bottom basin, and a topography with eight islands). The author then gives an overview of studies aimed at improving the ocean circulation code: use of a new state equation, use of a formulation of surface pressure, use of a new mesh. He reports the study of the use of multi-block domains on CM2 through advection tests, and two-block tests

  3. Reversible machine code and its abstract processor architecture

    DEFF Research Database (Denmark)

    Axelsen, Holger Bock; Glück, Robert; Yokoyama, Tetsuo

    2007-01-01

    A reversible abstract machine architecture and its reversible machine code are presented and formalized. For machine code to be reversible, both the underlying control logic and each instruction must be reversible. A general class of machine instruction sets was proven to be reversible, building...

  4. Model-Driven Engineering of Machine Executable Code

    Science.gov (United States)

    Eichberg, Michael; Monperrus, Martin; Kloppenburg, Sven; Mezini, Mira

    Implementing static analyses of machine-level executable code is labor intensive and complex. We show how to leverage model-driven engineering to facilitate the design and implementation of programs doing static analyses. Further, we report on important lessons learned on the benefits and drawbacks while using the following technologies: using the Scala programming language as target of code generation, using XML-Schema to express a metamodel, and using XSLT to implement (a) transformations and (b) a lint like tool. Finally, we report on the use of Prolog for writing model transformations.

  5. Optimal interference code based on machine learning

    Science.gov (United States)

    Qian, Ye; Chen, Qian; Hu, Xiaobo; Cao, Ercong; Qian, Weixian; Gu, Guohua

    2016-10-01

    In this paper, we analyze the characteristics of pseudo-random code, by the case of m sequence. Depending on the description of coding theory, we introduce the jamming methods. We simulate the interference effect or probability model by the means of MATLAB to consolidate. In accordance with the length of decoding time the adversary spends, we find out the optimal formula and optimal coefficients based on machine learning, then we get the new optimal interference code. First, when it comes to the phase of recognition, this study judges the effect of interference by the way of simulating the length of time over the decoding period of laser seeker. Then, we use laser active deception jamming simulate interference process in the tracking phase in the next block. In this study we choose the method of laser active deception jamming. In order to improve the performance of the interference, this paper simulates the model by MATLAB software. We find out the least number of pulse intervals which must be received, then we can make the conclusion that the precise interval number of the laser pointer for m sequence encoding. In order to find the shortest space, we make the choice of the greatest common divisor method. Then, combining with the coding regularity that has been found before, we restore pulse interval of pseudo-random code, which has been already received. Finally, we can control the time period of laser interference, get the optimal interference code, and also increase the probability of interference as well.

  6. Understanding and Writing G & M Code for CNC Machines

    Science.gov (United States)

    Loveland, Thomas

    2012-01-01

    In modern CAD and CAM manufacturing companies, engineers design parts for machines and consumable goods. Many of these parts are cut on CNC machines. Whether using a CNC lathe, milling machine, or router, the ideas and designs of engineers must be translated into a machine-readable form called G & M Code that can be used to cut parts to precise…

  7. Machine Learning via Mathematical Programming

    National Research Council Canada - National Science Library

    Mamgasarian, Olivi

    1999-01-01

    Mathematical programming approaches were applied to a variety of problems in machine learning in order to gain deeper understanding of the problems and to come up with new and more efficient computational algorithms...

  8. A portable virtual machine target for proof-carrying code

    DEFF Research Database (Denmark)

    Franz, Michael; Chandra, Deepak; Gal, Andreas

    2005-01-01

    Virtual Machines (VMs) and Proof-Carrying Code (PCC) are two techniques that have been used independently to provide safety for (mobile) code. Existing virtual machines, such as the Java VM, have several drawbacks: First, the effort required for safety verification is considerable. Second and mor...... simultaneously providing efficient justin-time compilation and target-machine independence. In particular, our approach reduces the complexity of the required proofs, resulting in fewer proof obligations that need to be discharged at the target machine....

  9. Machine function based control code algebras

    NARCIS (Netherlands)

    Bergstra, J.A.

    Machine functions have been introduced by Earley and Sturgis in [6] in order to provide a mathematical foundation of the use of the T-diagrams proposed by Bratman in [5]. Machine functions describe the operation of a machine at a very abstract level. A theory of hardware and software based on

  10. Complete permutation Gray code implemented by finite state machine

    Directory of Open Access Journals (Sweden)

    Li Peng

    2014-09-01

    Full Text Available An enumerating method of complete permutation array is proposed. The list of n! permutations based on Gray code defined over finite symbol set Z(n = {1, 2, …, n} is implemented by finite state machine, named as n-RPGCF. An RPGCF can be used to search permutation code and provide improved lower bounds on the maximum cardinality of a permutation code in some cases.

  11. 3D equilibrium codes for mirror machines

    International Nuclear Information System (INIS)

    Kaiser, T.B.

    1983-01-01

    The codes developed for cumputing three-dimensional guiding center equilibria for quadrupole tandem mirrors are discussed. TEBASCO (Tandem equilibrium and ballooning stability code) is a code developed at LLNL that uses a further expansion of the paraxial equilibrium equation in powers of β (plasma pressure/magnetic pressure). It has been used to guide the design of the TMX-U and MFTF-B experiments at Livermore. Its principal weakness is its perturbative nature, which renders its validity for high-β calculation open to question. In order to compute high-β equilibria, the reduced MHD technique that has been proven useful for determining toroidal equilibria was adapted to the tandem mirror geometry. In this approach, the paraxial expansion of the MHD equations yields a set of coupled nonlinear equations of motion valid for arbitrary β, that are solved as an initial-value problem. Two particular formulations have been implemented in computer codes developed at NYU/Kyoto U and LLNL. They differ primarily in the type of grid, the location of the lateral boundary and the damping techniques employed, and in the method of calculating pressure-balance equilibrium. Discussions on these codes are presented in this paper. (Kato, T.)

  12. A friend man-machine interface for thermo-hydraulic simulation codes of nuclear installations

    International Nuclear Information System (INIS)

    Araujo Filho, F. de; Belchior Junior, A.; Barroso, A.C.O.; Gebrim, A.

    1994-01-01

    This work presents the development of a Man-Machine Interface to the TRAC-PF1 code, a computer program to perform best estimate analysis of transients and accidents at nuclear power plants. The results were considered satisfactory and a considerable productivity gain was achieved in the activity of preparing and analyzing simulations. (author)

  13. The Three Pillars of Machine Programming

    OpenAIRE

    Gottschlich, Justin; Solar-Lezama, Armando; Tatbul, Nesime; Carbin, Michael; Rinard, Martin; Barzilay, Regina; Amarasinghe, Saman; Tenenbaum, Joshua B; Mattson, Tim

    2018-01-01

    In this position paper, we describe our vision of the future of machine programming through a categorical examination of three pillars of research. Those pillars are: (i) intention, (ii) invention, and(iii) adaptation. Intention emphasizes advancements in the human-to-computer and computer-to-machine-learning interfaces. Invention emphasizes the creation or refinement of algorithms or core hardware and software building blocks through machine learning (ML). Adaptation emphasizes advances in t...

  14. Abstract Machines for Programming Language Implementation

    NARCIS (Netherlands)

    Diehl, Stephan; Hartel, Pieter H.; Sestoft, Peter

    We present an extensive, annotated bibliography of the abstract machines designed for each of the main programming paradigms (imperative, object oriented, functional, logic and concurrent). We conclude that whilst a large number of efficient abstract machines have been designed for particular

  15. Perspex machine: V. Compilation of C programs

    Science.gov (United States)

    Spanner, Matthew P.; Anderson, James A. D. W.

    2006-01-01

    The perspex machine arose from the unification of the Turing machine with projective geometry. The original, constructive proof used four special, perspective transformations to implement the Turing machine in projective geometry. These four transformations are now generalised and applied in a compiler, implemented in Pop11, that converts a subset of the C programming language into perspexes. This is interesting both from a geometrical and a computational point of view. Geometrically, it is interesting that program source can be converted automatically to a sequence of perspective transformations and conditional jumps, though we find that the product of homogeneous transformations with normalisation can be non-associative. Computationally, it is interesting that program source can be compiled for a Reduced Instruction Set Computer (RISC), the perspex machine, that is a Single Instruction, Zero Exception (SIZE) computer.

  16. CNC LATHE MACHINE PRODUCING NC CODE BY USING DIALOG METHOD

    Directory of Open Access Journals (Sweden)

    Yakup TURGUT

    2004-03-01

    Full Text Available In this study, an NC code generation program utilising Dialog Method was developed for turning centres. Initially, CNC lathes turning methods and tool path development techniques were reviewed briefly. By using geometric definition methods, tool path was generated and CNC part program was developed for FANUC control unit. The developed program made CNC part program generation process easy. The program was developed using BASIC 6.0 programming language while the material and cutting tool database were and supported with the help of ACCESS 7.0.

  17. Compiler design handbook optimizations and machine code generation

    CERN Document Server

    Srikant, YN

    2003-01-01

    The widespread use of object-oriented languages and Internet security concerns are just the beginning. Add embedded systems, multiple memory banks, highly pipelined units operating in parallel, and a host of other advances and it becomes clear that current and future computer architectures pose immense challenges to compiler designers-challenges that already exceed the capabilities of traditional compilation techniques. The Compiler Design Handbook: Optimizations and Machine Code Generation is designed to help you meet those challenges. Written by top researchers and designers from around the

  18. Implications of Structured Programming for Machine Architecture

    NARCIS (Netherlands)

    Tanenbaum, A.S.

    1978-01-01

    Based on an empirical study of more than 10,000 lines of program text written in a GOTO-less language, a machine architecture specifically designed for structured programs is proposed. Since assignment, CALL, RETURN, and IF statements together account for 93 percent of all executable statements,

  19. Teaching Machines and Programmed Instruction; an Introduction.

    Science.gov (United States)

    Fry, Edward B.

    Teaching machines and programed instruction represent new methods in education, but they are based on teaching principles established before the development of media technology. Today programed learning materials based on the new technology enjoy increasing popularity for several reasons: they apply sound psychological theories; the materials can…

  20. Simulation program for multiple expansion Stirling machines

    International Nuclear Information System (INIS)

    Walker, G.; Weiss, M.; Fauvel, R.; Reader, G.; Bingham, E.R.

    1992-01-01

    Multiple expansion Stirling machines have been a topic of interest at the University of Calgary for some years. Recently a second-order computer simulation program with integral graphics package for Stirling cryocoolers with up to four stages of expansion were developed and made available to the Stirling community. Adaptation of the program to multiple expansion Stirling power systems is anticipated. This paper briefly introduces the program and presents a specimen result

  1. Code quality issues in student programs

    NARCIS (Netherlands)

    Keuning, H.W.; Heeren, B.J.; Jeuring, J.T.

    2017-01-01

    Because low quality code can cause serious problems in software systems, students learning to program should pay attention to code quality early. Although many studies have investigated mistakes that students make during programming, we do not know much about the quality of their code. This study

  2. Programming Entity Framework Code First

    CERN Document Server

    Lerman, Julia

    2011-01-01

    Take advantage of the Code First data modeling approach in ADO.NET Entity Framework, and learn how to build and configure a model based on existing classes in your business domain. With this concise book, you'll work hands-on with examples to learn how Code First can create an in-memory model and database by default, and how you can exert more control over the model through further configuration. Code First provides an alternative to the database first and model first approaches to the Entity Data Model. Learn the benefits of defining your model with code, whether you're working with an exis

  3. Program Design Report of the CNC Machine Tool(II)

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Kiun; Youm, K. U.; Kim, K. S.; Lee, I. B.; Yoon, K. B.; Lee, C. K.; Youm, J. H

    2007-06-15

    The application of CNC machine tool being widely expanded according to variety of machine work method and rapid promotion of machine tool, cutting tool, for high speed efficient machine work. In order to conduct of the project of manufacture and maintenance of laboratory equipment, production design and machine work technology are continually developed, especially the application of CNC machine tool is very important for the improvement of productivity, quality and clearing up a manpower shortage. We publish technical report which it includes CNC machine tool program and drawing, it contributes to the systematic development of CNC program design and machine work technology.

  4. Program Design Report of the CNC Machine Tool(II)

    International Nuclear Information System (INIS)

    Kim, Jong Kiun; Youm, K. U.; Kim, K. S.; Lee, I. B.; Yoon, K. B.; Lee, C. K.; Youm, J. H.

    2007-06-01

    The application of CNC machine tool being widely expanded according to variety of machine work method and rapid promotion of machine tool, cutting tool, for high speed efficient machine work. In order to conduct of the project of manufacture and maintenance of laboratory equipment, production design and machine work technology are continually developed, especially the application of CNC machine tool is very important for the improvement of productivity, quality and clearing up a manpower shortage. We publish technical report which it includes CNC machine tool program and drawing, it contributes to the systematic development of CNC program design and machine work technology

  5. Program Design Report of the CNC Machine Tool(III)

    International Nuclear Information System (INIS)

    Kim, Jong Kiun; Youm, K. U.; Kim, K. S.; Lee, I. B.; Yoon, K. B.; Lee, C. K.; Youm, J. H.

    2008-08-01

    The application of CNC machine tool being widely expanded according to variety of machine work method and rapid promotion of machine tool, cutting tool, for high speed efficient machine work. In order to conduct of the project of manufacture and maintenance of laboratory equipment, production design and machine work technology are continually developed, especially the application of CNC machine tool is very important for the improvement of productivity, quality and clearing up a manpower shortage. We publish technical report which it includes CNC machine tool program and drawing, it contributes to the systematic development of CNC program design and machine work technology

  6. Program Design Report of the CNC Machine Tool(IV)

    International Nuclear Information System (INIS)

    Youm, Ki Un; Lee, I. B.; Youm, J. H.

    2009-09-01

    The application of CNC machine tool being widely expanded according to variety of machine work method and rapid promotion of machine tool, cutting tool, for high speed efficient machine work. In order to conduct of the project of manufacture and maintenance of laboratory equipment, production design and machine work technology are continually developed, especially the application of CNC machine tool is very important for the improvement of productivity, quality and clearing up a manpower shortage. We publish technical report which it includes CNC machine tool program and drawing, it contributes to the systematic development of CNC program design and machine work technology

  7. Program Design Report of the CNC Machine Tool (I)

    International Nuclear Information System (INIS)

    Kim, Jong Kiun; Youm, K. U.; Kim, K. S.

    2006-08-01

    The application of CNC machine tool being widely expanded according to variety of machine work method and rapid promotion of machine tool, cutting tool, for high speed efficient machine work. In order to conduct of the project of manufacture and maintenance of laboratory equipment, production design and machine work technology are continually developed, especially the application of CNC machine tool is very important for the improvement of productivity, quality and clearing up a manpower shortage. We publish technical report which it includes CNC machine tool program and drawing, it contributes to the systematic development of CNC program design and machine work technology

  8. An Automatic Instruction-Level Parallelization of Machine Code

    Directory of Open Access Journals (Sweden)

    MARINKOVIC, V.

    2018-02-01

    Full Text Available Prevailing multicores and novel manycores have made a great challenge of modern day - parallelization of embedded software that is still written as sequential. In this paper, automatic code parallelization is considered, focusing on developing a parallelization tool at the binary level as well as on the validation of this approach. The novel instruction-level parallelization algorithm for assembly code which uses the register names after SSA to find independent blocks of code and then to schedule independent blocks using METIS to achieve good load balance is developed. The sequential consistency is verified and the validation is done by measuring the program execution time on the target architecture. Great speedup, taken as the performance measure in the validation process, and optimal load balancing are achieved for multicore RISC processors with 2 to 16 cores (e.g. MIPS, MicroBlaze, etc.. In particular, for 16 cores, the average speedup is 7.92x, while in some cases it reaches 14x. An approach to automatic parallelization provided by this paper is useful to researchers and developers in the area of parallelization as the basis for further optimizations, as the back-end of a compiler, or as the code parallelization tool for an embedded system.

  9. A method of numerically controlled machine part programming

    Science.gov (United States)

    1970-01-01

    Computer program is designed for automatically programmed tools. Preprocessor computes desired tool path and postprocessor computes actual commands causing machine tool to follow specific path. It is used on a Cincinnati ATC-430 numerically controlled machine tool.

  10. Nevada Administrative Code for Special Education Programs.

    Science.gov (United States)

    Nevada State Dept. of Education, Carson City. Special Education Branch.

    This document presents excerpts from Chapter 388 of the Nevada Administrative Code, which concerns definitions, eligibility, and programs for students who are disabled or gifted/talented. The first section gathers together 36 relevant definitions from the Code for such concepts as "adaptive behavior,""autism,""gifted and…

  11. National machine guarding program: Part 1. Machine safeguarding practices in small metal fabrication businesses

    OpenAIRE

    Parker, David L.; Yamin, Samuel C.; Brosseau, Lisa M.; Xi, Min; Gordon, Robert; Most, Ivan G.; Stanley, Rodney

    2015-01-01

    Background Metal fabrication workers experience high rates of traumatic occupational injuries. Machine operators in particular face high risks, often stemming from the absence or improper use of machine safeguarding or the failure to implement lockout procedures. Methods The National Machine Guarding Program (NMGP) was a translational research initiative implemented in conjunction with two workers' compensation insures. Insurance safety consultants trained in machine guarding used standardize...

  12. Generic programming for deterministic neutron transport codes

    International Nuclear Information System (INIS)

    Plagne, L.; Poncot, A.

    2005-01-01

    This paper discusses the implementation of neutron transport codes via generic programming techniques. Two different Boltzmann equation approximations have been implemented, namely the Sn and SPn methods. This implementation experiment shows that generic programming allows us to improve maintainability and readability of source codes with no performance penalties compared to classical approaches. In the present implementation, matrices and vectors as well as linear algebra algorithms are treated separately from the rest of source code and gathered in a tool library called 'Generic Linear Algebra Solver System' (GLASS). Such a code architecture, based on a linear algebra library, allows us to separate the three different scientific fields involved in transport codes design: numerical analysis, reactor physics and computer science. Our library handles matrices with optional storage policies and thus applies both to Sn code, where the matrix elements are computed on the fly, and to SPn code where stored matrices are used. Thus, using GLASS allows us to share a large fraction of source code between Sn and SPn implementations. Moreover, the GLASS high level of abstraction allows the writing of numerical algorithms in a form which is very close to their textbook descriptions. Hence the GLASS algorithms collection, disconnected from computer science considerations (e.g. storage policy), is very easy to read, to maintain and to extend. (authors)

  13. Machine-Learning Algorithms to Code Public Health Spending Accounts.

    Science.gov (United States)

    Brady, Eoghan S; Leider, Jonathon P; Resnick, Beth A; Alfonso, Y Natalia; Bishai, David

    Government public health expenditure data sets require time- and labor-intensive manipulation to summarize results that public health policy makers can use. Our objective was to compare the performances of machine-learning algorithms with manual classification of public health expenditures to determine if machines could provide a faster, cheaper alternative to manual classification. We used machine-learning algorithms to replicate the process of manually classifying state public health expenditures, using the standardized public health spending categories from the Foundational Public Health Services model and a large data set from the US Census Bureau. We obtained a data set of 1.9 million individual expenditure items from 2000 to 2013. We collapsed these data into 147 280 summary expenditure records, and we followed a standardized method of manually classifying each expenditure record as public health, maybe public health, or not public health. We then trained 9 machine-learning algorithms to replicate the manual process. We calculated recall, precision, and coverage rates to measure the performance of individual and ensembled algorithms. Compared with manual classification, the machine-learning random forests algorithm produced 84% recall and 91% precision. With algorithm ensembling, we achieved our target criterion of 90% recall by using a consensus ensemble of ≥6 algorithms while still retaining 93% coverage, leaving only 7% of the summary expenditure records unclassified. Machine learning can be a time- and cost-saving tool for estimating public health spending in the United States. It can be used with standardized public health spending categories based on the Foundational Public Health Services model to help parse public health expenditure information from other types of health-related spending, provide data that are more comparable across public health organizations, and evaluate the impact of evidence-based public health resource allocation.

  14. Towards a universal code formatter through machine learning

    NARCIS (Netherlands)

    Parr, T. (Terence); J.J. Vinju (Jurgen)

    2016-01-01

    textabstractThere are many declarative frameworks that allow us to implement code formatters relatively easily for any specific language, but constructing them is cumbersome. The first problem is that "everybody" wants to format their code differently, leading to either many formatter variants or a

  15. Code-Expanded Random Access for Machine-Type Communications

    DEFF Research Database (Denmark)

    Kiilerich Pratas, Nuno; Thomsen, Henning; Stefanovic, Cedomir

    2012-01-01

    Abstract—The random access methods used for support of machine-type communications (MTC) in current cellular standards are derivatives of traditional framed slotted ALOHA and therefore do not support high user loads efficiently. Motivated by the random access method employed in LTE, we propose...

  16. Code-expanded radio access protocol for machine-to-machine communications

    DEFF Research Database (Denmark)

    Thomsen, Henning; Kiilerich Pratas, Nuno; Stefanovic, Cedomir

    2013-01-01

    The random access methods used for support of machine-to-machine, also referred to as Machine-Type Communications, in current cellular standards are derivatives of traditional framed slotted ALOHA and therefore do not support high user loads efficiently. We propose an approach that is motivated b...... subframes and orthogonal preambles, the amount of available contention resources is drastically increased, enabling the massive support of Machine-Type Communication users that is beyond the reach of current systems.......The random access methods used for support of machine-to-machine, also referred to as Machine-Type Communications, in current cellular standards are derivatives of traditional framed slotted ALOHA and therefore do not support high user loads efficiently. We propose an approach that is motivated...... by the random access method employed in LTE, which significantly increases the amount of contention resources without increasing the system resources, such as contention subframes and preambles. This is accomplished by a logical, rather than physical, extension of the access method in which the available system...

  17. The semaphore codes attached to a Turing machine via resets and their various limits

    OpenAIRE

    Rhodes, John; Schilling, Anne; Silva, Pedro V.

    2016-01-01

    We introduce semaphore codes associated to a Turing machine via resets. Semaphore codes provide an approximation theory for resets. In this paper we generalize the set-up of our previous paper "Random walks on semaphore codes and delay de Bruijn semigroups" to the infinite case by taking the profinite limit of $k$-resets to obtain $(-\\omega)$-resets. We mention how this opens new avenues to attack the P versus NP problem.

  18. A Machine Learning Perspective on Predictive Coding with PAQ

    OpenAIRE

    Knoll, Byron; de Freitas, Nando

    2011-01-01

    PAQ8 is an open source lossless data compression algorithm that currently achieves the best compression rates on many benchmarks. This report presents a detailed description of PAQ8 from a statistical machine learning perspective. It shows that it is possible to understand some of the modules of PAQ8 and use this understanding to improve the method. However, intuitive statistical explanations of the behavior of other modules remain elusive. We hope the description in this report will be a sta...

  19. National Machine Guarding Program: Part 1. Machine safeguarding practices in small metal fabrication businesses.

    Science.gov (United States)

    Parker, David L; Yamin, Samuel C; Brosseau, Lisa M; Xi, Min; Gordon, Robert; Most, Ivan G; Stanley, Rodney

    2015-11-01

    Metal fabrication workers experience high rates of traumatic occupational injuries. Machine operators in particular face high risks, often stemming from the absence or improper use of machine safeguarding or the failure to implement lockout procedures. The National Machine Guarding Program (NMGP) was a translational research initiative implemented in conjunction with two workers' compensation insures. Insurance safety consultants trained in machine guarding used standardized checklists to conduct a baseline inspection of machine-related hazards in 221 business. Safeguards at the point of operation were missing or inadequate on 33% of machines. Safeguards for other mechanical hazards were missing on 28% of machines. Older machines were both widely used and less likely than newer machines to be properly guarded. Lockout/tagout procedures were posted at only 9% of machine workstations. The NMGP demonstrates a need for improvement in many aspects of machine safety and lockout in small metal fabrication businesses. © 2015 The Authors. American Journal of Industrial Medicine published by Wiley Periodicals, Inc.

  20. Utilities programs for the WIMSD4 code

    International Nuclear Information System (INIS)

    Leszczynski, F.

    1990-01-01

    The WIMSD4 code is widely known around the world. For its better use, it is convenient to count with auxiliary programs. Two of these programs, developed in FORTRAN 77, in the VAX computer of the Bariloche Atomic Center, are herein presented. WINTER (Wims INTERactive) to generate input data of WIMSD4 in an interactive way, and AMICO (Anisn MIx and COndense) to deal with cross sections data of a multigroup data library and of WIMS output to be used in other programs, such as: ANISN, DOT, CITATION, DIPOBAR, etc. (Author) [es

  1. Findings From the National Machine Guarding Program-A Small Business Intervention: Machine Safety.

    Science.gov (United States)

    Parker, David L; Yamin, Samuel C; Xi, Min; Brosseau, Lisa M; Gordon, Robert; Most, Ivan G; Stanley, Rodney

    2016-09-01

    The purpose of this nationwide intervention was to improve machine safety in small metal fabrication businesses (3 to 150 employees). The failure to implement machine safety programs related to guarding and lockout/tagout (LOTO) are frequent causes of Occupational Safety and Health Administration (OSHA) citations and may result in serious traumatic injury. Insurance safety consultants conducted a standardized evaluation of machine guarding, safety programs, and LOTO. Businesses received a baseline evaluation, two intervention visits, and a 12-month follow-up evaluation. The intervention was completed by 160 businesses. Adding a safety committee was associated with a 10% point increase in business-level machine scores (P increase in LOTO program scores (P < 0.0001). Insurance safety consultants proved effective at disseminating a machine safety and LOTO intervention via management-employee safety committees.

  2. Data calculation program for RELAP 5 code

    International Nuclear Information System (INIS)

    Silvestre, Larissa J.B.; Sabundjian, Gaiane

    2015-01-01

    As the criteria and requirements for a nuclear power plant are extremely rigid, computer programs for simulation and safety analysis are required for certifying and licensing a plant. Based on this scenario, some sophisticated computational tools have been used such as the Reactor Excursion and Leak Analysis Program (RELAP5), which is the most used code for the thermo-hydraulic analysis of accidents and transients in nuclear reactors. A major difficulty in the simulation using RELAP5 code is the amount of information required for the simulation of thermal-hydraulic accidents or transients. The preparation of the input data leads to a very large number of mathematical operations for calculating the geometry of the components. Therefore, a mathematical friendly preprocessor was developed in order to perform these calculations and prepare RELAP5 input data. The Visual Basic for Application (VBA) combined with Microsoft EXCEL demonstrated to be an efficient tool to perform a number of tasks in the development of the program. Due to the absence of necessary information about some RELAP5 components, this work aims to make improvements to the Mathematic Preprocessor for RELAP5 code (PREREL5). For the new version of the preprocessor, new screens of some components that were not programmed in the original version were designed; moreover, screens of pre-existing components were redesigned to improve the program. In addition, an English version was provided for the new version of the PREREL5. The new design of PREREL5 contributes for saving time and minimizing mistakes made by users of the RELAP5 code. The final version of this preprocessor will be applied to Angra 2. (author)

  3. Data calculation program for RELAP 5 code

    Energy Technology Data Exchange (ETDEWEB)

    Silvestre, Larissa J.B.; Sabundjian, Gaiane, E-mail: larissajbs@usp.br, E-mail: gdjian@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    As the criteria and requirements for a nuclear power plant are extremely rigid, computer programs for simulation and safety analysis are required for certifying and licensing a plant. Based on this scenario, some sophisticated computational tools have been used such as the Reactor Excursion and Leak Analysis Program (RELAP5), which is the most used code for the thermo-hydraulic analysis of accidents and transients in nuclear reactors. A major difficulty in the simulation using RELAP5 code is the amount of information required for the simulation of thermal-hydraulic accidents or transients. The preparation of the input data leads to a very large number of mathematical operations for calculating the geometry of the components. Therefore, a mathematical friendly preprocessor was developed in order to perform these calculations and prepare RELAP5 input data. The Visual Basic for Application (VBA) combined with Microsoft EXCEL demonstrated to be an efficient tool to perform a number of tasks in the development of the program. Due to the absence of necessary information about some RELAP5 components, this work aims to make improvements to the Mathematic Preprocessor for RELAP5 code (PREREL5). For the new version of the preprocessor, new screens of some components that were not programmed in the original version were designed; moreover, screens of pre-existing components were redesigned to improve the program. In addition, an English version was provided for the new version of the PREREL5. The new design of PREREL5 contributes for saving time and minimizing mistakes made by users of the RELAP5 code. The final version of this preprocessor will be applied to Angra 2. (author)

  4. Lean coding machine. Facilities target productivity and job satisfaction with coding automation.

    Science.gov (United States)

    Rollins, Genna

    2010-07-01

    Facilities are turning to coding automation to help manage the volume of electronic documentation, streamlining workflow, boosting productivity, and increasing job satisfaction. As EHR adoption increases, computer-assisted coding may become a necessity, not an option.

  5. Machine-Checked Sequencer for Critical Embedded Code Generator

    Science.gov (United States)

    Izerrouken, Nassima; Pantel, Marc; Thirioux, Xavier

    This paper presents the development of a correct-by-construction block sequencer for GeneAuto a qualifiable (according to DO178B/ED12B recommendation) automatic code generator. It transforms Simulink models to MISRA C code for safety critical systems. Our approach which combines classical development process and formal specification and verification using proof-assistants, led to preliminary fruitful exchanges with certification authorities. We present parts of the classical user and tools requirements and derived formal specifications, implementation and verification for the correctness and termination of the block sequencer. This sequencer has been successfully applied to real-size industrial use cases from various transportation domain partners and led to requirement errors detection and a correct-by-construction implementation.

  6. National machine guarding program: Part 1. Machine safeguarding practices in small metal fabrication businesses

    Science.gov (United States)

    Yamin, Samuel C.; Brosseau, Lisa M.; Xi, Min; Gordon, Robert; Most, Ivan G.; Stanley, Rodney

    2015-01-01

    Background Metal fabrication workers experience high rates of traumatic occupational injuries. Machine operators in particular face high risks, often stemming from the absence or improper use of machine safeguarding or the failure to implement lockout procedures. Methods The National Machine Guarding Program (NMGP) was a translational research initiative implemented in conjunction with two workers' compensation insures. Insurance safety consultants trained in machine guarding used standardized checklists to conduct a baseline inspection of machine‐related hazards in 221 business. Results Safeguards at the point of operation were missing or inadequate on 33% of machines. Safeguards for other mechanical hazards were missing on 28% of machines. Older machines were both widely used and less likely than newer machines to be properly guarded. Lockout/tagout procedures were posted at only 9% of machine workstations. Conclusions The NMGP demonstrates a need for improvement in many aspects of machine safety and lockout in small metal fabrication businesses. Am. J. Ind. Med. 58:1174–1183, 2015. © 2015 The Authors. American Journal of Industrial Medicine published by Wiley Periodicals, Inc. PMID:26332060

  7. Automatic generation of data merging program codes.

    OpenAIRE

    Hyensook, Kim; Oussena, Samia; Zhang, Ying; Clark, Tony

    2010-01-01

    Data merging is an essential part of ETL (Extract-Transform-Load) processes to build a data warehouse system. To avoid rewheeling merging techniques, we propose a Data Merging Meta-model (DMM) and its transformation into executable program codes in the manner of model driven engineering. DMM allows defining relationships of different model entities and their merging types in conceptual level. Our formalized transformation described using ATL (ATLAS Transformation Language) enables automatic g...

  8. Accuracy comparison among different machine learning techniques for detecting malicious codes

    Science.gov (United States)

    Narang, Komal

    2016-03-01

    In this paper, a machine learning based model for malware detection is proposed. It can detect newly released malware i.e. zero day attack by analyzing operation codes on Android operating system. The accuracy of Naïve Bayes, Support Vector Machine (SVM) and Neural Network for detecting malicious code has been compared for the proposed model. In the experiment 400 benign files, 100 system files and 500 malicious files have been used to construct the model. The model yields the best accuracy 88.9% when neural network is used as classifier and achieved 95% and 82.8% accuracy for sensitivity and specificity respectively.

  9. Using machine-coded event data for the micro-level study of political violence

    Directory of Open Access Journals (Sweden)

    Jesse Hammond

    2014-07-01

    Full Text Available Machine-coded datasets likely represent the future of event data analysis. We assess the use of one of these datasets—Global Database of Events, Language and Tone (GDELT—for the micro-level study of political violence by comparing it to two hand-coded conflict event datasets. Our findings indicate that GDELT should be used with caution for geo-spatial analyses at the subnational level: its overall correlation with hand-coded data is mediocre, and at the local level major issues of geographic bias exist in how events are reported. Overall, our findings suggest that due to these issues, researchers studying local conflict processes may want to wait for a more reliable geocoding method before relying too heavily on this set of machine-coded data.

  10. Monte Carlo codes and Monte Carlo simulator program

    International Nuclear Information System (INIS)

    Higuchi, Kenji; Asai, Kiyoshi; Suganuma, Masayuki.

    1990-03-01

    Four typical Monte Carlo codes KENO-IV, MORSE, MCNP and VIM have been vectorized on VP-100 at Computing Center, JAERI. The problems in vector processing of Monte Carlo codes on vector processors have become clear through the work. As the result, it is recognized that these are difficulties to obtain good performance in vector processing of Monte Carlo codes. A Monte Carlo computing machine, which processes the Monte Carlo codes with high performances is being developed at our Computing Center since 1987. The concept of Monte Carlo computing machine and its performance have been investigated and estimated by using a software simulator. In this report the problems in vectorization of Monte Carlo codes, Monte Carlo pipelines proposed to mitigate these difficulties and the results of the performance estimation of the Monte Carlo computing machine by the simulator are described. (author)

  11. Statistical and Machine Learning Models to Predict Programming Performance

    OpenAIRE

    Bergin, Susan

    2006-01-01

    This thesis details a longitudinal study on factors that influence introductory programming success and on the development of machine learning models to predict incoming student performance. Although numerous studies have developed models to predict programming success, the models struggled to achieve high accuracy in predicting the likely performance of incoming students. Our approach overcomes this by providing a machine learning technique, using a set of three significant...

  12. Numerical code to determine the particle trapping region in the LISA machine

    International Nuclear Information System (INIS)

    Azevedo, M.T. de; Raposo, C.C. de; Tomimura, A.

    1984-01-01

    A numerical code is constructed to determine the trapping region in machine like LISA. The variable magnetic field is two deimensional and is coupled to the Runge-Kutta through the Tchebichev polynomial. Various particle orbits including particle interactions were analysed. Beside this, a strong electric field is introduced to see the possible effects happening inside the plasma. (Author) [pt

  13. Using supervised machine learning to code policy issues: Can classifiers generalize across contexts?

    NARCIS (Netherlands)

    Burscher, B.; Vliegenthart, R.; de Vreese, C.H.

    2015-01-01

    Content analysis of political communication usually covers large amounts of material and makes the study of dynamics in issue salience a costly enterprise. In this article, we present a supervised machine learning approach for the automatic coding of policy issues, which we apply to news articles

  14. The vector and parallel processing of MORSE code on Monte Carlo Machine

    International Nuclear Information System (INIS)

    Hasegawa, Yukihiro; Higuchi, Kenji.

    1995-11-01

    Multi-group Monte Carlo Code for particle transport, MORSE is modified for high performance computing on Monte Carlo Machine Monte-4. The method and the results are described. Monte-4 was specially developed to realize high performance computing of Monte Carlo codes for particle transport, which have been difficult to obtain high performance in vector processing on conventional vector processors. Monte-4 has four vector processor units with the special hardware called Monte Carlo pipelines. The vectorization and parallelization of MORSE code and the performance evaluation on Monte-4 are described. (author)

  15. Machine-learning-assisted correction of correlated qubit errors in a topological code

    Directory of Open Access Journals (Sweden)

    Paul Baireuther

    2018-01-01

    Full Text Available A fault-tolerant quantum computation requires an efficient means to detect and correct errors that accumulate in encoded quantum information. In the context of machine learning, neural networks are a promising new approach to quantum error correction. Here we show that a recurrent neural network can be trained, using only experimentally accessible data, to detect errors in a widely used topological code, the surface code, with a performance above that of the established minimum-weight perfect matching (or blossom decoder. The performance gain is achieved because the neural network decoder can detect correlations between bit-flip (X and phase-flip (Z errors. The machine learning algorithm adapts to the physical system, hence no noise model is needed. The long short-term memory layers of the recurrent neural network maintain their performance over a large number of quantum error correction cycles, making it a practical decoder for forthcoming experimental realizations of the surface code.

  16. Investigation of roughing machining simulation by using visual basic programming in NX CAM system

    Science.gov (United States)

    Hafiz Mohamad, Mohamad; Nafis Osman Zahid, Muhammed

    2018-03-01

    This paper outlines a simulation study to investigate the characteristic of roughing machining simulation in 4th axis milling processes by utilizing visual basic programming in NX CAM systems. The selection and optimization of cutting orientation in rough milling operation is critical in 4th axis machining. The main purpose of roughing operation is to approximately shape the machined parts into finished form by removing the bulk of material from workpieces. In this paper, the simulations are executed by manipulating a set of different cutting orientation to generate estimated volume removed from the machine parts. The cutting orientation with high volume removal is denoted as an optimum value and chosen to execute a roughing operation. In order to run the simulation, customized software is developed to assist the routines. Operations build-up instructions in NX CAM interface are translated into programming codes via advanced tool available in the Visual Basic Studio. The codes is customized and equipped with decision making tools to run and control the simulations. It permits the integration with any independent program files to execute specific operations. This paper aims to discuss about the simulation program and identifies optimum cutting orientations for roughing processes. The output of this study will broaden up the simulation routines performed in NX CAM systems.

  17. Summary of UCRL pyrotron (mirror machine) program

    Energy Technology Data Exchange (ETDEWEB)

    Post, R F [Radiation Laboratory, University of California, Livermore, CA (United States)

    1958-07-01

    Under the sponsorship of the Atomic Energy Commission, work has been going forward at the University of California Radiation Laboratory since 1952 to investigate the application of the so-called 'magnetic mirror' effect to the creation and confinement of a high temperature plasma. This report presents some of the theory of operation of the Mirror Machine, and summarizes the experimental work which has been carried out.

  18. Understanding Notional Machines through Traditional Teaching with Conceptual Contraposition and Program Memory Tracing

    Directory of Open Access Journals (Sweden)

    Jeisson Hidalgo-Céspedes

    2016-08-01

    Full Text Available A correct understanding about how computers run code is mandatory in order to effectively learn to program. Lectures have historically been used in programming courses to teach how computers execute code, and students are assessed through traditional evaluation methods, such as exams. Constructivism learning theory objects to students’ passiveness during lessons, and traditional quantitative methods for evaluating a complex cognitive process such as understanding. Constructivism proposes complimentary techniques, such as conceptual contraposition and colloquies. We enriched lectures of a “Programming II” (CS2 course combining conceptual contraposition with program memory tracing, then we evaluated students’ understanding of programming concepts through colloquies. Results revealed that these techniques applied to the lecture are insufficient to help students develop satisfactory mental models of the C++ notional machine, and colloquies behaved as the most comprehensive traditional evaluations conducted in the course.

  19. "Hour of Code": Can It Change Students' Attitudes toward Programming?

    Science.gov (United States)

    Du, Jie; Wimmer, Hayden; Rada, Roy

    2016-01-01

    The Hour of Code is a one-hour introduction to computer science organized by Code.org, a non-profit dedicated to expanding participation in computer science. This study investigated the impact of the Hour of Code on students' attitudes towards computer programming and their knowledge of programming. A sample of undergraduate students from two…

  20. OPMILL - MICRO COMPUTER PROGRAMMING ENVIRONMENT FOR CNC MILLING MACHINES THREE AXIS EQUATION PLOTTING CAPABILITIES

    Science.gov (United States)

    Ray, R. B.

    1994-01-01

    OPMILL is a computer operating system for a Kearney and Trecker milling machine that provides a fast and easy way to program machine part manufacture with an IBM compatible PC. The program gives the machinist an "equation plotter" feature which plots any set of equations that define axis moves (up to three axes simultaneously) and converts those equations to a machine milling program that will move a cutter along a defined path. Other supported functions include: drill with peck, bolt circle, tap, mill arc, quarter circle, circle, circle 2 pass, frame, frame 2 pass, rotary frame, pocket, loop and repeat, and copy blocks. The system includes a tool manager that can handle up to 25 tools and automatically adjusts tool length for each tool. It will display all tool information and stop the milling machine at the appropriate time. Information for the program is entered via a series of menus and compiled to the Kearney and Trecker format. The program can then be loaded into the milling machine, the tool path graphically displayed, and tool change information or the program in Kearney and Trecker format viewed. The program has a complete file handling utility that allows the user to load the program into memory from the hard disk, save the program to the disk with comments, view directories, merge a program on the disk with one in memory, save a portion of a program in memory, and change directories. OPMILL was developed on an IBM PS/2 running DOS 3.3 with 1 MB of RAM. OPMILL was written for an IBM PC or compatible 8088 or 80286 machine connected via an RS-232 port to a Kearney and Trecker Data Mill 700/C Control milling machine. It requires a "D:" drive (fixed-disk or virtual), a browse or text display utility, and an EGA or better display. Users wishing to modify and recompile the source code will also need Turbo BASIC, Turbo C, and Crescent Software's QuickPak for Turbo BASIC. IBM PC and IBM PS/2 are registered trademarks of International Business Machines. Turbo

  1. Notional Machines and Introductory Programming Education

    Science.gov (United States)

    Sorva, Juha

    2013-01-01

    This article brings together, summarizes, and comments on several threads of research that have contributed to our understanding of the challenges that novice programmers face when learning about the runtime dynamics of programs and the role of the computer in program execution. More specifically, the review covers the literature on programming…

  2. Development of non-linear vibration analysis code for CANDU fuelling machine

    International Nuclear Information System (INIS)

    Murakami, Hajime; Hirai, Takeshi; Horikoshi, Kiyomi; Mizukoshi, Kaoru; Takenaka, Yasuo; Suzuki, Norio.

    1988-01-01

    This paper describes the development of a non-linear, dynamic analysis code for the CANDU 600 fuelling machine (F-M), which includes a number of non-linearities such as gap with or without Coulomb friction, special multi-linear spring connections, etc. The capabilities and features of the code and the mathematical treatment for the non-linearities are explained. The modeling and numerical methodology for the non-linearities employed in the code are verified experimentally. Finally, the simulation analyses for the full-scale F-M vibration testing are carried out, and the applicability of the code to such multi-degree of freedom systems as F-M is demonstrated. (author)

  3. Experiences and results multitasking a hydrodynamics code on global and local memory machines

    International Nuclear Information System (INIS)

    Mandell, D.

    1987-01-01

    A one-dimensional, time-dependent Lagrangian hydrodynamics code using a Godunov solution method has been multimasked for the Cray X-MP/48, the Intel iPSC hypercube, the Alliant FX series and the IBM RP3 computers. Actual multitasking results have been obtained for the Cray, Intel and Alliant computers and simulated results were obtained for the Cray and RP3 machines. The differences in the methods required to multitask on each of the machines is discussed. Results are presented for a sample problem involving a shock wave moving down a channel. Comparisons are made between theoretical speedups, predicted by Amdahl's law, and the actual speedups obtained. The problems of debugging on the different machines are also described

  4. C4.5 programs for machine learning

    CERN Document Server

    Quinlan, J Ross

    1992-01-01

    Classifier systems play a major role in machine learning and knowledge-based systems, and Ross Quinlan's work on ID3 and C4.5 is widely acknowledged to have made some of the most significant contributions to their development. This book is a complete guide to the C4.5 system as implemented in C for the UNIX environment. It contains a comprehensive guide to the system's use , the source code (about 8,800 lines), and implementation notes. The source code and sample datasets are also available for download (see below). C4.5 starts with large sets of cases belonging to known classes. The cases,

  5. Group program procedure for machining seal rings of steam turbines on digital computer controlled machines

    International Nuclear Information System (INIS)

    Glukhikh, V.K.; Skvortsov, S.B.; Sidorov, V.A.

    1982-01-01

    Developed is a group program procedure for turning machining of seal rings, including the use of new progressive high-accuracy equipment, universal device for securing of all nomenclature of treated seal rings, necessary cutting tools and program control of the process of treatment. Introduction of a new technological process permitted to improve the quality of treated seal rings; to reduce the labour consumption in 30...40% [ru

  6. Parallelization of MCNP Monte Carlo neutron and photon transport code in parallel virtual machine and message passing interface

    International Nuclear Information System (INIS)

    Deng Li; Xie Zhongsheng

    1999-01-01

    The coupled neutron and photon transport Monte Carlo code MCNP (version 3B) has been parallelized in parallel virtual machine (PVM) and message passing interface (MPI) by modifying a previous serial code. The new code has been verified by solving sample problems. The speedup increases linearly with the number of processors and the average efficiency is up to 99% for 12-processor. (author)

  7. Extraction of state machines of legacy C code with Cpp2XMI

    NARCIS (Netherlands)

    Brand, van den M.G.J.; Serebrenik, A.; Zeeland, van D.; Serebrenik, A.

    2008-01-01

    Analysis of legacy code is often focussed on extracting either metrics or relations, e.g. call relations or structure relations. For object-oriented programs, e.g. Java or C++ code, such relations are commonly represented as UML diagrams: e.g., such tools as Columbus [1] and Cpp2XMI [2] are capable

  8. Joint Machine Learning and Game Theory for Rate Control in High Efficiency Video Coding.

    Science.gov (United States)

    Gao, Wei; Kwong, Sam; Jia, Yuheng

    2017-08-25

    In this paper, a joint machine learning and game theory modeling (MLGT) framework is proposed for inter frame coding tree unit (CTU) level bit allocation and rate control (RC) optimization in High Efficiency Video Coding (HEVC). First, a support vector machine (SVM) based multi-classification scheme is proposed to improve the prediction accuracy of CTU-level Rate-Distortion (R-D) model. The legacy "chicken-and-egg" dilemma in video coding is proposed to be overcome by the learning-based R-D model. Second, a mixed R-D model based cooperative bargaining game theory is proposed for bit allocation optimization, where the convexity of the mixed R-D model based utility function is proved, and Nash bargaining solution (NBS) is achieved by the proposed iterative solution search method. The minimum utility is adjusted by the reference coding distortion and frame-level Quantization parameter (QP) change. Lastly, intra frame QP and inter frame adaptive bit ratios are adjusted to make inter frames have more bit resources to maintain smooth quality and bit consumption in the bargaining game optimization. Experimental results demonstrate that the proposed MLGT based RC method can achieve much better R-D performances, quality smoothness, bit rate accuracy, buffer control results and subjective visual quality than the other state-of-the-art one-pass RC methods, and the achieved R-D performances are very close to the performance limits from the FixedQP method.

  9. TEACHING MACHINES AND PROGRAMMED LEARNING, A SOURCE BOOK.

    Science.gov (United States)

    LUMSDAINE, A.A., ED.; GLASER, ROBERT, ED.

    BROUGHT TOGETHER HERE IS THE WIDELY-SCATTERED LITERATURE ON SELF-INSTRUCTIONAL PROGRAMS AND DEVICES BY LEADERS, PAST AND PRESENT, IN THEIR DEVELOPMENT. S.L. PRESSEY IN HIS ARTICLES DESCRIBES THE APPARATUS, METHODS, THEORY, AND RESULTS ATTENDANT UPON USE OF HIS TEST-SCORING DEVICES. B.F. SKINNER IN HIS ARTICLES DEVELOPS THEORY, DESCRIBES MACHINES,…

  10. Asnuntuck Community College's Machine Technology Certificate and Degree Programs.

    Science.gov (United States)

    Irlen, Harvey S.; Gulluni, Frank D.

    2002-01-01

    States that although manufacturing remains a viable sector in Connecticut, it is experiencing skills shortages in the workforce. Describes the machine technology program's purpose, the development of the Asnuntuck Community College's (Connecticut) partnership with private sector manufacturers, the curriculum, the outcomes, and benefits of…

  11. A Teaching System To Learn Programming: the Programmer's Learning Machine

    OpenAIRE

    Quinson , Martin; Oster , Gérald

    2015-01-01

    International audience; The Programmer's Learning Machine (PLM) is an interactive exerciser for learning programming and algorithms. Using an integrated and graphical environment that provides a short feedback loop, it allows students to learn in a (semi)-autonomous way. This generic platform also enables teachers to create specific programming microworlds that match their teaching goals. This paper discusses our design goals and motivations, introduces the existing material and the proposed ...

  12. A Navier-Strokes Chimera Code on the Connection Machine CM-5: Design and Performance

    Science.gov (United States)

    Jespersen, Dennis C.; Levit, Creon; Kwak, Dochan (Technical Monitor)

    1994-01-01

    We have implemented a three-dimensional compressible Navier-Stokes code on the Connection Machine CM-5. The code is set up for implicit time-stepping on single or multiple structured grids. For multiple grids and geometrically complex problems, we follow the 'chimera' approach, where flow data on one zone is interpolated onto another in the region of overlap. We will describe our design philosophy and give some timing results for the current code. A parallel machine like the CM-5 is well-suited for finite-difference methods on structured grids. The regular pattern of connections of a structured mesh maps well onto the architecture of the machine. So the first design choice, finite differences on a structured mesh, is natural. We use centered differences in space, with added artificial dissipation terms. When numerically solving the Navier-Stokes equations, there are liable to be some mesh cells near a solid body that are small in at least one direction. This mesh cell geometry can impose a very severe CFL (Courant-Friedrichs-Lewy) condition on the time step for explicit time-stepping methods. Thus, though explicit time-stepping is well-suited to the architecture of the machine, we have adopted implicit time-stepping. We have further taken the approximate factorization approach. This creates the need to solve large banded linear systems and creates the first possible barrier to an efficient algorithm. To overcome this first possible barrier we have considered two options. The first is just to solve the banded linear systems with data spread over the whole machine, using whatever fast method is available. This option is adequate for solving scalar tridiagonal systems, but for scalar pentadiagonal or block tridiagonal systems it is somewhat slower than desired. The second option is to 'transpose' the flow and geometry variables as part of the time-stepping process: Start with x-lines of data in-processor. Form explicit terms in x, then transpose so y-lines of data are

  13. Program Design Report of the CNC Machine Tool(V-1)

    International Nuclear Information System (INIS)

    Youm, Ki Un; Moon, J. S.; Lee, I. B.; Youn, J. H.

    2010-08-01

    The application of CNC machine tool being widely expanded according to variety of machine work method and rapid promotion of machine tool, cutting tool, for high speed efficient machine work. In order to conduct of the project of manufacture and maintenance of laboratory equipment, production design and machine work technology are continually developed, especially the application of CNC machine tool is very important for the improvement of productivity, quality and clearing up a manpower shortage. We publish technical report which it includes CNC machine tool program and drawing, it contributes to the systematic development of CNC program design and machine work technology

  14. Brain cells in the avian 'prefrontal cortex' code for features of slot-machine-like gambling.

    Directory of Open Access Journals (Sweden)

    Damian Scarf

    2011-01-01

    Full Text Available Slot machines are the most common and addictive form of gambling. In the current study, we recorded from single neurons in the 'prefrontal cortex' of pigeons while they played a slot-machine-like task. We identified four categories of neurons that coded for different aspects of our slot-machine-like task. Reward-Proximity neurons showed a linear increase in activity as the opportunity for a reward drew near. I-Won neurons fired only when the fourth stimulus of a winning (four-of-a-kind combination was displayed. I-Lost neurons changed their firing rate at the presentation of the first nonidentical stimulus, that is, when it was apparent that no reward was forthcoming. Finally, Near-Miss neurons also changed their activity the moment it was recognized that a reward was no longer available, but more importantly, the activity level was related to whether the trial contained one, two, or three identical stimuli prior to the display of the nonidentical stimulus. These findings not only add to recent neurophysiological research employing simulated gambling paradigms, but also add to research addressing the functional correspondence between the avian NCL and primate PFC.

  15. Computer code qualification program for the Advanced CANDU Reactor

    International Nuclear Information System (INIS)

    Popov, N.K.; Wren, D.J.; Snell, V.G.; White, A.J.; Boczar, P.G.

    2003-01-01

    Atomic Energy of Canada Ltd (AECL) has developed and implemented a Software Quality Assurance program (SQA) to ensure that its analytical, scientific and design computer codes meet the required standards for software used in safety analyses. This paper provides an overview of the computer programs used in Advanced CANDU Reactor (ACR) safety analysis, and assessment of their applicability in the safety analyses of the ACR design. An outline of the incremental validation program, and an overview of the experimental program in support of the code validation are also presented. An outline of the SQA program used to qualify these computer codes is also briefly presented. To provide context to the differences in the SQA with respect to current CANDUs, the paper also provides an overview of the ACR design features that have an impact on the computer code qualification. (author)

  16. On the linear programming bound for linear Lee codes.

    Science.gov (United States)

    Astola, Helena; Tabus, Ioan

    2016-01-01

    Based on an invariance-type property of the Lee-compositions of a linear Lee code, additional equality constraints can be introduced to the linear programming problem of linear Lee codes. In this paper, we formulate this property in terms of an action of the multiplicative group of the field [Formula: see text] on the set of Lee-compositions. We show some useful properties of certain sums of Lee-numbers, which are the eigenvalues of the Lee association scheme, appearing in the linear programming problem of linear Lee codes. Using the additional equality constraints, we formulate the linear programming problem of linear Lee codes in a very compact form, leading to a fast execution, which allows to efficiently compute the bounds for large parameter values of the linear codes.

  17. AECL's advanced code program

    Energy Technology Data Exchange (ETDEWEB)

    McGee, G.; Ball, J. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada)

    2012-07-01

    This paper discusses the advanced code project at AECL.Current suite of Analytical, Scientific and Design (ASD) computer codes in use by Canadian Nuclear Power Industry is mostly developed 20 or more years ago. It is increasingly difficult to develop and maintain. It consist of many independent tools and integrated analysis is difficult, time consuming and error-prone. The objectives of this project is to demonstrate that nuclear facility systems, structures and components meet their design objectives in terms of function, cost, and safety; demonstrate that the nuclear facility meets licensing requirements in terms of consequences of off-normal events; dose to public, workers, impact on environment and demonstrate that the nuclear facility meets operational requirements with respect to on-power fuelling and outage management.

  18. Code Development and Analysis Program: developmental checkout of the BEACON/MOD2A code

    International Nuclear Information System (INIS)

    Ramsthaler, J.A.; Lime, J.F.; Sahota, M.S.

    1978-12-01

    A best-estimate transient containment code, BEACON, is being developed by EG and G Idaho, Inc. for the Nuclear Regulatory Commission's reactor safety research program. This is an advanced, two-dimensional fluid flow code designed to predict temperatures and pressures in a dry PWR containment during a hypothetical loss-of-coolant accident. The most recent version of the code, MOD2A, is presently in the final stages of production prior to being released to the National Energy Software Center. As part of the final code checkout, seven sample problems were selected to be run with BEACON/MOD2A

  19. Argonne Code Center: compilation of program abstracts

    Energy Technology Data Exchange (ETDEWEB)

    Butler, M.K.; DeBruler, M.; Edwards, H.S.; Harrison, C. Jr.; Hughes, C.E.; Jorgensen, R.; Legan, M.; Menozzi, T.; Ranzini, L.; Strecok, A.J.

    1977-08-01

    This publication is the eleventh supplement to, and revision of, ANL-7411. It contains additional abstracts and revisions to some earlier abstracts and other pages. Sections of the complete document ANL-7411 are as follows: preface, history and acknowledgements, abstract format, recommended program package contents, program classification guide and thesaurus, and the abstract collection. (RWR)

  20. Argonne Code Center: compilation of program abstracts

    Energy Technology Data Exchange (ETDEWEB)

    Butler, M.K.; DeBruler, M.; Edwards, H.S.

    1976-08-01

    This publication is the tenth supplement to, and revision of, ANL-7411. It contains additional abstracts and revisions to some earlier abstracts and other pages. Sections of the document are as follows: preface; history and acknowledgements; abstract format; recommended program package contents; program classification guide and thesaurus; and abstract collection. (RWR)

  1. Argonne Code Center: compilation of program abstracts

    International Nuclear Information System (INIS)

    Butler, M.K.; DeBruler, M.; Edwards, H.S.; Harrison, C. Jr.; Hughes, C.E.; Jorgensen, R.; Legan, M.; Menozzi, T.; Ranzini, L.; Strecok, A.J.

    1977-08-01

    This publication is the eleventh supplement to, and revision of, ANL-7411. It contains additional abstracts and revisions to some earlier abstracts and other pages. Sections of the complete document ANL-7411 are as follows: preface, history and acknowledgements, abstract format, recommended program package contents, program classification guide and thesaurus, and the abstract collection

  2. Argonne Code Center: compilation of program abstracts

    International Nuclear Information System (INIS)

    Butler, M.K.; DeBruler, M.; Edwards, H.S.

    1976-08-01

    This publication is the tenth supplement to, and revision of, ANL-7411. It contains additional abstracts and revisions to some earlier abstracts and other pages. Sections of the document are as follows: preface; history and acknowledgements; abstract format; recommended program package contents; program classification guide and thesaurus; and abstract collection

  3. Energy Efficiency Program Administrators and Building Energy Codes

    Science.gov (United States)

    Explore how energy efficiency program administrators have helped advance building energy codes at federal, state, and local levels—using technical, institutional, financial, and other resources—and discusses potential next steps.

  4. Particle-in-cell plasma simulation codes on the connection machine

    International Nuclear Information System (INIS)

    Walker, D.W.

    1991-01-01

    Methods for implementing three-dimensional, electromagnetic, relativistic PIC plasma simulation codes on the Connection Machine (CM-2) are discussed. The gather and scatter phases of the PIC algorithm involve indirect indexing of data, which results in a large amount of communication on the CM-2. Different data decompositions are described that seek to reduce the amount of communication while maintaining good load balance. These methods require the particles to be spatially sorted at the start of each time step, which introduced another form of overhead. The different methods are implemented in CM Fortran on the CM-2 and compared. It was found that the general router is slow in performing the communication in the gather and scatter steps, which precludes an efficient CM Fortran implementation. An alternative method that uses PARIS calls and the NEWS communication network to pipeline data along the axes of the VP set is suggested as a more efficient algorithm

  5. PERMUTATION-BASED POLYMORPHIC STEGO-WATERMARKS FOR PROGRAM CODES

    Directory of Open Access Journals (Sweden)

    Denys Samoilenko

    2016-06-01

    Full Text Available Purpose: One of the most actual trends in program code protection is code marking. The problem consists in creation of some digital “watermarks” which allow distinguishing different copies of the same program codes. Such marks could be useful for authority protection, for code copies numbering, for program propagation monitoring, for information security proposes in client-server communication processes. Methods: We used the methods of digital steganography adopted for program codes as text objects. The same-shape symbols method was transformed to same-semantic element method due to codes features which makes them different from ordinary texts. We use dynamic principle of marks forming making codes similar to be polymorphic. Results: We examined the combinatorial capacity of permutations possible in program codes. As a result it was shown that the set of 5-7 polymorphic variables is suitable for the most modern network applications. Marks creation and restoration algorithms where proposed and discussed. The main algorithm is based on full and partial permutations in variables names and its declaration order. Algorithm for partial permutation enumeration was optimized for calculation complexity. PHP code fragments which realize the algorithms were listed. Discussion: Methodic proposed in the work allows distinguishing of each client-server connection. In a case if a clone of some network resource was found the methodic could give information about included marks and thereby data on IP, date and time, authentication information of client copied the resource. Usage of polymorphic stego-watermarks should improve information security indexes in network communications.

  6. International Code Assessment and Applications Program: Annual report

    International Nuclear Information System (INIS)

    Ting, P.; Hanson, R.; Jenks, R.

    1987-03-01

    This is the first annual report of the International Code Assessment and Applications Program (ICAP). The ICAP was organized by the Office of Nuclear Regulatory Research, United States Nuclear Regulatory Commission (USNRC) in 1985. The ICAP is an international cooperative reactor safety research program planned to continue over a period of approximately five years. To date, eleven European and Asian countries/organizations have joined the program through bilateral agreements with the USNRC. Seven proposed agreements are currently under negotiation. The primary mission of the ICAP is to provide independent assessment of the three major advanced computer codes (RELAP5, TRAC-PWR, and TRAC-BWR) developed by the USNRC. However, program activities can be expected to enhance the assessment process throughout member countries. The codes were developed to calculate the reactor plant response to transients and loss-of-coolant accidents. Accurate prediction of normal and abnormal plant response using the codes enhances procedures and regulations used for the safe operation of the plant and also provides technical basis for assessing the safety margin of future reactor plant designs. The ICAP is providing required assessment data that will contribute to quantification of the code uncertainty for each code. The first annual report is devoted to coverage of program activities and accomplishments during the period between April 1985 and March 1987

  7. Machine medical ethics

    CERN Document Server

    Pontier, Matthijs

    2015-01-01

    The essays in this book, written by researchers from both humanities and sciences, describe various theoretical and experimental approaches to adding medical ethics to a machine in medical settings. Medical machines are in close proximity with human beings, and getting closer: with patients who are in vulnerable states of health, who have disabilities of various kinds, with the very young or very old, and with medical professionals. In such contexts, machines are undertaking important medical tasks that require emotional sensitivity, knowledge of medical codes, human dignity, and privacy. As machine technology advances, ethical concerns become more urgent: should medical machines be programmed to follow a code of medical ethics? What theory or theories should constrain medical machine conduct? What design features are required? Should machines share responsibility with humans for the ethical consequences of medical actions? How ought clinical relationships involving machines to be modeled? Is a capacity for e...

  8. Machine Learning: developing an image recognition program : with Python, Scikit Learn and OpenCV

    OpenAIRE

    Nguyen, Minh

    2016-01-01

    Machine Learning is one of the most debated topic in computer world these days, especially after the first Computer Go program has beaten human Go world champion. Among endless application of Machine Learning, image recognition, which problem is processing enormous amount of data from dynamic input. This thesis will present the basic concept of Machine Learning, Machine Learning algorithms, Python programming language and Scikit Learn – a simple and efficient tool for data analysis in P...

  9. RunJumpCode: An Educational Game for Educating Programming

    Science.gov (United States)

    Hinds, Matthew; Baghaei, Nilufar; Ragon, Pedrito; Lambert, Jonathon; Rajakaruna, Tharindu; Houghton, Travers; Dacey, Simon

    2017-01-01

    Programming promotes critical thinking, problem solving and analytic skills through creating solutions that can solve everyday problems. However, learning programming can be a daunting experience for a lot of students. "RunJumpCode" is an educational 2D platformer video game, designed and developed in Unity, to teach players the…

  10. Epigenetic codes programming class switch recombination

    Directory of Open Access Journals (Sweden)

    Bharat eVaidyanathan

    2015-09-01

    Full Text Available Class switch recombination imparts B cells with a fitness-associated adaptive advantage during a humoral immune response by using a precision-tailored DNA excision and ligation process to swap the default constant region gene of the antibody with a new one that has unique effector functions. This secondary diversification of the antibody repertoire is a hallmark of the adaptability of B cells when confronted with environmental and pathogenic challenges. Given that the nucleotide sequence of genes during class switching remains unchanged (genetic constraints, it is logical and necessary therefore, to integrate the adaptability of B cells to an epigenetic state, which is dynamic and can be heritably modulated before, after or even during an antibody-dependent immune response. Epigenetic regulation encompasses heritable changes that affect function (phenotype without altering the sequence information embedded in a gene, and include histone, DNA and RNA modifications. Here, we review current literature on how B cells use an epigenetic code language as a means to ensure antibody plasticity in light of pathogenic insults.

  11. Interactive game programming with Python (CodeSkulptor)

    OpenAIRE

    Ajayi, Richard Olugbenga

    2014-01-01

    Over the years, several types of gaming platforms have been created to encourage a more organised and friendly atmosphere for game lovers in various works of life, culture, and environment. This thesis focuses on the concept of interactive programming using Python. It encourages the use of Python to create simple interactive games applications based on basic human concept and ideas. CodeSkulptor is a browser-based IDE programming environment and uses the Python programming language. O...

  12. LFTPLT8: plotter program for RELAP5 code

    International Nuclear Information System (INIS)

    Yamano, Kazuaki; Abe, Nobuaki; Tasaka, Kanji

    1981-02-01

    The plotter program LFTPLT8 is a new version of the LFTPLT7 developed to plot the calculated results by RELAP5 code. The RELAP5/MOD0 code has also been revised for LFTPLT8. LFTPLT8 is capable of multiple plotting of any combination of experimental data and calculated results by RELAP4J, RELAP4/MOD5, ALARM-P1, and RELAP5/MOD0. (author)

  13. Optimization and Openmp Parallelization of a Discrete Element Code for Convex Polyhedra on Multi-Core Machines

    Science.gov (United States)

    Chen, Jian; Matuttis, Hans-Georg

    2013-02-01

    We report our experiences with the optimization and parallelization of a discrete element code for convex polyhedra on multi-core machines and introduce a novel variant of the sort-and-sweep neighborhood algorithm. While in theory the whole code in itself parallelizes ideally, in practice the results on different architectures with different compilers and performance measurement tools depend very much on the particle number and optimization of the code. After difficulties with the interpretation of the data for speedup and efficiency are overcome, respectable parallelization speedups could be obtained.

  14. Virtual machine provisioning, code management, and data movement design for the Fermilab HEPCloud Facility

    Science.gov (United States)

    Timm, S.; Cooper, G.; Fuess, S.; Garzoglio, G.; Holzman, B.; Kennedy, R.; Grassano, D.; Tiradani, A.; Krishnamurthy, R.; Vinayagam, S.; Raicu, I.; Wu, H.; Ren, S.; Noh, S.-Y.

    2017-10-01

    The Fermilab HEPCloud Facility Project has as its goal to extend the current Fermilab facility interface to provide transparent access to disparate resources including commercial and community clouds, grid federations, and HPC centers. This facility enables experiments to perform the full spectrum of computing tasks, including data-intensive simulation and reconstruction. We have evaluated the use of the commercial cloud to provide elasticity to respond to peaks of demand without overprovisioning local resources. Full scale data-intensive workflows have been successfully completed on Amazon Web Services for two High Energy Physics Experiments, CMS and NOνA, at the scale of 58000 simultaneous cores. This paper describes the significant improvements that were made to the virtual machine provisioning system, code caching system, and data movement system to accomplish this work. The virtual image provisioning and contextualization service was extended to multiple AWS regions, and to support experiment-specific data configurations. A prototype Decision Engine was written to determine the optimal availability zone and instance type to run on, minimizing cost and job interruptions. We have deployed a scalable on-demand caching service to deliver code and database information to jobs running on the commercial cloud. It uses the frontiersquid server and CERN VM File System (CVMFS) clients on EC2 instances and utilizes various services provided by AWS to build the infrastructure (stack). We discuss the architecture and load testing benchmarks on the squid servers. We also describe various approaches that were evaluated to transport experimental data to and from the cloud, and the optimal solutions that were used for the bulk of the data transport. Finally, we summarize lessons learned from this scale test, and our future plans to expand and improve the Fermilab HEP Cloud Facility.

  15. Virtual Machine Provisioning, Code Management, and Data Movement Design for the Fermilab HEPCloud Facility

    Energy Technology Data Exchange (ETDEWEB)

    Timm, S. [Fermilab; Cooper, G. [Fermilab; Fuess, S. [Fermilab; Garzoglio, G. [Fermilab; Holzman, B. [Fermilab; Kennedy, R. [Fermilab; Grassano, D. [Fermilab; Tiradani, A. [Fermilab; Krishnamurthy, R. [IIT, Chicago; Vinayagam, S. [IIT, Chicago; Raicu, I. [IIT, Chicago; Wu, H. [IIT, Chicago; Ren, S. [IIT, Chicago; Noh, S. Y. [KISTI, Daejeon

    2017-11-22

    The Fermilab HEPCloud Facility Project has as its goal to extend the current Fermilab facility interface to provide transparent access to disparate resources including commercial and community clouds, grid federations, and HPC centers. This facility enables experiments to perform the full spectrum of computing tasks, including data-intensive simulation and reconstruction. We have evaluated the use of the commercial cloud to provide elasticity to respond to peaks of demand without overprovisioning local resources. Full scale data-intensive workflows have been successfully completed on Amazon Web Services for two High Energy Physics Experiments, CMS and NOνA, at the scale of 58000 simultaneous cores. This paper describes the significant improvements that were made to the virtual machine provisioning system, code caching system, and data movement system to accomplish this work. The virtual image provisioning and contextualization service was extended to multiple AWS regions, and to support experiment-specific data configurations. A prototype Decision Engine was written to determine the optimal availability zone and instance type to run on, minimizing cost and job interruptions. We have deployed a scalable on-demand caching service to deliver code and database information to jobs running on the commercial cloud. It uses the frontiersquid server and CERN VM File System (CVMFS) clients on EC2 instances and utilizes various services provided by AWS to build the infrastructure (stack). We discuss the architecture and load testing benchmarks on the squid servers. We also describe various approaches that were evaluated to transport experimental data to and from the cloud, and the optimal solutions that were used for the bulk of the data transport. Finally, we summarize lessons learned from this scale test, and our future plans to expand and improve the Fermilab HEP Cloud Facility.

  16. A new method for species identification via protein-coding and non-coding DNA barcodes by combining machine learning with bioinformatic methods.

    Directory of Open Access Journals (Sweden)

    Ai-bing Zhang

    Full Text Available Species identification via DNA barcodes is contributing greatly to current bioinventory efforts. The initial, and widely accepted, proposal was to use the protein-coding cytochrome c oxidase subunit I (COI region as the standard barcode for animals, but recently non-coding internal transcribed spacer (ITS genes have been proposed as candidate barcodes for both animals and plants. However, achieving a robust alignment for non-coding regions can be problematic. Here we propose two new methods (DV-RBF and FJ-RBF to address this issue for species assignment by both coding and non-coding sequences that take advantage of the power of machine learning and bioinformatics. We demonstrate the value of the new methods with four empirical datasets, two representing typical protein-coding COI barcode datasets (neotropical bats and marine fish and two representing non-coding ITS barcodes (rust fungi and brown algae. Using two random sub-sampling approaches, we demonstrate that the new methods significantly outperformed existing Neighbor-joining (NJ and Maximum likelihood (ML methods for both coding and non-coding barcodes when there was complete species coverage in the reference dataset. The new methods also out-performed NJ and ML methods for non-coding sequences in circumstances of potentially incomplete species coverage, although then the NJ and ML methods performed slightly better than the new methods for protein-coding barcodes. A 100% success rate of species identification was achieved with the two new methods for 4,122 bat queries and 5,134 fish queries using COI barcodes, with 95% confidence intervals (CI of 99.75-100%. The new methods also obtained a 96.29% success rate (95%CI: 91.62-98.40% for 484 rust fungi queries and a 98.50% success rate (95%CI: 96.60-99.37% for 1094 brown algae queries, both using ITS barcodes.

  17. A new method for species identification via protein-coding and non-coding DNA barcodes by combining machine learning with bioinformatic methods.

    Science.gov (United States)

    Zhang, Ai-bing; Feng, Jie; Ward, Robert D; Wan, Ping; Gao, Qiang; Wu, Jun; Zhao, Wei-zhong

    2012-01-01

    Species identification via DNA barcodes is contributing greatly to current bioinventory efforts. The initial, and widely accepted, proposal was to use the protein-coding cytochrome c oxidase subunit I (COI) region as the standard barcode for animals, but recently non-coding internal transcribed spacer (ITS) genes have been proposed as candidate barcodes for both animals and plants. However, achieving a robust alignment for non-coding regions can be problematic. Here we propose two new methods (DV-RBF and FJ-RBF) to address this issue for species assignment by both coding and non-coding sequences that take advantage of the power of machine learning and bioinformatics. We demonstrate the value of the new methods with four empirical datasets, two representing typical protein-coding COI barcode datasets (neotropical bats and marine fish) and two representing non-coding ITS barcodes (rust fungi and brown algae). Using two random sub-sampling approaches, we demonstrate that the new methods significantly outperformed existing Neighbor-joining (NJ) and Maximum likelihood (ML) methods for both coding and non-coding barcodes when there was complete species coverage in the reference dataset. The new methods also out-performed NJ and ML methods for non-coding sequences in circumstances of potentially incomplete species coverage, although then the NJ and ML methods performed slightly better than the new methods for protein-coding barcodes. A 100% success rate of species identification was achieved with the two new methods for 4,122 bat queries and 5,134 fish queries using COI barcodes, with 95% confidence intervals (CI) of 99.75-100%. The new methods also obtained a 96.29% success rate (95%CI: 91.62-98.40%) for 484 rust fungi queries and a 98.50% success rate (95%CI: 96.60-99.37%) for 1094 brown algae queries, both using ITS barcodes.

  18. Sequence Coding and Search System Backfit Quality Assurance Program Plan

    International Nuclear Information System (INIS)

    Lovell, C.J.; Stepina, P.L.

    1985-03-01

    The Sequence Coding and Search System is a computer-based encoding system for events described in Licensee Event Reports. This data system contains LERs from 1981 to present. Backfit of the data system to include LERs prior to 1981 is required. This report documents the Quality Assurance Program Plan that EG and G Idaho, Inc. will follow while encoding 1980 LERs

  19. The linear programming bound for binary linear codes

    NARCIS (Netherlands)

    Brouwer, A.E.

    1993-01-01

    Combining Delsarte's (1973) linear programming bound with the information that certain weights cannot occur, new upper bounds for dmin (n,k), the maximum possible minimum distance of a binary linear code with given word length n and dimension k, are derived.

  20. The Glasgow Parallel Reduction Machine: Programming Shared-memory Many-core Systems using Parallel Task Composition

    Directory of Open Access Journals (Sweden)

    Ashkan Tousimojarad

    2013-12-01

    Full Text Available We present the Glasgow Parallel Reduction Machine (GPRM, a novel, flexible framework for parallel task-composition based many-core programming. We allow the programmer to structure programs into task code, written as C++ classes, and communication code, written in a restricted subset of C++ with functional semantics and parallel evaluation. In this paper we discuss the GPRM, the virtual machine framework that enables the parallel task composition approach. We focus the discussion on GPIR, the functional language used as the intermediate representation of the bytecode running on the GPRM. Using examples in this language we show the flexibility and power of our task composition framework. We demonstrate the potential using an implementation of a merge sort algorithm on a 64-core Tilera processor, as well as on a conventional Intel quad-core processor and an AMD 48-core processor system. We also compare our framework with OpenMP tasks in a parallel pointer chasing algorithm running on the Tilera processor. Our results show that the GPRM programs outperform the corresponding OpenMP codes on all test platforms, and can greatly facilitate writing of parallel programs, in particular non-data parallel algorithms such as reductions.

  1. Small machine tools for small workpieces final report of the DFG priority program 1476

    CERN Document Server

    Sanders, Adam

    2017-01-01

    This contributed volume presents the research results of the program “Small machine tools for small work pieces” (SPP 1476), funded by the German Research Society (DFG). The book contains the final report of the priority program, presenting novel approached for size-adapted, reconfigurable micro machine tools. The target audience primarily comprises research experts and practitioners in the field of micro machine tools, but the book may also be beneficial for graduate students.

  2. Findings From the National Machine Guarding Program?A Small Business Intervention

    OpenAIRE

    Parker, David L.; Yamin, Samuel C.; Xi, Min; Brosseau, Lisa M.; Gordon, Robert; Most, Ivan G.; Stanley, Rodney

    2016-01-01

    Objectives: The purpose of this nationwide intervention was to improve machine safety in small metal fabrication businesses (3 to 150 employees). The failure to implement machine safety programs related to guarding and lockout/tagout (LOTO) are frequent causes of Occupational Safety and Health Administration (OSHA) citations and may result in serious traumatic injury. Methods: Insurance safety consultants conducted a standardized evaluation of machine guarding, safety programs, and LOTO. Busi...

  3. TEACHING MACHINES AND PROGRAMED LEARNING, A SURVEY OF THE INDUSTRY, 1962.

    Science.gov (United States)

    FINN, JAMES D.; AND OTHERS

    THIS PAPER REPORTS THE DEVELOPMENT OF THE TEACHING MACHINES AND PROGRAMED INSTRUCTION INDUSTRY THROUGH 1961. THIS EFFORT IS AN OUTGROWTH OF TWO LARGER SURVEYS--ONE ON MATERIALS OF INSTRUCTION, THE OTHER ON HARDWARE OR DEVICES. A CATALOG AND A STATUS REPORT ARE GIVEN FOR AVAILABLE TEACHING MACHINES, PROGRAMS, AND MANUFACTURERS. (GD)

  4. From Algorithmic Black Boxes to Adaptive White Boxes: Declarative Decision-Theoretic Ethical Programs as Codes of Ethics

    OpenAIRE

    van Otterlo, Martijn

    2017-01-01

    Ethics of algorithms is an emerging topic in various disciplines such as social science, law, and philosophy, but also artificial intelligence (AI). The value alignment problem expresses the challenge of (machine) learning values that are, in some way, aligned with human requirements or values. In this paper I argue for looking at how humans have formalized and communicated values, in professional codes of ethics, and for exploring declarative decision-theoretic ethical programs (DDTEP) to fo...

  5. Programming a real code in a functional language (part 1)

    Energy Technology Data Exchange (ETDEWEB)

    Hendrickson, C.P.

    1991-09-10

    For some, functional languages hold the promise of allowing ease of programming massively parallel computers that imperative languages such as Fortran and C do not offer. At LLNL, we have initiated a project to write the physics of a major production code in Sisal, a functional language developed at LLNL in collaboration with researchers throughout the world. We are investigating the expressibility of Sisal, as well as its performance on a shared-memory multiprocessor, the Y-MP. An interesting aspect of the project is that Sisal modules can call Fortran modules, and are callable by them. This eliminates the rewriting of 80% of the production code that would not benefit from parallel execution. Preliminary results indicate that the restrictive nature of the language does not cause problems in expressing the algorithms we have chosen. Some interesting aspects of programming in a mixed functional-imperative environment have surfaced, but can be managed. 8 refs.

  6. Step by step parallel programming method for molecular dynamics code

    International Nuclear Information System (INIS)

    Orii, Shigeo; Ohta, Toshio

    1996-07-01

    Parallel programming for a numerical simulation program of molecular dynamics is carried out with a step-by-step programming technique using the two phase method. As a result, within the range of a certain computing parameters, it is found to obtain parallel performance by using the level of parallel programming which decomposes the calculation according to indices of do-loops into each processor on the vector parallel computer VPP500 and the scalar parallel computer Paragon. It is also found that VPP500 shows parallel performance in wider range computing parameters. The reason is that the time cost of the program parts, which can not be reduced by the do-loop level of the parallel programming, can be reduced to the negligible level by the vectorization. After that, the time consuming parts of the program are concentrated on less parts that can be accelerated by the do-loop level of the parallel programming. This report shows the step-by-step parallel programming method and the parallel performance of the molecular dynamics code on VPP500 and Paragon. (author)

  7. PCRELAP5: data calculation program for RELAP 5 code

    International Nuclear Information System (INIS)

    Silvestre, Larissa Jacome Barros

    2016-01-01

    Nuclear accidents in the world led to the establishment of rigorous criteria and requirements for nuclear power plant operations by the international regulatory bodies. By using specific computer programs, simulations of various accidents and transients likely to occur at any nuclear power plant are required for certifying and licensing a nuclear power plant. Based on this scenario, some sophisticated computational tools have been used such as the Reactor Excursion and Leak Analysis Program (RELAP5), which is the most widely used code for the thermo-hydraulic analysis of accidents and transients in nuclear reactors in Brazil and worldwide. A major difficulty in the simulation by using RELAP5 code is the amount of information required for the simulation of thermal-hydraulic accidents or transients. The preparation of the input data requires a great number of mathematical operations to calculate the geometry of the components. Thus, for those calculations performance and preparation of RELAP5 input data, a friendly mathematical preprocessor was designed. The Visual Basic for Application (VBA) for Microsoft Excel demonstrated to be an effective tool to perform a number of tasks in the development of the program. In order to meet the needs of RELAP5 users, the RELAP5 Calculation Program (Programa de Calculo do RELAP5 - PCRELAP5) was designed. The components of the code were codified; all entry cards including the optional cards of each one have been programmed. In addition, an English version for PCRELAP5 was provided. Furthermore, a friendly design was developed in order to minimize the time of preparation of input data and errors committed by users. In this work, the final version of this preprocessor was successfully applied for Safety Injection System (SIS) of Angra 2. (author)

  8. Programming peptidomimetic syntheses by translating genetic codes designed de novo.

    Science.gov (United States)

    Forster, Anthony C; Tan, Zhongping; Nalam, Madhavi N L; Lin, Hening; Qu, Hui; Cornish, Virginia W; Blacklow, Stephen C

    2003-05-27

    Although the universal genetic code exhibits only minor variations in nature, Francis Crick proposed in 1955 that "the adaptor hypothesis allows one to construct, in theory, codes of bewildering variety." The existing code has been expanded to enable incorporation of a variety of unnatural amino acids at one or two nonadjacent sites within a protein by using nonsense or frameshift suppressor aminoacyl-tRNAs (aa-tRNAs) as adaptors. However, the suppressor strategy is inherently limited by compatibility with only a small subset of codons, by the ways such codons can be combined, and by variation in the efficiency of incorporation. Here, by preventing competing reactions with aa-tRNA synthetases, aa-tRNAs, and release factors during translation and by using nonsuppressor aa-tRNA substrates, we realize a potentially generalizable approach for template-encoded polymer synthesis that unmasks the substantially broader versatility of the core translation apparatus as a catalyst. We show that several adjacent, arbitrarily chosen sense codons can be completely reassigned to various unnatural amino acids according to de novo genetic codes by translating mRNAs into specific peptide analog polymers (peptidomimetics). Unnatural aa-tRNA substrates do not uniformly function as well as natural substrates, revealing important recognition elements for the translation apparatus. Genetic programming of peptidomimetic synthesis should facilitate mechanistic studies of translation and may ultimately enable the directed evolution of small molecules with desirable catalytic or pharmacological properties.

  9. Complementary Machine Intelligence and Human Intelligence in Virtual Teaching Assistant for Tutoring Program Tracing

    Science.gov (United States)

    Chou, Chih-Yueh; Huang, Bau-Hung; Lin, Chi-Jen

    2011-01-01

    This study proposes a virtual teaching assistant (VTA) to share teacher tutoring tasks in helping students practice program tracing and proposes two mechanisms of complementing machine intelligence and human intelligence to develop the VTA. The first mechanism applies machine intelligence to extend human intelligence (teacher answers) to evaluate…

  10. High explosive programmed burn in the FLAG code

    Energy Technology Data Exchange (ETDEWEB)

    Mandell, D.; Burton, D.; Lund, C.

    1998-02-01

    The models used to calculate the programmed burn high-explosive lighting times for two- and three-dimensions in the FLAG code are described. FLAG uses an unstructured polyhedra grid. The calculations were compared to exact solutions for a square in two dimensions and for a cube in three dimensions. The maximum error was 3.95 percent in two dimensions and 4.84 percent in three dimensions. The high explosive lighting time model described has the advantage that only one cell at a time needs to be considered.

  11. Nonlinear programming for classification problems in machine learning

    Science.gov (United States)

    Astorino, Annabella; Fuduli, Antonio; Gaudioso, Manlio

    2016-10-01

    We survey some nonlinear models for classification problems arising in machine learning. In the last years this field has become more and more relevant due to a lot of practical applications, such as text and web classification, object recognition in machine vision, gene expression profile analysis, DNA and protein analysis, medical diagnosis, customer profiling etc. Classification deals with separation of sets by means of appropriate separation surfaces, which is generally obtained by solving a numerical optimization model. While linear separability is the basis of the most popular approach to classification, the Support Vector Machine (SVM), in the recent years using nonlinear separating surfaces has received some attention. The objective of this work is to recall some of such proposals, mainly in terms of the numerical optimization models. In particular we tackle the polyhedral, ellipsoidal, spherical and conical separation approaches and, for some of them, we also consider the semisupervised versions.

  12. Development of hole inspection program using touch trigger probe on CNC machine tools

    International Nuclear Information System (INIS)

    Lee, Chan Ho; Lee, Eung Suk

    2012-01-01

    According to many customers' requests, optical measurement module (OMM) applications using automatic measuring devices to measure the machined part rapidly on a machine tool have increased steeply. Touch trigger probes are being used for job setup and feature inspection as automatic measuring devices, and this makes quality checking and machining compensation possible. Therefore, in this study, the use of touch trigger probes for accurate measurement of the machined part has been studied and a macro program for a hole measuring cycle has been developed. This hole is the most common feature to be measured, but conventional methods are still not free from measuring error. In addition, the eccentricity change of the least square circle was simulated according to the roundness error in a hole measurement. To evaluate the reliability of this study, the developed hole measuring program was executed to measure the hole plate on the machine and verify the roundness error in the eccentricity simulation result

  13. An optimal maintenance policy for machine replacement problem using dynamic programming

    OpenAIRE

    Mohsen Sadegh Amalnik; Morteza Pourgharibshahi

    2017-01-01

    In this article, we present an acceptance sampling plan for machine replacement problem based on the backward dynamic programming model. Discount dynamic programming is used to solve a two-state machine replacement problem. We plan to design a model for maintenance by consid-ering the quality of the item produced. The purpose of the proposed model is to determine the optimal threshold policy for maintenance in a finite time horizon. We create a decision tree based on a sequential sampling inc...

  14. The FELIX program of experiments and code development

    International Nuclear Information System (INIS)

    Turner, L.R.

    1983-01-01

    An experimental program and test bed called FELIX (Fusion Electromagnetic Induction Experiment) which is under construction at Argonne National Laboratory is described. The facility includes the following facilities; (a) a sizable constant field, analogous to a tokamak toroidal field or the confining field of a mirror reactor, (b) a pulsed field with a sizable rate of change, analogous to a pulsed poloidal field or to the changing field of a plasma disruption, perpendicular to the constant field, and (c) a sufficiently large volume to assure that large, complex test pieces can be tested, and that the forces, torques, currents, and field distortions which are developed are large enough to be measured accurately. The development of the necessary computer codes and the experimental program are examined. (U.K.)

  15. Joint FAM/Line Management Assessment Report on LLNL Machine Guarding Safety Program

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, J. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-07-19

    The LLNL Safety Program for Machine Guarding is implemented to comply with requirements in the ES&H Manual Document 11.2, "Hazards-General and Miscellaneous," Section 13 Machine Guarding (Rev 18, issued Dec. 15, 2015). The primary goal of this LLNL Safety Program is to ensure that LLNL operations involving machine guarding are managed so that workers, equipment and government property are adequately protected. This means that all such operations are planned and approved using the Integrated Safety Management System to provide the most cost effective and safest means available to support the LLNL mission.

  16. Teaching Machines, Programming, Computers, and Instructional Technology: The Roots of Performance Technology.

    Science.gov (United States)

    Deutsch, William

    1992-01-01

    Reviews the history of the development of the field of performance technology. Highlights include early teaching machines, instructional technology, learning theory, programed instruction, the systems approach, needs assessment, branching versus linear program formats, programing languages, and computer-assisted instruction. (LRW)

  17. On Coding the States of Sequential Machines with the Use of Partition Pairs

    DEFF Research Database (Denmark)

    Zahle, Torben U.

    1966-01-01

    This article introduces a new technique of making state assignment for sequential machines. The technique is in line with the approach used by Hartmanis [l], Stearns and Hartmanis [3], and Curtis [4]. It parallels the work of Dolotta and McCluskey [7], although it was developed independently...

  18. Usage of I++ Simulator to Program Coordinate Measuring Machines when Common Programming Methods are difficult to apply

    Directory of Open Access Journals (Sweden)

    Gąska A.

    2014-02-01

    Full Text Available Nowadays, simulators facilitate tasks performed daily by the engineers of different branches, including coordinate metrologists. Sometimes it is difficult or almost impossible to program a Coordinate Measuring Machine (CMM using standard methods. This happens, for example, during measurements of nano elements or in cases when measurements are performed on high-precision (accurate measuring machines which work in strictly air-conditioned spaces and the presence of the operator in such room during the programming of CMM could cause an increase in temperature, which in turn could make it necessary to wait some time until conditions stabilize. This article describes functioning of a simulator and its usage during Coordinate Measuring Machine programming in the latter situation. Article also describes a general process of programming CMMs which ensures the correct machine performance after starting the program on a real machine. As an example proving the presented considerations, measurement of exemplary workpiece, which was performed on the machine working in the strictly air-conditioned room, was described

  19. Development of an Auto-Validation Program for MARS Code Assessments

    International Nuclear Information System (INIS)

    Lee, Young Jin; Chung, Bub Dong

    2006-01-01

    MARS (Multi-dimensional Analysis of Reactor Safety) code is a best-estimate thermal hydraulic system analysis code developed at KAERI. It is important for a thermal hydraulic computer code to be assessed against theoretical and experimental data to verify and validate the performance and the integrity of the structure, models and correlations of the code. The code assessment efforts for complex thermal hydraulics code such as MARS code can be tedious, time-consuming and require large amount of human intervention in data transfer to see the results in graphic forms. Code developers produce many versions of a code during development and each version need to be verified for integrity. Thus, for MARS code developers, it is desirable to have an automatic way of carrying out the code assessment calculations. In the present work, an Auto-Validation program that carries out the code assessment efforts has been developed. The program uses the user supplied configuration file (with '.vv' extension) which contain commands to read input file, to execute the user selected MARS program, and to generate result graphs. The program can be useful if a same set of code assessments is repeated with different versions of the code. The program is written with the Delphi program language. The program runs under the Microsoft Windows environment

  20. A Computer Program for Simplifying Incompletely Specified Sequential Machines Using the Paull and Unger Technique

    Science.gov (United States)

    Ebersole, M. M.; Lecoq, P. E.

    1968-01-01

    This report presents a description of a computer program mechanized to perform the Paull and Unger process of simplifying incompletely specified sequential machines. An understanding of the process, as given in Ref. 3, is a prerequisite to the use of the techniques presented in this report. This process has specific application in the design of asynchronous digital machines and was used in the design of operational support equipment for the Mariner 1966 central computer and sequencer. A typical sequential machine design problem is presented to show where the Paull and Unger process has application. A description of the Paull and Unger process together with a description of the computer algorithms used to develop the program mechanization are presented. Several examples are used to clarify the Paull and Unger process and the computer algorithms. Program flow diagrams, program listings, and a program user operating procedures are included as appendixes.

  1. Monte Carlo simulation of a multi-leaf collimator design for telecobalt machine using BEAMnrc code

    Directory of Open Access Journals (Sweden)

    Ayyangar Komanduri

    2010-01-01

    Full Text Available This investigation aims to design a practical multi-leaf collimator (MLC system for the cobalt teletherapy machine and check its radiation properties using the Monte Carlo (MC method. The cobalt machine was modeled using the BEAMnrc Omega-Beam MC system, which could be freely downloaded from the website of the National Research Council (NRC, Canada. Comparison with standard depth dose data tables and the theoretically modeled beam showed good agreement within 2%. An MLC design with low melting point alloy (LMPA was tested for leakage properties of leaves. The LMPA leaves with a width of 7 mm and height of 6 cm, with tongue and groove of size 2 mm wide by 4 cm height, produced only 4% extra leakage compared to 10 cm height tungsten leaves. With finite 60 Co source size, the interleaf leakage was insignificant. This analysis helped to design a prototype MLC as an accessory mount on a cobalt machine. The complete details of the simulation process and analysis of results are discussed.

  2. Monte Carlo simulation of a multi-leaf collimator design for telecobalt machine using BEAMnrc code

    International Nuclear Information System (INIS)

    Ayyangar, Komanduri M.; Narayan, Pradush; Jesuraj, Fenedit; Raju, M.R.; Dinesh Kumar, M.

    2010-01-01

    This investigation aims to design a practical multi-leaf collimator (MLC) system for the cobalt teletherapy machine and check its radiation properties using the Monte Carlo (MC) method. The cobalt machine was modeled using the BEAMnrc Omega-Beam MC system, which could be freely downloaded from the website of the National Research Council (NRC), Canada. Comparison with standard depth dose data tables and the theoretically modeled beam showed good agreement within 2%. An MLC design with low melting point alloy (LMPA) was tested for leakage properties of leaves. The LMPA leaves with a width of 7 mm and height of 6 cm, with tongue and groove of size 2 mm wide by 4 cm height, produced only 4% extra leakage compared to 10 cm height tungsten leaves. With finite 60 Co source size, the interleaf leakage was insignificant. This analysis helped to design a prototype MLC as an accessory mount on a cobalt machine. The complete details of the simulation process and analysis of results are discussed. (author)

  3. GetData Digitizing Program Code: Description, Testing, Training

    International Nuclear Information System (INIS)

    Taova, S.

    2013-01-01

    90 percents of compilation in our center is obtained by data digitizing. So we are rather interested in the development of different techniques of data digitizing. Plots containing a great amount of points and solid lines are most complicated for digitizing. From our point of view including to the Exfor-Digitizer procedures of automatic or semi-automatic digitizing will allow to simplify significantly this process. We managed to test some free available program codes. Program GETDATA Graph Digitizer (www.getdata- graph-digitizer.com) looks more suitable for our purposes. GetData Graph Digitizer is a program for digitizing graphs, plots and maps. Main features of GetData Graph Digitizer are: - supported graphics formats are TIFF, JPEG, BMP and PCX; - two algorithms for automatic digitizing; - convenient manual digitizing; - reorder tool for easy points reordering; - save/open workspace, which allows to save the work and return to it later; - obtained data can be exported to the clipboard; - export to the formats: TXT (text file), XLS (MS Excel), XML, DXF (AutoCAD) and EPS (PostScript). GetData Graph Digitizer includes two algorithms for automatic digitizing. Auto trace lines: This method is designed to digitize solid lines. Choose the starting point, and the program will trace the line, stopping at it's end. To trace the line use Operations =>Auto trace lines menu or context menu ('Auto trace lines' item). To choose starting point click left mouse button, or click right mouse button to additionally choose direction for line tracing. Digitize area: The second way is to set digitizing area. This method works for any type of lines, including dashed lines. Data points are set at the intersection of grid with the line. You can choose the type of grid (X grid or Y grid), and set the distance between grid lines. You can also make the grid be shifted in such a way, that it will pass through a specific X (or Y) value. To digitize area use Operations →Digitize area menu

  4. Computer codes for the calculation of vibrations in machines and structures

    International Nuclear Information System (INIS)

    1989-01-01

    After an introductory paper on the typical requirements to be met by vibration calculations, the first two sections of the conference papers present universal as well as specific finite-element codes tailored to solve individual problems. The calculation of dynamic processes increasingly now in addition to the finite elements applies the method of multi-component systems which takes into account rigid bodies or partial structures and linking and joining elements. This method, too, is explained referring to universal computer codes and to special versions. In mechanical engineering, rotary vibrations are a major problem, and under this topic, conference papers exclusively deal with codes that also take into account special effects such as electromechanical coupling, non-linearities in clutches, etc. (orig./HP) [de

  5. TEACHING MACHINES AND PROGRAMED LEARNING, 1962--A SURVEY OF THE INDUSTRY.

    Science.gov (United States)

    FINN, JAMES D.; AND OTHERS

    THE PURPOSE OF THIS STUDY WAS TO (1) LOCATE COMPANIES AND ORGANIZATIONS IN THE UNITED STATES PREPARING PROGRAMS AND MANUFACTURING TEACHING MACHINES FOR COMMERCIAL DISTRIBUTION, (2) OBTAIN ACCURATE DESCRIPTIONS INSOFAR AS POSSIBLE OF THE TYPES, VARIETY, AND CAPABILITIES OF EQUIPMENT BEING MANUFACTURED AND THE TYPE AND CONTENT OF PROGRAMS BEING…

  6. Automatic Parallelization Tool: Classification of Program Code for Parallel Computing

    Directory of Open Access Journals (Sweden)

    Mustafa Basthikodi

    2016-04-01

    Full Text Available Performance growth of single-core processors has come to a halt in the past decade, but was re-enabled by the introduction of parallelism in processors. Multicore frameworks along with Graphical Processing Units empowered to enhance parallelism broadly. Couples of compilers are updated to developing challenges forsynchronization and threading issues. Appropriate program and algorithm classifications will have advantage to a great extent to the group of software engineers to get opportunities for effective parallelization. In present work we investigated current species for classification of algorithms, in that related work on classification is discussed along with the comparison of issues that challenges the classification. The set of algorithms are chosen which matches the structure with different issues and perform given task. We have tested these algorithms utilizing existing automatic species extraction toolsalong with Bones compiler. We have added functionalities to existing tool, providing a more detailed characterization. The contributions of our work include support for pointer arithmetic, conditional and incremental statements, user defined types, constants and mathematical functions. With this, we can retain significant data which is not captured by original speciesof algorithms. We executed new theories into the device, empowering automatic characterization of program code.

  7. US Accelerator R&D Program Toward Intensity Frontier Machines

    Energy Technology Data Exchange (ETDEWEB)

    Shiltsev, Vladimir [Fermilab

    2016-09-15

    The 2014 P5 report indicated the accelerator-based neutrino and rare decay physics research as a centerpiece of the US domestic HEP program. Operation, upgrade and development of the accelerators for the near-term and longer-term particle physics program at the Intensity Frontier face formidable challenges. Here we discuss key elements of the accelerator physics and technology R&D program toward future multi-MW proton accelerators.

  8. Manipulator system man-machine interface evaluation program. [technology assessment

    Science.gov (United States)

    Malone, T. B.; Kirkpatrick, M.; Shields, N. L.

    1974-01-01

    Application and requirements for remote manipulator systems for future space missions were investigated. A manipulator evaluation program was established to study the effects of various systems parameters on operator performance of tasks necessary for remotely manned missions. The program and laboratory facilities are described. Evaluation criteria and philosophy are discussed.

  9. INL Experimental Program Roadmap for Thermal Hydraulic Code Validation

    Energy Technology Data Exchange (ETDEWEB)

    Glenn McCreery; Hugh McIlroy

    2007-09-01

    Advanced computer modeling and simulation tools and protocols will be heavily relied on for a wide variety of system studies, engineering design activities, and other aspects of the Next Generation Nuclear Power (NGNP) Very High Temperature Reactor (VHTR), the DOE Global Nuclear Energy Partnership (GNEP), and light-water reactors. The goal is for all modeling and simulation tools to be demonstrated accurate and reliable through a formal Verification and Validation (V&V) process, especially where such tools are to be used to establish safety margins and support regulatory compliance, or to design a system in a manner that reduces the role of expensive mockups and prototypes. Recent literature identifies specific experimental principles that must be followed in order to insure that experimental data meet the standards required for a “benchmark” database. Even for well conducted experiments, missing experimental details, such as geometrical definition, data reduction procedures, and manufacturing tolerances have led to poor Benchmark calculations. The INL has a long and deep history of research in thermal hydraulics, especially in the 1960s through 1980s when many programs such as LOFT and Semiscle were devoted to light-water reactor safety research, the EBRII fast reactor was in operation, and a strong geothermal energy program was established. The past can serve as a partial guide for reinvigorating thermal hydraulic research at the laboratory. However, new research programs need to fully incorporate modern experimental methods such as measurement techniques using the latest instrumentation, computerized data reduction, and scaling methodology. The path forward for establishing experimental research for code model validation will require benchmark experiments conducted in suitable facilities located at the INL. This document describes thermal hydraulic facility requirements and candidate buildings and presents examples of suitable validation experiments related

  10. US/DOE Man-Machine Integration program for liquid metal reactors

    International Nuclear Information System (INIS)

    D'Zmura, A.P.; Seeman, S.E.

    1985-03-01

    The United States Department of Energy (DOE) Man-Machine Integration program was started in 1980 as an addition to the existing Liquid Metal Fast Breeder Reactor safety base technology program. The overall goal of the DOE program is to enhance the operational safety of liquid metal reactors by optimum integration of humans and machines in the overall reactor plant system and by application of the principles of human-factors engineering to the design of equipment, subsystems, facilities, operational aids, procedures and environments. In the four years since its inception the program has concentrated on understanding the control process for Liquid Metal Reactors (LMRs) and on applying advanced computer concepts to this process. This paper describes the products that have been developed in this program, present computer-related programs, and plans for the future

  11. US Liquid Metal Fast Breeder Reactor man-machine interface program

    International Nuclear Information System (INIS)

    Vaurio, J.K.; Change, S.A.

    1982-01-01

    The US LMFBR Man-Machine Interface Program is supportive to and an integral part of the LMFBR Safety Program. This paper describes the goal and objectives of the program, and the necessary research and development efforts with a logical structure for the orderly and timely implementation of the prgoram. Current status and near-term and long-term priority activities are also summarized

  12. BLAKE - A Thermodynamics Code Based on TIGER: Users' Guide to the Revised Program

    National Research Council Canada - National Science Library

    Freedman, Eli

    1998-01-01

    .... This code, which was derived from the original version of SRI's TIGER program, is intended primarily for making calculations on the combustion products from conventional military and electrically...

  13. lncRScan-SVM: A Tool for Predicting Long Non-Coding RNAs Using Support Vector Machine.

    Science.gov (United States)

    Sun, Lei; Liu, Hui; Zhang, Lin; Meng, Jia

    2015-01-01

    Functional long non-coding RNAs (lncRNAs) have been bringing novel insight into biological study, however it is still not trivial to accurately distinguish the lncRNA transcripts (LNCTs) from the protein coding ones (PCTs). As various information and data about lncRNAs are preserved by previous studies, it is appealing to develop novel methods to identify the lncRNAs more accurately. Our method lncRScan-SVM aims at classifying PCTs and LNCTs using support vector machine (SVM). The gold-standard datasets for lncRScan-SVM model training, lncRNA prediction and method comparison were constructed according to the GENCODE gene annotations of human and mouse respectively. By integrating features derived from gene structure, transcript sequence, potential codon sequence and conservation, lncRScan-SVM outperforms other approaches, which is evaluated by several criteria such as sensitivity, specificity, accuracy, Matthews correlation coefficient (MCC) and area under curve (AUC). In addition, several known human lncRNA datasets were assessed using lncRScan-SVM. LncRScan-SVM is an efficient tool for predicting the lncRNAs, and it is quite useful for current lncRNA study.

  14. Learning Computer Programming: Implementing a Fractal in a Turing Machine

    Science.gov (United States)

    Pereira, Hernane B. de B.; Zebende, Gilney F.; Moret, Marcelo A.

    2010-01-01

    It is common to start a course on computer programming logic by teaching the algorithm concept from the point of view of natural languages, but in a schematic way. In this sense we note that the students have difficulties in understanding and implementation of the problems proposed by the teacher. The main idea of this paper is to show that the…

  15. Survey of particle codes in the Magnetic Fusion Energy Program

    International Nuclear Information System (INIS)

    1977-12-01

    In the spring of 1976, the Fusion Plasma Theory Branch of the Division of Magnetic Fusion Energy conducted a survey of all the physics computer codes being supported at that time. The purpose of that survey was to allow DMFE to prepare a description of the codes for distribution to the plasma physics community. This document is the first of several planned and covers those types of codes which treat the plasma as a group of particles

  16. An optimal maintenance policy for machine replacement problem using dynamic programming

    Directory of Open Access Journals (Sweden)

    Mohsen Sadegh Amalnik

    2017-06-01

    Full Text Available In this article, we present an acceptance sampling plan for machine replacement problem based on the backward dynamic programming model. Discount dynamic programming is used to solve a two-state machine replacement problem. We plan to design a model for maintenance by consid-ering the quality of the item produced. The purpose of the proposed model is to determine the optimal threshold policy for maintenance in a finite time horizon. We create a decision tree based on a sequential sampling including renew, repair and do nothing and wish to achieve an optimal threshold for making decisions including renew, repair and continue the production in order to minimize the expected cost. Results show that the optimal policy is sensitive to the data, for the probability of defective machines and parameters defined in the model. This can be clearly demonstrated by a sensitivity analysis technique.

  17. Programming and machining of complex parts based on CATIA solid modeling

    Science.gov (United States)

    Zhu, Xiurong

    2017-09-01

    The complex parts of the use of CATIA solid modeling programming and simulation processing design, elaborated in the field of CNC machining, programming and the importance of processing technology. In parts of the design process, first make a deep analysis on the principle, and then the size of the design, the size of each chain, connected to each other. After the use of backstepping and a variety of methods to calculate the final size of the parts. In the selection of parts materials, careful study, repeated testing, the final choice of 6061 aluminum alloy. According to the actual situation of the processing site, it is necessary to make a comprehensive consideration of various factors in the machining process. The simulation process should be based on the actual processing, not only pay attention to shape. It can be used as reference for machining.

  18. Make man-machine communication easier: fuzzy programming

    Energy Technology Data Exchange (ETDEWEB)

    Farreny, H; Prade, H

    1982-06-01

    Procedures and data used by the human brain are not always accurately specified; fuzzy programming may help in the realisation of languages for the manipulation of such fuzzy entities. After having considered fuzzy instruction and its requirements, arguments, functions, predicates and designations, the authors present the outlines of a fuzzy filtering system. Two applications are given as examples; these are the accessing of a database and an expert system which may be used to solve problems in robotics.

  19. THE WORLD OF TEACHING MACHINES, PROGRAMED LEARNING AND SELF-INSTRUCTIONAL DEVICES.

    Science.gov (United States)

    FOLTZ, CHARLES I.

    TEACHING MACHINES HAVE SEVERAL ADVANTAGES--TIME IS SAVED, CORRECT RESPONSE IS REINFORCED IMMEDIATELY, AND STUDENTS ARE NOT PUBLICLY CONFRONTED WITH FAILURE. THERE ARE TWO APPROACHES TO THE PROGRAMED INSTRUCTION INTRINSIC TO THEIR USE--ONE (THE SKINNER METHOD) IS BASED ON FILLING IN BLANK SPACES, THE OTHER (THE CROWDER METHOD) EMPLOYS…

  20. PC-based support programs coupled with the sets code for large fault tree analysis

    International Nuclear Information System (INIS)

    Hioki, K.; Nakai, R.

    1989-01-01

    Power Reactor and Nuclear Fuel Development Corporation (PNC) has developed four PC programs: IEIQ (Initiating Event Identification and Quantification), MODESTY (Modular Even Description for a Variety of Systems), FAUST (Fault Summary Tables Generation Program) and ETAAS (Event Tree Analysis Assistant System). These programs prepare the input data for the SETS (Set Equation Transformation System) code and construct and quantify event trees (E/Ts) using the output of the SETS code. The capability of these programs is described and some examples of the results are presented in this paper. With these PC programs and the SETS code, PSA can now be performed with more consistency and less manpower

  1. Routine human-competitive machine intelligence by means of genetic programming

    Science.gov (United States)

    Koza, John R.; Streeter, Matthew J.; Keane, Martin

    2004-01-01

    Genetic programming is a systematic method for getting computers to automatically solve a problem. Genetic programming starts from a high-level statement of what needs to be done and automatically creates a computer program to solve the problem. The paper demonstrates that genetic programming (1) now routinely delivers high-return human-competitive machine intelligence; (2) is an automated invention machine; (3) can automatically create a general solution to a problem in the form of a parameterized topology; and (4) has delivered a progression of qualitatively more substantial results in synchrony with five approximately order-of-magnitude increases in the expenditure of computer time. Recent results involving the automatic synthesis of the topology and sizing of analog electrical circuits and controllers demonstrate these points.

  2. Simplified modeling and code usage in the PASC-3 code system by the introduction of a programming environment

    International Nuclear Information System (INIS)

    Pijlgroms, B.J.; Oppe, J.; Oudshoorn, H.L.; Slobben, J.

    1991-06-01

    A brief description is given of the PASC-3 (Petten-AMPX-SCALE) Reactor Physics code system and associated UNIPASC work environment. The PASC-3 code system is used for criticality and reactor calculations and consists of a selection from the Oak Ridge National Laboratory AMPX-SCALE-3 code collection complemented with a number of additional codes and nuclear data bases. The original codes have been adapted to run under the UNIX operating system. The recommended nuclear data base is a complete 219 group cross section library derived from JEF-1 of which some benchmark results are presented. By the addition of the UNIPASC work environment the usage of the code system is greatly simplified. Complex chains of programs can easily be coupled together to form a single job. In addition, the model parameters can be represented by variables instead of literal values which enhances the readability and may improve the integrity of the code inputs. (author). 8 refs.; 6 figs.; 1 tab

  3. National machine guarding program: Part 2. Safety management in small metal fabrication enterprises

    OpenAIRE

    Parker, David L.; Yamin, Samuel C.; Brosseau, Lisa M.; Xi, Min; Gordon, Robert; Most, Ivan G.; Stanley, Rodney

    2015-01-01

    Background Small manufacturing businesses often lack important safety programs. Many reasons have been set forth on why this has remained a persistent problem. Methods The National Machine Guarding Program (NMGP) was a nationwide intervention conducted in partnership with two workers' compensation insurers. Insurance safety consultants collected baseline data in 221 business using a 33?question safety management audit. Audits were completed during an interview with the business owner or manag...

  4. SLACINPT - a FORTRAN program that generates boundary data for the SLAC gun code

    International Nuclear Information System (INIS)

    Michel, W.L.; Hepburn, J.D.

    1982-03-01

    The FORTRAN program SLACINPT was written to simplify the preparation of boundary data for the SLAC gun code. In SLACINPT, the boundary is described by a sequence of straight line or arc segments. From these, the program generates the individual boundary mesh point data, required as input by the SLAC gun code

  5. RF upgrade program in LHC injectors and LHC machine

    International Nuclear Information System (INIS)

    Jensen, E.

    2012-01-01

    The main themes of the RF upgrade program are: the Linac4 project, the LLRF-upgrade and the study of a tuning-free wide-band system for PSB, the upgrade of the SPS 800 MHz amplifiers and beam controls and the upgrade of the transverse dampers of the LHC. Whilst LHC Splice Consolidation is certainly the top priority for LS1, some necessary RF consolidation and upgrade is necessary to assure the LHC performance for the next 3- year run period. This includes: 1) necessary maintenance and consolidation work that could not fit the shorter technical stops during the last years, 2) the upgrade of the SPS 200 MHz system from presently 4 to 6 cavities and possibly 3) the replacement of one LHC cavity module. On the longer term, the LHC luminosity upgrade requires crab cavities, for which some preparatory work in SPS Coldex must be scheduled during LS1. (author)

  6. A Case for Dynamic Reverse-code Generation to Debug Non-deterministic Programs

    Directory of Open Access Journals (Sweden)

    Jooyong Yi

    2013-09-01

    Full Text Available Backtracking (i.e., reverse execution helps the user of a debugger to naturally think backwards along the execution path of a program, and thinking backwards makes it easy to locate the origin of a bug. So far backtracking has been implemented mostly by state saving or by checkpointing. These implementations, however, inherently do not scale. Meanwhile, a more recent backtracking method based on reverse-code generation seems promising because executing reverse code can restore the previous states of a program without state saving. In the literature, there can be found two methods that generate reverse code: (a static reverse-code generation that pre-generates reverse code through static analysis before starting a debugging session, and (b dynamic reverse-code generation that generates reverse code by applying dynamic analysis on the fly during a debugging session. In particular, we espoused the latter one in our previous work to accommodate non-determinism of a program caused by e.g., multi-threading. To demonstrate the usefulness of our dynamic reverse-code generation, this article presents a case study of various backtracking methods including ours. We compare the memory usage of various backtracking methods in a simple but nontrivial example, a bounded-buffer program. In the case of non-deterministic programs such as this bounded-buffer program, our dynamic reverse-code generation outperforms the existing backtracking methods in terms of memory efficiency.

  7. Detonation of high explosives in Lagrangian hydrodynamic codes using the programmed burn technique

    International Nuclear Information System (INIS)

    Berger, M.E.

    1975-09-01

    Two initiation methods were developed for improving the programmed burn technique for detonation of high explosives in smeared-shock Lagrangian hydrodynamic codes. The methods are verified by comparing the improved programmed burn with existing solutions in one-dimensional plane, converging, and diverging geometries. Deficiencies in the standard programmed burn are described. One of the initiation methods has been determined to be better for inclusion in production hydrodynamic codes

  8. Guidelines and procedures for the International Code Assessment and Applications Program

    International Nuclear Information System (INIS)

    1987-04-01

    This document presents the guidelines and procedures by which the International Code Assessment and Applications Program (ICAP) will be conducted. The document summarizes the management structure of the program and the relationships between and responsibilities of the United States Nuclear Regulatory Commission (USNRC) and the international participants. The procedures for code maintenance and necessary documentation are described. Guidelines for the performance and documentation of code assessment studies are presented. An overview of an effort to quantify code uncertainty, which the ICAP supports, is included

  9. Measuring coding intensity in the Medicare Advantage program.

    Science.gov (United States)

    Kronick, Richard; Welch, W Pete

    2014-01-01

    In 2004, Medicare implemented a system of paying Medicare Advantage (MA) plans that gave them greater incentive than fee-for-service (FFS) providers to report diagnoses. Risk scores for all Medicare beneficiaries 2004-2013 and Medicare Current Beneficiary Survey (MCBS) data, 2006-2011. Change in average risk score for all enrollees and for stayers (beneficiaries who were in either FFS or MA for two consecutive years). Prevalence rates by Hierarchical Condition Category (HCC). Each year the average MA risk score increased faster than the average FFS score. Using the risk adjustment model in place in 2004, the average MA score as a ratio of the average FFS score would have increased from 90% in 2004 to 109% in 2013. Using the model partially implemented in 2014, the ratio would have increased from 88% to 102%. The increase in relative MA scores appears to largely reflect changes in diagnostic coding, not real increases in the morbidity of MA enrollees. In survey-based data for 2006-2011, the MA-FFS ratio of risk scores remained roughly constant at 96%. Intensity of coding varies widely by contract, with some contracts coding very similarly to FFS and others coding much more intensely than the MA average. Underpinning this relative growth in scores is particularly rapid relative growth in a subset of HCCs. Medicare has taken significant steps to mitigate the effects of coding intensity in MA, including implementing a 3.4% coding intensity adjustment in 2010 and revising the risk adjustment model in 2013 and 2014. Given the continuous relative increase in the average MA risk score, further policy changes will likely be necessary.

  10. FLUKA A multi-particle transport code (program version 2005)

    CERN Document Server

    Ferrari, A; Fassò, A; Ranft, Johannes

    2005-01-01

    This report describes the 2005 version of the Fluka particle transport code. The first part introduces the basic notions, describes the modular structure of the system, and contains an installation and beginner’s guide. The second part complements this initial information with details about the various components of Fluka and how to use them. It concludes with a detailed history and bibliography.

  11. Ways to increase the effectiveness of using computers and machine programs

    Energy Technology Data Exchange (ETDEWEB)

    Bulgakov, R T; Bagautdinov, G M; Kovalenko, Yu M

    1979-01-01

    An analysis is conducted of the statistical data about the operation of the computers of the computer center of the Tatar Scientific Research and Design Institute for Oil. Exposing the reasons which impact on the effectiveness of the use of the computers and the machine programs through an expert questionnaire, an ''effectiveness tree'' is compiled. Formulated are organizational measures for the executor (the computer center), the user and management and the senior leadership, which are required in order to successfully use the computers.

  12. Generation of Efficient High-Level Hardware Code from Dataflow Programs

    OpenAIRE

    Siret , Nicolas; Wipliez , Matthieu; Nezan , Jean François; Palumbo , Francesca

    2012-01-01

    High-level synthesis (HLS) aims at reducing the time-to-market by providing an automated design process that interprets and compiles high-level abstraction programs into hardware. However, HLS tools still face limitations regarding the performance of the generated code, due to the difficulties of compiling input imperative languages into efficient hardware code. Moreover the hardware code generated by the HLS tools is usually target-dependant and at a low level of abstraction (i.e. gate-level...

  13. RGENDF - An interface program between the NJOY code and codes using multigroup cross-sections

    International Nuclear Information System (INIS)

    Chalhoub, E.S.; Anaf, J.

    1988-02-01

    An interface program for reformatting multigroup cross-section libraries generated by NJOY into ENDF/B-V format and the EXPANDA, PFCOND and COMPAR input formats is presented. (author). 7 refs, 1 fig., 1 tab

  14. Process Model Improvement for Source Code Plagiarism Detection in Student Programming Assignments

    Science.gov (United States)

    Kermek, Dragutin; Novak, Matija

    2016-01-01

    In programming courses there are various ways in which students attempt to cheat. The most commonly used method is copying source code from other students and making minimal changes in it, like renaming variable names. Several tools like Sherlock, JPlag and Moss have been devised to detect source code plagiarism. However, for larger student…

  15. Rapid Prediction of Bacterial Heterotrophic Fluxomics Using Machine Learning and Constraint Programming.

    Directory of Open Access Journals (Sweden)

    Stephen Gang Wu

    2016-04-01

    Full Text Available 13C metabolic flux analysis (13C-MFA has been widely used to measure in vivo enzyme reaction rates (i.e., metabolic flux in microorganisms. Mining the relationship between environmental and genetic factors and metabolic fluxes hidden in existing fluxomic data will lead to predictive models that can significantly accelerate flux quantification. In this paper, we present a web-based platform MFlux (http://mflux.org that predicts the bacterial central metabolism via machine learning, leveraging data from approximately 100 13C-MFA papers on heterotrophic bacterial metabolisms. Three machine learning methods, namely Support Vector Machine (SVM, k-Nearest Neighbors (k-NN, and Decision Tree, were employed to study the sophisticated relationship between influential factors and metabolic fluxes. We performed a grid search of the best parameter set for each algorithm and verified their performance through 10-fold cross validations. SVM yields the highest accuracy among all three algorithms. Further, we employed quadratic programming to adjust flux profiles to satisfy stoichiometric constraints. Multiple case studies have shown that MFlux can reasonably predict fluxomes as a function of bacterial species, substrate types, growth rate, oxygen conditions, and cultivation methods. Due to the interest of studying model organism under particular carbon sources, bias of fluxome in the dataset may limit the applicability of machine learning models. This problem can be resolved after more papers on 13C-MFA are published for non-model species.

  16. Comparative analysis of lockout programs and procedures applied to industrial machines

    Energy Technology Data Exchange (ETDEWEB)

    Chinniah, Y.; Champoux, M.; Burlet-Vienney, D.; Daigle, R. [Institut de recherche Robert-Sauve en sante et en securite du travail, Montreal, PQ (Canada)

    2008-09-15

    In 2005, approximately 20 workers in Quebec were killed by dangerous machines. Approximately 13,000 accidents in the province were linked to the use of machines. The resulting cost associated with these accidents was estimated to be $70 million to the Quebec Occupational Health and Safety Commission (CSST) in compensation and salary replacement. According to article 185 of the Quebec Occupational Health and Safety Regulation (RSST), workers intervening in hazardous zones of machines and processes during maintenance, repairs, and unjamming activities must apply lockout procedures. Lockout is defined as the placement of a lock or tag on an energy-isolating device in accordance with an established procedure, indicating that the energy-isolating device is not to be operated until removal of the lock or tag in accordance with an established procedure. This report presented a comparative analysis of lockout programs and procedures applied to industrial machines. The study attempted to answer several questions regarding the concept of lockout and its definition in the literature; the differences between legal lockout requirements among provinces and countries; different standards on lockout; the contents of lockout programs as described by different documents; and the compliance of lockout programs in a sample of industries in Quebec in terms of Canadian standard on lockout, the CSA Z460-05 (2005). The report discussed the research objectives, methodology, and results of the study. It was concluded that the concept of lockout has different meanings or definitions in the literature, especially in regulations. However, definitions of lockout which are found in standards have certain similarities. 50 refs., 52 tabs., 2 appendices.

  17. Automated Source Code Analysis to Identify and Remove Software Security Vulnerabilities: Case Studies on Java Programs

    OpenAIRE

    Natarajan Meghanathan

    2013-01-01

    The high-level contribution of this paper is to illustrate the development of generic solution strategies to remove software security vulnerabilities that could be identified using automated tools for source code analysis on software programs (developed in Java). We use the Source Code Analyzer and Audit Workbench automated tools, developed by HP Fortify Inc., for our testing purposes. We present case studies involving a file writer program embedded with features for password validation, and ...

  18. PENNON: A code for convex nonlinear and semidefinite programming

    Czech Academy of Sciences Publication Activity Database

    Kočvara, Michal; Stingl, M.

    2003-01-01

    Roč. 18, č. 3 (2003), s. 317-333 ISSN 1055-6788 R&D Projects: GA ČR GA201/00/0080 Grant - others:BMBF(DE) 03ZOM3ER Institutional research plan: CEZ:AV0Z1075907 Keywords : convex programming * semidefinite programming * large-scale problems Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.306, year: 2003

  19. International Code Assessment and Applications Program: Summary of code assessment studies concerning RELAP5/MOD2, RELAP5/MOD3, and TRAC-B

    International Nuclear Information System (INIS)

    Schultz, R.R.

    1993-12-01

    Members of the International Code Assessment Program (ICAP) have assessed the US Nuclear Regulatory Commission (USNRC) advanced thermal-hydraulic codes over the past few years in a concerted effort to identify deficiencies, to define user guidelines, and to determine the state of each code. The results of sixty-two code assessment reviews, conducted at INEL, are summarized. Code deficiencies are discussed and user recommended nodalizations investigated during the course of conducting the assessment studies and reviews are listed. All the work that is summarized was done using the RELAP5/MOD2, RELAP5/MOD3, and TRAC-B codes

  20. Object-Oriented Programming in the Development of Containment Analysis Code

    International Nuclear Information System (INIS)

    Han, Tae Young; Hong, Soon Joon; Hwang, Su Hyun; Lee, Byung Chul; Byun, Choong Sup

    2009-01-01

    After the mid 1980s, the new programming concept, Object-Oriented Programming (OOP), was introduced and designed, which has the features such as the information hiding, encapsulation, modularity and inheritance. These offered much more convenient programming paradigm to code developers. The OOP concept was readily developed into the programming language as like C++ in the 1990s and is being widely used in the modern software industry. In this paper, we show that the OOP concept is successfully applicable to the development of safety analysis code for containment and propose the more explicit and easy OOP design for developers

  1. Study on Production Management in Programming of Computer Numerical Control Machines

    Directory of Open Access Journals (Sweden)

    Gheorghe Popovici

    2014-12-01

    Full Text Available The paper presents the results of a study regarding the need for technology in programming for machinetools with computer-aided command. Engineering is the science of making skilled things. That is why, in the "factory of the future", programming engineering will have to realise the part processing on MU-CNCs (Computer Numerical Control Machines in the optimum economic variant. There is no "recipe" when it comes to technologies. In order to select the correct variant from among several technical variants, 10 technological requirements are forwarded for the engineer to take into account in MU-CNC programming. It is the first argued synthesis of the need for technological knowledge in MU-CNC programming.

  2. Promoting Probabilistic Programming System (PPS) Development in Probabilistic Programming for Advancing Machine Learning (PPAML)

    Science.gov (United States)

    2018-03-01

    invested in the future developments of PPSs. 3.0 METHODS , ASSUMPTIONS, AND PROCEDURES Section 3 describes the methods for each of the primary areas of...approaches for solving machine learning problems of interest to defense, science , and the economy. Within DoD, there are different needs for ...Datasets include social network data and vaccination statistics . Those data have different characteristics (e.g., percentages for CDC regional

  3. Lex genetica: the law and ethics of programming biological code.

    Science.gov (United States)

    Burk, Dan L

    2002-01-01

    Recent advances in genetic engineering now allow the design of programmable biological artifacts. Such programming may include usage constraints that will alter the balance of ownership and control for biotechnology products. Similar changes have been analyzed in the context of digital content management systems, and while this previous work is useful in analyzing issues related to biological programming, the latter technology presents new conceptual problems that require more comprehensive evaluation of the interplay between law and technologically embedded values. In particular, the ability to embed contractual terms in technological artifacts now requires a re-examination of disclosure and consent in transactions involving such artifacts.

  4. Python Source Code Plagiarism Attacks on Introductory Programming Course Assignments

    Science.gov (United States)

    Karnalim, Oscar

    2017-01-01

    This paper empirically enlists Python plagiarism attacks that have been found on Introductory Programming course assignments for undergraduate students. According to our observation toward 400 plagiarism-suspected cases, there are 35 plagiarism attacks that have been conducted by students. It starts with comment & whitespace modification as…

  5. Fast and intuitive programming of adaptive laser cutting of lace enabled by machine vision

    Science.gov (United States)

    Vaamonde, Iago; Souto-López, Álvaro; García-Díaz, Antón

    2015-07-01

    A machine vision system has been developed, validated, and integrated in a commercial laser robot cell. It permits an offline graphical programming of laser cutting of lace. The user interface allows loading CAD designs and aligning them with images of lace pieces. Different thread widths are discriminated to generate proper cutting program templates. During online operation, the system aligns CAD models of pieces and lace images, pre-checks quality of lace cuts and adapts laser parameters to thread widths. For pieces detected with the required quality, the program template is adjusted by transforming the coordinates of every trajectory point. A low-cost lace feeding system was also developed for demonstration of full process automation.

  6. Use of Data to Develop a Code Blue Training Program

    Science.gov (United States)

    2017-01-28

    training program and create a sense of urgency for change. 3. Describe methods for linking learning objectives to performance gaps utilizing high...IMSH2017 .. .... _ . \\ "~ - ’ Enlisting Buy-In • Garner Leadership Support \\ ’(C>’>f ,... ... ,. -- ~ • Create a sense of urgency - I {’ . J...Include Data • ve1lows: Detail oriented; enjoy structure; punctual https://www.paceorg.com/ Creating a Sense of Urgency ... http

  7. TMAP/Mod 1: Tritium Migration Analysis Program code description and user's manual

    International Nuclear Information System (INIS)

    Merrill, B.J.; Jones, J.L.; Holland, D.F.

    1986-01-01

    The Tritium Migration Analysis Program (TMAP) has been developed by the Fusion Safety Program of EG and G Idaho, Inc., at the Idaho National Engineering Laboratory (INEL) as a safety analysis code to analyze tritium loss from fusion systems during normal operation and under accident conditions. TMAP is a one-dimensional code that determines tritium movement and inventories in a system of interconnected enclosures and wall structures. In addition, the thermal response of structures is modeled to provide temperature information required for calculations of tritium movement. The program is written in FORTRAN 4 and has been implemented on the National Magnetic Fusion Energy Computing Center (NMFECC)

  8. Remodularizing Java Programs for Improved Locality of Feature Implementations in Source Code

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2011-01-01

    Explicit traceability between features and source code is known to help programmers to understand and modify programs during maintenance tasks. However, the complex relations between features and their implementations are not evident from the source code of object-oriented Java programs....... Consequently, the implementations of individual features are difficult to locate, comprehend, and modify in isolation. In this paper, we present a novel remodularization approach that improves the representation of features in the source code of Java programs. Both forward- and reverse restructurings...... are supported through on-demand bidirectional restructuring between feature-oriented and object-oriented decompositions. The approach includes a feature location phase based of tracing program execution, a feature representation phase that reallocates classes into a new package structure based on single...

  9. Coding and Billing in Surgical Education: A Systems-Based Practice Education Program.

    Science.gov (United States)

    Ghaderi, Kimeya F; Schmidt, Scott T; Drolet, Brian C

    Despite increased emphasis on systems-based practice through the Accreditation Council for Graduate Medical Education core competencies, few studies have examined what surgical residents know about coding and billing. We sought to create and measure the effectiveness of a multifaceted approach to improving resident knowledge and performance of documenting and coding outpatient encounters. We identified knowledge gaps and barriers to documentation and coding in the outpatient setting. We implemented a series of educational and workflow interventions with a group of 12 residents in a surgical clinic at a tertiary care center. To measure the effect of this program, we compared billing codes for 1 year before intervention (FY2012) to prospectively collected data from the postintervention period (FY2013). All related documentation and coding were verified by study-blinded auditors. Interventions took place at the outpatient surgical clinic at Rhode Island Hospital, a tertiary-care center. A cohort of 12 plastic surgery residents ranging from postgraduate year 2 through postgraduate year 6 participated in the interventional sequence. A total of 1285 patient encounters in the preintervention group were compared with 1170 encounters in the postintervention group. Using evaluation and management codes (E&M) as a measure of documentation and coding, we demonstrated a significant and durable increase in billing with supporting clinical documentation after the intervention. For established patient visits, the monthly average E&M code level increased from 2.14 to 3.05 (p coding and billing of outpatient clinic encounters. Using externally audited coding data, we demonstrate significantly increased rates of higher complexity E&M coding in a stable patient population based on improved documentation and billing awareness by the residents. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  10. Automatic Creation of Machine Learning Workflows with Strongly Typed Genetic Programming

    Czech Academy of Sciences Publication Activity Database

    Křen, T.; Pilát, M.; Neruda, Roman

    2017-01-01

    Roč. 26, č. 5 (2017), č. článku 1760020. ISSN 0218-2130 R&D Projects: GA ČR GA15-19877S Grant - others:GA MŠk(CZ) LM2015042 Institutional support: RVO:67985807 Keywords : genetic programming * machine learning workflows * asynchronous evolutionary algorithm Subject RIV: IN - Informatics, Computer Science OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) Impact factor: 0.778, year: 2016

  11. Machine Shop Grinding Machines.

    Science.gov (United States)

    Dunn, James

    This curriculum manual is one in a series of machine shop curriculum manuals intended for use in full-time secondary and postsecondary classes, as well as part-time adult classes. The curriculum can also be adapted to open-entry, open-exit programs. Its purpose is to equip students with basic knowledge and skills that will enable them to enter the…

  12. Identification and Analysis of Critical Gaps in Nuclear Fuel Cycle Codes Required by the SINEMA Program

    International Nuclear Information System (INIS)

    Miron, Adrian; Valentine, Joshua; Christenson, John; Hawwari, Majd; Bhatt, Santosh; Dunzik-Gougar, Mary Lou; Lineberry, Michael

    2009-01-01

    The current state of the art in nuclear fuel cycle (NFC) modeling is an eclectic mixture of codes with various levels of applicability, flexibility, and availability. In support of the advanced fuel cycle systems analyses, especially those by the Advanced Fuel Cycle Initiative (AFCI), University of Cincinnati in collaboration with Idaho State University carried out a detailed review of the existing codes describing various aspects of the nuclear fuel cycle and identified the research and development needs required for a comprehensive model of the global nuclear energy infrastructure and the associated nuclear fuel cycles. Relevant information obtained on the NFC codes was compiled into a relational database that allows easy access to various codes' properties. Additionally, the research analyzed the gaps in the NFC computer codes with respect to their potential integration into programs that perform comprehensive NFC analysis.

  13. Identification and Analysis of Critical Gaps in Nuclear Fuel Cycle Codes Required by the SINEMA Program

    Energy Technology Data Exchange (ETDEWEB)

    Adrian Miron; Joshua Valentine; John Christenson; Majd Hawwari; Santosh Bhatt; Mary Lou Dunzik-Gougar: Michael Lineberry

    2009-10-01

    The current state of the art in nuclear fuel cycle (NFC) modeling is an eclectic mixture of codes with various levels of applicability, flexibility, and availability. In support of the advanced fuel cycle systems analyses, especially those by the Advanced Fuel Cycle Initiative (AFCI), Unviery of Cincinnati in collaboration with Idaho State University carried out a detailed review of the existing codes describing various aspects of the nuclear fuel cycle and identified the research and development needs required for a comprehensive model of the global nuclear energy infrastructure and the associated nuclear fuel cycles. Relevant information obtained on the NFC codes was compiled into a relational database that allows easy access to various codes' properties. Additionally, the research analyzed the gaps in the NFC computer codes with respect to their potential integration into programs that perform comprehensive NFC analysis.

  14. Shader programming for computational arts and design: A comparison between creative coding frameworks

    OpenAIRE

    Gomez, Andres Felipe; Colubri, Andres; Charalambos, Jean Pierre

    2016-01-01

    We describe an Application Program Interface (API) that facilitates the use of GLSL shaders in computational design, interactive arts, and data visualization. This API was first introduced in the version 2.0 of Processing, a programming language and environment widely used for teaching and production in the context of media arts and design, and has been recently completed in the 3.0 release. It aims to incorporate low-level shading programming into code-based design, by int...

  15. Parallel phase model : a programming model for high-end parallel machines with manycores.

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Junfeng (Syracuse University, Syracuse, NY); Wen, Zhaofang; Heroux, Michael Allen; Brightwell, Ronald Brian

    2009-04-01

    This paper presents a parallel programming model, Parallel Phase Model (PPM), for next-generation high-end parallel machines based on a distributed memory architecture consisting of a networked cluster of nodes with a large number of cores on each node. PPM has a unified high-level programming abstraction that facilitates the design and implementation of parallel algorithms to exploit both the parallelism of the many cores and the parallelism at the cluster level. The programming abstraction will be suitable for expressing both fine-grained and coarse-grained parallelism. It includes a few high-level parallel programming language constructs that can be added as an extension to an existing (sequential or parallel) programming language such as C; and the implementation of PPM also includes a light-weight runtime library that runs on top of an existing network communication software layer (e.g. MPI). Design philosophy of PPM and details of the programming abstraction are also presented. Several unstructured applications that inherently require high-volume random fine-grained data accesses have been implemented in PPM with very promising results.

  16. Contributions of the ORNL piping program to nuclear piping design codes and standards

    International Nuclear Information System (INIS)

    Moore, S.E.

    1975-11-01

    The ORNL Piping Program was conceived and established to develop basic information on the structural behavior of nuclear power plant piping components and to prepare this information in forms suitable for use in design analysis and codes and standards. One of the objectives was to develop and qualify stress indices and flexibility factors for direct use in Code-prescribed design analysis methods. Progress in this area is described

  17. Description of computer code PRINS, Program for Interpreting Gamma Spectra, developed at ENEA

    Energy Technology Data Exchange (ETDEWEB)

    Borsari, R. [ENEA, Centro Ricerche `E. Clementel`, Bologna (Italy). Dip. Energia

    1995-11-01

    The computer code PRINS, program for interpreting gamma Spectra, has been developed in collaboration with CENG/SECC (Centre Etude Nucleaire Grenoble / Service Etude Comportement du Combustible). Later it has been updated and improved at ENEA. Properties of the PRINS code are: (1) A powerful algorithm to locate the peaks; (2) An accurate evaluation of the errors; (3) Possibility of an automatic channels-energy calibration.

  18. Description of computer code PRINS, Program for Interpreting Gamma Spectra, developed at ENEA

    International Nuclear Information System (INIS)

    Borsari, R.

    1995-12-01

    The computer code PRINS, PRogram for INterpreting gamma Spectra, has been developed in collaboration with CENG/SECC (Centre Etude Nucleaire Grenoble / Service Etude Comportement du Combustible). Later it has been updated and improved at ENEA. Properties of the PRINS code are: I) A powerful algorithm to locate the peaks; 2) An accurate evaluation of the errors; 3) Possibility of an automatic channels-energy calibration

  19. Annual coded wire tag program, Washington: Missing production groups. Annual report for 1998

    International Nuclear Information System (INIS)

    Byrne, J.; Fuss, H.

    1999-01-01

    The Bonneville Power Administration (BPA) funds the ''Annual Coded Wire Tag Program--Missing Production Groups for Columbia River Hatcheries'' project. The WDFW project has three main objectives: (1) coded-wire tag at least one production group of each species at each Columbia Basin hatchery to enable evaluation of survival and catch distribution over time, (2) recover coded-wire tags from the snouts of fish tagged under objective 1 and estimate survival, contribution, and stray rates for each group, and (3) report the findings under objective 2 for all broods of chinook, and coho released from WDFW Columbia Basin hatcheries

  20. Use of system code to estimate equilibrium tritium inventory in fusion DT machines, such as ARIES-AT and components testing facilities

    International Nuclear Information System (INIS)

    Wong, C.P.C.; Merrill, B.

    2014-01-01

    Highlights: • With the use of a system code, tritium burn-up fraction (f burn ) can be determined. • Initial tritium inventory for steady state DT machines can be estimated. • f burn of ARIES-AT, CFETR and FNSF-AT are in the range of 1–2.8%. • Respective total tritium inventories of are 7.6 kg, 6.1 kg, and 5.2 kg. - Abstract: ITER is under construction and will begin operation in 2020. This is the first 500 MW fusion class DT device, and since it is not going to breed tritium, it will consume most of the limited supply of tritium resources in the world. Yet, in parallel, DT fusion nuclear component testing machines will be needed to provide technical data for the design of DEMO. It becomes necessary to estimate the tritium burn-up fraction and corresponding initial tritium inventory and the doubling time of these machines for the planning of future supply and utilization of tritium. With the use of a system code, tritium burn-up fraction and initial tritium inventory for steady state DT machines can be estimated. Estimated tritium burn-up fractions of FNSF-AT, CFETR-R and ARIES-AT are in the range of 1–2.8%. Corresponding total equilibrium tritium inventories of the plasma flow and tritium processing system, and with the DCLL blanket option are 7.6 kg, 6.1 kg, and 5.2 kg for ARIES-AT, CFETR-R and FNSF-AT, respectively

  1. Use of system code to estimate equilibrium tritium inventory in fusion DT machines, such as ARIES-AT and components testing facilities

    Energy Technology Data Exchange (ETDEWEB)

    Wong, C.P.C., E-mail: wongc@fusion.gat.com [General Atomics, San Diego, CA (United States); Merrill, B. [Idaho National Laboratory, Idaho Falls, ID (United States)

    2014-10-15

    Highlights: • With the use of a system code, tritium burn-up fraction (f{sub burn}) can be determined. • Initial tritium inventory for steady state DT machines can be estimated. • f{sub burn} of ARIES-AT, CFETR and FNSF-AT are in the range of 1–2.8%. • Respective total tritium inventories of are 7.6 kg, 6.1 kg, and 5.2 kg. - Abstract: ITER is under construction and will begin operation in 2020. This is the first 500 MW{sub fusion} class DT device, and since it is not going to breed tritium, it will consume most of the limited supply of tritium resources in the world. Yet, in parallel, DT fusion nuclear component testing machines will be needed to provide technical data for the design of DEMO. It becomes necessary to estimate the tritium burn-up fraction and corresponding initial tritium inventory and the doubling time of these machines for the planning of future supply and utilization of tritium. With the use of a system code, tritium burn-up fraction and initial tritium inventory for steady state DT machines can be estimated. Estimated tritium burn-up fractions of FNSF-AT, CFETR-R and ARIES-AT are in the range of 1–2.8%. Corresponding total equilibrium tritium inventories of the plasma flow and tritium processing system, and with the DCLL blanket option are 7.6 kg, 6.1 kg, and 5.2 kg for ARIES-AT, CFETR-R and FNSF-AT, respectively.

  2. AutoBayes/CC: Combining Program Synthesis with Automatic Code Certification: System Description

    Science.gov (United States)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Code certification is a lightweight approach to formally demonstrate software quality. It concentrates on aspects of software quality that can be defined and formalized via properties, e.g., operator safety or memory safety. Its basic idea is to require code producers to provide formal proofs that their code satisfies these quality properties. The proofs serve as certificates which can be checked independently, by the code consumer or by certification authorities, e.g., the FAA. It is the idea underlying such approaches as proof-carrying code [6]. Code certification can be viewed as a more practical version of traditional Hoare-style program verification. The properties to be verified are fairly simple and regular so that it is often possible to use an automated theorem prover to automatically discharge all emerging proof obligations. Usually, however, the programmer must still splice auxiliary annotations (e.g., loop invariants) into the program to facilitate the proofs. For complex properties or larger programs this quickly becomes the limiting factor for the applicability of current certification approaches.

  3. Towards advanced code simulators

    International Nuclear Information System (INIS)

    Scriven, A.H.

    1990-01-01

    The Central Electricity Generating Board (CEGB) uses advanced thermohydraulic codes extensively to support PWR safety analyses. A system has been developed to allow fully interactive execution of any code with graphical simulation of the operator desk and mimic display. The system operates in a virtual machine environment, with the thermohydraulic code executing in one virtual machine, communicating via interrupts with any number of other virtual machines each running other programs and graphics drivers. The driver code itself does not have to be modified from its normal batch form. Shortly following the release of RELAP5 MOD1 in IBM compatible form in 1983, this code was used as the driver for this system. When RELAP5 MOD2 became available, it was adopted with no changes needed in the basic system. Overall the system has been used for some 5 years for the analysis of LOBI tests, full scale plant studies and for simple what-if studies. For gaining rapid understanding of system dependencies it has proved invaluable. The graphical mimic system, being independent of the driver code, has also been used with other codes to study core rewetting, to replay results obtained from batch jobs on a CRAY2 computer system and to display suitably processed experimental results from the LOBI facility to aid interpretation. For the above work real-time execution was not necessary. Current work now centers on implementing the RELAP 5 code on a true parallel architecture machine. Marconi Simulation have been contracted to investigate the feasibility of using upwards of 100 processors, each capable of a peak of 30 MIPS to run a highly detailed RELAP5 model in real time, complete with specially written 3D core neutronics and balance of plant models. This paper describes the experience of using RELAP5 as an analyzer/simulator, and outlines the proposed methods and problems associated with parallel execution of RELAP5

  4. Utility residential new construction programs: Going beyond the code. A report from the Database on Energy Efficiency Programs (DEEP) Project

    Energy Technology Data Exchange (ETDEWEB)

    Vine, E.

    1995-08-01

    Based on an evaluation of 10 residential new construction programs, primarily sponsored by investor-owned utilities in the United States, we find that many of these programs are in dire straits and are in danger of being discontinued because current inclusion of only direct program effects leads to the conclusion that they are not cost-effective. We believe that the cost-effectiveness of residential new construction programs can be improved by: (1) promoting technologies and advanced building design practices that significantly exceed state and federal standards; (2) reducing program marketing costs and developing more effective marketing strategies; (3) recognizing the role of these programs in increasing compliance with existing state building codes; and (4) allowing utilities to obtain an ``energy-savings credit`` from utility regulators for program spillover (market transformation) impacts. Utilities can also leverage their resources in seizing these opportunities by forming strong and trusting partnerships with the building community and with local and state government.

  5. Delay-Aware Program Codes Dissemination Scheme in Internet of Everything

    Directory of Open Access Journals (Sweden)

    Yixuan Xu

    2016-01-01

    Full Text Available Due to recent advancements in big data, connection technologies, and smart devices, our environment is transforming into an “Internet of Everything” (IoE environment. These smart devices can obtain new or special functions by reprogramming: upgrade their soft systems through receiving new version of program codes. However, bulk codes dissemination suffers from large delay, energy consumption, and number of retransmissions because of the unreliability of wireless links. In this paper, a delay-aware program dissemination (DAPD scheme is proposed to disseminate program codes with fast, reliable, and energy-efficient style. We observe that although total energy is limited in wireless sensor network, there exists residual energy in nodes deployed far from the base station. Therefore, DAPD scheme improves the performance of bulk codes dissemination through the following two aspects. (1 Due to the fact that a high transmitting power can significantly improve the quality of wireless links, transmitting power of sensors with more residual energy is enhanced to improve link quality. (2 Due to the fact that performance of correlated dissemination tends to degrade in a highly dynamic environment, link correlation is autonomously updated in DAPD during codes dissemination to maintain improvements brought by correlated dissemination. Theoretical analysis and experimental results show that, compared with previous work, DAPD scheme improves the dissemination performance in terms of completion time, transmission cost, and the efficiency of energy utilization.

  6. Annual coded wire tag program (Washington) missing production groups : annual report 2000; ANNUAL

    International Nuclear Information System (INIS)

    Dammers, Wolf; Mills, Robin D.

    2002-01-01

    The Bonneville Power Administration (BPA) funds the ''Annual Coded-wire Tag Program - Missing Production Groups for Columbia River Hatcheries'' project. The Washington Department of Fish and Wildlife (WDFW), Oregon Department of Fish and Wildlife (ODFW) and the United States Fish and Wildlife Service (USFWS) all operate salmon and steelhead rearing programs in the Columbia River basin. The intent of the funding is to coded-wire tag at least one production group of each species at each Columbia Basin hatchery to provide a holistic assessment of survival and catch distribution over time and to meet various measures of the Northwest Power Planning Council's (NWPPC) Fish and Wildlife Program. The WDFW project has three main objectives: (1) coded-wire tag at least one production group of each species at each Columbia Basin hatchery to enable evaluation of survival and catch distribution over time, (2) recover coded-wire tags from the snouts of fish tagged under objective 1 and estimate survival, contribution, and stray rates for each group, and (3) report the findings under objective 2 for all broods of chinook, and coho released from WDFW Columbia Basin hatcheries. Objective 1 for FY-00 was met with few modifications to the original FY-00 proposal. Under Objective 2, snouts containing coded-wire tags that were recovered during FY-00 were decoded. Under Objective 3, this report summarizes available recovery information through 2000 and includes detailed information for brood years 1989 to 1994 for chinook and 1995 to 1997 for coho

  7. An object-oriented extension for debugging the virtual machine

    Energy Technology Data Exchange (ETDEWEB)

    Pizzi, Jr, Robert G. [Univ. of California, Davis, CA (United States)

    1994-12-01

    A computer is nothing more then a virtual machine programmed by source code to perform a task. The program`s source code expresses abstract constructs which are compiled into some lower level target language. When a virtual machine breaks, it can be very difficult to debug because typical debuggers provide only low-level target implementation information to the software engineer. We believe that the debugging task can be simplified by introducing aspects of the abstract design and data into the source code. We introduce OODIE, an object-oriented extension to programming languages that allows programmers to specify a virtual environment by describing the meaning of the design and data of a virtual machine. This specification is translated into symbolic information such that an augmented debugger can present engineers with a programmable debugging environment specifically tailored for the virtual machine that is to be debugged.

  8. A randomized, controlled intervention of machine guarding and related safety programs in small metal-fabrication businesses.

    Science.gov (United States)

    Parker, David L; Brosseau, Lisa M; Samant, Yogindra; Xi, Min; Pan, Wei; Haugan, David

    2009-01-01

    Metal fabrication employs an estimated 3.1 million workers in the United States. The absence of machine guarding and related programs such as lockout/tagout may result in serious injury or death. The purpose of this study was to improve machine-related safety in small metal-fabrication businesses. We used a randomized trial with two groups: management only and management-employee. We evaluated businesses for the adequacy of machine guarding (machine scorecard) and related safety programs (safety audit). We provided all businesses with a report outlining deficiencies and prioritizing their remediation. In addition, the management-employee group received four one-hour interactive training sessions from a peer educator. We evaluated 40 metal-fabrication businesses at baseline and 37 (93%) one year later. Of the three nonparticipants, two had gone out of business. More than 40% of devices required for adequate guarding were missing or inadequate, and 35% of required safety programs and practices were absent at baseline. Both measures improved significantly during the course of the intervention. No significant differences in changes occurred between the two intervention groups. Machine-guarding practices and programs improved by up to 13% and safety audit scores by up to 23%. Businesses that added safety committees or those that started with the lowest baseline measures showed the greatest improvements. Simple and easy-to-use assessment tools allowed businesses to significantly improve their safety practices, and safety committees facilitated this process.

  9. Mixed Integer Linear Programming based machine learning approach identifies regulators of telomerase in yeast.

    Science.gov (United States)

    Poos, Alexandra M; Maicher, André; Dieckmann, Anna K; Oswald, Marcus; Eils, Roland; Kupiec, Martin; Luke, Brian; König, Rainer

    2016-06-02

    Understanding telomere length maintenance mechanisms is central in cancer biology as their dysregulation is one of the hallmarks for immortalization of cancer cells. Important for this well-balanced control is the transcriptional regulation of the telomerase genes. We integrated Mixed Integer Linear Programming models into a comparative machine learning based approach to identify regulatory interactions that best explain the discrepancy of telomerase transcript levels in yeast mutants with deleted regulators showing aberrant telomere length, when compared to mutants with normal telomere length. We uncover novel regulators of telomerase expression, several of which affect histone levels or modifications. In particular, our results point to the transcription factors Sum1, Hst1 and Srb2 as being important for the regulation of EST1 transcription, and we validated the effect of Sum1 experimentally. We compiled our machine learning method leading to a user friendly package for R which can straightforwardly be applied to similar problems integrating gene regulator binding information and expression profiles of samples of e.g. different phenotypes, diseases or treatments. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  10. The Teaching of the Code of Ethics and Standard Practices for Texas Educator Preparation Programs

    Science.gov (United States)

    Davenport, Marvin; Thompson, J. Ray; Templeton, Nathan R.

    2015-01-01

    The purpose of this descriptive quantitative research study was to answer three basic informational questions: (1) To what extent ethics training, as stipulated in Texas Administrative Code Chapter 247, was included in the EPP curriculum; (2) To what extent Texas public universities with approved EPP programs provided faculty opportunities for…

  11. Towards provably correct code generation for a hard real-time programming language

    DEFF Research Database (Denmark)

    Fränzle, Martin; Müller-Olm, Markus

    1994-01-01

    This paper sketches a hard real-time programming language featuring operators for expressing timeliness requirements in an abstract, implementation-independent way and presents parts of the design and verification of a provably correct code generator for that language. The notion of implementation...

  12. Instructional Coding System for Mathematics Program of Studies. MET, A Title IV-C Project.

    Science.gov (United States)

    Fairfax County Public Schools, VA. Dept. of Instructional Services.

    This document is part of the Management for Effective Teaching (MET) support kit, a pilot project designed by the Fairfax County (Virginia) Public Schools to assist elementary school teachers in planning, managing, and implementing the county's Program of Studies (POS). This document provides an alpha-numeric coding system to be used in…

  13. Functions of Arabic-English Code-Switching: Sociolinguistic Insights from a Study Abroad Program

    Science.gov (United States)

    Al Masaeed, Khaled

    2013-01-01

    This sociolinguistic study examines the functions and motivations of code-switching, which is used here to mean the use of more than one language in the same conversation. The conversations studied here take place in a very particular context: one-on-one speaking sessions in a study abroad program in Morocco where English is the L1 and Arabic the…

  14. Assessment of Programming Language Learning Based on Peer Code Review Model: Implementation and Experience Report

    Science.gov (United States)

    Wang, Yanqing; Li, Hang; Feng, Yuqiang; Jiang, Yu; Liu, Ying

    2012-01-01

    The traditional assessment approach, in which one single written examination counts toward a student's total score, no longer meets new demands of programming language education. Based on a peer code review process model, we developed an online assessment system called "EduPCR" and used a novel approach to assess the learning of computer…

  15. Prediction of Student Dropout in E-Learning Program Through the Use of Machine Learning Method

    Directory of Open Access Journals (Sweden)

    Mingjie Tan

    2015-02-01

    Full Text Available The high rate of dropout is a serious problem in E-learning program. Thus it has received extensive concern from the education administrators and researchers. Predicting the potential dropout students is a workable solution to prevent dropout. Based on the analysis of related literature, this study selected student’s personal characteristic and academic performance as input attributions. Prediction models were developed using Artificial Neural Network (ANN, Decision Tree (DT and Bayesian Networks (BNs. A large sample of 62375 students was utilized in the procedures of model training and testing. The results of each model were presented in confusion matrix, and analyzed by calculating the rates of accuracy, precision, recall, and F-measure. The results suggested all of the three machine learning methods were effective in student dropout prediction, and DT presented a better performance. Finally, some suggestions were made for considerable future research.

  16. A trace display and editing program for data from fluorescence based sequencing machines.

    Science.gov (United States)

    Gleeson, T; Hillier, L

    1991-12-11

    'Ted' (Trace editor) is a graphical editor for sequence and trace data from automated fluorescence sequencing machines. It provides facilities for viewing sequence and trace data (in top or bottom strand orientation), for editing the base sequence, for automated or manual trimming of the head (vector) and tail (uncertain data) from the sequence, for vertical and horizontal trace scaling, for keeping a history of sequence editing, and for output of the edited sequence. Ted has been used extensively in the C.elegans genome sequencing project, both as a stand-alone program and integrated into the Staden sequence assembly package, and has greatly aided in the efficiency and accuracy of sequence editing. It runs in the X windows environment on Sun workstations and is available from the authors. Ted currently supports sequence and trace data from the ABI 373A and Pharmacia A.L.F. sequencers.

  17. National machine guarding program: Part 2. Safety management in small metal fabrication enterprises

    Science.gov (United States)

    Yamin, Samuel C.; Brosseau, Lisa M.; Xi, Min; Gordon, Robert; Most, Ivan G.; Stanley, Rodney

    2015-01-01

    Background Small manufacturing businesses often lack important safety programs. Many reasons have been set forth on why this has remained a persistent problem. Methods The National Machine Guarding Program (NMGP) was a nationwide intervention conducted in partnership with two workers' compensation insurers. Insurance safety consultants collected baseline data in 221 business using a 33‐question safety management audit. Audits were completed during an interview with the business owner or manager. Results Most measures of safety management improved with an increasing number of employees. This trend was particularly strong for lockout/tagout. However, size was only significant for businesses without a safety committee. Establishments with a safety committee scored higher (55% vs. 36%) on the safety management audit compared with those lacking a committee (P management programs were frequently absent. A safety committee appears to be a more important factor than business size in accounting for differences in outcome measures. Am. J. Ind. Med. 58:1184–1193, 2015. © 2015 The Authors. American Journal of Industrial Medicine Published by Wiley Periodicals, Inc. PMID:26345591

  18. National Machine Guarding Program: Part 2. Safety management in small metal fabrication enterprises.

    Science.gov (United States)

    Parker, David L; Yamin, Samuel C; Brosseau, Lisa M; Xi, Min; Gordon, Robert; Most, Ivan G; Stanley, Rodney

    2015-11-01

    Small manufacturing businesses often lack important safety programs. Many reasons have been set forth on why this has remained a persistent problem. The National Machine Guarding Program (NMGP) was a nationwide intervention conducted in partnership with two workers' compensation insurers. Insurance safety consultants collected baseline data in 221 business using a 33-question safety management audit. Audits were completed during an interview with the business owner or manager. Most measures of safety management improved with an increasing number of employees. This trend was particularly strong for lockout/tagout. However, size was only significant for businesses without a safety committee. Establishments with a safety committee scored higher (55% vs. 36%) on the safety management audit compared with those lacking a committee (P < 0.0001). Critical safety management programs were frequently absent. A safety committee appears to be a more important factor than business size in accounting for differences in outcome measures. © 2015 The Authors. American Journal of Industrial Medicine Published by Wiley Periodicals, Inc.

  19. Development of An Automatic Verification Program for Thermal-hydraulic System Codes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. Y.; Ahn, K. T.; Ko, S. H.; Kim, Y. S.; Kim, D. W. [Pusan National University, Busan (Korea, Republic of); Suh, J. S.; Cho, Y. S.; Jeong, J. J. [System Engineering and Technology Co., Daejeon (Korea, Republic of)

    2012-05-15

    As a project activity of the capstone design competitive exhibition, supported by the Education Center for Green Industry-friendly Fusion Technology (GIFT), we have developed a computer program which can automatically perform non-regression test, which is needed repeatedly during a developmental process of a thermal-hydraulic system code, such as the SPACE code. A non-regression test (NRT) is an approach to software testing. The purpose of the non-regression testing is to verify whether, after updating a given software application (in this case, the code), previous software functions have not been compromised. The goal is to prevent software regression, whereby adding new features results in software bugs. As the NRT is performed repeatedly, a lot of time and human resources will be needed during the development period of a code. It may cause development period delay. To reduce the cost and the human resources and to prevent wasting time, non-regression tests need to be automatized. As a tool to develop an automatic verification program, we have used Visual Basic for Application (VBA). VBA is an implementation of Microsoft's event-driven programming language Visual Basic 6 and its associated integrated development environment, which are built into most Microsoft Office applications (In this case, Excel)

  20. Development of An Automatic Verification Program for Thermal-hydraulic System Codes

    International Nuclear Information System (INIS)

    Lee, J. Y.; Ahn, K. T.; Ko, S. H.; Kim, Y. S.; Kim, D. W.; Suh, J. S.; Cho, Y. S.; Jeong, J. J.

    2012-01-01

    As a project activity of the capstone design competitive exhibition, supported by the Education Center for Green Industry-friendly Fusion Technology (GIFT), we have developed a computer program which can automatically perform non-regression test, which is needed repeatedly during a developmental process of a thermal-hydraulic system code, such as the SPACE code. A non-regression test (NRT) is an approach to software testing. The purpose of the non-regression testing is to verify whether, after updating a given software application (in this case, the code), previous software functions have not been compromised. The goal is to prevent software regression, whereby adding new features results in software bugs. As the NRT is performed repeatedly, a lot of time and human resources will be needed during the development period of a code. It may cause development period delay. To reduce the cost and the human resources and to prevent wasting time, non-regression tests need to be automatized. As a tool to develop an automatic verification program, we have used Visual Basic for Application (VBA). VBA is an implementation of Microsoft's event-driven programming language Visual Basic 6 and its associated integrated development environment, which are built into most Microsoft Office applications (In this case, Excel)

  1. EGS code system: computer programs for the Monte Carlo simulation of electromagnetic cascade showers. Version 3

    International Nuclear Information System (INIS)

    Ford, R.L.; Nelson, W.R.

    1978-06-01

    A code to simulate almost any electron--photon transport problem conceivable is described. The report begins with a lengthy historical introduction and a description of the shower generation process. Then the detailed physics of the shower processes and the methods used to simulate them are presented. Ideas of sampling theory, transport techniques, particle interactions in general, and programing details are discussed. Next, EGS calculations and various experiments and other Monte Carlo results are compared. The remainder of the report consists of user manuals for EGS, PEGS, and TESTSR codes; options, input specifications, and typical output are included. 38 figures, 12 tables

  2. Software Quality and Security in Teachers' and Students' Codes When Learning a New Programming Language

    Directory of Open Access Journals (Sweden)

    Arnon Hershkovitz

    2015-09-01

    Full Text Available In recent years, schools (as well as universities have added cyber security to their computer science curricula. This topic is still new for most of the current teachers, who would normally have a standard computer science background. Therefore the teachers are trained and then teaching their students what they have just learned. In order to explore differences in both populations’ learning, we compared measures of software quality and security between high-school teachers and students. We collected 109 source files, written in Python by 18 teachers and 31 students, and engineered 32 features, based on common standards for software quality (PEP 8 and security (derived from CERT Secure Coding Standards. We use a multi-view, data-driven approach, by (a using hierarchical clustering to bottom-up partition the population into groups based on their code-related features and (b building a decision tree model that predicts whether a student or a teacher wrote a given code (resulting with a LOOCV kappa of 0.751. Overall, our findings suggest that the teachers’ codes have a better quality than the students’ – with a sub-group of the teachers, mostly males, demonstrate better coding than their peers and the students – and that the students’ codes are slightly better secured than the teachers’ codes (although both populations show very low security levels. The findings imply that teachers might benefit from their prior knowledge and experience, but also emphasize the lack of continuous involvement of some of the teachers with code-writing. Therefore, findings shed light on computer science teachers as lifelong learners. Findings also highlight the difference between quality and security in today’s programming paradigms. Implications for these findings are discussed.

  3. A program for undergraduate research into the mechanisms of sensory coding and memory decay

    Energy Technology Data Exchange (ETDEWEB)

    Calin-Jageman, R J

    2010-09-28

    This is the final technical report for this DOE project, entitltled "A program for undergraduate research into the mechanisms of sensory coding and memory decay". The report summarizes progress on the three research aims: 1) to identify phyisological and genetic correlates of long-term habituation, 2) to understand mechanisms of olfactory coding, and 3) to foster a world-class undergraduate neuroscience program. Progress on the first aim has enabled comparison of learning-regulated transcripts across closely related learning paradigms and species, and results suggest that only a small core of transcripts serve truly general roles in long-term memory. Progress on the second aim has enabled testing of several mutant phenotypes for olfactory behaviors, and results show that responses are not fully consistent with the combinitoral coding hypothesis. Finally, 14 undergraduate students participated in this research, the neuroscience program attracted extramural funding, and we completed a successful summer program to enhance transitions for community-college students into 4-year colleges to persue STEM fields.

  4. Annual coded wire tag program (Washington) missing production groups: annual report for 1997; ANNUAL

    International Nuclear Information System (INIS)

    Byrne, J.; Fuss, H.; Ashbrook, C.

    1998-01-01

    The Bonneville Power Administration (BPA) funds the ''Annual Coded Wire Tag Program - Missing Production Groups for Columbia River Hatcheries'' project. The Washington Department of Fish and Wildlife (WDFW), Oregon Department of Fish and Wildlife (ODFW) and the United States Fish and Wildlife Service (USFWS) all operate salmon and steelhead rearing programs in the Columbia River basin. The intent of the funding is to coded-wire tag at least one production group of each species at each Columbia Basin hatchery to provide a holistic assessment of survival and catch distribution over time and to meet various measures of the Northwest Power Planning Councils (NWPPC) Fish and Wildlife Program. The WDFW project has three main objectives: (1) coded-wire tag at least one production group of each species at each Columbia Basin hatchery to enable evaluation of survival and catch distribution over time, (2) recover coded-wire tags from the snouts of fish tagged under objective 1 and estimate survival, contribution, and stray rates for each group, and (3) report the findings under objective 2 for all broods of chinook, and coho released from WDFW Columbia Basin hatcheries. Objective 1 for FY-97 was met with few modifications to the original FY-97 proposal. Under Objective 2, snouts containing coded-wire tags that were recovered during FY-97 were decoded. Under Objective 3, survival, contribution and stray rate estimates for the 1991-96 broods of chinook and 1993-96 broods of coho have not been made because recovery data for 1996-97 fisheries and escapement are preliminary. This report summarizes recovery information through 1995

  5. Construction of Fixed Rate Non-Binary WOM Codes Based on Integer Programming

    Science.gov (United States)

    Fujino, Yoju; Wadayama, Tadashi

    In this paper, we propose a construction of non-binary WOM (Write-Once-Memory) codes for WOM storages such as flash memories. The WOM codes discussed in this paper are fixed rate WOM codes where messages in a fixed alphabet of size $M$ can be sequentially written in the WOM storage at least $t^*$-times. In this paper, a WOM storage is modeled by a state transition graph. The proposed construction has the following two features. First, it includes a systematic method to determine the encoding regions in the state transition graph. Second, the proposed construction includes a labeling method for states by using integer programming. Several novel WOM codes for $q$ level flash memories with 2 cells are constructed by the proposed construction. They achieve the worst numbers of writes $t^*$ that meet the known upper bound in many cases. In addition, we constructed fixed rate non-binary WOM codes with the capability to reduce ICI (inter cell interference) of flash cells. One of the advantages of the proposed construction is its flexibility. It can be applied to various storage devices, to various dimensions (i.e, number of cells), and various kind of additional constraints.

  6. DOG -II input generator program for DOT3.5 code

    International Nuclear Information System (INIS)

    Hayashi, Katsumi; Handa, Hiroyuki; Yamada, Koubun; Kamogawa, Susumu; Takatsu, Hideyuki; Koizumi, Kouichi; Seki, Yasushi

    1992-01-01

    DOT3.5 is widely used for radiation transport analysis of fission reactors, fusion experimental facilities and particle accelerators. We developed the input generator program for DOT3.5 code in aim to prepare input data effectively. Formar program DOG was developed and used internally in Hitachi Engineering Company. In this new version DOG-II, limitation for R-Θ geometry was removed. All the input data is created by interactive method in front of color display without using DOT3.5 manual. Also the geometry related input are easily created without calculation of precise curved mesh point. By using DOG-II, reliable input data for DOT3.5 code is obtained easily and quickly

  7. Computer Program of SIE ASME-NH (Revision 1.0) Code

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Gyeong Hoi; Lee, J. H

    2008-01-15

    In this report, the SIE ASME (Structural Integrity Evaluations by ASME-NH) (Revision 1.0), which has a computerized implementation of ASME Pressure Vessels and Piping Code Section III Subsection NH rules, is developed to apply to the next generation reactor design subjecting to the elevated temperature operations over 500 .deg. C and over 30 years design lifetime, and the user's manual for this program is described in detail.

  8. Computer Program of SIE ASME-NH (Revision 1.0) Code

    International Nuclear Information System (INIS)

    Koo, Gyeong Hoi; Lee, J. H.

    2008-01-01

    In this report, the SIE ASME (Structural Integrity Evaluations by ASME-NH) (Revision 1.0), which has a computerized implementation of ASME Pressure Vessels and Piping Code Section III Subsection NH rules, is developed to apply to the next generation reactor design subjecting to the elevated temperature operations over 500 .deg. C and over 30 years design lifetime, and the user's manual for this program is described in detail

  9. Development of a model of machine hand eye coordination and program specifications for a topological machine vision system

    Science.gov (United States)

    1972-01-01

    A unified approach to computer vision and manipulation is developed which is called choreographic vision. In the model, objects to be viewed by a projected robot in the Viking missions to Mars are seen as objects to be manipulated within choreographic contexts controlled by a multimoded remote, supervisory control system on Earth. A new theory of context relations is introduced as a basis for choreographic programming languages. A topological vision model is developed for recognizing objects by shape and contour. This model is integrated with a projected vision system consisting of a multiaperture image dissector TV camera and a ranging laser system. System program specifications integrate eye-hand coordination and topological vision functions and an aerospace multiprocessor implementation is described.

  10. Responding to the Effects of Extreme Heat: Baltimore City's Code Red Program.

    Science.gov (United States)

    Martin, Jennifer L

    2016-01-01

    Heat response plans are becoming increasingly more common as US cities prepare for heat waves and other effects of climate change. Standard elements of heat response plans exist, but plans vary depending on geographic location and distribution of vulnerable populations. Because heat events vary over time and affect populations differently based on vulnerability, it is difficult to compare heat response plans and evaluate responses to heat events. This article provides an overview of the Baltimore City heat response plan, the Code Red program, and discusses the city's response to the 2012 Ohio Valley/Mid Atlantic Derecho, a complex heat event. Challenges with and strategies for evaluating the program are reviewed and shared.

  11. Cracking the code: a decode strategy for the international business machines punch cards of Korean war soldiers.

    Science.gov (United States)

    Mitsunaga, Erin M

    2006-05-01

    During the Korean War, International Business Machines (IBM) punch cards were created for every individual involved in military combat. Each card contained all pertinent personal information about the individual and was utilized to keep track of all soldiers involved. However, at present, all of the information known about these punch cards reveals only their format and their significance; there is little to no information on how these cards were created or how to interpret the information contained without the aid of the computer system used during the war. Today, it is believed there is no one available to explain this computerized system, nor do the original computers exist. This decode strategy is the result of an attempt to decipher the information on these cards through the use of all available medical and dental records for each individual examined. By cross-referencing the relevant personal information with the known format of the cards, a basic guess-and-check method was utilized. After examining hundreds of IBM punch cards, however, it has become clear that the punch card method of recording information was not infallible. In some cases, there are gaps of information on cards where there are data recorded on personal records; in others, information is punched incorrectly onto the cards, perhaps as the result of a transcription error. Taken all together, it is clear that the information contained on each individual's card should be taken solely as another form of personal documentation.

  12. Development of a computer program to support an efficient non-regression test of a thermal-hydraulic system code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jun Yeob; Jeong, Jae Jun [School of Mechanical Engineering, Pusan National University, Busan (Korea, Republic of); Suh, Jae Seung [System Engineering and Technology Co., Daejeon (Korea, Republic of); Kim, Kyung Doo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    During the development process of a thermal-hydraulic system code, a non-regression test (NRT) must be performed repeatedly in order to prevent software regression. The NRT process, however, is time-consuming and labor-intensive. Thus, automation of this process is an ideal solution. In this study, we have developed a program to support an efficient NRT for the SPACE code and demonstrated its usability. This results in a high degree of efficiency for code development. The program was developed using the Visual Basic for Applications and designed so that it can be easily customized for the NRT of other computer codes.

  13. Development and application of the PCRELAP5 - Data Calculation Program for RELAP 5 Code

    International Nuclear Information System (INIS)

    Silvestre, Larissa J.B.; Sabundjian, Gaianê

    2017-01-01

    Nuclear accidents in the world led to the establishment of rigorous criteria and requirements for nuclear power plant operations by the international regulatory bodies. By using specific computer programs, simulations of various accidents and transients likely to occur at any nuclear power plant are required for certifying and licensing a nuclear power plant. Some sophisticated computational tools have been used such as the Reactor Excursion and Leak Analysis Program (RELAP5), which is the most widely used code for the thermo-hydraulic analysis of accidents and transients in nuclear reactors in Brazil and worldwide. A major difficulty in the simulation by using RELAP5 code is the amount of information required for the simulation of thermal-hydraulic accidents or transients. Thus, for those calculations performance and preparation of RELAP5 input data, a friendly mathematical preprocessor was designed. The Visual Basic for Application (VBA) for Microsoft Excel demonstrated to be an effective tool to perform a number of tasks in the development of the program. In order to meet the needs of RELAP5 users, the RELAP5 Calculation Program (Programa de Cálculo do RELAP5 – PCRELAP5) was designed. The components of the code were codified; all entry cards including the optional cards of each one have been programmed. An English version for PCRELAP5 was provided. Furthermore, a friendly design was developed in order to minimize the time of preparation of input data and errors committed by users. The final version of this preprocessor was successfully applied for Safety Injection System (SIS) of Angra-2. (author)

  14. Development and application of the PCRELAP5 - Data Calculation Program for RELAP 5 Code

    Energy Technology Data Exchange (ETDEWEB)

    Silvestre, Larissa J.B.; Sabundjian, Gaianê, E-mail: larissajbs@usp.br, E-mail: gdjian@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    Nuclear accidents in the world led to the establishment of rigorous criteria and requirements for nuclear power plant operations by the international regulatory bodies. By using specific computer programs, simulations of various accidents and transients likely to occur at any nuclear power plant are required for certifying and licensing a nuclear power plant. Some sophisticated computational tools have been used such as the Reactor Excursion and Leak Analysis Program (RELAP5), which is the most widely used code for the thermo-hydraulic analysis of accidents and transients in nuclear reactors in Brazil and worldwide. A major difficulty in the simulation by using RELAP5 code is the amount of information required for the simulation of thermal-hydraulic accidents or transients. Thus, for those calculations performance and preparation of RELAP5 input data, a friendly mathematical preprocessor was designed. The Visual Basic for Application (VBA) for Microsoft Excel demonstrated to be an effective tool to perform a number of tasks in the development of the program. In order to meet the needs of RELAP5 users, the RELAP5 Calculation Program (Programa de Cálculo do RELAP5 – PCRELAP5) was designed. The components of the code were codified; all entry cards including the optional cards of each one have been programmed. An English version for PCRELAP5 was provided. Furthermore, a friendly design was developed in order to minimize the time of preparation of input data and errors committed by users. The final version of this preprocessor was successfully applied for Safety Injection System (SIS) of Angra-2. (author)

  15. Combining machine learning and matching techniques to improve causal inference in program evaluation.

    Science.gov (United States)

    Linden, Ariel; Yarnold, Paul R

    2016-12-01

    Program evaluations often utilize various matching approaches to emulate the randomization process for group assignment in experimental studies. Typically, the matching strategy is implemented, and then covariate balance is assessed before estimating treatment effects. This paper introduces a novel analytic framework utilizing a machine learning algorithm called optimal discriminant analysis (ODA) for assessing covariate balance and estimating treatment effects, once the matching strategy has been implemented. This framework holds several key advantages over the conventional approach: application to any variable metric and number of groups; insensitivity to skewed data or outliers; and use of accuracy measures applicable to all prognostic analyses. Moreover, ODA accepts analytic weights, thereby extending the methodology to any study design where weights are used for covariate adjustment or more precise (differential) outcome measurement. One-to-one matching on the propensity score was used as the matching strategy. Covariate balance was assessed using standardized difference in means (conventional approach) and measures of classification accuracy (ODA). Treatment effects were estimated using ordinary least squares regression and ODA. Using empirical data, ODA produced results highly consistent with those obtained via the conventional methodology for assessing covariate balance and estimating treatment effects. When ODA is combined with matching techniques within a treatment effects framework, the results are consistent with conventional approaches. However, given that it provides additional dimensions and robustness to the analysis versus what can currently be achieved using conventional approaches, ODA offers an appealing alternative. © 2016 John Wiley & Sons, Ltd.

  16. A program system for ab initio MO calculations on vector and parallel processing machines. Pt. 1

    International Nuclear Information System (INIS)

    Ernenwein, R.; Rohmer, M.M.; Benard, M.

    1990-01-01

    We present a program system for ab initio molecular orbital calculations on vector and parallel computers. The present article is devoted to the computation of one- and two-electron integrals over contracted Gaussian basis sets involving s-, p-, d- and f-type functions. The McMurchie and Davidson (MMD) algorithm has been implemented and parallelized by distributing over a limited number of logical tasks the calculation of the 55 relevant classes of integrals. All sections of the MMD algorithm have been efficiently vectorized, leading to a scalar/vector ratio of 5.8. Different algorithms are proposed and compared for an optimal vectorization of the contraction of the 'intermediate integrals' generated by the MMD formalism. Advantage is taken of the dynamic storage allocation for tuning the length of the vector loops (i.e. the size of the vectorization buffer) as a function of (i) the total memory available for the job, (ii) the number of logical tasks defined by the user (≤13), and (iii) the storage requested by each specific class of integrals. Test calculations carried out on a CRAY-2 computer show that the average number of finite integrals computed over a (s, p, d, f) CGTO basis set is about 1180000 per second and per processor. The combination of vectorization and parallelism on this 4-processor machine reduces the CPU time by a factor larger than 20 with respect to the scalar and sequential performance. (orig.)

  17. Man-machine interface systems and operator training program for ABWR in Japan

    International Nuclear Information System (INIS)

    Kunito, Susumu

    2004-01-01

    The Tokyo Electric Power Company (TEPCO) has developed a new Main Control Room design for the Advanced Boiling Water Reactor (ABWR) to improve man-machine interface. New configuration of panels and enhanced automation are some of the features of the ABWR type Main Control Room design. Various technologies such as Cathode Ray Tubes (CRTs) and Flat Displays (FDs) with touch-sensitive operations are contributed to the development of the ABWR type control room design. This design will be first applied to Kashiwazaki-Kariwa Nuclear Power Station unit 6 (K-6). To train the operators sufficiently, TEPCO reviewed the operator training program. Compared with the conventional training, new training menu will be added and the training of ABWR operators will be started 6 months earlier. An ABWR simulator is under construction and training using this simulator is scheduled to be started in August 1994, which is 18 months before fuel loading of K-6. We are reviewing malfunction modes on the simulator. (author)

  18. Development of Client-Server Application by Using UDP Socket Programming for Remotely Monitoring CNC Machine Environment in Fixture Process

    Directory of Open Access Journals (Sweden)

    Darmawan Darmawan

    2016-08-01

    Full Text Available The use of computer technology in manufacturing industries can improve manufacturing flexibility significantly, especially in manufacturing processes; many software applications have been utilized to improve machining performance. However, none of them has discussed the abilities to perform direct machining. In this paper, an integrated system for remote operation and monitoring of Computer Numerical Control (CNC machines is put into consideration. The integrated system includes computerization, network technology, and improved holding mechanism. The work proposed by this research is mainly on the software development for such integrated system. It uses Java three-dimensional (3D programming and Virtual Reality Modeling Language (VRML at the client side for visualization of machining environment. This research is aimed at developing a control system to remotely operate and monitor a self-reconfiguration fixture mechanism of a CNC milling machine through internet connection and integration of Personal Computer (PC-based CNC controller, a server side, a client side and CNC milling. The performance of the developed system was evaluated by testing with one type of common protocols particularly User Datagram Protocol (UDP.  Using UDP, the developed system requires 3.9 seconds to complete the close clamping, less than 1 second to release the clamping and it can deliver 463 KiloByte.

  19. THE ROLE OF REVIEW MATERIAL IN CONTINUOUS PROGRAMMING WITH TEACHING MACHINES.

    Science.gov (United States)

    FERSTER, C.B.

    STUDENTS WERE PRESENTED 61 LESSONS BY MEANS OF SEMIAUTOMATIC TEACHING MACHINES. LESSONS WERE ARRANGED SO THAT EACH PARTICIPATING STUDENT STUDIED PART OF THE COURSE MATERIAL WITH A SINGLE REPETITION AND PART WITHOUT REPETITION. DATA WERE OBTAINED FROM TWO TESTS SHOWING TEACHING-MACHINE RESULTS AND ONE FINAL COURSE EXAMINATION. NO SIGNIFICANT…

  20. Model-based machine learning.

    Science.gov (United States)

    Bishop, Christopher M

    2013-02-13

    Several decades of research in the field of machine learning have resulted in a multitude of different algorithms for solving a broad range of problems. To tackle a new application, a researcher typically tries to map their problem onto one of these existing methods, often influenced by their familiarity with specific algorithms and by the availability of corresponding software implementations. In this study, we describe an alternative methodology for applying machine learning, in which a bespoke solution is formulated for each new application. The solution is expressed through a compact modelling language, and the corresponding custom machine learning code is then generated automatically. This model-based approach offers several major advantages, including the opportunity to create highly tailored models for specific scenarios, as well as rapid prototyping and comparison of a range of alternative models. Furthermore, newcomers to the field of machine learning do not have to learn about the huge range of traditional methods, but instead can focus their attention on understanding a single modelling environment. In this study, we show how probabilistic graphical models, coupled with efficient inference algorithms, provide a very flexible foundation for model-based machine learning, and we outline a large-scale commercial application of this framework involving tens of millions of users. We also describe the concept of probabilistic programming as a powerful software environment for model-based machine learning, and we discuss a specific probabilistic programming language called Infer.NET, which has been widely used in practical applications.

  1. Computer Aided Simulation Machining Programming In 5-Axis Nc Milling Of Impeller Leaf

    Science.gov (United States)

    Huran, Liu

    At present, cad/cam (computer-aided design and manufacture) have fine wider and wider application in mechanical industry. For the complex surfaces, the traditional machine tool can no longer satisfy the requirement of such complex task. Only by the help of cad/cam can fulfill the requirement. The machining of the vane surface of the impeller leaf has been considered as the hardest challenge. Because of their complex shape, the 5-axis cnc machine tool is needed for the machining of such parts. The material is hard to cut, the requirement for the surface finish and clearance is very high, so that the manufacture quality of impeller leaf represent the level of 5-axis machining. This paper opened a new field in machining the complicated surface, based on a relatively more rigid mathematical basis. The theory presented here is relatively more systematical. Since the lack of theoretical guidance, in the former research, people have to try in machining many times. Such case will be changed. The movement of the cutter determined by this method is definite, and the residual is the smallest while the times of travel is the fewest. The criterion is simple and the calculation is easy.

  2. Evaluation of cleaning and disinfection performance of automatic washer disinfectors machines in programs presenting different cycle times and temperatures

    OpenAIRE

    Bergo,Maria do Carmo Noronha Cominato

    2006-01-01

    Thermal washer-disinfectors represent a technology that brought about great advantages such as, establishment of protocols, standard operating procedures, reduction in occupational risk of a biological and environmental nature. The efficacy of the cleaning and disinfection obtained by automatic washer disinfectors machines in running programs with different times and temperatures determined by the different official agencies was validated according to recommendations from ISO Standards 15883-...

  3. MINI-TRAC code: a driver program for assessment of constitutive equations of two-fluid model

    International Nuclear Information System (INIS)

    Akimoto, Hajime; Abe, Yutaka; Ohnuki, Akira; Murao, Yoshio

    1991-05-01

    MINI-TRAC code, a driver program for assessment of constitutive equations of two-fluid model, has been developed to perform assessment and improvement of constitutive equations of two-fluid model widely and efficiently. The MINI-TRAC code uses one-dimensional conservation equations for mass, momentum and energy based on the two-fluid model. The code can work on a personal computer because it can be operated with a core memory size less than 640 KB. The MINI-TRAC code includes constitutive equations of TRAC-PF1/MOD1 code, TRAC-BF1 code and RELAP5/MOD2 code. The code is modulated so that one can easily change constitutive equations to perform a test calculation. This report is a manual of the MINI-TRAC code. The basic equations, numerics, constitutive, equations included in the MINI-TRAC code will be described. The user's manual such as input description will be presented. The program structure and contents of main variables will also be mentioned in this report. (author)

  4. Aspects regarding the aided programming of the electroerosion machine ROBOFIL 310

    OpenAIRE

    Ioan Mocian; Răzvan Cazacu

    2011-01-01

    This paper presents the solutions to some practical issues regarding the design of technologies with the wire electroerosion numerical command machine ROBOFIL 310, produced by the Swiss manufacturer Charmilles. As part of the study an AutoCAD application was designed using Visual Basic and the .NET platform, aimed at helping the designer identify the minimum radius of a contour before sending it to the machine

  5. Aspects regarding the aided programming of the electroerosion machine ROBOFIL 310

    Directory of Open Access Journals (Sweden)

    Ioan Mocian

    2011-12-01

    Full Text Available This paper presents the solutions to some practical issues regarding the design of technologies with the wire electroerosion numerical command machine ROBOFIL 310, produced by the Swiss manufacturer Charmilles. As part of the study an AutoCAD application was designed using Visual Basic and the .NET platform, aimed at helping the designer identify the minimum radius of a contour before sending it to the machine

  6. Experimental program based on a High Beta Q Machine. Final report, 1 May 1978-30 September 1980

    International Nuclear Information System (INIS)

    Ribe, F.L.

    1980-07-01

    This report summarizes work done in designing and constructing the High Beta Q Machine from the inception of the work in May 1978 until the present time. It is a 3-m long, low-compression theta pinch with a 22-cm-diameter segmented compression coil with a minimum axial periodicity length of 10 cm. This capability of driving the machine as a simple, low-density theta pinch, and also of independently applying periodic magnetic fields before or after formation of the plasma column, gives the device considerable flexibility. Reported here is the construction and testing of the machine, development of its diagnostics and initial measurements of the plasma at early times in the duration of the crowbarred magnetic field. The experimental effort has been paralleled by theoretical work to model the diffuse profile, collisionless plasma in its response to the periodic RF magnetic fields. The model chosen is the Freidberg-Pearlstein Vlasov-fluid model which provides an MHD-like description but with accounting of ion kinetic effects over diffuse equilibrium profiles. A computer code has been developed to accurately calculate the resistive response of the plasma column, giving the power absorption by ion Landau damping and more recently, ion-cyclotron damping

  7. Battelle integrity of nuclear piping program. Summary of results and implications for codes/standards

    International Nuclear Information System (INIS)

    Miura, Naoki

    2005-01-01

    The BINP(Battelle Integrity of Nuclear Piping) program was proposed by Battelle to elaborate pipe fracture evaluation methods and to improve LBB and in-service flaw evaluation criteria. The program has been conducted from October 1998 to September 2003. In Japan, CRIEPI participated in the program on behalf of electric utilities and fabricators to catch up the technical backgrounds for possible future revision of LBB and in-service flaw evaluation standards and to investigate the issues needed to be reflected to current domestic standards. A series of the results obtained from the program has been well utilized for the new LBB Regulatory Guide Program by USNRC and for proposal of revised in-service flaw evaluation criteria to the ASME Code Committee. The results were assessed whether they had implications for the existing or future domestic standards. As a result, the impact of many of these issues, which were concerned to be adversely affected to LBB approval or allowable flaw sizes in flaw evaluation criteria, was found to be relatively minor under actual plant conditions. At the same time, some issues that needed to be resolved to address advanced and rational standards in the future were specified. (author)

  8. IPAD applications to the design, analysis, and/or machining of aerospace structures. [Integrated Program for Aerospace-vehicle Design

    Science.gov (United States)

    Blackburn, C. L.; Dovi, A. R.; Kurtze, W. L.; Storaasli, O. O.

    1981-01-01

    A computer software system for the processing and integration of engineering data and programs, called IPAD (Integrated Programs for Aerospace-Vehicle Design), is described. The ability of the system to relieve the engineer of the mundane task of input data preparation is demonstrated by the application of a prototype system to the design, analysis, and/or machining of three simple structures. Future work to further enhance the system's automated data handling and ability to handle larger and more varied design problems are also presented.

  9. Coding a Weather Model: DOE-FIU Science & Technology Workforce Development Program.

    Energy Technology Data Exchange (ETDEWEB)

    Bradley, Jon David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-12-01

    DOE Fellow, Andres Cremisini, completed a 10-week internship with Sandia National Laboratories (SNL) in Albuquerque, New Mexico. Under the management of Kristopher Klingler and the mentorship of Jon Bradley, he was tasked with conceiving and coding a realistic weather model for use in physical security applications. The objective was to make a weather model that could use real data to accurately predict wind and precipitation conditions at any location of interest on the globe at any user-determined time. The intern received guidance on software design, the C++ programming language and clear communication of project goals and ongoing progress. In addition, Mr. Cremisini was given license to structure the program however he best saw fit, an experience that will benefit ongoing research endeavors.

  10. Development of a Wrapper Object, TRelap, for RELAP5 Code for Use in Object Oriented Programs

    International Nuclear Information System (INIS)

    Lee, Young Jin

    2008-01-01

    TRelap object class has been developed to enable object oriented programming techniques to be used where functionality of the RELAP5 thermal hydraulic system analysis code is needed. The TRelap is an object front for Dynamic Link Library (DLL) manifestation of the Relap5 code, Relap5.dll. In making the Relap5.dll, the top most structure of the RELAP5 was altered to enable the external calling procedures to control and the access the memory. The alteration was performed in such a way to allow the entire 'fa' and the f tb' memory spaces to be accessible to the calling procedure. Thus, any variable contained within the 'fa' array such as the parameters for the components, volumes, junctions, and heat structures can be accessed by the external calling procedure through TRelap. Various methods and properties to control the RELAP5 calculation and to access and manipulate the variables are built into the TRelap to enable easy manipulation. As a verification effort, a simple program was written to demonstrate the capability of the TRelap

  11. Consolidated fuel-reprocessing program:: man/machine interface development for the REMOTEX concept

    International Nuclear Information System (INIS)

    Garin, J.; Clarke, M.M.

    1981-01-01

    This paper describes ongoing research at ORNL to develop a man/machine interface system that can be used to remotely control a system composed of a transporter base and a force-reflecting, servo-controlled manipulator. A unique feature of the concept is the incorporation of totally remote operation. Thus, a major objective is the requirement that an operator have a sense of presence in the remote environment. Man/machine interface requirements for this totally remote operation remain to be developed. Therefore, a simulator is being built to optimize such requirements and the developments are discussed

  12. Fabrication Process for Machined and Shrink-Fitted Impactor-Type Liners for the LOS Alamos Hedp Program

    Science.gov (United States)

    Randolph, B.

    2004-11-01

    Composite liners have been fabricated for the Los Alamos liner-driven High Energy Density Physics (HEDP) experiments using impactors formed by physical vapor deposition, and by machining and shrink fitting. Chemical vapor deposition has been proposed for some ATLAS liner applications. This paper describes the processes used to fabricate machined and shrink-fitted impactors; these processes have been used for copper impactors in 1100 aluminum liners and for 6061 T-6 aluminum impactors in 1100 aluminum liners. The most successful processes have been largely empirically developed and rely upon a combination of shrink-fitting and light press fitting. The processes used to date will be described along with some considerations for future composite liners for the HEDP Program.

  13. Thermochemistry in BWR. An overview of applications of program codes and databases

    International Nuclear Information System (INIS)

    Hermansson, H-P.; Becker, R.

    2010-01-01

    The Swedish work on thermodynamics of metal-water systems relevant to BWR conditions has been ongoing since the 70ies, and at present time a compilation and adaptation of codes and thermodynamic databases are in progress. In the previous work, basic thermodynamic data were compiled for parts of the system Fe-Cr-Ni-Co-Zn-S-H 2 O at 25-300 °C. Since some thermodynamic information necessary for temperature extrapolations of data up to 300 °C was not published in the earlier works, these data have now been partially recalculated. This applies especially to the parameters of the HKF-model, which are used to extrapolate the thermodynamic data for ionic and neutral aqua species from 25 °C to BWR temperatures. Using the completed data, e.g. the change in standard Gibbs energy (ΔG 0 ) and the equilibrium constant (log K) can be calculated for further applications at BWR/LWR conditions. In addition a computer program is currently being developed at Studsvik for the calculation of equilibrium conductivity in high temperature water. The program is intended for PWR applications, but can also be applied to BWR environment. Data as described above will be added to the database of this program. It will be relatively easy to further develop the program e.g. to calculate Pourbaix diagrams, and these graphs could then be calculated at any temperature. This means that there will be no limitation to the temperatures and total concentrations (usually 10 -6 to 10 -8 mol/kg) as reported in earlier work. It is also easy to add a function generating ΔG 0 and log K values at selected temperatures. One of the fundamentals for this work was also to overview and collect publicly available thermodynamic program codes and databases of relevance for BWR conditions found in open sources. The focus has been on finding already done compilations and reviews, and some 40 codes and 15 databases were found. Codes and data-bases are often integrated and such a package is often developed for

  14. The Milling Assistant, Case-Based Reasoning, and machining strategy: A report on the development of automated numerical control programming systems at New Mexico State University

    Energy Technology Data Exchange (ETDEWEB)

    Burd, W. [Sandia National Labs., Albuquerque, NM (United States); Culler, D.; Eskridge, T.; Cox, L.; Slater, T. [New Mexico State Univ., Las Cruces, NM (United States)

    1993-08-01

    The Milling Assistant (MA) programming system demonstrates the automated development of tool paths for Numerical Control (NC) machine tools. By integrating a Case-Based Reasoning decision processor with a commercial CAD/CAM software, intelligent tool path files for milled and point-to-point features can be created. The operational system is capable of reducing the time required to program a variety of parts and improving product quality by collecting and utilizing ``best of practice`` machining strategies.

  15. Physics 30 Program Machine-Scorable Open-Ended Questions: Unit 2: Electric and Magnetic Forces. Diploma Examinations Program.

    Science.gov (United States)

    Alberta Dept. of Education, Edmonton.

    This document outlines the use of machine-scorable open-ended questions for the evaluation of Physics 30 in Alberta. Contents include: (1) an introduction to the questions; (2) sample instruction sheet; (3) fifteen sample items; (4) item information including the key, difficulty, and source of each item; (5) solutions to items having multiple…

  16. Tourist Affiliate Program while Using Online Booking System with Possibility of Entering B2B Code

    Directory of Open Access Journals (Sweden)

    Slivar Iva

    2008-01-01

    Full Text Available Affiliate marketing programs are one of the most powerful tools for online marketing since the merchant presenting a product or a service decides on the commissioning model and the commission is granted only if the desired results have been reached. Affiliate marketing is based offline as much as tourism itself and it relies on the commission that tourist companies pay to their partners (affiliates who bring new guests. This paper will present the basics of how online affiliate programs work, benefits they bring and steps for their further implementation. It will explain in detail how to establish an affiliate program for dynamic web pages which use online booking system platforms that offer a possibility of entering a B2B code. Special attention will be paid to SEO (Search Engine Optimisation. It will also present results of a research on Croatian hotels web pages and the implementation of the online booking system and affiliate programs. Having in mind the insufficient deployment of online potentials, the aim of the paper is to stress the need for setting up an effective method of monitoring changes and updates in the online world as well as implementing new promotional possibilities, all aimed at increasing sales. The goal of the paper is to explore advantages and disadvantages of the affiliate program as a new sales channel and promote the possibility to implement it in one of the biggest Croatian hotel companies, Maistra d.d. Rovinj. Along with methods of data acquiring and different techniques of creative thinking, the following scientific research methods were also used: statistic, historic, descriptive, comparison, interview, analysis and synthesis, induction and deduction.

  17. Standardization of computer programs - basis of the Czechoslovak library of nuclear codes

    International Nuclear Information System (INIS)

    Gregor, M.

    1987-01-01

    A standardized form of computer code documentation has been established in the CSSR in the field of reactor safety. Structure and content of the documentation are described and codes already subject to this process are mentioned. The formation of a Czechoslovak nuclear code library and facilitated discussion of safety reports containing results of standardized codes are aimed at

  18. Ces-VP: consultation expert system for vector programming of nuclear codes

    International Nuclear Information System (INIS)

    Fujisaki, Masahide; Makino, Mitsuhiro; Ishiguro, Misako

    1988-08-01

    Ces-VP is a prototype rule-based expert system for consulting the vector programming, based on the knowledge of vectorization of nuclear codes at JAERI during these 10 years. Experts in vectorization can restructure nuclear codes with high performance on vector processors, since they have know-how for choosing the best technique among a lot of techniques that were acquired from the experience of vectorization in the past. Frequency in trial and error will be reduced if a beginner can easily use the know-how of experts. In this report, at first the contents of Ces-VP and its intention are shown. Then, the method for acquiring the know-how of vectorization and the method for making rules from the know-how are described. The outline of Ces-VP implemented on Fujitsu expert tool ESHELL is described. Finally, the availability of Ces-VP is evaluated from the data gathered from practical use and its present problems are discussed. (author)

  19. Implementation of Neutronics Analysis Code using the Features of Object Oriented Programming via Fortran90/95

    Energy Technology Data Exchange (ETDEWEB)

    Han, Tae Young; Cho, Beom Jin [KEPCO Nuclear Fuel, Daejeon (Korea, Republic of)

    2011-05-15

    The object-oriented programming (OOP) concept was radically established after 1990s and successfully involved in Fortran 90/95. The features of OOP are such as the information hiding, encapsulation, modularity and inheritance, which lead to producing code that satisfy three R's: reusability, reliability and readability. The major OOP concepts, however, except Module are not mainly used in neutronics analysis codes even though the code was written by Fortran 90/95. In this work, we show that the OOP concept can be employed to develop the neutronics analysis code, ASTRA1D (Advanced Static and Transient Reactor Analyzer for 1-Dimension), via Fortran90/95 and those can be more efficient and reasonable programming methods

  20. ARDISC (Argonne Dispersion Code): computer programs to calculate the distribution of trace element migration in partially equilibrating media

    International Nuclear Information System (INIS)

    Strickert, R.; Friedman, A.M.; Fried, S.

    1979-04-01

    A computer program (ARDISC, the Argonne Dispersion Code) is described which simulates the migration of nuclides in porous media and includes first order kinetic effects on the retention constants. The code allows for different absorption and desorption rates and solves the coupled migration equations by arithmetic reiterations. Input data needed are the absorption and desorption rates, equilibrium surface absorption coefficients, flow rates and volumes, and media porosities

  1. Your Sewing Machine.

    Science.gov (United States)

    Peacock, Marion E.

    The programed instruction manual is designed to aid the student in learning the parts, uses, and operation of the sewing machine. Drawings of sewing machine parts are presented, and space is provided for the student's written responses. Following an introductory section identifying sewing machine parts, the manual deals with each part and its…

  2. Implementation and evaluation of a simulation curriculum for paediatric residency programs including just-in-time in situ mock codes.

    Science.gov (United States)

    Sam, Jonathan; Pierse, Michael; Al-Qahtani, Abdullah; Cheng, Adam

    2012-02-01

    To develop, implement and evaluate a simulation-based acute care curriculum in a paediatric residency program using an integrated and longitudinal approach. Curriculum framework consisting of three modular, year-specific courses and longitudinal just-in-time, in situ mock codes. Paediatric residency program at BC Children's Hospital, Vancouver, British Columbia. The three year-specific courses focused on the critical first 5 min, complex medical management and crisis resource management, respectively. The just-in-time in situ mock codes simulated the acute deterioration of an existing ward patient, prepared the actual multidisciplinary code team, and primed the surrounding crisis support systems. Each curriculum component was evaluated with surveys using a five-point Likert scale. A total of 40 resident surveys were completed after each of the modular courses, and an additional 28 surveys were completed for the overall simulation curriculum. The highest Likert scores were for hands-on skill stations, immersive simulation environment and crisis resource management teaching. Survey results also suggested that just-in-time mock codes were realistic, reinforced learning, and prepared ward teams for patient deterioration. A simulation-based acute care curriculum was successfully integrated into a paediatric residency program. It provides a model for integrating simulation-based learning into other training programs, as well as a model for any hospital that wishes to improve paediatric resuscitation outcomes using just-in-time in situ mock codes.

  3. Development of application program and building database to increase facilities for using the radiation effect assessment computer codes

    International Nuclear Information System (INIS)

    Hyun Seok Ko; Young Min Kim; Suk-Hoon Kim; Dong Hoon Shin; Chang-Sun Kang

    2005-01-01

    The current radiation effect assessment system is required the skillful technique about the application for various code and high level of special knowledge classified by field. Therefore, as a matter of fact, it is very difficult for the radiation users' who don't have enough special knowledge to assess or recognize the radiation effect properly. For this, we already have developed the five Computer codes(windows-based), that is the radiation effect assessment system, in radiation utilizing field including the nuclear power generation. It needs the computer program that non-specialist can use the five computer codes to have already developed with ease. So, we embodied the A.I-based specialist system that can infer the assessment system by itself, according to the characteristic of given problem. The specialist program can guide users, search data, inquire of administrator directly. Conceptually, with circumstance which user to apply the five computer code may encounter actually, we embodied to consider aspects as follows. First, the accessibility of concept and data to need must be improved. Second, the acquirement of reference theory and use of corresponding computer code must be easy. Third, Q and A function needed for solution of user's question out of consideration previously. Finally, the database must be renewed continuously. Actually, to express this necessity, we develop the client program to organize reference data, to build the access methodology(query) about organized data, to load the visible expression function of searched data. And It is embodied the instruction method(effective theory acquirement procedure and methodology) to acquire the theory referring the five computer codes. It is developed the data structure access program(DBMS) to renew continuously data with ease. For Q and A function, it is embodied the Q and A board within client program because the user of client program can search the content of question and answer. (authors)

  4. Multiple access to sterile syringes for injection drug users: vending machines, needle exchange programs and legal pharmacy sales in Marseille, France.

    Science.gov (United States)

    Moatti, J P; Vlahov, D; Feroni, I; Perrin, V; Obadia, Y

    2001-03-01

    In Marseille, southeastern France, HIV prevention programs for injection drug users (IDUs) simultaneously include access to sterile syringes through needle exchange programs (NEPs), legal pharmacy sales and, since 1996, vending machines that mechanically exchange new syringes for used ones. The purpose of this study was to compare the characteristics of IDUs according to the site where they last obtained new syringes. During 3 days in September 1997, all IDUs who obtained syringes from 32 pharmacies, four NEPs and three vending machines were offered the opportunity to complete a self-administered questionnaire on demographics, drug use characteristics and program utilization. Of 485 individuals approached, the number who completed the questionnaire was 141 in pharmacies, 114 in NEPs and 88 at vending machines (response rate = 70.7%). Compared to NEP users, vending machine users were younger and less likely to be enrolled in a methadone program or to report being HIV infected, but more likely to misuse buprenorphine. They also had lower financial resources and were less likely to be heroin injectors than both pharmacy and NEP users. Our results suggest that vending machines attract a very different group of IDUs than NEPs, and that both programs are useful adjuncts to legal pharmacy sales for covering the needs of IDUs for sterile syringes in a single city. Assessment of the effectiveness and cost-effectiveness of combining such programs for the prevention of HIV and other infectious diseases among IDUs requires further comparative research. Copyright 2001 S. Karger AG, Basel

  5. The proposed human factors engineering program plan for man-machine interface system design of the next generation NPP in Korea

    International Nuclear Information System (INIS)

    Oh, I.S.; Lee, H.C.; Seo, S.M.; Cheon, S.W.; Park, K.O.; Lee, J.W.; Sim, B.S.

    1994-01-01

    Human factors application to nuclear power plant (NPP) design, especially, to man-machine interface system (MMIS) design becomes an important issue among the licensing requirements. Recently, the nuclear regulatory bodies require the evidence of systematic human factors application to the MMIS design. Human Factors Engineering Program Plan (HFEPP), as a basis and central one among the human factors application by the MMIS designers. This paper describes the framework of HFEPP for the MMIS design of next generation NPP (NG-NPP) in Korea. This framework provides an integral plan and some bases of the systematic application of human factors to the MMIS design, and consists of purpose and scope, codes and standards, human factors organization, human factors tasks, engineering control methodology, human factors documentations, and milestones. The proposed HFEPP is a top level document to define and describe human factors tasks, based on each step of MMIS design process, in view point of how, what, when and by whom to be performed. (author). 11 refs, 1 fig

  6. Nuclear model codes available at the Nuclear Energy Agency Computer Program Library (NEA-CPL)

    International Nuclear Information System (INIS)

    Sartori, E.; Garcia Viedma, L. de

    1976-01-01

    This paper briefly outlines the objectives of the NEA-CPL and its activities in the field of Nuclear Model Computer Codes. A short description of the computer codes available from the CPL in this field is also presented. (author)

  7. Machine protection systems

    CERN Document Server

    Macpherson, A L

    2010-01-01

    A summary of the Machine Protection System of the LHC is given, with particular attention given to the outstanding issues to be addressed, rather than the successes of the machine protection system from the 2009 run. In particular, the issues of Safe Machine Parameter system, collimation and beam cleaning, the beam dump system and abort gap cleaning, injection and dump protection, and the overall machine protection program for the upcoming run are summarised.

  8. Paracantor: A two group, two region reactor code

    Energy Technology Data Exchange (ETDEWEB)

    Stone, Stuart

    1956-07-01

    Paracantor I a two energy group, two region, time independent reactor code, which obtains a closed solution for a critical reactor assembly. The code deals with cylindrical reactors of finite length and with a radial reflector of finite thickness. It is programmed for the 1.B.M: Magnetic Drum Data-Processing Machine, Type 650. The limited memory space available does not permit a flux solution to be included in the basic Paracantor code. A supplementary code, Paracantor 11, has been programmed which computes fluxes, .including adjoint fluxes, from the .output of Paracamtor I.

  9. Simulation of linac operation using the tracking code L

    International Nuclear Information System (INIS)

    Drevlak, M.; Timm, M.; Weiland, T.

    1996-01-01

    In linear accelerators, misalignments of the machine elements can cause considerable emittance growth due to wake fields, dispersion and other effects. Hence, tight limits are imposed on machine tolerances, design parameters and methods of machine operation. In order to simulate the beam dynamics in linacs, the tracking code L has been developed. Including both single- and multi-bunch effects, the behaviour of the beam in the machine can be simulated and adjustments on parameters of the machine elements up to complete correction techniques and operation procedures can be applied. Utilization of the program is facilitated by a graphical user interface. In this paper we will give an overview over the capabilities of this code and demonstrate its efficiency at attacking the problems associated with large linear accelerators. (author)

  10. A fast reactor transient analysis methodology for PCs: Volume 3, LTC program manual of the QuickBASIC code

    International Nuclear Information System (INIS)

    Ott, K.O.; Chung, L.

    1992-06-01

    This manual augments the detailed manual of the GW-BASIC version of the LTC code for an application in QuickBASIC. As most of the GW-BASIC coding of this program for ''LMR Transient Calculations'' is compatible with QuickBASIC, this manual pertains primarily to the required changes, such as the handling of input and output. The considerable reduction in computation time achieved by this conversion is demonstrated for two sample problems, using a variety of hardware and execution options. The revised code is listed. Although the severe storage limitations of GW-BASIC no longer apply, the LOF transient path has not been completed in this QuickBASIC code. Its advantages are thus primarily in the much faster running time for TOP and LOHS transients. For the fastest PC hardware (486) and execution option the computation time is reduced by a factor of 124 compared to GW-BASIC on a 386/20

  11. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  12. Reactor Vessel External Cooling for Corium Retention SULTAN Experimental Program and Modelling with CATHARE Code

    International Nuclear Information System (INIS)

    Rouge, S.; Dor, I.; Geffraye, G.

    1999-01-01

    In case of severe accident, a molten pool may form at the bottom of the lower head, and some pessimistic scenarios estimate that heat fluxes up to 1.5 MW/m 2 should be transferred through the vessel wall. An efficient, though completely passive, removal of heat flux during a long time is necessary to prevent total wall ablation, and a possible solution is to flood the cavity with water and establish boiling in natural convection. High heat exchanges are expected, especially if the system design (deflector along the vessel, riser...) emphasize water natural circulation, but are unfortunately limited by the critical heat flux phenomena (CHF). CHF data are very scarce in the adequate range of hydraulic and geometric parameters and are clearly dependent of the system effect in natural convection. The system effect can both modify flow velocity and two phase flow regimes, counter-current phenomena and flow static or dynamic instabilities. The SULTAN experimental program purpose was of two kinds, increasing CHF data for realistic situations, and improving the modeling of large 3D two phase flow circuits in natural convection. The CATHARE thermal-hydraulic code is used for interpreting the data and for extrapolation to real geometry. As a first step, a one-dimensional model is used. It is shown that some closure laws have to be improved. Reasonable predictions may be obtained but, for some test conditions, multi-dimensional effects such as recirculation appear to be dominant. Therefore the 3-dimensional module of CATHARE is also used to investigate these effects. This model well predicts qualitatively the existence and the development of a 2-phase layer along the heated wall as well as the existence of a recirculation zone. But modelling problems still require further development as part of a long term program for a better prediction of multi-dimensional two-phase flows

  13. Application of CAMP code to analysis of debris coolability experiments in ALPHA program

    International Nuclear Information System (INIS)

    Maruyama, Yu; Moriyama, Kiyofumi; Park, Hyun-Sun; Yang, Yanhua; Sugimoto, Jun

    1999-01-01

    An analytical code for thermo-fluid dynamics of a molten debris, CAMP, was applied to the analysis of the ex-vessel and in-vessel debris coolability experiments performed in ALPHA program. The analysis on the ex-vessel debris coolability experiments, where water was added onto a layer of thermite melt, indicated that the upper surface of the melt was remained molten during a period when melt eruptions followed by a mild steam explosion were observed. This might imply that a coarse mixing between the melt and the overlying water could have been formed if a sufficient force was generated at the interface between the two fluids. In the analysis of the in-vessel debris coolability experiments, where an aluminum oxide (Al 2 O 3 ) melt was poured into a water-filled lower head experimental vessel, a temperature increase at the outer surface of the vessel was qualitatively reproduced when a gap was assumed to be at the interface between the solidified Al 2 O 3 and the vessel wall. (author)

  14. The development of depletion program coupled with Monte Carlo computer code

    International Nuclear Information System (INIS)

    Nguyen Kien Cuong; Huynh Ton Nghiem; Vuong Huu Tan

    2015-01-01

    The paper presents the development of depletion code for light water reactor coupled with MCNP5 code called the MCDL code (Monte Carlo Depletion for Light Water Reactor). The first order differential depletion system equations of 21 actinide isotopes and 50 fission product isotopes are solved by the Radau IIA Implicit Runge Kutta (IRK) method after receiving neutron flux, reaction rates in one group energy and multiplication factors for fuel pin, fuel assembly or whole reactor core from the calculation results of the MCNP5 code. The calculation for beryllium poisoning and cooling time is also integrated in the code. To verify and validate the MCDL code, high enriched uranium (HEU) and low enriched uranium (LEU) fuel assemblies VVR-M2 types and 89 fresh HEU fuel assemblies, 92 LEU fresh fuel assemblies cores of the Dalat Nuclear Research Reactor (DNRR) have been investigated and compared with the results calculated by the SRAC code and the MCNP R EBUS linkage system code. The results show good agreement between calculated data of the MCDL code and reference codes. (author)

  15. Engine Lathe Operator. Instructor's Guide. Part of Single-Tool Skills Program Series. Machine Industries Occupations.

    Science.gov (United States)

    New York State Education Dept., Albany. Bureau of Secondary Curriculum Development.

    Expected to help meet the need for trained operators in metalworking and suitable for use in the adult education programs of school districts, in manpower development and training programs, and in secondary schools, this guide consists of four sections: Introduction, General Job Content, Shop Projects, and Drawings for the Projects. General Job…

  16. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  17. Autocoding State Machine in Erlang

    DEFF Research Database (Denmark)

    Guo, Yu; Hoffman, Torben; Gunder, Nicholas

    2008-01-01

    This paper presents an autocoding tool suit, which supports development of state machine in a model-driven fashion, where models are central to all phases of the development process. The tool suit, which is built on the Eclipse platform, provides facilities for the graphical specification...... of a state machine model. Once the state machine is specified, it is used as input to a code generation engine that generates source code in Erlang....

  18. LISA. A code for safety assessment in nuclear waste disposals program description and user guide

    International Nuclear Information System (INIS)

    Saltelli, A.; Bertozzi, G.; Stanners, D.A.

    1984-01-01

    The code LISA (Long term Isolation Safety Assessment), developed at the Joint Research Centre, Ispra is a useful tool in the analysis of the hazard due to the disposal of nuclear waste in geological formations. The risk linked to preestablished release scenarios is assessed by the code in terms of dose rate to a maximum exposed individual. The various submodels in the code simulate the system of barriers -both natural and man made- which are interposed between the contaminants and man. After a description of the code features a guide for the user is supplied and then a test case is presented

  19. Detecting Source Code Plagiarism on .NET Programming Languages using Low-level Representation and Adaptive Local Alignment

    Directory of Open Access Journals (Sweden)

    Oscar Karnalim

    2017-01-01

    Full Text Available Even though there are various source code plagiarism detection approaches, only a few works which are focused on low-level representation for deducting similarity. Most of them are only focused on lexical token sequence extracted from source code. In our point of view, low-level representation is more beneficial than lexical token since its form is more compact than the source code itself. It only considers semantic-preserving instructions and ignores many source code delimiter tokens. This paper proposes a source code plagiarism detection which rely on low-level representation. For a case study, we focus our work on .NET programming languages with Common Intermediate Language as its low-level representation. In addition, we also incorporate Adaptive Local Alignment for detecting similarity. According to Lim et al, this algorithm outperforms code similarity state-of-the-art algorithm (i.e. Greedy String Tiling in term of effectiveness. According to our evaluation which involves various plagiarism attacks, our approach is more effective and efficient when compared with standard lexical-token approach.

  20. Tangent: Automatic Differentiation Using Source Code Transformation in Python

    OpenAIRE

    van Merriënboer, Bart; Wiltschko, Alexander B.; Moldovan, Dan

    2017-01-01

    Automatic differentiation (AD) is an essential primitive for machine learning programming systems. Tangent is a new library that performs AD using source code transformation (SCT) in Python. It takes numeric functions written in a syntactic subset of Python and NumPy as input, and generates new Python functions which calculate a derivative. This approach to automatic differentiation is different from existing packages popular in machine learning, such as TensorFlow and Autograd. Advantages ar...

  1. Quantum machine learning.

    Science.gov (United States)

    Biamonte, Jacob; Wittek, Peter; Pancotti, Nicola; Rebentrost, Patrick; Wiebe, Nathan; Lloyd, Seth

    2017-09-13

    Fuelled by increasing computer power and algorithmic advances, machine learning techniques have become powerful tools for finding patterns in data. Quantum systems produce atypical patterns that classical systems are thought not to produce efficiently, so it is reasonable to postulate that quantum computers may outperform classical computers on machine learning tasks. The field of quantum machine learning explores how to devise and implement quantum software that could enable machine learning that is faster than that of classical computers. Recent work has produced quantum algorithms that could act as the building blocks of machine learning programs, but the hardware and software challenges are still considerable.

  2. The Evolution of a Coding Schema in a Paced Program of Research

    Science.gov (United States)

    Winters, Charlene A.; Cudney, Shirley; Sullivan, Therese

    2010-01-01

    A major task involved in the management, analysis, and integration of qualitative data is the development of a coding schema to facilitate the analytic process. Described in this paper is the evolution of a coding schema that was used in the analysis of qualitative data generated from online forums of middle-aged women with chronic conditions who…

  3. The Live Coding of Slub - Art Oriented Programming as Media Critique

    DEFF Research Database (Denmark)

    Andersen, Christian Ulrik

    2007-01-01

    Computer art is often associated with computer-generated expressions (digital audio/images in music, video, stage design, etc.). In recent computer art, however, the code-text itself – not the generated output – has become the artwork (Perl Poetry, ASCII Art, obfuscated code, etc.). This paper wi...

  4. Technology Roadmap Instrumentation, Control, and Human-Machine Interface to Support DOE Advanced Nuclear Energy Programs

    Energy Technology Data Exchange (ETDEWEB)

    Donald D Dudenhoeffer; Burce P Hallbert

    2007-03-01

    Instrumentation, Controls, and Human-Machine Interface (ICHMI) technologies are essential to ensuring delivery and effective operation of optimized advanced Generation IV (Gen IV) nuclear energy systems. In 1996, the Watts Bar I nuclear power plant in Tennessee was the last U.S. nuclear power plant to go on line. It was, in fact, built based on pre-1990 technology. Since this last U.S. nuclear power plant was designed, there have been major advances in the field of ICHMI systems. Computer technology employed in other industries has advanced dramatically, and computing systems are now replaced every few years as they become functionally obsolete. Functional obsolescence occurs when newer, more functional technology replaces or supersedes an existing technology, even though an existing technology may well be in working order.Although ICHMI architectures are comprised of much of the same technology, they have not been updated nearly as often in the nuclear power industry. For example, some newer Personal Digital Assistants (PDAs) or handheld computers may, in fact, have more functionality than the 1996 computer control system at the Watts Bar I plant. This illustrates the need to transition and upgrade current nuclear power plant ICHMI technologies.

  5. Annual Coded Wire Tag Program; Oregon Missing Production Groups, 1995 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Garrison, Robert L.; Mallette, Christine; Lewis, Mark A.

    1995-12-01

    Bonneville Power Administration is the funding source for the Oregon Department of Fish and Wildlife`s Annual Coded Wire Tag Program - Oregon Missing Production Groups Project. Tule brood fall chinook were caught primarily in the British Columbia, Washington and northern Oregon ocean commercial fisheries. The up-river bright fall chinook contributed primarily to the Alaska and British Columbia ocean commercial fisheries and the Columbia River gillnet fishery. Contribution of Rogue fall chinook released in the lower Columbia River system occurred primarily in the Oregon ocean commercial and Columbia river gillnet fisheries Willamette spring chinook salmon contributed primarily to the Alaska and British Columbia ocean commercial, Oregon freshwater sport and Columbia River gillnet fisheries. Restricted ocean sport and commercial fisheries limited contribution of the Columbia coho released in the Umatilla River that survived at an average rate of 1.05% and contributed primarily to the Washington, Oregon and California ocean sport and commercial fisheries and the Columbia River gillnet fishery. The 1987 to 1991 brood years of coho released in the Yakima River survived at an average rate of 0.64% and contributed primarily to the Washington, Oregon and California ocean sport and commercial fisheries and the Columbia River gillnet fishery. Survival rates of salmon and steelhead are influenced, not only by factors in the hatchery, disease, density, diet and size and time of release, but also by environmental factors in the river and ocean. These environmental factors are controlled by large scale weather patterns such as El Nino over which man has no influence. Man could have some influence over river flow conditions, but political and economic pressures generally out weigh the biological needs of the fish.

  6. Development of a Wrapper Object for MARS TH Systems Code and Its Applications in Object Oriented Programs

    International Nuclear Information System (INIS)

    Park, Sun Byung; Lee, Young Jin; Kim, Hyong Chol; Han, Sam Hee; Kim, Hyun Jik

    2013-01-01

    TMARS is written for the object pascal program language, and 'wraps' the Dynamic Link Library (DLL) manifestation of the MARS-KS code written in Fortran 90. TMARS behaves as a true object and it can be instantiated, inherited, and its methods overloaded. The functionality of TMARS was verified and demonstrated using two programs built under object oriented program environment. One is a text based program for reviewing the data interface of TMARS, and the other is a graphic intensive prototype NPA program for testing the overall performance of TMARS. The prototype NPA was also used to assess the real-time capability of TMARS. The demonstration programs show that application of TMARS is straight forward and that its functions facilitate easy application developments. TMARS, a wrapper object class encapsulating the calculation functions of MARS-KS code, was successfully developed and verification of its functionality was carried out using custom made programs. The verification results show that TMARS is capable of providing reliable TH calculation results and sufficient performance to realize real time calculations

  7. Towards Compatible and Interderivable Semantic Specifications for the Scheme Programming Language, Part I: Denotational Semantics, Natural Semantics, and Abstract Machines

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2009-01-01

    We derive two big-step abstract machines, a natural semantics, and the valuation function of a denotational semantics based on the small-step abstract machine for Core Scheme presented by Clinger at PLDI'98. Starting from a functional implementation of this small-step abstract machine, (1) we fus...

  8. The use of multimedia and programmed teaching machines for remote sensing education

    Science.gov (United States)

    Ulliman, J. J.

    1980-01-01

    The advantages, limitations, and uses of various audio visual equipments and techniques used in various universities for individualized and group instruction in the interpretation and classification of remotely sensed data are considered as well as systems for programmed and computer-assisted instruction.

  9. Preprocessor for RELAP5 code, nuclear reactor thermal hydraulics accident analysis program, using Microsoft MS-EXCEL tool

    International Nuclear Information System (INIS)

    Biaty, Patricia Andrea Paladino; Sabundjian, Gaiane

    2005-01-01

    The thermal hydraulic study in accidents and transients analyses in nuclear power plants is realized with some special tools. These programs use the best estimate analyses and have been developed to simulate accidents and transients in Pressurized Water Reactors (PWR) and auxiliary systems. The RELAP5 code has been used as tool to licensing the nuclear facilities in our country, which is the objective of this study. The main problem when RELAP5 code is used is a lot of information necessary to simulate thermal hydraulic accidents. Moreover, there is the necessity of a reasonable amount of mathematical operations to calculation of the geometry of the components existents. Therefore, in order to facilitate the manipulation of this information, it is necessary the developing a friendly preprocessor for attainment of the mathematical calculations for RELAP5 code. One of the tools used for some of these calculations is the MS-EXCEL, which will be used in this work. (author)

  10. Code development and analysis program. RELAP4/MOD7 (Version 2): user's manual

    International Nuclear Information System (INIS)

    1978-08-01

    This manual describes RELAP4/MOD7 (Version 2), which is the latest version of the RELAP4 LPWR blowdown code. Version 2 is a precursor to the final version of RELAP4/MOD7, which will address LPWR LOCA analysis in integral fashion (i.e., blowdown, refill, and reflood in continuous fashion). This manual describes the new code models and provides application information required to utilize the code. It must be used in conjunction with the RELAP4/MOD5 User's Manual (ANCR-NUREG-1335, dated September 1976), and the RELAP4/MOD6 User's Manual

  11. Findings From the National Machine Guarding Program: A Small Business Intervention: Lockout/Tagout.

    Science.gov (United States)

    Parker, David L; Yamin, Samuel C; Xi, Min; Brosseau, Lisa M; Gordon, Robert; Most, Ivan G; Stanley, Rodney

    2016-01-01

    Failure to implement lockout/tagout (LOTO) procedures adversely affects the rate of work-related fatalities and serious traumatic injury and is one of the most frequently cited Occupational Safety and Health Administration standards. This study assesses the impact of a nationwide intervention to improve LOTO in small metal fabrication businesses. Insurance safety consultants conducted a standardized and validated evaluation of LOTO programs and procedures. Businesses received a baseline evaluation, two intervention visits, and a 12-month follow-up evaluation. The intervention was completed by 160 businesses. The mean LOTO procedure score improved from 8% to 33% (P < 0.0001), the mean program score went from 55% to 76% (P < 0.0001), and the presence of lockable disconnects went from 88% to 92% (P < 0.0001). This nationwide intervention showed substantial improvements in LOTO. It provides a framework for assessing and improving LOTO.

  12. Towards Compatible and Interderivable Semantic Specifications for the Scheme Programming Language, Part I: Denotational Semantics, Natural Semantics, and Abstract Machines

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2008-01-01

    We derive two big-step abstract machines, a natural semantics, and the valuation function of a denotational semantics based on the small-step abstract machine for Core Scheme presented by Clinger at PLDI'98. Starting from a functional implementation of this small-step abstract machine, (1) we fuse...... its transition function with its driver loop, obtaining the functional implementation of a big-step abstract machine; (2) we adjust this big-step abstract machine so that it is in defunctionalized form, obtaining the functional implementation of a second big-step abstract machine; (3) we...... refunctionalize this adjusted abstract machine, obtaining the functional implementation of a natural semantics in continuation style; and (4) we closure-unconvert this natural semantics, obtaining a compositional continuation-passing evaluation function which we identify as the functional implementation...

  13. Prediction of Student Dropout in E-Learning Program Through the Use of Machine Learning Method

    OpenAIRE

    Mingjie Tan; Peiji Shao

    2015-01-01

    The high rate of dropout is a serious problem in E-learning program. Thus it has received extensive concern from the education administrators and researchers. Predicting the potential dropout students is a workable solution to prevent dropout. Based on the analysis of related literature, this study selected student’s personal characteristic and academic performance as input attributions. Prediction models were developed using Artificial Neural Network (ANN), Decision Tree (DT) and Bayesian Ne...

  14. Automated detection and classification of cryptographic algorithms in binary programs through machine learning

    OpenAIRE

    Hosfelt, Diane Duros

    2015-01-01

    Threats from the internet, particularly malicious software (i.e., malware) often use cryptographic algorithms to disguise their actions and even to take control of a victim's system (as in the case of ransomware). Malware and other threats proliferate too quickly for the time-consuming traditional methods of binary analysis to be effective. By automating detection and classification of cryptographic algorithms, we can speed program analysis and more efficiently combat malware. This thesis wil...

  15. Validation of the REL2005 code package on Gd-poisoned PWR type assemblies through the CAMELEON experimental program

    International Nuclear Information System (INIS)

    Blaise, Patrick; Vidal, Jean-Francois; Santamarina, Alain

    2009-01-01

    This paper details the validation of Gd-poisoned 17x17 PWR lattices, through several configurations of the CAMELEON experimental program, by using the newly qualified REL2005 French code package. After a general presentation of the CAMELEON program that took place in the EOLE critical Facility in Cadarache, one describes the new REL2005 code package relying on the deterministic transport code APOLLO2.8 based on characteristics method (MOC), and its new CEA2005 library based on the latest JEFF-3.1.1 nuclear data evaluation. For critical masses, the average Calculation-to-Experiment C/E's on the k eff are (136 ± 80) pcm and (300 ± 76) pcm for the reference 281 groups MOC and optimized 26 groups MOC schemes respectively. These values include also a drastic improvement of about 250 pcm due to the change in the library from JEF2.2 to JEFF3.1. For pin-by-pin radial power distributions, reference and REL2005 results are very close, with maximum discrepancies of the order of 2%, i.e., in the experimental uncertainty limits. The Optimized REL2005 code package allows to predict the reactivity worth of the Gd-clusters (averaged on 9 experimental configurations) to be C/E Δρ(Gd clusters) = +1.3% ± 2.3%. (author)

  16. IAEA program for the preparation of safety codes and guides for nuclear power plants

    International Nuclear Information System (INIS)

    1975-01-01

    On the 13th of September, 1974, the IAEA Governors' Council has given its consent to the programme for the establishment of safety codes and guides (annex VII to IAEA document G.C. (XVIII/526)). The programme envisages the establishment of one code of practice for each of the issues governmental organization, siting, design, operation and quality assurance and also of about 50 safety guides between 1975 and 1980. These codes will contain the minimum requirements for the safety of the nuclear power stations, their systems and components. The guides will recommend methods to achieve the aims stated in the codes. It is the purpose of these IAEA activities to provide recommendations and guiding rules which may serve as standards for the assessment of the safety of nuclear power stations for all nations which may become participants in the peaceful use of nuclear energy within the next few years. (orig./AK) [de

  17. Analysis of programming properties and the row-column generation method for 1-norm support vector machines.

    Science.gov (United States)

    Zhang, Li; Zhou, WeiDa

    2013-12-01

    This paper deals with fast methods for training a 1-norm support vector machine (SVM). First, we define a specific class of linear programming with many sparse constraints, i.e., row-column sparse constraint linear programming (RCSC-LP). In nature, the 1-norm SVM is a sort of RCSC-LP. In order to construct subproblems for RCSC-LP and solve them, a family of row-column generation (RCG) methods is introduced. RCG methods belong to a category of decomposition techniques, and perform row and column generations in a parallel fashion. Specially, for the 1-norm SVM, the maximum size of subproblems of RCG is identical with the number of Support Vectors (SVs). We also introduce a semi-deleting rule for RCG methods and prove the convergence of RCG methods when using the semi-deleting rule. Experimental results on toy data and real-world datasets illustrate that it is efficient to use RCG to train the 1-norm SVM, especially in the case of small SVs. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Lessons learned from new construction utility demand side management programs and their implications for implementing building energy codes

    Energy Technology Data Exchange (ETDEWEB)

    Wise, B.K.; Hughes, K.R.; Danko, S.L.; Gilbride, T.L.

    1994-07-01

    This report was prepared for the US Department of Energy (DOE) Office of Codes and Standards by the Pacific Northwest Laboratory (PNL) through its Building Energy Standards Program (BESP). The purpose of this task was to identify demand-side management (DSM) strategies for new construction that utilities have adopted or developed to promote energy-efficient design and construction. PNL conducted a survey of utilities and used the information gathered to extrapolate lessons learned and to identify evolving trends in utility new-construction DSM programs. The ultimate goal of the task is to identify opportunities where states might work collaboratively with utilities to promote the adoption, implementation, and enforcement of energy-efficient building energy codes.

  19. Expanding Options. A Model to Attract Secondary Students into Nontraditional Vocational Programs. For Emphasis in: Building Trades, Electronics, Health Services, Machine Shop, Welding.

    Science.gov (United States)

    Good, James D.; DeVore, Mary Ann

    This model has been designed for use by Missouri secondary schools in attracting females and males into nontraditional occupational programs. The research-based strategies are intended for implementation in the following areas: attracting females into building trades, electronics, machine shop, and welding; and males into secondary health…

  20. Maintenance Of The EPS 3000 Electron Beam Machine As Part Of Quality Assurance Program For Irradiation Service At ALURTRON, Nuclear Malaysia

    International Nuclear Information System (INIS)

    Siti Aiasah Hashim; Shari Jahar; Ayub Muhammad; Azmi Ali; Abdul Basit Shafiei; Sarada Idris

    2012-01-01

    The EPS 3000 electron beam machine is the first of its kind in the country and was installed in Nuclear Malaysia in 1991. It was manufactured by Nissin High Voltage having variable energies from 0.5 to 3.0 MeV and maximum power of 90 kW. The machine is currently used for commercial irradiation that serves local industries. The Alurtron facility where the EPS is housed is an ISO 9000 certified plant. Maintenance program for the EPS is an essential part of Alurtron's Quality Assurance program. This is to ensure that the machine is in good condition and can serve the customer as the demand requires. Preventive maintenance is carried out at scheduled period based on recommendation of the machine's manufacturer. Corrective maintenance and repairs are carried out in-house by Alurtron's technical staff. Assistance may be sought from the manufacturer if necessary. Over the years, Alurtron had built its own capabilities in term of operation and maintenance of Cockcroft Walton type electron beam machine. (author)

  1. PROBABILISTIC PROGRAMMING FOR ADVANCED MACHINE LEARNING (PPAML) DISCRIMINATIVE LEARNING FOR GENERATIVE TASKS (DILIGENT)

    Science.gov (United States)

    2017-11-29

    follows, to see the performance of the SVM Standard algorithm: python mamiStd.py --nJobs 2 --trainSize 80 where nJobs tell the computer to use ...follows: python mamiLupi.py --nJobs 2 --trainSize 80 where nJobs tell the computer to use 2 processors and trainSize tells it to run the...in the course of DARPA PPAML program. 2 INTRODUCTION As explained in Introduction , the focus of our project is to enable the use of discriminative

  2. Report on a study of the feasibility of use of the COLUMN2 computer program in a probabilistic risk assessment code

    International Nuclear Information System (INIS)

    Hall, R.C.; Liew, S.K.

    1986-10-01

    This report contains the results of a feasibility study carried out on the COLUMN2 computer program to assess its potential for use in time-dependent probabilistic risk assessment code for radiological assessment purposes. COLUMN2 is a program which provides predictions of nuclide transport in a one-dimensional geosphere configuration using a simple Ksub(d) approach for sorption. A moving grid numerical solver is used and the program is therefore potentially capable of handling time-dependent parameters. The reported work covers aspects of code acquisition, loading, theory and structure, necessary code modifications and testing, the last aspect including two documented test cases from the COLUMN2 manual and verification tests derived from international comparison exercises. Recommendations are made as to the required code development of COLUMN2 in order that the code is capable of fulfilling its prime role as defined by the System Design Working Group in a pra code. (author)

  3. A study on the nuclear computer code maintenance and management system

    International Nuclear Information System (INIS)

    Kim, Yeon Seung; Huh, Young Hwan; Lee, Jong Bok; Choi, Young Gil; Suh, Soong Hyok; Kang, Byong Heon; Kim, Hee Kyung; Kim, Ko Ryeo; Park, Soo Jin

    1990-12-01

    According to current software development and quality assurance trends. It is necessary to develop computer code management system for nuclear programs. For this reason, the project started in 1987. Main objectives of the project are to establish a nuclear computer code management system, to secure software reliability, and to develop nuclear computer code packages. Contents of performing the project in this year were to operate and maintain computer code information system of KAERI computer codes, to develop application tool, AUTO-i, for solving the 1st and 2nd moments of inertia on polygon or circle, and to research nuclear computer code conversion between different machines. For better supporting the nuclear code availability and reliability, assistance from users who are using codes is required. Lastly, for easy reference about the codes information, we presented list of code names and information on the codes which were introduced or developed during this year. (Author)

  4. ALGORITHMS FOR THE PROGRAMMING OF FOOTWEAR SOLES MOULDS ON WORKING POSTS OF INJECTION MACHINES

    Directory of Open Access Journals (Sweden)

    LUCA Cornelia

    2014-05-01

    Full Text Available The moulds stock necessary for realization in rhythmically conditions, a certain volume of footwear soles depends on some criterions such as: the range of soles for footwear volume daily realized, the sizes structure of those soles for footwear and, respectively, the sizes tally, the technological cycle for an used mould depending on the equipment efficiency, the provide necessity of spare moulds, the using and fixing conditions etc. From the efficiency point of view, the equipments may have two working posts, or more working posts (always, an even number, as 6, 12, 24, 40 posts. Footwear soles manufacturing takes into account the percentage distribution of the size numbers of the size series. When o portative assembly is used for the manufacturing of the footwear soles using the injection with “n” working posts, it is very important an optimum distribution of the working posts. The disadvantages of these equipments are the situations of the no equilibrium programming of the moulds, so that, in one time, some working posts spread out of the work. The paper presents some practical and theoretical solutions for moulds stock programming in portative assembly for footwear soles injection, so that an optimum equilibrium degree of the working posts will obtain

  5. Developmental programming of long non-coding RNAs during postnatal liver maturation in mice.

    Directory of Open Access Journals (Sweden)

    Lai Peng

    Full Text Available The liver is a vital organ with critical functions in metabolism, protein synthesis, and immune defense. Most of the liver functions are not mature at birth and many changes happen during postnatal liver development. However, it is unclear what changes occur in liver after birth, at what developmental stages they occur, and how the developmental processes are regulated. Long non-coding RNAs (lncRNAs are involved in organ development and cell differentiation. Here, we analyzed the transcriptome of lncRNAs in mouse liver from perinatal (day -2 to adult (day 60 by RNA-Sequencing, with an attempt to understand the role of lncRNAs in liver maturation. We found around 15,000 genes expressed, including about 2,000 lncRNAs. Most lncRNAs were expressed at a lower level than coding RNAs. Both coding RNAs and lncRNAs displayed three major ontogenic patterns: enriched at neonatal, adolescent, or adult stages. Neighboring coding and non-coding RNAs showed the trend to exhibit highly correlated ontogenic expression patterns. Gene ontology (GO analysis revealed that some lncRNAs enriched at neonatal ages have their neighbor protein coding genes also enriched at neonatal ages and associated with cell proliferation, immune activation related processes, tissue organization pathways, and hematopoiesis; other lncRNAs enriched at adolescent ages have their neighbor protein coding genes associated with different metabolic processes. These data reveal significant functional transition during postnatal liver development and imply the potential importance of lncRNAs in liver maturation.

  6. Simulation model for wind energy storage systems. Volume III. Program descriptions. [SIMWEST CODE

    Energy Technology Data Exchange (ETDEWEB)

    Warren, A.W.; Edsinger, R.W.; Burroughs, J.D.

    1977-08-01

    The effort developed a comprehensive computer program for the modeling of wind energy/storage systems utilizing any combination of five types of storage (pumped hydro, battery, thermal, flywheel and pneumatic). An acronym for the program is SIMWEST (Simulation Model for Wind Energy Storage). The level of detail of SIMWEST is consistent with a role of evaluating the economic feasibility as well as the general performance of wind energy systems. The software package consists of two basic programs and a library of system, environmental, and load components. Volume III, the SIMWEST program description contains program descriptions, flow charts and program listings for the SIMWEST Model Generation Program, the Simulation program, the File Maintenance program and the Printer Plotter program. Volume III generally would not be required by SIMWEST user.

  7. Pembuatan Kakas Pendeteksi Unused Method pada Kode Program PHP dengan Framework CodeIgniter Menggunakan Call Graph

    Directory of Open Access Journals (Sweden)

    Divi Galih Prasetyo Putri

    2014-03-01

    Full Text Available Proses evolusi dan perawatan dari sebuah sistem merupakan proses yang sangat penting dalam rekayasa perangkat lunak tidak terkecuali pada aplikasi web. Pada proses ini kebanyakan pengembang tidak lagi berpatokan pada rancangan sistem. Hal ini menyebabkan munculnya unused method. Bagian-bagian program ini tidak lagi terpakai namun masih berada dalam sistem. Keadaan ini meningkatkan kompleksitas dan mengurangi tingkat understandability sistem. Guna mendeteksi adanya unused method pada progam diperlukan teknik untuk melakukan code analysis. Teknik static analysis yang digunakan memanfaatkan call graph yang dibangun dari kode program untuk mengetahui adanya unused method. Call graph dibangun berdasarkan pemanggilan antar method. Aplikasi ini mendeteksi unused method pada kode program PHP yang dibangun menggunakan framework CodeIgniter. Kode program sebagai inputan diurai kedalam bentuk Abstract Syntax Tree (AST yang kemudian dimanfaatkan untuk melakukan analisis terhadap kode program. Proses analisis tersebut kemudian menghasilkan sebuah call graph. Dari call graph yang dihasilkan dapat dideteksi method-method mana saja yang tidak berhasil ditelusuri dan tergolong kedalam unused method. Kakas telah diuji coba pada 5 aplikasi PHP dengan hasil  rata-rata nilai presisi sistem sebesar 0.749 dan recall sebesar 1.

  8. Code requirements document: MODFLOW 2.1: A program for predicting moderator flow patterns

    International Nuclear Information System (INIS)

    Peterson, P.F.

    1992-03-01

    Sudden changes in the temperature of flowing liquids can result in transient buoyancy forces which strongly impact the flow hydrodynamics via flow stratification. These effects have been studied for the case of potential flow of stratified liquids to line sinks, but not for moderator flow in SRS reactors. Standard codes, such as TRAC and COMMIX, do not have the capability to capture the stratification effect, due to strong numerical diffusion which smears away the hot/cold fluid interface. A related problem with standard codes is the inability to track plumes injected into the liquid flow, again due to numerical diffusion. The combined effects of buoyant stratification and plume dispersion have been identified as being important in operation of the Supplementary Safety System which injects neutron-poison ink into SRS reactors to provide safe shutdown in the event of safety rod failure. The MODFLOW code discussed here provides transient moderator flow pattern information with stratification effects, and tracks the location of ink plumes in the reactor. The code, written in Fortran, is compiled for Macintosh II computers, and includes subroutines for interactive control and graphical output. Removing the graphics capabilities, the code can also be compiled on other computers. With graphics, in addition to the capability to perform safety related computations, MODFLOW also provides an easy tool for becoming familiar with flow distributions in SRS reactors

  9. Advanced man-machine interface systems and advanced information management systems programs

    International Nuclear Information System (INIS)

    Naser, J.; Gray, S.; Machiels, A.

    1997-01-01

    The Advanced Light Water Reactor (ALWR) Program started in the early 1980's. This work involves the development and NRC review of the ALWR Utility Requirements Documents, the development and design certification of ALWR designs, the analysis of the Early Site Permit process, and the First-of-a-Kind Engineering for two of the ALWR plant designs. ALWRs will embody modern proven technology. However, technologies expected to be used in these plants are changing very rapidly so that additional capabilities will become available that will be beneficial for future plants. To remain competitive on a life-cycle basis in the future, the ALWR must take advantage of the best and most modem technologies available. 1 ref

  10. A program system for ab initio MO calculations on vector and parallel processing machines. Pt. 3

    International Nuclear Information System (INIS)

    Wiest, R.; Demuynck, J.; Benard, M.; Rohmer, M.M.; Ernenwein, R.

    1991-01-01

    This series of three papers presents a program system for ab initio molecular orbital calculations on vector and parallel computers. Part III is devoted to the four-index transformation on a molecular orbital basis of size NMO of the file of two-electorn integrals (pqparallelrs) generated by a contracted Gaussian set of size NATO (number of atomic orbitals). A fast Yoshimine algorithm first sorts the (pqparallelrs) integrals with respect to index pq only. This file of half-sorted integrals labelled by their rs-index can be processed without further modification to generate either the transformed integrals or the supermatrix elements. The large memory available on the CRAY-2 hase made possible to implement the transformation algorithm proposed by Bender in 1972, which requires a core-storage allocation varying as (NATO) 3 . Two versions of Bender's algorithm are included in the present program. The first version is an in-core version, where the complete file of accumulated contributions to transformed integrals in stored and updated in central memory. This version has been parallelized by distributing over a limited number of logical tasks the NATO steps corresponding to the scanning of the most external loop. The second version is an out-of-core version, in which twin files are alternatively used as input and output for the accumulated contributions to transformed integrals. This version is not parallel. The choice of one or another version and (for version 1) the determination of the number of tasks depends upon the balance between the available and the requested amounts of storage. The storage management and the choice of the proper version are carried out automatically using dynamic storage allocation. Both versions are vectorized and take advantage of the molecular symmetry. (orig.)

  11. Towards Compatible and Interderivable Semantic Specifications for the Scheme Programming Language, Part II: Reduction Semantics and Abstract Machines

    DEFF Research Database (Denmark)

    Biernacka, Malgorzata; Danvy, Olivier

    2008-01-01

    We present a context-sensitive reduction semantics for a lambda-calculus with explicit substitutions and store and we show that the functional implementation of this small-step semantics mechanically corresponds to that of an abstract machine. This abstract machine is very close to the abstract m...... machine for Core Scheme presented by Clinger at PLDI'98. This lambda-calculus with explicit substitutions and store therefore aptly accounts for Core Scheme....

  12. Procedure for the use of the code SAGAPO-A and auxiliary programs

    International Nuclear Information System (INIS)

    Cevolani, S.

    1981-06-01

    This paper describes the procedure developed in order to optimize the use of the computer code SAGAPO-A for the thermo-fluid-dynamic analysis of gas cooled fuel element bundles. The first item of this procedure concerns the dynamic dimensioning of the code, having as target the optimization of the computer storage requirement. The second item concerns the graphical output: the results of the calculation are plotted together with the experimental results, in order to allow an immediate evaluation of the calculation. (orig.) [de

  13. Hanford Dose Overview Program. Comparison of AIRDOS-EPA and Hanford site dose codes

    International Nuclear Information System (INIS)

    Aaberg, R.L.; Napier, B.A.

    1985-11-01

    Radiation dose commitments for persons in the Hanford environs calculated using AIRDOS-EPA were compared with those calculated using a suite of Hanford codes: FOOD, PABLM, DACRIN, and KRONIC. Dose commitments to the population and to the maximally exposed individual (MI) based on annual releases of eight radionuclides from the N-Reactor, were calculated by these codes. Dose commitments from each pathway to the total body, lung, thyroid, and lower large intestine (LLI) are given for the population and MI, respectively. 11 refs., 25 tabs

  14. HETERO code, heterogeneous procedure for reactor calculation; Program Hetero, heterogeni postupak proracuna reaktora

    Energy Technology Data Exchange (ETDEWEB)

    Jovanovic, S M; Raisic, N M [Boris Kidric Institute of Nuclear Sciences Vinca, Beograd (Yugoslavia)

    1966-11-15

    This report describes the procedure for calculating the parameters of heterogeneous reactor system taking into account the interaction between fuel elements related to established geometry. First part contains the analysis of single fuel element in a diffusion medium, and criticality condition of the reactor system described by superposition of elements interactions. the possibility of performing such analysis by determination of heterogeneous system lattice is described in the second part. Computer code HETERO with the code KETAP (calculation of criticality factor {eta}{sub n} and flux distribution) is part of this report together with the example of RB reactor square lattice.

  15. Keyboard with Universal Communication Protocol Applied to CNC Machine

    Directory of Open Access Journals (Sweden)

    Mejía-Ugalde Mario

    2014-04-01

    Full Text Available This article describes the use of a universal communication protocol for industrial keyboard based microcontroller applied to computer numerically controlled (CNC machine. The main difference among the keyboard manufacturers is that each manufacturer has its own programming of source code, producing a different communication protocol, generating an improper interpretation of the function established. The above results in commercial industrial keyboards which are expensive and incompatible in their connection with different machines. In the present work the protocol allows to connect the designed universal keyboard and the standard keyboard of the PC at the same time, it is compatible with all the computers through the communications USB, AT or PS/2, to use in CNC machines, with extension to other machines such as robots, blowing, injection molding machines and others. The advantages of this design include its easy reprogramming, decreased costs, manipulation of various machine functions and easy expansion of entry and exit signals. The results obtained of performance tests were satisfactory, because each key has the programmed and reprogrammed facility in different ways, generating codes for different functions, depending on the application where it is required to be used.

  16. Machine learning with R

    CERN Document Server

    Lantz, Brett

    2015-01-01

    Perhaps you already know a bit about machine learning but have never used R, or perhaps you know a little R but are new to machine learning. In either case, this book will get you up and running quickly. It would be helpful to have a bit of familiarity with basic programming concepts, but no prior experience is required.

  17. Development of a Data Acquisition Program for the Purpose of Monitoring Processing Statistics Throughout the BaBar Online Computing Infrastructure's Farm Machines

    Energy Technology Data Exchange (ETDEWEB)

    Stonaha, P.

    2004-09-03

    A current shortcoming of the BaBar monitoring system is the lack of systematic gathering, archiving, and access to the running statistics of the BaBar Online Computing Infrastructure's farm machines. Using C, a program has been written to gather the raw data of each machine's running statistics and compute various rates and percentages that can be used for system monitoring. These rates and percentages then can be stored in an EPICS database for graphing, archiving, and future access. Graphical outputs show the reception of the data into the EPICS database. The C program can read if the data are 32- or 64-bit and correct for overflows. This program is not exclusive to BaBar and can be easily modified for any system.

  18. Programming Video Games and Simulations in Science Education: Exploring Computational Thinking through Code Analysis

    Science.gov (United States)

    Garneli, Varvara; Chorianopoulos, Konstantinos

    2018-01-01

    Various aspects of computational thinking (CT) could be supported by educational contexts such as simulations and video-games construction. In this field study, potential differences in student motivation and learning were empirically examined through students' code. For this purpose, we performed a teaching intervention that took place over five…

  19. Evaluation of cleaning and disinfection performance of automatic washer disinfectors machines in programs presenting different cycle times and temperatures.

    Science.gov (United States)

    Bergo, Maria do Carmo Noronha Cominato

    2006-01-01

    Thermal washer-disinfectors represent a technology that brought about great advantages such as, establishment of protocols, standard operating procedures, reduction in occupational risk of a biological and environmental nature. The efficacy of the cleaning and disinfection obtained by automatic washer disinfectors machines in running programs with different times and temperatures determined by the different official agencies was validated according to recommendations from ISO Standards 15883-1/1999 and HTM2030 (NHS Estates, 1997) for the determining of the Minimum Lethality and DAL both theoretically and through the use with thermocouples. In order to determine the cleaning efficacy, the Soil Test, Biotrace Pro-tect and the Protein Test Kit were used. The procedure to verify the CFU count of viable microorganisms was performed before and after the thermal disinfection. This article shows that the results are in compliance with the ISO and HTM Standards. The validation steps confirmed the high efficacy level of the Medical Washer-Disinfectors. This protocol enabled the evaluation of the procedure based on evidence supported by scientific research, aiming at the support of the Supply Center multi-professional personnel with information and the possibility of developing further research.

  20. Hydride heat pump. Volume I. Users manual for HYCSOS system design program. [HYCSOS code

    Energy Technology Data Exchange (ETDEWEB)

    Gorman, R.; Moritz, P.

    1978-05-01

    A method for the design and costing of a metal hydride heat pump for residential use and a computer program, HYCSOS, which automates that method are described. The system analyzed is one in which a metal hydride heat pump can provide space heating and space cooling powered by energy from solar collectors and electric power generated from solar energy. The principles and basic design of the system are presented, and the computer program is described giving detailed design and performance equations used in the program. The operation of the program is explained, and a sample run is presented. This computer program is part of an effort to design, cost, and evaluate a hydride heat pump for residential use. The computer program is written in standard Fortran IV and was run on a CDC Cyber 74 and Cyber 174 computer. A listing of the program is included as an appendix. This report is Volume 1 of a two-volume document.

  1. Test of user- and system programs coded in real time languages - requirements on program language and testing tool

    International Nuclear Information System (INIS)

    Hertlin, J.; Mackert, M.

    1979-01-01

    In the present paper the functions are presented, which should be part of a test system for user programs in a higher treat time programming language, taking into account time sequences and competitive processes. As can be shown by the problem of testing, use of higher level real time programming languages renders the task of program development essentially easier, however performance of test procedures without appropriate test systems is very difficult. After the presentation of notions and methods for the testing of programs, general requirements on testing tools are described and the test system functions for a program test, beeing uncritical with respect to time, are placed together. Thereby, for every individual function, the interface between the test system, the program under test, and the residual program-generation system (compiler, binder, operating system, delay-time system, and loader) is given too. For the time-critical test, a series of desirable functions are described, which can be implemented with acceptable expense. (orig.) [de

  2. Analyzing Array Manipulating Programs by Program Transformation

    Science.gov (United States)

    Cornish, J. Robert M.; Gange, Graeme; Navas, Jorge A.; Schachte, Peter; Sondergaard, Harald; Stuckey, Peter J.

    2014-01-01

    We explore a transformational approach to the problem of verifying simple array-manipulating programs. Traditionally, verification of such programs requires intricate analysis machinery to reason with universally quantified statements about symbolic array segments, such as "every data item stored in the segment A[i] to A[j] is equal to the corresponding item stored in the segment B[i] to B[j]." We define a simple abstract machine which allows for set-valued variables and we show how to translate programs with array operations to array-free code for this machine. For the purpose of program analysis, the translated program remains faithful to the semantics of array manipulation. Based on our implementation in LLVM, we evaluate the approach with respect to its ability to extract useful invariants and the cost in terms of code size.

  3. Medical Applications of the PHITS Code (3): User Assistance Program for Medical Physics Computation.

    Science.gov (United States)

    Furuta, Takuya; Hashimoto, Shintaro; Sato, Tatsuhiko

    2016-01-01

    DICOM2PHITS and PSFC4PHITS are user assistance programs for medical physics PHITS applications. DICOM2PHITS is a program to construct the voxel PHITS simulation geometry from patient CT DICOM image data by using a conversion table from CT number to material composition. PSFC4PHITS is a program to convert the IAEA phase-space file data to PHITS format to be used as a simulation source of PHITS. Both of the programs are useful for users who want to apply PHITS simulation to verification of the treatment planning of radiation therapy. We are now developing a program to convert dose distribution obtained by PHITS to DICOM RT-dose format. We also want to develop a program which is able to implement treatment information included in other DICOM files (RT-plan and RT-structure) as a future plan.

  4. EAC european accident code. A modular system of computer programs to simulate LMFBR hypothetical accidents

    International Nuclear Information System (INIS)

    Wider, H.; Cametti, J.; Clusaz, A.; Devos, J.; VanGoethem, G.; Nguyen, H.; Sola, A.

    1985-01-01

    One aspect of fast reactor safety analysis consists of calculating the strongly coupled system of physical phenomena which contribute to the reactivity balance in hypothetical whole-core accidents: these phenomena are neutronics, fuel behaviour and heat transfer together with coolant thermohydraulics in single- and two-phase flow. Temperature variations in fuel, coolant and neighbouring structures induce, in fact, thermal reactivity feedbacks which are added up and put in the neutronics calculation to predict the neutron flux and the subsequent heat generation in the reactor. At this point a whole-core analysis code is necessary to examine for any hypothetical transient whether the various feedbacks result effectively in a negative balance, which is the basis condition to ensure stability and safety. The European Accident Code (EAC), developed at the Joint Research Centre of the CEC at Ispra (Italy), fulfills this objective. It is a modular informatics structure (quasi 2-D multichannel approach) aimed at collecting stand-alone computer codes of neutronics, fuel pin mechanics and hydrodynamics, developed both in national laboratories and in the JRC itself. EAC makes these modules interact with each other and produces results for these hypothetical accidents in terms of core damage and total energy release. 10 refs

  5. Application programming interface document for the modernized Transient Reactor Analysis Code (TRAC-M)

    International Nuclear Information System (INIS)

    Mahaffy, J.; Boyack, B.E.; Steinke, R.G.

    1998-05-01

    The objective of this document is to ease the task of adding new system components to the Transient Reactor Analysis Code (TRAC) or altering old ones. Sufficient information is provided to permit replacement or modification of physical models and correlations. Within TRAC, information is passed at two levels. At the upper level, information is passed by system-wide and component-specific data modules at and above the level of component subroutines. At the lower level, information is passed through a combination of module-based data structures and argument lists. This document describes the basic mechanics involved in the flow of information within the code. The discussion of interfaces in the body of this document has been kept to a general level to highlight key considerations. The appendices cover instructions for obtaining a detailed list of variables used to communicate in each subprogram, definitions and locations of key variables, and proposed improvements to intercomponent interfaces that are not available in the first level of code modernization

  6. Overview of the U.S. DOE Hydrogen Safety, Codes and Standards Program. Part 4: Hydrogen Sensors; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Buttner, William J.; Rivkin, Carl; Burgess, Robert; Brosha, Eric; Mukundan, Rangachary; James, C. Will; Keller, Jay

    2016-12-01

    Hydrogen sensors are recognized as a critical element in the safety design for any hydrogen system. In this role, sensors can perform several important functions including indication of unintended hydrogen releases, activation of mitigation strategies to preclude the development of dangerous situations, activation of alarm systems and communication to first responders, and to initiate system shutdown. The functionality of hydrogen sensors in this capacity is decoupled from the system being monitored, thereby providing an independent safety component that is not affected by the system itself. The importance of hydrogen sensors has been recognized by DOE and by the Fuel Cell Technologies Office's Safety and Codes Standards (SCS) program in particular, which has for several years supported hydrogen safety sensor research and development. The SCS hydrogen sensor programs are currently led by the National Renewable Energy Laboratory, Los Alamos National Laboratory, and Lawrence Livermore National Laboratory. The current SCS sensor program encompasses the full range of issues related to safety sensors, including development of advance sensor platforms with exemplary performance, development of sensor-related code and standards, outreach to stakeholders on the role sensors play in facilitating deployment, technology evaluation, and support on the proper selection and use of sensors.

  7. Two-Level Semantics and Code Generation

    DEFF Research Database (Denmark)

    Nielson, Flemming; Nielson, Hanne Riis

    1988-01-01

    A two-level denotational metalanguage that is suitable for defining the semantics of Pascal-like languages is presented. The two levels allow for an explicit distinction between computations taking place at compile-time and computations taking place at run-time. While this distinction is perhaps...... not absolutely necessary for describing the input-output semantics of programming languages, it is necessary when issues such as data flow analysis and code generation are considered. For an example stack-machine, the authors show how to generate code for the run-time computations and still perform the compile...

  8. Using Program Theory-Driven Evaluation Science to Crack the Da Vinci Code

    Science.gov (United States)

    Donaldson, Stewart I.

    2005-01-01

    Program theory-driven evaluation science uses substantive knowledge, as opposed to method proclivities, to guide program evaluations. It aspires to update, clarify, simplify, and make more accessible the evolving theory of evaluation practice commonly referred to as theory-driven or theory-based evaluation. The evaluator in this chapter provides a…

  9. Panda code

    International Nuclear Information System (INIS)

    Altomare, S.; Minton, G.

    1975-02-01

    PANDA is a new two-group one-dimensional (slab/cylinder) neutron diffusion code designed to replace and extend the FAB series. PANDA allows for the nonlinear effects of xenon, enthalpy and Doppler. Fuel depletion is allowed. PANDA has a completely general search facility which will seek criticality, maximize reactivity, or minimize peaking. Any single parameter may be varied in a search. PANDA is written in FORTRAN IV, and as such is nearly machine independent. However, PANDA has been written with the present limitations of the Westinghouse CDC-6600 system in mind. Most computation loops are very short, and the code is less than half the useful 6600 memory size so that two jobs can reside in the core at once. (auth)

  10. Development of Web-based Virtual Training Environment for Machining

    Science.gov (United States)

    Yang, Zhixin; Wong, S. F.

    2010-05-01

    With the booming in the manufacturing sector of shoe, garments and toy, etc. in pearl region, training the usage of various facilities and design the facility layout become crucial for the success of industry companies. There is evidence that the use of virtual training may provide benefits in improving the effect of learning and reducing risk in the physical work environment. This paper proposed an advanced web-based training environment that could demonstrate the usage of a CNC machine in terms of working condition and parameters selection. The developed virtual environment could provide training at junior level and advanced level. Junior level training is to explain machining knowledge including safety factors, machine parameters (ex. material, speed, feed rate). Advanced level training enables interactive programming of NG coding and effect simulation. Operation sequence was used to assist the user to choose the appropriate machining condition. Several case studies were also carried out with animation of milling and turning operations.

  11. SAFIPA-Meraka Institute code-sprints program; a mechanism to enhance the development capacity of emerging developers – observations and lessons learned

    CSIR Research Space (South Africa)

    Coetzee, L

    2010-05-01

    Full Text Available -MERAKA code-sprints program, a possibility for development is identified. NAP, an initiative to enhance inclusion and empower persons with disabilities, has shown that initiatives need to come from within the community to succeed. A popular slogan... ownership of development efforts. This paper investigates the feasibility of the “ICT for Society through Society” paradigm at the hand of the SAFIPA-MERAKA code-sprints program, an analysis of the Information and Communications Technology...

  12. ECHO: Machine feasibility program

    Science.gov (United States)

    Philip H. Steele; Craig Boden; Philip A. Araman

    2000-01-01

    Reductions in saw kerf (the term saw kerf refers to both the sawtooth width as well as the actual sawline made in sawing) on headrigs and resaws can dramatically increase lumber recovery. Research has also shown that lumber target size reductions are even more important than kerf reductions in providing increased lumber recovery. Decreases in either kerf or lumber size...

  13. COMSAT: Residue contact prediction of transmembrane proteins based on support vector machines and mixed integer linear programming.

    Science.gov (United States)

    Zhang, Huiling; Huang, Qingsheng; Bei, Zhendong; Wei, Yanjie; Floudas, Christodoulos A

    2016-03-01

    In this article, we present COMSAT, a hybrid framework for residue contact prediction of transmembrane (TM) proteins, integrating a support vector machine (SVM) method and a mixed integer linear programming (MILP) method. COMSAT consists of two modules: COMSAT_SVM which is trained mainly on position-specific scoring matrix features, and COMSAT_MILP which is an ab initio method based on optimization models. Contacts predicted by the SVM model are ranked by SVM confidence scores, and a threshold is trained to improve the reliability of the predicted contacts. For TM proteins with no contacts above the threshold, COMSAT_MILP is used. The proposed hybrid contact prediction scheme was tested on two independent TM protein sets based on the contact definition of 14 Å between Cα-Cα atoms. First, using a rigorous leave-one-protein-out cross validation on the training set of 90 TM proteins, an accuracy of 66.8%, a coverage of 12.3%, a specificity of 99.3% and a Matthews' correlation coefficient (MCC) of 0.184 were obtained for residue pairs that are at least six amino acids apart. Second, when tested on a test set of 87 TM proteins, the proposed method showed a prediction accuracy of 64.5%, a coverage of 5.3%, a specificity of 99.4% and a MCC of 0.106. COMSAT shows satisfactory results when compared with 12 other state-of-the-art predictors, and is more robust in terms of prediction accuracy as the length and complexity of TM protein increase. COMSAT is freely accessible at http://hpcc.siat.ac.cn/COMSAT/. © 2016 Wiley Periodicals, Inc.

  14. Comparison of Test Procedures and Energy Efficiency Criteria in Selected International Standards and Labeling Programs for Clothes Washers, Water Dispensers, Vending Machines and CFLs

    Energy Technology Data Exchange (ETDEWEB)

    Fridley, David; Zheng, Nina; Zhou, Nan

    2010-06-01

    Since the late 1970s, energy labeling programs and mandatory energy performance standards have been used in many different countries to improve the efficiency levels of major residential and commercial equipment. As more countries and regions launch programs covering a greater range of products that are traded worldwide, greater attention has been given to harmonizing the specific efficiency criteria in these programs and the test methods for measurements. For example, an international compact fluorescent light (CFL) harmonization initiative was launched in 2006 to focus on collaboration between Australia, China, Europe and North America. Given the long history of standards and labeling programs, most major energy-consuming residential appliances and commercial equipment are already covered under minimum energy performance standards (MEPS) and/or energy labels. For these products, such as clothes washers and CFLs, harmonization may still be possible when national MEPS or labeling thresholds are revised. Greater opportunity for harmonization exists in newer energy-consuming products that are not commonly regulated but are under consideration for new standards and labeling programs. This may include commercial products such as water dispensers and vending machines, which are only covered by MEPS or energy labels in a few countries or regions. As China continues to expand its appliance standards and labeling programs and revise existing standards and labels, it is important to learn from recent international experiences with efficiency criteria and test procedures for the same products. Specifically, various types of standards and labeling programs already exist in North America, Europe and throughout Asia for products in China's 2010 standards and labeling programs, namely clothes washers, water dispensers, vending machines and CFLs. This report thus examines similarities and critical differences in energy efficiency values, test procedure specifications and

  15. Integer-linear-programing optimization in scalable video multicast with adaptive modulation and coding in wireless networks.

    Science.gov (United States)

    Lee, Dongyul; Lee, Chaewoo

    2014-01-01

    The advancement in wideband wireless network supports real time services such as IPTV and live video streaming. However, because of the sharing nature of the wireless medium, efficient resource allocation has been studied to achieve a high level of acceptability and proliferation of wireless multimedia. Scalable video coding (SVC) with adaptive modulation and coding (AMC) provides an excellent solution for wireless video streaming. By assigning different modulation and coding schemes (MCSs) to video layers, SVC can provide good video quality to users in good channel conditions and also basic video quality to users in bad channel conditions. For optimal resource allocation, a key issue in applying SVC in the wireless multicast service is how to assign MCSs and the time resources to each SVC layer in the heterogeneous channel condition. We formulate this problem with integer linear programming (ILP) and provide numerical results to show the performance under 802.16 m environment. The result shows that our methodology enhances the overall system throughput compared to an existing algorithm.

  16. Integer-Linear-Programing Optimization in Scalable Video Multicast with Adaptive Modulation and Coding in Wireless Networks

    Directory of Open Access Journals (Sweden)

    Dongyul Lee

    2014-01-01

    Full Text Available The advancement in wideband wireless network supports real time services such as IPTV and live video streaming. However, because of the sharing nature of the wireless medium, efficient resource allocation has been studied to achieve a high level of acceptability and proliferation of wireless multimedia. Scalable video coding (SVC with adaptive modulation and coding (AMC provides an excellent solution for wireless video streaming. By assigning different modulation and coding schemes (MCSs to video layers, SVC can provide good video quality to users in good channel conditions and also basic video quality to users in bad channel conditions. For optimal resource allocation, a key issue in applying SVC in the wireless multicast service is how to assign MCSs and the time resources to each SVC layer in the heterogeneous channel condition. We formulate this problem with integer linear programming (ILP and provide numerical results to show the performance under 802.16 m environment. The result shows that our methodology enhances the overall system throughput compared to an existing algorithm.

  17. Preliminary design and manufacturing feasibility study for a machined Zircaloy triangular pitch fuel rod support system (grids) (AWBA development program)

    International Nuclear Information System (INIS)

    Horwood, W.A.

    1981-07-01

    General design features and manufacturing operations for a high precision machined Zircaloy fuel rod support grid intended for use in advanced light water prebreeder or breeder reactor designs are described. The grid system consists of a Zircaloy main body with fuel rod and guide tube cells machined using wire EDM, a separate AM-350 stainless steel insert spring which fits into a full length T-slot in each fuel rod cell, and a thin (0.025'' or 0.040'' thick) wire EDM machined Zircaloy coverplate laser welded to each side of the grid body to retain the insert springs. The fuel rods are placed in a triangular pitch array with a tight rod-to-rod spacing of 0.063 inch nominal. Two dimples are positioned at the mid-thickness of the grid (single level) with a 90 0 included angle. Data is provided on the effectiveness of the manufacturing operations chosen for grid machining and assembly

  18. DOE FreedomCAR and vehicle technologies program advanced power electronic and electrical machines annual review report

    Energy Technology Data Exchange (ETDEWEB)

    Olszewski, Mitch [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2006-10-11

    This report is a summary of the Review Panel at the FY06 DOE FreedomCAR and Vehicle Technologies (FCVT) Annual Review of Advanced Power Electronics and Electric Machine (APEEM) research activities held on August 15-17, 2006.

  19. Code Maintenance and Design for a Visual Programming Language Graphical User Interface

    National Research Council Canada - National Science Library

    Pierson, Graham

    2004-01-01

    This work adds new functionality to an existing visual programming environment. It applies software maintenance techniques for use with the Java Language in a Microsoft Windows operating system environment...

  20. International training program: 3D S.UN.COP - Scaling, uncertainty and 3D thermal-hydraulics/neutron-kinetics coupled codes seminar

    International Nuclear Information System (INIS)

    Petruzzi, A.; D'Auria, F.; Bajs, T.; Reventos, F.

    2006-01-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers and vendors, nuclear fuel companies, research organizations, consulting companies, and technical support organizations. The computer code user represents a source of uncertainty that can influence the results of system code calculations. This influence is commonly known as the 'user effect' and stems from the limitations embedded in the codes as well as from the limited capability of the analysts to use the codes. Code user training and qualification is an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In other words, the program aims at contributing towards solving the problem of user effect. The 3D S.UN.COP 2005 (Scaling, Uncertainty and 3D COuPled code calculations) seminar has been organized by University of Pisa and University of Zagreb as follow-up of the proposal to IAEA for the Permanent Training Course for System Code Users (D'Auria, 1998). It was recognized that such a course represented both a source of continuing education for current code users and a means for current code users to enter the formal training structure of a proposed 'permanent' stepwise approach to user training. The seminar-training was successfully held with the participation of 19 persons coming from 9 countries and 14 different institutions (universities, vendors, national laboratories and regulatory bodies). More than 15 scientists were involved in the organization of the seminar, presenting theoretical aspects of the proposed methodologies and holding the training and the final examination. A certificate (LA Code User grade) was released

  1. A program code generator for multiphysics biological simulation using markup languages.

    Science.gov (United States)

    Amano, Akira; Kawabata, Masanari; Yamashita, Yoshiharu; Rusty Punzalan, Florencio; Shimayoshi, Takao; Kuwabara, Hiroaki; Kunieda, Yoshitoshi

    2012-01-01

    To cope with the complexity of the biological function simulation models, model representation with description language is becoming popular. However, simulation software itself becomes complex in these environment, thus, it is difficult to modify the simulation conditions, target computation resources or calculation methods. In the complex biological function simulation software, there are 1) model equations, 2) boundary conditions and 3) calculation schemes. Use of description model file is useful for first point and partly second point, however, third point is difficult to handle for various calculation schemes which is required for simulation models constructed from two or more elementary models. We introduce a simulation software generation system which use description language based description of coupling calculation scheme together with cell model description file. By using this software, we can easily generate biological simulation code with variety of coupling calculation schemes. To show the efficiency of our system, example of coupling calculation scheme with three elementary models are shown.

  2. Sustainable machining

    CERN Document Server

    2017-01-01

    This book provides an overview on current sustainable machining. Its chapters cover the concept in economic, social and environmental dimensions. It provides the reader with proper ways to handle several pollutants produced during the machining process. The book is useful on both undergraduate and postgraduate levels and it is of interest to all those working with manufacturing and machining technology.

  3. Annual Coded Wire Tag Program; Oregon Missing Production Groups, 1994 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Garrison, Robert L.; Isaac, Dennis L.; Lewis, Mark A.

    1994-12-01

    The goal of this program is to develop the ability to estimate hatchery production survival values and evaluate effectiveness of Oregon hatcheries. To accomplish this goal. We are tagging missing production groups within hatcheries to assure each production group is identifiable to allow future evaluation upon recovery of tag data.

  4. CodeMaster--Automatic Assessment and Grading of App Inventor and Snap! Programs

    Science.gov (United States)

    von Wangenheim, Christiane Gresse; Hauck, Jean C. R.; Demetrio, Matheus Faustino; Pelle, Rafael; da Cruz Alves, Nathalia; Barbosa, Heliziane; Azevedo, Luiz Felipe

    2018-01-01

    The development of computational thinking is a major topic in K-12 education. Many of these experiences focus on teaching programming using block-based languages. As part of these activities, it is important for students to receive feedback on their assignments. Yet, in practice it may be difficult to provide personalized, objective and consistent…

  5. The Coded Schoolhouse: One-to-One Tablet Computer Programs and Urban Education

    Science.gov (United States)

    Crooks, Roderic N.

    2016-01-01

    Using a South Los Angeles charter school of approximately 650 students operated by a non-profit charter management organization (CMO) as the primary field site, this two-year, ethnographic research project examines the implementation of a one-to-one tablet computer program in a public high school. This dissertation examines the variety of ways…

  6. Precision machining commercialization

    International Nuclear Information System (INIS)

    1978-01-01

    To accelerate precision machining development so as to realize more of the potential savings within the next few years of known Department of Defense (DOD) part procurement, the Air Force Materials Laboratory (AFML) is sponsoring the Precision Machining Commercialization Project (PMC). PMC is part of the Tri-Service Precision Machine Tool Program of the DOD Manufacturing Technology Five-Year Plan. The technical resources supporting PMC are provided under sponsorship of the Department of Energy (DOE). The goal of PMC is to minimize precision machining development time and cost risk for interested vendors. PMC will do this by making available the high precision machining technology as developed in two DOE contractor facilities, the Lawrence Livermore Laboratory of the University of California and the Union Carbide Corporation, Nuclear Division, Y-12 Plant, at Oak Ridge, Tennessee

  7. CDC 7600 LTSS programming stratagens: preparing your first production code for the Livermore Timesharing System

    International Nuclear Information System (INIS)

    Fong, K.W.

    1977-01-01

    This report deals with some techniques in applied programming using the Livermore Timesharing System (LTSS) on the CDC 7600 computers at the National Magnetic Fusion Energy Computer Center (NMFECC) and the Lawrence Livermore Laboratory Computer Center (LLLCC or Octopus network). This report is based on a document originally written specifically about the system as it is implemented at NMFECC but has been revised to accommodate differences between LLLCC and NMFECC implementations. Topics include: maintaining programs, debugging, recovering from system crashes, and using the central processing unit, memory, and input/output devices efficiently and economically. Routines that aid in these procedures are mentioned. The companion report, UCID-17556, An LTSS Compendium, discusses the hardware and operating system and should be read before reading this report

  8. Designing machines for lattice physics and algorithm investigation

    International Nuclear Information System (INIS)

    Fischler, M.; Atac, R.; Cook, A.

    1989-10-01

    Special-purpose computers are appropriate tools for the study of lattice gauge theory. While these machines deliver considerable processing power, it is also important to be able to program complex physics ideas and investigate algorithms on them. We examine features that facilitate coding of physics problems, and flexibility in algorithms. Appropriate balances among power, memory, communications and I/O capabilities are presented. 10 refs

  9. VMIL 2011 : the 5th Workshop on Virtual Machines and Intermediate Languages

    NARCIS (Netherlands)

    Rajan, Hridesh; Hauptmann, Michael; Bockisch, Christoph; Dyer, Robert

    2011-01-01

    The VMIL workshop is a forum for research in virtual machines and intermediate languages. It is dedicated to identifying programming mechanisms and constructs that are currently realized as code transformations or implemented in libraries but should rather be supported at VM level. Candidates for

  10. 6th Workshop on Virtual Machines and Intermediate Languages (VMIL’12)

    NARCIS (Netherlands)

    Rajan, Hridesh; Hauptmann, Michael; Bockisch, Christoph; Blackburn, Steve

    2012-01-01

    The VMIL workshop is a forum for research in virtual machines and intermediate languages. It is dedicated to identifying programming mechanisms and constructs that are currently realized as code transformations or implemented in libraries but should rather be supported at VM level. Candidates for

  11. The GEM code. A simulation program for the evaporation and the fission process of an excited nucleus

    Energy Technology Data Exchange (ETDEWEB)

    Furihata, Shiori [Mitsubishi Research Institute Inc., Tokyo (Japan); Niita, Koji [Research Organization for Information Science and Technology, Tokai, Ibaraki (Japan); Meigo, Shin-ichiro; Ikeda, Yujiro; Maekawa, Fujio [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-03-01

    The GEM code is a simulation program which describes the de-excitation process of an excited nucleus, which is based on the Generalized Evaporation Model and the Atchison fission model. It has been shown that the combination of the Bertini intranuclear cascade model and GEM accurately predicts the cross sections of light fragments, such as Be produced from the proton-induced reactions. It has also been shown that the use of the reevaluated parameters in the Atchison model improves predictions of cross sections of fission fragments produced from the proton-induced reaction on Au. In this report, we present details and the usage of the GEM code. Furthermore, the results of benchmark calculations are shown by using the combination of the Bertini intranuclear cascade model and the GEM code (INC/GEM). Neutron spectra and isotope production cross sections from the reactions on various targets irradiated by protons are calculated with INC/GEM. Those results are compared with experimental data as well as the calculation results with LAHET. INC/GEM reproduces the experiments of double differential neutron emissions from the reaction on Al and Pb. The isotopic distributions for He, Li, and Be produced from the reaction on Ag are in good agreement with experimental data within 50%, although INC/GEM underestimates those of heavier nuclei than O. It is also shown that the predictions with INC/GEM for isotope production of light fragments, such as Li and Be, are better than those calculation with LAHET, particularly for heavy target. INC/GEM also gives better estimates of the cross sections of fission products than LAHET. (author)

  12. Machine musicianship

    Science.gov (United States)

    Rowe, Robert

    2002-05-01

    The training of musicians begins by teaching basic musical concepts, a collection of knowledge commonly known as musicianship. Computer programs designed to implement musical skills (e.g., to make sense of what they hear, perform music expressively, or compose convincing pieces) can similarly benefit from access to a fundamental level of musicianship. Recent research in music cognition, artificial intelligence, and music theory has produced a repertoire of techniques that can make the behavior of computer programs more musical. Many of these were presented in a recently published book/CD-ROM entitled Machine Musicianship. For use in interactive music systems, we are interested in those which are fast enough to run in real time and that need only make reference to the material as it appears in sequence. This talk will review several applications that are able to identify the tonal center of musical material during performance. Beyond this specific task, the design of real-time algorithmic listening through the concurrent operation of several connected analyzers is examined. The presentation includes discussion of a library of C++ objects that can be combined to perform interactive listening and a demonstration of their capability.

  13. Melt/concrete interactions: the Sandia experimental program, model development, and code comparison test

    International Nuclear Information System (INIS)

    Powers, D.A.; Muir, J.F.

    1979-01-01

    High temperature melt/concrete interactions have been studied both experimentally and analytically at Sandia under sponsorship of Reactor Safety Research of the US Nuclear Regulatory Commission. The purpose of these studies has been to develop an understanding of these interactions suitable for risk assessment. Results of the experimental program are summarized and a computer model of melt/concrete interactions is described. A melt/concrete interaction test that will allow this and other models of the interaction to be compared is also described

  14. Tool set for distributed real-time machine control

    Science.gov (United States)

    Carrott, Andrew J.; Wright, Christopher D.; West, Andrew A.; Harrison, Robert; Weston, Richard H.

    1997-01-01

    Demands for increased control capabilities require next generation manufacturing machines to comprise intelligent building elements, physically located at the point where the control functionality is required. Networks of modular intelligent controllers are increasingly designed into manufacturing machines and usable standards are slowly emerging. To implement a control system using off-the-shelf intelligent devices from multi-vendor sources requires a number of well defined activities, including (a) the specification and selection of interoperable control system components, (b) device independent application programming and (c) device configuration, management, monitoring and control. This paper briefly discusses the support for the above machine lifecycle activities through the development of an integrated computing environment populated with an extendable software toolset. The toolset supports machine builder activities such as initial control logic specification, logic analysis, machine modeling, mechanical verification, application programming, automatic code generation, simulation/test, version control, distributed run-time support and documentation. The environment itself consists of system management tools and a distributed object-oriented database which provides storage for the outputs from machine lifecycle activities and specific target control solutions.

  15. Machine Learning for Hackers

    CERN Document Server

    Conway, Drew

    2012-01-01

    If you're an experienced programmer interested in crunching data, this book will get you started with machine learning-a toolkit of algorithms that enables computers to train themselves to automate useful tasks. Authors Drew Conway and John Myles White help you understand machine learning and statistics tools through a series of hands-on case studies, instead of a traditional math-heavy presentation. Each chapter focuses on a specific problem in machine learning, such as classification, prediction, optimization, and recommendation. Using the R programming language, you'll learn how to analyz

  16. A Turing Machine Simulator.

    Science.gov (United States)

    Navarro, Aaron B.

    1981-01-01

    Presents a program in Level II BASIC for a TRS-80 computer that simulates a Turing machine and discusses the nature of the device. The program is run interactively and is designed to be used as an educational tool by computer science or mathematics students studying computational or automata theory. (MP)

  17. An Optimal Linear Coding for Index Coding Problem

    OpenAIRE

    Pezeshkpour, Pouya

    2015-01-01

    An optimal linear coding solution for index coding problem is established. Instead of network coding approach by focus on graph theoric and algebraic methods a linear coding program for solving both unicast and groupcast index coding problem is presented. The coding is proved to be the optimal solution from the linear perspective and can be easily utilize for any number of messages. The importance of this work is lying mostly on the usage of the presented coding in the groupcast index coding ...

  18. Conceptual Design of Object Oriented Program (OOP) for Pilot Code of Two-Fluid, Three-field Model with C++ 6.0

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Lee, Young Jin

    2006-01-01

    Engineering software for design purpose in nuclear industries have been developed since early 1970s, and well established in 1980s. The most popular and common language for the software development has been FORTRAN series, until the more sophisticated GUI and software coupling is needed. The advanced computer language, such as C++, C has been developed to help the programming for the easy GUI need and reuse of well developed routines, with adopting the objective oriented program. A recent trend of programming becomes objective-oriented since the results are often more intuitive and easier to maintain than procedure program. The main motivation of this work is to capture objective oriented concepts for conventional safety analysis programs which consist of many functions and procedure oriented structures. In this work, the new objective programming with C++ 6.0 language has been tried for the PILOT code written in FORTRAN language, and conceptual OOP design of the system safety analysis code has been done

  19. Establishing a communications link between two different, incompatible, personal computers: with practical examples and illustrations and program code.

    Science.gov (United States)

    Davidson, R W

    1985-01-01

    The increasing need to communicate to exchange data can be handled by personal microcomputers. The necessity for the transference of information stored in one type of personal computer to another type of personal computer is often encountered in the process of integrating multiple sources of information stored in different and incompatible computers in Medical Research and Practice. A practical example is demonstrated with two relatively inexpensive commonly used computers, the IBM PC jr. and the Apple IIe. The basic input/output (I/O) interface chip for serial communication for each computer are joined together using a Null connector and cable to form a communications link. Using BASIC (Beginner's All-purpose Symbolic Instruction Code) Computer Language and the Disk Operating System (DOS) the communications handshaking protocol and file transfer is established between the two computers. The BASIC programming languages used are Applesoft (Apple Personal Computer) and PC BASIC (IBM Personal computer).

  20. Towards Compatible and Interderivable Semantic Specifications for the Scheme Programming Language, Part II: Reduction Semantics and Abstract Machines

    DEFF Research Database (Denmark)

    Biernacka, Malgorzata; Danvy, Olivier

    2009-01-01

    We present a context-sensitive reduction semantics for a lambmda-calculus with explicit substitutions and we show that the functional implementation of this small-step semantics mechanically corresponds to that of the abstract machine for Core Scheme presented by Clinger at PLDI'98, including fir...

  1. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Code Reference Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; K. J. Kvarfordt; S. T. Wood

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for transforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). SAPHIRE includes a separate module called the Graphical Evaluation Module (GEM). GEM provides a highly specialized user interface with SAPHIRE that automates SAPHIRE process steps for evaluating operational events at commercial nuclear power plants. Using GEM, an analyst can estimate the risk associated with operational events in a very efficient and expeditious manner. This reference guide will introduce the SAPHIRE Version 7.0 software. A brief discussion of the purpose and history of the software is included along with

  2. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Code Reference Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; K. J. Kvarfordt; S. T. Wood

    2006-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for ansforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). SAPHIRE includes a separate module called the Graphical Evaluation Module (GEM). GEM provides a highly specialized user interface with SAPHIRE that automates SAPHIRE process steps for evaluating operational events at commercial nuclear power plants. Using GEM, an analyst can estimate the risk associated with operational events in a very efficient and expeditious manner. This reference guide will introduce the SAPHIRE Version 7.0 software. A brief discussion of the purpose and history of the software is included along with

  3. Description of the TREBIL, CRESSEX and STREUSL computer programs, that belongs to RALLY computer code pack for the analysis of reliability systems

    International Nuclear Information System (INIS)

    Fernandes Filho, T.L.

    1982-11-01

    The RALLY computer code pack (RALLY pack) is a set of computer codes destinate to the reliability of complex systems, aiming to a risk analysis. Three of the six codes, are commented, presenting their purpose, input description, calculation methods and results obtained with each one of those computer codes. The computer codes are: TREBIL, to obtain the fault tree logical equivalent; CRESSEX, to obtain the minimal cut and the punctual values of the non-reliability and non-availability of the system; and STREUSL, for the dispersion calculation of those values around the media. In spite of the CRESSEX, in its version available at CNEN, uses a little long method to obtain the minimal cut in an HB-CNEN system, the three computer programs show good results, mainly the STREUSL, which permits the simulation of various components. (E.G.) [pt

  4. Development of a simple detector response function generation program: The CEARDRFs code

    Energy Technology Data Exchange (ETDEWEB)

    Wang Jiaxin, E-mail: jwang3@ncsu.edu [Center for Engineering Applications of Radioisotopes (CEAR), Department of Nuclear Engineering, North Carolina State University, Raleigh, NC 27695 (United States); Wang Zhijian; Peeples, Johanna [Center for Engineering Applications of Radioisotopes (CEAR), Department of Nuclear Engineering, North Carolina State University, Raleigh, NC 27695 (United States); Yu Huawei [Center for Engineering Applications of Radioisotopes (CEAR), Department of Nuclear Engineering, North Carolina State University, Raleigh, NC 27695 (United States); College of Geo-Resources and Information, China University of Petroleum, Qingdao, Shandong 266555 (China); Gardner, Robin P. [Center for Engineering Applications of Radioisotopes (CEAR), Department of Nuclear Engineering, North Carolina State University, Raleigh, NC 27695 (United States)

    2012-07-15

    A simple Monte Carlo program named CEARDRFs has been developed to generate very accurate detector response functions (DRFs) for scintillation detectors. It utilizes relatively rigorous gamma-ray transport with simple electron transport, and accounts for two phenomena that have rarely been treated: scintillator non-linearity and the variable flat continuum part of the DRF. It has been proven that these physics and treatments work well for 3 Multiplication-Sign 3 Double-Prime and 6 Multiplication-Sign 6 Double-Prime cylindrical NaI detector in CEAR's previous work. Now this approach has been expanded to cover more scintillation detectors with various common shapes and sizes. Benchmark experiments of 2 Multiplication-Sign 2 Double-Prime cylindrical BGO detector and 2 Multiplication-Sign 4 Multiplication-Sign 16 Double-Prime rectangular NaI detector have been carried out at CEAR with various radiactive sources. The simulation results of CEARDRFs have also been compared with MCNP5 calculations. The benchmark and comparison show that CEARDRFs can generate very accurate DRFs (more accurate than MCNP5) at a very fast speed (hundred times faster than MCNP5). The use of this program can significantly increase the accuracy of applications relying on detector spectroscopy like prompt gamma-ray neutron activation analysis, X-ray fluorescence analysis, oil well logging and homeland security. - Highlights: Black-Right-Pointing-Pointer CEARDRF has been developed to generate detector response functions (DRFs) for scintillation detectors a. Black-Right-Pointing-Pointer Generated DRFs are very accurate. Black-Right-Pointing-Pointer Simulation speed is hundreds of times faster than MCNP5. Black-Right-Pointing-Pointer It utilizes rigorous gamma-ray transport with simple electron transport. Black-Right-Pointing-Pointer It also accounts for scintillator non-linearity and the variable flat continuum part.

  5. Development of a simple detector response function generation program: The CEARDRFs code

    International Nuclear Information System (INIS)

    Wang Jiaxin; Wang Zhijian; Peeples, Johanna; Yu Huawei; Gardner, Robin P.

    2012-01-01

    A simple Monte Carlo program named CEARDRFs has been developed to generate very accurate detector response functions (DRFs) for scintillation detectors. It utilizes relatively rigorous gamma-ray transport with simple electron transport, and accounts for two phenomena that have rarely been treated: scintillator non-linearity and the variable flat continuum part of the DRF. It has been proven that these physics and treatments work well for 3×3″ and 6×6″ cylindrical NaI detector in CEAR's previous work. Now this approach has been expanded to cover more scintillation detectors with various common shapes and sizes. Benchmark experiments of 2×2″ cylindrical BGO detector and 2×4×16″ rectangular NaI detector have been carried out at CEAR with various radiactive sources. The simulation results of CEARDRFs have also been compared with MCNP5 calculations. The benchmark and comparison show that CEARDRFs can generate very accurate DRFs (more accurate than MCNP5) at a very fast speed (hundred times faster than MCNP5). The use of this program can significantly increase the accuracy of applications relying on detector spectroscopy like prompt gamma-ray neutron activation analysis, X-ray fluorescence analysis, oil well logging and homeland security. - Highlights: ► CEARDRF has been developed to generate detector response functions (DRFs) for scintillation detectors a. ► Generated DRFs are very accurate. ► Simulation speed is hundreds of times faster than MCNP5. ► It utilizes rigorous gamma-ray transport with simple electron transport. ► It also accounts for scintillator non-linearity and the variable flat continuum part.

  6. Vocable Code

    DEFF Research Database (Denmark)

    Soon, Winnie; Cox, Geoff

    2018-01-01

    a computational and poetic composition for two screens: on one of these, texts and voices are repeated and disrupted by mathematical chaos, together exploring the performativity of code and language; on the other, is a mix of a computer programming syntax and human language. In this sense queer code can...... be understood as both an object and subject of study that intervenes in the world’s ‘becoming' and how material bodies are produced via human and nonhuman practices. Through mixing the natural and computer language, this article presents a script in six parts from a performative lecture for two persons...

  7. User's guide to GASPAR code (a computer program for calculating radiation exposure to man from routine air releases of nuclear reactor effluents). Technical report

    International Nuclear Information System (INIS)

    Eckerman, K.F.; Congel, F.J.; Roecklein, A.K.; Pasciak, W.J.

    1980-06-01

    The document is a user's guide for the GASPAR code, a computer program written for the evaluation of radiological impacts due to the release of radioactive material to the atmosphere during normal operation of light water reactors. The GASPAR code implements the radiological impact models of NRC Regulatory Guide 1.109, Revision 1, for atmospheric releases. The code is currently used by NRC in reactor licensing evaluations to estimate (1) the collective or population dose to the population within a 50-mile radius of a facility, (2) the total collective dose to the U.S. population, and (3) the maximum individual doses at selected locations in the vicinity of the plant

  8. Training program for energy conservation in new building construction. Volume III. Energy conservation technology for plan examiners and code administrators. Energy Conservation Technology Series 200

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-12-01

    Under the sponsorship of the United States Department of Energy, a Model Code for Energy Conservation in New Building Construction has been developed by those national organizations primarily concerned with the development and promulgation of model codes. The technical provisions are based on ASHRAE Standard 90-75 and are intended for use by state and local officials. The subject of regulation of new building construction to assure energy conservation is recognized as one in which code officials have not had previous exposure. It was also determined that application of the model code would be made at varying levels by officials with both a specific requirement for knowledge and a differing degree of prior training in the state-of-the-art. Therefore, a training program and instructional materials were developed for code officials to assist them in the implementation and enforcement of energy efficient standards and codes. The training program for Energy Conservation Tehnology for Plan Examiners and Code Administrators (ECT Series 200) is presented.

  9. Qualification testing program plan for SIMMER. A computer code for LMFBR disrupted core analysis

    International Nuclear Information System (INIS)

    Basdekas, D.L.; Silberberg, M.; Curtis, R.T.; Kelber, C.N.

    1978-07-01

    The objective of SIMMER qualification testing program is to assure that the mathematical models and input parameters are derived from experimental data, which, on the basis of criteria still to be established, are representative of the phenomena and processes governing the progression of a CDA in an LMFBR. At the present time, the work to meet this objective can be classified into three general task areas as they relate to the use of SIMMER in CDA analysis: (1) The whole-core energetic disassembly accident, or the ''vessel problem'': The objective here is to predict the partition of the total energy release, by a postulated severe power excursion, between the primary containment and the energy absorbed through nondestructive dissipative processes. (2) Single subassembly accident: The objective here is to determine the pertinent phenomena and to develop the capability to assess the significance and likelihood that such accidents might propagate to involvement of larger fraction of the core. (3) The whole-core transition phase accident: The objective here is to advance the understanding of the phenomena and processes involved, so that reliable predictions can be made of the possible consequences of a CDA and the potential for further nuclear excursions through recriticality

  10. Simple machines

    CERN Document Server

    Graybill, George

    2007-01-01

    Just how simple are simple machines? With our ready-to-use resource, they are simple to teach and easy to learn! Chocked full of information and activities, we begin with a look at force, motion and work, and examples of simple machines in daily life are given. With this background, we move on to different kinds of simple machines including: Levers, Inclined Planes, Wedges, Screws, Pulleys, and Wheels and Axles. An exploration of some compound machines follows, such as the can opener. Our resource is a real time-saver as all the reading passages, student activities are provided. Presented in s

  11. Comparison of Test Procedures and Energy Efficiency Criteria in Selected International Standards & Labeling Programs for Copy Machines, External Power Supplies, LED Displays, Residential Gas Cooktops and Televisions

    Energy Technology Data Exchange (ETDEWEB)

    Zheng, Nina [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Zhou, Nan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fridley, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-03-01

    This report presents a technical review of international minimum energy performance standards (MEPS), voluntary and mandatory energy efficiency labels and test procedures for five products being considered for new or revised MEPS in China: copy machines, external power supply, LED displays, residential gas cooktops and flat-screen televisions. For each product, an overview of the scope of existing international standards and labeling programs, energy values and energy performance metrics and description and detailed summary table of criteria and procedures in major test standards are presented.

  12. Gamma ray shielding study of barium-bismuth-borosilicate glasses as transparent shielding materials using MCNP-4C code, XCOM program, and available experimental data

    Energy Technology Data Exchange (ETDEWEB)

    Bagheri, Reza; Yousefinia, Hassan [Nuclear Fuel Cycle Research School (NFCRS), Nuclear Science and Technology Research Institute (NSTRI), Atomic Energy Organization of Iran, Tehran (Iran, Islamic Republic of); Moghaddam, Alireza Khorrami [Radiology Department, Paramedical Faculty, Mazandaran University of Medical Sciences, Sari (Iran, Islamic Republic of)

    2017-02-15

    In this work, linear and mass attenuation coefficients, effective atomic number and electron density, mean free paths, and half value layer and 10th value layer values of barium-bismuth-borosilicate glasses were obtained for 662 keV, 1,173 keV, and 1,332 keV gamma ray energies using MCNP-4C code and XCOM program. Then obtained data were compared with available experimental data. The MCNP-4C code and XCOM program results were in good agreement with the experimental data. Barium-bismuth-borosilicate glasses have good gamma ray shielding properties from the shielding point of view.

  13. Gamma Ray Shielding Study of Barium–Bismuth–Borosilicate Glasses as Transparent Shielding Materials using MCNP-4C Code, XCOM Program, and Available Experimental Data

    Directory of Open Access Journals (Sweden)

    Reza Bagheri

    2017-02-01

    Full Text Available In this work, linear and mass attenuation coefficients, effective atomic number and electron density, mean free paths, and half value layer and 10th value layer values of barium–bismuth–borosilicate glasses were obtained for 662 keV, 1,173 keV, and 1,332 keV gamma ray energies using MCNP-4C code and XCOM program. Then obtained data were compared with available experimental data. The MCNP-4C code and XCOM program results were in good agreement with the experimental data. Barium–bismuth–borosilicate glasses have good gamma ray shielding properties from the shielding point of view.

  14. International training program in support of safety analysis. 3D S.UN.COP-scaling uncertainty and 3D thermal-hydraulics/neutron-kinetics coupled codes seminars

    International Nuclear Information System (INIS)

    Petruzzi, Alessandro; D'Auria, Francesco; Bajs, Tomislav; Reventos, Francesc; Hassan, Yassin

    2007-01-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers and vendors, nuclear fuel companies, research organizations, consulting companies, and technical support organizations. The computer code user represents a source of uncertainty that can influence the results of system code calculations. This influence is commonly known as the user effect' and stems from the limitations embedded in the codes as well as from the limited capability of the analysis to use the codes. Code user training and qualification is an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In other words, the program aims at contributing towards solving the problem of user effect. The 3D S.UN.COP (Scaling, Uncertainty and 3D COuPled code calculations) seminars have been organized as follow-up of the proposal to IAEA for the Permanent Training Course for System Code Users. Six seminars have been held at University of Pisa (2003, 2004), at The Pennsylvania State University (2004), at University of Zagreb (2005), at the School of Industrial Engineering of Barcelona (January-February 2006) and in Buenos Aires, Argentina (October 2006), being this last one requested by ARN (Autoridad Regulatoria Nuclear), NA-SA (Nucleoelectrica Argentina S.A) and CNEA (Comision Nacional de Energia Atomica). It was recognized that such courses represented both a source of continuing education for current code users and a mean for current code users to enter the formal training structure of a proposed 'permanent' stepwise approach to user training. The 3D S.UN.COP 2006 in Barcelona was successfully held with the attendance of 33

  15. International Training Program in Support of Safety Analysis: 3D S.UN.COP-Scaling, Uncertainty and 3D Thermal-Hydraulics/Neutron-Kinetics Coupled Codes Seminars

    International Nuclear Information System (INIS)

    Petruzzi, Alessandro; D'Auria, Francesco; Bajs, Tomislav; Reventos, Francesc

    2006-01-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers and vendors, nuclear fuel companies, research organizations, consulting companies, and technical support organizations. The computer code user represents a source of uncertainty that can influence the results of system code calculations. This influence is commonly known as the 'user effect' and stems from the limitations embedded in the codes as well as from the limited capability of the analysts to use the codes. Code user training and qualification is an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In other words, the program aims at contributing towards solving the problem of user effect. The 3D S.UN.COP (Scaling, Uncertainty and 3D COuPled code calculations) seminars have been organized as follow-up of the proposal to IAEA for the Permanent Training Course for System Code Users [1]. Five seminars have been held at University of Pisa (2003, 2004), at The Pennsylvania State University (2004), at University of Zagreb (2005) and at the School of Industrial Engineering of Barcelona (2006). It was recognized that such courses represented both a source of continuing education for current code users and a mean for current code users to enter the formal training structure of a proposed 'permanent' stepwise approach to user training. The 3D S.UN.COP 2006 was successfully held with the attendance of 33 participants coming from 18 countries and 28 different institutions (universities, vendors, national laboratories and regulatory bodies). More than 30 scientists (coming from 13 countries and 23 different institutions) were

  16. Quantum Virtual Machine (QVM)

    Energy Technology Data Exchange (ETDEWEB)

    2016-11-18

    There is a lack of state-of-the-art HPC simulation tools for simulating general quantum computing. Furthermore, there are no real software tools that integrate current quantum computers into existing classical HPC workflows. This product, the Quantum Virtual Machine (QVM), solves this problem by providing an extensible framework for pluggable virtual, or physical, quantum processing units (QPUs). It enables the execution of low level quantum assembly codes and returns the results of such executions.

  17. Face machines

    Energy Technology Data Exchange (ETDEWEB)

    Hindle, D.

    1999-06-01

    The article surveys latest equipment available from the world`s manufacturers of a range of machines for tunnelling. These are grouped under headings: excavators; impact hammers; road headers; and shields and tunnel boring machines. Products of thirty manufacturers are referred to. Addresses and fax numbers of companies are supplied. 5 tabs., 13 photos.

  18. Electric machine

    Science.gov (United States)

    El-Refaie, Ayman Mohamed Fawzi [Niskayuna, NY; Reddy, Patel Bhageerath [Madison, WI

    2012-07-17

    An interior permanent magnet electric machine is disclosed. The interior permanent magnet electric machine comprises a rotor comprising a plurality of radially placed magnets each having a proximal end and a distal end, wherein each magnet comprises a plurality of magnetic segments and at least one magnetic segment towards the distal end comprises a high resistivity magnetic material.

  19. Machine Learning.

    Science.gov (United States)

    Kirrane, Diane E.

    1990-01-01

    As scientists seek to develop machines that can "learn," that is, solve problems by imitating the human brain, a gold mine of information on the processes of human learning is being discovered, expert systems are being improved, and human-machine interactions are being enhanced. (SK)

  20. Nonplanar machines

    International Nuclear Information System (INIS)

    Ritson, D.

    1989-05-01

    This talk examines methods available to minimize, but never entirely eliminate, degradation of machine performance caused by terrain following. Breaking of planar machine symmetry for engineering convenience and/or monetary savings must be balanced against small performance degradation, and can only be decided on a case-by-case basis. 5 refs

  1. Object-Oriented Support for Adaptive Methods on Paranel Machines

    Directory of Open Access Journals (Sweden)

    Sandeep Bhatt

    1993-01-01

    Full Text Available This article reports on experiments from our ongoing project whose goal is to develop a C++ library which supports adaptive and irregular data structures on distributed memory supercomputers. We demonstrate the use of our abstractions in implementing "tree codes" for large-scale N-body simulations. These algorithms require dynamically evolving treelike data structures, as well as load-balancing, both of which are widely believed to make the application difficult and cumbersome to program for distributed-memory machines. The ease of writing the application code on top of our C++ library abstractions (which themselves are application independent, and the low overhead of the resulting C++ code (over hand-crafted C code supports our belief that object-oriented approaches are eminently suited to programming distributed-memory machines in a manner that (to the applications programmer is architecture-independent. Our contribution in parallel programming methodology is to identify and encapsulate general classes of communication and load-balancing strategies useful across applications and MIMD architectures. This article reports experimental results from simulations of half a million particles using multiple methods.

  2. Clojure for machine learning

    CERN Document Server

    Wali, Akhil

    2014-01-01

    A book that brings out the strengths of Clojure programming that have to facilitate machine learning. Each topic is described in substantial detail, and examples and libraries in Clojure are also demonstrated.This book is intended for Clojure developers who want to explore the area of machine learning. Basic understanding of the Clojure programming language is required, but thorough acquaintance with the standard Clojure library or any libraries are not required. Familiarity with theoretical concepts and notation of mathematics and statistics would be an added advantage.

  3. Neutron transport on the connection machine

    International Nuclear Information System (INIS)

    Robin, F.

    1991-12-01

    Monte Carlo methods are heavily used at CEA and account for a a large part of the total CPU time of industrial codes. In the present work (done in the frame of the Parallel Computing Project of the CEL-V Applied Mathematics Department) we study and implement on the Connection Machine an optimised Monte Carlo algorithm for solving the neutron transport equation. This allows us to investigate the suitability of such an architecture for this kind of problem. This report describes the chosen methodology, the algorithm and its performances. We found that programming the CM-2 in CM Fortran is relatively easy and we got interesting performances as, on a 16 k, CM-2 they are the same level as those obtained on one processor of a CRAY X-MP with a well optimized vector code

  4. 76 FR 13101 - Building Energy Codes Program: Presenting and Receiving Comments to DOE Proposed Changes to the...

    Science.gov (United States)

    2011-03-10

    ... DEPARTMENT OF ENERGY 10 CFR Part 430 [Docket No. EERE-2011-BT-BC-0009] Building Energy Codes.... The IgCC is intended to provide a green model building code provisions for new and existing commercial... Code (IgCC) AGENCY: Office of Energy Efficiency and Renewable Energy, Department of Energy. ACTION...

  5. Creating a Multi-axis Machining Postprocessor

    Directory of Open Access Journals (Sweden)

    Petr Vavruška

    2012-01-01

    Full Text Available This paper focuses on the postprocessor creation process. When using standard commercially available postprocessors it is often very difficult to modify its internal source code, and it is a very complex process, in many cases even impossible, to implement the newly-developed functions. It is therefore very important to have a method for creating a postprocessor for any CAM system, which allows CL data (Cutter Location data to be generated to a separate text file. The goal of our work is to verify the proposed method for creating a postprocessor. Postprocessor functions for multi-axis machiningare dealt with in this work. A file with CL data must be translated by the postprocessor into an NC program that has been customized for a specific production machine and its control system. The postprocessor is therefore verified by applications for machining free-form surfaces of complex parts, and by executing the NC programs that are generated on real machine tools. This is also presented here.

  6. Program EPICSHOW. A computer code to allow interactive viewing of the EPIC data libraries (Version 98-1). Summary documentation

    International Nuclear Information System (INIS)

    Pronyaev, V.G.; McLaughlin, P.K.

    1999-01-01

    EPICSHOW (Electron Photon Interactive Code - Show Data) is an interactive graphics code that allows users to view and interact with neutron, photon, electron and light charged particle data. Besides on screen graphics the code provides hard copy in the form of tabulated listings and Postscript output files. The code has been implemented on UNIX, IBM-PC, Power MAC, and even Laptop computers. It should be relatively easy to use it on almost any computer. All of the data included in this system is based on the Lawrence Livermore National Laboratory Databases and the neutron and photon data is used in the TART97 Monte Carlo transport code system. (author)

  7. DNA-based machines.

    Science.gov (United States)

    Wang, Fuan; Willner, Bilha; Willner, Itamar

    2014-01-01

    The base sequence in nucleic acids encodes substantial structural and functional information into the biopolymer. This encoded information provides the basis for the tailoring and assembly of DNA machines. A DNA machine is defined as a molecular device that exhibits the following fundamental features. (1) It performs a fuel-driven mechanical process that mimics macroscopic machines. (2) The mechanical process requires an energy input, "fuel." (3) The mechanical operation is accompanied by an energy consumption process that leads to "waste products." (4) The cyclic operation of the DNA devices, involves the use of "fuel" and "anti-fuel" ingredients. A variety of DNA-based machines are described, including the construction of "tweezers," "walkers," "robots," "cranes," "transporters," "springs," "gears," and interlocked cyclic DNA structures acting as reconfigurable catenanes, rotaxanes, and rotors. Different "fuels", such as nucleic acid strands, pH (H⁺/OH⁻), metal ions, and light, are used to trigger the mechanical functions of the DNA devices. The operation of the devices in solution and on surfaces is described, and a variety of optical, electrical, and photoelectrochemical methods to follow the operations of the DNA machines are presented. We further address the possible applications of DNA machines and the future perspectives of molecular DNA devices. These include the application of DNA machines as functional structures for the construction of logic gates and computing, for the programmed organization of metallic nanoparticle structures and the control of plasmonic properties, and for controlling chemical transformations by DNA machines. We further discuss the future applications of DNA machines for intracellular sensing, controlling intracellular metabolic pathways, and the use of the functional nanostructures for drug delivery and medical applications.

  8. International Training Program: 3D S. Un. Cop - Scaling, Uncertainty and 3D Thermal-Hydraulics/Neutron-Kinetics Coupled Codes Seminar

    International Nuclear Information System (INIS)

    Pertuzzi, A.; D'Auria, F.; Bajs, T.; Reventos, F.

    2006-01-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers and vendors, nuclear fuel companies, research organizations, consulting companies, and technical support organizations. The computer code user represents a source of uncertainty that can influence the results of system code calculations. This influence is commonly known as the 'user effect' and stems from the limitations embedded in the codes as well as from the limited capability of the analysts to use the codes. Code user training and qualification is an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In other words, the program aims at contributing towards solving the problem of user effect. The 3D S.UN.COP (Scaling, Uncertainty and 3D COuPled code calculations) seminars have been organized as follow-up of the proposal to IAEA for the Permanent Training Course for System Code Users (D'Auria, 1998). Four seminars have been held at University of Pisa (2003, 2004), at The Pennsylvania State University (2004) and at University of Zagreb (2005). It was recognized that such courses represented both a source of continuing education for current code users and a mean for current code users to enter the formal training structure of a proposed 'permanent' stepwise approach to user training. The 3D S.UN.COP 2005 was successfully held with the participation of 19 persons coming from 9 countries and 14 different institutions (universities, vendors, national laboratories and regulatory bodies). More than 15 scientists were involved in the organization of the seminar, presenting theoretical aspects of the proposed methodologies and

  9. Machine learning with R cookbook

    CERN Document Server

    Chiu, Yu-Wei

    2015-01-01

    If you want to learn how to use R for machine learning and gain insights from your data, then this book is ideal for you. Regardless of your level of experience, this book covers the basics of applying R to machine learning through to advanced techniques. While it is helpful if you are familiar with basic programming or machine learning concepts, you do not require prior experience to benefit from this book.

  10. The Machine within the Machine

    CERN Multimedia

    Katarina Anthony

    2014-01-01

    Although Virtual Machines are widespread across CERN, you probably won't have heard of them unless you work for an experiment. Virtual machines - known as VMs - allow you to create a separate machine within your own, allowing you to run Linux on your Mac, or Windows on your Linux - whatever combination you need.   Using a CERN Virtual Machine, a Linux analysis software runs on a Macbook. When it comes to LHC data, one of the primary issues collaborations face is the diversity of computing environments among collaborators spread across the world. What if an institute cannot run the analysis software because they use different operating systems? "That's where the CernVM project comes in," says Gerardo Ganis, PH-SFT staff member and leader of the CernVM project. "We were able to respond to experimentalists' concerns by providing a virtual machine package that could be used to run experiment software. This way, no matter what hardware they have ...

  11. Creating Tomorrow's Technologists: Contrasting Information Technology Curriculum in North American Library and Information Science Graduate Programs against Code4lib Job Listings

    Science.gov (United States)

    Maceli, Monica

    2015-01-01

    This research study explores technology-related course offerings in ALA-accredited library and information science (LIS) graduate programs in North America. These data are juxtaposed against a text analysis of several thousand LIS-specific technology job listings from the Code4lib jobs website. Starting in 2003, as a popular library technology…

  12. Machine translation

    Energy Technology Data Exchange (ETDEWEB)

    Nagao, M

    1982-04-01

    Each language has its own structure. In translating one language into another one, language attributes and grammatical interpretation must be defined in an unambiguous form. In order to parse a sentence, it is necessary to recognize its structure. A so-called context-free grammar can help in this respect for machine translation and machine-aided translation. Problems to be solved in studying machine translation are taken up in the paper, which discusses subjects for semantics and for syntactic analysis and translation software. 14 references.

  13. Refactoring test code

    NARCIS (Netherlands)

    A. van Deursen (Arie); L.M.F. Moonen (Leon); A. van den Bergh; G. Kok

    2001-01-01

    textabstractTwo key aspects of extreme programming (XP) are unit testing and merciless refactoring. Given the fact that the ideal test code / production code ratio approaches 1:1, it is not surprising that unit tests are being refactored. We found that refactoring test code is different from

  14. Code Cactus; Code Cactus

    Energy Technology Data Exchange (ETDEWEB)

    Fajeau, M; Nguyen, L T; Saunier, J [Commissariat a l' Energie Atomique, Centre d' Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette (France)

    1966-09-01

    This code handles the following problems: -1) Analysis of thermal experiments on a water loop at high or low pressure; steady state or transient behavior; -2) Analysis of thermal and hydrodynamic behavior of water-cooled and moderated reactors, at either high or low pressure, with boiling permitted; fuel elements are assumed to be flat plates: - Flowrate in parallel channels coupled or not by conduction across plates, with conditions of pressure drops or flowrate, variable or not with respect to time is given; the power can be coupled to reactor kinetics calculation or supplied by the code user. The code, containing a schematic representation of safety rod behavior, is a one dimensional, multi-channel code, and has as its complement (FLID), a one-channel, two-dimensional code. (authors) [French] Ce code permet de traiter les problemes ci-dessous: 1. Depouillement d'essais thermiques sur boucle a eau, haute ou basse pression, en regime permanent ou transitoire; 2. Etudes thermiques et hydrauliques de reacteurs a eau, a plaques, a haute ou basse pression, ebullition permise: - repartition entre canaux paralleles, couples on non par conduction a travers plaques, pour des conditions de debit ou de pertes de charge imposees, variables ou non dans le temps; - la puissance peut etre couplee a la neutronique et une representation schematique des actions de securite est prevue. Ce code (Cactus) a une dimension d'espace et plusieurs canaux, a pour complement Flid qui traite l'etude d'un seul canal a deux dimensions. (auteurs)

  15. Preliminary Test of Upgraded Conventional Milling Machine into PC Based CNC Milling Machine

    International Nuclear Information System (INIS)

    Abdul Hafid

    2008-01-01

    CNC (Computerized Numerical Control) milling machine yields a challenge to make an innovation in the field of machining. With an action job is machining quality equivalent to CNC milling machine, the conventional milling machine ability was improved to be based on PC CNC milling machine. Mechanically and instrumentally change. As a control replacing was conducted by servo drive and proximity were used. Computer programme was constructed to give instruction into milling machine. The program structure of consists GUI model and ladder diagram. Program was put on programming systems called RTX software. The result of up-grade is computer programming and CNC instruction job. The result was beginning step and it will be continued in next time. With upgrading ability milling machine becomes user can be done safe and optimal from accident risk. By improving performance of milling machine, the user will be more working optimal and safely against accident risk. (author)

  16. Machine learning and medical imaging

    CERN Document Server

    Shen, Dinggang; Sabuncu, Mert

    2016-01-01

    Machine Learning and Medical Imaging presents state-of- the-art machine learning methods in medical image analysis. It first summarizes cutting-edge machine learning algorithms in medical imaging, including not only classical probabilistic modeling and learning methods, but also recent breakthroughs in deep learning, sparse representation/coding, and big data hashing. In the second part leading research groups around the world present a wide spectrum of machine learning methods with application to different medical imaging modalities, clinical domains, and organs. The biomedical imaging modalities include ultrasound, magnetic resonance imaging (MRI), computed tomography (CT), histology, and microscopy images. The targeted organs span the lung, liver, brain, and prostate, while there is also a treatment of examining genetic associations. Machine Learning and Medical Imaging is an ideal reference for medical imaging researchers, industry scientists and engineers, advanced undergraduate and graduate students, a...

  17. Machine Learning an algorithmic perspective

    CERN Document Server

    Marsland, Stephen

    2009-01-01

    Traditional books on machine learning can be divided into two groups - those aimed at advanced undergraduates or early postgraduates with reasonable mathematical knowledge and those that are primers on how to code algorithms. The field is ready for a text that not only demonstrates how to use the algorithms that make up machine learning methods, but also provides the background needed to understand how and why these algorithms work. Machine Learning: An Algorithmic Perspective is that text.Theory Backed up by Practical ExamplesThe book covers neural networks, graphical models, reinforcement le

  18. Machine Learning

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Machine learning, which builds on ideas in computer science, statistics, and optimization, focuses on developing algorithms to identify patterns and regularities in data, and using these learned patterns to make predictions on new observations. Boosted by its industrial and commercial applications, the field of machine learning is quickly evolving and expanding. Recent advances have seen great success in the realms of computer vision, natural language processing, and broadly in data science. Many of these techniques have already been applied in particle physics, for instance for particle identification, detector monitoring, and the optimization of computer resources. Modern machine learning approaches, such as deep learning, are only just beginning to be applied to the analysis of High Energy Physics data to approach more and more complex problems. These classes will review the framework behind machine learning and discuss recent developments in the field.

  19. Machine Translation

    Indian Academy of Sciences (India)

    Research Mt System Example: The 'Janus' Translating Phone Project. The Janus ... based on laptops, and simultaneous translation of two speakers in a dialogue. For more ..... The current focus in MT research is on using machine learning.

  20. MCNP code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MCNP code is the major Monte Carlo coupled neutron-photon transport research tool at the Los Alamos National Laboratory, and it represents the most extensive Monte Carlo development program in the United States which is available in the public domain. The present code is the direct descendent of the original Monte Carlo work of Fermi, von Neumaum, and Ulam at Los Alamos in the 1940s. Development has continued uninterrupted since that time, and the current version of MCNP (or its predecessors) has always included state-of-the-art methods in the Monte Carlo simulation of radiation transport, basic cross section data, geometry capability, variance reduction, and estimation procedures. The authors of the present code have oriented its development toward general user application. The documentation, though extensive, is presented in a clear and simple manner with many examples, illustrations, and sample problems. In addition to providing the desired results, the output listings give a a wealth of detailed information (some optional) concerning each state of the calculation. The code system is continually updated to take advantage of advances in computer hardware and software, including interactive modes of operation, diagnostic interrupts and restarts, and a variety of graphical and video aids

  1. Using Peephole Optimization on Intermediate Code

    NARCIS (Netherlands)

    Tanenbaum, A.S.; van Staveren, H.; Stevenson, J.W.

    1982-01-01

    Many portable compilers generate an intermediate code that is subsequently translated into the target machine's assembly language. In this paper a stack-machine-based intermediate code suitable for algebraic languages (e.g., PASCAL, C, FORTRAN) and most byte-addressed mini- and microcomputers is

  2. The Machine Scoring of Writing

    Science.gov (United States)

    McCurry, Doug

    2010-01-01

    This article provides an introduction to the kind of computer software that is used to score student writing in some high stakes testing programs, and that is being promoted as a teaching and learning tool to schools. It sketches the state of play with machines for the scoring of writing, and describes how these machines work and what they do.…

  3. Synthesizing Certified Code

    OpenAIRE

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach for formally demonstrating software quality. Its basic idea is to require code producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates that can be checked independently. Since code certification uses the same underlying technology as program verification, it requires detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding annotations to th...

  4. Visualization program development using Java

    International Nuclear Information System (INIS)

    Sasaki, Akira; Suto, Keiko

    2002-03-01

    Method of visualization programs using Java for the PC with the graphical user interface (GUI) is discussed, and applied to the visualization and analysis of 1D and 2D data from experiments and numerical simulations. Based on an investigation of programming techniques such as drawing graphics and event driven program, example codes are provided in which GUI is implemented using the Abstract Window Toolkit (AWT). The marked advantage of Java comes from the inclusion of library routines for graphics and networking as its language specification, which enables ordinary scientific programmers to make interactive visualization a part of their simulation codes. Moreover, the Java programs are machine independent at the source level. Object oriented programming (OOP) methods used in Java programming will be useful for developing large scientific codes which includes number of modules with better maintenance ability. (author)

  5. Structural analysis program of plant piping system. Introduction of AutoPIPE V8i new feature. JSME PPC-class 2 piping code

    International Nuclear Information System (INIS)

    Motohashi, Kazuhiko

    2009-01-01

    After an integration with ADLPipe, AutoPIPE V8i (ver.9.1) became the structural analysis program of plant piping system featured with analysis capability for the ASME NB Class 1 and JSME PPC-Class 2 piping codes including ASME NC Class 2 and ASME ND Class 3. This article described analysis capability for the JSME PPC-Class 2 piping code as well as new general features such as static analysis up to 100 thermal, 10 seismic and 10 wind load cases including different loading scenarios and pipe segment edit function: join, split, reverse and re-order segments. (T. Tanaka)

  6. EGS code system: computer programs for the Monte Carlo simulation of electromagnetic cascade showers. Version 3. [EGS, PEGS, TESTSR, in MORTRAN

    Energy Technology Data Exchange (ETDEWEB)

    Ford, R.L.; Nelson, W.R.

    1978-06-01

    A code to simulate almost any electron--photon transport problem conceivable is described. The report begins with a lengthy historical introduction and a description of the shower generation process. Then the detailed physics of the shower processes and the methods used to simulate them are presented. Ideas of sampling theory, transport techniques, particle interactions in general, and programing details are discussed. Next, EGS calculations and various experiments and other Monte Carlo results are compared. The remainder of the report consists of user manuals for EGS, PEGS, and TESTSR codes; options, input specifications, and typical output are included. 38 figures, 12 tables. (RWR)

  7. Codes and curves

    CERN Document Server

    Walker, Judy L

    2000-01-01

    When information is transmitted, errors are likely to occur. Coding theory examines efficient ways of packaging data so that these errors can be detected, or even corrected. The traditional tools of coding theory have come from combinatorics and group theory. Lately, however, coding theorists have added techniques from algebraic geometry to their toolboxes. In particular, by re-interpreting the Reed-Solomon codes, one can see how to define new codes based on divisors on algebraic curves. For instance, using modular curves over finite fields, Tsfasman, Vladut, and Zink showed that one can define a sequence of codes with asymptotically better parameters than any previously known codes. This monograph is based on a series of lectures the author gave as part of the IAS/PCMI program on arithmetic algebraic geometry. Here, the reader is introduced to the exciting field of algebraic geometric coding theory. Presenting the material in the same conversational tone of the lectures, the author covers linear codes, inclu...

  8. User's manual for seismic analysis code 'SONATINA-2V'

    International Nuclear Information System (INIS)

    Hanawa, Satoshi; Iyoku, Tatsuo

    2001-08-01

    The seismic analysis code, SONATINA-2V, has been developed to analyze the behavior of the HTTR core graphite components under seismic excitation. The SONATINA-2V code is a two-dimensional computer program capable of analyzing the vertical arrangement of the HTTR graphite components, such as fuel blocks, replaceable reflector blocks, permanent reflector blocks, as well as their restraint structures. In the analytical model, each block is treated as rigid body and is restrained by dowel pins which restrict relative horizontal movement but allow vertical and rocking motions between upper and lower blocks. Moreover, the SONATINA-2V code is capable of analyzing the core vibration behavior under both simultaneous excitations of vertical and horizontal directions. The SONATINA-2V code is composed of the main program, pri-processor for making the input data to SONATINA-2V and post-processor for data processing and making the graphics from analytical results. Though the SONATINA-2V code was developed in order to work in the MSP computer system of Japan Atomic Energy Research Institute (JAERI), the computer system was abolished with the technical progress of computer. Therefore, improvement of this analysis code was carried out in order to operate the code under the UNIX machine, SR8000 computer system, of the JAERI. The users manual for seismic analysis code, SONATINA-2V, including pri- and post-processor is given in the present report. (author)

  9. Runtime Verification of C Programs

    Science.gov (United States)

    Havelund, Klaus

    2008-01-01

    We present in this paper a framework, RMOR, for monitoring the execution of C programs against state machines, expressed in a textual (nongraphical) format in files separate from the program. The state machine language has been inspired by a graphical state machine language RCAT recently developed at the Jet Propulsion Laboratory, as an alternative to using Linear Temporal Logic (LTL) for requirements capture. Transitions between states are labeled with abstract event names and Boolean expressions over such. The abstract events are connected to code fragments using an aspect-oriented pointcut language similar to ASPECTJ's or ASPECTC's pointcut language. The system is implemented in the C analysis and transformation package CIL, and is programmed in OCAML, the implementation language of CIL. The work is closely related to the notion of stateful aspects within aspect-oriented programming, where pointcut languages are extended with temporal assertions over the execution trace.

  10. Formal modeling of virtual machines

    Science.gov (United States)

    Cremers, A. B.; Hibbard, T. N.

    1978-01-01

    Systematic software design can be based on the development of a 'hierarchy of virtual machines', each representing a 'level of abstraction' of the design process. The reported investigation presents the concept of 'data space' as a formal model for virtual machines. The presented model of a data space combines the notions of data type and mathematical machine to express the close interaction between data and control structures which takes place in a virtual machine. One of the main objectives of the investigation is to show that control-independent data type implementation is only of limited usefulness as an isolated tool of program development, and that the representation of data is generally dictated by the control context of a virtual machine. As a second objective, a better understanding is to be developed of virtual machine state structures than was heretofore provided by the view of the state space as a Cartesian product.

  11. Development of Fractal Pattern Making Application using L-System for Enhanced Machine Controller

    Directory of Open Access Journals (Sweden)

    Gunawan Alexander A S

    2014-03-01

    Full Text Available One big issue facing the industry today is an automated machine lack of flexibility for customization because it is designed by the manufacturers based on certain standards. In this research, it is developed customized application software for CNC (Computer Numerically Controlled machines using open source platform. The application is enable us to create designs by means of fractal patterns using L-System, developed by turtle geometry interpretation and Python programming languages. The result of the application is the G-Code of fractal pattern formed by the method of L-System. In the experiment on the CNC machine, the G-Code of fractal pattern which involving the branching structure has been able to run well.

  12. ENERGY STAR Certified Vending Machines

    Data.gov (United States)

    U.S. Environmental Protection Agency — Certified models meet all ENERGY STAR requirements as listed in the Version 3.1 ENERGY STAR Program Requirements for Refrigerated Beverage Vending Machines that are...

  13. Neural Decoder for Topological Codes

    Science.gov (United States)

    Torlai, Giacomo; Melko, Roger G.

    2017-07-01

    We present an algorithm for error correction in topological codes that exploits modern machine learning techniques. Our decoder is constructed from a stochastic neural network called a Boltzmann machine, of the type extensively used in deep learning. We provide a general prescription for the training of the network and a decoding strategy that is applicable to a wide variety of stabilizer codes with very little specialization. We demonstrate the neural decoder numerically on the well-known two-dimensional toric code with phase-flip errors.

  14. Technology Roadmap on Instrumentation, Control, and Human-Machine Interface to Support DOE Advanced Nuclear Energy Programs

    International Nuclear Information System (INIS)

    Donald D Dudenhoeffer; Burce P Hallbert

    2007-01-01

    Instrumentation, Controls, and Human-Machine Interface (ICHMI) technologies are essential to ensuring delivery and effective operation of optimized advanced Generation IV (Gen IV) nuclear energy systems. In 1996, the Watts Bar I nuclear power plant in Tennessee was the last U.S. nuclear power plant to go on line. It was, in fact, built based on pre-1990 technology. Since this last U.S. nuclear power plant was designed, there have been major advances in the field of ICHMI systems. Computer technology employed in other industries has advanced dramatically, and computing systems are now replaced every few years as they become functionally obsolete. Functional obsolescence occurs when newer, more functional technology replaces or supersedes an existing technology, even though an existing technology may well be in working order. Although ICHMI architectures are comprised of much of the same technology, they have not been updated nearly as often in the nuclear power industry. For example, some newer Personal Digital Assistants (PDAs) or handheld computers may, in fact, have more functionality than the 1996 computer control system at the Watts Bar I plant. This illustrates the need to transition and upgrade current nuclear power plant ICHMI technologies

  15. Summary Report for ASC L2 Milestone #4782: Assess Newly Emerging Programming and Memory Models for Advanced Architectures on Integrated Codes

    Energy Technology Data Exchange (ETDEWEB)

    Neely, J. R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hornung, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Black, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Robinson, P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-29

    This document serves as a detailed companion to the powerpoint slides presented as part of the ASC L2 milestone review for Integrated Codes milestone #4782 titled “Assess Newly Emerging Programming and Memory Models for Advanced Architectures on Integrated Codes”, due on 9/30/2014, and presented for formal program review on 9/12/2014. The program review committee is represented by Mike Zika (A Program Project Lead for Kull), Brian Pudliner (B Program Project Lead for Ares), Scott Futral (DEG Group Lead in LC), and Mike Glass (Sierra Project Lead at Sandia). This document, along with the presentation materials, and a letter of completion signed by the review committee will act as proof of completion for this milestone.

  16. Machine Protection

    International Nuclear Information System (INIS)

    Zerlauth, Markus; Schmidt, Rüdiger; Wenninger, Jörg

    2012-01-01

    The present architecture of the machine protection system is being recalled and the performance of the associated systems during the 2011 run will be briefly summarized. An analysis of the causes of beam dumps as well as an assessment of the dependability of the machine protection systems (MPS) itself is being presented. Emphasis will be given to events that risked exposing parts of the machine to damage. Further improvements and mitigations of potential holes in the protection systems will be evaluated along with their impact on the 2012 run. The role of rMPP during the various operational phases (commissioning, intensity ramp up, MDs...) will be discussed along with a proposal for the intensity ramp up for the start of beam operation in 2012

  17. Machine Learning

    Energy Technology Data Exchange (ETDEWEB)

    Chikkagoudar, Satish; Chatterjee, Samrat; Thomas, Dennis G.; Carroll, Thomas E.; Muller, George

    2017-04-21

    The absence of a robust and unified theory of cyber dynamics presents challenges and opportunities for using machine learning based data-driven approaches to further the understanding of the behavior of such complex systems. Analysts can also use machine learning approaches to gain operational insights. In order to be operationally beneficial, cybersecurity machine learning based models need to have the ability to: (1) represent a real-world system, (2) infer system properties, and (3) learn and adapt based on expert knowledge and observations. Probabilistic models and Probabilistic graphical models provide these necessary properties and are further explored in this chapter. Bayesian Networks and Hidden Markov Models are introduced as an example of a widely used data driven classification/modeling strategy.

  18. Machine Protection

    CERN Document Server

    Zerlauth, Markus; Wenninger, Jörg

    2012-01-01

    The present architecture of the machine protection system is being recalled and the performance of the associated systems during the 2011 run will be briefly summarized. An analysis of the causes of beam dumps as well as an assessment of the dependability of the machine protection systems (MPS) itself is being presented. Emphasis will be given to events that risked exposing parts of the machine to damage. Further improvements and mitigations of potential holes in the protection systems will be evaluated along with their impact on the 2012 run. The role of rMPP during the various operational phases (commissioning, intensity ramp up, MDs...) will be discussed along with a proposal for the intensity ramp up for the start of beam operation in 2012.

  19. Machine Protection

    Energy Technology Data Exchange (ETDEWEB)

    Zerlauth, Markus; Schmidt, Rüdiger; Wenninger, Jörg [European Organization for Nuclear Research, Geneva (Switzerland)

    2012-07-01

    The present architecture of the machine protection system is being recalled and the performance of the associated systems during the 2011 run will be briefly summarized. An analysis of the causes of beam dumps as well as an assessment of the dependability of the machine protection systems (MPS) itself is being presented. Emphasis will be given to events that risked exposing parts of the machine to damage. Further improvements and mitigations of potential holes in the protection systems will be evaluated along with their impact on the 2012 run. The role of rMPP during the various operational phases (commissioning, intensity ramp up, MDs...) will be discussed along with a proposal for the intensity ramp up for the start of beam operation in 2012.

  20. R5FORCE: a program to compute fluid induced forces using hydrodynamic output from the RELAP5 code

    International Nuclear Information System (INIS)

    Watkins, J.C.

    1983-01-01

    This paper describes the computer code R5FORCE, a postprocessor to the RELAP5/MOD1 thermal-hydraulics code. R5FORCE computes piping hydraulic force/time histories that can be input into various structural analysis computer codes. R5FORCE solves the momentum conservation equation using the pressure and wall shear force terms rather than the pressure and fluid acceleration terms; eliminating potential instabilities associated with computing the time derivative in the fluid acceleration term. The updates to REALP5 required to generate the input data to R5FORCE are also discussed

  1. Coding completeness and quality of relative survival-related variables in the National Program of Cancer Registries Cancer Surveillance System, 1995-2008.

    Science.gov (United States)

    Wilson, Reda J; O'Neil, M E; Ntekop, E; Zhang, Kevin; Ren, Y

    2014-01-01

    Calculating accurate estimates of cancer survival is important for various analyses of cancer patient care and prognosis. Current US survival rates are estimated based on data from the National Cancer Institute's (NCI's) Surveillance, Epidemiology, and End RESULTS (SEER) program, covering approximately 28 percent of the US population. The National Program of Cancer Registries (NPCR) covers about 96 percent of the US population. Using a population-based database with greater US population coverage to calculate survival rates at the national, state, and regional levels can further enhance the effective monitoring of cancer patient care and prognosis in the United States. The first step is to establish the coding completeness and coding quality of the NPCR data needed for calculating survival rates and conducting related validation analyses. Using data from the NPCR-Cancer Surveillance System (CSS) from 1995 through 2008, we assessed coding completeness and quality on 26 data elements that are needed to calculate cancer relative survival estimates and conduct related analyses. Data elements evaluated consisted of demographic, follow-up, prognostic, and cancer identification variables. Analyses were performed showing trends of these variables by diagnostic year, state of residence at diagnosis, and cancer site. Mean overall percent coding completeness by each NPCR central cancer registry averaged across all data elements and diagnosis years ranged from 92.3 percent to 100 percent. RESULTS showing the mean percent coding completeness for the relative survival-related variables in NPCR data are presented. All data elements but 1 have a mean coding completeness greater than 90 percent as was the mean completeness by data item group type. Statistically significant differences in coding completeness were found in the ICD revision number, cause of death, vital status, and date of last contact variables when comparing diagnosis years. The majority of data items had a coding

  2. Chatter and machine tools

    CERN Document Server

    Stone, Brian

    2014-01-01

    Focussing on occurrences of unstable vibrations, or Chatter, in machine tools, this book gives important insights into how to eliminate chatter with associated improvements in product quality, surface finish and tool wear. Covering a wide range of machining processes, including turning, drilling, milling and grinding, the author uses his research expertise and practical knowledge of vibration problems to provide solutions supported by experimental evidence of their effectiveness. In addition, this book contains links to supplementary animation programs that help readers to visualise the ideas detailed in the text. Advancing knowledge in chatter avoidance and suggesting areas for new innovations, Chatter and Machine Tools serves as a handbook for those desiring to achieve significant reductions in noise, longer tool and grinding wheel life and improved product finish.

  3. Teletherapy machine

    International Nuclear Information System (INIS)

    Panyam, Vinatha S.; Rakshit, Sougata; Kulkarni, M.S.; Pradeepkumar, K.S.

    2017-01-01

    Radiation Standards Section (RSS), RSSD, BARC is the national metrology institute for ionizing radiation. RSS develops and maintains radiation standards for X-ray, beta, gamma and neutron radiations. In radiation dosimetry, traceability, accuracy and consistency of radiation measurements is very important especially in radiotherapy where the success of patient treatment is dependent on the accuracy of the dose delivered to the tumour. Cobalt teletherapy machines have been used in the treatment of cancer since the early 1950s and India had its first cobalt teletherapy machine installed at the Cancer Institute, Chennai in 1956

  4. Portable LQCD Monte Carlo code using OpenACC

    Science.gov (United States)

    Bonati, Claudio; Calore, Enrico; Coscetti, Simone; D'Elia, Massimo; Mesiti, Michele; Negro, Francesco; Fabio Schifano, Sebastiano; Silvi, Giorgio; Tripiccione, Raffaele

    2018-03-01

    Varying from multi-core CPU processors to many-core GPUs, the present scenario of HPC architectures is extremely heterogeneous. In this context, code portability is increasingly important for easy maintainability of applications; this is relevant in scientific computing where code changes are numerous and frequent. In this talk we present the design and optimization of a state-of-the-art production level LQCD Monte Carlo application, using the OpenACC directives model. OpenACC aims to abstract parallel programming to a descriptive level, where programmers do not need to specify the mapping of the code on the target machine. We describe the OpenACC implementation and show that the same code is able to target different architectures, including state-of-the-art CPUs and GPUs.

  5. Superconducting magnetic systems and electrical machines

    International Nuclear Information System (INIS)

    Glebov, I.A.

    1975-01-01

    The use of superconductors for magnets and electrical machines attracts close attention of designers and scientists. A description is given of an ongoing research program to create superconductive magnetic systems, commutator motors, homopolar machines, topological generators and turbogenerators with superconductive field windings. All the machines are tentative experimental models and serve as a basis for further developments

  6. Nuclear code abstracts (1975 edition)

    International Nuclear Information System (INIS)

    Akanuma, Makoto; Hirakawa, Takashi

    1976-02-01

    Nuclear Code Abstracts is compiled in the Nuclear Code Committee to exchange information of the nuclear code developments among members of the committee. Enlarging the collection, the present one includes nuclear code abstracts obtained in 1975 through liaison officers of the organizations in Japan participating in the Nuclear Energy Agency's Computer Program Library at Ispra, Italy. The classification of nuclear codes and the format of code abstracts are the same as those in the library. (auth.)

  7. Indonesian Stock Prediction using Support Vector Machine (SVM

    Directory of Open Access Journals (Sweden)

    Santoso Murtiyanto

    2018-01-01

    Full Text Available This project is part of developing software to provide predictive information technology-based services artificial intelligence (Machine Intelligence or Machine Learning that will be utilized in the money market community. The prediction method used in this early stages uses the combination of Gaussian Mixture Model and Support Vector Machine with Python programming. The system predicts the price of Astra International (stock code: ASII.JK stock data. The data used was taken during 17 yr period of January 2000 until September 2017. Some data was used for training/modeling (80 % of data and the remainder (20 % was used for testing. An integrated model comprising Gaussian Mixture Model and Support Vector Machine system has been tested to predict stock market of ASII.JK for l d in advance. This model has been compared with the Market Cummulative Return. From the results, it is depicts that the Gaussian Mixture Model-Support Vector Machine based stock predicted model, offers significant improvement over the compared models resulting sharpe ratio of 3.22.

  8. Optimized periodic verification testing blended risk and performance-based MOV inservice test program an application of ASME code case OMN-1

    Energy Technology Data Exchange (ETDEWEB)

    Sellers, C.; Fleming, K.; Bidwell, D.; Forbes, P. [and others

    1996-12-01

    This paper presents an application of ASME Code Case OMN-1 to the GL 89-10 Program at the South Texas Project Electric Generating Station (STPEGS). Code Case OMN-1 provides guidance for a performance-based MOV inservice test program that can be used for periodic verification testing and allows consideration of risk insights. Blended probabilistic and deterministic evaluation techniques were used to establish inservice test strategies including both test methods and test frequency. Described in the paper are the methods and criteria for establishing MOV safety significance based on the STPEGS probabilistic safety assessment, deterministic considerations of MOV performance characteristics and performance margins, the expert panel evaluation process, and the development of inservice test strategies. Test strategies include a mix of dynamic and static testing as well as MOV exercising.

  9. Optimized periodic verification testing blended risk and performance-based MOV inservice test program an application of ASME code case OMN-1

    International Nuclear Information System (INIS)

    Sellers, C.; Fleming, K.; Bidwell, D.; Forbes, P.

    1996-01-01

    This paper presents an application of ASME Code Case OMN-1 to the GL 89-10 Program at the South Texas Project Electric Generating Station (STPEGS). Code Case OMN-1 provides guidance for a performance-based MOV inservice test program that can be used for periodic verification testing and allows consideration of risk insights. Blended probabilistic and deterministic evaluation techniques were used to establish inservice test strategies including both test methods and test frequency. Described in the paper are the methods and criteria for establishing MOV safety significance based on the STPEGS probabilistic safety assessment, deterministic considerations of MOV performance characteristics and performance margins, the expert panel evaluation process, and the development of inservice test strategies. Test strategies include a mix of dynamic and static testing as well as MOV exercising

  10. Evaluation the total exposure of soil sample in Adaya site and the obtain risk assessments for the worker by Res Rad code program

    International Nuclear Information System (INIS)

    Mahadi, A. M.; Khadim, A. A. N.; Ibrahim, Z. H.; Ali, S. A.

    2012-12-01

    The present study aims to evaluation the total exposure to the worker in Adaya site risk assessment by using Res Rad code program. The study including 5 areas soil sample calculate in the site and analysis it by High Pure Germaniums (Hg) system made (CANBERRA) company. The soil sample simulation by (Res Rad) code program by inter the radioactive isotope concentration and the specification of the contamination zone area, depth and the cover depth of it. The total exposure of same sample was about 9 mSv/year and the (Heast 2001 Morbidity, FGR13 Morbidity) about 2.045 state every 100 worker in the year. There are simple different between Heast 2001 Morbidity and FGR13 Morbidity according to the Dose Conversion Factor (DCF) use it. The (FGR13 Morbidity) about 2.041 state every 100 worker in the year. (Author)

  11. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    This essay studies the source code of an artwork from a software studies perspective. By examining code that come close to the approach of critical code studies (Marino, 2006), I trace the network artwork, Pupufu (Lin, 2009) to understand various real-time approaches to social media platforms (MSN......, Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...... to understand the socio-technical side of a changing network environment. Through the study of code, including but not limited to source code, technical specifications and other materials in relation to the artwork production, I would like to explore the materiality of code that goes beyond technical...

  12. Modular programming method at JAERI

    International Nuclear Information System (INIS)

    Asai, Kiyoshi; Katsuragi, Satoru

    1982-02-01

    In this report the histories, concepts and a method for the construction and maintenance of nuclear code systems of Japan Atomic Energy Research Institute (JAERI) are presented. The method is mainly consisted of novel computer features. The development process of the features and experiences with them which required many man-months and efforts of scientists and engineers of JAERI and a computer manufacturer are also described. One of the features is a file handling program named datapool. The program is being used in code systems which are under development at JAERI. The others are computer features such as dynamic linking, reentrant coding of Fortran programs, interactive programming facility, document editor, quick system output viewer and editor, flexible man-machine interactive Fortran executor, and selective use of time-sharing or batch oriented computer in an interactive porgramming environment. In 1980 JAERI has replaced its two old computer systems by three FACOM M-200 computer systems and they have such features as mentioned above. Since 1981 most code systems, or even big single codes can be changed to modular code systems even if the developers or users of the systems will not recognize the fact that they are using modular code systems. The purpose of this report is to describe our methodology of modular programming from aspects of computer features and some of their applications to nuclear codes to get sympathetic understanding of it from persons of organizations who are concerned with the effective use of computers, especially, in nuclear research fields. (author)

  13. Machine testning

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo

    This document is used in connection with a laboratory exercise of 3 hours duration as a part of the course GEOMETRICAL METROLOGY AND MACHINE TESTING. The exercise includes a series of tests carried out by the student on a conventional and a numerically controled lathe, respectively. This document...

  14. Machine rates for selected forest harvesting machines

    Science.gov (United States)

    R.W. Brinker; J. Kinard; Robert Rummer; B. Lanford

    2002-01-01

    Very little new literature has been published on the subject of machine rates and machine cost analysis since 1989 when the Alabama Agricultural Experiment Station Circular 296, Machine Rates for Selected Forest Harvesting Machines, was originally published. Many machines discussed in the original publication have undergone substantial changes in various aspects, not...

  15. Legacy program acquires special equipment for CRL spent fuel. Canadian arm of Rolls Royce delivers unique machines

    International Nuclear Information System (INIS)

    Boyd, F.

    2012-01-01

    Six years ago the federal government accepted its responsibility for the radioactive waste that has resulted from the six decades of nuclear research and development conducted by Atomic Energy of Canada Limited and its predecessor operator of the Chalk River Laboratories (CRL) the National Research Council (1944 - 1952). Nuclear research and development and, particularly, reactor operation at CRL has resulted in outdated and unused research facilities and buildings and a wide variety of buried and stored radioactive waste. In 2006 the federal government established the Nuclear Legacy Liabilities Program (NLLP) with an initial funding of $520 million. The mandate of the NLLP is to deal with the radioactive waste arising from the nuclear research and development program of AECL and also prototype reactors in which it was involved. The timeline for the NLLP extends several decades into the future. The NLLP is implemented through a partnership of Natural Resources Canada (NRCan) and AECL. NRCan is responsible for policy direction and oversight, while AECL is responsible for program implementation and all licences, facilities and lands. About 70 percent of the liabilities from AECL activities are at CRL. Other sites that will be restored under the NLLP are: the Whiteshell Laboratories and Underground Research Laboratory in Manitoba; NPD and Douglas Point reactors in Ontario; and the Gentilly 1 reactor in Quebec. (author)

  16. Recent upgrading of the modelling program COMFORT

    International Nuclear Information System (INIS)

    Hawkes, C.; Lee, M.

    1986-01-01

    The computer code COMFORT, developed for the online control of machine functions at the SLC, has recently undergone several modifications to overcome some of its limitations. This note describes the reasons for these changes, the methods employed, some test results and the applications of the new version of the program

  17. Using Machine Learning to Predict Swine Movements within a Regional Program to Improve Control of Infectious Diseases in the US.

    Science.gov (United States)

    Valdes-Donoso, Pablo; VanderWaal, Kimberly; Jarvis, Lovell S; Wayne, Spencer R; Perez, Andres M

    2017-01-01

    Between-farm animal movement is one of the most important factors influencing the spread of infectious diseases in food animals, including in the US swine industry. Understanding the structural network of contacts in a food animal industry is prerequisite to planning for efficient production strategies and for effective disease control measures. Unfortunately, data regarding between-farm animal movements in the US are not systematically collected and thus, such information is often unavailable. In this paper, we develop a procedure to replicate the structure of a network, making use of partial data available, and subsequently use the model developed to predict animal movements among sites in 34 Minnesota counties. First, we summarized two networks of swine producing facilities in Minnesota, then we used a machine learning technique referred to as random forest, an ensemble of independent classification trees, to estimate the probability of pig movements between farms and/or markets sites located in two counties in Minnesota. The model was calibrated and tested by comparing predicted data and observed data in those two counties for which data were available. Finally, the model was used to predict animal movements in sites located across 34 Minnesota counties. Variables that were important in predicting pig movements included between-site distance, ownership, and production type of the sending and receiving farms and/or markets. Using a weighted-kernel approach to describe spatial variation in the centrality measures of the predicted network, we showed that the south-central region of the study area exhibited high aggregation of predicted pig movements. Our results show an overlap with the distribution of outbreaks of porcine reproductive and respiratory syndrome, which is believed to be transmitted, at least in part, though animal movements. While the correspondence of movements and disease is not a causal test, it suggests that the predicted network may approximate

  18. Phase 3 of a Brushless Doubly-Fed Machine System Development Program : Final Technical Report for Period January 1, 1992-June 30, 1993.

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, Gerald C.; Spee, Rene; Wallace, Alan K.

    1993-12-31

    Since the inception of the BDFM development program in 1989, the value of BDFM technology has become apparent. The BDFM provides for adjustable speed, synchronous operation while keeping costs associated with the required power conversion equipment lower than in competing technologies. This provides for an advantage in initial as well as maintenance expenses over conventional drive system. Thus, the BDFM enables energy efficient, adjustable speed process control for applications where established drive technology has not been able to deliver satisfactory returns on investment. At the same time, the BDFM challenges conventional drive technologies in established markets by providing for improved performance at lower cost. BDFM converter rating is kept at a minimum, which significantly improves power quality at the utility interface over competing power conversion equipment. In summary, BDFM technology can be expected to provide significant benefits to utilities as well as their customers. This report discusses technical research and development activities related to Phase 3 of the Brushless Doubly-Fed Machine System Development Program, including work made possible by supplemental funds for laboratory improvement and prototype construction. Market research for the BDFM was provided by the College of Business at Oregon State University; market study results will be discussed in a separate report.

  19. Factors Associated with HIV Testing Among Participants from Substance Use Disorder Treatment Programs in the US: A Machine Learning Approach.

    Science.gov (United States)

    Pan, Yue; Liu, Hongmei; Metsch, Lisa R; Feaster, Daniel J

    2017-02-01

    HIV testing is the foundation for consolidated HIV treatment and prevention. In this study, we aim to discover the most relevant variables for predicting HIV testing uptake among substance users in substance use disorder treatment programs by applying random forest (RF), a robust multivariate statistical learning method. We also provide a descriptive introduction to this method for those who are unfamiliar with it. We used data from the National Institute on Drug Abuse Clinical Trials Network HIV testing and counseling study (CTN-0032). A total of 1281 HIV-negative or status unknown participants from 12 US community-based substance use disorder treatment programs were included and were randomized into three HIV testing and counseling treatment groups. The a priori primary outcome was self-reported receipt of HIV test results. Classification accuracy of RF was compared to logistic regression, a standard statistical approach for binary outcomes. Variable importance measures for the RF model were used to select the most relevant variables. RF based models produced much higher classification accuracy than those based on logistic regression. Treatment group is the most important predictor among all covariates, with a variable importance index of 12.9%. RF variable importance revealed that several types of condomless sex behaviors, condom use self-efficacy and attitudes towards condom use, and level of depression are the most important predictors of receipt of HIV testing results. There is a non-linear negative relationship between count of condomless sex acts and the receipt of HIV testing. In conclusion, RF seems promising in discovering important factors related to HIV testing uptake among large numbers of predictors and should be encouraged in future HIV prevention and treatment research and intervention program evaluations.

  20. Induction technology optimization code

    International Nuclear Information System (INIS)

    Caporaso, G.J.; Brooks, A.L.; Kirbie, H.C.

    1992-01-01

    A code has been developed to evaluate relative costs of induction accelerator driver systems for relativistic klystrons. The code incorporates beam generation, transport and pulsed power system constraints to provide an integrated design tool. The code generates an injector/accelerator combination which satisfies the top level requirements and all system constraints once a small number of design choices have been specified (rise time of the injector voltage and aspect ratio of the ferrite induction cores, for example). The code calculates dimensions of accelerator mechanical assemblies and values of all electrical components. Cost factors for machined parts, raw materials and components are applied to yield a total system cost. These costs are then plotted as a function of the two design choices to enable selection of an optimum design based on various criteria. (Author) 11 refs., 3 figs

  1. PCRELAP5: data calculation program for RELAP 5 code; PCRELAP5: programa de calculo dos dados de entrada para o codigo RELAP5

    Energy Technology Data Exchange (ETDEWEB)

    Silvestre, Larissa Jacome Barros

    2016-07-01

    Nuclear accidents in the world led to the establishment of rigorous criteria and requirements for nuclear power plant operations by the international regulatory bodies. By using specific computer programs, simulations of various accidents and transients likely to occur at any nuclear power plant are required for certifying and licensing a nuclear power plant. Based on this scenario, some sophisticated computational tools have been used such as the Reactor Excursion and Leak Analysis Program (RELAP5), which is the most widely used code for the thermo-hydraulic analysis of accidents and transients in nuclear reactors in Brazil and worldwide. A major difficulty in the simulation by using RELAP5 code is the amount of information required for the simulation of thermal-hydraulic accidents or transients. The preparation of the input data requires a great number of mathematical operations to calculate the geometry of the components. Thus, for those calculations performance and preparation of RELAP5 input data, a friendly mathematical preprocessor was designed. The Visual Basic for Application (VBA) for Microsoft Excel demonstrated to be an effective tool to perform a number of tasks in the development of the program. In order to meet the needs of RELAP5 users, the RELAP5 Calculation Program (Programa de Calculo do RELAP5 - PCRELAP5) was designed. The components of the code were codified; all entry cards including the optional cards of each one have been programmed. In addition, an English version for PCRELAP5 was provided. Furthermore, a friendly design was developed in order to minimize the time of preparation of input data and errors committed by users. In this work, the final version of this preprocessor was successfully applied for Safety Injection System (SIS) of Angra 2. (author)

  2. Probe code: a set of programs for processing and analysis of the left ventricular function - User's manual

    International Nuclear Information System (INIS)

    Piva, R.M.V.

    1987-01-01

    The User's Manual of the Probe Code is an addendum to the M.Sc. thesis entitled A Microcomputer System of Nuclear Probe to Check the Left Ventricular Function. The Probe Code is a software which was developed for processing and off-line analysis curves from the Left Ventricular Function, that were obtained in vivo. These curves are produced by means of an external scintigraph probe, which was collimated and put on the left ventricule, after a venous inoculation of Tc-99 m. (author)

  3. Electric machines

    CERN Document Server

    Gross, Charles A

    2006-01-01

    BASIC ELECTROMAGNETIC CONCEPTSBasic Magnetic ConceptsMagnetically Linear Systems: Magnetic CircuitsVoltage, Current, and Magnetic Field InteractionsMagnetic Properties of MaterialsNonlinear Magnetic Circuit AnalysisPermanent MagnetsSuperconducting MagnetsThe Fundamental Translational EM MachineThe Fundamental Rotational EM MachineMultiwinding EM SystemsLeakage FluxThe Concept of Ratings in EM SystemsSummaryProblemsTRANSFORMERSThe Ideal n-Winding TransformerTransformer Ratings and Per-Unit ScalingThe Nonideal Three-Winding TransformerThe Nonideal Two-Winding TransformerTransformer Efficiency and Voltage RegulationPractical ConsiderationsThe AutotransformerOperation of Transformers in Three-Phase EnvironmentsSequence Circuit Models for Three-Phase Transformer AnalysisHarmonics in TransformersSummaryProblemsBASIC MECHANICAL CONSIDERATIONSSome General PerspectivesEfficiencyLoad Torque-Speed CharacteristicsMass Polar Moment of InertiaGearingOperating ModesTranslational SystemsA Comprehensive Example: The ElevatorP...

  4. Charging machine

    International Nuclear Information System (INIS)

    Medlin, J.B.

    1976-01-01

    A charging machine for loading fuel slugs into the process tubes of a nuclear reactor includes a tubular housing connected to the process tube, a charging trough connected to the other end of the tubular housing, a device for loading the charging trough with a group of fuel slugs, means for equalizing the coolant pressure in the charging trough with the pressure in the process tubes, means for pushing the group of fuel slugs into the process tube and a latch and a seal engaging the last object in the group of fuel slugs to prevent the fuel slugs from being ejected from the process tube when the pusher is removed and to prevent pressure liquid from entering the charging machine. 3 claims, 11 drawing figures

  5. The Aesthetics of Coding

    DEFF Research Database (Denmark)

    Andersen, Christian Ulrik

    2007-01-01

    Computer art is often associated with computer-generated expressions (digitally manipulated audio/images in music, video, stage design, media facades, etc.). In recent computer art, however, the code-text itself – not the generated output – has become the artwork (Perl Poetry, ASCII Art, obfuscated...... code, etc.). The presentation relates this artistic fascination of code to a media critique expressed by Florian Cramer, claiming that the graphical interface represents a media separation (of text/code and image) causing alienation to the computer’s materiality. Cramer is thus the voice of a new ‘code...... avant-garde’. In line with Cramer, the artists Alex McLean and Adrian Ward (aka Slub) declare: “art-oriented programming needs to acknowledge the conditions of its own making – its poesis.” By analysing the Live Coding performances of Slub (where they program computer music live), the presentation...

  6. Genesis machines

    CERN Document Server

    Amos, Martyn

    2014-01-01

    Silicon chips are out. Today's scientists are using real, wet, squishy, living biology to build the next generation of computers. Cells, gels and DNA strands are the 'wetware' of the twenty-first century. Much smaller and more intelligent, these organic computers open up revolutionary possibilities. Tracing the history of computing and revealing a brave new world to come, Genesis Machines describes how this new technology will change the way we think not just about computers - but about life itself.

  7. PREREM: an interactive data preprocessing code for INREM II. Part I: user's manual. Part II: code structure

    Energy Technology Data Exchange (ETDEWEB)

    Ryan, M.T.; Fields, D.E.

    1981-05-01

    PREREM is an interactive computer code developed as a data preprocessor for the INREM-II (Killough, Dunning, and Pleasant, 1978a) internal dose program. PREREM is intended to provide easy access to current and self-consistent nuclear decay and radionuclide-specific metabolic data sets. Provision is made for revision of metabolic data, and the code is intended for both production and research applications. Documentation for the code is in two parts. Part I is a user's manual which emphasizes interpretation of program prompts and choice of user input. Part II stresses internal structure and flow of program control and is intended to assist the researcher who wishes to revise or modify the code or add to its capabilities. PREREM is written for execution on a Digital Equipment Corporation PDP-10 System and much of the code will require revision before it can be run on other machines. The source program length is 950 lines (116 blocks) and computer core required for execution is 212 K bytes. The user must also have sufficient file space for metabolic and S-factor data sets. Further, 64 100 K byte blocks of computer storage space are required for the nuclear decay data file. Computer storage space must also be available for any output files produced during the PREREM execution. 9 refs., 8 tabs.

  8. Programming Scala Scalability = Functional Programming + Objects

    CERN Document Server

    Wampler, Dean

    2009-01-01

    Learn how to be more productive with Scala, a new multi-paradigm language for the Java Virtual Machine (JVM) that integrates features of both object-oriented and functional programming. With this book, you'll discover why Scala is ideal for highly scalable, component-based applications that support concurrency and distribution. Programming Scala clearly explains the advantages of Scala as a JVM language. You'll learn how to leverage the wealth of Java class libraries to meet the practical needs of enterprise and Internet projects more easily. Packed with code examples, this book provides us

  9. Performance evaluation of scientific programs on advanced architecture computers

    International Nuclear Information System (INIS)

    Walker, D.W.; Messina, P.; Baille, C.F.

    1988-01-01

    Recently a number of advanced architecture machines have become commercially available. These new machines promise better cost-performance then traditional computers, and some of them have the potential of competing with current supercomputers, such as the Cray X/MP, in terms of maximum performance. This paper describes an on-going project to evaluate a broad range of advanced architecture computers using a number of complete scientific application programs. The computers to be evaluated include distributed- memory machines such as the NCUBE, INTEL and Caltech/JPL hypercubes, and the MEIKO computing surface, shared-memory, bus architecture machines such as the Sequent Balance and the Alliant, very long instruction word machines such as the Multiflow Trace 7/200 computer, traditional supercomputers such as the Cray X.MP and Cray-2, and SIMD machines such as the Connection Machine. Currently 11 application codes from a number of scientific disciplines have been selected, although it is not intended to run all codes on all machines. Results are presented for two of the codes (QCD and missile tracking), and future work is proposed

  10. Synthesizing Certified Code

    Science.gov (United States)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  11. An A.P.L. micro-programmed machine: implementation on a Multi-20 mini-computer, memory organization, micro-programming and flowcharts

    International Nuclear Information System (INIS)

    Granger, Jean-Louis

    1975-01-01

    This work deals with the presentation of an APL interpreter implemented on an MULTI 20 mini-computer. It includes a left to right syntax analyser, a recursive routine for generation and execution. This routine uses a beating method for array processing. Moreover, during the execution of all APL statements, dynamic memory allocation is used. Execution of basic operations has been micro-programmed. The basic APL interpreter has a length of 10 K bytes. It uses overlay methods. (author) [fr

  12. Abstract quantum computing machines and quantum computational logics

    Science.gov (United States)

    Chiara, Maria Luisa Dalla; Giuntini, Roberto; Sergioli, Giuseppe; Leporini, Roberto

    2016-06-01

    Classical and quantum parallelism are deeply different, although it is sometimes claimed that quantum Turing machines are nothing but special examples of classical probabilistic machines. We introduce the concepts of deterministic state machine, classical probabilistic state machine and quantum state machine. On this basis, we discuss the question: To what extent can quantum state machines be simulated by classical probabilistic state machines? Each state machine is devoted to a single task determined by its program. Real computers, however, behave differently, being able to solve different kinds of problems. This capacity can be modeled, in the quantum case, by the mathematical notion of abstract quantum computing machine, whose different programs determine different quantum state machines. The computations of abstract quantum computing machines can be linguistically described by the formulas of a particular form of quantum logic, termed quantum computational logic.

  13. The path of code linting

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Join the path of code linting and discover how it can help you reach higher levels of programming enlightenment. Today we will cover how to embrace code linters to offload cognitive strain on preserving style standards in your code base as well as avoiding error-prone constructs. Additionally, I will show you the journey ahead for integrating several code linters in the programming tools your already use with very little effort.

  14. Note related to the elaboration of a coding by key sentences for the programming of a document automatic selection system

    International Nuclear Information System (INIS)

    Leroy, A.; Braffort, P.

    1959-01-01

    This note deals with the providing of CEA documentalists with a tool for coding studies. The authors first discuss issues related to code selection criteria (author classification, topic classification, and so on), and propose an overview and a discussion of linguistic models. They also comment how diagrams illustrating relationships between words are built up, and propose a diagram representation example which includes different concepts such as conditions, properties, object, tools or processes (for example hardness for a steel, batch processing for a condition, or sintering for a process), and also the introduction of negation. Then, the authors address how basic concepts can be highlighted, describe how key sentences can be built up, and propose an example analysis in the case of a published article dealing with nuclear reactors (in this case, the study of a liquid-metal neutron absorber for the control of a gas-cooled power reactor). Perspectives of evolution are finally discussed

  15. Untyped Memory in the Java Virtual Machine

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    We have implemented a virtual execution environment that executes legacy binary code on top of the type-safe Java Virtual Machine by recompiling native code instructions to type-safe bytecode. As it is essentially impossible to infer static typing into untyped machine code, our system emulates...... untyped memory on top of Java’s type system. While this approach allows to execute native code on any off-the-shelf JVM, the resulting runtime performance is poor. We propose a set of virtual machine extensions that add type-unsafe memory objects to JVM. We contend that these JVM extensions do not relax...... Java’s type system as the same functionality can be achieved in pure Java, albeit much less efficiently....

  16. Evaluating the ONEBFP transport code for possible use in the proton radiography program. Final report, Task 47

    International Nuclear Information System (INIS)

    Marr, D.R.; Prael, R.E.; Adams, K.J.

    1996-10-01

    This is notification of the completion of Task 47 and a summary of the fulfillment of the requirements thereof. Deliverables for Task 47 include the data test files and a final report. The test files have been delivered to the customer and the attached paper satisfies the requirements for a final report. Detail on the completion of each of the subtasks described in the Statement of Work follow. The author repeats the complete list of subtasks for Task 47: (1) The software engineer will modify the ONEBFP code to generate a logarithmic distribution of discrete angles and an associated set of quadrature weights; (2) The software engineer will work with Group XTM personnel to obtain the required cross-section data for protons/nuclear cascade particles; and (3) The software engineer will perform 5 test calculations using the modified ONEBFP code to assess its accuracy and efficiency for proton transport problems. The test calculations will be documented in a brief report. Appendix C of the paper describes the quadrature set capability installed in the ONEBFP code pertinent to the fulfillment of subtask 1. A portion of the body of the paper describes the source and modeling and Appendix A describes the extraction of the cross section data used in this study, fulfilling subtask 2. The bulk of the attached report describes the test problems, states the modeling used for each problem, shows the results in both graphical and tabular form, and discusses the implications of the results. This fulfills the requirements of subtask 3

  17. Representational Machines

    DEFF Research Database (Denmark)

    Photography not only represents space. Space is produced photographically. Since its inception in the 19th century, photography has brought to light a vast array of represented subjects. Always situated in some spatial order, photographic representations have been operatively underpinned by social...... to the enterprises of the medium. This is the subject of Representational Machines: How photography enlists the workings of institutional technologies in search of establishing new iconic and social spaces. Together, the contributions to this edited volume span historical epochs, social environments, technological...... possibilities, and genre distinctions. Presenting several distinct ways of producing space photographically, this book opens a new and important field of inquiry for photography research....

  18. Shear machines

    International Nuclear Information System (INIS)

    Astill, M.; Sunderland, A.; Waine, M.G.

    1980-01-01

    A shear machine for irradiated nuclear fuel elements has a replaceable shear assembly comprising a fuel element support block, a shear blade support and a clamp assembly which hold the fuel element to be sheared in contact with the support block. A first clamp member contacts the fuel element remote from the shear blade and a second clamp member contacts the fuel element adjacent the shear blade and is advanced towards the support block during shearing to compensate for any compression of the fuel element caused by the shear blade (U.K.)

  19. XSOR codes users manual

    International Nuclear Information System (INIS)

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ''XSOR''. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms

  20. Electricity of machine tool

    International Nuclear Information System (INIS)

    Gijeon media editorial department

    1977-10-01

    This book is divided into three parts. The first part deals with electricity machine, which can taints from generator to motor, motor a power source of machine tool, electricity machine for machine tool such as switch in main circuit, automatic machine, a knife switch and pushing button, snap switch, protection device, timer, solenoid, and rectifier. The second part handles wiring diagram. This concludes basic electricity circuit of machine tool, electricity wiring diagram in your machine like milling machine, planer and grinding machine. The third part introduces fault diagnosis of machine, which gives the practical solution according to fault diagnosis and the diagnostic method with voltage and resistance measurement by tester.

  1. Environmentally Friendly Machining

    CERN Document Server

    Dixit, U S; Davim, J Paulo

    2012-01-01

    Environment-Friendly Machining provides an in-depth overview of environmentally-friendly machining processes, covering numerous different types of machining in order to identify which practice is the most environmentally sustainable. The book discusses three systems at length: machining with minimal cutting fluid, air-cooled machining and dry machining. Also covered is a way to conserve energy during machining processes, along with useful data and detailed descriptions for developing and utilizing the most efficient modern machining tools. Researchers and engineers looking for sustainable machining solutions will find Environment-Friendly Machining to be a useful volume.

  2. Machine Protection

    CERN Document Server

    Schmidt, R

    2014-01-01

    The protection of accelerator equipment is as old as accelerator technology and was for many years related to high-power equipment. Examples are the protection of powering equipment from overheating (magnets, power converters, high-current cables), of superconducting magnets from damage after a quench and of klystrons. The protection of equipment from beam accidents is more recent. It is related to the increasing beam power of high-power proton accelerators such as ISIS, SNS, ESS and the PSI cyclotron, to the emission of synchrotron light by electron–positron accelerators and FELs, and to the increase of energy stored in the beam (in particular for hadron colliders such as LHC). Designing a machine protection system requires an excellent understanding of accelerator physics and operation to anticipate possible failures that could lead to damage. Machine protection includes beam and equipment monitoring, a system to safely stop beam operation (e.g. dumping the beam or stopping the beam at low energy) and an ...

  3. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  4. Rain VM: Portable Concurrency through Managing Code

    OpenAIRE

    Brown, Neil C.C.

    2006-01-01

    A long-running recent trend in computer programming is the growth in popularity of virtual machines. However, few have included good support for concurrency - a natural mechanism in the Rain programming language. This paper details the design and implementation of a secure virtual machine with support for concurrency, which enables portability of concurrent programs. Possible implementation ideas of many-to-many threading models for the virtual machine kernel are discussed, and initial benchm...

  5. Object-Oriented Parallel Particle-in-Cell Code for Beam Dynamics Simulation in Linear Accelerators

    International Nuclear Information System (INIS)

    Qiang, J.; Ryne, R.D.; Habib, S.; Decky, V.

    1999-01-01

    In this paper, we present an object-oriented three-dimensional parallel particle-in-cell code for beam dynamics simulation in linear accelerators. A two-dimensional parallel domain decomposition approach is employed within a message passing programming paradigm along with a dynamic load balancing. Implementing object-oriented software design provides the code with better maintainability, reusability, and extensibility compared with conventional structure based code. This also helps to encapsulate the details of communications syntax. Performance tests on SGI/Cray T3E-900 and SGI Origin 2000 machines show good scalability of the object-oriented code. Some important features of this code also include employing symplectic integration with linear maps of external focusing elements and using z as the independent variable, typical in accelerators. A successful application was done to simulate beam transport through three superconducting sections in the APT linac design

  6. Machine learning an artificial intelligence approach

    CERN Document Server

    Banerjee, R; Bradshaw, Gary; Carbonell, Jaime Guillermo; Mitchell, Tom Michael; Michalski, Ryszard Spencer

    1983-01-01

    Machine Learning: An Artificial Intelligence Approach contains tutorial overviews and research papers representative of trends in the area of machine learning as viewed from an artificial intelligence perspective. The book is organized into six parts. Part I provides an overview of machine learning and explains why machines should learn. Part II covers important issues affecting the design of learning programs-particularly programs that learn from examples. It also describes inductive learning systems. Part III deals with learning by analogy, by experimentation, and from experience. Parts IV a

  7. FRAC (failure rate analysis code): a computer program for analysis of variance of failure rates. An application user's guide

    International Nuclear Information System (INIS)

    Martz, H.F.; Beckman, R.J.; McInteer, C.R.

    1982-03-01

    Probabilistic risk assessments (PRAs) require estimates of the failure rates of various components whose failure modes appear in the event and fault trees used to quantify accident sequences. Several reliability data bases have been designed for use in providing the necessary reliability data to be used in constructing these estimates. In the nuclear industry, the Nuclear Plant Reliability Data System (NPRDS) and the In-Plant Reliability Data System (IRPDS), among others, were designed for this purpose. An important characteristic of such data bases is the selection and identification of numerous factors used to classify each component that is reported and the subsequent failures of each component. However, the presence of such factors often complicates the analysis of reliability data in the sense that it is inappropriate to group (that is, pool) data for those combinations of factors that yield significantly different failure rate values. These types of data can be analyzed by analysis of variance. FRAC (Failure Rate Analysis Code) is a computer code that performs an analysis of variance of failure rates. In addition, FRAC provides failure rate estimates

  8. TURING MACHINE AS UNIVERSAL ALGORITHM EXECUTOR AND ITS APPLICATION IN THE PROCESS OF HIGH-SCHOOL STUDENTS` ADVANCED STUDY OF ALGORITHMIZATION AND PROGRAMMING FUNDAMENTALS

    Directory of Open Access Journals (Sweden)

    Oleksandr B. Yashchyk

    2016-05-01

    Full Text Available The article discusses the importance of studying the notion of algorithm and its formal specification using Turing machines. In the article it was identified the basic hypothesis of the theory of algorithms for Turing as well as reviewed scientific research of modern scientists devoted to this issue and found the main principles of the Turing machine as an abstract mathematical model. The process of forming information competencies components, information culture and students` logical thinking development with the inclusion of the topic “Study and Application of Turing machine as Universal Algorithm Executor” in the course of Informatics was analyzed.

  9. Analysis of machining and machine tools

    CERN Document Server

    Liang, Steven Y

    2016-01-01

    This book delivers the fundamental science and mechanics of machining and machine tools by presenting systematic and quantitative knowledge in the form of process mechanics and physics. It gives readers a solid command of machining science and engineering, and familiarizes them with the geometry and functionality requirements of creating parts and components in today’s markets. The authors address traditional machining topics, such as: single and multiple point cutting processes grinding components accuracy and metrology shear stress in cutting cutting temperature and analysis chatter They also address non-traditional machining, such as: electrical discharge machining electrochemical machining laser and electron beam machining A chapter on biomedical machining is also included. This book is appropriate for advanced undergraduate and graduate mechani cal engineering students, manufacturing engineers, and researchers. Each chapter contains examples, exercises and their solutions, and homework problems that re...

  10. Machine Protection

    International Nuclear Information System (INIS)

    Schmidt, R

    2014-01-01

    The protection of accelerator equipment is as old as accelerator technology and was for many years related to high-power equipment. Examples are the protection of powering equipment from overheating (magnets, power converters, high-current cables), of superconducting magnets from damage after a quench and of klystrons. The protection of equipment from beam accidents is more recent. It is related to the increasing beam power of high-power proton accelerators such as ISIS, SNS, ESS and the PSI cyclotron, to the emission of synchrotron light by electron–positron accelerators and FELs, and to the increase of energy stored in the beam (in particular for hadron colliders such as LHC). Designing a machine protection system requires an excellent understanding of accelerator physics and operation to anticipate possible failures that could lead to damage. Machine protection includes beam and equipment monitoring, a system to safely stop beam operation (e.g. dumping the beam or stopping the beam at low energy) and an interlock system providing the glue between these systems. The most recent accelerator, the LHC, will operate with about 3 × 10 14 protons per beam, corresponding to an energy stored in each beam of 360 MJ. This energy can cause massive damage to accelerator equipment in case of uncontrolled beam loss, and a single accident damaging vital parts of the accelerator could interrupt operation for years. This article provides an overview of the requirements for protection of accelerator equipment and introduces the various protection systems. Examples are mainly from LHC, SNS and ESS

  11. An abstract model of rogue code insertion into radio frequency wireless networks. The effects of computer viruses on the Program Management Office

    Science.gov (United States)

    Feudo, Christopher V.

    1994-04-01

    This dissertation demonstrates that inadequately protected wireless LANs are more vulnerable to rogue program attack than traditional LANs. Wireless LANs not only run the same risks as traditional LANs, but they also run additional risks associated with an open transmission medium. Intruders can scan radio waves and, given enough time and resources, intercept, analyze, decipher, and reinsert data into the transmission medium. This dissertation describes the development and instantiation of an abstract model of the rogue code insertion process into a DOS-based wireless communications system using radio frequency (RF) atmospheric signal transmission. The model is general enough to be applied to widely used target environments such as UNIX, Macintosh, and DOS operating systems. The methodology and three modules, the prober, activator, and trigger modules, to generate rogue code and insert it into a wireless LAN were developed to illustrate the efficacy of the model. Also incorporated into the model are defense measures against remotely introduced rogue programs and a cost-benefit analysis that determined that such defenses for a specific environment were cost-justified.

  12. Metallizing of machinable glass ceramic

    International Nuclear Information System (INIS)

    Seigal, P.K.

    1976-02-01

    A satisfactory technique has been developed for metallizing Corning (Code 9658) machinable glass ceramic for brazing. Analyses of several bonding materials suitable for metallizing were made using microprobe analysis, optical metallography, and tensile strength tests. The effect of different cleaning techniques on the microstructure and the effect of various firing temperatures on the bonding interface were also investigated. A nickel paste, used for thick-film application, has been applied to obtain braze joints with strength in excess of 2000 psi

  13. Study of on-machine error identification and compensation methods for micro machine tools

    International Nuclear Information System (INIS)

    Wang, Shih-Ming; Yu, Han-Jen; Lee, Chun-Yi; Chiu, Hung-Sheng

    2016-01-01

    Micro machining plays an important role in the manufacturing of miniature products which are made of various materials with complex 3D shapes and tight machining tolerance. To further improve the accuracy of a micro machining process without increasing the manufacturing cost of a micro machine tool, an effective machining error measurement method and a software-based compensation method are essential. To avoid introducing additional errors caused by the re-installment of the workpiece, the measurement and compensation method should be on-machine conducted. In addition, because the contour of a miniature workpiece machined with a micro machining process is very tiny, the measurement method should be non-contact. By integrating the image re-constructive method, camera pixel correction, coordinate transformation, the error identification algorithm, and trajectory auto-correction method, a vision-based error measurement and compensation method that can on-machine inspect the micro machining errors and automatically generate an error-corrected numerical control (NC) program for error compensation was developed in this study. With the use of the Canny edge detection algorithm and camera pixel calibration, the edges of the contour of a machined workpiece were identified and used to re-construct the actual contour of the work piece. The actual contour was then mapped to the theoretical contour to identify the actual cutting points and compute the machining errors. With the use of a moving matching window and calculation of the similarity between the actual and theoretical contour, the errors between the actual cutting points and theoretical cutting points were calculated and used to correct the NC program. With the use of the error-corrected NC program, the accuracy of a micro machining process can be effectively improved. To prove the feasibility and effectiveness of the proposed methods, micro-milling experiments on a micro machine tool were conducted, and the results

  14. Machine terms dictionary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1979-04-15

    This book gives descriptions of machine terms which includes machine design, drawing, the method of machine, machine tools, machine materials, automobile, measuring and controlling, electricity, basic of electron, information technology, quality assurance, Auto CAD and FA terms and important formula of mechanical engineering.

  15. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We identify a number of common features in programming that seem...... conspicuously absent from the literature on biomolecular computing; to partially redress this absence, we introduce a model of computation that is evidently programmable, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined...... by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways...

  16. The Design and Realization of Virtual Machine of Embedded Soft PLC Running System

    Directory of Open Access Journals (Sweden)

    Qingzhao Zeng

    2014-11-01

    Full Text Available Currently soft PLC has been the focus of study object for many countries. Soft PLC system consists of the developing system and running system. A Virtual Machine is an important part in running system even in the whole soft PLC system. It explains and performs intermediate code generated by the developing system and updates I/O status of PLC in order to complete its control function. This paper introduced the implementation scheme and execution process of the embedded soft PLC running system Virtual Machine, and mainly introduced its software implementation method, including the realization of the input sampling program, the realization of the instruction execution program and the realization of output refresh program. Besides, an operation code matching method was put forward in the instruction execution program design. Finally, the test takes PowerPC/P1010 (Freescale as the hardware platform and Vxworks as the operating system, the system test result shows that accuracy, the real-time performance and reliability of Virtual Machine.

  17. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  18. Analogy Mapping Development for Learning Programming

    Science.gov (United States)

    Sukamto, R. A.; Prabawa, H. W.; Kurniawati, S.

    2017-02-01

    Programming skill is an important skill for computer science students, whereas nowadays, there many computer science students are lack of skills and information technology knowledges in Indonesia. This is contrary with the implementation of the ASEAN Economic Community (AEC) since the end of 2015 which is the qualified worker needed. This study provided an effort for nailing programming skills by mapping program code to visual analogies as learning media. The developed media was based on state machine and compiler principle and was implemented in C programming language. The state of every basic condition in programming were successful determined as analogy visualization.

  19. FCG: a code generator for lazy functional languages

    NARCIS (Netherlands)

    Kastens, U.; Langendoen, K.G.; Hartel, Pieter H.; Pfahler, P.

    1992-01-01

    The FCGcode generator produces portable code that supports efficient two-space copying garbage collection. The code generator transforms the output of the FAST compiler front end into an abstract machine code. This code explicitly uses a call stack, which is accessible to the garbage collector. In

  20. Python for probability, statistics, and machine learning

    CERN Document Server

    Unpingco, José

    2016-01-01

    This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...

  1. Computer codes for safety analysis

    International Nuclear Information System (INIS)

    Holland, D.F.

    1986-11-01

    Computer codes for fusion safety analysis have been under development in the United States for about a decade. This paper will discuss five codes that are currently under development by the Fusion Safety Program. The purpose and capability of each code will be presented, a sample given, followed by a discussion of the present status and future development plans

  2. Machine learning topological states

    Science.gov (United States)

    Deng, Dong-Ling; Li, Xiaopeng; Das Sarma, S.

    2017-11-01

    Artificial neural networks and machine learning have now reached a new era after several decades of improvement where applications are to explode in many fields of science, industry, and technology. Here, we use artificial neural networks to study an intriguing phenomenon in quantum physics—the topological phases of matter. We find that certain topological states, either symmetry-protected or with intrinsic topological order, can be represented with classical artificial neural networks. This is demonstrated by using three concrete spin systems, the one-dimensional (1D) symmetry-protected topological cluster state and the 2D and 3D toric code states with intrinsic topological orders. For all three cases, we show rigorously that the topological ground states can be represented by short-range neural networks in an exact and efficient fashion—the required number of hidden neurons is as small as the number of physical spins and the number of parameters scales only linearly with the system size. For the 2D toric-code model, we find that the proposed short-range neural networks can describe the excited states with Abelian anyons and their nontrivial mutual statistics as well. In addition, by using reinforcement learning we show that neural networks are capable of finding the topological ground states of nonintegrable Hamiltonians with strong interactions and studying their topological phase transitions. Our results demonstrate explicitly the exceptional power of neural networks in describing topological quantum states, and at the same time provide valuable guidance to machine learning of topological phases in generic lattice models.

  3. Addiction Machines

    Directory of Open Access Journals (Sweden)

    James Godley

    2011-10-01

    Full Text Available Entry into the crypt William Burroughs shared with his mother opened and shut around a failed re-enactment of William Tell’s shot through the prop placed upon a loved one’s head. The accidental killing of his wife Joan completed the installation of the addictation machine that spun melancholia as manic dissemination. An early encryptment to which was added the audio portion of abuse deposited an undeliverable message in WB. Wil- liam could never tell, although his corpus bears the in- scription of this impossibility as another form of pos- sibility. James Godley is currently a doctoral candidate in Eng- lish at SUNY Buffalo, where he studies psychoanalysis, Continental philosophy, and nineteenth-century litera- ture and poetry (British and American. His work on the concept of mourning and “the dead” in Freudian and Lacanian approaches to psychoanalytic thought and in Gothic literature has also spawned an essay on zombie porn. Since entering the Academy of Fine Arts Karlsruhe in 2007, Valentin Hennig has studied in the classes of Sil- via Bächli, Claudio Moser, and Corinne Wasmuht. In 2010 he spent a semester at the Dresden Academy of Fine Arts. His work has been shown in group exhibi- tions in Freiburg and Karlsruhe.

  4. TVF-NMCRC-A powerful program for writing and executing simulation inputs for the FLUKA Monte Carlo Code system

    International Nuclear Information System (INIS)

    Mark, S.; Khomchenko, S.; Shifrin, M.; Haviv, Y.; Schwartz, J.R.; Orion, I.

    2007-01-01

    We at the Negev Monte Carlo Research Center (NMCRC) have developed a powerful new interface for writing and executing FLUKA input files-TVF-NMCRC. With the TVF tool a FLUKA user has the ability to easily write an input file without requiring any previous experience. The TVF-NMCRC tool is a LINUX program that has been verified for the most common LINUX-based operating systems, and is suitable for the latest version of FLUKA (FLUKA 2006.3)

  5. Technical Note: Defining cyclotron-based clinical scanning proton machines in a FLUKA Monte Carlo system.

    Science.gov (United States)

    Fiorini, Francesca; Schreuder, Niek; Van den Heuvel, Frank

    2018-02-01

    Cyclotron-based pencil beam scanning (PBS) proton machines represent nowadays the majority and most affordable choice for proton therapy facilities, however, their representation in Monte Carlo (MC) codes is more complex than passively scattered proton system- or synchrotron-based PBS machines. This is because degraders are used to decrease the energy from the cyclotron maximum energy to the desired energy, resulting in a unique spot size, divergence, and energy spread depending on the amount of degradation. This manuscript outlines a generalized methodology to characterize a cyclotron-based PBS machine in a general-purpose MC code. The code can then be used to generate clinically relevant plans starting from commercial TPS plans. The described beam is produced at the Provision Proton Therapy Center (Knoxville, TN, USA) using a cyclotron-based IBA Proteus Plus equipment. We characterized the Provision beam in the MC FLUKA using the experimental commissioning data. The code was then validated using experimental data in water phantoms for single pencil beams and larger irregular fields. Comparisons with RayStation TPS plans are also presented. Comparisons of experimental, simulated, and planned dose depositions in water plans show that same doses are calculated by both programs inside the target areas, while penumbrae differences are found at the field edges. These differences are lower for the MC, with a γ(3%-3 mm) index never below 95%. Extensive explanations on how MC codes can be adapted to simulate cyclotron-based scanning proton machines are given with the aim of using the MC as a TPS verification tool to check and improve clinical plans. For all the tested cases, we showed that dose differences with experimental data are lower for the MC than TPS, implying that the created FLUKA beam model is better able to describe the experimental beam. © 2017 The Authors. Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists

  6. User's guide for FRMOD, a zero dimensional FRM burn code

    International Nuclear Information System (INIS)

    Driemeryer, D.; Miley, G.H.

    1979-01-01

    The zero-dimensional FRM plasma burn code, FRMOD is written in the FORTRAN language and is currently available on the Control Data Corporation (CDC) 7600 computer at the Magnetic Fusion Energy Computer Center (MFECC), sponsored by the US Department of Energy, in Livermore, CA. This guide assumes that the user is familiar with the system architecture and some of the utility programs available on the MFE-7600 machine, since online documentation is available for system routines through the use of the DOCUMENT utility. Users may therefore refer to it for answers to system related questions

  7. Aeroelastic code development activities in the United States

    Energy Technology Data Exchange (ETDEWEB)

    Wright, A.D. [National Renewable Energy Lab., Golden, Colorado (United States)

    1996-09-01

    Designing wind turbines to be fatigue resistant and to have long lifetimes at minimal cost is a major goal of the federal wind program and the wind industry in the United States. To achieve this goal, we must be able to predict critical loads for a wide variety of different wind turbines operating under extreme conditions. The codes used for wind turbine dynamic analysis must be able to analyze a wide range of different wind turbine configurations as well as rapidly predict the loads due to turbulent wind inflow with a minimal set of degrees of freedom. Code development activities in the US have taken a two-pronged approach in order to satisfy both of these criteria: (1) development of a multi-purpose code which can be used to analyze a wide variety of wind turbine configurations without having to develop new equations of motion with each configuration change, and (2) development of specialized codes with minimal sets of specific degrees of freedom for analysis of two- and three-bladed horizontal axis wind turbines and calculation of machine loads due to turbulent inflow. In the first method we have adapted a commercial multi-body dynamics simulation package for wind turbine analysis. In the second approach we are developing specialized codes with limited degrees of freedom, usually specified in the modal domain. This paper will summarize progress to date in the development, validation, and application of these codes. (au) 13 refs.

  8. Using Phun to Study ``Perpetual Motion'' Machines

    Science.gov (United States)

    Koreš, Jaroslav

    2012-05-01

    The concept of "perpetual motion" has a long history. The Indian astronomer and mathematician Bhaskara II (12th century) was the first person to describe a perpetual motion (PM) machine. An example of a 13th- century PM machine is shown in Fig. 1. Although the law of conservation of energy clearly implies the impossibility of PM construction, over the centuries numerous proposals for PM have been made, involving ever more elements of modern science in their construction. It is possible to test a variety of PM machines in the classroom using a program called Phun2 or its commercial version Algodoo.3 The programs are designed to simulate physical processes and we can easily simulate mechanical machines using them. They provide an intuitive graphical environment controlled with a mouse; a programming language is not needed. This paper describes simulations of four different (supposed) PM machines.4

  9. Trunnion Collar Removal Machine - Gap Analysis Table

    International Nuclear Information System (INIS)

    Johnson, M.

    2005-01-01

    The purpose of this document is to review the existing the trunnion collar removal machine against the ''Nuclear Safety Design Bases for License Application'' (NSDB) [Ref. 10] requirements and to identify codes and standards and supplemental requirements to meet these requirements. If these codes and standards can not fully meet these requirements then a ''gap'' is identified. These gaps will be identified here and addressed using the ''Trunnion Collar Removal Machine Design Development Plan'' [Ref. 15]. The codes and standards, supplemental requirements, and design development requirements for the trunnion collar removal machine are provided in the gap analysis table (Appendix A, Table 1). Because the trunnion collar removal machine is credited with performing functions important to safety (ITS) in the NSDB [Ref. 10], design basis requirements are applicable to ensure equipment is available and performs required safety functions when needed. The gap analysis table is used to identify design objectives and provide a means to satisfy safety requirements. To ensure that the trunnion collar removal machine performs required safety functions and meets performance criteria, this portion of the gap analysis tables supplies codes and standards sections and the supplemental requirements and identifies design development requirements, if needed

  10. Operating System For Numerically Controlled Milling Machine

    Science.gov (United States)

    Ray, R. B.

    1992-01-01

    OPMILL program is operating system for Kearney and Trecker milling machine providing fast easy way to program manufacture of machine parts with IBM-compatible personal computer. Gives machinist "equation plotter" feature, which plots equations that define movements and converts equations to milling-machine-controlling program moving cutter along defined path. System includes tool-manager software handling up to 25 tools and automatically adjusts to account for each tool. Developed on IBM PS/2 computer running DOS 3.3 with 1 MB of random-access memory.

  11. SSCTRK: A particle tracking code for the SSC

    International Nuclear Information System (INIS)

    Ritson, D.

    1990-07-01

    While many indirect methods are available to evaluate dynamic aperture there appears at this time to be no reliable substitute to tracking particles through realistic machine lattices for a number of turns determined by the storage times. Machine lattices are generated by ''Monte Carlo'' techniques from the expected rms fabrication and survey errors. Any given generated machine can potentially be a lucky or unlucky fluctuation from the average. Therefore simulation to serve as a predictor of future performance must be done for an ensemble of generated machines. Further, several amplitudes and momenta are necessary to predict machine performance. Thus to make Monte Carlo type simulations for the SSC requires very considerable computer resources. Hitherto, it has been assumed that this was not feasible, and alternative indirect methods have been proposed or tried to answer the problem. We reexamined the feasibility of using direct computation. Previous codes have represented lattices by a succession of thin elements separated by bend-drifts. With ''kick-drift'' configurations, tracking time is linear in the multipole order included, and the code is symplectic. Modern vector processors simultaneously handle a large number of cases in parallel. Combining the efficiencies of kick drift tracking with vector processing, in fact, makes realistic Monte Carlo simulation entirely feasible. SSCTRK uses the above features. It is structured to have a very friendly interface, a very wide latitude of choice for cases to be run in parallel, and, by using pure FORTRAN 77, to interchangeably run on a wide variety of computers. We describe in this paper the program structure operational checks and results achieved

  12. Bacteriological quality of drinks from vending machines.

    Science.gov (United States)

    Hunter, P. R.; Burge, S. H.

    1986-01-01

    A survey on the bacteriological quality of both drinking water and flavoured drinks from coin-operated vending machines is reported. Forty-four per cent of 25 drinking water samples examined contained coliforms and 84% had viable counts of greater than 1000 organisms ml at 30 degrees C. Thirty-one flavoured drinks were examined; 6% contained coliforms and 39% had total counts greater than 1000 organisms ml. It is suggested that the D.H.S.S. code of practice on coin-operated vending machines is not being followed. It is also suggested that drinking water alone should not be dispensed from such machines. PMID:3794325

  13. Design Control Systems of Human Machine Interface in the NTVS-2894 Seat Grinder Machine to Increase the Productivity

    Science.gov (United States)

    Ardi, S.; Ardyansyah, D.

    2018-02-01

    In the Manufacturing of automotive spare parts, increased sales of vehicles is resulted in increased demand for production of engine valve of the customer. To meet customer demand, we carry out improvement and overhaul of the NTVS-2894 seat grinder machine on a machining line. NTVS-2894 seat grinder machine has been decreased machine productivity, the amount of trouble, and the amount of downtime. To overcome these problems on overhaul the NTVS-2984 seat grinder machine include mechanical and programs, is to do the design and manufacture of HMI (Human Machine Interface) GP-4501T program. Because of the time prior to the overhaul, NTVS-2894 seat grinder machine does not have a backup HMI (Human Machine Interface) program. The goal of the design and manufacture in this program is to improve the achievement of production, and allows an operator to operate beside it easier to troubleshoot the NTVS-2894 seat grinder machine thereby reducing downtime on the NTVS-2894 seat grinder machine. The results after the design are HMI program successfully made it back, machine productivity increased by 34.8%, the amount of trouble, and downtime decreased 40% decrease from 3,160 minutes to 1,700 minutes. The implication of our design, it could facilitate the operator in operating machine and the technician easer to maintain and do the troubleshooting the machine problems.

  14. Implementing particle-in-cell plasma simulation code on the BBN TC2000

    International Nuclear Information System (INIS)

    Sturtevant, J.E.; Maccabe, A.B.

    1990-01-01

    The BBN TC2000 is a multiple instruction, multiple data (MIMD) machine that combines a physically distributed memory with a logically shared memory programming environment using the unique Butterfly switch. Particle-In-Cell (PIC) plasma simulations model the interaction of charged particles with electric and magnetic fields. This paper describes the implementation of both a 1-D electrostatic and a 2 1/2-D electromagnetic PIC (particle-in-cell) plasma simulation code on a BBN TC2000. Performance is compared to implementations of the same code on the shared memory Sequent Balance and distributed memory Intel iPSC hypercube

  15. User's manual for seismic analysis code 'SONATINA-2V'

    Energy Technology Data Exchange (ETDEWEB)

    Hanawa, Satoshi; Iyoku, Tatsuo [Japan Atomic Energy Research Inst., Oarai, Ibaraki (Japan). Oarai Research Establishment

    2001-08-01

    The seismic analysis code, SONATINA-2V, has been developed to analyze the behavior of the HTTR core graphite components under seismic excitation. The SONATINA-2V code is a two-dimensional computer program capable of analyzing the vertical arrangement of the HTTR graphite components, such as fuel blocks, replaceable reflector blocks, permanent reflector blocks, as well as their restraint structures. In the analytical model, each block is treated as rigid body and is restrained by dowel pins which restrict relative horizontal movement but allow vertical and rocking motions between upper and lower blocks. Moreover, the SONATINA-2V code is capable of analyzing the core vibration behavior under both simultaneous excitations of vertical and horizontal directions. The SONATINA-2V code is composed of the main program, pri-processor for making the input data to SONATINA-2V and post-processor for data processing and making the graphics from analytical results. Though the SONATINA-2V code was developed in order to work in the MSP computer system of Japan Atomic Energy Research Institute (JAERI), the computer system was abolished with the technical progress of computer. Therefore, improvement of this analysis code was carried out in order to operate the code under the UNIX machine, SR8000 computer system, of the JAERI. The users manual for seismic analysis code, SONATINA-2V, including pri- and post-processor is given in the present report. (author)

  16. An abstract machine for module replacement

    OpenAIRE

    Walton, Chris; Krl, Dilsun; Gilmore, Stephen

    1998-01-01

    In this paper we define an abstract machine model for the mλ typed intermediate language. This abstract machine is used to give a formal description of the operation of run-time module replacement from the programming language Dynamic ML. The essential technical device which we employ for module replacement is a modification of two-space copying garbage collection.

  17. Gravity inversion code

    International Nuclear Information System (INIS)

    Burkhard, N.R.

    1979-01-01

    The gravity inversion code applies stabilized linear inverse theory to determine the topography of a subsurface density anomaly from Bouguer gravity data. The gravity inversion program consists of four source codes: SEARCH, TREND, INVERT, and AVERAGE. TREND and INVERT are used iteratively to converge on a solution. SEARCH forms the input gravity data files for Nevada Test Site data. AVERAGE performs a covariance analysis on the solution. This document describes the necessary input files and the proper operation of the code. 2 figures, 2 tables

  18. ENERGY STAR Certified Vending Machines

    Science.gov (United States)

    Certified models meet all ENERGY STAR requirements as listed in the Version 3.0 ENERGY STAR Program Requirements for Refrigerated Beverage Vending Machines that are effective as of March 1, 2013. A detailed listing of key efficiency criteria are available at

  19. Machine intelligence and knowledge bases

    Energy Technology Data Exchange (ETDEWEB)

    Furukawa, K

    1981-09-01

    The basic functions necessary in machine intelligence are a knowledge base and a logic programming language such as PROLOG using deductive reasoning. Recently inductive reasoning based on meta knowledge and default reasoning have been developed. The creative thought model of Lenit is reviewed and the concept of knowledge engineering is introduced. 17 references.

  20. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2010-01-01

    in a strong sense: a universal algorithm exists, that is able to execute any program, and is not asymptotically inefficient. A prototype model has been implemented (for now in silico on a conventional computer). This work opens new perspectives on just how computation may be specified at the biological level......., by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only...... executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways and without arcane encodings of data and algorithm); it is also uniform: new “hardware” is not needed to solve new problems; and (last but not least) it is Turing complete...

  1. Evaluation of leak rate by EPRI code

    International Nuclear Information System (INIS)

    Isozaki, Toshikuni; Hashiguchi, Issei; Kato, Kiyoshi; Miyazono, Shohachiro

    1987-08-01

    From 1987, a research on the leak rate from a cracked pipe under BWR or PWR operating condition is going to be carried out at the authors' laboratory. This report describes the computed results by EPRI's leak rate code which was mounted on JAERI FACOM-M380 machine. Henry's critical flow model is used in this program. For the planning of an experimental research, the leak rate from a crack under BWR or PWR operating condition is computed, varying a crack length 2c, crack opening diameter COD and pipe diameter. The COD value under which the minimum detectable leak rate of 5 gpm is given is 0.22 mm or 0.21 mm under the BWR or PWR condition with 2c = 100 mm and 16B pipe geometry. The entire lists are shown in the appendix. (author)

  2. Connection machine: a computer architecture based on cellular automata

    Energy Technology Data Exchange (ETDEWEB)

    Hillis, W D

    1984-01-01

    This paper describes the connection machine, a programmable computer based on cellular automata. The essential idea behind the connection machine is that a regular locally-connected cellular array can be made to behave as if the processing cells are connected into any desired topology. When the topology of the machine is chosen to match the topology of the application program, the result is a fast, powerful computing engine. The connection machine was originally designed to implement knowledge retrieval operations in artificial intelligence programs, but the hardware and the programming techniques are apparently applicable to a much larger class of problems. A machine with 100000 processing cells is currently being constructed. 27 references.

  3. Computer-modeling codes to improve exploration nuclear-logging methods. National Uranium Resource Evaluation

    International Nuclear Information System (INIS)

    Wilson, R.D.; Price, R.K.; Kosanke, K.L.

    1983-03-01

    As part of the Department of Energy's National Uranium Resource Evaluation (NURE) project's Technology Development effort, a number of computer codes and accompanying data bases were assembled for use in modeling responses of nuclear borehole logging Sondes. The logging methods include fission neutron, active and passive gamma-ray, and gamma-gamma. These CDC-compatible computer codes and data bases are available on magnetic tape from the DOE Technical Library at its Grand Junction Area Office. Some of the computer codes are standard radiation-transport programs that have been available to the radiation shielding community for several years. Other codes were specifically written to model the response of borehole radiation detectors or are specialized borehole modeling versions of existing Monte Carlo transport programs. Results from several radiation modeling studies are available as two large data bases (neutron and gamma-ray). These data bases are accompanied by appropriate processing programs that permit the user to model a wide range of borehole and formation-parameter combinations for fission-neutron, neutron-, activation and gamma-gamma logs. The first part of this report consists of a brief abstract for each code or data base. The abstract gives the code name and title, short description, auxiliary requirements, typical running time (CDC 6600), and a list of references. The next section gives format specifications and/or directory for the tapes. The final section of the report presents listings for programs used to convert data bases between machine floating-point and EBCDIC

  4. RFQ simulation code

    International Nuclear Information System (INIS)

    Lysenko, W.P.

    1984-04-01

    We have developed the RFQLIB simulation system to provide a means to systematically generate the new versions of radio-frequency quadrupole (RFQ) linac simulation codes that are required by the constantly changing needs of a research environment. This integrated system simplifies keeping track of the various versions of the simulation code and makes it practical to maintain complete and up-to-date documentation. In this scheme, there is a certain standard version of the simulation code that forms a library upon which new versions are built. To generate a new version of the simulation code, the routines to be modified or added are appended to a standard command file, which contains the commands to compile the new routines and link them to the routines in the library. The library itself is rarely changed. Whenever the library is modified, however, this modification is seen by all versions of the simulation code, which actually exist as different versions of the command file. All code is written according to the rules of structured programming. Modularity is enforced by not using COMMON statements, simplifying the relation of the data flow to a hierarchy diagram. Simulation results are similar to those of the PARMTEQ code, as expected, because of the similar physical model. Different capabilities, such as those for generating beams matched in detail to the structure, are available in the new code for help in testing new ideas in designing RFQ linacs

  5. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and software...

  6. Machine technology: a survey

    International Nuclear Information System (INIS)

    Barbier, M.M.

    1981-01-01

    An attempt was made to find existing machines that have been upgraded and that could be used for large-scale decontamination operations outdoors. Such machines are in the building industry, the mining industry, and the road construction industry. The road construction industry has yielded the machines in this presentation. A review is given of operations that can be done with the machines available

  7. Machine Shop Lathes.

    Science.gov (United States)

    Dunn, James

    This guide, the second in a series of five machine shop curriculum manuals, was designed for use in machine shop courses in Oklahoma. The purpose of the manual is to equip students with basic knowledge and skills that will enable them to enter the machine trade at the machine-operator level. The curriculum is designed so that it can be used in…

  8. Superconducting rotating machines

    International Nuclear Information System (INIS)

    Smith, J.L. Jr.; Kirtley, J.L. Jr.; Thullen, P.

    1975-01-01

    The opportunities and limitations of the applications of superconductors in rotating electric machines are given. The relevant properties of superconductors and the fundamental requirements for rotating electric machines are discussed. The current state-of-the-art of superconducting machines is reviewed. Key problems, future developments and the long range potential of superconducting machines are assessed

  9. Procedure and code for calculating black control rods taking into account epithermal absorption, code CAS-1; Postupak i program za proracun crnih kontrolnih sipki, uzimajuci u obzir i epitermalnu apsorpciju, CAS-1

    Energy Technology Data Exchange (ETDEWEB)

    Martinc, R; Trivunac, N; Zivkovic, Z [Boris Kidric Institute of nuclear sciences Vinca, Belgrade (Yugoslavia)

    1964-12-15

    This report describes the computer code CAS-1, calculation method and procedure applied for calculating the black control rods taking into account the epithermal neutron absorption. Results obtained for supercell method applied for regular lattice reflected in the multiplication medium is part of this report in addition to the computer code manual.

  10. Machine-to-machine communications architectures, technology, standards, and applications

    CERN Document Server

    Misic, Vojislav B

    2014-01-01

    With the number of machine-to-machine (M2M)-enabled devices projected to reach 20 to 50 billion by 2020, there is a critical need to understand the demands imposed by such systems. Machine-to-Machine Communications: Architectures, Technology, Standards, and Applications offers rigorous treatment of the many facets of M2M communication, including its integration with current technology.Presenting the work of a different group of international experts in each chapter, the book begins by supplying an overview of M2M technology. It considers proposed standards, cutting-edge applications, architectures, and traffic modeling and includes case studies that highlight the differences between traditional and M2M communications technology.Details a practical scheme for the forward error correction code designInvestigates the effectiveness of the IEEE 802.15.4 low data rate wireless personal area network standard for use in M2M communicationsIdentifies algorithms that will ensure functionality, performance, reliability, ...

  11. Functional Programming

    OpenAIRE

    Chitil, Olaf

    2009-01-01

    Functional programming is a programming paradigm like object-oriented programming and logic programming. Functional programming comprises both a specific programming style and a class of programming languages that encourage and support this programming style. Functional programming enables the programmer to describe an algorithm on a high-level, in terms of the problem domain, without having to deal with machine-related details. A program is constructed from functions that only map inputs to ...

  12. Quality Managment Program (QMP) report: A review of quality management programs developed in response to Title 10, Section 35.32 of the Code of Federal Regulations

    Energy Technology Data Exchange (ETDEWEB)

    Witte, M.C.

    1994-10-01

    In July of 1991, the Nuclear Regulatory Commission published a Final Rule in the Federal Register amending regulations governing medical therapeutic administrations of byproduct material and certain uses of radioactive sodium iodide. These amendments required implementation of a Quality Management Program (QMP) to provide high confidence that the byproduct material -- or radiation from byproduct material -- will be administered as directed by an authorized user physician. Herein, this rule is referred to as the QM rule. The Final Rule was published after two proposed rules had been published in the Federal Register.

  13. TRIPOLI 01, a three-dimensional polykinetic Monte Carlo program. Pt.1. Presentation of the TRIPOLI code

    International Nuclear Information System (INIS)

    Baur, A.; Bourdet, L.; Gonnord, J.; Nimal, J.C.; Vergnaud, T.

    1977-01-01

    TRIPOLI is a package of programs intended for solving the neutron polykinetic transport in any three-dimensional geometry. It is written in FORTRAN for IBM computers and the 400 kilo octets are not overflown (buffers excluded). The Monte Carlo method is used. Particular emphasis is put on the problems of reducing the calculating time through two different ways: weighting or smoothing techniques have been used for processing the strong attenuations with a reasonable computer time consumption, and the quantities have been pre-calculated to reduce to a maximum the simulation time. TRIPOLI has been conceived to solve a large scale of neutron propagation problems involving fast neutrons (calculation of radiation damage in materials, biological dose or inelastic γ production), slow neutrons (mechanical structure activation, neutron flux on control chambers or sources of capture γ radiation); near the cores (materials irradiation inside power or experimental reactors) or at large distances from the sources (activation of the secondary fluid or radiation streaming through the shields). Three new possibilities appear in TRIPOLI 2: calculations in unsteady operation, point calculations of the reaction rates using the method of 'the shockless flux after the shock', and the FINE RESPONSE method in opposition to INTEGRAL RESPONSES [fr

  14. VOA: a 2-d plasma physics code

    International Nuclear Information System (INIS)

    Eltgroth, P.G.

    1975-12-01

    A 2-dimensional relativistic plasma physics code was written and tested. The non-thermal components of the particle distribution functions are represented by expansion into moments in momentum space. These moments are computed directly from numerical equations. Currently three species are included - electrons, ions and ''beam electrons''. The computer code runs on either the 7600 or STAR machines at LLL. Both the physics and the operation of the code are discussed

  15. Parallel processing of Monte Carlo code MCNP for particle transport problem

    Energy Technology Data Exchange (ETDEWEB)

    Higuchi, Kenji; Kawasaki, Takuji

    1996-06-01

    It is possible to vectorize or parallelize Monte Carlo codes (MC code) for photon and neutron transport problem, making use of independency of the calculation for each particle. Applicability of existing MC code to parallel processing is mentioned. As for parallel computer, we have used both vector-parallel processor and scalar-parallel processor in performance evaluation. We have made (i) vector-parallel processing of MCNP code on Monte Carlo machine Monte-4 with four vector processors, (ii) parallel processing on Paragon XP/S with 256 processors. In this report we describe the methodology and results for parallel processing on two types of parallel or distributed memory computers. In addition, we mention the evaluation of parallel programming environments for parallel computers used in the present work as a part of the work developing STA (Seamless Thinking Aid) Basic Software. (author)

  16. An Efficient Platform for the Automatic Extraction of Patterns in Native Code

    Directory of Open Access Journals (Sweden)

    Javier Escalada

    2017-01-01

    Full Text Available Different software tools, such as decompilers, code quality analyzers, recognizers of packed executable files, authorship analyzers, and malware detectors, search for patterns in binary code. The use of machine learning algorithms, trained with programs taken from the huge number of applications in the existing open source code repositories, allows finding patterns not detected with the manual approach. To this end, we have created a versatile platform for the automatic extraction of patterns from native code, capable of processing big binary files. Its implementation has been parallelized, providing important runtime performance benefits for multicore architectures. Compared to the single-processor execution, the average performance improvement obtained with the best configuration is 3.5 factors over the maximum theoretical gain of 4 factors.

  17. Optimizing fusion PIC code performance at scale on Cori Phase 2

    Energy Technology Data Exchange (ETDEWEB)

    Koskela, T. S.; Deslippe, J.

    2017-07-23

    In this paper we present the results of optimizing the performance of the gyrokinetic full-f fusion PIC code XGC1 on the Cori Phase Two Knights Landing system. The code has undergone substantial development to enable the use of vector instructions in its most expensive kernels within the NERSC Exascale Science Applications Program. We study the single-node performance of the code on an absolute scale using the roofline methodology to guide optimization efforts. We have obtained 2x speedups in single node performance due to enabling vectorization and performing memory layout optimizations. On multiple nodes, the code is shown to scale well up to 4000 nodes, near half the size of the machine. We discuss some communication bottlenecks that were identified and resolved during the work.

  18. PGHPF – An Optimizing High Performance Fortran Compiler for Distributed Memory Machines

    Directory of Open Access Journals (Sweden)

    Zeki Bozkus

    1997-01-01

    Full Text Available High Performance Fortran (HPF is the first widely supported, efficient, and portable parallel programming language for shared and distributed memory systems. HPF is realized through a set of directive-based extensions to Fortran 90. It enables application developers and Fortran end-users to write compact, portable, and efficient software that will compile and execute on workstations, shared memory servers, clusters, traditional supercomputers, or massively parallel processors. This article describes a production-quality HPF compiler for a set of parallel machines. Compilation techniques such as data and computation distribution, communication generation, run-time support, and optimization issues are elaborated as the basis for an HPF compiler implementation on distributed memory machines. The performance of this compiler on benchmark programs demonstrates that high efficiency can be achieved executing HPF code on parallel architectures.

  19. Program Development Tools and Infrastructures

    International Nuclear Information System (INIS)

    Schulz, M.

    2012-01-01

    Exascale class machines will exhibit a new level of complexity: they will feature an unprecedented number of cores and threads, will most likely be heterogeneous and deeply hierarchical, and offer a range of new hardware techniques (such as speculative threading, transactional memory, programmable prefetching, and programmable accelerators), which all have to be utilized for an application to realize the full potential of the machine. Additionally, users will be faced with less memory per core, fixed total power budgets, and sharply reduced MTBFs. At the same time, it is expected that the complexity of applications will rise sharply for exascale systems, both to implement new science possible at exascale and to exploit the new hardware features necessary to achieve exascale performance. This is particularly true for many of the NNSA codes, which are large and often highly complex integrated simulation codes that push the limits of everything in the system including language features. To overcome these limitations and to enable users to reach exascale performance, users will expect a new generation of tools that address the bottlenecks of exascale machines, that work seamlessly with the (set of) programming models on the target machines, that scale with the machine, that provide automatic analysis capabilities, and that are flexible and modular enough to overcome the complexities and changing demands of the exascale architectures. Further, any tool must be robust enough to handle the complexity of large integrated codes while keeping the user's learning curve low. With the ASC program, in particular the CSSE (Computational Systems and Software Engineering) and CCE (Common Compute Environment) projects, we are working towards a new generation of tools that fulfill these requirements and that provide our users as well as the larger HPC community with the necessary tools, techniques, and methodologies required to make exascale performance a reality.

  20. Program Development Tools and Infrastructures

    Energy Technology Data Exchange (ETDEWEB)

    Schulz, M

    2012-03-12

    Exascale class machines will exhibit a new level of complexity: they will feature an unprecedented number of cores and threads, will most likely be heterogeneous and deeply hierarchical, and offer a range of new hardware techniques (such as speculative threading, transactional memory, programmable prefetching, and programmable accelerators), which all have to be utilized for an application to realize the full potential of the machine. Additionally, users will be faced with less memory per core, fixed total power budgets, and sharply reduced MTBFs. At the same time, it is expected that the complexity of applications will rise sharply for exascale systems, both to implement new science possible at exascale and to exploit the new hardware features necessary to achieve exascale performance. This is particularly true for many of the NNSA codes, which are large and often highly complex integrated simulation codes that push the limits of everything in the system including language features. To overcome these limitations and to enable users to reach exascale performance, users will expect a new generation of tools that address the bottlenecks of exascale machines, that work seamlessly with the (set of) programming models on the target machines, that scale with the machine, that provide automatic analysis capabilities, and that are flexible and modular enough to overcome the complexities and changing demands of the exascale architectures. Further, any tool must be robust enough to handle the complexity of large integrated codes while keeping the user's learning curve low. With the ASC program, in particular the CSSE (Computational Systems and Software Engineering) and CCE (Common Compute Environment) projects, we are working towards a new generation of tools that fulfill these requirements and that provide our users as well as the larger HPC community with the necessary tools, techniques, and methodologies required to make exascale performance a reality.