WorldWideScience

Sample records for automated calculation program

  1. GoSam. A program for automated one-loop calculations

    Energy Technology Data Exchange (ETDEWEB)

    Cullen, G. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Greiner, N.; Heinrich, G.; Reiter, T. [Max-Planck-Institut fuer Physik, Muenchen (Germany); Luisoni, G. [Durham Univ. (United Kingdom). Inst. for Particle Physics Phenomenology; Mastrolia, P. [Max-Planck-Institut fuer Physik, Muenchen (Germany); Padua Univ. (Italy). Dipt. di Fisica; Ossola, G. [City Univ. of New York, NY (United States). New York City College of Technology; Tramontano, F. [European Organization for Nuclear Research (CERN), Geneva (Switzerland)

    2011-11-15

    The program package GoSam is presented which aims at the automated calculation of one-loop amplitudes for multi-particle processes. The amplitudes are generated in terms of Feynman diagrams and can be reduced using either D-dimensional integrand-level decomposition or tensor reduction, or a combination of both. GoSam can be used to calculate one-loop corrections to both QCD and electroweak theory, and model files for theories Beyond the Standard Model can be linked as well. A standard interface to programs calculating real radiation is also included. The flexibility of the program is demonstrated by various examples. (orig.)

  2. GoSam. A program for automated one-loop calculations

    International Nuclear Information System (INIS)

    Cullen, G.; Greiner, N.; Heinrich, G.; Reiter, T.; Luisoni, G.

    2011-11-01

    The program package GoSam is presented which aims at the automated calculation of one-loop amplitudes for multi-particle processes. The amplitudes are generated in terms of Feynman diagrams and can be reduced using either D-dimensional integrand-level decomposition or tensor reduction, or a combination of both. GoSam can be used to calculate one-loop corrections to both QCD and electroweak theory, and model files for theories Beyond the Standard Model can be linked as well. A standard interface to programs calculating real radiation is also included. The flexibility of the program is demonstrated by various examples. (orig.)

  3. GoSam: A program for automated one-loop calculations

    International Nuclear Information System (INIS)

    Cullen, G; Greiner, N; Heinrich, G; Mastrolia, P; Reiter, T; Luisoni, G; Ossola, G; Tramontano, F

    2012-01-01

    The program package GoSam is presented which aims at the automated calculation of one-loop amplitudes for multi-particle processes. The amplitudes are generated in terms of Feynman diagrams and can be reduced using either D-dimensional integrand-level decomposition or tensor reduction, or a combination of both. GoSam can be used to calculate one-loop corrections to both QCD and electroweak theory, and model files for theories Beyond the Standard Model can be linked as well. A standard interface to programs calculating real radiation is also included. The flexibility of the program is demonstrated by various examples.

  4. Automated one-loop calculations with GOSAM

    International Nuclear Information System (INIS)

    Cullen, Gavin; Greiner, Nicolas; Heinrich, Gudrun; Reiter, Thomas; Luisoni, Gionata

    2011-11-01

    We present the program package GoSam which is designed for the automated calculation of one-loop amplitudes for multi-particle processes in renormalisable quantum field theories. The amplitudes, which are generated in terms of Feynman diagrams, can be reduced using either D-dimensional integrand-level decomposition or tensor reduction. GoSam can be used to calculate one-loop QCD and/or electroweak corrections to Standard Model processes and offers the flexibility to link model files for theories Beyond the Standard Model. A standard interface to programs calculating real radiation is also implemented. We demonstrate the flexibility of the program by presenting examples of processes with up to six external legs attached to the loop. (orig.)

  5. Automated one-loop calculations with GOSAM

    Energy Technology Data Exchange (ETDEWEB)

    Cullen, Gavin [Edinburgh Univ. (United Kingdom). School of Physics and Astronomy; Deutsches Elektronen-Synchrotron, Zeuthen [DESY; Germany; Greiner, Nicolas [Illinois Univ., Urbana-Champaign, IL (United States). Dept. of Physics; Max-Planck-Institut fuer Physik, Muenchen (Germany); Heinrich, Gudrun; Reiter, Thomas [Max-Planck-Institut fuer Physik, Muenchen (Germany); Luisoni, Gionata [Durham Univ. (United Kingdom). Inst. for Particle Physics Phenomenology; Mastrolia, Pierpaolo [Max-Planck-Institut fuer Physik, Muenchen (Germany); Padua Univ. (Italy). Dipt. di Fisica; Ossola, Giovanni [New York City Univ., NY (United States). New York City College of Technology; New York City Univ., NY (United States). The Graduate School and University Center; Tramontano, Francesco [European Organization for Nuclear Research (CERN), Geneva (Switzerland)

    2011-11-15

    We present the program package GoSam which is designed for the automated calculation of one-loop amplitudes for multi-particle processes in renormalisable quantum field theories. The amplitudes, which are generated in terms of Feynman diagrams, can be reduced using either D-dimensional integrand-level decomposition or tensor reduction. GoSam can be used to calculate one-loop QCD and/or electroweak corrections to Standard Model processes and offers the flexibility to link model files for theories Beyond the Standard Model. A standard interface to programs calculating real radiation is also implemented. We demonstrate the flexibility of the program by presenting examples of processes with up to six external legs attached to the loop. (orig.)

  6. Automated one-loop calculations with GoSam

    International Nuclear Information System (INIS)

    Cullen, Gavin; Greiner, Nicolas; Heinrich, Gudrun; Reiter, Thomas; Luisoni, Gionata; Mastrolia, Pierpaolo; Ossola, Giovanni; Tramontano, Francesco

    2012-01-01

    We present the program package GoSam which is designed for the automated calculation of one-loop amplitudes for multi-particle processes in renormalisable quantum field theories. The amplitudes, which are generated in terms of Feynman diagrams, can be reduced using either D-dimensional integrand-level decomposition or tensor reduction. GoSam can be used to calculate one-loop QCD and/or electroweak corrections to Standard Model processes and offers the flexibility to link model files for theories Beyond the Standard Model. A standard interface to programs calculating real radiation is also implemented. We demonstrate the flexibility of the program by presenting examples of processes with up to six external legs attached to the loop. (orig.)

  7. Automated One-Loop Calculations with GoSam

    CERN Document Server

    Cullen, Gavin; Heinrich, Gudrun; Luisoni, Gionata; Mastrolia, Pierpaolo; Ossola, Giovanni; Reiter, Thomas; Tramontano, Francesco

    2012-01-01

    We present the program package GoSam which is designed for the automated calculation of one-loop amplitudes for multi-particle processes in renormalisable quantum field theories. The amplitudes, which are generated in terms of Feynman diagrams, can be reduced using either D-dimensional integrand-level decomposition or tensor reduction. GoSam can be used to calculate one-loop QCD and/or electroweak corrections to Standard Model processes and offers the flexibility to link model files for theories Beyond the Standard Model. A standard interface to programs calculating real radiation is also implemented. We demonstrate the flexibility of the program by presenting examples of processes with up to six external legs attached to the loop.

  8. Automated atomic absorption spectrophotometer, utilizing a programmable desk calculator

    International Nuclear Information System (INIS)

    Futrell, T.L.; Morrow, R.W.

    1977-01-01

    A commercial, double-beam atomic absorption spectrophotometer has been interfaced with a sample changer and a Hewlett-Packard 9810A calculator to yield a completely automated analysis system. The interface electronics can be easily constructed and should be adaptable to any double-beam atomic absorption instrument. The calculator is easily programmed and can be used for general laboratory purposes when not operating the instrument. The automated system has been shown to perform very satisfactorily when operated unattended to analyze a large number of samples. Performance statistics agree well with a manually operated instrument

  9. Robust automated mass spectra interpretation and chemical formula calculation using mixed integer linear programming.

    Science.gov (United States)

    Baran, Richard; Northen, Trent R

    2013-10-15

    Untargeted metabolite profiling using liquid chromatography and mass spectrometry coupled via electrospray ionization is a powerful tool for the discovery of novel natural products, metabolic capabilities, and biomarkers. However, the elucidation of the identities of uncharacterized metabolites from spectral features remains challenging. A critical step in the metabolite identification workflow is the assignment of redundant spectral features (adducts, fragments, multimers) and calculation of the underlying chemical formula. Inspection of the data by experts using computational tools solving partial problems (e.g., chemical formula calculation for individual ions) can be performed to disambiguate alternative solutions and provide reliable results. However, manual curation is tedious and not readily scalable or standardized. Here we describe an automated procedure for the robust automated mass spectra interpretation and chemical formula calculation using mixed integer linear programming optimization (RAMSI). Chemical rules among related ions are expressed as linear constraints and both the spectra interpretation and chemical formula calculation are performed in a single optimization step. This approach is unbiased in that it does not require predefined sets of neutral losses and positive and negative polarity spectra can be combined in a single optimization. The procedure was evaluated with 30 experimental mass spectra and was found to effectively identify the protonated or deprotonated molecule ([M + H](+) or [M - H](-)) while being robust to the presence of background ions. RAMSI provides a much-needed standardized tool for interpreting ions for subsequent identification in untargeted metabolomics workflows.

  10. Automated [inservice testing] IST program

    International Nuclear Information System (INIS)

    Wright, W.M.

    1990-01-01

    There are two methods used to manage a Section XI program: Manual and Automated. The manual method usually consists of hand written records of test results and scheduling requirements. This method while initially lower in cost, results in problems later on in the life of a plant as data continues to accumulate. Automation allows instant access to forty years of test results. Due to the lower cost and higher performance of todays' personal computers, an automated method via a computer program provides an excellent method for managing the vast amount of data that accumulates over the forty year life of a plant. Through the use of a computer, special functions involving this data are available, which through a manual method would not be practical. This paper will describe some of the advantages in using a computer program to manage the Section XI 1ST program. The ISTBASE consists of program code and numerous databases. The source code is written and complied in CLIPPER (tm) language. Graphing routines are performed by dGE (tm) graphics library. Graphs are displayed in EGA form. Since it was estimated that the total complied code, would exceed 640K of ram, overlays through the use of modular programming were used to facilitate the DOS restrictions of 640K ram. The use of overlays still require the user to gain access to ISTBASE through the PASSWORD module. The database files are designed to be compatible with dBASE III+ (tm) data structure. This allows transfer of data between ISTBASE and other database managers/applications. A math co-processor is utilized to speed up calculations on graphs and other mathematical calculations. Program code and data files require a hard disk drive with at least 28 Meg capacity. While ISTBASE will execute on a 8088 based computer, an 80286 computer with a 12 MHz operating speed should be considered the minimum system configuration

  11. Automated one-loop calculations with GoSam

    International Nuclear Information System (INIS)

    Cullen, G.; Greiner, N.; Heinrich, G.; Reiter, T.; Luisoni, G.; Mastrolia, P.; Padua Univ.; Ossola, G.; Tramontano, F.

    2012-01-01

    In this talk, the program package GOSAM is presented which can be used for the automated calculation of one-loop amplitudes for multi-particle processes. The integrands are generated in terms of Feynman diagrams and can be reduced by d-dimensional integrand-level decomposition, or tensor reduction, or a combination of both. Through various examples we show that GOSAM can produce one-loop amplitudes for both QCD and electroweak theory; model files for theories Beyond the Standard Model can be linked as well. (orig.)

  12. Automated one-loop calculations with GoSam

    Energy Technology Data Exchange (ETDEWEB)

    Cullen, G. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Greiner, N.; Heinrich, G.; Reiter, T. [Max-Planck-Institut fuer Physik, Muenchen (Germany); Luisoni, G. [Durham Univ. (United Kingdom). Inst. for Particle Physics Phenomenology; Mastrolia, P. [Max-Planck-Institut fuer Physik, Muenchen (Germany); Padua Univ. (Italy). Dipt. di Fisica; Ossola, G. [City Univ. of New York, NY (United States). New York City College of Technology; Tramontano, F. [CERN, Geneva (Switzerland). AS Div.

    2012-01-15

    In this talk, the program package GOSAM is presented which can be used for the automated calculation of one-loop amplitudes for multi-particle processes. The integrands are generated in terms of Feynman diagrams and can be reduced by d-dimensional integrand-level decomposition, or tensor reduction, or a combination of both. Through various examples we show that GOSAM can produce one-loop amplitudes for both QCD and electroweak theory; model files for theories Beyond the Standard Model can be linked as well. (orig.)

  13. Program support of the automated system of planned calculations of the Oil and Gas Extracting Administration

    Energy Technology Data Exchange (ETDEWEB)

    Ashkinuze, V G; Reznikovskiy, P T

    1978-01-01

    An examination is made of the program support of the Automated System of Planned Calculations (ASPC) of the oil and Gas Extracting Administration (OGEA). Specific requirements for the ASPC of the OGEA are indicated and features of its program realization. In developing the program support of the system, an approach of parametric programming was used. A formal model of the ASPC OGEA is described in detail. It was formed in a theoretical-multiple language. Sets with structure of a tree are examined. They illustrate the production and administrative hierarchical structure of the planning objects in the oil region. The top of the tree corresponds to the OGEA as a whole. In the simplest realization, the tree has two levels of hierarchy: association and field. In general features, a procedure is described for possible use of the system by the planning workers. A plan is presented for program support of the ASPC OGEA, in light of whose specific nature a large part of the programs which realize this system are written in a language ASSEMBLER.

  14. Prospective validation of a near real-time EHR-integrated automated SOFA score calculator.

    Science.gov (United States)

    Aakre, Christopher; Franco, Pablo Moreno; Ferreyra, Micaela; Kitson, Jaben; Li, Man; Herasevich, Vitaly

    2017-07-01

    We created an algorithm for automated Sequential Organ Failure Assessment (SOFA) score calculation within the Electronic Health Record (EHR) to facilitate detection of sepsis based on the Third International Consensus Definitions for Sepsis and Septic Shock (SEPSIS-3) clinical definition. We evaluated the accuracy of near real-time and daily automated SOFA score calculation compared with manual score calculation. Automated SOFA scoring computer programs were developed using available EHR data sources and integrated into a critical care focused patient care dashboard at Mayo Clinic in Rochester, Minnesota. We prospectively compared the accuracy of automated versus manual calculation for a sample of patients admitted to the medical intensive care unit at Mayo Clinic Hospitals in Rochester, Minnesota and Jacksonville, Florida. Agreement was calculated with Cohen's kappa statistic. Reason for discrepancy was tabulated during manual review. Random spot check comparisons were performed 134 times on 27 unique patients, and daily SOFA score comparisons were performed for 215 patients over a total of 1206 patient days. Agreement between automatically scored and manually scored SOFA components for both random spot checks (696 pairs, κ=0.89) and daily calculation (5972 pairs, κ=0.89) was high. The most common discrepancies were in the respiratory component (inaccurate fraction of inspired oxygen retrieval; 200/1206) and creatinine (normal creatinine in patients with no urine output on dialysis; 128/1094). 147 patients were at risk of developing sepsis after intensive care unit admission, 10 later developed sepsis confirmed by chart review. All were identified before onset of sepsis with the ΔSOFA≥2 point criterion and 46 patients were false-positives. Near real-time automated SOFA scoring was found to have strong agreement with manual score calculation and may be useful for the detection of sepsis utilizing the new SEPSIS-3 definition. Copyright © 2017 Elsevier B.V. All

  15. Automated protein structure calculation from NMR data

    International Nuclear Information System (INIS)

    Williamson, Mike P.; Craven, C. Jeremy

    2009-01-01

    Current software is almost at the stage to permit completely automatic structure determination of small proteins of <15 kDa, from NMR spectra to structure validation with minimal user interaction. This goal is welcome, as it makes structure calculation more objective and therefore more easily validated, without any loss in the quality of the structures generated. Moreover, it releases expert spectroscopists to carry out research that cannot be automated. It should not take much further effort to extend automation to ca 20 kDa. However, there are technological barriers to further automation, of which the biggest are identified as: routines for peak picking; adoption and sharing of a common framework for structure calculation, including the assembly of an automated and trusted package for structure validation; and sample preparation, particularly for larger proteins. These barriers should be the main target for development of methodology for protein structure determination, particularly by structural genomics consortia

  16. 78 FR 53466 - Modification of Two National Customs Automation Program (NCAP) Tests Concerning Automated...

    Science.gov (United States)

    2013-08-29

    ... Customs Automation Program (NCAP) Tests Concerning Automated Commercial Environment (ACE) Document Image... National Customs Automation Program (NCAP) tests concerning document imaging, known as the Document Image... the National Customs Automation Program (NCAP) tests concerning document imaging, known as the...

  17. Ultrasound automated volume calculation in reproduction and in pregnancy.

    Science.gov (United States)

    Ata, Baris; Tulandi, Togas

    2011-06-01

    To review studies assessing the application of ultrasound automated volume calculation in reproductive medicine. We performed a literature search using the keywords "SonoAVC, sonography-based automated volume calculation, automated ultrasound, 3D ultrasound, antral follicle, follicle volume, follicle monitoring, follicle tracking, in vitro fertilization, controlled ovarian hyperstimulation, embryo volume, embryonic volume, gestational sac, and fetal volume" and conducted the search in PubMed, Medline, EMBASE, and the Cochrane Database of Systematic Reviews. Reference lists of identified reports were manually searched for other relevant publications. Automated volume measurements are in very good agreement with actual volumes of the assessed structures or with other validated measurement methods. The technique seems to provide reliable and highly reproducible results under a variety of conditions. Automated measurements take less time than manual measurements. Ultrasound automated volume calculation is a promising new technology which is already used in daily practice especially for assisted reproduction. Improvements to the technology will undoubtedly render it more effective and increase its use. Copyright © 2011 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  18. Aviation Safety/Automation Program Conference

    Science.gov (United States)

    Morello, Samuel A. (Compiler)

    1990-01-01

    The Aviation Safety/Automation Program Conference - 1989 was sponsored by the NASA Langley Research Center on 11 to 12 October 1989. The conference, held at the Sheraton Beach Inn and Conference Center, Virginia Beach, Virginia, was chaired by Samuel A. Morello. The primary objective of the conference was to ensure effective communication and technology transfer by providing a forum for technical interchange of current operational problems and program results to date. The Aviation Safety/Automation Program has as its primary goal to improve the safety of the national airspace system through the development and integration of human-centered automation technologies for aircraft crews and air traffic controllers.

  19. 78 FR 66039 - Modification of National Customs Automation Program Test Concerning Automated Commercial...

    Science.gov (United States)

    2013-11-04

    ... Customs Automation Program Test Concerning Automated Commercial Environment (ACE) Cargo Release (Formerly...) plan to both rename and modify the National Customs Automation Program (NCAP) test concerning the... data elements required to obtain release for cargo transported by air. The test will now be known as...

  20. BREED: a CDC-7600 computer program for the automation of breeder reactor design analysis (LWBR Development Program)

    International Nuclear Information System (INIS)

    Candelore, N.R.; Maher, C.M.

    1985-03-01

    BREED is an executive CDC-7600 program which was developed to facilitate the sequence of calculations and movement of data through a prescribed series of breeder reactor design computer programs in an uninterrupted single-job mode. It provides the capability to interface different application programs into a single computer run to provide a complete design function. The automation that can be achieved as a result of using BREED significantly reduces not only the time required for data preparation and hand transfer of data, but also the time required to complete an iteration of the total design effort. Data processing within a technical discipline and data transfer between technical disciplines can be accommodated. The input/output data processing is achieved with BREED by using a set of simple, easily understood user commands, usually short descriptive words, which the user inserts in his input deck. The input deck completely identifies and controls the calculational sequence needed to produce a desired end product. This report has been prepared to provide instructional material on the use of BREED and its user-oriented procedures to facilitate computer automation of design calculations

  1. 77 FR 48527 - National Customs Automation Program (NCAP) Test Concerning Automated Commercial Environment (ACE...

    Science.gov (United States)

    2012-08-14

    ... National Customs Automation Program (NCAP) test concerning the simplified entry functionality in the... DEPARTMENT OF HOMELAND SECURITY U.S. Customs and Border Protection National Customs Automation Program (NCAP) Test Concerning Automated Commercial Environment (ACE) Simplified Entry: Modification of...

  2. Risk Assessment on the Transition Program for Air Traffic Control Automation System Upgrade

    Directory of Open Access Journals (Sweden)

    Li Dong Bin

    2016-01-01

    Full Text Available We analyzed the safety risks of the transition program for Air Traffic Control (ATC automation system upgrade by using the event tree analysis method in this paper. We decomposed the occurrence progress of the three transition phase and built the event trees corresponding to the three stages, and then we determined the probability of success of each factor and calculated probability of success of the air traffic control automation system upgrade transition. In the conclusion, we illustrate the transition program safety risk according to the results.

  3. 78 FR 44142 - Modification of Two National Customs Automation Program (NCAP) Tests Concerning Automated...

    Science.gov (United States)

    2013-07-23

    ... Customs Automation Program (NCAP) Tests Concerning Automated Commercial Environment (ACE) Document Image... (CBP's) plan to modify the National Customs Automation Program (NCAP) tests concerning document imaging... entry process by reducing the number of data elements required to obtain release for cargo transported...

  4. 76 FR 34246 - Automated Commercial Environment (ACE); Announcement of National Customs Automation Program Test...

    Science.gov (United States)

    2011-06-13

    ... Environment (ACE); Announcement of National Customs Automation Program Test of Automated Procedures for In... Customs Automation Program (NCAP) test relating to highway movements of commercial goods that are transported in-bond through the United States from one point in Canada to another point in Canada. The NCAP...

  5. [The use of programmed microcalculators for automation of leukocyte count].

    Science.gov (United States)

    Plykin, D L

    1989-01-01

    Soviet programmed microcalculators are recommended to be used for the calculation of the leukocytic formulae when making serial blood analyses at clinical laboratories. The suggested program helps completely automate the process of estimating the leukocyte types, detectable in microscopic examination of the blood smears; the results may be obtained as a per cent ratio of the cells (a form most prevalent nowadays) and as their quantity per microliter of blood. The presence of service elements in the program essentially simplifies the work, making it convenient for an untrained user of the microcalculator. Since commercial Soviet programmed microcalculators somewhat differ in the systems of program steps, two variants of the program are suggested, adapted to the two most prevalent designs.

  6. Mining Repair Actions for Guiding Automated Program Fixing

    OpenAIRE

    Martinez , Matias; Monperrus , Martin

    2012-01-01

    Automated program fixing consists of generating source code in order to fix bugs in an automated manner. Our intuition is that automated program fixing can imitate human-based program fixing. Hence, we present a method to mine repair actions from software repositories. A repair action is a small semantic modification on code such as adding a method call. We then decorate repair actions with a probability distribution also learnt from software repositories. Our probabilistic repair models enab...

  7. Laboratory automation in a functional programming language.

    Science.gov (United States)

    Runciman, Colin; Clare, Amanda; Harkness, Rob

    2014-12-01

    After some years of use in academic and research settings, functional languages are starting to enter the mainstream as an alternative to more conventional programming languages. This article explores one way to use Haskell, a functional programming language, in the development of control programs for laboratory automation systems. We give code for an example system, discuss some programming concepts that we need for this example, and demonstrate how the use of functional programming allows us to express and verify properties of the resulting code. © 2014 Society for Laboratory Automation and Screening.

  8. Automating the personnel dosimeter monitoring program

    International Nuclear Information System (INIS)

    Compston, M.W.

    1982-12-01

    The personnel dosimetry monitoring program at the Portsmouth uranium enrichment facility has been improved by using thermoluminescent dosimetry to monitor for ionizing radiation exposure, and by automating most of the operations and all of the associated information handling. A thermoluminescent dosimeter (TLD) card, worn by personnel inside security badges, stores the energy of ionizing radiation. The dosimeters are changed-out periodically and are loaded 150 cards at a time into an automated reader-processor. The resulting data is recorded and filed into a useful form by computer programming developed for this purpose

  9. Automation of data processing and calculation of retention parameters and thermodynamic data for gas chromatography

    Science.gov (United States)

    Makarycheva, A. I.; Faerman, V. A.

    2017-02-01

    The analyses of automation patterns is performed and the programming solution for the automation of data processing of the chromatographic data and their further information storage with a help of a software package, Mathcad and MS Excel spreadsheets, is developed. The offered approach concedes the ability of data processing algorithm modification and does not require any programming experts participation. The approach provides making a measurement of the given time and retention volumes, specific retention volumes, a measurement of differential molar free adsorption energy, and a measurement of partial molar solution enthalpies and isosteric heats of adsorption. The developed solution is focused on the appliance in a small research group and is tested on the series of some new gas chromatography sorbents. More than 20 analytes were submitted to calculation of retention parameters and thermodynamic sorption quantities. The received data are provided in the form accessible to comparative analysis, and they are able to find sorbing agents with the most profitable properties to solve some concrete analytic issues.

  10. The NASA automation and robotics technology program

    Science.gov (United States)

    Holcomb, Lee B.; Montemerlo, Melvin D.

    1986-01-01

    The development and objectives of the NASA automation and robotics technology program are reviewed. The objectives of the program are to utilize AI and robotics to increase the probability of mission success; decrease the cost of ground control; and increase the capability and flexibility of space operations. There is a need for real-time computational capability; an effective man-machine interface; and techniques to validate automated systems. Current programs in the areas of sensing and perception, task planning and reasoning, control execution, operator interface, and system architecture and integration are described. Programs aimed at demonstrating the capabilities of telerobotics and system autonomy are discussed.

  11. 76 FR 69755 - National Customs Automation Program Test Concerning Automated Commercial Environment (ACE...

    Science.gov (United States)

    2011-11-09

    ... DEPARTMENT OF HOMELAND SECURITY U.S. Customs and Border Protection National Customs Automation... announces U.S. Customs and Border Protection's (CBP's) plan to conduct a National Customs Automation Program... conveyance transporting the cargo to the United States. This data will fulfill merchandise entry requirements...

  12. Model-based automated testing of critical PLC programs.

    CERN Document Server

    Fernández Adiego, B; Tournier, J-C; González Suárez, V M; Bliudze, S

    2014-01-01

    Testing of critical PLC (Programmable Logic Controller) programs remains a challenging task for control system engineers as it can rarely be automated. This paper proposes a model based approach which uses the BIP (Behavior, Interactions and Priorities) framework to perform automated testing of PLC programs developed with the UNICOS (UNified Industrial COntrol System) framework. This paper defines the translation procedure and rules from UNICOS to BIP which can be fully automated in order to hide the complexity of the underlying model from the control engineers. The approach is illustrated and validated through the study of a water treatment process.

  13. Audit of the Reporting Requirements for Major Automated Information System Programs

    National Research Council Canada - National Science Library

    2000-01-01

    .... There are 71 Major Automated Information System programs with total program costs of $26 billion. To qualify as a Major Automated Information System, the program must meet the following critena...

  14. GoSam 2.0. Automated one loop calculations within and beyond the standard model

    International Nuclear Information System (INIS)

    Greiner, Nicolas; Deutsches Elektronen-Synchrotron

    2014-10-01

    We present GoSam 2.0, a fully automated framework for the generation and evaluation of one loop amplitudes in multi leg processes. The new version offers numerous improvements both on generational aspects as well as on the reduction side. This leads to a faster and more stable code for calculations within and beyond the Standard Model. Furthermore it contains the extended version of the standardized interface to Monte Carlo programs which allows for an easy combination with other existing tools. We briefly describe the conceptual innovations and present some phenomenological results.

  15. Lithography-based automation in the design of program defect masks

    Science.gov (United States)

    Vakanas, George P.; Munir, Saghir; Tejnil, Edita; Bald, Daniel J.; Nagpal, Rajesh

    2004-05-01

    In this work, we are reporting on a lithography-based methodology and automation in the design of Program Defect masks (PDM"s). Leading edge technology masks have ever-shrinking primary features and more pronounced model-based secondary features such as optical proximity corrections (OPC), sub-resolution assist features (SRAF"s) and phase-shifted mask (PSM) structures. In order to define defect disposition specifications for critical layers of a technology node, experience alone in deciding worst-case scenarios for the placement of program defects is necessary but may not be sufficient. MEEF calculations initiated from layout pattern data and their integration in a PDM layout flow provide a natural approach for improvements, relevance and accuracy in the placement of programmed defects. This methodology provides closed-loop feedback between layout and hard defect disposition specifications, thereby minimizing engineering test restarts, improving quality and reducing cost of high-end masks. Apart from SEMI and industry standards, best-known methods (BKM"s) in integrated lithographically-based layout methodologies and automation specific to PDM"s are scarce. The contribution of this paper lies in the implementation of Design-For-Test (DFT) principles to a synergistic interaction of CAD Layout and Aerial Image Simulator to drive layout improvements, highlight layout-to-fracture interactions and output accurate program defect placement coordinates to be used by tools in the mask shop.

  16. Automated calculation of complete Pxy and Txy diagrams for binary systems

    DEFF Research Database (Denmark)

    Cismondi, Martin; Michelsen, Michael Locht

    2007-01-01

    phase equilibrium calculations in binary systems, in: Proceedings of the CD-ROM EQUIFASE 2006, Morelia, Michoacan, Mexico, October 21-25, 2006; www.gpec.plapiqui.edu.ar]. In this work we present the methods and computational strategy for the automated calculation of complete Pxy and Txy diagrams...

  17. Automation and robotics for the National Space Program

    Science.gov (United States)

    1985-01-01

    The emphasis on automation and robotics in the augmentation of the human centered systems as it concerns the space station is discussed. How automation and robotics can amplify the capabilities of humans is detailed. A detailed developmental program for the space station is outlined.

  18. An ultraviolet-visible spectrophotometer automation system. Part 3: Program documentation

    Science.gov (United States)

    Roth, G. S.; Teuschler, J. M.; Budde, W. L.

    1982-07-01

    The Ultraviolet-Visible Spectrophotometer (UVVIS) automation system accomplishes 'on-line' spectrophotometric quality assurance determinations, report generations, plot generations and data reduction for chlorophyll or color analysis. This system also has the capability to process manually entered data for the analysis of chlorophyll or color. For each program of the UVVIS system, this document contains a program description, flowchart, variable dictionary, code listing, and symbol cross-reference table. Also included are descriptions of file structures and of routines common to all automated analyses. The programs are written in Data General extended BASIC, Revision 4.3, under the RDOS operating systems, Revision 6.2. The BASIC code has been enhanced for real-time data acquisition, which is accomplished by CALLS to assembly language subroutines. Two other related publications are 'An Ultraviolet-Visible Spectrophotometer Automation System - Part I Functional Specifications,' and 'An Ultraviolet-Visible Spectrophotometer Automation System - Part II User's Guide.'

  19. Computer automation for protection factor calculations of buildings

    International Nuclear Information System (INIS)

    Farafat, M.A.Z.; Madian, A.H.

    2011-01-01

    The protection factors of buildings are different according to the constructional and architectural specifications. Uk and USA performed a calculation using manual method to calculate the protection factor for any building which may protect the people in it from gamma rays and fall-out.The manual calculation method is very complex which is very difficult to use, for that reason the researchers simplify this method in proposed form which will be easy to understand and use. Also the researchers have designed a computer program ,in visual basic, to calculate the different protection factors for buildings. The program aims to provide the missing time in the calculation processes to calculate the protection in some spaces for any building through entering specifications data for any building .The program will modify the protection factor in very short time which will save the effort and time in comparison with the manual calculation.

  20. GoSam-2.0. A tool for automated one-loop calculations within the Standard Model and beyond

    International Nuclear Information System (INIS)

    Cullen, Gavin; Deurzen, Hans van; Greiner, Nicolas

    2014-05-01

    We present the version 2.0 of the program package GoSam for the automated calculation of one-loop amplitudes. GoSam is devised to compute one-loop QCD and/or electroweak corrections to multi-particle processes within and beyond the Standard Model. The new code contains improvements in the generation and in the reduction of the amplitudes, performs better in computing time and numerical accuracy, and has an extended range of applicability. The extended version of the ''Binoth-Les-Houches-Accord'' interface to Monte Carlo programs is also implemented. We give a detailed description of installation and usage of the code, and illustrate the new features in dedicated examples.

  1. A computational framework for automation of point defect calculations

    International Nuclear Information System (INIS)

    Goyal, Anuj; Gorai, Prashun; Peng, Haowei

    2017-01-01

    We have developed a complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory. Furthermore, the framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. This package provides the capability to compute widely-accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3 as test examples, we demonstrate the package capabilities and validate the methodology.

  2. Automated calculations for massive fermion production with ai-bar Talc

    International Nuclear Information System (INIS)

    Lorca, A.; Riemann, T.

    2004-01-01

    The package ai-bar Talc has been developed for the automated calculation of radiative corrections to two-fermion production at e + e - colliders. The package uses Diana, Qgraf, Form, Fortran, FF, LoopTools, and further unix/linux tools. Numerical results are presented for e + e - -> e + e - , μ + μ - , bs-bar , tc-bar

  3. Iterative User Interface Design for Automated Sequential Organ Failure Assessment Score Calculator in Sepsis Detection.

    Science.gov (United States)

    Aakre, Christopher Ansel; Kitson, Jaben E; Li, Man; Herasevich, Vitaly

    2017-05-18

    The new sepsis definition has increased the need for frequent sequential organ failure assessment (SOFA) score recalculation and the clerical burden of information retrieval makes this score ideal for automated calculation. The aim of this study was to (1) estimate the clerical workload of manual SOFA score calculation through a time-motion analysis and (2) describe a user-centered design process for an electronic medical record (EMR) integrated, automated SOFA score calculator with subsequent usability evaluation study. First, we performed a time-motion analysis by recording time-to-task-completion for the manual calculation of 35 baseline and 35 current SOFA scores by 14 internal medicine residents over a 2-month period. Next, we used an agile development process to create a user interface for a previously developed automated SOFA score calculator. The final user interface usability was evaluated by clinician end users with the Computer Systems Usability Questionnaire. The overall mean (standard deviation, SD) time-to-complete manual SOFA score calculation time was 61.6 s (33). Among the 24% (12/50) usability survey respondents, our user-centered user interface design process resulted in >75% favorability of survey items in the domains of system usability, information quality, and interface quality. Early stakeholder engagement in our agile design process resulted in a user interface for an automated SOFA score calculator that reduced clinician workload and met clinicians' needs at the point of care. Emerging interoperable platforms may facilitate dissemination of similarly useful clinical score calculators and decision support algorithms as "apps." A user-centered design process and usability evaluation should be considered during creation of these tools. ©Christopher Ansel Aakre, Jaben E Kitson, Man Li, Vitaly Herasevich. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 18.05.2017.

  4. 78 FR 27984 - Modification of the National Customs Automation Program Test (NCAP) Regarding Reconciliation for...

    Science.gov (United States)

    2013-05-13

    ... Customs Automation Program Test (NCAP) Regarding Reconciliation for Filing Certain Post-Importation Claims... Automation Program (NCAP) Reconciliation prototype test to include the filing of post-importation [[Page... of the Test Reconciliation, a planned component of the National Customs Automation Program (NCAP), as...

  5. Automated calculation of ptosis on lateral clinical photographs.

    Science.gov (United States)

    Lee, Juhun; Kim, Edward; Reece, Gregory P; Crosby, Melissa A; Beahm, Elisabeth K; Markey, Mia K

    2015-10-01

    The goal is to fully automate the calculation of a breast ptosis measure from clinical photographs through automatic localization of fiducial points relevant to the measure. Sixty-eight women (97 clinical photographs) who underwent or were scheduled for breast reconstruction were included. The photographs were divided into a development set (N = 49) and an evaluation set (N = 48). The breast ptosis measure is obtained automatically from distances between three fiducial points: the nipple, the lowest visible point of breast (LVP), and the lateral terminus of the inframammary fold (LT). The nipple is localized using the YIQ colour space to highlight the contrast between the areola and the surrounding breast skin. The areola is localized using its shape, location and high Q component intensity. The breast contour is estimated using Dijkstra's shortest path algorithm on the gradient of the photograph in greyscale. The lowest point of the estimated contour is set as the LVP. To locate the anatomically subtle LT, the location of patient's axilla is used as a reference. The algorithm's efficacy was evaluated by comparing manual and automated localizations of the fiducial points. The average nipple diameter was used as a cut-off to define success. The algorithm showed 90, 91 and 83% accuracy for locating the nipple, LVP and LT in the evaluation set, respectively. This study presents a new automated algorithm that may facilitate the quantification of breast ptosis from lateral views of patients' photographs. © 2015 John Wiley & Sons, Ltd.

  6. Development of mathematical models for automation of strength calculation during plastic deformation processing

    Science.gov (United States)

    Steposhina, S. V.; Fedonin, O. N.

    2018-03-01

    Dependencies that make it possible to automate the force calculation during surface plastic deformation (SPD) processing and, thus, to shorten the time for technological preparation of production have been developed.

  7. Pilot program for an automated data collection system

    International Nuclear Information System (INIS)

    Burns, R.S.; Johnson, P.S.; Denny, E.C.

    1984-01-01

    This report describes the pilot program of an automated data collection system and presents some of the managerial experiences during its startup. The pilot program demonstrated that improvements can be made in data collection and handling, even when a key hardware item does not meet requirements. 2 figures, 1 table

  8. Comparison of Size Modulation Standard Automated Perimetry and Conventional Standard Automated Perimetry with a 10-2 Test Program in Glaucoma Patients.

    Science.gov (United States)

    Hirasawa, Kazunori; Takahashi, Natsumi; Satou, Tsukasa; Kasahara, Masayuki; Matsumura, Kazuhiro; Shoji, Nobuyuki

    2017-08-01

    This prospective observational study compared the performance of size modulation standard automated perimetry with the Octopus 600 10-2 test program, with stimulus size modulation during testing, based on stimulus intensity and conventional standard automated perimetry, with that of the Humphrey 10-2 test program in glaucoma patients. Eighty-seven eyes of 87 glaucoma patients underwent size modulation standard automated perimetry with Dynamic strategy and conventional standard automated perimetry using the SITA standard strategy. The main outcome measures were global indices, point-wise threshold, visual defect size and depth, reliability indices, and test duration; these were compared between size modulation standard automated perimetry and conventional standard automated perimetry. Global indices and point-wise threshold values between size modulation standard automated perimetry and conventional standard automated perimetry were moderately to strongly correlated (p 33.40, p modulation standard automated perimetry than with conventional standard automated perimetry, but the visual-field defect size was smaller (p modulation-standard automated perimetry than on conventional standard automated perimetry. The reliability indices, particularly the false-negative response, of size modulation standard automated perimetry were worse than those of conventional standard automated perimetry (p modulation standard automated perimetry than with conventional standard automated perimetry (p = 0.02). Global indices and the point-wise threshold value of the two testing modalities correlated well. However, the potential of a large stimulus presented at an area with a decreased sensitivity with size modulation standard automated perimetry could underestimate the actual threshold in the 10-2 test protocol, as compared with conventional standard automated perimetry.

  9. Use of an automated bolus calculator in MDI-treated type 1 diabetes

    DEFF Research Database (Denmark)

    Schmidt, Signe; Meldgaard, Merete; Serifovski, Nermin

    2012-01-01

    To investigate the effect of flexible intensive insulin therapy (FIIT) and an automated bolus calculator (ABC) in a Danish type 1 diabetes population treated with multiple daily injections. Furthermore, to test the feasibility of teaching FIIT in a 3-h structured course....

  10. MO-D-213-07: RadShield: Semi- Automated Calculation of Air Kerma Rate and Barrier Thickness

    International Nuclear Information System (INIS)

    DeLorenzo, M; Wu, D; Rutel, I; Yang, K

    2015-01-01

    Purpose: To develop the first Java-based semi-automated calculation program intended to aid professional radiation shielding design. Air-kerma rate and barrier thickness calculations are performed by implementing NCRP Report 147 formalism into a Graphical User Interface (GUI). The ultimate aim of this newly created software package is to reduce errors and improve radiographic and fluoroscopic room designs over manual approaches. Methods: Floor plans are first imported as images into the RadShield software program. These plans serve as templates for drawing barriers, occupied regions and x-ray tube locations. We have implemented sub-GUIs that allow the specification in regions and equipment for occupancy factors, design goals, number of patients, primary beam directions, source-to-patient distances and workload distributions. Once the user enters the above parameters, the program automatically calculates air-kerma rate at sampled points beyond all barriers. For each sample point, a corresponding minimum barrier thickness is calculated to meet the design goal. RadShield allows control over preshielding, sample point location and material types. Results: A functional GUI package was developed and tested. Examination of sample walls and source distributions yields a maximum percent difference of less than 0.1% between hand-calculated air-kerma rates and RadShield. Conclusion: The initial results demonstrated that RadShield calculates air-kerma rates and required barrier thicknesses with reliable accuracy and can be used to make radiation shielding design more efficient and accurate. This newly developed approach differs from conventional calculation methods in that it finds air-kerma rates and thickness requirements for many points outside the barriers, stores the information and selects the largest value needed to comply with NCRP Report 147 design goals. Floor plans, parameters, designs and reports can be saved and accessed later for modification and recalculation

  11. MO-D-213-07: RadShield: Semi- Automated Calculation of Air Kerma Rate and Barrier Thickness

    Energy Technology Data Exchange (ETDEWEB)

    DeLorenzo, M [Oklahoma University Health Sciences Center, Oklahoma City, OK (United States); Wu, D [University of Oklahoma Health Sciences Center, Oklahoma City, Ok (United States); Rutel, I [University of Oklahoma Health Science Center, Oklahoma City, OK (United States); Yang, K [Massachusetts General Hospital, Boston, MA (United States)

    2015-06-15

    Purpose: To develop the first Java-based semi-automated calculation program intended to aid professional radiation shielding design. Air-kerma rate and barrier thickness calculations are performed by implementing NCRP Report 147 formalism into a Graphical User Interface (GUI). The ultimate aim of this newly created software package is to reduce errors and improve radiographic and fluoroscopic room designs over manual approaches. Methods: Floor plans are first imported as images into the RadShield software program. These plans serve as templates for drawing barriers, occupied regions and x-ray tube locations. We have implemented sub-GUIs that allow the specification in regions and equipment for occupancy factors, design goals, number of patients, primary beam directions, source-to-patient distances and workload distributions. Once the user enters the above parameters, the program automatically calculates air-kerma rate at sampled points beyond all barriers. For each sample point, a corresponding minimum barrier thickness is calculated to meet the design goal. RadShield allows control over preshielding, sample point location and material types. Results: A functional GUI package was developed and tested. Examination of sample walls and source distributions yields a maximum percent difference of less than 0.1% between hand-calculated air-kerma rates and RadShield. Conclusion: The initial results demonstrated that RadShield calculates air-kerma rates and required barrier thicknesses with reliable accuracy and can be used to make radiation shielding design more efficient and accurate. This newly developed approach differs from conventional calculation methods in that it finds air-kerma rates and thickness requirements for many points outside the barriers, stores the information and selects the largest value needed to comply with NCRP Report 147 design goals. Floor plans, parameters, designs and reports can be saved and accessed later for modification and recalculation

  12. Performance of a glucose meter with a built-in automated bolus calculator versus manual bolus calculation in insulin-using subjects.

    Science.gov (United States)

    Sussman, Allen; Taylor, Elizabeth J; Patel, Mona; Ward, Jeanne; Alva, Shridhara; Lawrence, Andrew; Ng, Ronald

    2012-03-01

    Patients consider multiple parameters in adjusting prandial insulin doses for optimal glycemic control. Difficulties in calculations can lead to incorrect doses or induce patients to administer fixed doses, rely on empirical estimates, or skip boluses. A multicenter study was conducted with 205 diabetes subjects who were on multiple daily injections of rapid/ short-acting insulin. Using the formula provided, the subjects manually calculated two prandial insulin doses based on one high and one normal glucose test result, respectively. They also determined the two doses using the FreeStyle InsuLinx Blood Glucose Monitoring System, which has a built-in, automated bolus calculator. After dose determinations, the subjects completed opinion surveys. Of the 409 insulin doses manually calculated by the subjects, 256 (63%) were incorrect. Only 23 (6%) of the same 409 dose determinations were incorrect using the meter, and these errors were due to either confirmed or potential deviations from the study instructions by the subjects when determining dose with meter. In the survey, 83% of the subjects expressed more confidence in the meter-calculated doses than the manually calculated doses. Furthermore, 87% of the subjects preferred to use the meter than manual calculation to determine prandial insulin doses. Insulin-using patients made errors in more than half of the manually calculated insulin doses. Use of the automated bolus calculator in the FreeStyle InsuLinx meter minimized errors in dose determination. The patients also expressed confidence and preference for using the meter. This may increase adherence and help optimize the use of mealtime insulin. © 2012 Diabetes Technology Society.

  13. Computer program CDCID: an automated quality control program using CDC update

    International Nuclear Information System (INIS)

    Singer, G.L.; Aguilar, F.

    1984-04-01

    A computer program, CDCID, has been developed in coordination with a quality control program to provide a highly automated method of documenting changes to computer codes at EG and G Idaho, Inc. The method uses the standard CDC UPDATE program in such a manner that updates and their associated documentation are easily made and retrieved in various formats. The method allows each card image of a source program to point to the document which describes it, who created the card, and when it was created. The method described is applicable to the quality control of computer programs in general. The computer program described is executable only on CDC computing systems, but the program could be modified and applied to any computing system with an adequate updating program

  14. Will the Measurement Robots Take Our Jobs? An Update on the State of Automated M&V for Energy Efficiency Programs

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Touzani, Samir [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Taylor, Cody [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fernandes, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-07-28

    Trustworthy savings calculations are critical to convincing regulators of both the cost-effectiveness of energy efficiency program investments and their ability to defer supply-side capital investments. Today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of energy efficiency programs. They also require time-consuming data acquisition. A spectrum of savings calculation approaches is used, with some relying more heavily on measured data and others relying more heavily on estimated, modeled, or stipulated data. The rising availability of “smart” meters and devices that report near-real time data, combined with new analytical approaches to quantifying savings, offers potential to conduct M&V more quickly and at lower cost, with comparable or improved accuracy. Commercial energy management and information systems (EMIS) technologies are beginning to offer M&V capabilities, and program administrators want to understand how they might assist programs in quickly and accurately measuring energy savings. This paper presents the results of recent testing of the ability to use automation to streamline some parts of M&V. Here in this paper, we detail metrics to assess the performance of these new M&V approaches, and a framework to compute the metrics. We also discuss the accuracy, cost, and time trade-offs between more traditional M&V, and these emerging streamlined methods that use high-resolution energy data and automated computational intelligence. Finally we discuss the potential evolution of M&V and early results of pilots currently underway to incorporate M&V automation into ratepayer-funded programs and professional implementation and evaluation practice.

  15. Design Automation Using Script Languages. High-Level CAD Templates in Non-Parametric Programs

    Science.gov (United States)

    Moreno, R.; Bazán, A. M.

    2017-10-01

    The main purpose of this work is to study the advantages offered by the application of traditional techniques of technical drawing in processes for automation of the design, with non-parametric CAD programs, provided with scripting languages. Given that an example drawing can be solved with traditional step-by-step detailed procedures, is possible to do the same with CAD applications and to generalize it later, incorporating references. In today’s modern CAD applications, there are striking absences of solutions for building engineering: oblique projections (military and cavalier), 3D modelling of complex stairs, roofs, furniture, and so on. The use of geometric references (using variables in script languages) and their incorporation into high-level CAD templates allows the automation of processes. Instead of repeatedly creating similar designs or modifying their data, users should be able to use these templates to generate future variations of the same design. This paper presents the automation process of several complex drawing examples based on CAD script files aided with parametric geometry calculation tools. The proposed method allows us to solve complex geometry designs not currently incorporated in the current CAD applications and to subsequently create other new derivatives without user intervention. Automation in the generation of complex designs not only saves time but also increases the quality of the presentations and reduces the possibility of human errors.

  16. SOLGAS refined: A computerized thermodynamic equilibrium calculation tool

    International Nuclear Information System (INIS)

    Trowbridge, L.D.; Leitnaker, J.M.

    1993-11-01

    SOLGAS, an early computer program for calculating equilibrium in a chemical system, has been made more user-friendly, and several open-quote bells and whistlesclose quotes have been added. The necessity to include elemental species has been eliminated. The input of large numbers of starting conditions has been automated. A revised format for entering data simplifies and reduces chances for error. Calculated errors by SOLGAS are flagged, and several programming errors are corrected. Auxiliary programs are available to assemble and partially automate plotting of large amounts of data. Thermodynamic input data can be changed open-quotes on line.close-quote The program can be operated with or without a co-processor. Copies of the program, suitable for the IBM-PC or compatible with at least 384 bytes of low RAM, are available from the authors

  17. BioXTAS RAW, a software program for high-throughput automated small-angle X-ray scattering data reduction and preliminary analysis

    DEFF Research Database (Denmark)

    Nielsen, S.S.; Toft, K.N.; Snakenborg, Detlef

    2009-01-01

    A fully open source software program for automated two-dimensional and one-dimensional data reduction and preliminary analysis of isotropic small-angle X-ray scattering (SAXS) data is presented. The program is freely distributed, following the open-source philosophy, and does not rely on any...... commercial software packages. BioXTAS RAW is a fully automated program that, via an online feature, reads raw two-dimensional SAXS detector output files and processes and plots data as the data files are created during measurement sessions. The software handles all steps in the data reduction. This includes...... mask creation, radial averaging, error bar calculation, artifact removal, normalization and q calibration. Further data reduction such as background subtraction and absolute intensity scaling is fast and easy via the graphical user interface. BioXTAS RAW also provides preliminary analysis of one...

  18. Automated Assessment in a Programming Tools Course

    Science.gov (United States)

    Fernandez Aleman, J. L.

    2011-01-01

    Automated assessment systems can be useful for both students and instructors. Ranking and immediate feedback can have a strongly positive effect on student learning. This paper presents an experience using automatic assessment in a programming tools course. The proposal aims at extending the traditional use of an online judging system with a…

  19. Automated procedure for calculating time-dependent sensitivities in ORIGEN2

    International Nuclear Information System (INIS)

    Worley, B.A.; Wright, R.Q.

    1985-10-01

    ORIGEN2 is a widely used point-depletion and radioactive-decay computer code for use in simulating nuclear fuel cycles and/or spent fuel characteristics. This paper presents the application of the GRESS procedure to the ORIGEN2 code for performing a sensitivity analysis of a high-level waste disposal problem. The GRESS procedure uses computer calculus and the GRESS precompiler to automate the generation and calculation of gradients in a computer code. The GRESS version of ORIGEN2 is used to calculate the nuclide-dependent sensitivities of the decay heat and radioactivity of 1008 nuclides comprising reprocessed high-level waste to changes in data and input parameters. The sensitivities are calculated in a single execution of the revised code as compared to the conventional method of rerunning the code numerous times. The availability of sensitivity data as an option in ORIGEN2 reveals relationships not easily recognized even with reruns

  20. Automated Literature Searches for Longitudinal Tracking of Cancer Research Training Program Graduates.

    Science.gov (United States)

    Padilla, Luz A; Desmond, Renee A; Brooks, C Michael; Waterbor, John W

    2018-06-01

    A key outcome measure of cancer research training programs is the number of cancer-related peer-reviewed publications after training. Because program graduates do not routinely report their publications, staff must periodically conduct electronic literature searches on each graduate. The purpose of this study is to compare findings of an innovative computer-based automated search program versus repeated manual literature searches to identify post-training peer-reviewed publications. In late 2014, manual searches for publications by former R25 students identified 232 cancer-related articles published by 112 of 543 program graduates. In 2016, a research assistant was instructed in performing Scopus literature searches for comparison with individual PubMed searches on our 543 program graduates. Through 2014, Scopus found 304 cancer publications, 220 of that had been retrieved manually plus an additional 84 papers. However, Scopus missed 12 publications found manually. Together, both methods found 316 publications. The automated method found 96.2 % of the 316 publications while individual searches found only 73.4 %. An automated search method such as using the Scopus database is a key tool for conducting comprehensive literature searches, but it must be supplemented with periodic manual searches to find the initial publications of program graduates. A time-saving feature of Scopus is the periodic automatic alerts of new publications. Although a training period is needed and initial costs can be high, an automated search method is worthwhile due to its high sensitivity and efficiency in the long term.

  1. SOLGAS refined: A computerized thermodynamic equilibrium calculation tool

    Energy Technology Data Exchange (ETDEWEB)

    Trowbridge, L.D.; Leitnaker, J.M.

    1993-11-01

    SOLGAS, an early computer program for calculating equilibrium in a chemical system, has been made more user-friendly, and several{open_quote} bells and whistles{close_quotes} have been added. The necessity to include elemental species has been eliminated. The input of large numbers of starting conditions has been automated. A revised format for entering data simplifies and reduces chances for error. Calculated errors by SOLGAS are flagged, and several programming errors are corrected. Auxiliary programs are available to assemble and partially automate plotting of large amounts of data. Thermodynamic input data can be changed {open_quotes}on line.{close_quote} The program can be operated with or without a co-processor. Copies of the program, suitable for the IBM-PC or compatible with at least 384 bytes of low RAM, are available from the authors.

  2. Automated-biasing approach to Monte Carlo shipping-cask calculations

    International Nuclear Information System (INIS)

    Hoffman, T.J.; Tang, J.S.; Parks, C.V.; Childs, R.L.

    1982-01-01

    Computer Sciences at Oak Ridge National Laboratory, under a contract with the Nuclear Regulatory Commission, has developed the SCALE system for performing standardized criticality, shielding, and heat transfer analyses of nuclear systems. During the early phase of shielding development in SCALE, it was established that Monte Carlo calculations of radiation levels exterior to a spent fuel shipping cask would be extremely expensive. This cost can be substantially reduced by proper biasing of the Monte Carlo histories. The purpose of this study is to develop and test an automated biasing procedure for the MORSE-SGC/S module of the SCALE system

  3. Automated Peak Picking and Peak Integration in Macromolecular NMR Spectra Using AUTOPSY

    Science.gov (United States)

    Koradi, Reto; Billeter, Martin; Engeli, Max; Güntert, Peter; Wüthrich, Kurt

    1998-12-01

    A new approach for automated peak picking of multidimensional protein NMR spectra with strong overlap is introduced, which makes use of the program AUTOPSY (automatedpeak picking for NMRspectroscopy). The main elements of this program are a novel function for local noise level calculation, the use of symmetry considerations, and the use of lineshapes extracted from well-separated peaks for resolving groups of strongly overlapping peaks. The algorithm generates peak lists with precise chemical shift and integral intensities, and a reliability measure for the recognition of each peak. The results of automated peak picking of NOESY spectra with AUTOPSY were tested in combination with the combined automated NOESY cross peak assignment and structure calculation routine NOAH implemented in the program DYANA. The quality of the resulting structures was found to be comparable with those from corresponding data obtained with manual peak picking.

  4. An overview of the contaminant analysis automation program

    International Nuclear Information System (INIS)

    Hollen, R.M.; Erkkila, T.; Beugelsdijk, T.J.

    1992-01-01

    The Department of Energy (DOE) has significant amounts of radioactive and hazardous wastes stored, buried, and still being generated at many sites within the United States. These wastes must be characterized to determine the elemental, isotopic, and compound content before remediation can begin. In this paper, the authors project that sampling requirements will necessitate generating more than 10 million samples by 1995, which will far exceed the capabilities of our current manual chemical analysis laboratories. The Contaminant Analysis Automation effort (CAA), with Los Alamos National Laboratory (LANL) as to the coordinating Laboratory, is designing and fabricating robotic systems that will standardize and automate both the hardware and the software of the most common environmental chemical methods. This will be accomplished by designing and producing several unique analysis systems called Standard Analysis Methods (SAM). Each SAM will automate a specific chemical method, including sample preparation, the analytical analysis, and the data interpretation, by using a building block known as the Standard Laboratory Module (SLM). This concept allows the chemist to assemble an automated environmental method using standardized SLMs easily and without the worry of hardware compatibility or the necessity of generating complicated control programs

  5. Calculational model used in the analysis of nuclear performance of the Light Water Breeder Reactor (LWBR) (LWBR Development Program)

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, L.B. (ed.)

    1978-08-01

    The calculational model used in the analysis of LWBR nuclear performance is described. The model was used to analyze the as-built core and predict core nuclear performance prior to core operation. The qualification of the nuclear model using experiments and calculational standards is described. Features of the model include: an automated system of processing manufacturing data; an extensively analyzed nuclear data library; an accurate resonance integral calculation; space-energy corrections to infinite medium cross sections; an explicit three-dimensional diffusion-depletion calculation; a transport calculation for high energy neutrons; explicit accounting for fuel and moderator temperature feedback, clad diameter shrinkage, and fuel pellet growth; and an extensive testing program against experiments and a highly developed analytical standard.

  6. Automated processing of data generated by molecular dynamics

    International Nuclear Information System (INIS)

    Lobato Hoyos, Ivan; Rojas Tapia, Justo; Instituto Peruano de Energia Nuclear, Lima

    2008-01-01

    A new integrated tool for automated processing of data generated by molecular dynamics packages and programs have been developed. The program allows to calculate important quantities such as pair correlation function, the analysis of common neighbors, counting nanoparticles and their size distribution, conversion of output files between different formats. The work explains in detail the modules of the tool, the interface between them. The uses of program are illustrated in application examples in the calculation of various properties of silver nanoparticles. (author)

  7. Automated variance reduction of Monte Carlo shielding calculations using the discrete ordinates adjoint function

    International Nuclear Information System (INIS)

    Wagner, J.C.; Haghighat, A.

    1998-01-01

    Although the Monte Carlo method is considered to be the most accurate method available for solving radiation transport problems, its applicability is limited by its computational expense. Thus, biasing techniques, which require intuition, guesswork, and iterations involving manual adjustments, are employed to make reactor shielding calculations feasible. To overcome this difficulty, the authors have developed a method for using the S N adjoint function for automated variance reduction of Monte Carlo calculations through source biasing and consistent transport biasing with the weight window technique. They describe the implementation of this method into the standard production Monte Carlo code MCNP and its application to a realistic calculation, namely, the reactor cavity dosimetry calculation. The computational effectiveness of the method, as demonstrated through the increase in calculational efficiency, is demonstrated and quantified. Important issues associated with this method and its efficient use are addressed and analyzed. Additional benefits in terms of the reduction in time and effort required of the user are difficult to quantify but are possibly as important as the computational efficiency. In general, the automated variance reduction method presented is capable of increases in computational performance on the order of thousands, while at the same time significantly reducing the current requirements for user experience, time, and effort. Therefore, this method can substantially increase the applicability and reliability of Monte Carlo for large, real-world shielding applications

  8. Automating with SIMATIC S7-1500 configuring, programming and testing with STEP 7 Professional

    CERN Document Server

    Berger, Hans

    2014-01-01

    With many innovations, the SIMATIC S7-1500 programmable logic controller (PLC) sets new standards in productivity and efficiency in control technology. By its outstanding system performance and with PROFINET as the standard interface, it ensures extremely short system response times and the highest control quality with a maximum of flexibility for most demanding automation tasks. The engineering software STEP 7 Professional operates inside TIA Portal, a user interface that is designed for intuitive operation. Functionality includes all aspects of Automation: from the configuration of the controllers via the programming in the IEC languages ¿¿LAD, FBD, STL, and SCL up to the program test. In the book, the hardware components of the automation system S7-1500 are presented including the description of their configuration and parameterization. A comprehensive introduction into STEP 7 Professional illustrates the basics of programming and troubleshooting. Beginners learn the basics of automation with Simatic...

  9. Concentration creation of system and applied software of reactor calculations

    International Nuclear Information System (INIS)

    Zizin, M.N.

    1995-01-01

    Basic concept provisions, including modularity, openness and machine-independent programs; accumulation of procedure knowledge in form of computer module library; creation of medium facilitating module development and accompanying; possibility of head programs automated generation and user-friendly interface are presented. Intellectual program shell ShIPR for mathematical reactor modeling, the final goal whereof consists in automated generation of machine-independent programs in the Fortran 77 language on the basis of calculation moduli and accumulated knowledge base, is developed on the basis of the above concept

  10. The elaboration of motor programs for the automation of letter production.

    Science.gov (United States)

    Thibon, Laurence Séraphin; Gerber, Silvain; Kandel, Sonia

    2018-01-01

    We investigated how children learn to write letters. Letter writing evolves from stroke-by-stroke to whole-letter programming. Children of ages 6 to 9 (N=98) wrote letters of varying complexity on a digitizer. At ages 6 and 7 movement duration, dysfluency and trajectory increased with stroke number. This indicates that the motor program they activated mainly coded information on stroke production. Stroke number affected the older children's production much less, suggesting that they programmed stroke chunks or the whole letter. The fact that movement duration and dysfluency decreased from ages 6 to 8, and remained stable at ages 8 and 9 suggests that automation of letter writing begins at age 8. Automation seems to require the elaboration of stroke chunks and/or letter-sized motor programs. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Automation of program model developing for complex structure control objects

    International Nuclear Information System (INIS)

    Ivanov, A.P.; Sizova, T.B.; Mikhejkina, N.D.; Sankovskij, G.A.; Tyufyagin, A.N.

    1991-01-01

    A brief description of software for automated developing the models of integrating modular programming system, program module generator and program module library providing thermal-hydraulic calcualtion of process dynamics in power unit equipment components and on-line control system operation simulation is given. Technical recommendations for model development are based on experience in creation of concrete models of NPP power units. 8 refs., 1 tab., 4 figs

  12. HP-67 calculator programs for thermodynamic data and phase diagram calculations

    International Nuclear Information System (INIS)

    Brewer, L.

    1978-01-01

    This report is a supplement to a tabulation of the thermodynamic and phase data for the 100 binary systems of Mo with the elements from H to Lr. The calculations of thermodynamic data and phase equilibria were carried out from 5000 0 K to low temperatures. This report presents the methods of calculation used. The thermodynamics involved is rather straightforward and the reader is referred to any advanced thermodynamic text. The calculations were largely carried out using an HP-65 programmable calculator. In this report, those programs are reformulated for use with the HP-67 calculator; great reduction in the number of programs required to carry out the calculation results

  13. Automated Critical Peak Pricing Field Tests: Program Descriptionand Results

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David; Motegi, Naoya; Kiliccote, Sila; Xu, Peng

    2006-04-06

    California utilities have been exploring the use of critical peak prices (CPP) to help reduce needle peaks in customer end-use loads. CPP is a form of price-responsive demand response (DR). Recent experience has shown that customers have limited knowledge of how to operate their facilities in order to reduce their electricity costs under CPP (Quantum 2004). While the lack of knowledge about how to develop and implement DR control strategies is a barrier to participation in DR programs like CPP, another barrier is the lack of automation of DR systems. During 2003 and 2004, the PIER Demand Response Research Center (DRRC) conducted a series of tests of fully automated electric demand response (Auto-DR) at 18 facilities. Overall, the average of the site-specific average coincident demand reductions was 8% from a variety of building types and facilities. Many electricity customers have suggested that automation will help them institutionalize their electric demand savings and improve their overall response and DR repeatability. This report focuses on and discusses the specific results of the Automated Critical Peak Pricing (Auto-CPP, a specific type of Auto-DR) tests that took place during 2005, which build on the automated demand response (Auto-DR) research conducted through PIER and the DRRC in 2003 and 2004. The long-term goal of this project is to understand the technical opportunities of automating demand response and to remove technical and market impediments to large-scale implementation of automated demand response (Auto-DR) in buildings and industry. A second goal of this research is to understand and identify best practices for DR strategies and opportunities. The specific objectives of the Automated Critical Peak Pricing test were as follows: (1) Demonstrate how an automated notification system for critical peak pricing can be used in large commercial facilities for demand response (DR). (2) Evaluate effectiveness of such a system. (3) Determine how customers

  14. How a Fully Automated eHealth Program Simulates Three Therapeutic Processes: A Case Study.

    Science.gov (United States)

    Holter, Marianne T S; Johansen, Ayna; Brendryen, Håvar

    2016-06-28

    eHealth programs may be better understood by breaking down the components of one particular program and discussing its potential for interactivity and tailoring in regard to concepts from face-to-face counseling. In the search for the efficacious elements within eHealth programs, it is important to understand how a program using lapse management may simultaneously support working alliance, internalization of motivation, and behavior maintenance. These processes have been applied to fully automated eHealth programs individually. However, given their significance in face-to-face counseling, it may be important to simulate the processes simultaneously in interactive, tailored programs. We propose a theoretical model for how fully automated behavior change eHealth programs may be more effective by simulating a therapist's support of a working alliance, internalization of motivation, and managing lapses. We show how the model is derived from theory and its application to Endre, a fully automated smoking cessation program that engages the user in several "counseling sessions" about quitting. A descriptive case study based on tools from the intervention mapping protocol shows how each therapeutic process is simulated. The program supports the user's working alliance through alliance factors, the nonembodied relational agent Endre and computerized motivational interviewing. Computerized motivational interviewing also supports internalized motivation to quit, whereas a lapse management component responds to lapses. The description operationalizes working alliance, internalization of motivation, and managing lapses, in terms of eHealth support of smoking cessation. A program may simulate working alliance, internalization of motivation, and lapse management through interactivity and individual tailoring, potentially making fully automated eHealth behavior change programs more effective.

  15. Costs to Automate Demand Response - Taxonomy and Results from Field Studies and Programs

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Schetrit, Oren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kiliccote, Sila [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cheung, Iris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Li, Becky Z [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-07-31

    During the past decade, the technology to automate demand response (DR) in buildings and industrial facilities has advanced significantly. Automation allows rapid, repeatable, reliable operation. This study focuses on costs for DR automation in commercial buildings with some discussion on residential buildings and industrial facilities. DR automation technology relies on numerous components, including communication systems, hardware and software gateways, standards-based messaging protocols, controls and integration platforms, and measurement and telemetry systems. This report compares cost data from several DR automation programs and pilot projects, evaluates trends in the cost per unit of DR and kilowatts (kW) available from automated systems, and applies a standard naming convention and classification or taxonomy for system elements. Median costs for the 56 installed automated DR systems studied here are about $200/kW. The deviation around this median is large with costs in some cases being an order of magnitude great or less than the median. This wide range is a result of variations in system age, size of load reduction, sophistication, and type of equipment included in cost analysis. The costs to automate fast DR systems for ancillary services are not fully analyzed in this report because additional research is needed to determine the total cost to install, operate, and maintain these systems. However, recent research suggests that they could be developed at costs similar to those of existing hot-summer DR automation systems. This report considers installation and configuration costs and does include the costs of owning and operating DR automation systems. Future analysis of the latter costs should include the costs to the building or facility manager costs as well as utility or third party program manager cost.

  16. Automated complex spectra processing of actinide α-radiation

    International Nuclear Information System (INIS)

    Anichenkov, S.V.; Popov, Yu.S.; Tselishchev, I.V.; Mishenev, V.B.; Timofeev, G.A.

    1989-01-01

    Earlier described algorithms of automated processing of complex α - spectra of actinides with the use of Ehlektronika D3-28 computer line, connected with ICA-070 multichannel amplitude pulse analyzer, were realized. The developed program enables to calculated peak intensity and the relative isotope content, to conduct energy calibration of spectra, to calculate peak center of gravity and energy resolution, to perform integral counting in particular part of the spectrum. Error of the method of automated processing depens on the degree of spectrum complication and lies within the limits of 1-12%. 8 refs.; 4 figs.; 2 tabs

  17. Computer programs for lattice calculations

    International Nuclear Information System (INIS)

    Keil, E.; Reich, K.H.

    1984-01-01

    The aim of the workshop was to find out whether some standardisation could be achieved for future work in this field. A certain amount of useful information was unearthed, and desirable features of a ''standard'' program emerged. Progress is not expected to be breathtaking, although participants (practically from all interested US, Canadian and European accelerator laboratories) agreed that the mathematics of the existing programs is more or less the same. Apart from the NIH (not invented here) effect, there is a - to quite some extent understandable - tendency to stay with a program one knows and to add to it if unavoidable rather than to start using a new one. Users of the well supported program TRANSPORT (designed for beam line calculations) would prefer to have it fully extended for lattice calculations (to some extent already possible now), while SYNCH users wish to see that program provided with a user-friendly input, rather than spending time and effort for mastering a new program

  18. Biocoder: A programming language for standardizing and automating biology protocols.

    Science.gov (United States)

    Ananthanarayanan, Vaishnavi; Thies, William

    2010-11-08

    Published descriptions of biology protocols are often ambiguous and incomplete, making them difficult to replicate in other laboratories. However, there is increasing benefit to formalizing the descriptions of protocols, as laboratory automation systems (such as microfluidic chips) are becoming increasingly capable of executing them. Our goal in this paper is to improve both the reproducibility and automation of biology experiments by using a programming language to express the precise series of steps taken. We have developed BioCoder, a C++ library that enables biologists to express the exact steps needed to execute a protocol. In addition to being suitable for automation, BioCoder converts the code into a readable, English-language description for use by biologists. We have implemented over 65 protocols in BioCoder; the most complex of these was successfully executed by a biologist in the laboratory using BioCoder as the only reference. We argue that BioCoder exposes and resolves ambiguities in existing protocols, and could provide the software foundations for future automation platforms. BioCoder is freely available for download at http://research.microsoft.com/en-us/um/india/projects/biocoder/. BioCoder represents the first practical programming system for standardizing and automating biology protocols. Our vision is to change the way that experimental methods are communicated: rather than publishing a written account of the protocols used, researchers will simply publish the code. Our experience suggests that this practice is tractable and offers many benefits. We invite other researchers to leverage BioCoder to improve the precision and completeness of their protocols, and also to adapt and extend BioCoder to new domains.

  19. Application of automated blind for daylighting in tropical region

    International Nuclear Information System (INIS)

    Chaiwiwatworakul, Pipat; Chirarattananon, Surapong; Rakkwamsuk, Pattana

    2009-01-01

    This paper reports an experimental and simulation study on application of automated Venetian blind for daylighting in tropical climate. A horizontal blind system operating automatically under programmed control was constructed and integrated onto the glazed windows to form a window system with an automated blind in a room of a laboratory building. A dimming controller was also integrated to the lighting system of the room. Different operation schemes of the window system were devised and tested in the attempt to maximize energy savings while maintaining the quality of the visual environment in the room. Intensive measurement of illuminance of the interior space was undertaken during the experiments. A methodology for calculation of interior daylight illuminance and associated glare corresponding to the configurations of the experiments was adopted. The method was coded into a computer program. Results of calculation from the program agree well with those from experiments for all the schemes of operation conducted. The program was used to simulate the situation when each scheme of operation was implemented for a whole year. It was found that such window system with automated blind enabled energy savings of 80%, but a more sophisticated scheme also helped maintain the interior visual quality at high level.

  20. Calculational-theoretical studies of the system of local automated regulators and lateral ionization chambers

    International Nuclear Information System (INIS)

    Aleksakov, A.N.; Emel'yanov, I.Ya.; Nikolaev, E.V.; Panin, V.M.; Podlazov, L.N.; Rogova, V.D.

    1987-01-01

    Methods of engineering synthesis of the systems for nuclear reactor local automated power regulation and radial-azimuthal energy distribution stabilization operating according to lateral ionization chamber signals are described. Results of calculational-theoretical investigations into the system efficiency and peculiarities of its reaction to some perturbations typical of the RBMK type reactors are considered

  1. Automated Manufacturing/Robotics Technology: Certificate and Associate Degree Programs.

    Science.gov (United States)

    McQuay, Paul L.

    A description is provided of the Automated Manufacturing/Robotics program to be offered at Delaware County Community College beginning in September 1984. Section I provides information on the use of reprogramable industrial robots in manufacturing and the rapid changes in production that can be effected through the application of automated…

  2. Effects of advanced carbohydrate counting guided by an automated bolus calculator in Type 1 diabetes mellitus (StenoABC)

    DEFF Research Database (Denmark)

    Hommel, E; Schmidt, S; Vistisen, D

    2017-01-01

    -centre, investigator-initiated clinical study. We enrolled advanced carbohydrate counting-naïve adults with Type 1 diabetes and HbA1c levels 64-100 mmol/mol (8.0-11.3%), who were receiving multiple daily insulin injection therapy. In a 1:1-ratio, participants were randomized to receive training in either advanced......AIMS: To test whether concomitant use of an automated bolus calculator for people with Type 1 diabetes carrying out advanced carbohydrate counting would induce further improvements in metabolic control. METHODS: We conducted a 12-month, randomized, parallel-group, open-label, single...... carbohydrate counting using mental calculations (MC group) or advanced carbohydrate counting using an automated bolus calculator (ABC group) during a 3.5-h group training course. For 12 months after training, participants attended a specialized diabetes centre quarterly. The primary outcome was change in HbA1c...

  3. Testing Automation of Context-Oriented Programs Using Separation Logic

    Directory of Open Access Journals (Sweden)

    Mohamed A. El-Zawawy

    2014-01-01

    Full Text Available A new approach for programming that enables switching among contexts of commands during program execution is context-oriented programming (COP. This technique is more structured and modular than object-oriented and aspect-oriented programming and hence more flexible. For context-oriented programming, as implemented in COP languages such as ContextJ* and ContextL, this paper introduces accurate operational semantics. The language model of this paper uses Java concepts and is equipped with layer techniques for activation/deactivation of layer contexts. This paper also presents a logical system for COP programs. This logic is necessary for the automation of testing, developing, and validating of partial correctness specifications for COP programs and is an extension of separation logic. A mathematical soundness proof for the logical system against the proposed operational semantics is presented in the paper.

  4. The automated testing system of programs with the graphic user interface within the context of educational process

    OpenAIRE

    Sychev, O.; Kiryushkin, A.

    2009-01-01

    The paper describes the problems of automation of educational process at the course "Programming on high level language. Algorithmic languages". Complexities of testing of programs with the user interface are marked. Existing analogues was considered. Methods of automation of student's jobs testing are offered.

  5. Calculation methods in program CCRMN

    Energy Technology Data Exchange (ETDEWEB)

    Chonghai, Cai [Nankai Univ., Tianjin (China). Dept. of Physics; Qingbiao, Shen [Chinese Nuclear Data Center, Beijing, BJ (China)

    1996-06-01

    CCRMN is a program for calculating complex reactions of a medium-heavy nucleus with six light particles. In CCRMN, the incoming particles can be neutrons, protons, {sup 4}He, deuterons, tritons and {sup 3}He. the CCRMN code is constructed within the framework of the optical model, pre-equilibrium statistical theory based on the exciton model and the evaporation model. CCRMN is valid in 1{approx} MeV energy region, it can give correct results for optical model quantities and all kinds of reaction cross sections. This program has been applied in practical calculations and got reasonable results.

  6. Automation of ORIGEN2 calculations for the transuranic waste baseline inventory database using a pre-processor and a post-processor

    International Nuclear Information System (INIS)

    Liscum-Powell, J.

    1997-06-01

    The purpose of the work described in this report was to automate ORIGEN2 calculations for the Waste Isolation Pilot Plant (WIPP) Transuranic Waste Baseline Inventory Database (WTWBID); this was done by developing a pre-processor to generate ORIGEN2 input files from WWBID inventory files and a post-processor to remove excess information from the ORIGEN2 output files. The calculations performed with ORIGEN2 estimate the radioactive decay and buildup of various radionuclides in the waste streams identified in the WTWBID. The resulting radionuclide inventories are needed for performance assessment calculations for the WIPP site. The work resulted in the development of PreORG, which requires interaction with the user to generate ORIGEN2 input files on a site-by-site basis, and PostORG, which processes ORIGEN2 output into more manageable files. Both programs are written in the FORTRAN 77 computer language. After running PreORG, the user will run ORIGEN2 to generate the desired data; upon completion of ORIGEN2 calculations, the user can run PostORG to process the output to make it more manageable. All the programs run on a 386 PC or higher with a math co-processor or a computer platform running under VMS operating system. The pre- and post-processors for ORIGEN2 were generated for use with Rev. 1 data of the WTWBID and can also be used with Rev. 2 and 3 data of the TWBID (Transuranic Waste Baseline Inventory Database)

  7. Automated size-specific CT dose monitoring program: Assessing variability in CT dose

    International Nuclear Information System (INIS)

    Christianson, Olav; Li Xiang; Frush, Donald; Samei, Ehsan

    2012-01-01

    Purpose: The potential health risks associated with low levels of ionizing radiation have created a movement in the radiology community to optimize computed tomography (CT) imaging protocols to use the lowest radiation dose possible without compromising the diagnostic usefulness of the images. Despite efforts to use appropriate and consistent radiation doses, studies suggest that a great deal of variability in radiation dose exists both within and between institutions for CT imaging. In this context, the authors have developed an automated size-specific radiation dose monitoring program for CT and used this program to assess variability in size-adjusted effective dose from CT imaging. Methods: The authors radiation dose monitoring program operates on an independent health insurance portability and accountability act compliant dosimetry server. Digital imaging and communication in medicine routing software is used to isolate dose report screen captures and scout images for all incoming CT studies. Effective dose conversion factors (k-factors) are determined based on the protocol and optical character recognition is used to extract the CT dose index and dose-length product. The patient's thickness is obtained by applying an adaptive thresholding algorithm to the scout images and is used to calculate the size-adjusted effective dose (ED adj ). The radiation dose monitoring program was used to collect data on 6351 CT studies from three scanner models (GE Lightspeed Pro 16, GE Lightspeed VCT, and GE Definition CT750 HD) and two institutions over a one-month period and to analyze the variability in ED adj between scanner models and across institutions. Results: No significant difference was found between computer measurements of patient thickness and observer measurements (p= 0.17), and the average difference between the two methods was less than 4%. Applying the size correction resulted in ED adj that differed by up to 44% from effective dose estimates that were not

  8. Isochronous cyclotron closed equilibrium orbit calculation program description

    International Nuclear Information System (INIS)

    Kiyan, I.N.; Vorozhtsov, S.B.; Tarashkevich, R.

    2003-01-01

    The Equilibrium Orbit Research Program - EORP, written in C++ with the use of Visual C++ is described. The program is intended for the calculation of the particle rotation frequency and particle kinetic energy in the closed equilibrium orbits of an isochronous cyclotron, where the closed equilibrium orbits are described through the radius and particle momentum angle: r eo (θ) and φ p (θ). The program algorithm was developed on the basis of articles, lecture notes and original analytic calculations. The results of calculations by the EORP were checked and confirmed by using the results of calculations by the numerical methods. The discrepancies between the EORP results and the numerical method results for the calculations of the particle rotation frequency and particle kinetic energy are within the limits of ±1·10 -4 . The EORP results and the numerical method results for the calculations of r eo (θ) and φ p (θ) practically coincide. All this proves the accuracy of calculations by the EORP for the isochronous cyclotrons with the azimuthally varied fields. As is evident from the results of calculations, the program can be used for the calculations of both straight - sector and spiral-sector isochronous cyclotrons. (author)

  9. Report of the workshop on Aviation Safety/Automation Program

    Science.gov (United States)

    Morello, Samuel A. (Editor)

    1990-01-01

    As part of NASA's responsibility to encourage and facilitate active exchange of information and ideas among members of the aviation community, an Aviation Safety/Automation workshop was organized and sponsored by the Flight Management Division of NASA Langley Research Center. The one-day workshop was held on October 10, 1989, at the Sheraton Beach Inn and Conference Center in Virginia Beach, Virginia. Participants were invited from industry, government, and universities to discuss critical questions and issues concerning the rapid introduction and utilization of advanced computer-based technology into the flight deck and air traffic controller workstation environments. The workshop was attended by approximately 30 discipline experts, automation and human factors researchers, and research and development managers. The goal of the workshop was to address major issues identified by the NASA Aviation Safety/Automation Program. Here, the results of the workshop are documented. The ideas, thoughts, and concepts were developed by the workshop participants. The findings, however, have been synthesized into a final report primarily by the NASA researchers.

  10. Automated Critical PeakPricing Field Tests: 2006 Pilot ProgramDescription and Results

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David; Motegi, Naoya; Kiliccote, Sila

    2007-06-19

    During 2006 Lawrence Berkeley National Laboratory (LBNL) and the Demand Response Research Center (DRRC) performed a technology evaluation for the Pacific Gas and Electric Company (PG&E) Emerging Technologies Programs. This report summarizes the design, deployment, and results from the 2006 Automated Critical Peak Pricing Program (Auto-CPP). The program was designed to evaluate the feasibility of deploying automation systems that allow customers to participate in critical peak pricing (CPP) with a fully-automated response. The 2006 program was in operation during the entire six-month CPP period from May through October. The methodology for this field study included site recruitment, control strategy development, automation system deployment, and evaluation of sites' participation in actual CPP events through the summer of 2006. LBNL recruited sites in PG&E's territory in northern California through contacts from PG&E account managers, conferences, and industry meetings. Each site contact signed a memorandum of understanding with LBNL that outlined the activities needed to participate in the Auto-CPP program. Each facility worked with LBNL to select and implement control strategies for demand response and developed automation system designs based on existing Internet connectivity and building control systems. Once the automation systems were installed, LBNL conducted communications tests to ensure that the Demand Response Automation Server (DRAS) correctly provided and logged the continuous communications of the CPP signals with the energy management and control system (EMCS) for each site. LBNL also observed and evaluated Demand Response (DR) shed strategies to ensure proper commissioning of controls. The communication system allowed sites to receive day-ahead as well as day-of signals for pre-cooling, a DR strategy used at a few sites. Measurement of demand response was conducted using two different baseline models for estimating peak load savings. One

  11. Automated generation of lattice QCD Feynman rules

    Energy Technology Data Exchange (ETDEWEB)

    Hart, A.; Mueller, E.H. [Edinburgh Univ. (United Kingdom). SUPA School of Physics and Astronomy; von Hippel, G.M. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Horgan, R.R. [Cambridge Univ. (United Kingdom). DAMTP, CMS

    2009-04-15

    The derivation of the Feynman rules for lattice perturbation theory from actions and operators is complicated, especially for highly improved actions such as HISQ. This task is, however, both important and particularly suitable for automation. We describe a suite of software to generate and evaluate Feynman rules for a wide range of lattice field theories with gluons and (relativistic and/or heavy) quarks. Our programs are capable of dealing with actions as complicated as (m)NRQCD and HISQ. Automated differentiation methods are used to calculate also the derivatives of Feynman diagrams. (orig.)

  12. Automated generation of lattice QCD Feynman rules

    International Nuclear Information System (INIS)

    Hart, A.; Mueller, E.H.; Horgan, R.R.

    2009-04-01

    The derivation of the Feynman rules for lattice perturbation theory from actions and operators is complicated, especially for highly improved actions such as HISQ. This task is, however, both important and particularly suitable for automation. We describe a suite of software to generate and evaluate Feynman rules for a wide range of lattice field theories with gluons and (relativistic and/or heavy) quarks. Our programs are capable of dealing with actions as complicated as (m)NRQCD and HISQ. Automated differentiation methods are used to calculate also the derivatives of Feynman diagrams. (orig.)

  13. Some calculator programs for particle physics

    International Nuclear Information System (INIS)

    Wohl, C.G.

    1982-01-01

    Seven calculator programs that do simple chores that arise in elementary particle physics are given. LEGENDRE evaluates the Legendre polynomial series Σa/sub n/P/sub n/(x) at a series of values of x. ASSOCIATED LEGENDRE evaluates the first-associated Legendre polynomial series Σb/sub n/P/sub n/ 1 (x) at a series of values of x. CONFIDENCE calculates confidence levels for chi 2 , Gaussian, or Poisson probability distributions. TWO BODY calculates the c.m. energy, the initial- and final-state c.m. momenta, and the extreme values of t and u for a 2-body reaction. ELLIPSE calculates coordinates of points for drawing an ellipse plot showing the kinematics of a 2-body reaction or decay. DALITZ RECTANGULAR calculates coordinates of points on the boundary of a rectangular Dalitz plot. DALITZ TRIANGULAR calculates coordinates of points on the boundary of a triangular Dalitz plot. There are short versions of CONFIDENCE (EVEN N and POISSON) that calculate confidence levels for the even-degree-of-freedom-chi 2 and the Poisson cases, and there is a short version of TWO BODY (CM) that calculates just the c.m. energy and initial-state momentum. The programs are written for the HP-97 calculator

  14. Utilizing Facebook and Automated Telephone Calls to Increase Adoption of a Local Smoke Alarm Installation Program.

    Science.gov (United States)

    Frattaroli, Shannon; Schulman, Eric; McDonald, Eileen M; Omaki, Elise C; Shields, Wendy C; Jones, Vanya; Brewer, William

    2018-05-17

    Innovative strategies are needed to improve the prevalence of working smoke alarms in homes. To our knowledge, this is the first study to report on the effectiveness of Facebook advertising and automated telephone calls as population-level strategies to encourage an injury prevention behavior. We examine the effectiveness of Facebook advertising and automated telephone calls as strategies to enroll individuals in Baltimore City's Fire Department's free smoke alarm installation program. We directed our advertising efforts toward Facebook users eligible for the Baltimore City Fire Department's free smoke alarm installation program and all homes with a residential phone line included in Baltimore City's automated call system. The Facebook campaign targeted Baltimore City residents 18 years of age and older. In total, an estimated 300 000 Facebook users met the eligibility criteria. Facebook advertisements were delivered to users' desktop and mobile device newsfeeds. A prerecorded message was sent to all residential landlines listed in the city's automated call system. By the end of the campaign, the 3 advertisements generated 456 666 impressions reaching 130 264 Facebook users. Of the users reached, 4367 individuals (1.3%) clicked the advertisement. The automated call system included approximately 90 000 residential phone numbers. Participants attributed 25 smoke alarm installation requests to Facebook and 458 to the automated call. Facebook advertisements are a novel approach to promoting smoke alarms and appear to be effective in exposing individuals to injury prevention messages. However, converting Facebook message recipients to users of a smoke alarm installation program occurred infrequently in this study. Residents who participated in the smoke alarm installation program were more likely to cite the automated call as the impetus for their participation. Additional research is needed to understand the circumstances and strategies to effectively use the social

  15. PASA - A Program for Automated Protein NMR Backbone Signal Assignment by Pattern-Filtering Approach

    International Nuclear Information System (INIS)

    Xu Yizhuang; Wang Xiaoxia; Yang Jun; Vaynberg, Julia; Qin Jun

    2006-01-01

    We present a new program, PASA (Program for Automated Sequential Assignment), for assigning protein backbone resonances based on multidimensional heteronuclear NMR data. Distinct from existing programs, PASA emphasizes a per-residue-based pattern-filtering approach during the initial stage of the automated 13 C α and/or 13 C β chemical shift matching. The pattern filter employs one or multiple constraints such as 13 C α /C β chemical shift ranges for different amino acid types and side-chain spin systems, which helps to rule out, in a stepwise fashion, improbable assignments as resulted from resonance degeneracy or missing signals. Such stepwise filtering approach substantially minimizes early false linkage problems that often propagate, amplify, and ultimately cause complication or combinatorial explosion of the automation process. Our program (http://www.lerner.ccf.org/moleccard/qin/) was tested on four representative small-large sized proteins with various degrees of resonance degeneracy and missing signals, and we show that PASA achieved the assignments efficiently and rapidly that are fully consistent with those obtained by laborious manual protocols. The results demonstrate that PASA may be a valuable tool for NMR-based structural analyses, genomics, and proteomics

  16. Program for the surface muon spectra calculation

    International Nuclear Information System (INIS)

    Arkatov, Yu.M.; Voloshchuk, V.I.; Zolenko, V.A.; Prokhorets, I.M.; Soldatov, S.A.

    1987-01-01

    Program for the ''surface'' muon spectrum calculation is described. The algorithm is based on simulation of coordinates of π-meson birth point and direction of its escape from meson-forming target (MFT) according to angular distribution with the use of Monte Carlo method. Ionization losses of π-(μ)-mesons in the target are taken into account in the program. Calculation of ''surface'' muon spectrum is performed in the range of electron energies from 150 MeV up to 1000 MeV. Spectra of π-mesons are calculated with account of ionization losses in the target and without it. Distributions over lengths of π-meson paths in MFT and contribution of separate sections of the target to pion flux at the outlet of meson channel are calculated as well. Meson-forming target for calculation can be made of any material. The program provides for the use of the MFT itself in the form of photon converter or photon converter is located in front of the target. The program is composed of 13 subprograms; 2 of them represent generators of pseudorandom numbers, distributed uniformly in the range from 0 up to 1, and numbers with Gauss distribution. Example of calculation for copper target of 3 cm length, electron beam current-1 μA, energy-300 MeV is presented

  17. Software complex for developing dynamically packed program system for experiment automation

    International Nuclear Information System (INIS)

    Baluka, G.; Salamatin, I.M.

    1985-01-01

    Software complex for developing dynamically packed program system for experiment automation is considered. The complex includes general-purpose programming systems represented as the RT-11 standard operating system and specially developed problem-oriented moduli providing execution of certain jobs. The described complex is realized in the PASKAL' and MAKRO-2 languages and it is rather flexible to variations of the technique of the experiment

  18. Systematic review automation technologies

    Science.gov (United States)

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  19. CONCEPTUAL STRUCTURALLOGIC DIAGRAM PRODUCTION AUTOMATION EXPERT STUDY ON THE ISSUE OF CORRECTNESS OF CALCULATION OF THE TAX ON PROFIT OF ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    Andrey N. Ishchenko

    2014-01-01

    Full Text Available In this article the possibility of automation of an expert study on the questionof correctness of tax calculation profi t organization. Considered are the problemsof formalization of the expert research inthis field, specify the structure of imprisonment. The author proposes a conceptual structural-logic diagram automation expertresearch in this area.

  20. PONDEROSA, an automated 3D-NOESY peak picking program, enables automated protein structure determination.

    Science.gov (United States)

    Lee, Woonghee; Kim, Jin Hae; Westler, William M; Markley, John L

    2011-06-15

    PONDEROSA (Peak-picking Of Noe Data Enabled by Restriction of Shift Assignments) accepts input information consisting of a protein sequence, backbone and sidechain NMR resonance assignments, and 3D-NOESY ((13)C-edited and/or (15)N-edited) spectra, and returns assignments of NOESY crosspeaks, distance and angle constraints, and a reliable NMR structure represented by a family of conformers. PONDEROSA incorporates and integrates external software packages (TALOS+, STRIDE and CYANA) to carry out different steps in the structure determination. PONDEROSA implements internal functions that identify and validate NOESY peak assignments and assess the quality of the calculated three-dimensional structure of the protein. The robustness of the analysis results from PONDEROSA's hierarchical processing steps that involve iterative interaction among the internal and external modules. PONDEROSA supports a variety of input formats: SPARKY assignment table (.shifts) and spectrum file formats (.ucsf), XEASY proton file format (.prot), and NMR-STAR format (.star). To demonstrate the utility of PONDEROSA, we used the package to determine 3D structures of two proteins: human ubiquitin and Escherichia coli iron-sulfur scaffold protein variant IscU(D39A). The automatically generated structural constraints and ensembles of conformers were as good as or better than those determined previously by much less automated means. The program, in the form of binary code along with tutorials and reference manuals, is available at http://ponderosa.nmrfam.wisc.edu/.

  1. Automated size-specific CT dose monitoring program: Assessing variability in CT dose

    Energy Technology Data Exchange (ETDEWEB)

    Christianson, Olav; Li Xiang; Frush, Donald; Samei, Ehsan [Clinical Imaging Physics Group, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 and Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Clinical Imaging Physics Group, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States) and Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Clinical Imaging Physics Group, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Department of Physics, Duke University, Durham, North Carolina 27710 (United States); and Department of Biomedical Engineering, Duke University, Durham, North Carolina 27708 (United States)

    2012-11-15

    Purpose: The potential health risks associated with low levels of ionizing radiation have created a movement in the radiology community to optimize computed tomography (CT) imaging protocols to use the lowest radiation dose possible without compromising the diagnostic usefulness of the images. Despite efforts to use appropriate and consistent radiation doses, studies suggest that a great deal of variability in radiation dose exists both within and between institutions for CT imaging. In this context, the authors have developed an automated size-specific radiation dose monitoring program for CT and used this program to assess variability in size-adjusted effective dose from CT imaging. Methods: The authors radiation dose monitoring program operates on an independent health insurance portability and accountability act compliant dosimetry server. Digital imaging and communication in medicine routing software is used to isolate dose report screen captures and scout images for all incoming CT studies. Effective dose conversion factors (k-factors) are determined based on the protocol and optical character recognition is used to extract the CT dose index and dose-length product. The patient's thickness is obtained by applying an adaptive thresholding algorithm to the scout images and is used to calculate the size-adjusted effective dose (ED{sub adj}). The radiation dose monitoring program was used to collect data on 6351 CT studies from three scanner models (GE Lightspeed Pro 16, GE Lightspeed VCT, and GE Definition CT750 HD) and two institutions over a one-month period and to analyze the variability in ED{sub adj} between scanner models and across institutions. Results: No significant difference was found between computer measurements of patient thickness and observer measurements (p= 0.17), and the average difference between the two methods was less than 4%. Applying the size correction resulted in ED{sub adj} that differed by up to 44% from effective dose

  2. Utilization of a system of automated radiotherapy of malignant tumors using optimum programs of irradiation

    International Nuclear Information System (INIS)

    Pavlov, A.S.; Kostromina, K.N.; Fadeeva, M.A.

    1983-01-01

    The clinical experience in the implementation of optimized irradiation programs is summed up for tumors of different sites with the help of the first serial specimen of the system of automated control over irradiation - Altai-MT. The utilization of the system makes it possible to save time and avoid an error in the implementation of complex irradiation programs as well as to lower the exposure of medical personnel to radiation. Automated programs of irradiation meet the requirements of the conformity and homogeneity of a dose field within a focus of lesion, gradient conditions on the border with normal tissues, the minimization of radiation exposure in critical organs

  3. A spreadsheet-coupled SOLGAS: A computerized thermodynamic equilibrium calculation tool. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Trowbridge, L.D.; Leitnaker, J.M. [Oak Ridge K-25 Site, TN (United States). Technical Analysis and Operations Div.

    1995-07-01

    SOLGAS, an early computer program for calculating equilibrium in a chemical system, has been made more user-friendly, and several ``bells and whistles`` have been added. The necessity to include elemental species has been eliminated. The input of large numbers of starting conditions has been automated. A revised spreadsheet-based format for entering data, including non-ideal binary and ternary mixtures, simplifies and reduces chances for error. Calculational errors by SOLGAS are flagged, and several programming errors are corrected. Auxiliary programs are available to assemble and partially automate plotting of large amounts of data. Thermodynamic input data can be changed on line. The program can be operated with or without a co-processor. Copies of the program, suitable for the IBM-PC or compatibles with at least 384 bytes of low RAM, are available from the authors. This user manual contains appendices with examples of the use of SOLGAS. These range from elementary examples, such as, the relationships among water, ice, and water vapor, to more complex systems: phase diagram calculation of UF{sub 4} and UF{sub 6} system; burning UF{sub 4} in fluorine; thermodynamic calculation of the Cl-F-O-H system; equilibria calculations in the CCl{sub 4}--CH{sub 3}OH system; and limitations applicable to aqueous solutions. An appendix also contains the source code.

  4. Automated Source Code Analysis to Identify and Remove Software Security Vulnerabilities: Case Studies on Java Programs

    OpenAIRE

    Natarajan Meghanathan

    2013-01-01

    The high-level contribution of this paper is to illustrate the development of generic solution strategies to remove software security vulnerabilities that could be identified using automated tools for source code analysis on software programs (developed in Java). We use the Source Code Analyzer and Audit Workbench automated tools, developed by HP Fortify Inc., for our testing purposes. We present case studies involving a file writer program embedded with features for password validation, and ...

  5. The organization of professional predictions on the development of automation for stope equipment

    Energy Technology Data Exchange (ETDEWEB)

    Kanygin, U.M.; Markashov, V.E.; Pashchevskii, U.G.

    1980-01-01

    The problems of organizing and conducting experimental predictions on the development of automation for stope equipment are examined. Professional evaluations are developed, and the order for processing the results is given, together with a calculation program for use with the ES-1020 computer. Several results from predictive studies of the development of automation for use with stope equipment are given.

  6. Isochronous Cyclotron Closed Equilibrium Orbit Calculation Program Description

    CERN Document Server

    Kian, I N; Tarashkevich, R

    2003-01-01

    The Equilibrium Orbit Research Program - EORP, written in C++ with the use of Visual C++ is described. The program is intended for the calculation of the particle rotation frequency and particle kinetic energy in the closed equilibrium orbits of an isochronous cyclotron, where the closed equilibrium orbits are described through the radius and particle momentum angle: r_{eo}(\\theta) and \\varphi_{p}(\\theta). The program algorithm was developed on the basis of articles, lecture notes and original analytic calculations. The results of calculations by the EORP were checked and confirmed by using the results of calculations by the numerical methods. The discrepancies between the EORP results and the numerical method results for the calculations of the particle rotation frequency and particle kinetic energy are within the limits of \\pm1\\cdot10^{-4}. The EORP results and the numerical method results for the calculations of r_{eo}(\\theta) and \\varphi_{p}(\\theta) practically coincide. All this proves the accuracy of ca...

  7. Aviation safety/automation program overview

    Science.gov (United States)

    Morello, Samuel A.

    1990-01-01

    The goal is to provide a technology base leading to improved safety of the national airspace system through the development and integration of human-centered automation technologies for aircraft crews and air traffic controllers. Information on the problems, specific objectives, human-automation interaction, intelligent error-tolerant systems, and air traffic control/cockpit integration is given in viewgraph form.

  8. A nonproprietary, nonsecret program for calculating Stirling cryocoolers

    Science.gov (United States)

    Martini, W. R.

    1985-01-01

    A design program for an integrated Stirling cycle cryocooler was written on an IBM-PC computer. The program is easy to use and shows the trends and itemizes the losses. The calculated results were compared with some measured performance values. The program predicts somewhat optimistic performance and needs to be calibrated more with experimental measurements. Adding a multiplier to the friction factor can bring the calculated rsults in line with the limited test results so far available. The program is offered as a good framework on which to build a truly useful design program for all types of cryocoolers.

  9. Newnes circuit calculations pocket book with computer programs

    CERN Document Server

    Davies, Thomas J

    2013-01-01

    Newnes Circuit Calculations Pocket Book: With Computer Programs presents equations, examples, and problems in circuit calculations. The text includes 300 computer programs that help solve the problems presented. The book is comprised of 20 chapters that tackle different aspects of circuit calculation. The coverage of the text includes dc voltage, dc circuits, and network theorems. The book also covers oscillators, phasors, and transformers. The text will be useful to electrical engineers and other professionals whose work involves electronic circuitry.

  10. Performance of a fully automated program for measurement of left ventricular ejection fraction

    International Nuclear Information System (INIS)

    Douglass, K.H.; Tibbits, P.; Kasecamp, W.; Han, S.T.; Koller, D.; Links, J.M.; Wagner, H.H. Jr.

    1982-01-01

    A fully automated program developed by us for measurement of left ventricular ejection fraction from equilibrium gated blood studies was evaluated in 130 additional patients. Both of 6-min (130 studies) and 2-min (142 studies in 31 patients) gated blood pool studies were acquired and processed. The program successfully generated ejection fractions in 86% of the studies. These automatically generated ejection fractions were compared with ejection fractions derived from manually drawn regions the interest. When studies were acquired for 6-min with the patient at rest, the correlation between automated and manual ejection fractions was 0.92. When studies were acquired for 2-min, both at rest and during bicycle exercise, the correlation was 0.81. In 25 studies from patients who also underwent contrast ventriculography, the program successfully generated regions of interest in 22 (88%). The correlation between the ejection fraction determined by contrast ventriculography and the automatically generated radionuclide ejection fraction was 0.79. (orig.)

  11. Elementary function calculation programs for the central processor-6

    International Nuclear Information System (INIS)

    Dobrolyubov, L.V.; Ovcharenko, G.A.; Potapova, V.A.

    1976-01-01

    Subprograms of elementary functions calculations are given for the central processor (CP AS-6). A procedure is described to obtain calculated formulae which represent the elementary functions as a polynomial. Standard programs for random numbers are considered. All the programs described are based upon the algorithms of respective programs for BESM computer

  12. Concurrent validity of an automated algorithm for computing the center of pressure excursion index (CPEI).

    Science.gov (United States)

    Diaz, Michelle A; Gibbons, Mandi W; Song, Jinsup; Hillstrom, Howard J; Choe, Kersti H; Pasquale, Maria R

    2018-01-01

    Center of Pressure Excursion Index (CPEI), a parameter computed from the distribution of plantar pressures during stance phase of barefoot walking, has been used to assess dynamic foot function. The original custom program developed to calculate CPEI required the oversight of a user who could manually correct for certain exceptions to the computational rules. A new fully automatic program has been developed to calculate CPEI with an algorithm that accounts for these exceptions. The purpose of this paper is to compare resulting CPEI values computed by these two programs on plantar pressure data from both asymptomatic and pathologic subjects. If comparable, the new program offers significant benefits-reduced potential for variability due to rater discretion and faster CPEI calculation. CPEI values were calculated from barefoot plantar pressure distributions during comfortable paced walking on 61 healthy asymptomatic adults, 19 diabetic adults with moderate hallux valgus, and 13 adults with mild hallux valgus. Right foot data for each subject was analyzed with linear regression and a Bland-Altman plot. The automated algorithm yielded CPEI values that were linearly related to the original program (R 2 =0.99; Pcomputation methods. Results of this analysis suggest that the new automated algorithm may be used to calculate CPEI on both healthy and pathologic feet. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Fully automated segmentation of left ventricle using dual dynamic programming in cardiac cine MR images

    Science.gov (United States)

    Jiang, Luan; Ling, Shan; Li, Qiang

    2016-03-01

    Cardiovascular diseases are becoming a leading cause of death all over the world. The cardiac function could be evaluated by global and regional parameters of left ventricle (LV) of the heart. The purpose of this study is to develop and evaluate a fully automated scheme for segmentation of LV in short axis cardiac cine MR images. Our fully automated method consists of three major steps, i.e., LV localization, LV segmentation at end-diastolic phase, and LV segmentation propagation to the other phases. First, the maximum intensity projection image along the time phases of the midventricular slice, located at the center of the image, was calculated to locate the region of interest of LV. Based on the mean intensity of the roughly segmented blood pool in the midventricular slice at each phase, end-diastolic (ED) and end-systolic (ES) phases were determined. Second, the endocardial and epicardial boundaries of LV of each slice at ED phase were synchronously delineated by use of a dual dynamic programming technique. The external costs of the endocardial and epicardial boundaries were defined with the gradient values obtained from the original and enhanced images, respectively. Finally, with the advantages of the continuity of the boundaries of LV across adjacent phases, we propagated the LV segmentation from the ED phase to the other phases by use of dual dynamic programming technique. The preliminary results on 9 clinical cardiac cine MR cases show that the proposed method can obtain accurate segmentation of LV based on subjective evaluation.

  14. Automation and schema acquisition in learning elementary computer programming : implications for the design of practice

    NARCIS (Netherlands)

    van Merrienboer, Jeroen J.G.; van Merrienboer, J.J.G.; Paas, Fred G.W.C.

    1990-01-01

    Two complementary processes may be distinguished in learning a complex cognitive skill such as computer programming. First, automation offers task-specific procedures that may directly control programming behavior, second, schema acquisition offers cognitive structures that provide analogies in new

  15. Space Missions for Automation and Robotics Technologies (SMART) Program

    Science.gov (United States)

    Cliffone, D. L.; Lum, H., Jr.

    1985-01-01

    NASA is currently considering the establishment of a Space Mission for Automation and Robotics Technologies (SMART) Program to define, develop, integrate, test, and operate a spaceborne national research facility for the validation of advanced automation and robotics technologies. Initially, the concept is envisioned to be implemented through a series of shuttle based flight experiments which will utilize telepresence technologies and real time operation concepts. However, eventually the facility will be capable of a more autonomous role and will be supported by either the shuttle or the space station. To ensure incorporation of leading edge technology in the facility, performance capability will periodically and systematically be upgraded by the solicitation of recommendations from a user advisory group. The facility will be managed by NASA, but will be available to all potential investigators. Experiments for each flight will be selected by a peer review group. Detailed definition and design is proposed to take place during FY 86, with the first SMART flight projected for FY 89.

  16. Program for TI programmable 59 calculator for calculation of 3H concentration of water samples

    International Nuclear Information System (INIS)

    Hussain, S.D.; Asghar, G.

    1982-09-01

    A program has been developed for TI Programmable 59 Calculator of Texas Instruments Inc. to calculate from the observed parameters such as count rate etc. the 3 H (tritium) concentration of water samples processed with/without prior electrolytic enrichment. Procedure to use the program has been described in detail. A brief description of the laboratory treatment of samples and the mathematical equations used in the calculations have been given. (orig./A.B.)

  17. Population-level effects of automated smoking cessation help programs: a randomized controlled trial.

    Science.gov (United States)

    Borland, Ron; Balmford, James; Benda, Peter

    2013-03-01

    To test the population impact of offering automated smoking cessation interventions via the internet and/or by mobile phone. Pragmatic randomized controlled trial with five conditions: offer of (i) minimal intervention control; (ii) QuitCoach personalized tailored internet-delivered advice program; (iii) onQ, an interactive automated text-messaging program; (iv) an integration of both QuitCoach and onQ; and (v) a choice of either alone or the combined program. Australia, via a mix of internet and telephone contacts. A total of 3530 smokers or recent quitters recruited from those interested in quitting, and seeking self-help resources (n = 1335) or cold-contacted from internet panels (n = 2195). The primary outcome was self-report of 6 months sustained abstinence at 7 months post-recruitment. Only 42.5% of those offered one of the interventions took it up to a minimal level. The intervention groups combined had a non-significantly higher 6-month sustained abstinence rate than the control [odds ratio (OR) = 1.48; 95% confidence interval (CI): 0.98-2.24] (missing cases treated as smokers), with no differences between the interventions. Among those who used an intervention, there was a significant overall increase in abstinence (OR = 1.95; CI: 1.04-3.67), but not clearly so when analysing only cases with reported outcomes. Success rates were greater among those recruited after seeking information compared to those cold-contacted. Smokers interested in quitting who were assigned randomly to an offer of either the QuitCoach internet-based support program and/or the interactive automated text-messaging program had non-significantly greater odds of quitting for at least 6 months than those randomized to an offer of a simple information website. © 2012 The Authors, Addiction © 2012 Society for the Study of Addiction.

  18. Automated calculation of myocardial external efficiency from a single 11C-acetate PET/CT scan

    DEFF Research Database (Denmark)

    Harms, Hans; Tolbod, Lars Poulsen; Hansson, Nils Henrik

    of this study was to develop and validate an automated method of calculating MEE from a single dynamic 11C-acetate PETscan. Methods: 21 subjects underwent a dynamic 27 min 11C-acetate PETscan on a Siemens Biograph TruePoint 64 PET/CTscanner. Using cluster analysis, the LV-aortic time-activity curve (TACLV......). Conclusion: Myocardial efficiencycanbe derived directly andautomatically froma single dynamic 11C-acetate PET scan. This eliminates the need for a separate CMR scan and eliminates any potential errors due to different loading conditions between CMR and PETscans.......Background: Dynamic PETwith 11C-acetate can be used to assess myocardial oxygen use which in turn is usedto calculate myocardial external efficiency (MEE), anearly marker of heart failure. MEE is defined as the ratio of total work (TW) and total energy use (TE). Calculation of TW and TE requires...

  19. Automation of testing the metrological reliability of nondestructive control systems

    International Nuclear Information System (INIS)

    Zhukov, Yu.A.; Isakov, V.B.; Karlov, Yu.K.; Kovalevskij, Yu.A.

    1987-01-01

    Opportunities of microcomputers are used to solve the problem of testing control-measuring systems. Besides the main program the program of data processing when characterizing the nondestructive control systems is written in the microcomputer. The program includes two modules. The first module contains tests-programs, by which accuracy of functional elements of the microcomputer and interface elements with issuing a message to the operator on readiness of the elements for operation and failure of a certain element are determined. The second module includes: calculational programs when determining metrological reliability of measuring channel reliability, a calculational subprogram for random statistical measuring error, time instability and ''dead time''. Automation of testing metrological reliability of the nondestructive control systems increases reliability of determining metrological parameters and reduces time of system testing

  20. Automated Inadvertent Intruder Application

    International Nuclear Information System (INIS)

    Koffman, Larry D.; Lee, Patricia L.; Cook, James R.; Wilhite, Elmer L.

    2008-01-01

    The Environmental Analysis and Performance Modeling group of Savannah River National Laboratory (SRNL) conducts performance assessments of the Savannah River Site (SRS) low-level waste facilities to meet the requirements of DOE Order 435.1. These performance assessments, which result in limits on the amounts of radiological substances that can be placed in the waste disposal facilities, consider numerous potential exposure pathways that could occur in the future. One set of exposure scenarios, known as inadvertent intruder analysis, considers the impact on hypothetical individuals who are assumed to inadvertently intrude onto the waste disposal site. Inadvertent intruder analysis considers three distinct scenarios for exposure referred to as the agriculture scenario, the resident scenario, and the post-drilling scenario. Each of these scenarios has specific exposure pathways that contribute to the overall dose for the scenario. For the inadvertent intruder analysis, the calculation of dose for the exposure pathways is a relatively straightforward algebraic calculation that utilizes dose conversion factors. Prior to 2004, these calculations were performed using an Excel spreadsheet. However, design checks of the spreadsheet calculations revealed that errors could be introduced inadvertently when copying spreadsheet formulas cell by cell and finding these errors was tedious and time consuming. This weakness led to the specification of functional requirements to create a software application that would automate the calculations for inadvertent intruder analysis using a controlled source of input parameters. This software application, named the Automated Inadvertent Intruder Application, has undergone rigorous testing of the internal calculations and meets software QA requirements. The Automated Inadvertent Intruder Application was intended to replace the previous spreadsheet analyses with an automated application that was verified to produce the same calculations and

  1. Automation of Educational Tasks for Academic Radiology.

    Science.gov (United States)

    Lamar, David L; Richardson, Michael L; Carlson, Blake

    2016-07-01

    The process of education involves a variety of repetitious tasks. We believe that appropriate computer tools can automate many of these chores, and allow both educators and their students to devote a lot more of their time to actual teaching and learning. This paper details tools that we have used to automate a broad range of academic radiology-specific tasks on Mac OS X, iOS, and Windows platforms. Some of the tools we describe here require little expertise or time to use; others require some basic knowledge of computer programming. We used TextExpander (Mac, iOS) and AutoHotKey (Win) for automated generation of text files, such as resident performance reviews and radiology interpretations. Custom statistical calculations were performed using TextExpander and the Python programming language. A workflow for automated note-taking was developed using Evernote (Mac, iOS, Win) and Hazel (Mac). Automated resident procedure logging was accomplished using Editorial (iOS) and Python. We created three variants of a teaching session logger using Drafts (iOS) and Pythonista (iOS). Editorial and Drafts were used to create flashcards for knowledge review. We developed a mobile reference management system for iOS using Editorial. We used the Workflow app (iOS) to automatically generate a text message reminder for daily conferences. Finally, we developed two separate automated workflows-one with Evernote (Mac, iOS, Win) and one with Python (Mac, Win)-that generate simple automated teaching file collections. We have beta-tested these workflows, techniques, and scripts on several of our fellow radiologists. All of them expressed enthusiasm for these tools and were able to use one or more of them to automate their own educational activities. Appropriate computer tools can automate many educational tasks, and thereby allow both educators and their students to devote a lot more of their time to actual teaching and learning. Copyright © 2016 The Association of University Radiologists

  2. Data acquisition and processing for flame spectrophotometry using a programmable desk calculator

    International Nuclear Information System (INIS)

    Hurteau, M.T.; Ashley, R.W.

    1976-02-01

    A programmable calculator has been used to provide automatic data acquisition and processing for flame spectrophotometric measurements. When coupled with an automatic wavelength selector, complete automation of sample analysis is provided for one or more elements in solution. The program takes into account deviation of analytical curves from linearity. Increased sensitivity and precision over manual calculations are obtained. (author)

  3. Exploring the Lived Experiences of Program Managers Regarding an Automated Logistics Environment

    Science.gov (United States)

    Allen, Ronald Timothy

    2014-01-01

    Automated Logistics Environment (ALE) is a new term used by Navy and aerospace industry executives to describe the aggregate of logistics-related information systems that support modern aircraft weapon systems. The development of logistics information systems is not always well coordinated among programs, often resulting in solutions that cannot…

  4. Automated computation of one-loop integrals in massless theories

    International Nuclear Information System (INIS)

    Hameren, A. van; Vollinga, J.; Weinzierl, S.

    2005-01-01

    We consider one-loop tensor and scalar integrals, which occur in a massless quantum field theory, and we report on the implementation into a numerical program of an algorithm for the automated computation of these one-loop integrals. The number of external legs of the loop integrals is not restricted. All calculations are done within dimensional regularization. (orig.)

  5. Embedded design based virtual instrument program for positron beam automation

    International Nuclear Information System (INIS)

    Jayapandian, J.; Gururaj, K.; Abhaya, S.; Parimala, J.; Amarendra, G.

    2008-01-01

    Automation of positron beam experiment with a single chip embedded design using a programmable system on chip (PSoC) which provides easy interfacing of the high-voltage DC power supply is reported. Virtual Instrument (VI) control program written in Visual Basic 6.0 ensures the following functions (i) adjusting of sample high voltage by interacting with the programmed PSoC hardware, (ii) control of personal computer (PC) based multi channel analyzer (MCA) card for energy spectroscopy, (iii) analysis of the obtained spectrum to extract the relevant line shape parameters, (iv) plotting of relevant parameters and (v) saving the file in the appropriate format. The present study highlights the hardware features of the PSoC hardware module as well as the control of MCA and other units through programming in Visual Basic

  6. Automation of the computational programs and codes used in the methodology of neutronic and thermohydraulic calculation for the IEA-R1 nuclear reactor

    International Nuclear Information System (INIS)

    Stefani, Giovanni Laranjo de

    2009-01-01

    This work proceeds the elaboration of a computational program for execution of various neutron and thermalhydraulic calculation methodology programs of the IEA-R1-Sao Paulo, Brazil, making the process more practical and safe, besides transforming de output data of each program an automatic process. This reactor is largely used for production of radioisotopes for medical use, material irradiation, personnel training and also for basic research. For that purposes it is necessary to change his core configuration in order to adapt the reactor for different uses. The work will transform various existent programs into subroutines of a principal program, i.e.,a program which call each of the programs automatically when necessary, and create another programs for manipulation the output data and therefore making practical the process

  7. Monte Carlo shielding analyses using an automated biasing procedure

    International Nuclear Information System (INIS)

    Tang, J.S.; Hoffman, T.J.

    1988-01-01

    A systematic and automated approach for biasing Monte Carlo shielding calculations is described. In particular, adjoint fluxes from a one-dimensional discrete ordinates calculation are used to generate biasing parameters for a Monte Carlo calculation. The entire procedure of adjoint calculation, biasing parameters generation, and Monte Carlo calculation has been automated. The automated biasing procedure has been applied to several realistic deep-penetration shipping cask problems. The results obtained for neutron and gamma-ray transport indicate that with the automated biasing procedure Monte Carlo shielding calculations of spent-fuel casks can be easily performed with minimum effort and that accurate results can be obtained at reasonable computing cost

  8. Automated Eukaryotic Gene Structure Annotation Using EVidenceModeler and the Program to Assemble Spliced Alignments

    Energy Technology Data Exchange (ETDEWEB)

    Haas, B J; Salzberg, S L; Zhu, W; Pertea, M; Allen, J E; Orvis, J; White, O; Buell, C R; Wortman, J R

    2007-12-10

    EVidenceModeler (EVM) is presented as an automated eukaryotic gene structure annotation tool that reports eukaryotic gene structures as a weighted consensus of all available evidence. EVM, when combined with the Program to Assemble Spliced Alignments (PASA), yields a comprehensive, configurable annotation system that predicts protein-coding genes and alternatively spliced isoforms. Our experiments on both rice and human genome sequences demonstrate that EVM produces automated gene structure annotation approaching the quality of manual curation.

  9. A versatile program for the calculation of linear accelerator room shielding.

    Science.gov (United States)

    Hassan, Zeinab El-Taher; Farag, Nehad M; Elshemey, Wael M

    2018-03-22

    This work aims at designing a computer program to calculate the necessary amount of shielding for a given or proposed linear accelerator room design in radiotherapy. The program (Shield Calculation in Radiotherapy, SCR) has been developed using Microsoft Visual Basic. It applies the treatment room shielding calculations of NCRP report no. 151 to calculate proper shielding thicknesses for a given linear accelerator treatment room design. The program is composed of six main user-friendly interfaces. The first enables the user to upload their choice of treatment room design and to measure the distances required for shielding calculations. The second interface enables the user to calculate the primary barrier thickness in case of three-dimensional conventional radiotherapy (3D-CRT), intensity modulated radiotherapy (IMRT) and total body irradiation (TBI). The third interface calculates the required secondary barrier thickness due to both scattered and leakage radiation. The fourth and fifth interfaces provide a means to calculate the photon dose equivalent for low and high energy radiation, respectively, in door and maze areas. The sixth interface enables the user to calculate the skyshine radiation for photons and neutrons. The SCR program has been successfully validated, precisely reproducing all of the calculated examples presented in NCRP report no. 151 in a simple and fast manner. Moreover, it easily performed the same calculations for a test design that was also calculated manually, and produced the same results. The program includes a new and important feature that is the ability to calculate required treatment room thickness in case of IMRT and TBI. It is characterised by simplicity, precision, data saving, printing and retrieval, in addition to providing a means for uploading and testing any proposed treatment room shielding design. The SCR program provides comprehensive, simple, fast and accurate room shielding calculations in radiotherapy.

  10. Batch calculations in CalcHEP

    International Nuclear Information System (INIS)

    Pukhov, A.

    2003-01-01

    CalcHEP is a clone of the CompHEP project which is developed by the author outside of the CompHEP group. CompHEP/CalcHEP are packages for automatic calculations of elementary particle decay and collision properties in the lowest order of perturbation theory. The main idea prescribed into the packages is to make available passing on from the Lagrangian to the final distributions effectively with a high level of automation. According to this, the packages were created as a menu driven user friendly programs for calculations in the interactive mode. From the other side, long-time calculations should be done in the non-interactive regime. Thus, from the beginning CompHEP has a problem of batch calculations. In CompHEP 33.23 the batch session was realized by mean of interactive menu which allows to the user to formulate the task for batch. After that the not-interactive session was launched. This way is too restricted, not flexible, and leads to doubling in programming. In this article I discuss another approach how one can force an interactive program to work in non-interactive mode. This approach was realized in CalcHEP 2.1 disposed on http://theory.sinp.msu.ru/~pukhov/calchep.html

  11. Finite difference program for calculating hydride bed wall temperature profiles

    International Nuclear Information System (INIS)

    Klein, J.E.

    1992-01-01

    A QuickBASIC finite difference program was written for calculating one dimensional temperature profiles in up to two media with flat, cylindrical, or spherical geometries. The development of the program was motivated by the need to calculate maximum temperature differences across the walls of the Tritium metal hydrides beds for thermal fatigue analysis. The purpose of this report is to document the equations and the computer program used to calculate transient wall temperatures in stainless steel hydride vessels. The development of the computer code was motivated by the need to calculate maximum temperature differences across the walls of the hydrides beds in the Tritium Facility for thermal fatigue analysis

  12. Program for calculating multi-component high-intense ion beam transport

    International Nuclear Information System (INIS)

    Kazarinov, N.Yu.; Prejzendorf, V.A.

    1985-01-01

    The CANAL program for calculating transport of high-intense beams containing ions with different charges in a channel consisting of dipole magnets and quadrupole lenses is described. The equations determined by the method of distribution function momenta and describing coordinate variations of the local mass centres and r.m.s. transverse sizes of beams with different charges form the basis of the calculation. The program is adapted for the CDC-6500 and SM-4 computers. The program functioning is organized in the interactive mode permitting to vary the parameters of any channel element and quickly choose the optimum version in the course of calculation. The calculation time for the CDC-6500 computer for the 30-40 m channel at the integration step of 1 cm is about 1 min. The program is used for calculating the channel for the uranium ion beam injection from the collective accelerator into the heavy-ion synchrotron

  13. Early identification of hERG liability in drug discovery programs by automated patch clamp

    Directory of Open Access Journals (Sweden)

    Timm eDanker

    2014-09-01

    Full Text Available Blockade of the cardiac ion channel coded by hERG can lead to cardiac arrhythmia, which has become a major concern in drug discovery and development. Automated electrophysiological patch clamp allows assessment of hERG channel effects early in drug development to aid medicinal chemistry programs and has become routine in pharmaceutical companies. However, a number of potential sources of errors in setting up hERG channel assays by automated patch clamp can lead to misinterpretation of data or false effects being reported. This article describes protocols for automated electrophysiology screening of compound effects on the hERG channel current. Protocol details and the translation of criteria known from manual patch clamp experiments to automated patch clamp experiments to achieve good quality data are emphasized. Typical pitfalls and artifacts that may lead to misinterpretation of data are discussed. While this article focuses on hERG channel recordings using the QPatch (Sophion A/S, Copenhagen, Denmark technology, many of the assay and protocol details given in this article can be transferred for setting up different ion channel assays by automated patch clamp and are similar on other planar patch clamp platforms.

  14. A semi-automated tool for treatment plan-quality evaluation and clinical trial quality assurance

    Science.gov (United States)

    Wang, Jiazhou; Chen, Wenzhou; Studenski, Matthew; Cui, Yunfeng; Lee, Andrew J.; Xiao, Ying

    2013-07-01

    The goal of this work is to develop a plan-quality evaluation program for clinical routine and multi-institutional clinical trials so that the overall evaluation efficiency is improved. In multi-institutional clinical trials evaluating the plan quality is a time-consuming and labor-intensive process. In this note, we present a semi-automated plan-quality evaluation program which combines MIMVista, Java/MATLAB, and extensible markup language (XML). More specifically, MIMVista is used for data visualization; Java and its powerful function library are implemented for calculating dosimetry parameters; and to improve the clarity of the index definitions, XML is applied. The accuracy and the efficiency of the program were evaluated by comparing the results of the program with the manually recorded results in two RTOG trials. A slight difference of about 0.2% in volume or 0.6 Gy in dose between the semi-automated program and manual recording was observed. According to the criteria of indices, there are minimal differences between the two methods. The evaluation time is reduced from 10-20 min to 2 min by applying the semi-automated plan-quality evaluation program.

  15. A semi-automated tool for treatment plan-quality evaluation and clinical trial quality assurance

    International Nuclear Information System (INIS)

    Wang, Jiazhou; Chen, Wenzhou; Studenski, Matthew; Cui, Yunfeng; Xiao, Ying; Lee, Andrew J

    2013-01-01

    The goal of this work is to develop a plan-quality evaluation program for clinical routine and multi-institutional clinical trials so that the overall evaluation efficiency is improved. In multi-institutional clinical trials evaluating the plan quality is a time-consuming and labor-intensive process. In this note, we present a semi-automated plan-quality evaluation program which combines MIMVista, Java/MATLAB, and extensible markup language (XML). More specifically, MIMVista is used for data visualization; Java and its powerful function library are implemented for calculating dosimetry parameters; and to improve the clarity of the index definitions, XML is applied. The accuracy and the efficiency of the program were evaluated by comparing the results of the program with the manually recorded results in two RTOG trials. A slight difference of about 0.2% in volume or 0.6 Gy in dose between the semi-automated program and manual recording was observed. According to the criteria of indices, there are minimal differences between the two methods. The evaluation time is reduced from 10–20 min to 2 min by applying the semi-automated plan-quality evaluation program. (note)

  16. Software complex AS (automation of spectrometry). The spectrometer interactive control program

    International Nuclear Information System (INIS)

    Astakhova, N.V.; Beskrovnyj, A.I.; Butorin, P.E.; Vasilovskij, S.E.; Salamatin, I.M.; Shvetsov, V.N.; Maznyj, N.G.

    2004-01-01

    At the development of Experiment Automation System (EAS) important and complicated challenges are integration of components in the system and reliability of work. First of all it concerns the driver layer of the programs. For a solution of these tasks the special technique of the assembly EAS from ready modules is used. For the purpose of checking the technique of EAS integration in the actual experimental conditions the program MC is developed. And apart from it, MC is a convenient tool for diagnostics of the equipment and realization of experiments in an interactive mode. During experimental maintenance on the spectrometer DN2, properties of performance of the developed technique are confirmed. The program MC without a modification can be used on various spectrometers. (author)

  17. Software Complex AS (Automation of Spectrometry). The Spectrometer Interactive Control Program

    CERN Document Server

    Astakhova, N V; Bytorin, P E; Vasilivskii, S E; Maznyi, N G; Salamatin, I M; Shvetsov, V N

    2004-01-01

    At the development of Experiment Automation System (EAS) important and complicated challenges are integration of components in the system and reliability of work. First of all it concerns driver layer of the programs. For a solution of these tasks the special technique of assembly EAS from ready modules is used. For the purpose of checking the technique of EAS integration in the actual experimental conditions the program MC is developed. And apart from it, MC is a convenient tool for diagnostics of the equipment and realization of experiments in an interactive mode. During experimental maintenance on the spectrometer DN2, properties of performance of the developed technique are confirmed. The program MC without a modification can be used on various spectrometers.

  18. AUTOMATED INADVERTENT INTRUDER APPLICATION

    International Nuclear Information System (INIS)

    Koffman, L; Patricia Lee, P; Jim Cook, J; Elmer Wilhite, E

    2007-01-01

    The Environmental Analysis and Performance Modeling group of Savannah River National Laboratory (SRNL) conducts performance assessments of the Savannah River Site (SRS) low-level waste facilities to meet the requirements of DOE Order 435.1. These performance assessments, which result in limits on the amounts of radiological substances that can be placed in the waste disposal facilities, consider numerous potential exposure pathways that could occur in the future. One set of exposure scenarios, known as inadvertent intruder analysis, considers the impact on hypothetical individuals who are assumed to inadvertently intrude onto the waste disposal site. Inadvertent intruder analysis considers three distinct scenarios for exposure referred to as the agriculture scenario, the resident scenario, and the post-drilling scenario. Each of these scenarios has specific exposure pathways that contribute to the overall dose for the scenario. For the inadvertent intruder analysis, the calculation of dose for the exposure pathways is a relatively straightforward algebraic calculation that utilizes dose conversion factors. Prior to 2004, these calculations were performed using an Excel spreadsheet. However, design checks of the spreadsheet calculations revealed that errors could be introduced inadvertently when copying spreadsheet formulas cell by cell and finding these errors was tedious and time consuming. This weakness led to the specification of functional requirements to create a software application that would automate the calculations for inadvertent intruder analysis using a controlled source of input parameters. This software application, named the Automated Inadvertent Intruder Application, has undergone rigorous testing of the internal calculations and meets software QA requirements. The Automated Inadvertent Intruder Application was intended to replace the previous spreadsheet analyses with an automated application that was verified to produce the same calculations and

  19. Calculation of cosmic ray induced single event upsets: Program CRUP (Cosmic Ray Upset Program)

    Science.gov (United States)

    Shapiro, P.

    1983-09-01

    This report documents PROGRAM CRUP, COSMIC RAY UPSET PROGRAM. The computer program calculates cosmic ray induced single-event error rates in microelectronic circuits exposed to several representative cosmic-ray environments.

  20. Estimates of Radionuclide Loading to Cochiti Lake from Los Alamos Canyon Using Manual and Automated Sampling

    Energy Technology Data Exchange (ETDEWEB)

    McLean, Christopher T. [Univ. of New Mexico, Albuquerque, NM (United States)

    2000-07-01

    Los Alamos National Laboratory has a long-standing program of sampling storm water runoff inside the Laboratory boundaries. In 1995, the Laboratory started collecting the samples using automated storm water sampling stations; prior to this time the samples were collected manually. The Laboratory has also been periodically collecting sediment samples from Cochiti Lake. This paper presents the data for Pu-238 and Pu-239 bound to the sediments for Los Alamos Canyon storm water runoff and compares the sampling types by mass loading and as a percentage of the sediment deposition to Cochiti Lake. The data for both manual and automated sampling are used to calculate mass loads from Los Alamos Canyon on a yearly basis. The automated samples show mass loading 200- 500 percent greater for Pu-238 and 300-700 percent greater for Pu-239 than the manual samples. Using the mean manual flow volume for mass loading calculations, the automated samples are over 900 percent greater for Pu-238 and over 1800 percent greater for Pu-239. Evaluating the Pu-238 and Pu-239 activities as a percentage of deposition to Cochiti Lake indicates that the automated samples are 700-1300 percent greater for Pu- 238 and 200-500 percent greater for Pu-239. The variance was calculated by two methods. The first method calculates the variance for each sample event. The second method calculates the variances by the total volume of water discharged in Los Alamos Canyon for the year.

  1. Can automation in radiotherapy reduce costs?

    Science.gov (United States)

    Massaccesi, Mariangela; Corti, Michele; Azario, Luigi; Balducci, Mario; Ferro, Milena; Mantini, Giovanna; Mattiucci, Gian Carlo; Valentini, Vincenzo

    2015-01-01

    Computerized automation is likely to play an increasingly important role in radiotherapy. The objective of this study was to report the results of the first part of a program to implement a model for economical evaluation based on micro-costing method. To test the efficacy of the model, the financial impact of the introduction of an automation tool was estimated. A single- and multi-center validation of the model by a prospective collection of data is planned as the second step of the program. The model was implemented by using an interactive spreadsheet (Microsoft Excel, 2010). The variables to be included were identified across three components: productivity, staff, and equipment. To calculate staff requirements, the workflow of Gemelli ART center was mapped out and relevant workload measures were defined. Profit and loss, productivity and staffing were identified as significant outcomes. Results were presented in terms of earnings before interest and taxes (EBIT). Three different scenarios were hypothesized: baseline situation at Gemelli ART (scenario 1); reduction by 2 minutes of the average duration of treatment fractions (scenario 2); and increased incidence of advanced treatment modalities (scenario 3). By using the model, predicted EBIT values for each scenario were calculated across a period of eight years (from 2015 to 2022). For both scenarios 2 and 3 costs are expected to slightly increase as compared to baseline situation that is particularly due to a little increase in clinical personnel costs. However, in both cases EBIT values are more favorable than baseline situation (EBIT values: scenario 1, 27%, scenario 2, 30%, scenario 3, 28% of revenues). A model based on a micro-costing method was able to estimate the financial consequences of the introduction of an automation tool in our radiotherapy department. A prospective collection of data at Gemelli ART and in a consortium of centers is currently under way to prospectively validate the model.

  2. RADSHI: shielding calculation program for different geometries sources

    International Nuclear Information System (INIS)

    Gelen, A.; Alvarez, I.; Lopez, H.; Manso, M.

    1996-01-01

    A computer code written in pascal language for IBM/Pc is described. The program calculates the optimum thickness of slab shield for different geometries sources. The Point Kernel Method is employed, which enables the obtention of the ionizing radiation flux density. The calculation takes into account the possibility of self-absorption in the source. The air kerma rate for gamma radiation is determined, and with the concept of attenuation length through the equivalent attenuation length the shield is obtained. The scattering and the exponential attenuation inside the shield material is considered in the program. The shield materials can be: concrete, water, iron or lead. It also calculates the shield for point isotropic neutron source, using as shield materials paraffin, concrete or water. (authors). 13 refs

  3. Automated Reporting of DXA Studies Using a Custom-Built Computer Program.

    Science.gov (United States)

    England, Joseph R; Colletti, Patrick M

    2018-06-01

    Dual-energy x-ray absorptiometry (DXA) scans are a critical population health tool and relatively simple to interpret but can be time consuming to report, often requiring manual transfer of bone mineral density and associated statistics into commercially available dictation systems. We describe here a custom-built computer program for automated reporting of DXA scans using Pydicom, an open-source package built in the Python computer language, and regular expressions to mine DICOM tags for patient information and bone mineral density statistics. This program, easy to emulate by any novice computer programmer, has doubled our efficiency at reporting DXA scans and has eliminated dictation errors.

  4. Mass: Fortran program for calculating mass-absorption coefficients

    International Nuclear Information System (INIS)

    Nielsen, Aa.; Svane Petersen, T.

    1980-01-01

    Determinations of mass-absorption coefficients in the x-ray analysis of trace elements are an important and time consuming part of the arithmetic calculation. In the course of time different metods have been used. The program MASS calculates the mass-absorption coefficients from a given major element analysis at the x-ray wavelengths normally used in trace element determinations and lists the chemical analysis and the mass-absorption coefficients. The program is coded in FORTRAN IV, and is operational on the IBM 370/165 computer, on the UNIVAC 1110 and on PDP 11/05. (author)

  5. Security Measures in Automated Assessment System for Programming Courses

    Directory of Open Access Journals (Sweden)

    Jana Šťastná

    2015-12-01

    Full Text Available A desirable characteristic of programming code assessment is to provide the learner the most appropriate information regarding the code functionality as well as a chance to improve. This can be hardly achieved in case the number of learners is high (500 or more. In this paper we address the problem of risky code testing and availability of an assessment platform Arena, dealing with potential security risks when providing an automated assessment for a large set of source code. Looking at students’ programs as if they were potentially malicious inspired us to investigate separated execution environments, used by security experts for secure software analysis. The results also show that availability issues of our assessment platform can be conveniently resolved with task queues. A special attention is paid to Docker, a virtual container ensuring no risky code can affect the assessment system security. The assessment platform Arena enables to regularly, effectively and securely assess students' source code in various programming courses. In addition to that it is a motivating factor and helps students to engage in the educational process.

  6. Weighted curve-fitting program for the HP 67/97 calculator

    International Nuclear Information System (INIS)

    Stockli, M.P.

    1983-01-01

    The HP 67/97 calculator provides in its standard equipment a curve-fit program for linear, logarithmic, exponential and power functions that is quite useful and popular. However, in more sophisticated applications, proper weights for data are often essential. For this purpose a program package was created which is very similar to the standard curve-fit program but which includes the weights of the data for proper statistical analysis. This allows accurate calculation of the uncertainties of the fitted curve parameters as well as the uncertainties of interpolations or extrapolations, or optionally the uncertainties can be normalized with chi-square. The program is very versatile and allows one to perform quite difficult data analysis in a convenient way with the pocket calculator HP 67/97

  7. Chattanooga Electric Power Board Case Study Distribution Automation

    Energy Technology Data Exchange (ETDEWEB)

    Glass, Jim [Chattanooga Electric Power Board (EPB), TN (United States); Melin, Alexander M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Starke, Michael R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ollis, Ben [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-10-01

    In 2009, the U.S. Department of Energy under the American Recovery and Reinvestment Act (ARRA) awarded a grant to the Chattanooga, Tennessee, Electric Power Board (EPB) as part of the Smart Grid Investment Grant Program. The grant had the objective “to accelerate the transformation of the nation’s electric grid by deploying smart grid technologies.” This funding award enabled EPB to expedite the original smart grid implementation schedule from an estimated 10-12 years to 2.5 years. With this funding, EPB invested heavily in distribution automation technologies including installing over 1,200 automated circuit switches and sensors on 171 circuits. For utilities considering a commitment to distribution automation, there are underlying questions such as the following: “What is the value?” and “What are the costs?” This case study attempts to answer these questions. The primary benefit of distribution automation is increased reliability or reduced power outage duration and frequency. Power outages directly impact customer economics by interfering with business functions. In the past, this economic driver has been difficult to effectively evaluate. However, as this case study demonstrates, tools and analysis techniques are now available. In this case study, the impact on customer costs associated with power outages before and after the implementation of distribution automation are compared. Two example evaluations are performed to demonstrate the benefits: 1) a savings baseline for customers under normal operations1 and 2) customer savings for a single severe weather event. Cost calculations for customer power outages are performed using the US Department of Energy (DOE) Interruption Cost Estimate (ICE) calculator2. This tool uses standard metrics associated with outages and the customers to calculate cost impact. The analysis shows that EPB customers have seen significant reliability improvements from the implementation of distribution automation. Under

  8. Guiding automated NMR structure determination using a global optimization metric, the NMR DP score

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Yuanpeng Janet, E-mail: yphuang@cabm.rutgers.edu; Mao, Binchen; Xu, Fei; Montelione, Gaetano T., E-mail: gtm@rutgers.edu [Rutgers, The State University of New Jersey, Department of Molecular Biology and Biochemistry, Center for Advanced Biotechnology and Medicine, and Northeast Structural Genomics Consortium (United States)

    2015-08-15

    ASDP is an automated NMR NOE assignment program. It uses a distinct bottom-up topology-constrained network anchoring approach for NOE interpretation, with 2D, 3D and/or 4D NOESY peak lists and resonance assignments as input, and generates unambiguous NOE constraints for iterative structure calculations. ASDP is designed to function interactively with various structure determination programs that use distance restraints to generate molecular models. In the CASD–NMR project, ASDP was tested and further developed using blinded NMR data, including resonance assignments, either raw or manually-curated (refined) NOESY peak list data, and in some cases {sup 15}N–{sup 1}H residual dipolar coupling data. In these blinded tests, in which the reference structure was not available until after structures were generated, the fully-automated ASDP program performed very well on all targets using both the raw and refined NOESY peak list data. Improvements of ASDP relative to its predecessor program for automated NOESY peak assignments, AutoStructure, were driven by challenges provided by these CASD–NMR data. These algorithmic improvements include (1) using a global metric of structural accuracy, the discriminating power score, for guiding model selection during the iterative NOE interpretation process, and (2) identifying incorrect NOESY cross peak assignments caused by errors in the NMR resonance assignment list. These improvements provide a more robust automated NOESY analysis program, ASDP, with the unique capability of being utilized with alternative structure generation and refinement programs including CYANA, CNS, and/or Rosetta.

  9. Studies of criticality Monte Carlo method convergence: use of a deterministic calculation and automated detection of the transient

    International Nuclear Information System (INIS)

    Jinaphanh, A.

    2012-01-01

    Monte Carlo criticality calculation allows to estimate the effective multiplication factor as well as local quantities such as local reaction rates. Some configurations presenting weak neutronic coupling (high burn up profile, complete reactor core,...) may induce biased estimations for k eff or reaction rates. In order to improve robustness of the iterative Monte Carlo methods, a coupling with a deterministic code was studied. An adjoint flux is obtained by a deterministic calculation and then used in the Monte Carlo. The initial guess is then automated, the sampling of fission sites is modified and the random walk of neutrons is modified using splitting and russian roulette strategies. An automated convergence detection method has been developed. It locates and suppresses the transient due to the initialization in an output series, applied here to k eff and Shannon entropy. It relies on modeling stationary series by an order 1 auto regressive process and applying statistical tests based on a Student Bridge statistics. This method can easily be extended to every output of an iterative Monte Carlo. Methods developed in this thesis are tested on different test cases. (author)

  10. Cell verification of parallel burnup calculation program MCBMPI based on MPI

    International Nuclear Information System (INIS)

    Yang Wankui; Liu Yaoguang; Ma Jimin; Wang Guanbo; Yang Xin; She Ding

    2014-01-01

    The parallel burnup calculation program MCBMPI was developed. The program was modularized. The parallel MCNP5 program MCNP5MPI was employed as neutron transport calculation module. And a composite of three solution methods was used to solve burnup equation, i.e. matrix exponential technique, TTA analytical solution, and Gauss Seidel iteration. MPI parallel zone decomposition strategy was concluded in the program. The program system only consists of MCNP5MPI and burnup subroutine. The latter achieves three main functions, i.e. zone decomposition, nuclide transferring and decaying, and data exchanging with MCNP5MPI. Also, the program was verified with the pressurized water reactor (PWR) cell burnup benchmark. The results show that it,s capable to apply the program to burnup calculation of multiple zones, and the computation efficiency could be significantly improved with the development of computer hardware. (authors)

  11. Simple Calculation Programs for Biology Methods in Molecular ...

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. Simple Calculation Programs for Biology Methods in Molecular Biology. GMAP: A program for mapping potential restriction sites. RE sites in ambiguous and non-ambiguous DNA sequence; Minimum number of silent mutations required for introducing a RE sites; Set ...

  12. Building an IDE for the Calculational Derivation of Imperative Programs

    Directory of Open Access Journals (Sweden)

    Dipak L. Chaudhari

    2015-08-01

    Full Text Available In this paper, we describe an IDE called CAPS (Calculational Assistant for Programming from Specifications for the interactive, calculational derivation of imperative programs. In building CAPS, our aim has been to make the IDE accessible to non-experts while retaining the overall flavor of the pen-and-paper calculational style. We discuss the overall architecture of the CAPS system, the main features of the IDE, the GUI design, and the trade-offs involved.

  13. Software complex AS (automation of spectrometry). User interface of experiment automation system implementation

    International Nuclear Information System (INIS)

    Astakhova, N.V.; Beskrovnyj, A.I.; Bogdzel', A.A.; Butorin, P.E.; Vasilovskij, S.G.; Gundorin, N.A.; Zlokazov, V.B.; Kutuzov, S.A.; Salamatin, I.M.; Shvetsov, V.N.

    2003-01-01

    An instrumental software complex for automation of spectrometry (AS) that enables prompt realization of experiment automation systems for spectrometers, which use data buferisation, has been developed. In the development new methods of programming and building of automation systems together with novel net technologies were employed. It is suggested that programs to schedule and conduct experiments should be based on the parametric model of the spectrometer, the approach that will make it possible to write programs suitable for any FLNP (Frank Laboratory of Neutron Physics) spectrometer and experimental technique applied and use different hardware interfaces for introducing the spectrometric data into the data acquisition system. The article describes the possibilities provided to the user in the field of scheduling and control of the experiment, data viewing, and control of the spectrometer parameters. The possibility of presenting the current spectrometer state, programs and the experimental data in the Internet in the form of dynamically formed protocols and graphs, as well as of the experiment control via the Internet is realized. To use the means of the Internet on the side of the client, applied programs are not needed. It suffices to know how to use the two programs to carry out experiments in the automated mode. The package is designed for experiments in condensed matter and nuclear physics and is ready for using. (author)

  14. Calculation program development for spinning reserve

    International Nuclear Information System (INIS)

    1979-01-01

    This study is about optimal holding of spinning reserve and optimal operation for it. It deals with the purpose and contents of the study, introduction of the spinning reserve electricity, speciality of the spinning reserve power, the result of calculation, analysis for limited method of optimum load, calculation of requirement for spinning reserve, analysis on measurement of system stability with summary, purpose of the analysis, cause of impact of the accident, basics on measurement of spinning reserve and conclusion. It has the reference on explanation for design of spinning reserve power program and using and trend about spinning reserve power in Korea.

  15. Station Program Note Pull Automation

    Science.gov (United States)

    Delgado, Ivan

    2016-01-01

    Upon commencement of my internship, I was in charge of maintaining the CoFR (Certificate of Flight Readiness) Tool. The tool acquires data from existing Excel workbooks on NASA's and Boeing's databases to create a new spreadsheet listing out all the potential safety concerns for upcoming flights and software transitions. Since the application was written in Visual Basic, I had to learn a new programming language and prepare to handle any malfunctions within the program. Shortly afterwards, I was given the assignment to automate the Station Program Note (SPN) Pull process. I developed an application, in Python, that generated a GUI (Graphical User Interface) that will be used by the International Space Station Safety & Mission Assurance team here at Johnson Space Center. The application will allow its users to download online files with the click of a button, import SPN's based on three different pulls, instantly manipulate and filter spreadsheets, and compare the three sources to determine which active SPN's (Station Program Notes) must be reviewed for any upcoming flights, missions, and/or software transitions. Initially, to perform the NASA SPN pull (one of three), I had created the program to allow the user to login to a secure webpage that stores data, input specific parameters, and retrieve the desired SPN's based on their inputs. However, to avoid any conflicts with sustainment, I altered it so that the user may login and download the NASA file independently. After the user has downloaded the file with the click of a button, I defined the program to check for any outdated or pre-existing files, for successful downloads, to acquire the spreadsheet, convert it from a text file to a comma separated file and finally into an Excel spreadsheet to be filtered and later scrutinized for specific SPN numbers. Once this file has been automatically manipulated to provide only the SPN numbers that are desired, they are stored in a global variable, shown on the GUI, and

  16. Automated nuclear materials accounting

    International Nuclear Information System (INIS)

    Pacak, P.; Moravec, J.

    1982-01-01

    An automated state system of accounting for nuclear materials data was established in Czechoslovakia in 1979. A file was compiled of 12 programs in the PL/1 language. The file is divided into four groups according to logical associations, namely programs for data input and checking, programs for handling the basic data file, programs for report outputs in the form of worksheets and magnetic tape records, and programs for book inventory listing, document inventory handling and materials balance listing. A similar automated system of nuclear fuel inventory for a light water reactor was introduced for internal purposes in the Institute of Nuclear Research (UJV). (H.S.)

  17. Use of the PASKAL' language for programming in experiment automation systems

    International Nuclear Information System (INIS)

    Ostrovnoj, A.I.

    1985-01-01

    A complex of standard solutions intended for realization of the main functions is suggested; execution of these solutions is provided by any system for experiment automation. They include: recording and accumulation of experimental data; visualization and preliminary processing of incoming data, interaction with the operator and system control; data filing. It is advisable to use standard software, to represent data processing algorithms as parallel processes, to apply the PASCAL' language for programming. Programming using CAMAC equipment is provided by complex of procedures similar to the set of subprograms in the FORTRAN language. Utilization of a simple data file in accumulation and processing programs ensures unified representation of experimental data and uniform access to them on behalf of a large number of programs operating both on-line and off-line regimes. The suggested approach is realized when developing systems on the base of the SM-3, SM-4 and MERA-60 computers with RAFOS operating system

  18. MONO: A program to calculate synchrotron beamline monochromator throughputs

    International Nuclear Information System (INIS)

    Chapman, D.

    1989-01-01

    A set of Fortran programs have been developed to calculate the expected throughput of x-ray monochromators with a filtered synchrotron source and is applicable to bending magnet and wiggler beamlines. These programs calculate the normalized throughput and filtered synchrotron spectrum passed by multiple element, flat un- focussed monochromator crystals of the Bragg or Laue type as a function of incident beam divergence, energy and polarization. The reflected and transmitted beam of each crystal is calculated using the dynamical theory of diffraction. Multiple crystal arrangements in the dispersive and non-dispersive mode are allowed as well as crystal asymmetry and energy or angle offsets. Filters or windows of arbitrary elemental composition may be used to filter the incident synchrotron beam. This program should be useful to predict the intensities available from many beamline configurations as well as assist in the design of new monochromator and analyzer systems. 6 refs., 3 figs

  19. aMCfast: automation of fast NLO computations for PDF fits

    CERN Document Server

    Bertone, Valerio; Frixione, Stefano; Rojo, Juan; Sutton, Mark

    2014-01-01

    We present the interface between MadGraph5_aMC@NLO, a self-contained program that calculates cross sections up to next-to-leading order accuracy in an automated manner, and APPLgrid, a code that parametrises such cross sections in the form of look-up tables which can be used for the fast computations needed in the context of PDF fits. The main characteristic of this interface, which we dub aMCfast, is its being fully automated as well, which removes the need to extract manually the process-specific information for additional physics processes, as is the case with other matrix element calculators, and renders it straightforward to include any new process in the PDF fits. We demonstrate this by studying several cases which are easily measured at the LHC, have a good constraining power on PDFs, and some of which were previously unavailable in the form of a fast interface.

  20. A Fully Automated Diabetes Prevention Program, Alive-PD: Program Design and Randomized Controlled Trial Protocol.

    Science.gov (United States)

    Block, Gladys; Azar, Kristen Mj; Block, Torin J; Romanelli, Robert J; Carpenter, Heather; Hopkins, Donald; Palaniappan, Latha; Block, Clifford H

    2015-01-21

    In the United States, 86 million adults have pre-diabetes. Evidence-based interventions that are both cost effective and widely scalable are needed to prevent diabetes. Our goal was to develop a fully automated diabetes prevention program and determine its effectiveness in a randomized controlled trial. Subjects with verified pre-diabetes were recruited to participate in a trial of the effectiveness of Alive-PD, a newly developed, 1-year, fully automated behavior change program delivered by email and Web. The program involves weekly tailored goal-setting, team-based and individual challenges, gamification, and other opportunities for interaction. An accompanying mobile phone app supports goal-setting and activity planning. For the trial, participants were randomized by computer algorithm to start the program immediately or after a 6-month delay. The primary outcome measures are change in HbA1c and fasting glucose from baseline to 6 months. The secondary outcome measures are change in HbA1c, glucose, lipids, body mass index (BMI), weight, waist circumference, and blood pressure at 3, 6, 9, and 12 months. Randomization and delivery of the intervention are independent of clinic staff, who are blinded to treatment assignment. Outcomes will be evaluated for the intention-to-treat and per-protocol populations. A total of 340 subjects with pre-diabetes were randomized to the intervention (n=164) or delayed-entry control group (n=176). Baseline characteristics were as follows: mean age 55 (SD 8.9); mean BMI 31.1 (SD 4.3); male 68.5%; mean fasting glucose 109.9 (SD 8.4) mg/dL; and mean HbA1c 5.6 (SD 0.3)%. Data collection and analysis are in progress. We hypothesize that participants in the intervention group will achieve statistically significant reductions in fasting glucose and HbA1c as compared to the control group at 6 months post baseline. The randomized trial will provide rigorous evidence regarding the efficacy of this Web- and Internet-based program in reducing or

  1. ASI's space automation and robotics programs: The second step

    Science.gov (United States)

    Dipippo, Simonetta

    1994-01-01

    The strategic decisions taken by ASI in the last few years in building up the overall A&R program, represent the technological drivers for other applications (i.e., internal automation of the Columbus Orbital Facility in the ESA Manned Space program, applications to mobile robots both in space and non-space environments, etc...). In this context, the main area of application now emerging is the scientific missions domain. Due to the broad range of applications of the developed technologies, both in the in-orbit servicing and maintenance of space structures and scientific missions, ASI foresaw the need to have a common technological development path, mainly focusing on: (1) control; (2) manipulation; (3) on-board computing; (4) sensors; and (5) teleoperation. Before entering into new applications in the scientific missions field, a brief overview of the status of the SPIDER related projects is given, underlining also the possible new applications for the LEO/GEO space structures.

  2. Programmable calculator: alternative to minicomputer-based analyzer

    International Nuclear Information System (INIS)

    Hochel, R.C.

    1979-01-01

    Described are a number of typical field and laboratory counting systems that use standard stand-alone multichannel analyzers (MCA) interfaced to a Hewlett-Packard Company (HP 9830) programmable calculator. Such systems can offer significant advantages in cost and flexibility over a minicomputyr-based system. Because most laboratories tend to accumulate MCA's over the years, the programmable calculator also offers an easy way to upgrade the laboratory while making optimum use of existing systems. Software programs are easily tailored to fit a variety of general or specific applications. The only disadvantage of the calculator vs a computer-based system is in speed of analyses; however, for most applications this handicap is minimal. Applications discussed give a brief overview of the power and flexibility of the MCA-calculator approach to automated counting and data reduction

  3. Calculation of Complexity Costs – An Approach for Rationalizing a Product Program

    DEFF Research Database (Denmark)

    Hansen, Christian Lindschou; Mortensen, Niels Henrik; Hvam, Lars

    2012-01-01

    This paper proposes an operational method for rationalizing a product program based on the calculation of complexity costs. The method takes its starting point in the calculation of complexity costs on a product program level. This is done throughout the value chain ranging from component invento...... of a product program. These findings represent an improved decision basis for the planning of reactive and proactive initiatives of rationalizing a product program.......This paper proposes an operational method for rationalizing a product program based on the calculation of complexity costs. The method takes its starting point in the calculation of complexity costs on a product program level. This is done throughout the value chain ranging from component...... inventories at the factory sites, all the way to the distribution of finished goods from distribution centers to the customers. The method proposes a step-wise approach including the analysis, quantification and allocation of product program complexity costs by the means of identifying of a number...

  4. TEMP-M program for thermal-hydraulic calculation of fast reactor fuel assemblies

    International Nuclear Information System (INIS)

    Bogoslovskaya, C.P.; Sorokin, A.P.; Tikhomirov, B.B.; Titov, P.A.; Ushakov, P.A.

    1983-01-01

    TEMP-M program (Fortran, BESM-6 computer) for thermal-hydraulic calculation of fast reactor fuel assemblies is described. Results of calculation of temperature field in a 127 fuel element assembly of BN-600, reactor accomplished according to TEMP-N program are considered as an example. Algorithm, realized in the program, enables to calculate the distributions of coolant heating, fuel element temperature (over perimeter and length) and assembly shell temperature. The distribution of coolant heating in assembly channels is determined from a solution of the balance equation system which accounts for interchannel exchange, nonadiabatic conditions on the assembly shell. The TEMP-M program gives necessary information for calculation of strength, seviceability of fast reactor core elements, serves an effective instrument for calculations when projecting reactor cores and analyzing thermal-hydraulic characteristics of operating reactor fuel assemblies

  5. Surrogate-Assisted Genetic Programming With Simplified Models for Automated Design of Dispatching Rules.

    Science.gov (United States)

    Nguyen, Su; Zhang, Mengjie; Tan, Kay Chen

    2017-09-01

    Automated design of dispatching rules for production systems has been an interesting research topic over the last several years. Machine learning, especially genetic programming (GP), has been a powerful approach to dealing with this design problem. However, intensive computational requirements, accuracy and interpretability are still its limitations. This paper aims at developing a new surrogate assisted GP to help improving the quality of the evolved rules without significant computational costs. The experiments have verified the effectiveness and efficiency of the proposed algorithms as compared to those in the literature. Furthermore, new simplification and visualisation approaches have also been developed to improve the interpretability of the evolved rules. These approaches have shown great potentials and proved to be a critical part of the automated design system.

  6. Automated Groundwater Screening

    International Nuclear Information System (INIS)

    Taylor, Glenn A.; Collard, Leonard B.

    2005-01-01

    The Automated Intruder Analysis has been extended to include an Automated Ground Water Screening option. This option screens 825 radionuclides while rigorously applying the National Council on Radiation Protection (NCRP) methodology. An extension to that methodology is presented to give a more realistic screening factor for those radionuclides which have significant daughters. The extension has the promise of reducing the number of radionuclides which must be tracked by the customer. By combining the Automated Intruder Analysis with the Automated Groundwater Screening a consistent set of assumptions and databases is used. A method is proposed to eliminate trigger values by performing rigorous calculation of the screening factor thereby reducing the number of radionuclides sent to further analysis. Using the same problem definitions as in previous groundwater screenings, the automated groundwater screening found one additional nuclide, Ge-68, which failed the screening. It also found that 18 of the 57 radionuclides contained in NCRP Table 3.1 failed the screening. This report describes the automated groundwater screening computer application

  7. An Internet-based program for depressive symptoms using human and automated support: a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Mira A

    2017-03-01

    Full Text Available Adriana Mira,1 Juana Bretón-López,1,2 Azucena García-Palacios,1,2 Soledad Quero,1,2 Rosa María Baños,2,3 Cristina Botella1,2 1Department of Basic, Clinical Psychology and Psychobiology, Labpsitec, Universitat Jaume I, Castellón de la Plana, Spain; 2CIBER of Physiopathology of Obesity and Nutrition CIBERobn, CB06/03 Instituto de Salud Carlos III, Santiago de Compostela, Spain; 3Department of Personality, Evaluation and Psychological Treatment, Universidad de Valencia, Valencia, Spain Purpose: The purpose of this study was to analyze the efficacy of an Internet-based program for depressive symptoms using automated support by information and communication technologies (ICTs and human support. Patients and methods: An Internet-based program was used to teach adaptive ways to cope with depressive symptoms and daily problems. A total of 124 participants who were experiencing at least one stressful event that caused interference in their lives, many of whom had clinically significant depressive symptoms, were randomly assigned into either an intervention group with ICT support (automated mobile phone messages, automated emails, and continued feedback through the program; an intervention group with ICT support plus human support (brief weekly support phone call without clinical content; or a waiting-list control. At pre-, post-, and 12-month follow-up, they completed depression, anxiety, positive and negative effect, and perceived stress measures. Results were analyzed using both intention-to-treat and completers data. The majority were women (67.7%, with a mean age of 35.6 years (standard deviation =9.7. Results: The analysis showed that the two intervention groups improved significantly pre- to posttreatment, compared with the control group. Furthermore, improvements were maintained at the 12-month follow-up. Adherence and satisfaction with the program was high in both conditions. Conclusion: The Internet-based program was effective and well

  8. Computer program for the automated attendance accounting system

    Science.gov (United States)

    Poulson, P.; Rasmusson, C.

    1971-01-01

    The automated attendance accounting system (AAAS) was developed under the auspices of the Space Technology Applications Program. The task is basically the adaptation of a small digital computer, coupled with specially developed pushbutton terminals located in school classrooms and offices for the purpose of taking daily attendance, maintaining complete attendance records, and producing partial and summary reports. Especially developed for high schools, the system is intended to relieve both teachers and office personnel from the time-consuming and dreary task of recording and analyzing the myriad classroom attendance data collected throughout the semester. In addition, since many school district budgets are related to student attendance, the increase in accounting accuracy is expected to augment district income. A major component of this system is the real-time AAAS software system, which is described.

  9. Development and benchmark verification of a parallelized Monte Carlo burnup calculation program MCBMPI

    International Nuclear Information System (INIS)

    Yang Wankui; Liu Yaoguang; Ma Jimin; Yang Xin; Wang Guanbo

    2014-01-01

    MCBMPI, a parallelized burnup calculation program, was developed. The program is modularized. Neutron transport calculation module employs the parallelized MCNP5 program MCNP5MPI, and burnup calculation module employs ORIGEN2, with the MPI parallel zone decomposition strategy. The program system only consists of MCNP5MPI and an interface subroutine. The interface subroutine achieves three main functions, i.e. zone decomposition, nuclide transferring and decaying, data exchanging with MCNP5MPI. Also, the program was verified with the Pressurized Water Reactor (PWR) cell burnup benchmark, the results showed that it's capable to apply the program to burnup calculation of multiple zones, and the computation efficiency could be significantly improved with the development of computer hardware. (authors)

  10. A Hybrid Genetic Programming Algorithm for Automated Design of Dispatching Rules.

    Science.gov (United States)

    Nguyen, Su; Mei, Yi; Xue, Bing; Zhang, Mengjie

    2018-06-04

    Designing effective dispatching rules for production systems is a difficult and timeconsuming task if it is done manually. In the last decade, the growth of computing power, advanced machine learning, and optimisation techniques has made the automated design of dispatching rules possible and automatically discovered rules are competitive or outperform existing rules developed by researchers. Genetic programming is one of the most popular approaches to discovering dispatching rules in the literature, especially for complex production systems. However, the large heuristic search space may restrict genetic programming from finding near optimal dispatching rules. This paper develops a new hybrid genetic programming algorithm for dynamic job shop scheduling based on a new representation, a new local search heuristic, and efficient fitness evaluators. Experiments show that the new method is effective regarding the quality of evolved rules. Moreover, evolved rules are also significantly smaller and contain more relevant attributes.

  11. Automation of calculation of fastening of non-standard freights on sea vessels

    Directory of Open Access Journals (Sweden)

    Андрій Валерійович Пархотько

    2015-11-01

    Full Text Available Correct positioning and fastening of freights are important safety conditions of navigation. Unreliable positioning and fastening of freights results in shipwreck and is the reason for injuries and losses of human lives both in the sea and during loading and unloading. To solve the above-mentioned problems, the International Maritime Organization publishes manuals in the form of either the Assembly resolutions, or the circulars approved by Maritime Safety Committee. The correct definition of necessary quantity of lashings and their positioning has the greatest impact on safe fastening of freights. The sea being rough, the vessel is accelerated both in longitudinal, and vertical and prevailing cross directions. The forces created by these accelerations generate the majority of the problems in fastening. The order of calculations of the force moments and forces acting upon the freights being shipped by sea vessels has been shown in the article. To know the proper number of lashings the calculations of the forces acting upon the freights being shipped as compared with the forces holding the freights and taking into account the strength, the number and the fastening angle of the lashings must be made. Оption of realization of algorithm of calculation with use of the а computer program to make these calculations has been offered. Some recommendations so that the program could be used by the management of the vessel, the surveyor companies and technologists of the port have been given as well as an example of such a calculation

  12. Buying Program of the Standard Automated Materiel Management System. Automated Small Purchase System: Defense Supply Center Philadelphia

    National Research Council Canada - National Science Library

    2001-01-01

    The Standard Automated Materiel Management System Automated Small Purchase System is a fully automated micro-purchases system used by the General and Industrial Directorate at the Defense Supply Center Philadelphia...

  13. Automated Generation of Formal Models from ST Control Programs for Verification Purposes

    CERN Document Server

    Fernandez Adiego, B; Tournier, J-C; Blanco Vinuela, E; Blech, J-O; Gonzalez Suarez, V

    2014-01-01

    In large industrial control systems such as the ones installed at CERN, one of the main issues is the ability to verify the correct behaviour of the Programmable Logic Controller (PLC) programs. While manual and automated testing can achieve good results, some obvious problems remain unsolved such as the difficulty to check safety or liveness properties. This paper proposes a general methodology and a tool to verify PLC programs by automatically generating formal models for different model checkers out of ST code. The proposed methodology defines an automata-based formalism used as intermediate model (IM) to transform PLC programs written in ST language into different formal models for verification purposes. A tool based on Xtext has been implemented that automatically generates models for the NuSMV and UPPAAL model checkers and the BIP framework.

  14. Computer program for calculation of ideal gas thermodynamic data

    Science.gov (United States)

    Gordon, S.; Mc Bride, B. J.

    1968-01-01

    Computer program calculates ideal gas thermodynamic properties for any species for which molecular constant data is available. Partial functions and derivatives from formulas based on statistical mechanics are provided by the program which is written in FORTRAN 4 and MAP.

  15. Computer program 'TRIO' for third order calculation of ion trajectory

    International Nuclear Information System (INIS)

    Matsuo, Takekiyo; Matsuda, Hisashi; Fujita, Yoshitaka; Wollnik, H.

    1976-01-01

    A computer program for the calculation of ion trajectory is described. This program ''TRIO'' (Third Order Ion Optics) is applicable to any ion optical system consisting of drift spaces, cylindrical or toroidal electric sector fields, homogeneous or inhomogeneous magnetic sector fields, magnetic and electrostatic Q-lenses. The influence of the fringing field is taken into consideration. A special device is introduced to the method of matrix multiplication to shorten the calculation time and the required time proves to be about 40 times shorter than the ordinary method as a result. The trajectory calculation is possible to execute with accuracy up to third order. Any one of three dispersion bases, momentum, energy, mass and energy, is possible to be selected. Full LIST of the computer program and an example are given. (auth.)

  16. GENGTC-JB: a computer program to calculate temperature distribution for cylindrical geometry capsule

    International Nuclear Information System (INIS)

    Someya, Hiroyuki; Kobayashi, Toshiki; Niimi, Motoji; Hoshiya, Taiji; Harayama, Yasuo

    1987-09-01

    In design of JMTR irradiation capsules contained specimens, a program (named GENGTC) has been generally used to evaluate temperature distributions in the capsules. The program was originally compiled by ORNL(U.S.A.) and consisted of very simple calculation methods. From the incorporated calculation methods, the program is easy to use, and has many applications to the capsule design. However, it was considered to replace original computing methods with advanced ones, when the program was checked from a standpoint of the recent computer abilities, and also to be complicated in data input. Therefore, the program was versioned up as aim to make better calculations and improve input method. The present report describes revised calculation methods and input/output guide of the version-up program. (author)

  17. Automating spectral measurements

    Science.gov (United States)

    Goldstein, Fred T.

    2008-09-01

    This paper discusses the architecture of software utilized in spectroscopic measurements. As optical coatings become more sophisticated, there is mounting need to automate data acquisition (DAQ) from spectrophotometers. Such need is exacerbated when 100% inspection is required, ancillary devices are utilized, cost reduction is crucial, or security is vital. While instrument manufacturers normally provide point-and-click DAQ software, an application programming interface (API) may be missing. In such cases automation is impossible or expensive. An API is typically provided in libraries (*.dll, *.ocx) which may be embedded in user-developed applications. Users can thereby implement DAQ automation in several Windows languages. Another possibility, developed by FTG as an alternative to instrument manufacturers' software, is the ActiveX application (*.exe). ActiveX, a component of many Windows applications, provides means for programming and interoperability. This architecture permits a point-and-click program to act as automation client and server. Excel, for example, can control and be controlled by DAQ applications. Most importantly, ActiveX permits ancillary devices such as barcode readers and XY-stages to be easily and economically integrated into scanning procedures. Since an ActiveX application has its own user-interface, it can be independently tested. The ActiveX application then runs (visibly or invisibly) under DAQ software control. Automation capabilities are accessed via a built-in spectro-BASIC language with industry-standard (VBA-compatible) syntax. Supplementing ActiveX, spectro-BASIC also includes auxiliary serial port commands for interfacing programmable logic controllers (PLC). A typical application is automatic filter handling.

  18. Program software for the automated processing of gravity and magnetic survey data for the Mir computer

    Energy Technology Data Exchange (ETDEWEB)

    Lyubimov, G.A.

    1980-01-01

    A presentation is made of the content of program software for the automated processing of gravity and magnetic survey data for the small Mir-1 and Mir-2 computers as worked out on the Voronezh geophysical expedition.

  19. Error rate of automated calculation for wound surface area using a digital photography.

    Science.gov (United States)

    Yang, S; Park, J; Lee, H; Lee, J B; Lee, B U; Oh, B H

    2018-02-01

    Although measuring would size using digital photography is a quick and simple method to evaluate the skin wound, the possible compatibility of it has not been fully validated. To investigate the error rate of our newly developed wound surface area calculation using digital photography. Using a smartphone and a digital single lens reflex (DSLR) camera, four photographs of various sized wounds (diameter: 0.5-3.5 cm) were taken from the facial skin model in company with color patches. The quantitative values of wound areas were automatically calculated. The relative error (RE) of this method with regard to wound sizes and types of camera was analyzed. RE of individual calculated area was from 0.0329% (DSLR, diameter 1.0 cm) to 23.7166% (smartphone, diameter 2.0 cm). In spite of the correction of lens curvature, smartphone has significantly higher error rate than DSLR camera (3.9431±2.9772 vs 8.1303±4.8236). However, in cases of wound diameter below than 3 cm, REs of average values of four photographs were below than 5%. In addition, there was no difference in the average value of wound area taken by smartphone and DSLR camera in those cases. For the follow-up of small skin defect (diameter: <3 cm), our newly developed automated wound area calculation method is able to be applied to the plenty of photographs, and the average values of them are a relatively useful index of wound healing with acceptable error rate. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  20. Injection Molding Parameters Calculations by Using Visual Basic (VB) Programming

    Science.gov (United States)

    Tony, B. Jain A. R.; Karthikeyen, S.; Alex, B. Jeslin A. R.; Hasan, Z. Jahid Ali

    2018-03-01

    Now a day’s manufacturing industry plays a vital role in production sectors. To fabricate a component lot of design calculation has to be done. There is a chance of human errors occurs during design calculations. The aim of this project is to create a special module using visual basic (VB) programming to calculate injection molding parameters to avoid human errors. To create an injection mold for a spur gear component the following parameters have to be calculated such as Cooling Capacity, Cooling Channel Diameter, and Cooling Channel Length, Runner Length and Runner Diameter, Gate Diameter and Gate Pressure. To calculate the above injection molding parameters a separate module has been created using Visual Basic (VB) Programming to reduce the human errors. The outcome of the module dimensions is the injection molding components such as mold cavity and core design, ejector plate design.

  1. Simple Calculation Programs for Biology Immunological Methods

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. Simple Calculation Programs for Biology Immunological Methods. Computation of Ab/Ag Concentration from EISA data. Graphical Method; Raghava et al., 1992, J. Immuno. Methods 153: 263. Determination of affinity of Monoclonal Antibody. Using non-competitive ...

  2. Distribution automation at BC Hydro : a case study

    Energy Technology Data Exchange (ETDEWEB)

    Siew, C. [BC Hydro, Vancouver, BC (Canada). Smart Grid Development Program

    2009-07-01

    This presentation discussed a distribution automation study conducted by BC Hydro to determine methods of improving grid performance by supporting intelligent transmission and distribution systems. The utility's smart grid program includes a number of utility-side and customer-side applications, including enabled demand response, microgrid, and operational efficiency applications. The smart grid program will improve reliability and power quality by 40 per cent, improve conservation and energy efficiency throughout the province, and provide enhanced customer service. Programs and initiatives currently underway at the utility include distribution management, smart metering, distribution automation, and substation automation programs. The utility's automation functionality will include fault interruption and locating, restoration capability, and restoration success. A decision support system has also been established to assist control room and field operating personnel with monitoring and control of the electric distribution system. Protection, control and monitoring (PCM) and volt VAR optimization upgrades are also planned. Reclosers are also being automated, and an automation guide has been developed for switches. tabs., figs.

  3. Second AIAA/NASA USAF Symposium on Automation, Robotics and Advanced Computing for the National Space Program

    Science.gov (United States)

    Myers, Dale

    1987-01-01

    An introduction is given to NASA goals in the development of automation (expert systems) and robotics technologies in the Space Station program. Artificial intelligence (AI) has been identified as a means to lowering ground support costs. Telerobotics will enhance space assembly, servicing and repair capabilities, and will be used for an estimated half of the necessary EVA tasks. The general principles guiding NASA in the design, development, ground-testing, interactions with industry and construction of the Space Station component systems are summarized. The telerobotics program has progressed to a point where a telerobot servicer is a firm component of the first Space Station element launch, to support assembly, maintenance and servicing of the Station. The University of Wisconsin has been selected for the establishment of a Center for the Commercial Development of Space, specializing in space automation and robotics.

  4. Simple Calculation Programs for Biology Other Methods

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. Simple Calculation Programs for Biology Other Methods. Hemolytic potency of drugs. Raghava et al., (1994) Biotechniques 17: 1148. FPMAP: methods for classification and identification of microorganisms 16SrRNA. graphical display of restriction and fragment map of ...

  5. Comparison of Integer Programming (IP) Solvers for Automated Test Assembly (ATA). Research Report. ETS RR-15-05

    Science.gov (United States)

    Donoghue, John R.

    2015-01-01

    At the heart of van der Linden's approach to automated test assembly (ATA) is a linear programming/integer programming (LP/IP) problem. A variety of IP solvers are available, ranging in cost from free to hundreds of thousands of dollars. In this paper, I compare several approaches to solving the underlying IP problem. These approaches range from…

  6. Development and implementation of an automated quantitative film digitizer quality control program

    Science.gov (United States)

    Fetterly, Kenneth A.; Avula, Ramesh T. V.; Hangiandreou, Nicholas J.

    1999-05-01

    A semi-automated, quantitative film digitizer quality control program that is based on the computer analysis of the image data from a single digitized test film was developed. This program includes measurements of the geometric accuracy, optical density performance, signal to noise ratio, and presampled modulation transfer function. The variability of the measurements was less than plus or minus 5%. Measurements were made on a group of two clinical and two laboratory laser film digitizers during a trial period of approximately four months. Quality control limits were established based on clinical necessity, vendor specifications and digitizer performance. During the trial period, one of the digitizers failed the performance requirements and was corrected by calibration.

  7. Poster - 09: A MATLAB-based Program for Automated Quality Assurance of a Prostate Brachytherapy Ultrasound System

    Energy Technology Data Exchange (ETDEWEB)

    Poon, Justin; Sabondjian, Eric; Sankreacha, Raxa [University of British Columbia, Dept. of Physics and Astronomy, Vancouver, BC (Canada); Trillium Health Partners – Credit Valley Hospital, Peel Regional Cancer Centre, Mississauga, ON, Trillium Health Partners – Credit Valley Hospital, Peel Regional Cancer Centre, Mississauga, ON, Trillium Health Partners – Credit Valley Hospital, Peel Regional Cancer Centre, Mississauga, ON (Canada); University of Toronto, Dept. of Radiation Oncology, Toronto, ON (Canada)

    2016-08-15

    Purpose: A robust Quality Assurance (QA) program is essential for prostate brachytherapy ultrasound systems due to the importance of imaging accuracy during treatment and planning. Task Group 128 of the American Association of Physicists in Medicine has recommended a set of QA tests covering grayscale visibility, depth of penetration, axial and lateral resolution, distance measurement, area measurement, volume measurement, and template/electronic grid alignment. Making manual measurements on the ultrasound system can be slow and inaccurate, so a MATLAB program was developed for automation of the described tests. Methods: Test images were acquired using a BK Medical Flex Focus 400 ultrasound scanner and 8848 transducer with the CIRS Brachytherapy QA Phantom – Model 045A. For each test, the program automatically segments the inputted image(s), makes the appropriate measurements, and indicates if the test passed or failed. The program was tested by analyzing two sets of images, where the measurements from the first set were used as baseline values. Results: The program successfully analyzed the images for each test and determined if any action limits were exceeded. All tests passed – the measurements made by the program were consistent and met the requirements outlined by Task Group 128. Conclusions: The MATLAB program we have developed can be used for automated QA of an ultrasound system for prostate brachytherapy. The GUI provides a user-friendly way to analyze images without the need for any manual measurement, potentially removing intra- and inter-user variability for more consistent results.

  8. Poster - 09: A MATLAB-based Program for Automated Quality Assurance of a Prostate Brachytherapy Ultrasound System

    International Nuclear Information System (INIS)

    Poon, Justin; Sabondjian, Eric; Sankreacha, Raxa

    2016-01-01

    Purpose: A robust Quality Assurance (QA) program is essential for prostate brachytherapy ultrasound systems due to the importance of imaging accuracy during treatment and planning. Task Group 128 of the American Association of Physicists in Medicine has recommended a set of QA tests covering grayscale visibility, depth of penetration, axial and lateral resolution, distance measurement, area measurement, volume measurement, and template/electronic grid alignment. Making manual measurements on the ultrasound system can be slow and inaccurate, so a MATLAB program was developed for automation of the described tests. Methods: Test images were acquired using a BK Medical Flex Focus 400 ultrasound scanner and 8848 transducer with the CIRS Brachytherapy QA Phantom – Model 045A. For each test, the program automatically segments the inputted image(s), makes the appropriate measurements, and indicates if the test passed or failed. The program was tested by analyzing two sets of images, where the measurements from the first set were used as baseline values. Results: The program successfully analyzed the images for each test and determined if any action limits were exceeded. All tests passed – the measurements made by the program were consistent and met the requirements outlined by Task Group 128. Conclusions: The MATLAB program we have developed can be used for automated QA of an ultrasound system for prostate brachytherapy. The GUI provides a user-friendly way to analyze images without the need for any manual measurement, potentially removing intra- and inter-user variability for more consistent results.

  9. Automated design and optimization of flexible booster autopilots via linear programming, volume 1

    Science.gov (United States)

    Hauser, F. D.

    1972-01-01

    A nonlinear programming technique was developed for the automated design and optimization of autopilots for large flexible launch vehicles. This technique, which resulted in the COEBRA program, uses the iterative application of linear programming. The method deals directly with the three main requirements of booster autopilot design: to provide (1) good response to guidance commands; (2) response to external disturbances (e.g. wind) to minimize structural bending moment loads and trajectory dispersions; and (3) stability with specified tolerances on the vehicle and flight control system parameters. The method is applicable to very high order systems (30th and greater per flight condition). Examples are provided that demonstrate the successful application of the employed algorithm to the design of autopilots for both single and multiple flight conditions.

  10. SAMPO 90 - High resolution interactive gamma spectrum analysis including automation with macros

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Nikkinen, M.T.; Routti, J.T.

    1991-01-01

    SAMPO 90 is a high performance gamma spectrum analysis program for personal computers. It uses high resolution color graphics to display calibrations, spectra, fitting results as multiplet components, and analysis results. All the analysis phases can be done either under full interactive user control or by using macros for automated measurement and analysis sequences including the control of MCAs and sample changers. Semi-automated calibrations for peak shapes (Gaussian with exponential tails), detector efficiency, and energy are available with a possibility for user intervention through interactive graphics. Accurate peak area determination of even the most complex multiplets, of up to 32 components, is accomplished using linear, non-linear and mixed mode fitting, where the component energies and areas can be either frozen or allowed to float in arbitrary combinations. Nuclide identification is done using associated lines techniques which allow interference correction for fully overlapping peaks. Peaked Background Subtraction can be performed and Minimum Detectable Activities calculated. Attenuation corrections can be taken into account in detector efficiency calculation. The most common PC-based MCA spectrum formats (Canberra S100, Ortec ACE, Nucleus PCA, ND AccuSpec) are supported as well as ASCII spectrum files. A gamma-line library is included together with an editor for user configurable libraries. The analysis reports and program parameters are fully customizable. Function key macros can be used to automate the most common analysis procedures. Small batch type modules are additionally available for routine work. SAMPO 90 is a result of over twenty man years of programming and contains 25,000 lines of Fortran, 10,000 lines of C, and 12,000 lines of assembler

  11. Data calculation program for RELAP 5 code

    International Nuclear Information System (INIS)

    Silvestre, Larissa J.B.; Sabundjian, Gaiane

    2015-01-01

    As the criteria and requirements for a nuclear power plant are extremely rigid, computer programs for simulation and safety analysis are required for certifying and licensing a plant. Based on this scenario, some sophisticated computational tools have been used such as the Reactor Excursion and Leak Analysis Program (RELAP5), which is the most used code for the thermo-hydraulic analysis of accidents and transients in nuclear reactors. A major difficulty in the simulation using RELAP5 code is the amount of information required for the simulation of thermal-hydraulic accidents or transients. The preparation of the input data leads to a very large number of mathematical operations for calculating the geometry of the components. Therefore, a mathematical friendly preprocessor was developed in order to perform these calculations and prepare RELAP5 input data. The Visual Basic for Application (VBA) combined with Microsoft EXCEL demonstrated to be an efficient tool to perform a number of tasks in the development of the program. Due to the absence of necessary information about some RELAP5 components, this work aims to make improvements to the Mathematic Preprocessor for RELAP5 code (PREREL5). For the new version of the preprocessor, new screens of some components that were not programmed in the original version were designed; moreover, screens of pre-existing components were redesigned to improve the program. In addition, an English version was provided for the new version of the PREREL5. The new design of PREREL5 contributes for saving time and minimizing mistakes made by users of the RELAP5 code. The final version of this preprocessor will be applied to Angra 2. (author)

  12. Data calculation program for RELAP 5 code

    Energy Technology Data Exchange (ETDEWEB)

    Silvestre, Larissa J.B.; Sabundjian, Gaiane, E-mail: larissajbs@usp.br, E-mail: gdjian@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    As the criteria and requirements for a nuclear power plant are extremely rigid, computer programs for simulation and safety analysis are required for certifying and licensing a plant. Based on this scenario, some sophisticated computational tools have been used such as the Reactor Excursion and Leak Analysis Program (RELAP5), which is the most used code for the thermo-hydraulic analysis of accidents and transients in nuclear reactors. A major difficulty in the simulation using RELAP5 code is the amount of information required for the simulation of thermal-hydraulic accidents or transients. The preparation of the input data leads to a very large number of mathematical operations for calculating the geometry of the components. Therefore, a mathematical friendly preprocessor was developed in order to perform these calculations and prepare RELAP5 input data. The Visual Basic for Application (VBA) combined with Microsoft EXCEL demonstrated to be an efficient tool to perform a number of tasks in the development of the program. Due to the absence of necessary information about some RELAP5 components, this work aims to make improvements to the Mathematic Preprocessor for RELAP5 code (PREREL5). For the new version of the preprocessor, new screens of some components that were not programmed in the original version were designed; moreover, screens of pre-existing components were redesigned to improve the program. In addition, an English version was provided for the new version of the PREREL5. The new design of PREREL5 contributes for saving time and minimizing mistakes made by users of the RELAP5 code. The final version of this preprocessor will be applied to Angra 2. (author)

  13. Method and program for complex calculation of heterogeneous reactor

    International Nuclear Information System (INIS)

    Kalashnikov, A.G.; Glebov, A.P.; Elovskaya, L.F.; Kuznetsova, L.I.

    1988-01-01

    An algorithm and the GITA program for complex one-dimensional calculation of a heterogeneous reactor which permits to conduct calculations for the reactor and its cell simultaneously using the same algorithm are described. Multigroup macrocross sections for reactor zones in the thermal energy range are determined according to the technique for calculating a cell with complicate structure and then the continuous multi group calculation of the reactor in the thermal energy range and in the range of neutron thermalization is made. The kinetic equation is solved using the Pi- and DSn- approximations [fr

  14. Robotics/Automated Systems Technicians.

    Science.gov (United States)

    Doty, Charles R.

    Major resources exist that can be used to develop or upgrade programs in community colleges and technical institutes that educate robotics/automated systems technicians. The first category of resources is Economic, Social, and Education Issues. The Office of Technology Assessment (OTA) report, "Automation and the Workplace," presents analyses of…

  15. Implementation of a scalable, web-based, automated clinical decision support risk-prediction tool for chronic kidney disease using C-CDA and application programming interfaces.

    Science.gov (United States)

    Samal, Lipika; D'Amore, John D; Bates, David W; Wright, Adam

    2017-11-01

    Clinical decision support tools for risk prediction are readily available, but typically require workflow interruptions and manual data entry so are rarely used. Due to new data interoperability standards for electronic health records (EHRs), other options are available. As a clinical case study, we sought to build a scalable, web-based system that would automate calculation of kidney failure risk and display clinical decision support to users in primary care practices. We developed a single-page application, web server, database, and application programming interface to calculate and display kidney failure risk. Data were extracted from the EHR using the Consolidated Clinical Document Architecture interoperability standard for Continuity of Care Documents (CCDs). EHR users were presented with a noninterruptive alert on the patient's summary screen and a hyperlink to details and recommendations provided through a web application. Clinic schedules and CCDs were retrieved using existing application programming interfaces to the EHR, and we provided a clinical decision support hyperlink to the EHR as a service. We debugged a series of terminology and technical issues. The application was validated with data from 255 patients and subsequently deployed to 10 primary care clinics where, over the course of 1 year, 569 533 CCD documents were processed. We validated the use of interoperable documents and open-source components to develop a low-cost tool for automated clinical decision support. Since Consolidated Clinical Document Architecture-based data extraction extends to any certified EHR, this demonstrates a successful modular approach to clinical decision support. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  16. Automated Flight Routing Using Stochastic Dynamic Programming

    Science.gov (United States)

    Ng, Hok K.; Morando, Alex; Grabbe, Shon

    2010-01-01

    Airspace capacity reduction due to convective weather impedes air traffic flows and causes traffic congestion. This study presents an algorithm that reroutes flights in the presence of winds, enroute convective weather, and congested airspace based on stochastic dynamic programming. A stochastic disturbance model incorporates into the reroute design process the capacity uncertainty. A trajectory-based airspace demand model is employed for calculating current and future airspace demand. The optimal routes minimize the total expected traveling time, weather incursion, and induced congestion costs. They are compared to weather-avoidance routes calculated using deterministic dynamic programming. The stochastic reroutes have smaller deviation probability than the deterministic counterpart when both reroutes have similar total flight distance. The stochastic rerouting algorithm takes into account all convective weather fields with all severity levels while the deterministic algorithm only accounts for convective weather systems exceeding a specified level of severity. When the stochastic reroutes are compared to the actual flight routes, they have similar total flight time, and both have about 1% of travel time crossing congested enroute sectors on average. The actual flight routes induce slightly less traffic congestion than the stochastic reroutes but intercept more severe convective weather.

  17. Calculator Programming Engages Visual and Kinesthetic Learners

    Science.gov (United States)

    Tabor, Catherine

    2014-01-01

    Inclusion and differentiation--hallmarks of the current educational system--require a paradigm shift in the way that educators run their classrooms. This article enumerates the need for techno-kinesthetic, visually based activities and offers an example of a calculator-based programming activity that addresses that need. After discussing the use…

  18. MP.EXE Microphone pressure sensitivity calibration calculation program

    DEFF Research Database (Denmark)

    Rasmussen, Knud

    1999-01-01

    MP.EXE is a program which calculates the pressure sensitivity of LS1 microphones as defined in IEC 61094-1, based on measurement results performed as laid down in IEC 61094-2.A very early program was developed and written by K. Rasmussen. The code of the present heavily extended version is writte...... by E.S. Olsen.The present manual is written by K.Rasmussen and E.S. Olsen....

  19. The Automation-by-Expertise-by-Training Interaction.

    Science.gov (United States)

    Strauch, Barry

    2017-03-01

    I introduce the automation-by-expertise-by-training interaction in automated systems and discuss its influence on operator performance. Transportation accidents that, across a 30-year interval demonstrated identical automation-related operator errors, suggest a need to reexamine traditional views of automation. I review accident investigation reports, regulator studies, and literature on human computer interaction, expertise, and training and discuss how failing to attend to the interaction of automation, expertise level, and training has enabled operators to commit identical automation-related errors. Automated systems continue to provide capabilities exceeding operators' need for effective system operation and provide interfaces that can hinder, rather than enhance, operator automation-related situation awareness. Because of limitations in time and resources, training programs do not provide operators the expertise needed to effectively operate these automated systems, requiring them to obtain the expertise ad hoc during system operations. As a result, many do not acquire necessary automation-related system expertise. Integrating automation with expected operator expertise levels, and within training programs that provide operators the necessary automation expertise, can reduce opportunities for automation-related operator errors. Research to address the automation-by-expertise-by-training interaction is needed. However, such research must meet challenges inherent to examining realistic sociotechnical system automation features with representative samples of operators, perhaps by using observational and ethnographic research. Research in this domain should improve the integration of design and training and, it is hoped, enhance operator performance.

  20. Future Autonomous and Automated Systems Testbed

    Data.gov (United States)

    National Aeronautics and Space Administration — Trust is the greatest obstacle to implementing greater autonomy and automation (A&A) in the human spaceflight program. The Future Autonomous and Automated...

  1. Automated registration of tail bleeding in rats.

    Science.gov (United States)

    Johansen, Peter B; Henriksen, Lars; Andresen, Per R; Lauritzen, Brian; Jensen, Kåre L; Juhl, Trine N; Tranholm, Mikael

    2008-05-01

    An automated system for registration of tail bleeding in rats using a camera and a user-designed PC-based software program has been developed. The live and processed images are displayed on the screen and are exported together with a text file for later statistical processing of the data allowing calculation of e.g. number of bleeding episodes, bleeding times and bleeding areas. Proof-of-principle was achieved when the camera captured the blood stream after infusion of rat whole blood into saline. Suitability was assessed by recording of bleeding profiles in heparin-treated rats, demonstrating that the system was able to capture on/off bleedings and that the data transfer and analysis were conducted successfully. Then, bleeding profiles were visually recorded by two independent observers simultaneously with the automated recordings after tail transection in untreated rats. Linear relationships were found in the number of bleedings, demonstrating, however, a statistically significant difference in the recording of bleeding episodes between observers. Also, the bleeding time was longer for visual compared to automated recording. No correlation was found between blood loss and bleeding time in untreated rats, but in heparinized rats a correlation was suggested. Finally, the blood loss correlated with the automated recording of bleeding area. In conclusion, the automated system has proven suitable for replacing visual recordings of tail bleedings in rats. Inter-observer differences can be eliminated, monotonous repetitive work avoided, and a higher through-put of animals in less time achieved. The automated system will lead to an increased understanding of the nature of bleeding following tail transection in different rodent models.

  2. Transcranial Magnetic Stimulation: An Automated Procedure to Obtain Coil-specific Models for Field Calculations

    DEFF Research Database (Denmark)

    Madsen, Kristoffer Hougaard; Ewald, Lars; Siebner, Hartwig R.

    2015-01-01

    Background: Field calculations for transcranial magnetic stimulation (TMS) are increasingly implemented online in neuronavigation systems and in more realistic offline approaches based on finite-element methods. They are often based on simplified and/or non-validated models of the magnetic vector...... potential of the TMS coils. Objective: To develop an approach to reconstruct the magnetic vector potential based on automated measurements. Methods: We implemented a setup that simultaneously measures the three components of the magnetic field with high spatial resolution. This is complemented by a novel...... approach to determine the magnetic vector potential via volume integration of the measured field. Results: The integration approach reproduces the vector potential with very good accuracy. The vector potential distribution of a standard figure-of-eight shaped coil determined with our setup corresponds well...

  3. Communications and Information: Strategic Automated Command Control System-Data Transmission Subsystem (SACCS-DTS) Network Security Program. Volume 2

    National Research Council Canada - National Science Library

    1997-01-01

    ...) Systems, and 33-2, Information Protection. This instruction prescribes the requirements, responsibilities and procedures for the security program for the Strategic Automated Command Control System-Data Transmission Subsystem (SACCS-DTS...

  4. Bringing Automated Formal Verification to PLC Program Development

    CERN Document Server

    Fernández Adiego, Borja; Blanco Viñuela, Enrique

    Automation is the field of engineering that deals with the development of control systems for operating systems such as industrial processes, railways, machinery or aircraft without human intervention. In most of the cases, a failure in these control systems can cause a disaster in terms of economic losses, environmental damages or human losses. For that reason, providing safe, reliable and robust control systems is a first priority goal for control engineers. Ideally, control engineers should be able to guarantee that both software and hardware fulfill the design requirements. This is an enormous challenge in which industry and academia have been working and making progresses in the last decades. This thesis focuses on one particular type of control systems that operates industrial processes, the PLC (Programmable Logic Controller) - based control systems. Moreover it targets one of the main challenges for these systems, guaranteeing that PLC programs are compliant with their specifications. Traditionally ...

  5. Improved reliability, accuracy and quality in automated NMR structure calculation with ARIA

    Energy Technology Data Exchange (ETDEWEB)

    Mareuil, Fabien [Institut Pasteur, Cellule d' Informatique pour la Biologie (France); Malliavin, Thérèse E.; Nilges, Michael; Bardiaux, Benjamin, E-mail: bardiaux@pasteur.fr [Institut Pasteur, Unité de Bioinformatique Structurale, CNRS UMR 3528 (France)

    2015-08-15

    In biological NMR, assignment of NOE cross-peaks and calculation of atomic conformations are critical steps in the determination of reliable high-resolution structures. ARIA is an automated approach that performs NOE assignment and structure calculation in a concomitant manner in an iterative procedure. The log-harmonic shape for distance restraint potential and the Bayesian weighting of distance restraints, recently introduced in ARIA, were shown to significantly improve the quality and the accuracy of determined structures. In this paper, we propose two modifications of the ARIA protocol: (1) the softening of the force field together with adapted hydrogen radii, which is meaningful in the context of the log-harmonic potential with Bayesian weighting, (2) a procedure that automatically adjusts the violation tolerance used in the selection of active restraints, based on the fitting of the structure to the input data sets. The new ARIA protocols were fine-tuned on a set of eight protein targets from the CASD–NMR initiative. As a result, the convergence problems previously observed for some targets was resolved and the obtained structures exhibited better quality. In addition, the new ARIA protocols were applied for the structure calculation of ten new CASD–NMR targets in a blind fashion, i.e. without knowing the actual solution. Even though optimisation of parameters and pre-filtering of unrefined NOE peak lists were necessary for half of the targets, ARIA consistently and reliably determined very precise and highly accurate structures for all cases. In the context of integrative structural biology, an increasing number of experimental methods are used that produce distance data for the determination of 3D structures of macromolecules, stressing the importance of methods that successfully make use of ambiguous and noisy distance data.

  6. A generic template for automated bioanalytical ligand-binding assays using modular robotic scripts in support of discovery biotherapeutic programs.

    Science.gov (United States)

    Duo, Jia; Dong, Huijin; DeSilva, Binodh; Zhang, Yan J

    2013-07-01

    Sample dilution and reagent pipetting are time-consuming steps in ligand-binding assays (LBAs). Traditional automation-assisted LBAs use assay-specific scripts that require labor-intensive script writing and user training. Five major script modules were developed on Tecan Freedom EVO liquid handling software to facilitate the automated sample preparation and LBA procedure: sample dilution, sample minimum required dilution, standard/QC minimum required dilution, standard/QC/sample addition, and reagent addition. The modular design of automation scripts allowed the users to assemble an automated assay with minimal script modification. The application of the template was demonstrated in three LBAs to support discovery biotherapeutic programs. The results demonstrated that the modular scripts provided the flexibility in adapting to various LBA formats and the significant time saving in script writing and scientist training. Data generated by the automated process were comparable to those by manual process while the bioanalytical productivity was significantly improved using the modular robotic scripts.

  7. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  8. Computer automation of continuous-flow analyzers for trace constituents in water. Volume 4. Description of program segments. Part 3. TAASTART

    International Nuclear Information System (INIS)

    Crawford, R.W.

    1979-01-01

    This report describes TAASTART, the third program in the series of programs necessary in automating the Technicon AutoAnalyzer. Included is a flow chart that illustrates the program logic and a description of each section and subroutine. In addition, all arrays, variables and strings are listed and defined, and a sample program listing with a complete list of symbols and references is provided

  9. Automated detection and classification of cryptographic algorithms in binary programs through machine learning

    OpenAIRE

    Hosfelt, Diane Duros

    2015-01-01

    Threats from the internet, particularly malicious software (i.e., malware) often use cryptographic algorithms to disguise their actions and even to take control of a victim's system (as in the case of ransomware). Malware and other threats proliferate too quickly for the time-consuming traditional methods of binary analysis to be effective. By automating detection and classification of cryptographic algorithms, we can speed program analysis and more efficiently combat malware. This thesis wil...

  10. A system-level approach to automation research

    Science.gov (United States)

    Harrison, F. W.; Orlando, N. E.

    1984-01-01

    Automation is the application of self-regulating mechanical and electronic devices to processes that can be accomplished with the human organs of perception, decision, and actuation. The successful application of automation to a system process should reduce man/system interaction and the perceived complexity of the system, or should increase affordability, productivity, quality control, and safety. The expense, time constraints, and risk factors associated with extravehicular activities have led the Automation Technology Branch (ATB), as part of the NASA Automation Research and Technology Program, to investigate the use of robots and teleoperators as automation aids in the context of space operations. The ATB program addresses three major areas: (1) basic research in autonomous operations, (2) human factors research on man-machine interfaces with remote systems, and (3) the integration and analysis of automated systems. This paper reviews the current ATB research in the area of robotics and teleoperators.

  11. Dipole showers and automated NLO matching in Herwig++

    International Nuclear Information System (INIS)

    Plaetzer, Simon; Gieseke, Stefan

    2011-09-01

    We report on the implementation of a coherent dipole shower algorithm along with an automated implementation for dipole subtraction and for performing POWHEG- and MC rate at NLO-type matching to next-to-leading order (NLO) calculations. Both programs are implemented as add-on modules to the event generator HERWIG++. A preliminary tune of parameters to data acquired at LEP, HERA and Drell-Yan pair production at the Tevatron has been performed, and we find an overall very good description which is slightly improved by the NLO matching. (orig.)

  12. Dipole showers and automated NLO matching in Herwig++

    Energy Technology Data Exchange (ETDEWEB)

    Plaetzer, Simon [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Gieseke, Stefan [Karlsruher Institut fuer Technologie, Karlsruhe (Germany). Inst. fuer Theoretische Teilchenphysik

    2011-09-15

    We report on the implementation of a coherent dipole shower algorithm along with an automated implementation for dipole subtraction and for performing POWHEG- and MC rate at NLO-type matching to next-to-leading order (NLO) calculations. Both programs are implemented as add-on modules to the event generator HERWIG++. A preliminary tune of parameters to data acquired at LEP, HERA and Drell-Yan pair production at the Tevatron has been performed, and we find an overall very good description which is slightly improved by the NLO matching. (orig.)

  13. NLOM - a program for nonlocal optical model calculations

    International Nuclear Information System (INIS)

    Kim, B.T.; Kyum, M.C.; Hong, S.W.; Park, M.H.; Udagawa, T.

    1992-01-01

    A FORTRAN program NLOM for nonlocal optical model calculations is described. It is based on a method recently developed by Kim and Udagawa, which utilizes the Lanczos technique for solving integral equations derived from the nonlocal Schroedinger equation. (orig.)

  14. Determination of the burn-up of TRIGA fuel elements by calculation with new TRIGLAV program

    International Nuclear Information System (INIS)

    Zagar, T.; Ravnik, M.

    1996-01-01

    The results of fuel element burn-up calculations with new TRIGLAV program are presented. TRIGLAV program uses two dimensional model. Results of calculation are compared to results calculated with program, which uses one dimensional model. The results of fuel element burn-up measurements with reactivity method are presented and compared with the calculated results. (author)

  15. SU-G-BRB-05: Automation of the Photon Dosimetric Quality Assurance Program of a Linear Accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Lebron, S; Lu, B; Yan, G; Li, J; Liu, C [University of Florida, Gainesville, FL (United States)

    2016-06-15

    Purpose: To develop an automated method to calculate a linear accelerator (LINAC) photon radiation field size, flatness, symmetry, output and beam quality in a single delivery for flattened (FF) and flattening-filter-free (FFF) beams using an ionization chamber array. Methods: The proposed method consists of three control points that deliver 30×30, 10×10 and 5×5cm{sup 2} fields (FF or FFF) in a step-and-shoot sequence where the number of monitor units is weighted for each field size. The IC Profiler (Sun Nuclear Inc.) with 5mm detector spacing was used for this study. The corrected counts (CCs) were calculated and the locations of the maxima and minima values of the first-order gradient determined data of each sub field. Then, all CCs for each field size are summed in order to obtain the final profiles. For each profile, the radiation field size, symmetry, flatness, output factor and beam quality were calculated. For field size calculation, a parameterized gradient method was used. For method validation, profiles were collected in the detector array both, individually and as part of the step-and-shoot plan, with 9.9cm buildup for FF and FFF beams at 90cm source-to-surface distance. The same data were collected with the device (plus buildup) placed on a movable platform to achieve a 1mm resolution. Results: The differences between the dosimetric quantities calculated from both deliveries, individually and step-and-shoot, were within 0.31±0.20% and 0.04±0.02mm. The differences between the calculated field sizes with 5mm and 1mm resolution were ±0.1mm. Conclusion: The proposed single delivery method proved to be simple and efficient in automating the photon dosimetric monthly and annual quality assurance.

  16. An automated sensitivity analysis procedure for the performance assessment of nuclear waste isolation systems

    International Nuclear Information System (INIS)

    Pin, F.G.; Worley, B.A.; Oblow, E.M.; Wright, R.Q.; Harper, W.V.

    1986-01-01

    To support an effort in making large-scale sensitivity analyses feasible, cost efficient and quantitatively complete, the authors have developed an automated procedure making use of computer calculus. The procedure, called GRESS (GRadient Enhanced Software System), is embodied in a precompiler that can process Fortran computer codes and add derivative-taking capabilities to the normal calculation scheme. In this paper, the automated GRESS procedure is described and applied to the code UCB-NE-10.2, which simulates the migration through a sorption medium of the radionuclide members of a decay chain. The sensitivity calculations for a sample problem are verified using comparison with analytical and perturbation analysis results. Conclusions are drawn relative to the applicability of GRESS for more general large-scale sensitivity studies, and the role of such techniques in an overall sensitivity and uncertainty analysis program is discussed

  17. Participation through Automation: Fully Automated Critical PeakPricing in Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Kiliccote,Sila; Linkugel, Eric

    2006-06-20

    California electric utilities have been exploring the use of dynamic critical peak prices (CPP) and other demand response programs to help reduce peaks in customer electric loads. CPP is a tariff design to promote demand response. Levels of automation in DR can be defined as follows: Manual Demand Response involves a potentially labor-intensive approach such as manually turning off or changing comfort set points at each equipment switch or controller. Semi-Automated Demand Response involves a pre-programmed demand response strategy initiated by a person via centralized control system. Fully Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. The receipt of the external signal initiates pre-programmed demand response strategies. They refer to this as Auto-DR. This paper describes the development, testing, and results from automated CPP (Auto-CPP) as part of a utility project in California. The paper presents the project description and test methodology. This is followed by a discussion of Auto-DR strategies used in the field test buildings. They present a sample Auto-CPP load shape case study, and a selection of the Auto-CPP response data from September 29, 2005. If all twelve sites reached their maximum saving simultaneously, a total of approximately 2 MW of DR is available from these twelve sites that represent about two million ft{sup 2}. The average DR was about half that value, at about 1 MW. These savings translate to about 0.5 to 1.0 W/ft{sup 2} of demand reduction. They are continuing field demonstrations and economic evaluations to pursue increasing penetrations of automated DR that has demonstrated ability to provide a valuable DR resource for California.

  18. Application of advanced technology to space automation

    Science.gov (United States)

    Schappell, R. T.; Polhemus, J. T.; Lowrie, J. W.; Hughes, C. A.; Stephens, J. R.; Chang, C. Y.

    1979-01-01

    Automated operations in space provide the key to optimized mission design and data acquisition at minimum cost for the future. The results of this study strongly accentuate this statement and should provide further incentive for immediate development of specific automtion technology as defined herein. Essential automation technology requirements were identified for future programs. The study was undertaken to address the future role of automation in the space program, the potential benefits to be derived, and the technology efforts that should be directed toward obtaining these benefits.

  19. Automated Test-Form Generation

    Science.gov (United States)

    van der Linden, Wim J.; Diao, Qi

    2011-01-01

    In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…

  20. Some problems of software development for the plant-level automated control system of NPPs with the RBMK reactors

    International Nuclear Information System (INIS)

    Gorbunov, V.P.; Egorov, A.K.; Isaev, N.V.; Saprykin, E.M.

    1987-01-01

    Problems on development and operation of automated control system (ACS) software of NPPs with the RBMK reactors are discussed. The ES computer with large on-line storage (not less than 1 Mbite) and fast response (not less than 300.000 of operations per a second) should enter the ACS composition. Several program complexes are used in the NPP ACS. The programs collected into the EhNERGIYa library are used to provide central control system operation. The information-retrival system called the Fuel file is used to automate NPP fuel motion account, as well as to estimate efficiency of fuel application, to carry out calculations of a fuel component of electric and heat energy production cost. The automated information system for unit operation efficiency analysis, which solves both plant and unit-level problems, including engineering and economical factors and complexing of operation parameter bank, is under trial operation

  1. Automated procedures for sizing aerospace vehicle structures /SAVES/

    Science.gov (United States)

    Giles, G. L.; Blackburn, C. L.; Dixon, S. C.

    1972-01-01

    Results from a continuing effort to develop automated methods for structural design are described. A system of computer programs presently under development called SAVES is intended to automate the preliminary structural design of a complete aerospace vehicle. Each step in the automated design process of the SAVES system of programs is discussed, with emphasis placed on use of automated routines for generation of finite-element models. The versatility of these routines is demonstrated by structural models generated for a space shuttle orbiter, an advanced technology transport,n hydrogen fueled Mach 3 transport. Illustrative numerical results are presented for the Mach 3 transport wing.

  2. Automation of radiation dosimetry using PTW dosemeter and LabVIEWTM

    International Nuclear Information System (INIS)

    Weiss, C.; Al-Frouh, K.; Anjak, O.

    2011-01-01

    Automation of UNIDOS 'Dosemeter' using personal computer (PC) is discussed in this paper. In order to save time and eliminate human operation errors during the radiation dosimetry, suitable software, using LabVIEW TM graphical programming language, was written to automate and facilitate the processes of measurements, analysis and data storage. The software calculates the calibration factor of the ionization chamber in terms of air kerma or absorbed dose to water according to IAEA dosimetry protocols. It also has the ability to print a calibration certificate. The obtained results using this software are found to be more reliable and flexible than those obtained by manual methods previously employed. Using LabVIEW TM as a development tool is extremely convenient to make things easier when software modifications and improvements are needed.

  3. Programming PHREEQC calculations with C++ and Python a comparative study

    Science.gov (United States)

    Charlton, Scott R.; Parkhurst, David L.; Muller, Mike

    2011-01-01

    The new IPhreeqc module provides an application programming interface (API) to facilitate coupling of other codes with the U.S. Geological Survey geochemical model PHREEQC. Traditionally, loose coupling of PHREEQC with other applications required methods to create PHREEQC input files, start external PHREEQC processes, and process PHREEQC output files. IPhreeqc eliminates most of this effort by providing direct access to PHREEQC capabilities through a component object model (COM), a library, or a dynamically linked library (DLL). Input and calculations can be specified through internally programmed strings, and all data exchange between an application and the module can occur in computer memory. This study compares simulations programmed in C++ and Python that are tightly coupled with IPhreeqc modules to the traditional simulations that are loosely coupled to PHREEQC. The study compares performance, quantifies effort, and evaluates lines of code and the complexity of the design. The comparisons show that IPhreeqc offers a more powerful and simpler approach for incorporating PHREEQC calculations into transport models and other applications that need to perform PHREEQC calculations. The IPhreeqc module facilitates the design of coupled applications and significantly reduces run times. Even a moderate knowledge of one of the supported programming languages allows more efficient use of PHREEQC than the traditional loosely coupled approach.

  4. PTOLEMY, a program for heavy-ion direction-reaction calculations

    International Nuclear Information System (INIS)

    Gloeckner, D.H.; Macfarlane, M.H.; Pieper, S.C.

    1976-03-01

    Ptolemy is an IBM/360 program for the computation of nuclear elastic and direct-reaction cross sections. It carries out both optical-model fits to elastic-scattering data at one or more energies, and DWBA calculations for nucleon-transfer reactions. Ptolemy has been specifically designed for heavy-ion calculations. It is fast and does not require large amounts of core. The input is exceptionally flexible and easy to use. This report outlines the types of calculation that Ptolemy can carry out, summarizes the formulas used, and gives a detailed description of its input

  5. GRUCAL, a computer program for calculating macroscopic group constants

    International Nuclear Information System (INIS)

    Woll, D.

    1975-06-01

    Nuclear reactor calculations require material- and composition-dependent, energy averaged nuclear data to describe the interaction of neutrons with individual isotopes in material compositions of reactor zones. The code GRUCAL calculates these macroscopic group constants for given compositions from the material-dependent data of the group constant library GRUBA. The instructions for calculating group constants are not fixed in the program, but will be read at the actual execution time from a separate instruction file. This allows to accomodate GRUCAL to various problems or different group constant concepts. (orig.) [de

  6. GENMOD - A program for internal dosimetry calculations

    International Nuclear Information System (INIS)

    Dunford, D.W.; Johnson, J.R.

    1987-12-01

    The computer code GENMOD was created to calculate the retention and excretion, and the integrated retention for selected radionuclides under a variety of exposure conditions. Since the creation of GENMOD new models have been developed and interfaced to GENMOD. This report describes the models now included in GENMOD, the dosimetry factors database, and gives a brief description of the GENMOD program

  7. Automated software analysis of nuclear core discharge data

    International Nuclear Information System (INIS)

    Larson, T.W.; Halbig, J.K.; Howell, J.A.; Eccleston, G.W.; Klosterbuer, S.F.

    1993-03-01

    Monitoring the fueling process of an on-load nuclear reactor is a full-time job for nuclear safeguarding agencies. Nuclear core discharge monitors (CDMS) can provide continuous, unattended recording of the reactor's fueling activity for later, qualitative review by a safeguards inspector. A quantitative analysis of this collected data could prove to be a great asset to inspectors because more information can be extracted from the data and the analysis time can be reduced considerably. This paper presents a prototype for an automated software analysis system capable of identifying when fuel bundle pushes occurred and monitoring the power level of the reactor. Neural network models were developed for calculating the region on the reactor face from which the fuel was discharged and predicting the burnup. These models were created and tested using actual data collected from a CDM system at an on-load reactor facility. Collectively, these automated quantitative analysis programs could help safeguarding agencies to gain a better perspective on the complete picture of the fueling activity of an on-load nuclear reactor. This type of system can provide a cost-effective solution for automated monitoring of on-load reactors significantly reducing time and effort

  8. Automation of the ANSTO working standard of measurement for the activity of radionuclides

    International Nuclear Information System (INIS)

    Buckman, S.M.

    1990-08-01

    The ANSTO working standard ion chamber is used routinely for the standardisation of a range of gamma emitting radio-isotopes. The ion chamber has recently been automated by replacing the AAEC type 292 Recycling Discriminator, timer module and model 43 teletype printer with the HP86B computer, HP-59501B voltage programmer and HP-6181C current source. The program 'MEASION', running on the Deltacom IBM AT clone, calculates the radioactivity, with full error statements, from the ion chamber measurements. Each of these programs is listed and discussed. 13 refs., 5 figs., tabs

  9. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    Science.gov (United States)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  10. Automated data processing and radioassays.

    Science.gov (United States)

    Samols, E; Barrows, G H

    1978-04-01

    Radioassays include (1) radioimmunoassays, (2) competitive protein-binding assays based on competition for limited antibody or specific binding protein, (3) immunoradiometric assay, based on competition for excess labeled antibody, and (4) radioreceptor assays. Most mathematical models describing the relationship between labeled ligand binding and unlabeled ligand concentration have been based on the law of mass action or the isotope dilution principle. These models provide useful data reduction programs, but are theoretically unfactory because competitive radioassay usually is not based on classical dilution principles, labeled and unlabeled ligand do not have to be identical, antibodies (or receptors) are frequently heterogenous, equilibrium usually is not reached, and there is probably steric and cooperative influence on binding. An alternative, more flexible mathematical model based on the probability or binding collisions being restricted by the surface area of reactive divalent sites on antibody and on univalent antigen has been derived. Application of these models to automated data reduction allows standard curves to be fitted by a mathematical expression, and unknown values are calculated from binding data. The vitrues and pitfalls are presented of point-to-point data reduction, linear transformations, and curvilinear fitting approaches. A third-order polynomial using the square root of concentration closely approximates the mathematical model based on probability, and in our experience this method provides the most acceptable results with all varieties of radioassays. With this curvilinear system, linear point connection should be used between the zero standard and the beginning of significant dose response, and also towards saturation. The importance is stressed of limiting the range of reported automated assay results to that portion of the standard curve that delivers optimal sensitivity. Published methods for automated data reduction of Scatchard plots

  11. PCRELAP5: data calculation program for RELAP 5 code

    International Nuclear Information System (INIS)

    Silvestre, Larissa Jacome Barros

    2016-01-01

    Nuclear accidents in the world led to the establishment of rigorous criteria and requirements for nuclear power plant operations by the international regulatory bodies. By using specific computer programs, simulations of various accidents and transients likely to occur at any nuclear power plant are required for certifying and licensing a nuclear power plant. Based on this scenario, some sophisticated computational tools have been used such as the Reactor Excursion and Leak Analysis Program (RELAP5), which is the most widely used code for the thermo-hydraulic analysis of accidents and transients in nuclear reactors in Brazil and worldwide. A major difficulty in the simulation by using RELAP5 code is the amount of information required for the simulation of thermal-hydraulic accidents or transients. The preparation of the input data requires a great number of mathematical operations to calculate the geometry of the components. Thus, for those calculations performance and preparation of RELAP5 input data, a friendly mathematical preprocessor was designed. The Visual Basic for Application (VBA) for Microsoft Excel demonstrated to be an effective tool to perform a number of tasks in the development of the program. In order to meet the needs of RELAP5 users, the RELAP5 Calculation Program (Programa de Calculo do RELAP5 - PCRELAP5) was designed. The components of the code were codified; all entry cards including the optional cards of each one have been programmed. In addition, an English version for PCRELAP5 was provided. Furthermore, a friendly design was developed in order to minimize the time of preparation of input data and errors committed by users. In this work, the final version of this preprocessor was successfully applied for Safety Injection System (SIS) of Angra 2. (author)

  12. Quantitative Estimation for the Effectiveness of Automation

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun

    2012-01-01

    In advanced MCR, various automation systems are applied to enhance the human performance and reduce the human errors in industrial fields. It is expected that automation provides greater efficiency, lower workload, and fewer human errors. However, these promises are not always fulfilled. As the new types of events related to application of the imperfect and complex automation are occurred, it is required to analyze the effects of automation system for the performance of human operators. Therefore, we suggest the quantitative estimation method to analyze the effectiveness of the automation systems according to Level of Automation (LOA) classification, which has been developed over 30 years. The estimation of the effectiveness of automation will be achieved by calculating the failure probability of human performance related to the cognitive activities

  13. Optimization of automation: I. Estimation method of cognitive automation rates reflecting the effects of automation on human operators in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Seong, Poong Hyun

    2014-01-01

    experiments. The result showed that a decreased rate of the operator working time was significantly related to the cognitive automation rate and that the calculation of the cognitive task load was useful as a measure of the cognitive automation rate

  14. HEINBE; the calculation program for helium production in beryllium under neutron irradiation

    International Nuclear Information System (INIS)

    Shimakawa, Satoshi; Ishitsuka, Etsuo; Sato, Minoru

    1992-11-01

    HEINBE is a program on personal computer for calculating helium production in beryllium under neutron irradiation. The program can also calculate the tritium production in beryllium. Considering many nuclear reactions and their multi-step reactions, helium and tritium productions in beryllium materials irradiated at fusion reactor or fission reactor may be calculated with high accuracy. The calculation method, user's manual, calculated examples and comparison with experimental data were described. This report also describes a neutronics simulation method to generate additional data on swelling of beryllium, 3,000-15,000 appm helium range, for end-of-life of the proposed design for fusion blanket of the ITER. The calculation results indicate that helium production for beryllium sample doped lithium by 50 days irradiation in the fission reactor, such as the JMTR, could be achieved to 2,000-8,000 appm. (author)

  15. Ptolemy: a program for heavy-ion direct-reaction calculations

    International Nuclear Information System (INIS)

    Macfarlane, M.H.; Pieper, S.C.

    1978-04-01

    Ptolemy is an IBM/360 program for the computation of nuclear elastic and direct-reaction cross sections. It carries out optical-model fits to elastic-scattering data at one or more energies and for one or more combinations of projectile and target, collective model DWBA calculations of excitation processes, and finite-range DWBA calculations of nucleon-transfer reactions. It is fast and does not require large amounts of core. The input is exceptionally flexible and easy to use. The types of calculations that Ptolemy can carry out are outlined, the formulas used are summarized, and a detailed description of its input is given

  16. Calculational Tool for Skin Contamination Dose Assessment

    CERN Document Server

    Hill, R L

    2002-01-01

    Spreadsheet calculational tool was developed to automate the calculations preformed for dose assessment of skin contamination. This document reports on the design and testing of the spreadsheet calculational tool.

  17. Automating with SIMATIC S7-400 inside TIA portal configuring, programming and testing with STEP 7 Professional

    CERN Document Server

    Berger, Hans

    2014-01-01

    This book presents a comprehensive description of the configuration of devices and network for the S7-400 components inside the engineering framework TIA Portal. You learn how to formulate and test a control program with the programming languages LAD, FBD, STL, and SCL. The book is rounded off by configuring the distributed I/O with PROFIBUS DP and PROFINET IO using SIMATIC S7-400 and data exchange via Industrial Ethernet. SIMATIC is the globally established automation system for implementing industrial controllers for machines, production plants and processes. SIMATIC S7-400 is the most powerful automation system within SIMATIC. This process controller is ideal for data-intensive tasks that are especially typical for the process industry. With superb communication capability and integrated interfaces it is optimized for larger tasks such as the coordination of entire systems. Open-loop and closed-loop control tasks are formulated with the STEP 7 Professional V11 engineering software in the field-proven progr...

  18. SHIELD 1.0: development of a shielding calculator program in diagnostic radiology

    International Nuclear Information System (INIS)

    Santos, Romulo R.; Real, Jessica V.; Luz, Renata M. da; Friedrich, Barbara Q.; Silva, Ana Maria Marques da

    2013-01-01

    In shielding calculation of radiological facilities, several parameters are required, such as occupancy, use factor, number of patients, source-barrier distance, area type (controlled and uncontrolled), radiation (primary or secondary) and material used in the barrier. The shielding design optimization requires a review of several options about the physical facility design and, mainly, the achievement of the best cost-benefit relationship for the shielding material. To facilitate the development of this kind of design, a program to calculate the shielding in diagnostic radiology was implemented, based on data and limits established by National Council on Radiation Protection and Measurements (NCRP) 147 and SVS-MS 453/98. The program was developed in C⌗ language, and presents a graphical interface for user data input and reporting capabilities. The module initially implemented, called SHIELD 1.0, refers to calculating barriers for conventional X-ray rooms. The program validation was performed by the comparison with the results of examples of shielding calculations presented in NCRP 147.

  19. Highway Electrification And Automation

    OpenAIRE

    Shladover, Steven E.

    1992-01-01

    This report addresses how the California Department of Transportation and the California PATH Program have made efforts to evaluate the feasibility and applicability of highway electrification and automation technologies. In addition to describing how the work was conducted, the report also describes the findings on highway electrification and highway automation, with experimental results, design study results, and a region-wide application impacts study for Los Angeles.

  20. Automation U.S.A.: Overcoming Barriers to Automation.

    Science.gov (United States)

    Brody, Herb

    1985-01-01

    Although labor unions and inadequate technology play minor roles, the principal barrier to factory automation is "fear of change." Related problems include long-term benefits, nontechnical executives, and uncertainty of factory cost accounting. Industry support for university programs is helping to educate engineers to design, implement, and…

  1. Introduction of an automated mine surveying system - a method for effective control of mining operations

    Energy Technology Data Exchange (ETDEWEB)

    Mazhdrakov, M.

    1987-04-01

    Reviews developments in automated processing of mine survey data in Bulgaria for 1965-1970. This development has occurred in three phases. In the first phase, computers calculated coordinates of mine survey points; in the second phase, these data were electronically processed; in the third phase, surface and underground mine development is controlled by electronic data processing equipment. Centralized and decentralized electronic processing of data has been introduced at major coal mines. The Bulgarian Pravets 82 microcomputer and the ASMO-MINI program package are in current use at major coal mines. A lack of plotters, due to financial limitations, handicaps large-scale application of automated mine surveying in Bulgaria.

  2. Calculations in cytogenetic dosimetry by means of the dosgen program

    International Nuclear Information System (INIS)

    Garcia Lima, O.; Zerquera, J.T.

    1996-01-01

    The DOSGEN program sums up the different calculations routing that are more often used in cytogenetic dosimetry. It can be implemented in a compatible IBM PC by cytogenetic experts having a basic knowledge of computing. The programs has been successfully applied using experimental data and its advantages have been acknowledge by Latin American and Asian Laboratories dealing with this medical branch. The program is written in Pascal Language and requires 42 K bytes

  3. Use of the 'DRAGON' program for the calculation of reactivity devices

    International Nuclear Information System (INIS)

    Mollerach, Ricardo; Fink, Jose

    2003-01-01

    DRAGON is a computer program developed at the Ecole Polytechnique of the University of Montreal and adopted by AECL for the transport calculations associated to reactivity devices. This report presents aspects of the implementation in NASA of the DRAGON program. Some cases of interest were evaluated. Comparisons with results of known programs as WIMS D5, and with experiments were done. a) Embalse (CANDU 6) cell without burnup and leakage. Calculations of macroscopic cross sections with WIMS and DRAGON show very good agreement with smaller differences in the thermal constants. b) Embalse fresh cell with different leakage options. c) Embalse cell with leakage and burnup. A comparison of k-infinity and k-effective with WIMS and DRAGON as a function of burnup shows that the differences ((D-W)/D) for fresh fuel are -0.17% roughly constant up to about 2500 MWd/tU, and then decrease to -0.06 % for 8500 MWd/tU. Experiments made in 1977 in ZED-2 critical facility, reported in [3], were used as a benchmark for the cell and supercell DRAGON calculations. Calculated fluxes were compared with experimental values and the agreement is so good. d) ZED-2 cell calculation. The measured buckling was used as geometric buckling. This case can be considered an experimental verification. The calculated reactivity with DRAGON is about 2 mk, and can be considered satisfactory. WIMS k-effective value is about one mk higher. e) Supercell calculations for ZED-2 vertical and horizontal tube and rod adjuster using 2D and 3D models were done. Comparisons between measured and calculated fluxes in the vicinity of the adjuster rods. Incremental cross sections for these adjusters were calculated using different options. f) ZED-2 reactor calculations with PUMA reveal a good concordance with critical heights measured in experiments. The report describes also particular features of the code and recommendations regarding its use that may be useful for new users. (author)

  4. Blow.MOD2: a program for blowdown transient calculations

    International Nuclear Information System (INIS)

    Doval, A.

    1990-01-01

    The BLOW.MOD2 program has been developed to calculate the blowdown phase in a pressurized vessel after a break/valve is opened. It is a one volume model where break height and flow area are specified. Moody critical flow model was adopted under saturation conditions for flow calculation through the break. Heat transfer from structures and internals have been taken into account. Long term depressurization results and a more complex model are compared satisfactorily. (Author)

  5. Computer automation of a health physics program record

    International Nuclear Information System (INIS)

    Bird, E.M.; Flook, B.A.; Jarrett, R.D.

    1984-01-01

    A multi-user computer data base management system (DBMS) has been developed to automate USDA's national radiological safety program. It maintains information on approved users of radioactive material and radiation emanating equipment, as a central file which is accessed whenever information on the user is required. Files of inventory, personnel dosemetry records, laboratory and equipment surveys, leak tests, bioassay reports, and all other information are linked to each approved user by an assigned code that identifies the user by state, agency, and facility. The DBMS is menu-driven with provisions for addition, modification and report generation of information maintained in the system. This DBMS was designed as a single entry system to reduce the redundency of data entry. Prompts guide the user at decision points and data validation routines check for proper data entry. The DBMS generates lists of current inventories, leak test forms, inspection reports, scans for overdue reports from users, and generates follow-up letters. The DBMS system operates on a Wang OIS computer and utilizes its compiled BASIC, List Processing, Word Processing, and indexed (ISAM) file features. This system is a very fast relational database supporting many users simultaneously while providing several methods of data protection. All data files are compatible with List Processing. Information in these files can be examined, sorted, modified, or outputted to word processing documents using software supplied by Wang. This has reduced the need for special one-time programs and provides alternative access to the data

  6. Automated Subsystem Control for Life Support System (ASCLSS)

    Science.gov (United States)

    Block, Roger F.

    1987-01-01

    The Automated Subsystem Control for Life Support Systems (ASCLSS) program has successfully developed and demonstrated a generic approach to the automation and control of space station subsystems. The automation system features a hierarchical and distributed real-time control architecture which places maximum controls authority at the lowest or process control level which enhances system autonomy. The ASCLSS demonstration system pioneered many automation and control concepts currently being considered in the space station data management system (DMS). Heavy emphasis is placed on controls hardware and software commonality implemented in accepted standards. The approach demonstrates successfully the application of real-time process and accountability with the subsystem or process developer. The ASCLSS system completely automates a space station subsystem (air revitalization group of the ASCLSS) which moves the crew/operator into a role of supervisory control authority. The ASCLSS program developed over 50 lessons learned which will aide future space station developers in the area of automation and controls..

  7. Automation from pictures

    International Nuclear Information System (INIS)

    Kozubal, A.J.

    1992-01-01

    The state transition diagram (STD) model has been helpful in the design of real time software, especially with the emergence of graphical computer aided software engineering (CASE) tools. Nevertheless, the translation of the STD to real time code has in the past been primarily a manual task. At Los Alamos we have automated this process. The designer constructs the STD using a CASE tool (Cadre Teamwork) using a special notation for events and actions. A translator converts the STD into an intermediate state notation language (SNL), and this SNL is compiled directly into C code (a state program). Execution of the state program is driven by external events, allowing multiple state programs to effectively share the resources of the host processor. Since the design and the code are tightly integrated through the CASE tool, the design and code never diverge, and we avoid design obsolescence. Furthermore, the CASE tool automates the production of formal technical documents from the graphic description encapsulated by the CASE tool. (author)

  8. TRING: a computer program for calculating radionuclide transport in groundwater

    International Nuclear Information System (INIS)

    Maul, P.R.

    1984-12-01

    The computer program TRING is described which enables the transport of radionuclides in groundwater to be calculated for use in long term radiological assessments using methods described previously. Examples of the areas of application of the program are activity transport in groundwater associated with accidental spillage or leakage of activity, the shutdown of reactors subject to delayed decommissioning, shallow land burial of intermediate level waste and geologic disposal of high level waste. Some examples of the use of the program are given, together with full details to enable users to run the program. (author)

  9. Monteray Mark-I: Computer program (PC-version) for shielding calculation with Monte Carlo method

    International Nuclear Information System (INIS)

    Pudjijanto, M.S.; Akhmad, Y.R.

    1998-01-01

    A computer program for gamma ray shielding calculation using Monte Carlo method has been developed. The program is written in WATFOR77 language. The MONTERAY MARH-1 is originally developed by James Wood. The program was modified by the authors that the modified version is easily executed. Applying Monte Carlo method the program observe photon gamma transport in an infinity planar shielding with various thick. A photon gamma is observed till escape from the shielding or when its energy less than the cut off energy. Pair production process is treated as pure absorption process that annihilation photons generated in the process are neglected in the calculation. The out put data calculated by the program are total albedo, build-up factor, and photon spectra. The calculation result for build-up factor of a slab lead and water media with 6 MeV parallel beam gamma source shows that they are in agreement with published data. Hence the program is adequate as a shielding design tool for observing gamma radiation transport in various media

  10. Calculation of new snow densities from sub-daily automated snow measurements

    Science.gov (United States)

    Helfricht, Kay; Hartl, Lea; Koch, Roland; Marty, Christoph; Lehning, Michael; Olefs, Marc

    2017-04-01

    :10 approximation (i.e. 100 kgm-3), which is mainly based on daily values in the Alps. Variations in new snow density could not be explained in a satisfactory manner using meteorological data measured at the same location. Likewise, some of the tested parametrizations of new snow density, which primarily use air temperature as a proxy, result in median new snow densities close to the ones from automated measurements, but show only a low correlation between calculated and measured new snow densities. The case study on the influence of snow settling on HN resulted on average in an underestimation of HN by 17%, which corresponds to 2-3% of the cumulated HN from the previous 24 hours. Therefore, the mean hourly new snow densities may be overestimated by 14%. The analysis in this study is especially limited with respect to the meteorological influence on the HS measurement using ultra-sonic rangers. Nevertheless, the reasonable mean values encourage calculating new snow densities from standard hydro-meteorological measurements using more precise observation devices such as optical snow depth sensors and more sensitive scales for SWE measurements also on sub-daily time-scales.

  11. Computer automation of continuous-flow analyzers for trace constituents in water. Volume 4. Description of program segments. Part 2. TAAINRE

    International Nuclear Information System (INIS)

    Crawford, R.W.

    1979-01-01

    TAAINRE, the second program in the series of programs necessary in automating the Technicon AutoAnalyzer, is presented. A flow chart and sequence list that describes and illustrates the function of each logical group of coding, and a description of the contents and function of each section and subroutine in the program is included. In addition, all arrays, strings, and variables are listed and defined, and a sample program listing with a complete list of symbols and references provided

  12. Computer automation of continuous-flow analyzers for trace constituents in water. Volume 4. Description of program segments. Part 1. TAAIN

    International Nuclear Information System (INIS)

    Crawford, R.W.

    1979-01-01

    This report describes TAAIN, the first program in the series of programs necessary in automating the Technicon AutoAnalyzer. A flow chart and sequence list that describes and illustrates each logical group of coding, and a description of the contents and functions of each section and subroutine in the program is included. In addition, all arrays, strings, and variables are listed and defined, and a sample program listing with a complete list of symbols and references is provided

  13. Space power subsystem automation technology

    Science.gov (United States)

    Graves, J. R. (Compiler)

    1982-01-01

    The technology issues involved in power subsystem automation and the reasonable objectives to be sought in such a program were discussed. The complexities, uncertainties, and alternatives of power subsystem automation, along with the advantages from both an economic and a technological perspective were considered. Whereas most spacecraft power subsystems now use certain automated functions, the idea of complete autonomy for long periods of time is almost inconceivable. Thus, it seems prudent that the technology program for power subsystem automation be based upon a growth scenario which should provide a structured framework of deliberate steps to enable the evolution of space power subsystems from the current practice of limited autonomy to a greater use of automation with each step being justified on a cost/benefit basis. Each accomplishment should move toward the objectives of decreased requirement for ground control, increased system reliability through onboard management, and ultimately lower energy cost through longer life systems that require fewer resources to operate and maintain. This approach seems well-suited to the evolution of more sophisticated algorithms and eventually perhaps even the use of some sort of artificial intelligence. Multi-hundred kilowatt systems of the future will probably require an advanced level of autonomy if they are to be affordable and manageable.

  14. Study on the output from programs in calculating lattice with transverse coupling

    International Nuclear Information System (INIS)

    Xu Jianming

    1994-01-01

    SYNCH and MAD outputs in calculating lattice with coordinate rotation have been studied. The result shows that the four dispersion functions given by SYNCH output in this case are wrong. There are large discrepancies between the Twiss Parameters given by these two programs. One has to be careful in using these programs to calculate or match lattices with coordinate rotations (coupling between two transverse motions) so that to avoid wrong results

  15. Instant Sikuli test automation

    CERN Document Server

    Lau, Ben

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. A concise guide written in an easy-to follow style using the Starter guide approach.This book is aimed at automation and testing professionals who want to use Sikuli to automate GUI. Some Python programming experience is assumed.

  16. MP.EXE, a Calculation Program for Pressure Reciprocity Calibration of Microphones

    DEFF Research Database (Denmark)

    Rasmussen, Knud

    1998-01-01

    A computer program is described which calculates the pressure sensitivity of microphones based on measurements of the electrical transfer impedance in a reciprocity calibration set-up. The calculations are performed according to the International Standard IEC 6194-2. In addition a number of options...

  17. Automatic capability to store and retrieve component data and to calculate structural integrity of these components

    International Nuclear Information System (INIS)

    McKinnis, C.J.; Toor, P.M.

    1985-01-01

    In structural analysis, assimilation of material, geometry, and service history input parameters is very cumbersome. Quite often with changing service history and revised material properties and geometry, an analysis has to be repeated. To overcome the above mentioned difficulties, a computer program was developed to provide the capability to establish a computerized library of all material, geometry, and service history parameters for components. The program also has the capability to calculate the structural integrity based on the Arrhenius type equations, including the probability calculations. This unique combination of computerized input information storage and automated analysis procedure assures consistency, efficiency, and accuracy when the hardware integrity has to be reassessed

  18. Home automation with Intel Galileo

    CERN Document Server

    Dundar, Onur

    2015-01-01

    This book is for anyone who wants to learn Intel Galileo for home automation and cross-platform software development. No knowledge of programming with Intel Galileo is assumed, but knowledge of the C programming language is essential.

  19. Validation experience with the core calculation program karate

    International Nuclear Information System (INIS)

    Hegyi, Gy.; Hordosy, G.; Kereszturi, A.; Makai, M.; Maraczy, Cs.

    1995-01-01

    A relatively fast and easy-to-handle modular code system named KARATE-440 has been elaborated for steady-state operational calculations of VVER-440 type reactors. It is built up from cell, assembly and global calculations. In the frame of the program neutron physical and thermohydraulic process of the core at normal startup, steady and slow transient can be simulated. The verification and validation of the global code have been prepared recently. The test cases include mathematical benchmark and measurements on operating VVER-440 units. Summary of the results, such as startup parameters, boron letdown curves, radial and axial power distributions of some cycles of Paks NPP is presented. (author)

  20. Program for photon shielding calculations. Examination of approximations on irradiation geometries

    International Nuclear Information System (INIS)

    Isozumi, Yasuhito; Ishizuka, Fumihiko; Miyatake, Hideo; Kato, Takahisa; Tosaki, Mitsuo

    2004-01-01

    Penetration factors and related numerical data in 'Manual of Practical Shield Calculation of Radiation Facilities (2000)', which correspond to the irradiation geometries of point isotropic source in infinite thick material (PI), point isotropic source in finite thick material (PF) and vertical incident to finite thick material (VF), have been carefully examined. The shield calculation based on the PI geometry is usually performed with effective dose penetration factors of radioisotopes given in the 'manual'. The present work cleary shows that such a calculation may lead to an overestimate more than twice larger, especially for thick shield of concrete and water. Employing the numerical data in the 'manual', we have fabricated a simple computer program for the estimation of penetration factors and effective doses of radioisotopes in the different irradiation geometries, i.e., PI, PF and VF. The program is also available to calculate the effective dose from a set of radioisotopes in the different positions, which is necessary for the γ-ray shielding of radioisotope facilities. (author)

  1. "First generation" automated DNA sequencing technology.

    Science.gov (United States)

    Slatko, Barton E; Kieleczawa, Jan; Ju, Jingyue; Gardner, Andrew F; Hendrickson, Cynthia L; Ausubel, Frederick M

    2011-10-01

    Beginning in the 1980s, automation of DNA sequencing has greatly increased throughput, reduced costs, and enabled large projects to be completed more easily. The development of automation technology paralleled the development of other aspects of DNA sequencing: better enzymes and chemistry, separation and imaging technology, sequencing protocols, robotics, and computational advancements (including base-calling algorithms with quality scores, database developments, and sequence analysis programs). Despite the emergence of high-throughput sequencing platforms, automated Sanger sequencing technology remains useful for many applications. This unit provides background and a description of the "First-Generation" automated DNA sequencing technology. It also includes protocols for using the current Applied Biosystems (ABI) automated DNA sequencing machines. © 2011 by John Wiley & Sons, Inc.

  2. An automated dose tracking system for adaptive radiation therapy.

    Science.gov (United States)

    Liu, Chang; Kim, Jinkoo; Kumarasiri, Akila; Mayyas, Essa; Brown, Stephen L; Wen, Ning; Siddiqui, Farzan; Chetty, Indrin J

    2018-02-01

    The implementation of adaptive radiation therapy (ART) into routine clinical practice is technically challenging and requires significant resources to perform and validate each process step. The objective of this report is to identify the key components of ART, to illustrate how a specific automated procedure improves efficiency, and to facilitate the routine clinical application of ART. Data was used from patient images, exported from a clinical database and converted to an intermediate format for point-wise dose tracking and accumulation. The process was automated using in-house developed software containing three modularized components: an ART engine, user interactive tools, and integration tools. The ART engine conducts computing tasks using the following modules: data importing, image pre-processing, dose mapping, dose accumulation, and reporting. In addition, custom graphical user interfaces (GUIs) were developed to allow user interaction with select processes such as deformable image registration (DIR). A commercial scripting application programming interface was used to incorporate automated dose calculation for application in routine treatment planning. Each module was considered an independent program, written in C++or C#, running in a distributed Windows environment, scheduled and monitored by integration tools. The automated tracking system was retrospectively evaluated for 20 patients with prostate cancer and 96 patients with head and neck cancer, under institutional review board (IRB) approval. In addition, the system was evaluated prospectively using 4 patients with head and neck cancer. Altogether 780 prostate dose fractions and 2586 head and neck cancer dose fractions went processed, including DIR and dose mapping. On average, daily cumulative dose was computed in 3 h and the manual work was limited to 13 min per case with approximately 10% of cases requiring an additional 10 min for image registration refinement. An efficient and convenient

  3. Thermal Hydraulic Fortran Program for Steady State Calculations of Plate Type Fuel Research Reactors

    International Nuclear Information System (INIS)

    Khedr, H.

    2008-01-01

    The safety assessment of Research and Power Reactors is a continuous process over their life and that requires verified and validated codes. Power Reactor codes all over the world are well established and qualified against a real measuring data and qualified experimental facilities. These codes are usually sophisticated, require special skills and consume much more running time. On the other hand, most of the Research Reactor codes still requiring more data for validation and qualification. Therefore it is benefit for a regulatory body and the companies working in the area of Research Reactor assessment and design to have their own program that give them a quick judgment. The present paper introduces a simple one dimensional Fortran program called THDSN for steady state best estimate Thermal Hydraulic (TH) calculations of plate type fuel RRs. Beside calculating the fuel and coolant temperature distribution and pressure gradient in an average and hot channel the program calculates the safety limits and margins against the critical phenomena encountered in RR such as the burnout heat flux and the onset of flow instability. Well known TH correlations for calculating the safety parameters are used. THDSN program is verified by comparing its results for 2 and 10 MW benchmark reactors with that published in IAEA publications and good agreement is found. Also the program results are compared with those published for other programs such as PARET and TERMIC. An extension for this program is underway to cover the transient TH calculations

  4. BUCKL: a program for rapid calculation of x-ray deposition

    International Nuclear Information System (INIS)

    Cole, R.K. Jr.

    1970-07-01

    A computer program is described which has the fast execution time of exponential codes but also evaluates the effects of fluorescence and scattering. The program makes use of diffusion calculations with a buckling correction included to approximate the effects of finite transverse geometry. Theory and derivations necessary for the BUCKL code are presented, and the code results are compared with those of earlier codes for a variety of problems. Inputs and outputs of the program are described, and a FORTRAN listing is provided. Shortcomings of the program are discussed and suggestions are provided for possible future improvement. (U.S.)

  5. Development of HyPEP, A Hydrogen Production Plant Efficiency Calculation Program

    International Nuclear Information System (INIS)

    Lee, Young Jin; Park, Ji Won; Lee, Won Jae; Shin, Young Joon; Kim, Jong Ho; Hong, Sung Deok; Lee, Seung Wook; Hwang, Moon Kyu

    2007-12-01

    Development of HyPEP program for assessing the steady-state hydrogen production efficiency of the nuclear hydrogen production facilities was carried out. The main developmental aims of the HyPEP program are the extensive application of the GUI for enhanced user friendliness and the fast numerical solution scheme. These features are suitable for such calculations as the optimisation calculations. HyPEP was developed with the object-oriented programming techniques. The components of the facility was modelled as objects in a hierarchical structure where the inheritance property of the object oriented program were extensively applied. The Delphi program language which is based on the Object Pascal was used for the HyPEP development. The conservation equations for the thermal hydraulic flow network were setup and the numerical solution scheme was developed and implemented into HyPEP beta version. HyPEP beta version has been developed with working GUI and the numerical solution scheme implementation. Due to the premature end of this project the fully working version of HyPEP was not produced

  6. Automated calculation of point A coordinates for CT-based high-dose-rate brachytherapy of cervical cancer

    Directory of Open Access Journals (Sweden)

    Hyejoo Kang

    2017-07-01

    Full Text Available Purpose: The goal is to develop a stand-alone application, which automatically and consistently computes the coordinates of the dose calculation point recommended by the American Brachytherapy Society (i.e., point A based solely on the implanted applicator geometry for cervical cancer brachytherapy. Material and methods: The application calculates point A coordinates from the source dwell geometries in the computed tomography (CT scans, and outputs the 3D coordinates in the left and right directions. The algorithm was tested on 34 CT scans of 7 patients treated with high-dose-rate (HDR brachytherapy using tandem and ovoid applicators. A single experienced user retrospectively and manually inserted point A into each CT scan, whose coordinates were used as the “gold standard” for all comparisons. The gold standard was subtracted from the automatically calculated points, a second manual placement by the same experienced user, and the clinically used point coordinates inserted by multiple planners. Coordinate differences and corresponding variances were compared using nonparametric tests. Results: Automatically calculated, manually placed, and clinically used points agree with the gold standard to < 1 mm, 1 mm, 2 mm, respectively. When compared to the gold standard, the average and standard deviation of the 3D coordinate differences were 0.35 ± 0.14 mm from automatically calculated points, 0.38 ± 0.21 mm from the second manual placement, and 0.71 ± 0.44 mm from the clinically used point coordinates. Both the mean and standard deviations of the 3D coordinate differences were statistically significantly different from the gold standard, when point A was placed by multiple users (p < 0.05 but not when placed repeatedly by a single user or when calculated automatically. There were no statistical differences in doses, which agree to within 1-2% on average for all three groups. Conclusions: The study demonstrates that the automated algorithm

  7. Automation-aided Task Loads Index based on the Automation Rate Reflecting the Effects on Human Operators in NPPs

    International Nuclear Information System (INIS)

    Lee, Seungmin; Seong, Poonghyun; Kim, Jonghyun

    2013-01-01

    Many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs) was suggested. These suggested measures express how much automation support human operators but it cannot express the change of human operators' workload, whether the human operators' workload is increased or decreased. Before considering automation rates, whether the adopted automation is good or bad might be estimated in advance. In this study, to estimate the appropriateness of automation according to the change of the human operators' task loads, automation-aided task loads index is suggested based on the concept of the suggested automation rate. To insure plant safety and efficiency on behalf of human operators, various automation systems have been installed in NPPs, and many works which were previously conducted by human operators can now be supported by computer-based operator aids. According to the characteristics of the automation types, the estimation method of the system automation and the cognitive automation rate were suggested. The proposed estimation method concentrates on the effects of introducing automation, so it directly express how much the automated system support human operators. Based on the suggested automation rates, the way to estimate how much the automated system can affect the human operators' cognitive task load is suggested in this study. When there is no automation, the calculated index is 1, and it means there is no change of human operators' task load

  8. Knowledge-based automated radiopharmaceutical manufacturing for Positron Emission Tomography

    International Nuclear Information System (INIS)

    Alexoff, D.L.

    1991-01-01

    This article describes the application of basic knowledge engineering principles to the design of automated synthesis equipment for radiopharmaceuticals used in Positron Emission Tomography (PET). Before discussing knowledge programming, an overview of the development of automated radiopharmaceutical synthesis systems for PET will be presented. Since knowledge systems will rely on information obtained from machine transducers, a discussion of the uses of sensory feedback in today's automated systems follows. Next, the operation of these automated systems is contrasted to radiotracer production carried out by chemists, and the rationale for and basic concepts of knowledge-based programming are explained. Finally, a prototype knowledge-based system supporting automated radiopharmaceutical manufacturing of 18FDG at Brookhaven National Laboratory (BNL) is described using 1stClass, a commercially available PC-based expert system shell

  9. SCMAG series of programs for calculating superconducting dipole and quadrupole magnets

    International Nuclear Information System (INIS)

    Green, M.A.

    1974-10-01

    Programs SCMAG1, SCMAG2, SCMAG3, and SCMAG4 are a group of programs used to design and calculate the characteristics of conductor dominated superconducting dipole and quadrupole magnets. These magnets are used to bend and focus beams of high energy particles and are being used to design the superconducting magnets for the LBL ESCAR accelerator. The four programs are briefly described. (TFD)

  10. Development of the RSAC Automation System for Reload Core of WH NPP

    International Nuclear Information System (INIS)

    Choi, Yu Sun; Bae, Sung Man; Koh, Byung Marn; Hong, Sun Kwan

    2006-01-01

    The Nuclear Design for Reload Core of Westinghouse Nuclear Power Plant consists of 'Reload Core Model Search', 'Safety Analysis(RSAC)', 'NDR(Nuclear Design Report) and OCAP(Operational Core Analysis Package Generation)' phases. Since scores of calculations for various accidents are required to confirm that the safety analysis assumptions are valid, the Safety Analysis(RSAC) is the most important and time and effort consuming phase of reload core design sequence. The Safety Analysis Automation System supports core designer by the automation of safety analysis calculations in 'Safety Analysis' phase(about 20 calculations). More than 10 kinds of codes, APA(ALPHA/PHOENIX/ANC), APOLLO, VENUS, PHIRE XEFIT, INCORE, etc. are being used for Safety Analysis calculations. Westinghouse code system needs numerous inputs and outputs, so the possibility of human errors could not be ignored during Safety Analysis calculations. To remove these inefficiencies, all input files for Safety Analysis calculations are automatically generated and executed by this Safety Analysis Automation System. All calculation notes are generated and the calculation results are summarized in RSAC (Reload Safety Analysis Checklist) by this system. Therefore, The Safety Analysis Automation System helps the reload core designer to perform safety analysis of the reload core model instantly and correctly

  11. Maneuver Automation Software

    Science.gov (United States)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; hide

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  12. Reload safety analysis automation tools

    International Nuclear Information System (INIS)

    Havlůj, F.; Hejzlar, J.; Vočka, R.

    2013-01-01

    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  13. The standard calibration instrument automation system for the atomic absorption spectrophotometer. Part 3: Program documentation

    Science.gov (United States)

    Ryan, D. P.; Roth, G. S.

    1982-04-01

    Complete documentation of the 15 programs and 11 data files of the EPA Atomic Absorption Instrument Automation System is presented. The system incorporates the following major features: (1) multipoint calibration using first, second, or third degree regression or linear interpolation, (2) timely quality control assessments for spiked samples, duplicates, laboratory control standards, reagent blanks, and instrument check standards, (3) reagent blank subtraction, and (4) plotting of calibration curves and raw data peaks. The programs of this system are written in Data General Extended BASIC, Revision 4.3, as enhanced for multi-user, real-time data acquisition. They run in a Data General Nova 840 minicomputer under the operating system RDOS, Revision 6.2. There is a functional description, a symbol definitions table, a functional flowchart, a program listing, and a symbol cross reference table for each program. The structure of every data file is also detailed.

  14. USB port compatible virtual instrument based automation for x-ray diffractometer setup

    International Nuclear Information System (INIS)

    Jayapandian, J.; Sheela, O.K.; Mallika, R.; Thiruarul, A.; Purniah, B.

    2004-01-01

    Windows based virtual instrument (VI) programs in graphic language simplify the design automation in R and D laboratories. With minimal hardware and maximum support of software, the automation becomes easier and user friendly. A novel design approach for the automation of SIEMENS make x-ray diffractometer setup is described in this paper. The automation is achieved with an indigenously developed virtual instrument program in labVIEW ver.6.0 and with a simple hardware design using 89C2051 micro-controller compatible with PC's USB port for the total automation of the experiment. (author)

  15. SCMAG series of programs for calculating superconducting dipole and quadrupole magnets

    International Nuclear Information System (INIS)

    Green, M.A.

    1974-01-01

    A general description is given of four computer programs for calculating the characteristics of superconducting magnets used in the bending and focusing of high-energy particle beams. The programs are being used in the design of magnets for the LBL ESCAR (Experimental Superconducting Accelerator Ring) accelerator. (U.S.)

  16. Development and application of the automated Monte Carlo biasing procedure in SAS4

    International Nuclear Information System (INIS)

    Tang, J.S.; Broadhead, B.L.

    1993-01-01

    An automated approach for biasing Monte Carlo shielding calculations is described. In particular, adjoint fluxes from a one-dimensional discrete-ordinates calculation are used to generate biasing parameters for a three-dimensional Monte Carlo calculation. The automated procedure consisting of cross-section processing, adjoint flux determination, biasing parameter generation, and the initiation of a MORSE-SGC/S Monte Carlo calculation has been implemented in the SAS4 module of the SCALE computer code system. The automated procedure has been used extensively in the investigation of both computational and experimental benchmarks for the NEACRP working group on shielding assessment of transportation packages. The results of these studies indicate that with the automated biasing procedure, Monte Carlo shielding calculations of spent fuel casks can be easily performed with minimum effort and that accurate results can be obtained at reasonable computing cost. The systematic biasing approach described in this paper can also be applied to other similar shielding problems

  17. Experimental verification of photon: A program for use in x-ray shielding calculations

    International Nuclear Information System (INIS)

    Brauer, E.; Thomlinson, W.

    1987-01-01

    At the National Synchrotron Light Source, a computer program named PHOTON has been developed to calculate radiation dose values around a beam line. The output from the program must be an accurate guide to beam line shielding. To test the program, a series of measurements of radiation dose were carried out using existing beam lines; the results were compared to the theoretical calculations of PHOTON. Several different scattering geometries, scattering materials, and sets of walls and shielding materials were studied. Results of the measurements allowed many advances to be made in the program, ultimately resulting in good agreement between the theory and experiment. 3 refs., 6 figs

  18. Automated DBS microsampling, microscale automation and microflow LC-MS for therapeutic protein PK.

    Science.gov (United States)

    Zhang, Qian; Tomazela, Daniela; Vasicek, Lisa A; Spellman, Daniel S; Beaumont, Maribel; Shyong, BaoJen; Kenny, Jacqueline; Fauty, Scott; Fillgrove, Kerry; Harrelson, Jane; Bateman, Kevin P

    2016-04-01

    Reduce animal usage for discovery-stage PK studies for biologics programs using microsampling-based approaches and microscale LC-MS. We report the development of an automated DBS-based serial microsampling approach for studying the PK of therapeutic proteins in mice. Automated sample preparation and microflow LC-MS were used to enable assay miniaturization and improve overall assay throughput. Serial sampling of mice was possible over the full 21-day study period with the first six time points over 24 h being collected using automated DBS sample collection. Overall, this approach demonstrated comparable data to a previous study using single mice per time point liquid samples while reducing animal and compound requirements by 14-fold. Reduction in animals and drug material is enabled by the use of automated serial DBS microsampling for mice studies in discovery-stage studies of protein therapeutics.

  19. Controls and automation in the SPIRAL project

    International Nuclear Information System (INIS)

    Bothner, U.; Boulot, A.; Maherault, J.; Martial, L.

    1999-01-01

    The control and automation team of the R and D of Accelerator-Exotic Beam Department has had in the framework of SPIRAL collaboration the following tasks: 1. automation of the resonator high frequency equipment of the CIME cyclotron; 2. automation of the vacuum equipment, i.e. the low energy line (TBE), the CIME cyclotron, the low energy line (BE); 3. automation of load safety for power supply; 4. for each of these tasks a circuitry file based on the SCHEMA software has been worked out. The programs required in the automation of load safety for power supply (STEP5, PROTOOL, DESIGNER 4.1) were developed and implemented for PC

  20. JTst - An Automated Unit Testing Tool for Java Program

    OpenAIRE

    Kamal Z.  Zamli; Nor A. M.  Isa

    2008-01-01

    Software testing is an integral part of software development lifecycle. Lack of testing can often lead to disastrous consequences including lost of data, fortunes, and even lives. Despite its importance, current software testing practice lacks automation, and is still primarily based on highly manual processes from the generation of test cases up to the actual execution of the test. Although the emergence of helpful automated testing tools in the market is blooming, their adoptions are lackin...

  1. FORTRAN program for calculating liquid-phase and gas-phase thermal diffusion column coefficients

    International Nuclear Information System (INIS)

    Rutherford, W.M.

    1980-01-01

    A computer program (COLCO) was developed for calculating thermal diffusion column coefficients from theory. The program, which is written in FORTRAN IV, can be used for both liquid-phase and gas-phase thermal diffusion columns. Column coefficients for the gas phase can be based on gas properties calculated from kinetic theory using tables of omega integrals or on tables of compiled physical properties as functions of temperature. Column coefficients for the liquid phase can be based on compiled physical property tables. Program listings, test data, sample output, and users manual are supplied for appendices

  2. Automation-aided Task Loads Index based on the Automation Rate Reflecting the Effects on Human Operators in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seungmin; Seong, Poonghyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Kim, Jonghyun [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2013-05-15

    Many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs) was suggested. These suggested measures express how much automation support human operators but it cannot express the change of human operators' workload, whether the human operators' workload is increased or decreased. Before considering automation rates, whether the adopted automation is good or bad might be estimated in advance. In this study, to estimate the appropriateness of automation according to the change of the human operators' task loads, automation-aided task loads index is suggested based on the concept of the suggested automation rate. To insure plant safety and efficiency on behalf of human operators, various automation systems have been installed in NPPs, and many works which were previously conducted by human operators can now be supported by computer-based operator aids. According to the characteristics of the automation types, the estimation method of the system automation and the cognitive automation rate were suggested. The proposed estimation method concentrates on the effects of introducing automation, so it directly express how much the automated system support human operators. Based on the suggested automation rates, the way to estimate how much the automated system can affect the human operators' cognitive task load is suggested in this study. When there is no automation, the calculated index is 1, and it means there is no change of human operators' task load.

  3. COMPUTER PROGRAM FOR CALCULATION MICROCHANNEL HEAT EXCHANGERS FOR AIR CONDITIONING SYSTEMS

    Directory of Open Access Journals (Sweden)

    Olga V. Olshevska

    2016-08-01

    Full Text Available Creating a computer program to calculate microchannel air condensers to reduce design time and carrying out variant calculations. Software packages for thermophysical properties of the working substance and the coolant, the correlation equation for calculating heat transfer, aerodynamics and hydrodynamics, the thermodynamic equations for the irreversible losses and their minimization in the heat exchanger were used in the process of creating. Borland Delphi 7 is used for creating software package.

  4. The Strategic Technologies for Automation and Robotics (STEAR) program: Protection of materials in the space environment subprogram

    Science.gov (United States)

    Schmidt, Lorne R.; Francoeur, J.; Aguero, Alina; Wertheimer, Michael R.; Klemberg-Sapieha, J. E.; Martinu, L.; Blezius, J. W.; Oliver, M.; Singh, A.

    1995-01-01

    Three projects are currently underway for the development of new coatings for the protection of materials in the space environment. These coatings are based on vacuum deposition technologies. The projects will go as far as the proof-of-concept stage when the commercial potential for the technology will be demonstrated on pilot-scale fabrication facilities in 1996. These projects are part of a subprogram to develop supporting technologies for automation and robotics technologies being developed under the Canadian Space Agency's STEAR Program, part of the Canadian Space Station Program.

  5. NASA Systems Autonomy Demonstration Project - Development of Space Station automation technology

    Science.gov (United States)

    Bull, John S.; Brown, Richard; Friedland, Peter; Wong, Carla M.; Bates, William

    1987-01-01

    A 1984 Congressional expansion of the 1958 National Aeronautics and Space Act mandated that NASA conduct programs, as part of the Space Station program, which will yield the U.S. material benefits, particularly in the areas of advanced automation and robotics systems. Demonstration programs are scheduled for automated systems such as the thermal control, expert system coordination of Station subsystems, and automation of multiple subsystems. The programs focus the R&D efforts and provide a gateway for transfer of technology to industry. The NASA Office of Aeronautics and Space Technology is responsible for directing, funding and evaluating the Systems Autonomy Demonstration Project, which will include simulated interactions between novice personnel and astronauts and several automated, expert subsystems to explore the effectiveness of the man-machine interface being developed. Features and progress on the TEXSYS prototype thermal control system expert system are outlined.

  6. Automated Instrumentation, Monitoring and Visualization of PVM Programs Using AIMS

    Science.gov (United States)

    Mehra, Pankaj; VanVoorst, Brian; Yan, Jerry; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    We present views and analysis of the execution of several PVM (Parallel Virtual Machine) codes for Computational Fluid Dynamics on a networks of Sparcstations, including: (1) NAS Parallel Benchmarks CG and MG; (2) a multi-partitioning algorithm for NAS Parallel Benchmark SP; and (3) an overset grid flowsolver. These views and analysis were obtained using our Automated Instrumentation and Monitoring System (AIMS) version 3.0, a toolkit for debugging the performance of PVM programs. We will describe the architecture, operation and application of AIMS. The AIMS toolkit contains: (1) Xinstrument, which can automatically instrument various computational and communication constructs in message-passing parallel programs; (2) Monitor, a library of runtime trace-collection routines; (3) VK (Visual Kernel), an execution-animation tool with source-code clickback; and (4) Tally, a tool for statistical analysis of execution profiles. Currently, Xinstrument can handle C and Fortran 77 programs using PVM 3.2.x; Monitor has been implemented and tested on Sun 4 systems running SunOS 4.1.2; and VK uses XIIR5 and Motif 1.2. Data and views obtained using AIMS clearly illustrate several characteristic features of executing parallel programs on networked workstations: (1) the impact of long message latencies; (2) the impact of multiprogramming overheads and associated load imbalance; (3) cache and virtual-memory effects; and (4) significant skews between workstation clocks. Interestingly, AIMS can compensate for constant skew (zero drift) by calibrating the skew between a parent and its spawned children. In addition, AIMS' skew-compensation algorithm can adjust timestamps in a way that eliminates physically impossible communications (e.g., messages going backwards in time). Our current efforts are directed toward creating new views to explain the observed performance of PVM programs. Some of the features planned for the near future include: (1) ConfigView, showing the physical topology

  7. Automated system for calculating the uncertainty of standards

    International Nuclear Information System (INIS)

    Harvel, C.D.

    1990-01-01

    Working Calibration and Test Material (WCTM) solutions are essential as standards in the surveillance of analytical methods, the calibration of equipment and methods, and the training and testing of laboratory personnel. Before the WCTM can be used it must be characterized. That is, the WCTM concentration and its associated uncertainty must be estimated. The characterization of a WCTM is a tedious process. The chemistry and subsequent statistical analysis require a significant amount of care. For a nonstatistician, the statistical analysis of a WCTM characterization can be quite difficult. In addition, the WCTM traceability and characterization must be thoroughly documented as required by DOE Order 5633.3 [1]. An automated system can easily do the statistical analysis and provide the necessary documentation. 3 refs., 2 figs

  8. Automated electrocardiogram interpretation programs versus cardiologists' triage decision making based on teletransmitted data in patients with suspected acute coronary syndrome

    DEFF Research Database (Denmark)

    Clark, Elaine N; Ripa, Maria Sejersten; Clemmensen, Peter

    2010-01-01

    The aims of this study were to assess the effectiveness of 2 automated electrocardiogram interpretation programs in patients with suspected acute coronary syndrome transported to hospital by ambulance in 1 rural region of Denmark with hospital discharge diagnosis used as the gold standard...... infarction with respect to discharge diagnosis were 78%, 91%, and 81% for LIFEPAK 12 and 78%, 94%, and 87% for the Glasgow program. Corresponding data for attending cardiologists were 85%, 90%, and 81%. In conclusion, the Glasgow program had significantly higher specificity than the LIFEPAK 12 program (p = 0...

  9. Program system for calculating streaming neutron radiation field in reactor cavity

    International Nuclear Information System (INIS)

    He Zhongliang; Zhao Shu.

    1986-01-01

    The A23 neutron albedo data base based on Monte Carlo method well agrees with SAIL albedo data base. RSCAM program system, using Monte Carlo method with albedo approach, is used to calculate streaming neutron radiation field in reactor cavity and containment operating hall. The dose rate distributions calculated with RSCAM in square concrete duct well agree with experiments

  10. Activity computer program for calculating ion irradiation activation

    Science.gov (United States)

    Palmer, Ben; Connolly, Brian; Read, Mark

    2017-07-01

    A computer program, Activity, was developed to predict the activity and gamma lines of materials irradiated with an ion beam. It uses the TENDL (Koning and Rochman, 2012) [1] proton reaction cross section database, the Stopping and Range of Ions in Matter (SRIM) (Biersack et al., 2010) code, a Nuclear Data Services (NDS) radioactive decay database (Sonzogni, 2006) [2] and an ENDF gamma decay database (Herman and Chadwick, 2006) [3]. An extended version of Bateman's equation is used to calculate the activity at time t, and this equation is solved analytically, with the option to also solve by numeric inverse Laplace Transform as a failsafe. The program outputs the expected activity and gamma lines of the activated material.

  11. Preliminary Framework for Human-Automation Collaboration

    International Nuclear Information System (INIS)

    Oxstrand, Johanna Helene; Le Blanc, Katya Lee; Spielman, Zachary Alexander

    2015-01-01

    The Department of Energy's Advanced Reactor Technologies Program sponsors research, development and deployment activities through its Next Generation Nuclear Plant, Advanced Reactor Concepts, and Advanced Small Modular Reactor (aSMR) Programs to promote safety, technical, economical, and environmental advancements of innovative Generation IV nuclear energy technologies. The Human Automation Collaboration (HAC) Research Project is located under the aSMR Program, which identifies developing advanced instrumentation and controls and human-machine interfaces as one of four key research areas. It is expected that the new nuclear power plant designs will employ technology significantly more advanced than the analog systems in the existing reactor fleet as well as utilizing automation to a greater extent. Moving towards more advanced technology and more automation does not necessary imply more efficient and safer operation of the plant. Instead, a number of concerns about how these technologies will affect human performance and the overall safety of the plant need to be addressed. More specifically, it is important to investigate how the operator and the automation work as a team to ensure effective and safe plant operation, also known as the human-automation collaboration (HAC). The focus of the HAC research is to understand how various characteristics of automation (such as its reliability, processes, and modes) effect an operator's use and awareness of plant conditions. In other words, the research team investigates how to best design the collaboration between the operators and the automated systems in a manner that has the greatest positive impact on overall plant performance and reliability. This report addresses the Department of Energy milestone M4AT-15IN2302054, Complete Preliminary Framework for Human-Automation Collaboration, by discussing the two phased development of a preliminary HAC framework. The framework developed in the first phase was used as

  12. Preliminary Framework for Human-Automation Collaboration

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna Helene [Idaho National Lab. (INL), Idaho Falls, ID (United States); Le Blanc, Katya Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States); Spielman, Zachary Alexander [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    The Department of Energy’s Advanced Reactor Technologies Program sponsors research, development and deployment activities through its Next Generation Nuclear Plant, Advanced Reactor Concepts, and Advanced Small Modular Reactor (aSMR) Programs to promote safety, technical, economical, and environmental advancements of innovative Generation IV nuclear energy technologies. The Human Automation Collaboration (HAC) Research Project is located under the aSMR Program, which identifies developing advanced instrumentation and controls and human-machine interfaces as one of four key research areas. It is expected that the new nuclear power plant designs will employ technology significantly more advanced than the analog systems in the existing reactor fleet as well as utilizing automation to a greater extent. Moving towards more advanced technology and more automation does not necessary imply more efficient and safer operation of the plant. Instead, a number of concerns about how these technologies will affect human performance and the overall safety of the plant need to be addressed. More specifically, it is important to investigate how the operator and the automation work as a team to ensure effective and safe plant operation, also known as the human-automation collaboration (HAC). The focus of the HAC research is to understand how various characteristics of automation (such as its reliability, processes, and modes) effect an operator’s use and awareness of plant conditions. In other words, the research team investigates how to best design the collaboration between the operators and the automated systems in a manner that has the greatest positive impact on overall plant performance and reliability. This report addresses the Department of Energy milestone M4AT-15IN2302054, Complete Preliminary Framework for Human-Automation Collaboration, by discussing the two phased development of a preliminary HAC framework. The framework developed in the first phase was used as the

  13. New automated pellet/powder assay system

    International Nuclear Information System (INIS)

    Olsen, R.N.

    1975-01-01

    This paper discusses an automated, high precision, pellet/ powder assay system. The system is an active assay system using a small isotopic neutron source and a coincidence detection system. The handling of the pellet powder samples has been automated and a programmable calculator has been integrated into the system to provide control and data analysis. The versatile system can assay uranium or plutonium in either active or passive modes

  14. Commentary on "Performance of a glucose meter with a built-in automated bolus calculator versus manual bolus calculation in insulin-using subjects".

    Science.gov (United States)

    Rossetti, Paolo; Vehí, Josep; Revert, Ana; Calm, Remei; Bondia, Jorge

    2012-03-01

    Since the early 2000s, there has been an exponentially increasing development of new diabetes-applied technology, such as continuous glucose monitoring, bolus calculators, and "smart" pumps, with the expectation of partially overcoming clinical inertia and low patient compliance. However, its long-term efficacy in glucose control has not been unequivocally proven. In this issue of Journal of Diabetes Science and Technology, Sussman and colleagues evaluated a tool for the calculation of the prandial insulin dose. A total of 205 insulin-treated patients were asked to compute a bolus dose in two simulated conditions either manually or with the bolus calculator built into the FreeStyle InsuLinx meter, revealing the high frequency of wrong calculations when performed manually. Although the clinical impact of this study is limited, it highlights the potential implications of low diabetesrelated numeracy in poor glycemic control. Educational programs aiming to increase patients' empowerment and caregivers' knowledge are needed in order to get full benefit of the technology. © 2012 Diabetes Technology Society.

  15. Optimization of automation: III. Development of optimization method for determining automation rate in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Kim, Man Cheol; Seong, Poong Hyun

    2016-01-01

    derivation process, the optimized automation rate is estimated through integrating the automation rate and ostracism rate according to the decreasing rate of working time. The automation rate that induces the most decreased rate of working time is calculated, and it is explained that the estimated automation rate is the optimized automation rate that provides the best operator performance for the circumstances. It is expected that the proposed automation rate optimization method will be useful in introducing automation with assurance of the best human performance.

  16. Automating ASW fusion

    OpenAIRE

    Pabelico, James C.

    2011-01-01

    Approved for public release; distribution is unlimited. This thesis examines ASW eFusion, an anti-submarine warfare (ASW) tactical decision aid (TDA) that utilizes Kalman filtering to improve battlespace awareness by simplifying and automating the track management process involved in anti-submarine warfare (ASW) watchstanding operations. While this program can currently help the ASW commander manage uncertainty and make better tactical decisions, the program has several limitations. Comman...

  17. Automation of the Jarrell--Ash model 70-314 emission spectrometer

    International Nuclear Information System (INIS)

    Morris, W.F.; Fisher, E.R.; Taber, L.

    1978-01-01

    Automation of the Jarrell-Ash 3.4-Meter Ebert direct-reading emission spectrometer with digital scaler readout is described. The readout is interfaced to a Data General NOVA 840 minicomputer. The automation code consists of BASIC language programs for interactive routines, data processing, and report generation. Call statements within the BASIC programs invoke assembly language routines for real-time data acquisition and control. In addition, the automation objectives as well as the spectrometer-computer system functions, coding, and operating instructions are presented

  18. Development, application and also modern condition of the calculated program Imitator of a reactor

    International Nuclear Information System (INIS)

    Aver'yanova, S.P.; Kovel', A.I.; Mamichev, V.V.; Filimonov, P.E.

    2008-01-01

    Features of the calculated program Imitator of a reactor (IR) for WWER-1000 operation simulation are discussed. It is noted that IR application at NPP provides for the project program (BIPR-7) on-line working. This offers a new means, on the one hand, for the efficient prediction and information support of operator, on the other hand, for the verification and development of calculated scheme and neutron-physical model of the WWER-1000 projection program [ru

  19. Automated Speech Rate Measurement in Dysarthria

    Science.gov (United States)

    Martens, Heidi; Dekens, Tomas; Van Nuffelen, Gwen; Latacz, Lukas; Verhelst, Werner; De Bodt, Marc

    2015-01-01

    Purpose: In this study, a new algorithm for automated determination of speech rate (SR) in dysarthric speech is evaluated. We investigated how reliably the algorithm calculates the SR of dysarthric speech samples when compared with calculation performed by speech-language pathologists. Method: The new algorithm was trained and tested using Dutch…

  20. DWPI: a computer program to calculate the inelastic scattering of pions from nuclei

    Energy Technology Data Exchange (ETDEWEB)

    Eisenstein, R A; Miller, G A [Carnegie-Mellon Univ., Pittsburgh, Pa. (USA). Dept. of Physics

    1976-02-01

    Angular distributions for the inelastic scattering of pions are generated using the distorted wave impulse approximation (DWIA). The cross section for a given transition is calculated by summing a partial wave expansion. The T-matrix elements are calculated using distorted pion waves from the program PIRK, and therefore include elastic scattering to all orders. The excitation is treated in first order only. Several optical potentials and nuclear densities are available in the program. The transition form factor may be uncoupled from the ground-state density. Coulomb excitation, which interferes coherently with the strong interaction, is a program option.

  1. Automated Work Packages Prototype: Initial Design, Development, and Evaluation. Light Water Reactor Sustainability Program

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna Helene [Idaho National Lab. (INL), Idaho Falls, ID (United States); Al Rashdan, Ahmad [Idaho National Lab. (INL), Idaho Falls, ID (United States); Le Blanc, Katya Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bly, Aaron Douglas [Idaho National Lab. (INL), Idaho Falls, ID (United States); Agarwal, Vivek [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-07-01

    The goal of the Automated Work Packages (AWP) project is to demonstrate how to enhance work quality, cost management, and nuclear safety through the use of advanced technology. The work described in this report is part of the digital architecture for a highly automated plant project of the technical program plan for advanced instrumentation, information, and control (II&C) systems technologies. This report addresses the DOE Milestone M2LW-15IN0603112: Describe the outcomes of field evaluations/demonstrations of the AWP prototype system and plant surveillance and communication framework requirements at host utilities. A brief background to the need for AWP research is provided, then two human factors field evaluation studies are described. These studies focus on the user experience of conducting a task (in this case a preventive maintenance and a surveillance test) while using an AWP system. The remaining part of the report describes an II&C effort to provide real time status updates to the technician by wireless transfer of equipment indications and a dynamic user interface.

  2. Automated Work Packages Prototype: Initial Design, Development, and Evaluation. Light Water Reactor Sustainability Program

    International Nuclear Information System (INIS)

    Oxstrand, Johanna Helene; Al Rashdan, Ahmad; Le Blanc, Katya Lee; Bly, Aaron Douglas; Agarwal, Vivek

    2015-01-01

    The goal of the Automated Work Packages (AWP) project is to demonstrate how to enhance work quality, cost management, and nuclear safety through the use of advanced technology. The work described in this report is part of the digital architecture for a highly automated plant project of the technical program plan for advanced instrumentation, information, and control (II&C) systems technologies. This report addresses the DOE Milestone M2LW-15IN0603112: Describe the outcomes of field evaluations/demonstrations of the AWP prototype system and plant surveillance and communication framework requirements at host utilities. A brief background to the need for AWP research is provided, then two human factors field evaluation studies are described. These studies focus on the user experience of conducting a task (in this case a preventive maintenance and a surveillance test) while using an AWP system. The remaining part of the report describes an II&C effort to provide real time status updates to the technician by wireless transfer of equipment indications and a dynamic user interface.

  3. WAD, a program to calculate the heat produced by alpha decay

    International Nuclear Information System (INIS)

    Jarvis, R.G.; Bretzlaff, C.I.

    1982-09-01

    The FORTRAN program WAD (Watts from Alpha Decay) deals with the alpha and beta decay chains to be encountered in advanced fuel cycles for CANDU reactors. The data library covers all necessary alpha-emitting and beta-emitting nuclides and the program calculates the heat produced by alpha decay. Any permissible chain can be constructed very simply

  4. Operational experiences with automated acoustic burst classification by neural networks

    International Nuclear Information System (INIS)

    Olma, B.; Ding, Y.; Enders, R.

    1996-01-01

    Monitoring of Loose Parts Monitoring System sensors for signal bursts associated with metallic impacts of loose parts has proved as an useful methodology for on-line assessing the mechanical integrity of components in the primary circuit of nuclear power plants. With the availability of neural networks new powerful possibilities for classification and diagnosis of burst signals can be realized for acoustic monitoring with the online system RAMSES. In order to look for relevant burst signals an automated classification is needed, that means acoustic signature analysis and assessment has to be performed automatically on-line. A back propagation neural network based on five pre-calculated signal parameter values has been set up for identification of different signal types. During a three-month monitoring program of medium-operated check valves burst signals have been measured and classified separately according to their cause. The successful results of the three measurement campaigns with an automated burst type classification are presented. (author)

  5. Development and application of the automated Monte Carlo biasing procedure in SAS4

    International Nuclear Information System (INIS)

    Tang, J.S.; Broadhead, B.L.

    1995-01-01

    An automated approach for biasing Monte Carlo shielding calculations is described. In particular, adjoint fluxes from a one-dimensional discrete-ordinates calculation are used to generate biasing parameters for a three-dimensional Monte Carlo calculation. The automated procedure consisting of cross-section processing, adjoint flux determination, biasing parameter generation, and the initiation of a MORSE-SGC/S Monte Carlo calculation has been implemented in the SAS4 module of the SCALE computer code system. (author)

  6. ParShield: A computer program for calculating attenuation parameters of the gamma rays and the fast neutrons

    International Nuclear Information System (INIS)

    Elmahroug, Y.; Tellili, B.; Souga, C.; Manai, K.

    2015-01-01

    Highlights: • Description of the theoretical method used by the ParShield program. • Description of the ParShield program. • Test and validation the ParShield program. - Abstract: This study aims to present a new computer program called ParShield which determines the neutron and gamma-ray shielding parameters. This program can calculate the total mass attenuation coefficients (μ t ), the effective atomic numbers (Z eff ) and the effective electron densities (N eff ) for gamma rays and it can also calculate the effective removal cross-sections (Σ R ) for fast neutrons for mixtures and compounds. The results obtained for the gamma rays by using ParShield were compared with the results calculated by the WinXcom program and the measured results. The obtained values of (Σ R ) were tested by comparing them with the measured results,the manually calculated results and with the results obtained by using MERCSFN program and an excellent agreement was found between them. The ParShield program can be used as a fast and effective tool to choose and compare the shielding materials, especially for the determination of (Z eff ) and (N eff ), there is no other programs in the literature which can calculate

  7. FISPRO: a simplified computer program for general fission product formation and decay calculations

    International Nuclear Information System (INIS)

    Jiacoletti, R.J.; Bailey, P.G.

    1979-08-01

    This report describes a computer program that solves a general form of the fission product formation and decay equations over given time steps for arbitrary decay chains composed of up to three nuclides. All fission product data and operational history data are input through user-defined input files. The program is very useful in the calculation of fission product activities of specific nuclides for various reactor operational histories and accident consequence calculations

  8. DEVELOPMENT OF A DICTIONARY FOR INFORMATION SYSTEM OF AUTOMATED FIGURING OUT PERSON’S META-PROGRAMS PROFILE

    Directory of Open Access Journals (Sweden)

    Valitova, Y.O.

    2017-06-01

    Full Text Available The article describes the process of formation of a dictionary for automated figuring out person’s meta-programs profile. Analysis revealed that personal orientation in accordance with the meta-program is manifested not only in the content of speech, but also in morphological characteristics of a word, what makes it practically impossible to use already existing dictionaries. During the development of a dictionary it is also important to take into consideration the presence of shorten words, abbreviations and slang words in the text. In this paper, the process of formation of a dictionary is described in detail and is accompanied by examples.

  9. GRUCAL: a program system for the calculation of macroscopic group constants

    International Nuclear Information System (INIS)

    Woll, D.

    1984-01-01

    Nuclear reactor calculations require material- and composition-dependent, energy-averaged neutron physical data in order to decribe the interaction between neutrons and isotopes. The multigroup cross section code GRUCAL calculates these macroscopic group constants for given material compositions from the material-dependent data of the group constant library GRUBA. The instructions for calculating group constants are not fixed in the program, but are read in from an instruction file. This makes it possible to adapt GRUCAL to various problems or different group constant concepts

  10. Computer program FPIP-REV calculates fission product inventory for U-235 fission

    Science.gov (United States)

    Brown, W. S.; Call, D. W.

    1967-01-01

    Computer program calculates fission product inventories and source strengths associated with the operation of U-235 fueled nuclear power reactor. It utilizes a fission-product nuclide library of 254 nuclides, and calculates the time dependent behavior of the fission product nuclides formed by fissioning of U-235.

  11. Adaptive Algorithms for Automated Processing of Document Images

    Science.gov (United States)

    2011-01-01

    ABSTRACT Title of dissertation: ADAPTIVE ALGORITHMS FOR AUTOMATED PROCESSING OF DOCUMENT IMAGES Mudit Agrawal, Doctor of Philosophy, 2011...2011 4. TITLE AND SUBTITLE Adaptive Algorithms for Automated Processing of Document Images 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...ALGORITHMS FOR AUTOMATED PROCESSING OF DOCUMENT IMAGES by Mudit Agrawal Dissertation submitted to the Faculty of the Graduate School of the University

  12. CALCMIN - an EXCEL™ Visual Basic application for calculating mineral structural formulae from electron microprobe analyses

    Science.gov (United States)

    Brandelik, Andreas

    2009-07-01

    CALCMIN, an open source Visual Basic program, was implemented in EXCEL™. The program was primarily developed to support geoscientists in their routine task of calculating structural formulae of minerals on the basis of chemical analysis mainly obtained by electron microprobe (EMP) techniques. Calculation programs for various minerals are already included in the form of sub-routines. These routines are arranged in separate modules containing a minimum of code. The architecture of CALCMIN allows the user to easily develop new calculation routines or modify existing routines with little knowledge of programming techniques. By means of a simple mouse-click, the program automatically generates a rudimentary framework of code using the object model of the Visual Basic Editor (VBE). Within this framework simple commands and functions, which are provided by the program, can be used, for example, to perform various normalization procedures or to output the results of the computations. For the clarity of the code, element symbols are used as variables initialized by the program automatically. CALCMIN does not set any boundaries in complexity of the code used, resulting in a wide range of possible applications. Thus, matrix and optimization methods can be included, for instance, to determine end member contents for subsequent thermodynamic calculations. Diverse input procedures are provided, such as the automated read-in of output files created by the EMP. Furthermore, a subsequent filter routine enables the user to extract specific analyses in order to use them for a corresponding calculation routine. An event-driven, interactive operating mode was selected for easy application of the program. CALCMIN leads the user from the beginning to the end of the calculation process.

  13. Calculating Program for Decommissioning Work Productivity based on Decommissioning Activity Experience Data

    Energy Technology Data Exchange (ETDEWEB)

    Song, Chan-Ho; Park, Seung-Kook; Park, Hee-Seong; Moon, Jei-kwon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    KAERI is performing research to calculate a coefficient for decommissioning work unit productivity to calculate the estimated time decommissioning work and estimated cost based on decommissioning activity experience data for KRR-2. KAERI used to calculate the decommissioning cost and manage decommissioning activity experience data through systems such as the decommissioning information management system (DECOMMIS), Decommissioning Facility Characterization DB System (DEFACS), decommissioning work-unit productivity calculation system (DEWOCS). In particular, KAERI used to based data for calculating the decommissioning cost with the form of a code work breakdown structure (WBS) based on decommissioning activity experience data for KRR-2.. Defined WBS code used to each system for calculate decommissioning cost. In this paper, we developed a program that can calculate the decommissioning cost using the decommissioning experience of KRR-2, UCP, and other countries through the mapping of a similar target facility between NPP and KRR-2. This paper is organized as follows. Chapter 2 discusses the decommissioning work productivity calculation method, and the mapping method of the decommissioning target facility will be described in the calculating program for decommissioning work productivity. At KAERI, research on various decommissioning methodologies of domestic NPPs will be conducted in the near future. In particular, It is difficult to determine the cost of decommissioning because such as NPP facility have the number of variables, such as the material of the target facility decommissioning, size, radiographic conditions exist.

  14. Calculating Program for Decommissioning Work Productivity based on Decommissioning Activity Experience Data

    International Nuclear Information System (INIS)

    Song, Chan-Ho; Park, Seung-Kook; Park, Hee-Seong; Moon, Jei-kwon

    2014-01-01

    KAERI is performing research to calculate a coefficient for decommissioning work unit productivity to calculate the estimated time decommissioning work and estimated cost based on decommissioning activity experience data for KRR-2. KAERI used to calculate the decommissioning cost and manage decommissioning activity experience data through systems such as the decommissioning information management system (DECOMMIS), Decommissioning Facility Characterization DB System (DEFACS), decommissioning work-unit productivity calculation system (DEWOCS). In particular, KAERI used to based data for calculating the decommissioning cost with the form of a code work breakdown structure (WBS) based on decommissioning activity experience data for KRR-2.. Defined WBS code used to each system for calculate decommissioning cost. In this paper, we developed a program that can calculate the decommissioning cost using the decommissioning experience of KRR-2, UCP, and other countries through the mapping of a similar target facility between NPP and KRR-2. This paper is organized as follows. Chapter 2 discusses the decommissioning work productivity calculation method, and the mapping method of the decommissioning target facility will be described in the calculating program for decommissioning work productivity. At KAERI, research on various decommissioning methodologies of domestic NPPs will be conducted in the near future. In particular, It is difficult to determine the cost of decommissioning because such as NPP facility have the number of variables, such as the material of the target facility decommissioning, size, radiographic conditions exist

  15. Programmable Automated Welding System (PAWS)

    Science.gov (United States)

    Kline, Martin D.

    1994-01-01

    An ambitious project to develop an advanced, automated welding system is being funded as part of the Navy Joining Center with Babcock & Wilcox as the prime integrator. This program, the Programmable Automated Welding System (PAWS), involves the integration of both planning and real-time control activities. Planning functions include the development of a graphical decision support system within a standard, portable environment. Real-time control functions include the development of a modular, intelligent, real-time control system and the integration of a number of welding process sensors. This paper presents each of these components of the PAWS and discusses how they can be utilized to automate the welding operation.

  16. Automated quantification of epicardial adipose tissue using CT angiography: evaluation of a prototype software

    International Nuclear Information System (INIS)

    Spearman, James V.; Silverman, Justin R.; Krazinski, Aleksander W.; Costello, Philip; Meinel, Felix G.; Geyer, Lucas L.; Schoepf, U.J.; Apfaltrer, Paul; Canstein, Christian; De Cecco, Carlo Nicola

    2014-01-01

    This study evaluated the performance of a novel automated software tool for epicardial fat volume (EFV) quantification compared to a standard manual technique at coronary CT angiography (cCTA). cCTA data sets of 70 patients (58.6 ± 12.9 years, 33 men) were retrospectively analysed using two different post-processing software applications. Observer 1 performed a manual single-plane pericardial border definition and EFV M segmentation (manual approach). Two observers used a software program with fully automated 3D pericardial border definition and EFV A calculation (automated approach). EFV and time required for measuring EFV (including software processing time and manual optimization time) for each method were recorded. Intraobserver and interobserver reliability was assessed on the prototype software measurements. T test, Spearman's rho, and Bland-Altman plots were used for statistical analysis. The final EFV A (with manual border optimization) was strongly correlated with the manual axial segmentation measurement (60.9 ± 33.2 mL vs. 65.8 ± 37.0 mL, rho = 0.970, P 0.9). Automated EFV A quantification is an accurate and time-saving method for quantification of EFV compared to established manual axial segmentation methods. (orig.)

  17. Slicken 1.0: Program for calculating the orientation of shear on reactivated faults

    Science.gov (United States)

    Xu, Hong; Xu, Shunshan; Nieto-Samaniego, Ángel F.; Alaniz-Álvarez, Susana A.

    2017-07-01

    The slip vector on a fault is an important parameter in the study of the movement history of a fault and its faulting mechanism. Although there exist many graphical programs to represent the shear stress (or slickenline) orientations on faults, programs to quantitatively calculate the orientation of fault slip based on a given stress field are scarce. In consequence, we develop Slicken 1.0, a software to rapidly calculate the orientation of maximum shear stress on any fault plane. For this direct method of calculating the resolved shear stress on a planar surface, the input data are the unit vector normal to the involved plane, the unit vectors of the three principal stress axes, and the stress ratio. The advantage of this program is that the vertical or horizontal principal stresses are not necessarily required. Due to its nimble design using Java SE 8.0, it runs on most operating systems with the corresponding Java VM. The software program will be practical for geoscience students, geologists and engineers and will help resolve a deficiency in field geology, and structural and engineering geology.

  18. Modeling Increased Complexity and the Reliance on Automation: FLightdeck Automation Problems (FLAP) Model

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    This paper highlights the development of a model that is focused on the safety issue of increasing complexity and reliance on automation systems in transport category aircraft. Recent statistics show an increase in mishaps related to manual handling and automation errors due to pilot complacency and over-reliance on automation, loss of situational awareness, automation system failures and/or pilot deficiencies. Consequently, the aircraft can enter a state outside the flight envelope and/or air traffic safety margins which potentially can lead to loss-of-control (LOC), controlled-flight-into-terrain (CFIT), or runway excursion/confusion accidents, etc. The goal of this modeling effort is to provide NASA's Aviation Safety Program (AvSP) with a platform capable of assessing the impacts of AvSP technologies and products towards reducing the relative risk of automation related accidents and incidents. In order to do so, a generic framework, capable of mapping both latent and active causal factors leading to automation errors, is developed. Next, the framework is converted into a Bayesian Belief Network model and populated with data gathered from Subject Matter Experts (SMEs). With the insertion of technologies and products, the model provides individual and collective risk reduction acquired by technologies and methodologies developed within AvSP.

  19. Automated electrocardiogram interpretation programs versus cardiologists' triage decision making based on teletransmitted data in patients with suspected acute coronary syndrome

    DEFF Research Database (Denmark)

    Clark, Elaine N; Ripa, Maria Sejersten; Clemmensen, Peter

    2010-01-01

    The aims of this study were to assess the effectiveness of 2 automated electrocardiogram interpretation programs in patients with suspected acute coronary syndrome transported to hospital by ambulance in 1 rural region of Denmark with hospital discharge diagnosis used as the gold standard...

  20. Programmable calculator programs to solve softwood volume and value equations.

    Science.gov (United States)

    Janet K. Ayer. Sachet

    1982-01-01

    This paper presents product value and product volume equations as programs for handheld calculators. These tree equations are for inland Douglas-fir, young-growth Douglas-fir, western white pine, ponderosa pine, and western larch. Operating instructions and an example are included.

  1. CREST : a computer program for the calculation of composition dependent self-shielded cross-sections

    International Nuclear Information System (INIS)

    Kapil, S.K.

    1977-01-01

    A computer program CREST for the calculation of the composition and temperature dependent self-shielded cross-sections using the shielding factor approach has been described. The code includes the editing and formation of the data library, calculation of the effective shielding factors and cross-sections, a fundamental mode calculation to generate the neutron spectrum for the system which is further used to calculate the effective elastic removal cross-sections. Studies to explore the sensitivity of reactor parameters to changes in group cross-sections can also be carried out by using the facility available in the code to temporarily change the desired constants. The final self-shielded and transport corrected group cross-sections can be dumped on cards or magnetic tape in a suitable form for their direct use in a transport or diffusion theory code for detailed reactor calculations. The program is written in FORTRAN and can be accommodated in a computer with 32 K work memory. The input preparation details, sample problem and the listing of the program are given. (author)

  2. DCHAIN: A user-friendly computer program for radioactive decay and reaction chain calculations

    International Nuclear Information System (INIS)

    East, L.V.

    1994-05-01

    A computer program for calculating the time-dependent daughter populations in radioactive decay and nuclear reaction chains is described. Chain members can have non-zero initial populations and be produced from the preceding chain member as the result of radioactive decay, a nuclear reaction, or both. As presently implemented, chains can contain up to 15 members. Program input can be supplied interactively or read from ASCII data files. Time units for half-lives, etc. can be specified during data entry. Input values are verified and can be modified if necessary, before used in calculations. Output results can be saved in ASCII files in a format suitable for including in reports or other documents. The calculational method, described in some detail, utilizes a generalized form of the Bateman equations. The program is written in the C language in conformance with current ANSI standards and can be used on multiple hardware platforms

  3. Automation and robotics technology for intelligent mining systems

    Science.gov (United States)

    Welsh, Jeffrey H.

    1989-01-01

    The U.S. Bureau of Mines is approaching the problems of accidents and efficiency in the mining industry through the application of automation and robotics to mining systems. This technology can increase safety by removing workers from hazardous areas of the mines or from performing hazardous tasks. The short-term goal of the Automation and Robotics program is to develop technology that can be implemented in the form of an autonomous mining machine using current continuous mining machine equipment. In the longer term, the goal is to conduct research that will lead to new intelligent mining systems that capitalize on the capabilities of robotics. The Bureau of Mines Automation and Robotics program has been structured to produce the technology required for the short- and long-term goals. The short-term goal of application of automation and robotics to an existing mining machine, resulting in autonomous operation, is expected to be accomplished within five years. Key technology elements required for an autonomous continuous mining machine are well underway and include machine navigation systems, coal-rock interface detectors, machine condition monitoring, and intelligent computer systems. The Bureau of Mines program is described, including status of key technology elements for an autonomous continuous mining machine, the program schedule, and future work. Although the program is directed toward underground mining, much of the technology being developed may have applications for space systems or mining on the Moon or other planets.

  4. Neural Signatures of Trust During Human-Automation Interactions

    Science.gov (United States)

    2016-04-01

    also automated devices such as a Global Positioning System. For instance, to provide advanced safety measures, the Transportation Safety...AFRL-AFOSR-VA-TR-2016-0160 Neural Signatures of Trust during Human- Automation Interactions Frank Krueger GEORGE MASON UNIVERSITY Final Report 04/01...SUBTITLE Neural Signatures of Trust during Human- Automation Interactions 5a. CONTRACT NUMBER FA9550-13-1-0017 5b. GRANT NUMBER 5c. PROGRAM ELEMENT

  5. An application program for fission product decay heat calculations

    International Nuclear Information System (INIS)

    Pham, Ngoc Son; Katakura, Jun-ichi

    2007-10-01

    The precise knowledge of decay heat is one of the most important factors in safety design and operation of nuclear power facilities. Furthermore, decay heat data also play an important role in design of fuel discharges, fuel storage and transport flasks, and in spent fuel management and processing. In this study, a new application program, called DHP (Decay Heat Power program), has been developed for exact decay heat summation calculations, uncertainty analysis, and for determination of the individual contribution of each fission product. The analytical methods were applied in the program without any simplification or approximation, in which all of linear and non-linear decay chains, and 12 decay modes, including ground state and meta-stable states, are automatically identified, and processed by using a decay data library and a fission yield data file, both in ENDF/B-VI format. The window interface of the program is designed with optional properties which is very easy for users to run the code. (author)

  6. Programs for data processing in radioimmunoassay using the HP-41C programmable calculator

    International Nuclear Information System (INIS)

    1981-09-01

    The programs described provide for analysis, with the Hewlett Packard HP-41C calculator, of counting data collected in radioimmunoassays or other related in-vitro assays. The immediate reason for their development was to assist laboratories having limited financial resources and serious problems of quality control. The programs are structured both for ''off-line'' use, with manual entry of counting data into the calculator through the keyboard, and, in a slightly altered version, for ''on-line'' use, with automatic data entry from an automatic well scintillation counter originally designed at the IAEA. Only the off-line variant of the programs is described. The programs determine from appropriate counting data the concentration of analyte in unknown specimens, and provide supplementary information about the reliability of these results and the consistency of current and past assay performance

  7. Computer program for calculation of complex chemical equilibrium compositions and applications. Part 1: Analysis

    Science.gov (United States)

    Gordon, Sanford; Mcbride, Bonnie J.

    1994-01-01

    This report presents the latest in a number of versions of chemical equilibrium and applications programs developed at the NASA Lewis Research Center over more than 40 years. These programs have changed over the years to include additional features and improved calculation techniques and to take advantage of constantly improving computer capabilities. The minimization-of-free-energy approach to chemical equilibrium calculations has been used in all versions of the program since 1967. The two principal purposes of this report are presented in two parts. The first purpose, which is accomplished here in part 1, is to present in detail a number of topics of general interest in complex equilibrium calculations. These topics include mathematical analyses and techniques for obtaining chemical equilibrium; formulas for obtaining thermodynamic and transport mixture properties and thermodynamic derivatives; criteria for inclusion of condensed phases; calculations at a triple point; inclusion of ionized species; and various applications, such as constant-pressure or constant-volume combustion, rocket performance based on either a finite- or infinite-chamber-area model, shock wave calculations, and Chapman-Jouguet detonations. The second purpose of this report, to facilitate the use of the computer code, is accomplished in part 2, entitled 'Users Manual and Program Description'. Various aspects of the computer code are discussed, and a number of examples are given to illustrate its versatility.

  8. FORTRAN programs for transient eddy current calculations using a perturbation-polynomial expansion technique

    International Nuclear Information System (INIS)

    Carpenter, K.H.

    1976-11-01

    A description is given of FORTRAN programs for transient eddy current calculations in thin, non-magnetic conductors using a perturbation-polynomial expansion technique. Basic equations are presented as well as flow charts for the programs implementing them. The implementation is in two steps--a batch program to produce an intermediate data file and interactive programs to produce graphical output. FORTRAN source listings are included for all program elements, and sample inputs and outputs are given for the major programs

  9. ptchg: A FORTRAN program for point-charge calculations of electric field gradients (EFGs)

    Science.gov (United States)

    Spearing, Dane R.

    1994-05-01

    ptchg, a FORTRAN program, has been developed to calculate electric field gradients (EFG) around an atomic site in crystalline solids using the point-charge direct-lattice summation method. It uses output from the crystal structure generation program Atoms as its input. As an application of ptchg, a point-charge calculation of the EFG quadrupolar parameters around the oxygen site in SiO 2 cristobalite is demonstrated. Although point-charge calculations of electric field gradients generally are limited to ionic compounds, the computed quadrupolar parameters around the oxygen site in SiO 2 cristobalite, a highly covalent material, are in good agreement with the experimentally determined values from nuclear magnetic resonance (NMR) spectroscopy.

  10. Automated x-ray inspection of composites at northrop aircraft

    International Nuclear Information System (INIS)

    Murphy, W.J. Jr.; Nutter, R.L.; Patricelli, F.

    1985-01-01

    The Northrop automated x-ray inspection system (AXIS) has evolved from a research and development program initiated in 1981 to reduce increasing inspection costs; and reduce inspection times to stay abreast with increasing F/A-18A production. The goal of the program was to develop an automated production system that would meet existing inspection requirements; automate handling and alignment; and replace film for the inspection of F/A-18A composite assemblies and laminates. Originally, the program was supported completely by Northrop internal finding. However in 1984 it became part of the Navy Industrial Modernization Incentive Program (IMIP) with joint funding. The program was selected by the Navy because of its great potential to reduce and stabilize costs associated with F/A-18A inspections. Currently the AXIS is in the last stage of development with final integration expected by the end of July 1985 and production implementation by the end of the year. This paper briefly describes the equipment, and operation of the AXIS. Slides will be presented at the conference which will further illustrate the system; including inspection results

  11. ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wieselquist, William A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Thompson, Adam B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bowman, Stephen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Peterson, Joshua L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process data to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.

  12. Are the program packages for molecular structure calculations really black boxes?

    Directory of Open Access Journals (Sweden)

    ANA MRAKOVIC

    2007-12-01

    Full Text Available In this communication it is shown that the widely held opinion that compact program packages for quantum–mechanical calculations of molecular structure can safely be used as black boxes is completely wrong. In order to illustrate this, the results of computations of equilibrium bond lengths, vibrational frequencies and dissociation energies for all homonuclear diatomic molecules involving the atoms from the first two rows of the Periodic Table, performed using the Gaussian program package are presented. It is demonstrated that the sensible use of the program requires a solid knowledge of quantum chemistry.

  13. 'PRIZE': A program for calculating collision probabilities in R-Z geometry

    International Nuclear Information System (INIS)

    Pitcher, H.H.W.

    1964-10-01

    PRIZE is an IBM7090 program which computes collision probabilities for systems with axial symmetry and outputs them on cards in suitable format for the PIP1 program. Its method of working, data requirements, output, running time and accuracy are described. The program has been used to compute non-escape (self-collision) probabilities of finite circular cylinders, and a table is given by which non-escape probabilities of slabs, finite and infinite circular cylinders, infinite square cylinders, cubes, spheres and hemispheres may quickly be calculated to 1/2% or better. (author)

  14. Systems automated reporting of patient dose in digital radiology

    International Nuclear Information System (INIS)

    Collado Chamorro, P.; Sanz Freire, C. J.; Martinez Mirallas, O.; Tejada San Juan, S.; Lopez de Gammarra, M. S.

    2013-01-01

    It has developed a procedure automated reporting of doses to patients in Radiology. This procedure allows to save the time required of the data used to calculate the dose to patients by yields. Also saves the time spent in the transcription of these data for the realization of the necessary calculations. This system has been developed using open source software. The characteristics of the systems of digital radiography for the automation of procedures, in particular the registration of dose should benefit from patient. This procedure is validated and currently in use at our institution. (Author)

  15. Calculation of pressure distribution in vacuum systems using a commercial finite element program

    International Nuclear Information System (INIS)

    Howell, J.; Wehrle, B.; Jostlein, H.

    1991-01-01

    The finite element method has proven to be a very useful tool for calculating pressure distributions in complex vacuum systems. A number of finite element programs have been developed for this specific task. For those who do not have access to one of these specialized programs and do not wish to develop their own program, another option is available. Any commercial finite element program with heat transfer analysis capabilities can be used to calculate pressure distributions. The approach uses an analogy between thermal conduction and gas conduction with the quantity temperature substituted for pressure. The thermal analogies for pumps, gas loads and tube conductances are described in detail. The method is illustrated for an example vacuum system. A listing of the ANSYS data input file for this example is included. 2 refs., 4 figs., 1 tab

  16. The Navy/NASA Engine Program (NNEP89): Interfacing the program for the calculation of complex Chemical Equilibrium Compositions (CEC)

    Science.gov (United States)

    Gordon, Sanford

    1991-01-01

    The NNEP is a general computer program for calculating aircraft engine performance. NNEP has been used extensively to calculate the design and off-design (matched) performance of a broad range of turbine engines, ranging from subsonic turboprops to variable cycle engines for supersonic transports. Recently, however, there has been increased interest in applications for which NNEP is not capable of simulating, such as the use of alternate fuels including cryogenic fuels and the inclusion of chemical dissociation effects at high temperatures. To overcome these limitations, NNEP was extended by including a general chemical equilibrium method. This permits consideration of any propellant system and the calculation of performance with dissociation effects. The new extended program is referred to as NNEP89.

  17. Approach to plant automation with evolving technology

    International Nuclear Information System (INIS)

    White, J.D.

    1989-01-01

    The US Department of Energy has provided support to Oak Ridge National Laboratory in order to pursue research leading to advanced, automated control of new innovative liquid-metal-cooled nuclear power plants. The purpose of this effort is to conduct research that will help to ensure improved operability, reliability, and safety for advanced LMRs. The plan adopted to achieve these program goals in an efficient and timely manner consists of utilizing, and advancing where required, state-of-the-art controls technology through close interaction with other national laboratories, universities, industry and utilities. A broad range of applications for the control systems strategies and the design environment developed in the course of this program is likely. A natural evolution of automated control in nuclear power plants is envisioned by ORNL to be a phased transition from today's situation of some analog control at the subsystem level with significant operator interaction to the future capability for completely automated digital control with operator supervision. The technical accomplishments provided by this program will assist the industry to accelerate this transition and provide greater economy and safety. The development of this transition to advanced, automated control system designs is expected to have extensive benefits in reduced operating costs, fewer outages, enhanced safety, improved licensability, and improved public acceptance for commercial nuclear power plants. 24 refs

  18. Approach to plant automation with evolving technology

    International Nuclear Information System (INIS)

    White, J.D.

    1989-01-01

    This paper reports that the U.S. Department of Energy has provided support to Oak Ridge National Laboratory in order to pursue research leading to advanced, automated control of new innovative liquid-metal-cooled nuclear power plants. The purpose of this effort is to conduct research that will help to ensure improved operability, reliability, and safety for advanced LMRs. The plan adopted to achieve these program goals in an efficient and timely manner consists of utilizing, and advancing where required, state-of-the art controls technology through close interaction with other national laboratories, universities, industry and utilities. A broad range of applications for the control systems strategies and the design environment developed in the course of this program is likely. A natural evolution of automated control in nuclear power plants is envisioned by ORNL to be a phased transition from today's situation of some analog control at the subsystem level with significant operator interaction to the future capability for completely automated digital control with operator supervision. The technical accomplishments provided by this program will assist the industry to accelerate this transition and provide greater economy and safety. The development of this transition to advanced, automated control system designs is expected to have extensive benefits in reduced operating costs, fewer outages, enhanced safety, improved licensability, and improved public acceptance for commercial nuclear power plants

  19. Preparation of a program for the independent verification of the brachytherapy planning systems calculations

    International Nuclear Information System (INIS)

    V Carmona, V.; Perez-Calatayud, J.; Lliso, F.; Richart Sancho, J.; Ballester, F.; Pujades-Claumarchirant, M.C.; Munoz, M.

    2010-01-01

    In this work a program is presented that independently checks for each patient the treatment planning system calculations in low dose rate, high dose rate and pulsed dose rate brachytherapy. The treatment planning system output text files are automatically loaded in this program in order to get the source coordinates, the desired calculation point coordinates and the dwell times when it is the case. The source strength and the reference dates are introduced by the user. The program allows implementing the recommendations about independent verification of the clinical brachytherapy dosimetry in a simple and accurate way, in few minutes. (Author).

  20. Comparison of the results of radiation transport calculation obtained by means of different programs

    International Nuclear Information System (INIS)

    Gorbatkov, D.V.; Kruchkov, V.P.

    1995-01-01

    Verification of calculational results of radiation transport, obtained by the known, programs and constant libraries (MCNP+ENDF/B, ANISN+HILO, FLUKA92) by means of their comparison with the precision results calculations through ROZ-6N+Sadko program constant complex and with experimental data, is carried out. Satisfactory agreement is shown with the MCNP+ENDF/B package data for the energy range of E<14 MeV. Analysis of the results derivations, obtained trough the ANISN-HILO package for E<400 MeV and the FLUKA92 programs of E<200 GeV is carried out. 25 refs., 12 figs., 3 tabs

  1. Complex of programs for calculating radiation fields outside plane protecting shields, bombarded by high-energy nucleons

    International Nuclear Information System (INIS)

    Gel'fand, E.K.; Man'ko, B.V.; Serov, A.Ya.; Sychev, B.S.

    1979-01-01

    A complex of programs for modelling various radiation situations at high energy proton accelerators is considered. The programs are divided into there main groups according to their purposes. The first group includes programs for preparing constants describing the processes of different particle interaction with a substanc The second group of programs calculates the complete function of particle distribution arising in shields under irradiation by high energy nucleons. Concrete radiation situations arising at high energy proton accelerators are calculated by means of the programs of the third group. A list of programs as well as their short characteristic are given

  2. Toward a human-centered aircraft automation philosophy

    Science.gov (United States)

    Billings, Charles E.

    1989-01-01

    The evolution of automation in civil aircraft is examined in order to discern trends in the respective roles and functions of automation technology and the humans who operate these aircraft. The effects of advances in automation technology on crew reaction is considered and it appears that, though automation may well have decreased the frequency of certain types of human errors in flight, it may also have enabled new categories of human errors, some perhaps less obvious and therefore more serious than those it has alleviated. It is suggested that automation could be designed to keep the pilot closer to the control of the vehicle, while providing an array of information management and aiding functions designed to provide the pilot with data regarding flight replanning, degraded system operation, and the operational status and limits of the aircraft, its systems, and the physical and operational environment. The automation would serve as the pilot's assistant, providing and calculating data, watching for the unexpected, and keeping track of resources and their rate of expenditure.

  3. Novel insights in agent-based complex automated negotiation

    CERN Document Server

    Lopez-Carmona, Miguel; Ito, Takayuki; Zhang, Minjie; Bai, Quan; Fujita, Katsuhide

    2014-01-01

    This book focuses on all aspects of complex automated negotiations, which are studied in the field of autonomous agents and multi-agent systems. This book consists of two parts. I: Agent-Based Complex Automated Negotiations, and II: Automated Negotiation Agents Competition. The chapters in Part I are extended versions of papers presented at the 2012 international workshop on Agent-Based Complex Automated Negotiation (ACAN), after peer reviews by three Program Committee members. Part II examines in detail ANAC 2012 (The Third Automated Negotiating Agents Competition), in which automated agents that have different negotiation strategies and are implemented by different developers are automatically negotiated in the several negotiation domains. ANAC is an international competition in which automated negotiation strategies, submitted by a number of universities and research institutes across the world, are evaluated in tournament style. The purpose of the competition is to steer the research in the area of bilate...

  4. Computer Programs for Calculating and Plotting the Stability Characteristics of a Balloon Tethered in a Wind

    Science.gov (United States)

    Bennett, R. M.; Bland, S. R.; Redd, L. T.

    1973-01-01

    Computer programs for calculating the stability characteristics of a balloon tethered in a steady wind are presented. Equilibrium conditions, characteristic roots, and modal ratios are calculated for a range of discrete values of velocity for a fixed tether-line length. Separate programs are used: (1) to calculate longitudinal stability characteristics, (2) to calculate lateral stability characteristics, (3) to plot the characteristic roots versus velocity, (4) to plot the characteristic roots in root-locus form, (5) to plot the longitudinal modes of motion, and (6) to plot the lateral modes for motion. The basic equations, program listings, and the input and output data for sample cases are presented, with a brief discussion of the overall operation and limitations. The programs are based on a linearized, stability-derivative type of analysis, including balloon aerodynamics, apparent mass, buoyancy effects, and static forces which result from the tether line.

  5. MCFT: a program for calculating fast and thermal neutron multigroup constants

    International Nuclear Information System (INIS)

    Yang Shunhai; Sang Xinzeng

    1993-01-01

    MCFT is a program for calculating the fast and thermal neutron multigroup constants, which is redesigned from some codes for generation of thermal neutron multigroup constants and for fast neutron multigroup constants adapted on CYBER 825 computer. It uses indifferently as basic input with the evaluated nuclear data contained in the ENDF/B (US), KEDAK (Germany) and UK (United Kingdom) libraries. The code includes a section devoted to the generation of resonant Doppler broadened cross section in the framework of single-or multi-level Breit-Wigner formalism. The program can compute the thermal neutron scattering law S (α, β, T) as the input data in tabular, free gas or diffusion motion form. It can treat up to 200 energy groups and Legendre moments up to P 5 . The output consists of various reaction multigroup constants in all neutron energy range desired in the nuclear reactor design and calculation. Three options in input file can be used by the user. The output format is arbitrary and defined by user with a minimum of program modification. The program includes about 15,000 cards and 184 subroutines. FORTRAN 5 computer language is used. The operation system is under NOS 2 on computer CYBER 825

  6. TRIGLAV-W a Windows computer program package with graphical users interface for TRIGA reactor core management calculations

    International Nuclear Information System (INIS)

    Zagar, T.; Zefran, B.; Slavic, S.; Snoj, L.; Ravnik, M.

    2006-01-01

    TRIGLAV-W is a program package for reactor calculations of TRIGA Mark II research reactor cores. This program package runs under Microsoft Windows operating system and has new friendly graphical user interface (GUI). The main part of the package is the TRIGLAV code based on two dimensional diffusion approximation for flux distribution calculation. The new GUI helps the user to prepare the input files, runs the main code and displays the output files. TRIGLAV-W has a user friendly GUI also for the visualisation of the calculation results. Calculation results can be visualised using 2D and 3D coloured graphs for easy presentations and analysis. In the paper the many options of the new GUI are presented along with the results of extensive testing of the program. The results of the TRIGLAV-W program package were compared with the results of WIMS-D and MCNP code for calculations of TRIGA benchmark. TRIGLAV-W program was also tested using several libraries developed under IAEA WIMS-D Library Update Project. Additional literature and application form for TRIGLAV-W program package beta testing can be found at http://www.rcp.ijs.si/triglav/. (author)

  7. A computer program to calculate the committed dose equivalent after the inhalation of radioactivity

    International Nuclear Information System (INIS)

    Van der Woude, S.

    1989-03-01

    A growing number of people are, as part of their occupation, at risk of being exposed to radiation originating from sources inside their bodies. The quantification of this exposure is an important part of health physics. The International Commission on Radiological Protection (ICRP) developed a first-order kinetics compartmental model to determine the transport of radioactive material through the human body. The model and the parameters involved in its use, are discussed. A versatile computer program was developed to do the following after the in vivo measurement of either the organ- or whole-body activity: calculate the original amount of radioactive material which was inhaled (intake) by employing the ICRP compartmental model of the human body; compare this intake to calculated reference levels and state any action to be taken for the case under consideration; calculate the committed dose equivalent resulting from this intake. In the execution of the above-mentioned calculations, the computer program makes provision for different aerosol particle sizes and the effect of previous intakes. Model parameters can easily be changed to take the effects of, for instance, medical intervention into account. The computer program and the organization of the data in the input files are such that the computer program can be applied to any first-order kinetics compartmental model. The computer program can also conveniently be used for research on problems related to the application of the ICRP model. 18 refs., 25 figs., 5 tabs

  8. Thermal-hydraulic Fortran program for steady-state calculations of plate-type fuel research reactors

    Directory of Open Access Journals (Sweden)

    Khedr Ahmed

    2008-01-01

    Full Text Available The safety assessment of research and power reactors is a continuous process covering their lifespan and requiring verified and validated codes. Power reactor codes all over the world are well established and qualified against real measuring data and qualified experimental facilities. These codes are usually sophisticated, require special skills and consume a lot of running time. On the other hand, most research reactor codes still require much more data for validation and qualification. It is, therefore, of benefit to any regulatory body to develop its own codes for the review and assessment of research reactors. The present paper introduces a simple, one-dimensional Fortran program called THDSN for steady-state thermal-hydraulic calculations of plate-type fuel research reactors. Besides calculating the fuel and coolant temperature distributions and pressure gradients in an average and hot channel, the program calculates the safety limits and margins against the critical phenomena encountered in research reactors, such as the onset of nucleate boiling, critical heat flux and flow instability. Well known thermal-hydraulic correlations for calculating the safety parameters and several formulas for the heat transfer coefficient have been used. The THDSN program was verified by comparing its results for 2 and 10 MW benchmark reactors with those published in IAEA publications and a good agreement was found. Also, the results of the program are compared with those published for other programs, such as the PARET and TERMIC.

  9. Transportable educational programs for scientific and technical professionals: More effective utilization of automated scientific and technical data base systems

    Science.gov (United States)

    Dominick, Wayne D.

    1987-01-01

    This grant final report executive summary documents a major, long-term program addressing innovative educational issues associated with the development, administration, evaluation, and widespread distribution of transportable educational programs for scientists and engineers to increase their knowledge of, and facilitate their utilization of automated scientific and technical information storage and retrieval systems. This educational program is of very broad scope, being targeted at Colleges of Engineering and Colleges of Physical sciences at a large number of colleges and universities throughout the United States. The educational program is designed to incorporate extensive hands-on, interactive usage of the NASA RECON system and is supported by a number of microcomputer-based software systems to facilitate the delivery and usage of the educational course materials developed as part of the program.

  10. Automated 741 document preparation: Oak Ridge National Laboratory's Automated Safeguards Information System (OASIS)

    International Nuclear Information System (INIS)

    Austin, H.C.; Gray, L.M.

    1982-01-01

    OASIS has been providing for Oak Ridge National Laboratory's total safeguards needs since being place on line in April 1980. The system supports near real-time nuclear materials safeguards and accountability control. The original design of OASIS called for an automated facsimile of a 741 document to be prepared as a functional by-product of updating the inventory. An attempt was made to utilize, intact, DOE-Albuquerque's automated 741 system to generate the facsimile; however, the five page document produced proved too cumbersome. Albuquerque's programs were modified to print an original 741 document utilizing standard DOE/NRC 741 forms. It is felt that the best features of both the automated and manually generated 741 documents have been incorporated. Automation of the source data for 741 shipping documents produces greater efficiency while reducing possible errors. Through utilization of the standard DOE/NRC form, continuity within the NMMSS system is maintained, thus minimizing the confusion and redundancy associated with facsimiles. OASIS now fulfills the original concept of near real-time accountability by furnishing a viable 741 document as a function of updating the inventory

  11. BASIC Program for the calculation of radioactive activities

    International Nuclear Information System (INIS)

    Cortes P, A.; Tejera R, A.; Becerril V, A.

    1990-04-01

    When one makes a measure of radioactive activity with a detection system that operates with a gamma radiation detector (Ge or of NaI (Tl) detector), it is necessary to take in account parameters and correction factors that making sufficiently difficult and tedious those calculations to using a considerable time by part of the person that carries out these measures. Also, this frequently, can to take to erroneous results. In this work a computer program in BASIC language that solves this problem is presented. (Author)

  12. A microcomputer program for coupled cycle burnup calculations

    International Nuclear Information System (INIS)

    Driscoll, M.J.; Downar, T.J.; Taylor, E.L.

    1986-01-01

    A program, designated BRACC (Burnup, Reactivity, And Cycle Coupling), has been developed for fuel management scoping calculations, and coded in the BASIC language in an interactive format for use with microcomputers. BRACC estimates batch and cycle burnups for sequential reloads for a variety of initial core conditions, and permits the user to specify either reload batch properties (enrichment, burnable poison reactivity) or the target cycle burnup. Most important fuel management tactics (out-in or low-leakage loading, coastdown, variation in number of assemblies charged) can be simulated

  13. Parallelization for first principles electronic state calculation program

    International Nuclear Information System (INIS)

    Watanabe, Hiroshi; Oguchi, Tamio.

    1997-03-01

    In this report we study the parallelization for First principles electronic state calculation program. The target machines are NEC SX-4 for shared memory type parallelization and FUJITSU VPP300 for distributed memory type parallelization. The features of each parallel machine are surveyed, and the parallelization methods suitable for each are proposed. It is shown that 1.60 times acceleration is achieved with 2 CPU parallelization by SX-4 and 4.97 times acceleration is achieved with 12 PE parallelization by VPP 300. (author)

  14. ADGEN: An automated adjoint code generator for large-scale sensitivity analysis

    International Nuclear Information System (INIS)

    Pin, F.G.; Oblow, E.M.; Horwedel, J.E.; Lucius, J.L.

    1987-01-01

    This paper describes a new automated system, named ADGEN, which makes use of the strengths of computer calculus to automate the costly and time-consuming calculation of derivatives in FORTRAN computer codes, and automatically generate adjoint solutions of computer codes

  15. A computer program for unilateral renal clearance calculation by a modified Oberhausen method

    International Nuclear Information System (INIS)

    Brueggemann, G.

    1980-01-01

    A FORTAN program is presented which, on the basis of data obtained with NUKLEOPAN M, calculates the glomerular filtration rate with sup(99m)Tc-DTPA, the unilateral effective renal plasma flow with 131 I-hippuran, and the parameters for describing the isotope rephrogram (ING) with 131 I-hippuran. The results are calculated fully automatically upon entry of the data, and the results are processed and printed out. The theoretical fundamentals of ING and whole-body clearance calculation are presented as well as the methods available for unilateral clearance calculation, and the FORTAN program is described in detail. The standard values of the method are documented, as well as a comparative gamma camera study of 48 patients in order to determine the accuracy of unilateral imaging with the NUKLEOPAN M instrument, a comparison of unilateral clearances by the Oberhausen and Taplin methods, and a comparison between 7/17' plasma clearance and whole-body clearance. Problems and findings of the method are discussed. (orig./MG) [de

  16. Streamlining resummed QCD calculations using Monte Carlo integration

    Energy Technology Data Exchange (ETDEWEB)

    Farhi, David; Feige, Ilya; Freytsis, Marat; Schwartz, Matthew D. [Center for the Fundamental Laws of Nature, Harvard University,17 Oxford St., Cambridge, MA 02138 (United States)

    2016-08-18

    Some of the most arduous and error-prone aspects of precision resummed calculations are related to the partonic hard process, having nothing to do with the resummation. In particular, interfacing to parton-distribution functions, combining various channels, and performing the phase space integration can be limiting factors in completing calculations. Conveniently, however, most of these tasks are already automated in many Monte Carlo programs, such as MADGRAPH http://dx.doi.org/10.1007/JHEP07(2014)079, ALPGEN http://dx.doi.org/10.1088/1126-6708/2003/07/001 or SHERPA http://dx.doi.org/10.1088/1126-6708/2009/02/007. In this paper, we show how such programs can be used to produce distributions of partonic kinematics with associated color structures representing the hard factor in a resummed distribution. These distributions can then be used to weight convolutions of jet, soft and beam functions producing a complete resummed calculation. In fact, only around 1000 unweighted events are necessary to produce precise distributions. A number of examples and checks are provided, including e{sup +}e{sup −} two- and four-jet event shapes, n-jettiness and jet-mass related observables at hadron colliders at next-to-leading-log (NLL) matched to leading order (LO). Attached code can be used to modify MADGRAPH to export the relevant LO hard functions and color structures for arbitrary processes.

  17. Examinations on Applications of Manual Calculation Programs on Lung Cancer Radiation Therapy Using Analytical Anisotropic Algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jung Min; Kim, Dae Sup; Hong, Dong Ki; Back, Geum Mun; Kwak, Jung Won [Dept. of Radiation Oncology, , Seoul (Korea, Republic of)

    2012-03-15

    There was a problem with using MU verification programs for the reasons that there were errors of MU when using MU verification programs based on Pencil Beam Convolution (PBC) Algorithm with radiation treatment plans around lung using Analytical Anisotropic Algorithm (AAA). On this study, we studied the methods that can verify the calculated treatment plans using AAA. Using Eclipse treatment planning system (Version 8.9, Varian, USA), for each 57 fields of 7 cases of Lung Stereotactic Body Radiation Therapy (SBRT), we have calculated using PBC and AAA with dose calculation algorithm. By developing MU of established plans, we compared and analyzed with MU of manual calculation programs. We have analyzed relationship between errors and 4 variables such as field size, lung path distance of radiation, Tumor path distance of radiation, effective depth that can affect on errors created from PBC algorithm and AAA using commonly used programs. Errors of PBC algorithm have showned 0.2{+-}1.0% and errors of AAA have showned 3.5{+-}2.8%. Moreover, as a result of analyzing 4 variables that can affect on errors, relationship in errors between lung path distance and MU, connection coefficient 0.648 (P=0.000) has been increased and we could calculate MU correction factor that is A.E=L.P 0.00903+0.02048 and as a result of replying for manual calculation program, errors of 3.5{+-}2.8% before the application has been decreased within 0.4{+-}2.0%. On this study, we have learned that errors from manual calculation program have been increased as lung path distance of radiation increases and we could verified MU of AAA with a simple method that is called MU correction factor.

  18. Examinations on Applications of Manual Calculation Programs on Lung Cancer Radiation Therapy Using Analytical Anisotropic Algorithm

    International Nuclear Information System (INIS)

    Kim, Jung Min; Kim, Dae Sup; Hong, Dong Ki; Back, Geum Mun; Kwak, Jung Won

    2012-01-01

    There was a problem with using MU verification programs for the reasons that there were errors of MU when using MU verification programs based on Pencil Beam Convolution (PBC) Algorithm with radiation treatment plans around lung using Analytical Anisotropic Algorithm (AAA). On this study, we studied the methods that can verify the calculated treatment plans using AAA. Using Eclipse treatment planning system (Version 8.9, Varian, USA), for each 57 fields of 7 cases of Lung Stereotactic Body Radiation Therapy (SBRT), we have calculated using PBC and AAA with dose calculation algorithm. By developing MU of established plans, we compared and analyzed with MU of manual calculation programs. We have analyzed relationship between errors and 4 variables such as field size, lung path distance of radiation, Tumor path distance of radiation, effective depth that can affect on errors created from PBC algorithm and AAA using commonly used programs. Errors of PBC algorithm have showned 0.2±1.0% and errors of AAA have showned 3.5±2.8%. Moreover, as a result of analyzing 4 variables that can affect on errors, relationship in errors between lung path distance and MU, connection coefficient 0.648 (P=0.000) has been increased and we could calculate MU correction factor that is A.E=L.P 0.00903+0.02048 and as a result of replying for manual calculation program, errors of 3.5±2.8% before the application has been decreased within 0.4±2.0%. On this study, we have learned that errors from manual calculation program have been increased as lung path distance of radiation increases and we could verified MU of AAA with a simple method that is called MU correction factor.

  19. System Operations Studies for Automated Guideway Transit Systems : Discrete Event Simulation Model Programmer's Manual

    Science.gov (United States)

    1982-07-01

    In order to examine specific automated guideway transit (AGT) developments and concepts, UMTA undertook a program of studies and technology investigations called Automated Guideway Transit Technology (AGTT) Program. The objectives of one segment of t...

  20. Performance of automated and manual coding systems for occupational data: a case study of historical records.

    Science.gov (United States)

    Patel, Mehul D; Rose, Kathryn M; Owens, Cindy R; Bang, Heejung; Kaufman, Jay S

    2012-03-01

    Occupational data are a common source of workplace exposure and socioeconomic information in epidemiologic research. We compared the performance of two occupation coding methods, an automated software and a manual coder, using occupation and industry titles from U.S. historical records. We collected parental occupational data from 1920-40s birth certificates, Census records, and city directories on 3,135 deceased individuals in the Atherosclerosis Risk in Communities (ARIC) study. Unique occupation-industry narratives were assigned codes by a manual coder and the Standardized Occupation and Industry Coding software program. We calculated agreement between coding methods of classification into major Census occupational groups. Automated coding software assigned codes to 71% of occupations and 76% of industries. Of this subset coded by software, 73% of occupation codes and 69% of industry codes matched between automated and manual coding. For major occupational groups, agreement improved to 89% (kappa = 0.86). Automated occupational coding is a cost-efficient alternative to manual coding. However, some manual coding is required to code incomplete information. We found substantial variability between coders in the assignment of occupations although not as large for major groups.

  1. Data structures and language elements for automated transport calculations for neutron and gamma radiation

    International Nuclear Information System (INIS)

    Rexer, G.

    1978-12-01

    Computer-aided design of nuclear shielding and irradiation facilities is characterized by studies of different design variants in order to determine which facilities are safe and still economicol. The design engineer has a very complex task including the formulation of calculation models, data linking of programs and data, and the management of large data stores. Integrated modular program systems with centralized module and data management make it possible to treat these problems in a more simplified and automatic manner. The paper describes a system of this type for the field of radiation transport and radiation shielding. The basis is the modular system RSYST II which has a dynamic hierarchical scheme for the structuring of problem data in a central data base. (orig./RW) [de

  2. Calculator: A Hardware Design, Math and Software Programming Project Base Learning

    Directory of Open Access Journals (Sweden)

    F. Criado

    2015-03-01

    Full Text Available This paper presents the implementation by the students of a complex calculator in hardware. This project meets hardware design goals, and also highly motivates them to use competences learned in others subjects. The learning process, associated to System Design, is hard enough because the students have to deal with parallel execution, signal delay, synchronization … Then, to strengthen the knowledge of hardware design a methodology as project based learning (PBL is proposed. Moreover, it is also used to reinforce cross subjects like math and software programming. This methodology creates a course dynamics that is closer to a professional environment where they will work with software and mathematics to resolve the hardware design problems. The students design from zero the functionality of the calculator. They are who make the decisions about the math operations that it is able to resolve it, and also the operands format or how to introduce a complex equation into the calculator. This will increase the student intrinsic motivation. In addition, since the choices may have consequences on the reliability of the calculator, students are encouraged to program in software the decisions about how implement the selected mathematical algorithm. Although math and hardware design are two tough subjects for students, the perception that they get at the end of the course is quite positive.

  3. A semi-automated approach to derive elevation time-series and calculate glacier mass balance from historical aerial imagery

    Science.gov (United States)

    Whorton, E.; Headman, A.; Shean, D. E.; McCann, E.

    2017-12-01

    Understanding the implications of glacier recession on water resources in the western U.S. requires quantifying glacier mass change across large regions over several decades. Very few glaciers in North America have long-term continuous field measurements of glacier mass balance. However, systematic aerial photography campaigns began in 1957 on many glaciers in the western U.S. and Alaska. These historical, vertical aerial stereo-photographs documenting glacier evolution have recently become publically available. Digital elevation models (DEM) of the transient glacier surface preserved in each imagery timestamp can be derived, then differenced to calculate glacier volume and mass change to improve regional geodetic solutions of glacier mass balance. In order to batch process these data, we use Python-based algorithms and Agisoft Photoscan structure from motion (SfM) photogrammetry software to semi-automate DEM creation, and orthorectify and co-register historical aerial imagery in a high-performance computing environment. Scanned photographs are rotated to reduce scaling issues, cropped to the same size to remove fiducials, and batch histogram equalization is applied to improve image quality and aid pixel-matching algorithms using the Python library OpenCV. Processed photographs are then passed to Photoscan through the Photoscan Python library to create DEMs and orthoimagery. To extend the period of record, the elevation products are co-registered to each other, airborne LiDAR data, and DEMs derived from sub-meter commercial satellite imagery. With the exception of the placement of ground control points, the process is entirely automated with Python. Current research is focused on: one, applying these algorithms to create geodetic mass balance time series for the 90 photographed glaciers in Washington State and two, evaluating the minimal amount of positional information required in Photoscan to prevent distortion effects that cannot be addressed during co

  4. DNAStat, version 2.1--a computer program for processing genetic profile databases and biostatistical calculations.

    Science.gov (United States)

    Berent, Jarosław

    2010-01-01

    This paper presents the new DNAStat version 2.1 for processing genetic profile databases and biostatistical calculations. The popularization of DNA studies employed in the judicial system has led to the necessity of developing appropriate computer programs. Such programs must, above all, address two critical problems, i.e. the broadly understood data processing and data storage, and biostatistical calculations. Moreover, in case of terrorist attacks and mass natural disasters, the ability to identify victims by searching related individuals is very important. DNAStat version 2.1 is an adequate program for such purposes. The DNAStat version 1.0 was launched in 2005. In 2006, the program was updated to 1.1 and 1.2 versions. There were, however, slight differences between those versions and the original one. The DNAStat version 2.0 was launched in 2007 and the major program improvement was an introduction of the group calculation options with the potential application to personal identification of mass disasters and terrorism victims. The last 2.1 version has the option of language selection--Polish or English, which will enhance the usage and application of the program also in other countries.

  5. 24 CFR 4001.203 - Calculation of upfront and annual mortgage insurance premiums for Program mortgages.

    Science.gov (United States)

    2010-04-01

    ... mortgage insurance premiums for Program mortgages. 4001.203 Section 4001.203 Housing and Urban Development... HOMEOWNERS PROGRAM HOPE FOR HOMEOWNERS PROGRAM Rights and Obligations Under the Contract of Insurance § 4001.203 Calculation of upfront and annual mortgage insurance premiums for Program mortgages. (a...

  6. Method of computer algebraic calculation of the matrix elements in the second quantization language

    International Nuclear Information System (INIS)

    Gotoh, Masashi; Mori, Kazuhide; Itoh, Reikichi

    1995-01-01

    An automated method by the algebraic programming language REDUCE3 for specifying the matrix elements expressed in second quantization language is presented and then applied to the case of the matrix elements in the TDHF theory. This program works in a very straightforward way by commuting the electron creation and annihilation operator (a † and a) until these operators have completely vanished from the expression of the matrix element under the appropriate elimination conditions. An improved method using singlet generators of unitary transformations in the place of the electron creation and annihilation operators is also presented. This improvement reduces the time and memory required for the calculation. These methods will make programming in the field of quantum chemistry much easier. 11 refs., 1 tab

  7. PEMBELAJARAN SISTEM HIDROLIK DAN PNEUMATIK DENGAN MENGGUNAKAN AUTOMATION STUDIO

    Directory of Open Access Journals (Sweden)

    Adi Dewanto

    2015-02-01

    Full Text Available ABSTRACT Students find it difficult to master the hydraulic and pneumatic system due to the lack of imagination on the component movement. It affects students’ learningprocess on the hydraulic and pneumatic system application. In order to solve the problem, the lecturer of Mechatronics course used the Automation Studio application. This software was helpful to design various automations, such as combination of hydraulic system, pneumatic system, electric system, and PLC. The lecturing process and design simulation were conducted by using Automation Studio. In general, the students were so much helped by this program in mastering the theory and practice of hydraulic and Pneumatic. On the other hand, it was found some problem in applying the Automation Studio on the classroom. The problems were limited onthe menu option as well ason the technical aspects related to the number of the computer. The implications from the writers’ experience in using Automation Studio were there was an opportunity for computer programmer to create learningmedia/ software for certain competence which was relevant, accessible and applicable. Also, in case of software preparation, it should be conducted by the lecturers and the students before the learning process. Keywords: automation studio program, learning process, Pneumatic and hydraulic learning

  8. Package of programs for calculating accidents involving melting of the materials in a fast-reactor vessel

    International Nuclear Information System (INIS)

    Vlasichev, G.N.

    1994-01-01

    Methods for calculating one-dimensional nonstationary temperature distribution in a system of physically coupled materials are described. Six computer programs developed for calculating accident processes for fast reactor core melt are described in the article. The methods and computer programs take into account melting, solidification, and, in some cases, vaporization of materials. The programs perform calculations for heterogeneous systems consisting of materials with arbitrary but constant composition and heat transfer conditions at material boundaries. Additional modules provide calculations of specific conditions of heat transfer between materials, the change in these conditions and configuration of the materials as a result of coolant boiling, melting and movement of the fuel and structural materials, temperature dependences of thermophysical properties of the materials, and heat release in the fuel. 11 refs., 3 figs

  9. Fission product inventory calculation by a CASMO/ORIGEN coupling program

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Do Heon; Kim, Jong Kyung [Hanyang University, Seoul (Korea, Republic of); Choi, Hang Bok; Roh, Gyu Hong; Jung, In Ha [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    A CASMO/ORIGEN coupling utility program was developed to predict the composition of all the fission products in spent PWR fuels. The coupling program reads the CASMO output file, modifies the ORIGEN cross section library and reconstructs the ORIGEN input file at each depletion step. In ORIGEN, the burnup equation is solved for actinides and fission products based on the fission reaction rates and depletion flux of CASMO. A sample calculation has been performed using a 14 x 14 PWR fuel assembly and the results are given in this paper. 3 refs., 1 fig., 1 tab. (Author)

  10. Fission product inventory calculation by a CASMO/ORIGEN coupling program

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Do Heon; Kim, Jong Kyung [Hanyang University, Seoul (Korea, Republic of); Choi, Hang Bok; Roh, Gyu Hong; Jung, In Ha [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1997-12-31

    A CASMO/ORIGEN coupling utility program was developed to predict the composition of all the fission products in spent PWR fuels. The coupling program reads the CASMO output file, modifies the ORIGEN cross section library and reconstructs the ORIGEN input file at each depletion step. In ORIGEN, the burnup equation is solved for actinides and fission products based on the fission reaction rates and depletion flux of CASMO. A sample calculation has been performed using a 14 x 14 PWR fuel assembly and the results are given in this paper. 3 refs., 1 fig., 1 tab. (Author)

  11. Data base to compare calculations and observations

    International Nuclear Information System (INIS)

    Tichler, J.L.

    1985-01-01

    Meteorological and climatological data bases were compared with known tritium release points and diffusion calculations to determine if calculated concentrations could replace measure concentrations at the monitoring stations. Daily tritium concentrations were monitored at 8 stations and 16 possible receptors. Automated data retrieval strategies are listed

  12. Clinical evaluation of automated processing of electrocardiograms by the Veterans Administration program (AVA 3.4).

    Science.gov (United States)

    Brohet, C R; Richman, H G

    1979-06-01

    Automated processing of electrocardiograms by the Veterans Administration program was evaluated for both agreement with physician interpretation and interpretative accuracy as assessed with nonelectrocardiographic criteria. One thousand unselected electrocardiograms were analyzed by two reviewer groups, one familiar and the other unfamiliar with the computer program. A significant number of measurement errors involving repolarization changes and left axis deviation occurred; however, interpretative disagreements related to statistical decision were largely language-related. Use of a printout with a more traditional format resulted in agreement with physician interpretation by both reviewer groups in more than 80 percent of cases. Overall sensitivity based on agreement with nonelectrocardiographic criteria was significantly greater with use of the computer program than with use of the conventional criteria utilized by the reviewers. This difference was particularly evident in the subgroup analysis of myocardial infarction and left ventricular hypertrophy. The degree of overdiagnosis of left ventricular hypertrophy and posteroinferior infarction was initially unacceptable, but this difficulty was corrected by adjustment of probabilities. Clinical acceptability of the Veterans Administration program appears to require greater physician education than that needed for other computer programs of electrocardiographic analysis; the flexibility of interpretation by statistical decision offers the potential for better diagnostic accuracy.

  13. 75 FR 64737 - Automated Commercial Environment (ACE): Announcement of a National Customs Automation Program...

    Science.gov (United States)

    2010-10-20

    ... interchange (EDI) for submitting advance ocean and rail data and intends to amend the regulations as [email protected] . Please describe in the body of the e-mail any past EDI history with CBP. Written... (EDI). For ocean and rail carriers, the CBP-approved EDI is the Automated Manifest System (AMS). Ocean...

  14. DEVELOPMENT OF PROGRAM MODULE FOR CALCULATING SPEED OF TITANIC PLASMA SEDIMENTATION IN ENVIRONMENT OF TECHNOLOGICAL GAS

    Directory of Open Access Journals (Sweden)

    S. A. Ivaschenko

    2006-01-01

    Full Text Available The program module has been developed on the basis of package of applied MATLAB programs which allows to calculate speed of coating sedimentation over the section of plasma stream taking into account magnetic field influence of a stabilizing coil, and also to correct the obtained value of sedimentation speed depending on the value of negative accelerating potential, arch current, technological gas pressure. The program resolves visualization of calculation results.

  15. Development and Applications of a Prototypic SCALE Control Module for Automated Burnup Credit Analysis

    International Nuclear Information System (INIS)

    Gauld, I.C.

    2001-01-01

    Consideration of the depletion phenomena and isotopic uncertainties in burnup-credit criticality analysis places an increasing reliance on computational tools and significantly increases the overall complexity of the calculations. An automated analysis and data management capability is essential for practical implementation of large-scale burnup credit analyses that can be performed in a reasonable amount of time. STARBUCS is a new prototypic analysis sequence being developed for the SCALE code system to perform automated criticality calculations of spent fuel systems employing burnup credit. STARBUCS is designed to help analyze the dominant burnup credit phenomena including spatial burnup gradients and isotopic uncertainties. A search capability also allows STARBUCS to iterate to determine the spent fuel parameters (e.g., enrichment and burnup combinations) that result in a desired k eff for a storage configuration. Although STARBUCS was developed to address the analysis needs for spent fuel transport and storage systems, it provides sufficient flexibility to allow virtually any configuration of spent fuel to be analyzed, such as storage pools and reprocessing operations. STARBUCS has been used extensively at Oak Ridge National Laboratory (ORNL) to study burnup credit phenomena in support of the NRC Research program

  16. Media Magic: Automating a K-12 Library Program in a Rural District.

    Science.gov (United States)

    Adams, Helen

    1994-01-01

    Describes the automation process in a library resources center in a small rural school district. Topics discussed include long-range planning; retrospective conversion for an online catalog; library automation software vendors; finances; training; time savings; CD-ROM products; telecomputing; computer literacy skills; professional development…

  17. Qgui: A high-throughput interface for automated setup and analysis of free energy calculations and empirical valence bond simulations in biological systems.

    Science.gov (United States)

    Isaksen, Geir Villy; Andberg, Tor Arne Heim; Åqvist, Johan; Brandsdal, Bjørn Olav

    2015-07-01

    Structural information and activity data has increased rapidly for many protein targets during the last decades. In this paper, we present a high-throughput interface (Qgui) for automated free energy and empirical valence bond (EVB) calculations that use molecular dynamics (MD) simulations for conformational sampling. Applications to ligand binding using both the linear interaction energy (LIE) method and the free energy perturbation (FEP) technique are given using the estrogen receptor (ERα) as a model system. Examples of free energy profiles obtained using the EVB method for the rate-limiting step of the enzymatic reaction catalyzed by trypsin are also shown. In addition, we present calculation of high-precision Arrhenius plots to obtain the thermodynamic activation enthalpy and entropy with Qgui from running a large number of EVB simulations. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Dijkstra's interpretation of the approach to solving a problem of program correctness

    Directory of Open Access Journals (Sweden)

    Markoski Branko

    2010-01-01

    Full Text Available Proving the program correctness and designing the correct programs are two connected theoretical problems, which are of great practical importance. The first is solved within program analysis, and the second one in program synthesis, although intertwining of these two processes is often due to connection between the analysis and synthesis of programs. Nevertheless, having in mind the automated methods of proving correctness and methods of automatic program synthesis, the difference is easy to tell. This paper presents denotative interpretation of programming calculation explaining semantics by formulae φ and ψ, in such a way that they can be used for defining state sets for program P.

  19. Abstract of programs for nuclear reactor calculation and kinetic equations solution

    International Nuclear Information System (INIS)

    Marakazov, A.A.

    1977-01-01

    The collection includes about 50 annotations of programmes,developed in the Kurchatov Atomic Energy Institute in 1971-1976. The programmes are intended for calculating the neutron flux, for solving systems of multigroup equations in P 3 approximation, for calculating the reactor cell, for analysing the system stability, breeding ratio etc. The programme annotations are compiled according to the following diagram: 1.Programme title. 2.Computer type. 3.Physical problem. 4.Solution method. 5.Calculation limitations. 6.Characteristic computer time. 7.Programme characteristic features. 8.Bound programmes. 9.Programme state. 10.Literature allusions in the programme. 11.Required memory resourses. 12.Programming language. 13.Operation system. 14.Names of authors and place of programme adjusting

  20. Magnetic particle movement program to calculate particle paths in flow and magnetic fields

    International Nuclear Information System (INIS)

    Inaba, Toru; Sakazume, Taku; Yamashita, Yoshihiro; Matsuoka, Shinya

    2014-01-01

    We developed an analysis program for predicting the movement of magnetic particles in flow and magnetic fields. This magnetic particle movement simulation was applied to a capturing process in a flow cell and a magnetic separation process in a small vessel of an in-vitro diagnostic system. The distributions of captured magnetic particles on a wall were calculated and compared with experimentally obtained distributions. The calculations involved evaluating not only the drag, pressure gradient, gravity, and magnetic force in a flow field but also the friction force between the particle and the wall, and the calculated particle distributions were in good agreement with the experimental distributions. Friction force was simply modeled as static and kinetic friction forces. The coefficients of friction were determined by comparing the calculated and measured results. This simulation method for solving multiphysics problems is very effective at predicting the movements of magnetic particles and is an excellent tool for studying the design and application of devices. - Highlights: ●We developed magnetic particles movement program in flow and magnetic fields. ●Friction force on wall is simply modeled as static and kinetic friction force. ●This program was applied for capturing and separation of an in-vitro diagnostic system. ●Predicted particle distributions on wall were agreed with experimental ones. ●This method is very effective at predicting movements of magnetic particles

  1. ORBITALES. A program for the calculation of wave functions with an analytical central potential

    International Nuclear Information System (INIS)

    Yunta Carretero; Rodriguez Mayquez, E.

    1974-01-01

    In this paper is described the objective, basis, carrying out in FORTRAN language and use of the program ORBITALES. This program calculate atomic wave function in the case of ths analytical central potential (Author) 8 refs

  2. The Weak Link HP-41C hand-held calculator program

    Science.gov (United States)

    Ross A. Phillips; Penn A. Peters; Gary D. Falk

    1982-01-01

    The Weak Link hand-held calculator program (HP-41C) quickly analyzes a system for logging production and costs. The production equations model conventional chain saw, skidder, loader, and tandemaxle truck operations in eastern mountain areas. Production of each function of the logging system may be determined so that the system may be balanced for minimum cost. The...

  3. Automated external defibrillators in National Collegiate Athletic Association Division I Athletics.

    Science.gov (United States)

    Coris, Eric E; Sahebzamani, Frances; Walz, Steve; Ramirez, Arnold M

    2004-01-01

    Sudden cardiac death is the leading cause of death in athletes. Evidence on current sudden cardiac death prevention through preparticipation history, physicals, and noninvasive cardiovascular diagnostics has demonstrated a low sensitivity for detection of athletes at high risk of sudden cardiac death. Data are lacking on automated external defibrillator programs specifically initiated to respond to rare dysrhythmia in younger, relatively low-risk populations. Surveys were mailed to the head athletic trainers of all National Collegiate Athletic Association Division I athletics programs listed in the National Athletic Trainers' Association directory. In all, 303 surveys were mailed; 186 departments (61%) responded. Seventy-two percent (133) of responding National Collegiate Athletic Association Division I athletics programs have access to automated external defibrillator units; 54% (101) own their units. Proven medical benefit (55%), concern for liability (51%), and affordability (29%) ranked highest in frequency of reasons for automated external defibrillator purchase. Unit cost (odds ratio = 1.01; 95% confidence interval, 1.01-1.0), donated units (odds ratio = 1.92; confidence interval, 3.66-1.01), institution size (odds ratio =.0001; confidence interval, 1.3 E-4 to 2.2E-05), and proven medical benefit of automated external defibrillators (odds ratio = 24; confidence interval, 72-8.1) were the most significant predictors of departmental defibrillator ownership. Emergency medical service response time and sudden cardiac death event history were not significantly predictive of departmental defibrillator ownership. The majority of automated external defibrillator interventions occurred on nonathletes. Many athletics medicine programs are obtaining automated external defibrillators without apparent criteria for determination of need. Usage and maintenance policies vary widely among departments with unit ownership or access. Programs need to approach the issue of unit

  4. Program realization of mathematical model of kinetostatical calculation of flat lever mechanisms

    Directory of Open Access Journals (Sweden)

    M. A. Vasechkin

    2016-01-01

    Full Text Available Global computerization determined the dominant position of the analytical methods for the study of mechanisms. As a result, kinetostatics analysis of mechanisms using software packages is an important part of scientific and practical activities of engineers and designers. Therefore, software implementation of mathematical models kinetostatical calculating mechanisms is of practical interest. The mathematical model obtained in [1]. In the language of Turbo Pascal developed a computer procedure that calculates the forces in kinematic pairs in groups Assur (GA and a balancing force at the primary level. Before use appropriate computational procedures it is necessary to know all external forces and moments acting on the GA and to determine the inertial forces and moments of inertia forces. The process of calculations and constructions of the provisions of the mechanism can be summarized as follows. Organized cycle in which to calculate the position of an initial link of the mechanism. Calculate the position of the remaining links of the mechanism by referring to relevant procedures module DIADA in GA [2,3]. Using the graphics mode of the computer displaying on the display the position of the mechanism. The computed inertial forces and moments of inertia forces. Turning to the corresponding procedures of the module, calculated all the forces in kinematic pairs and the balancing force at the primary level. In each kinematic pair build forces and their direction with the help of simple graphical procedures. The magnitude of these forces and their direction are displayed in a special window with text mode. This work contains listings of the test programs MyTеst, is an example of using computing capabilities of the developed module. As a check on the calculation procedures of module in the program is reproduced an example of calculating the balancing forces according to the method of Zhukovsky (Zhukovsky lever.

  5. Integrating Test-Form Formatting into Automated Test Assembly

    Science.gov (United States)

    Diao, Qi; van der Linden, Wim J.

    2013-01-01

    Automated test assembly uses the methodology of mixed integer programming to select an optimal set of items from an item bank. Automated test-form generation uses the same methodology to optimally order the items and format the test form. From an optimization point of view, production of fully formatted test forms directly from the item pool using…

  6. Bar-code automated waste tracking system

    International Nuclear Information System (INIS)

    Hull, T.E.

    1994-10-01

    The Bar-Code Automated Waste Tracking System was designed to be a site-Specific program with a general purpose application for transportability to other facilities. The system is user-friendly, totally automated, and incorporates the use of a drive-up window that is close to the areas dealing in container preparation, delivery, pickup, and disposal. The system features ''stop-and-go'' operation rather than a long, tedious, error-prone manual entry. The system is designed for automation but allows operators to concentrate on proper handling of waste while maintaining manual entry of data as a backup. A large wall plaque filled with bar-code labels is used to input specific details about any movement of waste

  7. Automation of gamma-therapy

    International Nuclear Information System (INIS)

    Al'bitskij, L.L.; Brikker, I.N.; Bychkov, V.N.; Voronin, V.V.; Mirzoyan, A.R.; Rogozhin, A.S.; Sarkisyan, Yu.Kh.

    1989-01-01

    A system of automated control Aspect-2 was developed for automation of gamma therapy on units of the Rokus series. The system consists of the following hardware and software complexes: a complex of preirradiation preparation Centrator-imitator, a complex Accord for anatomotopographic data coding; a software complex and a gamma-therapeutic complex Rokus-AM. The Centrator-imitator and Rokus-AM complexes are fitted out with built-in microcomputers with specially developed systemic software. The Rokus-AM complex has automatic punch tape programmed control of 9 degrees of freedom of the gamma-unit and treatment table and ensures 5 modes of irradiation: positional, rotating, rotaing-convergent, sectoral rotating-convergent and scanning

  8. SAMPO 90 high resolution interactive gamma-spectrum analysis including automation with macros

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Nikkinen, M.T.; Routti, J.T.

    1992-01-01

    SAMPO 90 is high performance gamma-spectrum analysis program for personal computers. It uses color graphics to display calibrations, spectra, fitting results as multiplet components, and analysis results. All the analysis phases can be done either under full interactive user control or macros and programmable function keys can be used for completely automated measurement and analysis sequences including the control of MACs and sample changers. Accurate peak area determination of even the most complex multiplets, of up to 32 components, is accomplished using linear and mixed mode fitting. Nuclide identification is done using associated lines techniques allowing interference correction for fully overlapping peaks. Peaked Background Subtraction can be performed and Minimum Detectable Activities calculated. The analysis reports and program parameters are fully customizable. (author) 13 refs.; 1 fig

  9. A finite element computer program for the calculation of the resonant frequencies of anisotropic materials

    International Nuclear Information System (INIS)

    Fleury, W.H.; Rosinger, H.E.; Ritchie, I.G.

    1975-09-01

    A set of computer programs for the calculation of the flexural and torsional resonant frequencies of rectangular section bars of materials of orthotropic or higher symmetry are described. The calculations are used in the experimental determination and verification of the elastic constants of anisotropic materials. The simple finite element technique employed separates the inertial and elastic properties of the beam element into station and field transfer matrices respectively. It includes the Timoshenko beam corrections for flexure and Lekhnitskii's theory for torsion-flexure coupling. The programs also calculate the vibration shapes and surface nodal contours or Chladni figures of the vibration modes. (author)

  10. Effective Dose Calculation Program (EDCP) for the usage of NORM-added consumer product.

    Science.gov (United States)

    Yoo, Do Hyeon; Lee, Jaekook; Min, Chul Hee

    2018-04-09

    The aim of this study is to develop the Effective Dose Calculation Program (EDCP) for the usage of Naturally Occurring Radioactive Material (NORM) added consumer products. The EDCP was developed based on a database of effective dose conversion coefficient and the Matrix Laboratory (MATLAB) program to incorporate a Graphic User Interface (GUI) for ease of use. To validate EDCP, the effective dose calculated with EDCP by manually determining the source region by using the GUI and that by using the reference mathematical algorithm were compared for pillow, waist supporter, eye-patch and sleeping mattress. The results show that the annual effective dose calculated with EDCP was almost identical to that calculated using the reference mathematical algorithm in most of the assessment cases. With the assumption of the gamma energy of 1 MeV and activity of 1 MBq, the annual effective doses of pillow, waist supporter, sleeping mattress, and eye-patch determined using the reference algorithm were 3.444 mSv year -1 , 2.770 mSv year -1 , 4.629 mSv year -1 , and 3.567 mSv year -1 , respectively, while those calculated using EDCP were 3.561 mSv year -1 , 2.630 mSv year -1 , 4.740 mSv year -1 , and 3.780 mSv year -1 , respectively. The differences in the annual effective doses were less than 5%, despite the different calculation methods employed. The EDCP can therefore be effectively used for radiation protection management in the context of the usage of NORM-added consumer products. Additionally, EDCP can be used by members of the public through the GUI for various studies in the field of radiation protection, thus facilitating easy access to the program. Copyright © 2018. Published by Elsevier Ltd.

  11. Evaluation of automated residential demand response with flat and dynamic pricing

    International Nuclear Information System (INIS)

    Swisher, Joel; Wang, Kitty; Stewart, Stewart

    2005-01-01

    This paper reviews the performance of two recent automated load management programs for residential customers of electric utilities in two American states. Both pilot programs have been run with about 200 participant houses each, and both programs have control populations of similar customers without the technology or program treatment. In both cases, the technology used in the pilot is GoodWatts, an advanced, two-way, real-time, comprehensive home energy management system. The purpose of each pilot is to determine the household kW reduction in coincident peak electric load from the energy management technology. Nevada Power has conducted a pilot program for Air-Conditioning Load Management (ACLM), in which customers are sent an electronic curtailment signal for three-hour intervals during times of maximum peak demand. The participating customers receive an annual incentive payment, but otherwise they are on a conventional utility tariff. In California, three major utilities are jointly conducting a pilot demonstration of an Automated Demand Response System (ADRS). Customers are on a time-of-use (ToU) tariff, which includes a critical peak pricing (CPP) element. During times of maximum peak demand, customers are sent an electronic price signal that is three times higher than the normal on-peak price. Houses with the automated GoodWatts technology reduced their demand in both the ACLM and the ADRS programs by about 50% consistently across the summer curtailment or super peak events, relative to homes without the technology or any load management program or tariff in place. The absolute savings were greater in the ACLM program, due to the higher baseline air conditioning loads in the hotter Las Vegas climate. The results suggest that either automated technology or dynamic pricing can deliver significant demand response in low-consumption houses. However, for high-consumption houses, automated technology can reduce load by a greater absolute kWh difference. Targeting

  12. The transition equation of the state intensities for exciton model and the calculation program

    International Nuclear Information System (INIS)

    Yu Xian; Zheng Jiwen; Liu Guoxing; Chen Keliang

    1995-01-01

    An equation set of the exciton model is given and calculation program is developed. The process of approaching to equilibrium state has been investigated with the program for 12 C + 64 Ni reaction at energy 72 MeV

  13. ERATO - a computer program for the calculation of induced eddy-currents in three-dimensional conductive structures

    International Nuclear Information System (INIS)

    Benner, J.

    1985-10-01

    The computer code ERATO is used for the calculation of eddy-currents in three-dimensional conductive structures and their secondary magnetic field. ERATO is a revised version of the code FEDIFF, developed at IPP Garching. For the calculation the Finite-Element-Network (FEN) method is used, where the structure is simulated by an equivalent electric network. In the ERATO-code, the calculation of the finite-element discretization, the eddy-current analysis, and the final evaluation of the results are done in separate programs. So the eddy-current analysis as the central step is perfectly independent of a special geometry. For the finite-element discretization there are two so called preprocessors, which treat a torus-segment and a rectangular, flat plate. For the final evaluation postprocessors are used, by which the current-distributions can be printed and plotted. In the report, the theoretical foundation of the FEN-Method is discussed, the structure and the application of the programs (preprocessors, analysis-program, postprocessors, supporting programs) are shown, and two examples for calculations are presented. (orig.) [de

  14. Automated uncertainty analysis methods in the FRAP computer codes

    International Nuclear Information System (INIS)

    Peck, S.O.

    1980-01-01

    A user oriented, automated uncertainty analysis capability has been incorporated in the Fuel Rod Analysis Program (FRAP) computer codes. The FRAP codes have been developed for the analysis of Light Water Reactor fuel rod behavior during steady state (FRAPCON) and transient (FRAP-T) conditions as part of the United States Nuclear Regulatory Commission's Water Reactor Safety Research Program. The objective of uncertainty analysis of these codes is to obtain estimates of the uncertainty in computed outputs of the codes is to obtain estimates of the uncertainty in computed outputs of the codes as a function of known uncertainties in input variables. This paper presents the methods used to generate an uncertainty analysis of a large computer code, discusses the assumptions that are made, and shows techniques for testing them. An uncertainty analysis of FRAP-T calculated fuel rod behavior during a hypothetical loss-of-coolant transient is presented as an example and carried through the discussion to illustrate the various concepts

  15. DEPDOSE: An interactive, microcomputer based program to calculate doses from exposure to radionuclides deposited on the ground

    International Nuclear Information System (INIS)

    Beres, D.A.; Hull, A.P.

    1991-12-01

    DEPDOSE is an interactive, menu driven, microcomputer based program designed to rapidly calculate committed dose from radionuclides deposited on the ground. The program is designed to require little or no computer expertise on the part of the user. The program consisting of a dose calculation section and a library maintenance section. These selections are available to the user from the main menu. The dose calculation section provides the user with the ability to calculate committed doses, determine the decay time needed to reach a particular dose, cross compare deposition data from separate locations, and approximate a committed dose based on a measured exposure rate. The library maintenance section allows the user to review and update dose modifier data as well as to build and maintain libraries of radionuclide data, dose conversion factors, and default deposition data. The program is structured to provide the user easy access for reviewing data prior to running the calculation. Deposition data can either be entered by the user or imported from other databases. Results can either be displayed on the screen or sent to the printer

  16. CRYOCOL a computer program to calculate the cryogenic distillation of hydrogen isotopes

    International Nuclear Information System (INIS)

    Douglas, S.R.

    1993-02-01

    This report describes the computer model and mathematical method coded into the AECL Research computer program CRYOCOL. The purpose of CRYOCOL is to calculate the separation of hydrogen isotopes by cryogenic distillation. (Author)

  17. HEXANN-EVALU - a Monte Carlo program system for pressure vessel neutron irradiation calculation

    International Nuclear Information System (INIS)

    Lux, Ivan

    1983-08-01

    The Monte Carlo program HEXANN and the evaluation program EVALU are intended to calculate Monte Carlo estimates of reaction rates and currents in segments of concentric angular regions around a hexagonal reactor-core region. The report describes the theoretical basis, structure and activity of the programs. Input data preparation guides and a sample problem are also included. Theoretical considerations as well as numerical experimental results suggest the user a nearly optimum way of making use of the Monte Carlo efficiency increasing options included in the program

  18. SUBDOSA: a computer program for calculating external doses from accidental atmospheric releases of radionuclides

    International Nuclear Information System (INIS)

    Strenge, D.L.; Watson, E.C.; Houston, J.R.

    1975-06-01

    A computer program, SUBDOSA, was developed for calculating external γ and β doses to individuals from the accidental release of radionuclides to the atmosphere. Characteristics of SUBDOSA are: doses from both γ and β radiation are calculated as a function of depth in tissue, summed and reported as skin, eye, gonadal, and total body dose; doses are calculated for releases within each of several release time intervals and nuclide inventories and atmospheric dispersion conditions are considered for each time interval; radioactive decay is considered during the release and/or transit using a chain decay scheme with branching to account for transitions to and from isomeric states; the dose from gamma radiation is calculated using a numerical integration technique to account for the finite size of the plume; and the program computes and lists the normalized air concentrations at ground level as a function of distance from the point of release. (auth)

  19. An Automated Self-Learning Quantification System to Identify Visible Areas in Capsule Endoscopy Images.

    Science.gov (United States)

    Hashimoto, Shinichi; Ogihara, Hiroyuki; Suenaga, Masato; Fujita, Yusuke; Terai, Shuji; Hamamoto, Yoshihiko; Sakaida, Isao

    2017-08-01

    Visibility in capsule endoscopic images is presently evaluated through intermittent analysis of frames selected by a physician. It is thus subjective and not quantitative. A method to automatically quantify the visibility on capsule endoscopic images has not been reported. Generally, when designing automated image recognition programs, physicians must provide a training image; this process is called supervised learning. We aimed to develop a novel automated self-learning quantification system to identify visible areas on capsule endoscopic images. The technique was developed using 200 capsule endoscopic images retrospectively selected from each of three patients. The rate of detection of visible areas on capsule endoscopic images between a supervised learning program, using training images labeled by a physician, and our novel automated self-learning program, using unlabeled training images without intervention by a physician, was compared. The rate of detection of visible areas was equivalent for the supervised learning program and for our automatic self-learning program. The visible areas automatically identified by self-learning program correlated to the areas identified by an experienced physician. We developed a novel self-learning automated program to identify visible areas in capsule endoscopic images.

  20. DIDACTIC AUTOMATED STATION OF COMPLEX KINEMATICS

    Directory of Open Access Journals (Sweden)

    Mariusz Sosnowski

    2014-03-01

    Full Text Available The paper presents the design, control system and software that controls the automated station of complex kinematics. Control interface and software has been developed and manufactured in the West Pomeranian University of Technology in Szczecin in the Department of Automated Manufacturing Systems Engineering and Quality. Conducting classes designed to teach programming and design of structures and systems for monitoring the robot kinematic components with non-standard structures was the reason for installation of the control system and software.

  1. Role of automation in the ACRV operations

    Science.gov (United States)

    Sepahban, S. F.

    1992-01-01

    The Assured Crew Return Vehicle (ACRV) will provide the Space Station Freedom with contingency means of return to earth (1) of one disabled crew member during medical emergencies, (2) of all crew members in case of accidents or failures of SSF systems, and (3) in case of interruption of the Space Shuttle flights. A wide range of vehicle configurations and system approaches are currently under study. The Program requirements focus on minimizing life cycle costs by ensuring simple operations, built-in reliability and maintainability. The ACRV philosophy of embedded operations is based on maximum use of existing facilities, resources and processes, while minimizing the interfaces and impacts to the Space Shuttle and Freedom programs. A preliminary integrated operations concept based on this philosophy and covering the ground, flight, mission support, and landing and recovery operations has been produced. To implement the ACRV operations concept, the underlying approach has been to rely on vehicle autonomy and automation, to the extent possible. Candidate functions and processes which may benefit from current or near-term automation and robotics technologies are identified. These include, but are not limited to, built-in automated ground tests and checkouts; use of the Freedom and the Orbiter remote manipulator systems, for ACRV berthing; automated passive monitoring and performance trend analysis, and periodic active checkouts during dormant periods. The major ACRV operations concept issues as they relate to the use of automation are discussed.

  2. FUP1--an unified program for calculating all fast neutron data of fissile nucleus

    International Nuclear Information System (INIS)

    Cai Chonghai; Zuo Yixin

    1990-01-01

    FUP1 is the first edition of an unified program for calculating all the fast neutron data in ENDF/B-4 format for fissile nucleus. Following data are calculated with FUP1 code: the total cross section, elastic scattering cross section, nonelastic cross section, total including up to 40 isolated levels and continuum state inelastic cross sections. In FUP1 the energy region of incident neutron is restricted to 10 Kev to 20 Mev. The advantages of this program are its perfect function, convenient to users and running very fast

  3. Super Phenix. Monitoring of structures subject to irradiation. Neutron dosimetry measurement and calculation program

    International Nuclear Information System (INIS)

    Cabrillat, J.C.; Arnaud, G.; Calamand, D.; Manent, G.; Tavassoli, A.A.

    1984-09-01

    For the Super Phenix reactor, the evolution, versus the irradiation of the mechanical properties of the core diagrid steel is the object of studies and is particularly monitored. The specimens irradiated, now in PHENIX and will be later irradiated in SUPER PHENIX as soon as the first operating cycles. An important dosimetry program coupling calculation and measurement, is parallely carried out. This paper presents the reasons, the definition of the structure, of the development and of materials used in this program of dosimetry, as also the first results of a calculation-measurement comparison [fr

  4. Cylindrization of a PWR core for neutronic calculations

    International Nuclear Information System (INIS)

    Santos, Rubens Souza dos

    2005-01-01

    In this work we propose a core cylindrization, starting from a PWR core configuration, through the use of an algorithm that becomes the process automated in the program, independent of the discretization. This approach overcomes the problem stemmed from the use of the neutron transport theory on the core boundary, in addition with the singularities associated with the presence of corners on the outer fuel element core of, existents in the light water reactors (LWR). The algorithm was implemented in a computational program used to identification of the control rod drop accident in a typical PWR core. The results showed that the algorithm presented consistent results comparing with an production code, for a problem with uniform properties. In our conclusions, we suggest, for future works, for analyzing the effect on mesh sizes for the Cylindrical geometry, and to compare the transport theory calculations versus diffusion theory, for the boundary conditions with corners, for typical PWR cores. (author)

  5. UNIDOSE - a computer program for the calculation of individual and collective doses from airborne radioactive pollutants

    International Nuclear Information System (INIS)

    Karlberg, O.; Schwartz, H.; Forssen, B.-H.; Marklund, J.-E.

    1979-01-01

    UNIDOSE is a program system for calculating the consequences of a radioactive release to the atmosphere. The program is applicable for computation of dispersion in a rnage of 0 - 50 km from the release point. The Gaussion plume model is used for calculating the external dose from activity in the atmosphere, on the ground and the internal dose via inhalation. Radioactive decay, as well as growth and decay of daughter products are accounted for. The influence of dry deposition and wash-out are also considered. It is possible to treat time-dependent release-rates of 1 - 24 hours duration and constant release-rates for up to one year. The program system also contains routines for the calculation of collective dose and health effects. The system operates in a statistical manner. Many weather-situations, based on measured data, can be analysed and statistical properties, such as cumulative frequences, can be calculated. (author)

  6. Automated Accounting. Instructor Guide.

    Science.gov (United States)

    Moses, Duane R.

    This curriculum guide was developed to assist business instructors using Dac Easy Accounting College Edition Version 2.0 software in their accounting programs. The module consists of four units containing assignment sheets and job sheets designed to enable students to master competencies identified in the area of automated accounting. The first…

  7. Automated and observer based light field indicator edge evaluation in diagnostic X-ray equipment

    International Nuclear Information System (INIS)

    Bottaro, Marcio; Nagy, Balazs Vince; Soares, Fernanda Cristina Salvador; Rosendo, Danilo Cabral

    2017-01-01

    Introduction: To analyze edge detection and optical contrast calculation of light field-indicators used in X-ray via automated- and observer-based methods, and comparison with current standard approaches, which do not give exact definition for light field edge determination. Methods: Automated light sensor array was used to measure the penumbra zone of the edge in the standard X-ray equipment, while trained and naive human observers were asked to mark the light field edge according to their own determination. Different interpretations of the contrast were then calculated and compared. Results: In contrast to automated measurements of edge definition and detection, measurements by human observers showed large inter-observer variation independent of their training with X-ray equipment. Different contrast calculations considering the different edge definitions gave very different contrast values. Conclusion: As the main conclusion, we propose a more exact edge definition of the X-ray light field, corresponding well to the average human observer's edge determination. The new edge definition method with automated systems would reduce human variability in edge determination. Such errors could potentially affect the approval of X-ray equipment, and also increase the radiation dose. The automated measurement based on human observers’ edge definition and the corresponding contrast calculation may lead to a more precise light field calibration, which enables reduced irradiation doses on radiology patients. (author)

  8. Automated and observer based light field indicator edge evaluation in diagnostic X-ray equipment

    Directory of Open Access Journals (Sweden)

    Márcio Bottaro

    Full Text Available Abstract Introduction To analyze edge detection and optical contrast calculation of light field-indicators used in X-ray via automated- and observer-based methods, and comparison with current standard approaches, which do not give exact definition for light field edge determination. Methods Automated light sensor array was used to measure the penumbra zone of the edge in the standard X-ray equipment, while trained and naïve human observers were asked to mark the light field edge according to their own determination. Different interpretations of the contrast were then calculated and compared. Results In contrast to automated measurements of edge definition and detection, measurements by human observers showed large inter-observer variation independent of their training with X-ray equipment. Different contrast calculations considering the different edge definitions gave very different contrast values. Conclusion As the main conclusion, we propose a more exact edge definition of the X-ray light field, corresponding well to the average human observer’s edge determination. The new edge definition method with automated systems would reduce human variability in edge determination. Such errors could potentially affect the approval of X-ray equipment, and also increase the radiation dose. The automated measurement based on human observers’ edge definition and the corresponding contrast calculation may lead to a more precise light field calibration, which enables reduced irradiation doses on radiology patients.

  9. Automated and observer based light field indicator edge evaluation in diagnostic X-ray equipment

    Energy Technology Data Exchange (ETDEWEB)

    Bottaro, Marcio; Nagy, Balazs Vince; Soares, Fernanda Cristina Salvador; Rosendo, Danilo Cabral, E-mail: marcio@iee.usp.br [Universidade de Sao Paulo (USP), SP (Brazil); Optics and Engineering Informatics, Budapest University of Technology and Economics, Budapest (Hungary)

    2017-04-15

    Introduction: To analyze edge detection and optical contrast calculation of light field-indicators used in X-ray via automated- and observer-based methods, and comparison with current standard approaches, which do not give exact definition for light field edge determination. Methods: Automated light sensor array was used to measure the penumbra zone of the edge in the standard X-ray equipment, while trained and naive human observers were asked to mark the light field edge according to their own determination. Different interpretations of the contrast were then calculated and compared. Results: In contrast to automated measurements of edge definition and detection, measurements by human observers showed large inter-observer variation independent of their training with X-ray equipment. Different contrast calculations considering the different edge definitions gave very different contrast values. Conclusion: As the main conclusion, we propose a more exact edge definition of the X-ray light field, corresponding well to the average human observer's edge determination. The new edge definition method with automated systems would reduce human variability in edge determination. Such errors could potentially affect the approval of X-ray equipment, and also increase the radiation dose. The automated measurement based on human observers’ edge definition and the corresponding contrast calculation may lead to a more precise light field calibration, which enables reduced irradiation doses on radiology patients. (author)

  10. Instrument calls and real-time code for laboratory automation

    International Nuclear Information System (INIS)

    Taber, L.; Ames, H.S.; Yamauchi, R.K.; Barton, G.W. Jr.

    1978-01-01

    These programs are the result of a joint Lawrence Livermore Laboratory and Environmental Protection Agency project to automate water quality laboratories. They form the interface between the analytical instruments and the BASIC language programs for data reduction and analysis. They operate on Data General NOVA 840's at Cincinnati and Chicago and on a Data General ECLIPSE C330 at Livermore. The operating system consists of unmodified RDOS, Data General's disk operating system, and Data General's multiuser BASIC modified to provide the instrument CALLs and other functions described. Instruments automated at various laboratories include Technicon AutoAnalyzers, atomic absorption spectrophotometers, total organic carbon analyzers, an emission spectrometer, an electronic balance, sample changers, and an optical spectrophotometer. Other instruments may be automated using these same CALLs, or new CALLs may be written as described

  11. Study of high-performance canonical molecular orbitals calculation for proteins

    Science.gov (United States)

    Hirano, Toshiyuki; Sato, Fumitoshi

    2017-11-01

    The canonical molecular orbital (CMO) calculation can help to understand chemical properties and reactions in proteins. However, it is difficult to perform the CMO calculation of proteins because of its self-consistent field (SCF) convergence problem and expensive computational cost. To certainly obtain the CMO of proteins, we work in research and development of high-performance CMO applications and perform experimental studies. We have proposed the third-generation density-functional calculation method of calculating the SCF, which is more advanced than the FILE and direct method. Our method is based on Cholesky decomposition for two-electron integrals calculation and the modified grid-free method for the pure-XC term evaluation. By using the third-generation density-functional calculation method, the Coulomb, the Fock-exchange, and the pure-XC terms can be given by simple linear algebraic procedure in the SCF loop. Therefore, we can expect to get a good parallel performance in solving the SCF problem by using a well-optimized linear algebra library such as BLAS on the distributed memory parallel computers. The third-generation density-functional calculation method is implemented to our program, ProteinDF. To achieve computing electronic structure of the large molecule, not only overcoming expensive computation cost and also good initial guess for safe SCF convergence are required. In order to prepare a precise initial guess for the macromolecular system, we have developed the quasi-canonical localized orbital (QCLO) method. The QCLO has the characteristics of both localized and canonical orbital in a certain region of the molecule. We have succeeded in the CMO calculations of proteins by using the QCLO method. For simplified and semi-automated calculation of the QCLO method, we have also developed a Python-based program, QCLObot.

  12. Automated lung nodule classification following automated nodule detection on CT: A serial approach

    International Nuclear Information System (INIS)

    Armato, Samuel G. III; Altman, Michael B.; Wilkie, Joel; Sone, Shusuke; Li, Feng; Doi, Kunio; Roy, Arunabha S.

    2003-01-01

    We have evaluated the performance of an automated classifier applied to the task of differentiating malignant and benign lung nodules in low-dose helical computed tomography (CT) scans acquired as part of a lung cancer screening program. The nodules classified in this manner were initially identified by our automated lung nodule detection method, so that the output of automated lung nodule detection was used as input to automated lung nodule classification. This study begins to narrow the distinction between the 'detection task' and the 'classification task'. Automated lung nodule detection is based on two- and three-dimensional analyses of the CT image data. Gray-level-thresholding techniques are used to identify initial lung nodule candidates, for which morphological and gray-level features are computed. A rule-based approach is applied to reduce the number of nodule candidates that correspond to non-nodules, and the features of remaining candidates are merged through linear discriminant analysis to obtain final detection results. Automated lung nodule classification merges the features of the lung nodule candidates identified by the detection algorithm that correspond to actual nodules through another linear discriminant classifier to distinguish between malignant and benign nodules. The automated classification method was applied to the computerized detection results obtained from a database of 393 low-dose thoracic CT scans containing 470 confirmed lung nodules (69 malignant and 401 benign nodules). Receiver operating characteristic (ROC) analysis was used to evaluate the ability of the classifier to differentiate between nodule candidates that correspond to malignant nodules and nodule candidates that correspond to benign lesions. The area under the ROC curve for this classification task attained a value of 0.79 during a leave-one-out evaluation

  13. Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation

    Science.gov (United States)

    2018-01-01

    ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory Simulator Output Files for Model...Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...although some minor changes may be needed. The program processes a GTRAJ output text file that contains results from 2 or more simulations , where each

  14. Bayesian ISOLA: new tool for automated centroid moment tensor inversion

    Science.gov (United States)

    Vackář, Jiří; Burjánek, Jan; Gallovič, František; Zahradník, Jiří; Clinton, John

    2017-04-01

    Focal mechanisms are important for understanding seismotectonics of a region, and they serve as a basic input for seismic hazard assessment. Usually, the point source approximation and the moment tensor (MT) are used. We have developed a new, fully automated tool for the centroid moment tensor (CMT) inversion in a Bayesian framework. It includes automated data retrieval, data selection where station components with various instrumental disturbances and high signal-to-noise are rejected, and full-waveform inversion in a space-time grid around a provided hypocenter. The method is innovative in the following aspects: (i) The CMT inversion is fully automated, no user interaction is required, although the details of the process can be visually inspected latter on many figures which are automatically plotted.(ii) The automated process includes detection of disturbances based on MouseTrap code, so disturbed recordings do not affect inversion.(iii) A data covariance matrix calculated from pre-event noise yields an automated weighting of the station recordings according to their noise levels and also serves as an automated frequency filter suppressing noisy frequencies.(iv) Bayesian approach is used, so not only the best solution is obtained, but also the posterior probability density function.(v) A space-time grid search effectively combined with the least-squares inversion of moment tensor components speeds up the inversion and allows to obtain more accurate results compared to stochastic methods. The method has been tested on synthetic and observed data. It has been tested by comparison with manually processed moment tensors of all events greater than M≥3 in the Swiss catalogue over 16 years using data available at the Swiss data center (http://arclink.ethz.ch). The quality of the results of the presented automated process is comparable with careful manual processing of data. The software package programmed in Python has been designed to be as versatile as possible in

  15. TU-AB-201-02: An Automated Treatment Plan Quality Assurance Program for Tandem and Ovoid High Dose-Rate Brachytherapy

    Energy Technology Data Exchange (ETDEWEB)

    Tan, J; Shi, F; Hrycushko, B; Medin, P; Stojadinovic, S; Pompos, A; Yang, M; Albuquerque, K; Jia, X [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States)

    2015-06-15

    Purpose: For tandem and ovoid (T&O) HDR brachytherapy in our clinic, it is required that the planning physicist manually capture ∼10 images during planning, perform a secondary dose calculation and generate a report, combine them into a single PDF document, and upload it to a record- and-verify system to prove to an independent plan checker that the case was planned correctly. Not only does this slow down the already time-consuming clinical workflow, the PDF document also limits the number of parameters that can be checked. To solve these problems, we have developed a web-based automatic quality assurance (QA) program. Methods: We set up a QA server accessible through a web- interface. A T&O plan and CT images are exported as DICOMRT files and uploaded to the server. The software checks 13 geometric features, e.g. if the dwell positions are reasonable, and 10 dosimetric features, e.g. secondary dose calculations via TG43 formalism and D2cc to critical structures. A PDF report is automatically generated with errors and potential issues highlighted. It also contains images showing important geometric and dosimetric aspects to prove the plan was created following standard guidelines. Results: The program has been clinically implemented in our clinic. In each of the 58 T&O plans we tested, a 14- page QA report was automatically generated. It took ∼45 sec to export the plan and CT images and ∼30 sec to perform the QA tests and generate the report. In contrast, our manual QA document preparation tooks on average ∼7 minutes under optimal conditions and up to 20 minutes when mistakes were made during the document assembly. Conclusion: We have tested the efficiency and effectiveness of an automated process for treatment plan QA of HDR T&O cases. This software was shown to improve the workflow compared to our conventional manual approach.

  16. TU-AB-201-02: An Automated Treatment Plan Quality Assurance Program for Tandem and Ovoid High Dose-Rate Brachytherapy

    International Nuclear Information System (INIS)

    Tan, J; Shi, F; Hrycushko, B; Medin, P; Stojadinovic, S; Pompos, A; Yang, M; Albuquerque, K; Jia, X

    2015-01-01

    Purpose: For tandem and ovoid (T&O) HDR brachytherapy in our clinic, it is required that the planning physicist manually capture ∼10 images during planning, perform a secondary dose calculation and generate a report, combine them into a single PDF document, and upload it to a record- and-verify system to prove to an independent plan checker that the case was planned correctly. Not only does this slow down the already time-consuming clinical workflow, the PDF document also limits the number of parameters that can be checked. To solve these problems, we have developed a web-based automatic quality assurance (QA) program. Methods: We set up a QA server accessible through a web- interface. A T&O plan and CT images are exported as DICOMRT files and uploaded to the server. The software checks 13 geometric features, e.g. if the dwell positions are reasonable, and 10 dosimetric features, e.g. secondary dose calculations via TG43 formalism and D2cc to critical structures. A PDF report is automatically generated with errors and potential issues highlighted. It also contains images showing important geometric and dosimetric aspects to prove the plan was created following standard guidelines. Results: The program has been clinically implemented in our clinic. In each of the 58 T&O plans we tested, a 14- page QA report was automatically generated. It took ∼45 sec to export the plan and CT images and ∼30 sec to perform the QA tests and generate the report. In contrast, our manual QA document preparation tooks on average ∼7 minutes under optimal conditions and up to 20 minutes when mistakes were made during the document assembly. Conclusion: We have tested the efficiency and effectiveness of an automated process for treatment plan QA of HDR T&O cases. This software was shown to improve the workflow compared to our conventional manual approach

  17. An automated thermoluminescence dosimetry (TLD) system

    International Nuclear Information System (INIS)

    Kicken, P.J.H.; Huyskens, C.J.

    1979-01-01

    In the Health Physics Division of the Eindhoven University of Technology work is going on in developing an automated TLD-system. Process automization, statistical computation, dose calculation as well as dose recording are carried out, using a microcomputer and floppy disk unit. The main features of this TLD-system are its low costs, flexibility, easy to operate, and the feasibility for use in routine dosimetry as well as in complex TLD research. Because of its modular set-up several components of the system are multifunctional in other operations. The system seems suited for medium sized Health Physics groups. (Auth.)

  18. Computer automation in veterinary hospitals.

    Science.gov (United States)

    Rogers, H

    1996-05-01

    Computers have been used to automate complex and repetitive tasks in veterinary hospitals since the 1960s. Early systems were expensive, but their use was justified because they performed jobs which would have been impossible or which would have required greater resources in terms of time and personnel had they been performed by other methods. Systems found in most veterinary hospitals today are less costly, magnitudes more capable, and often underused. Modern multitasking operating systems and graphical interfaces bring many opportunities for automation. Commercial and custom programs developed and used in a typical multidoctor mixed species veterinary practice are described.

  19. MOST-7 program for calculation of nonstationary operation modes of the nuclear steam generating plant with WWER

    International Nuclear Information System (INIS)

    Mysenkov, A.I.

    1979-01-01

    The MOST-7 program intended for calculating nonstationary emergency models of a nuclear steam generating plant (NSGP) with a WWER reactor is considered in detail. The program consists of the main MOST-7 subprogram, two main subprograms and 98 subprograms-functions. The MOST-7 program is written in the FORTRAN language and realized at the BESM-6 computer. Program storage capacity in the BESM-6 amounts to 73400 words. Primary information input into the program is carried out by means of information input operator from punched cards and DATA operator. Parameter lists, introduced both from punched cards and by means of DATA operator are tabulated. The procedure of calculational result output into printing and plotting devices is considered. Given is an example of calculating the nonstationary process, related to the loss of power in six main circulating pumps for NSGP with the WWER-440 reactor

  20. Calculations of Financial Incentives for Providers in a Pay-for-Performance Program: Manual Review Versus Data From Structured Fields in Electronic Health Records.

    Science.gov (United States)

    Urech, Tracy H; Woodard, LeChauncy D; Virani, Salim S; Dudley, R Adams; Lutschg, Meghan Z; Petersen, Laura A

    2015-10-01

    Hospital report cards and financial incentives linked to performance require clinical data that are reliable, appropriate, timely, and cost-effective to process. Pay-for-performance plans are transitioning to automated electronic health record (EHR) data as an efficient method to generate data needed for these programs. To determine how well data from automated processing of structured fields in the electronic health record (AP-EHR) reflect data from manual chart review and the impact of these data on performance rewards. Cross-sectional analysis of performance measures used in a cluster randomized trial assessing the impact of financial incentives on guideline-recommended care for hypertension. A total of 2840 patients with hypertension assigned to participating physicians at 12 Veterans Affairs hospital-based outpatient clinics. Fifty-two physicians and 33 primary care personnel received incentive payments. Overall, positive and negative agreement indices and Cohen's kappa were calculated for assessments of guideline-recommended antihypertensive medication use, blood pressure (BP) control, and appropriate response to uncontrolled BP. Pearson's correlation coefficient was used to assess how similar participants' calculated earnings were between the data sources. By manual chart review data, 72.3% of patients were considered to have received guideline-recommended antihypertensive medications compared with 65.0% by AP-EHR review (κ=0.51). Manual review indicated 69.5% of patients had controlled BP compared with 66.8% by AP-EHR review (κ=0.87). Compared with 52.2% of patients per the manual review, 39.8% received an appropriate response by AP-EHR review (κ=0.28). Participants' incentive payments calculated using the 2 methods were highly correlated (r≥0.98). Using the AP-EHR data to calculate earnings, participants' payment changes ranged from a decrease of $91.00 (-30.3%) to an increase of $18.20 (+7.4%) for medication use (interquartile range, -14.4% to 0

  1. REITP3-Hazard evaluation program for heat release based on thermochemical calculation

    Energy Technology Data Exchange (ETDEWEB)

    Akutsu, Yoshiaki.; Tamura, Masamitsu. [The University of Tokyo, Tokyo (Japan). School of Engineering; Kawakatsu, Yuichi. [Oji Paper Corp., Tokyo (Japan); Wada, Yuji. [National Institute for Resources and Environment, Tsukuba (Japan); Yoshida, Tadao. [Hosei University, Tokyo (Japan). College of Engineering

    1999-06-30

    REITP3-A hazard evaluation program for heat release besed on thermochemical calculation has been developed by modifying REITP2 (Revised Estimation of Incompatibility from Thermochemical Properties{sup 2)}. The main modifications are as follows. (1) Reactants are retrieved from the database by chemical formula. (2) As products are listed in an external file, the addition of products and change in order of production can be easily conducted. (3) Part of the program has been changed by considering its use on a personal computer or workstation. These modifications will promote the usefulness of the program for energy hazard evaluation. (author)

  2. Logistics Automation Master Plan (LAMP). Better Logistics Support through Automation.

    Science.gov (United States)

    1983-06-01

    productivity and efficiency of DARCOM human resources through the design, development, and deployment of workspace automation tools. 16. Develop Area Oriented...See Resource Annex Budgeted and Programed Resources by FY: See Resource Annex Actual or Planned Source of Resources: See Resourece Annex. Purpose and...screen, video disc machine and a microcomputer. Pressure from a human hand or light per on the user friendly screen tells the computer to retrieve

  3. Development of codes for physical calculations of WWER

    International Nuclear Information System (INIS)

    Novikov, A.N.

    2000-01-01

    A package of codes for physical calculations of WWER reactors, used at the RRC 'Kurchatov Institute' is discussed including the purpose of these codes, approximations used, degree of data verification, possibilities of automation of calculations and presentation of results, trends of further development of the codes. (Authors)

  4. ASOP, Shield Calculation, 1-D, Discrete Ordinates Transport

    International Nuclear Information System (INIS)

    1993-01-01

    1 - Nature of physical problem solved: ASOP is a shield optimization calculational system based on the one-dimensional discrete ordinates transport program ANISN. It has been used to design optimum shields for space applications of SNAP zirconium-hydride-uranium- fueled reactors and uranium-oxide fueled thermionic reactors and to design beam stops for the ORELA facility. 2 - Method of solution: ASOP generates coefficients of linear equations describing the logarithm of the dose and dose-weight derivatives as functions of position from data obtained in an automated sequence of ANISN calculations. With the dose constrained to a design value and all dose-weight derivatives required to be equal, the linear equations may be solved for a new set of shield dimensions. Since changes in the shield dimensions may cause the linear functions to change, the entire procedure is repeated until convergence is obtained. The detailed calculations of the radiation transport through shield configurations for every step in the procedure distinguish ASOP from other shield optimization computer code systems which rely on multiple component sources and attenuation coefficients to describe the transport. 3 - Restrictions on the complexity of the problem: Problem size is limited only by machine size

  5. TRAFIC, a computer program for calculating the release of metallic fission products from an HTGR core

    International Nuclear Information System (INIS)

    Smith, P.D.

    1978-02-01

    A special purpose computer program, TRAFIC, is presented for calculating the release of metallic fission products from an HTGR core. The program is based upon Fick's law of diffusion for radioactive species. One-dimensional transient diffusion calculations are performed for the coated fuel particles and for the structural graphite web. A quasi steady-state calculation is performed for the fuel rod matrix material. The model accounts for nonlinear adsorption behavior in the fuel rod gap and on the coolant hole boundary. The TRAFIC program is designed to operate in a core survey mode; that is, it performs many repetitive calculations for a large number of spatial locations in the core. This is necessary in order to obtain an accurate volume integrated release. For this reason the program has been designed with calculational efficiency as one of its main objectives. A highly efficient numerical method is used in the solution. The method makes use of the Duhamel superposition principle to eliminate interior spatial solutions from consideration. Linear response functions relating the concentrations and mass fluxes on the boundaries of a homogeneous region are derived. Multiple regions are numerically coupled through interface conditions. Algebraic elimination is used to reduce the equations as far as possible. The problem reduces to two nonlinear equations in two unknowns, which are solved using a Newton Raphson technique

  6. EPCARD (European Program Package for the Calculation of Aviation Route Doses). User's manual for version 3.2

    International Nuclear Information System (INIS)

    Schraube, H.; Leuthold, G.P.; Schraube, G.; Heinrich, W.; Roesler, S.; Mares, V.

    2002-01-01

    The GSF-National Research Center has developed the computer program EPCARD (European program package for the calculation of aviation route doses) jointly with scientists from Siegen University. With the program it is possible calculate the radiation dose obtained by individuals along any aviation route at flight altitudes between 5000 m and 25000 m, both in terms of ''ambient dose equivalent'' and ''effective dose''. Dose rates at any point in the atmosphere may be calculated for comparison with verification experiments, as well as simulated instrument readings, if the response characteristics of the instruments are known. The program fulfills the requirements of the European Council Directive 96/29/EURATOM and of the subsequent European national regulations. This report contains essentially all information, which is necessary to run EPCARDv3.2 from a standard PC. The program structure is depicted and the file structure described in detail, which permits to calculate the large number of data sets for the daily record keeping of airline crews and other frequently flying persons. Additionally, some information is given on the basic physical data, which is available from referenced publications. (orig.)

  7. Web-based automation of green building rating index and life cycle cost analysis

    Science.gov (United States)

    Shahzaib Khan, Jam; Zakaria, Rozana; Aminuddin, Eeydzah; IzieAdiana Abidin, Nur; Sahamir, Shaza Rina; Ahmad, Rosli; Nafis Abas, Darul

    2018-04-01

    Sudden decline in financial markets and economic meltdown has slow down adaptation and lowered interest of investors towards green certified buildings due to their higher initial costs. Similarly, it is essential to fetch investor’s attention towards more development of green buildings through automated tools for the construction projects. Though, historical dearth is found on the automation of green building rating tools that brings up an essential gap to develop an automated analog computerized programming tool. This paper present a proposed research aim to develop an integrated web-based automated analog computerized programming that applies green building rating assessment tool, green technology and life cycle cost analysis. It also emphasizes to identify variables of MyCrest and LCC to be integrated and developed in a framework then transformed into automated analog computerized programming. A mix methodology of qualitative and quantitative survey and its development portray the planned to carry MyCrest-LCC integration to an automated level. In this study, the preliminary literature review enriches better understanding of Green Building Rating Tools (GBRT) integration to LCC. The outcome of this research is a pave way for future researchers to integrate other efficient tool and parameters that contributes towards green buildings and future agendas.

  8. A comparison of recent results from HONDO III with the JSME nuclear shipping cask benchmark calculations

    International Nuclear Information System (INIS)

    Key, S.W.

    1985-01-01

    The results of two calculations related to the impact response of spent nuclear fuel shipping casks are compared to the benchmark results reported in a recent study by the Japan Society of Mechanical Engineers Subcommittee on Structural Analysis of Nuclear Shipping Casks. Two idealized impacts are considered. The first calculation utilizes a right circular cylinder of lead subjected to a 9.0 m free fall onto a rigid target, while the second calculation utilizes a stainless steel clad cylinder of lead subjected to the same impact conditions. For the first problem, four calculations from graphical results presented in the original study have been singled out for comparison with HONDO III. The results from DYNA3D, STEALTH, PISCES, and ABAQUS are reproduced. In the second problem, the results from four separate computer programs in the original study, ABAQUS, ANSYS, MARC, and PISCES, are used and compared with HONDO III. The current version of HONDO III contains a fully automated implementation of the explicit-explicit partitioning procedure for the central difference method time integration which results in a reduction of computational effort by a factor in excess of 5. The results reported here further support the conclusion of the original study that the explicit time integration schemes with automated time incrementation are effective and efficient techniques for computing the transient dynamic response of nuclear fuel shipping casks subject to impact loading. (orig.)

  9. A computer program for calculation of the fuel cycle in pressurized water reactors

    International Nuclear Information System (INIS)

    Solanilla, R.

    1976-01-01

    The purpose of the FUCEFURE program is two-fold: first, it is designed to solve the problem of nuclear fuel cycle cost in one pressurized light water reactor calculation. The code was developed primarily for comparative and sensitivity studies. The program contains simple correlations between exposure and available depletion data used to predict the uranium and plutonium content of the fuel as a function of the fuel initial enrichment. Second, it has been devised to evaluate the nuclear fuel demand associated with an expanding nuclear power system. Evaluation can be carried out at any time and stage in the fuel cycle. The program can calculate the natural uranium and separate work requirements of any final and tails enrichment. It also can determine the nuclear power share of each reactor in the system when a decision has been made about the long-term nuclear power installations to be used and the types of PWR and fast breeder reactor characteristics to be involved in them. (author)

  10. Model calculations as one means of satisfying the neutron cross-section requirements of the CTR program

    International Nuclear Information System (INIS)

    Gardner, D.G.

    1975-01-01

    A large amount of cross section and spectral information for neutron-induced reactions will be required for the CTR design program. To undertake to provide the required data through a purely experimental measurement program alone may not be the most efficient way of attacking the problem. It is suggested that a preliminary theoretical calculation be made of all relevant reactions on the dozen or so elements that now seem to comprise the inventory of possible construction materials to find out which are actually important, and over what energy ranges they are important. A number of computer codes for calculating cross sections for neutron induced reactions have been evaluated and extended. These will be described and examples will be given of various types of calculations of interest to the CTR program. (U.S.)

  11. Automation through the PIP [Program Implementation Plan] concurrence system improves information sharing among DOE [Dept. of Energy] managers

    International Nuclear Information System (INIS)

    Imholz, R.M.; Berube, D.S.; Peterson, J.L.

    1990-01-01

    The Program Implementation Plan (PIP) Concurrence System is designed to improve information sharing within the U.S. Department of Energy (DOE) and between DOE and the Field. Effectively sharing information enables DOE managers to make more informed, effective decisions. The PIP Concurrence System improved information sharing among DOE managers by defining the automated process for concurring on a DOE document, thus reducing the time required to concur on the document by 75%. The first step in defining an automated process is to structure the process for concurring on a document. Only those DOE managers with approved access could review certain parts of a document on a concurrence system. Remember that the concurrence process is a sign off procedure unlike a commentary process in which comments may not be restricted to certain people. The commentary process is the beginning of the concurrence process. The commentary process builds a document; the concurrence process approves it. 6 refs., 7 figs

  12. Workload Capacity: A Response Time-Based Measure of Automation Dependence.

    Science.gov (United States)

    Yamani, Yusuke; McCarley, Jason S

    2016-05-01

    An experiment used the workload capacity measure C(t) to quantify the processing efficiency of human-automation teams and identify operators' automation usage strategies in a speeded decision task. Although response accuracy rates and related measures are often used to measure the influence of an automated decision aid on human performance, aids can also influence response speed. Mean response times (RTs), however, conflate the influence of the human operator and the automated aid on team performance and may mask changes in the operator's performance strategy under aided conditions. The present study used a measure of parallel processing efficiency, or workload capacity, derived from empirical RT distributions as a novel gauge of human-automation performance and automation dependence in a speeded task. Participants performed a speeded probabilistic decision task with and without the assistance of an automated aid. RT distributions were used to calculate two variants of a workload capacity measure, COR(t) and CAND(t). Capacity measures gave evidence that a diagnosis from the automated aid speeded human participants' responses, and that participants did not moderate their own decision times in anticipation of diagnoses from the aid. Workload capacity provides a sensitive and informative measure of human-automation performance and operators' automation dependence in speeded tasks. © 2016, Human Factors and Ergonomics Society.

  13. A model for steady-state and transient determination of subcooled boiling for calculations coupling a thermohydraulic and a neutron physics calculation program for reactor core calculation

    International Nuclear Information System (INIS)

    Mueller, R.G.

    1987-06-01

    Due to the strong influence of vapour bubbles on the nuclear chain reaction, an exact calculation of neutron physics and thermal hydraulics in light water reactors requires consideration of subcooled boiling. To this purpose, in the present study a dynamic model is derived from the time-dependent conservation equations. It contains new methods for the time-dependent determination of evaporation and condensation heat flow and for the heat transfer coefficient in subcooled boiling. Furthermore, it enables the complete two-phase flow region to be treated in a consistent manner. The calculation model was verified using measured data of experiments covering a wide range of thermodynamic boundary conditions. In all cases very good agreement was reached. The results from the coupling of the new calculation model with a neutron kinetics program proved its suitability for the steady-state and transient calculation of reactor cores. (orig.) [de

  14. Automated absolute activation analysis with californium-252 sources

    International Nuclear Information System (INIS)

    MacMurdo, K.W.; Bowman, W.W.

    1978-09-01

    A 100-mg 252 Cf neutron activation analysis facility is used routinely at the Savannah River Laboratory for multielement analysis of many solid and liquid samples. An absolute analysis technique converts counting data directly to elemental concentration without the use of classical comparative standards and flux monitors. With the totally automated pneumatic sample transfer system, cyclic irradiation-decay-count regimes can be pre-selected for up to 40 samples, and samples can be analyzed with the facility unattended. An automatic data control system starts and stops a high-resolution gamma-ray spectrometer and/or a delayed-neutron detector; the system also stores data and controls output modes. Gamma ray data are reduced by three main programs in the IBM 360/195 computer: the 4096-channel spectrum and pertinent experimental timing, counting, and sample data are stored on magnetic tape; the spectrum is then reduced to a list of significant photopeak energies, integrated areas, and their associated statistical errors; and the third program assigns gamma ray photopeaks to the appropriate neutron activation product(s) by comparing photopeak energies to tabulated gamma ray energies. Photopeak areas are then converted to elemental concentration by using experimental timing and sample data, calculated elemental neutron capture rates, absolute detector efficiencies, and absolute spectroscopic decay data. Calculational procedures have been developed so that fissile material can be analyzed by cyclic neutron activation and delayed-neutron counting procedures. These calculations are based on a 6 half-life group model of delayed neutron emission; calculations include corrections for delayed neutron interference from 17 O. Detection sensitivities of 239 Pu were demonstrated with 15-g samples at a throughput of up to 140 per day. Over 40 elements can be detected at the sub-ppM level

  15. PABLM: a computer program to calculate accumulated radiation doses from radionuclides in the environment

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Kennedy, W.E. Jr.; Soldat, J.K.

    1980-03-01

    A computer program, PABLM, was written to facilitate the calculation of internal radiation doses to man from radionuclides in food products and external radiation doses from radionuclides in the environment. This report contains details of mathematical models used and calculational procedures required to run the computer program. Radiation doses from radionuclides in the environment may be calculated from deposition on the soil or plants during an atmospheric or liquid release, or from exposure to residual radionuclides in the environment after the releases have ended. Radioactive decay is considered during the release of radionuclides, after they are deposited on the plants or ground, and during holdup of food after harvest. The radiation dose models consider several exposure pathways. Doses may be calculated for either a maximum-exposed individual or for a population group. The doses calculated are accumulated doses from continuous chronic exposure. A first-year committed dose is calculated as well as an integrated dose for a selected number of years. The equations for calculating internal radiation doses are derived from those given by the International Commission on Radiological Protection (ICRP) for body burdens and MPC's of each radionuclide. The radiation doses from external exposure to contaminated water and soil are calculated using the basic assumption that the contaminated medium is large enough to be considered an infinite volume or plane relative to the range of the emitted radiations. The equations for calculations of the radiation dose from external exposure to shoreline sediments include a correction for the finite width of the contaminated beach.

  16. PABLM: a computer program to calculate accumulated radiation doses from radionuclides in the environment

    International Nuclear Information System (INIS)

    Napier, B.A.; Kennedy, W.E. Jr.; Soldat, J.K.

    1980-03-01

    A computer program, PABLM, was written to facilitate the calculation of internal radiation doses to man from radionuclides in food products and external radiation doses from radionuclides in the environment. This report contains details of mathematical models used and calculational procedures required to run the computer program. Radiation doses from radionuclides in the environment may be calculated from deposition on the soil or plants during an atmospheric or liquid release, or from exposure to residual radionuclides in the environment after the releases have ended. Radioactive decay is considered during the release of radionuclides, after they are deposited on the plants or ground, and during holdup of food after harvest. The radiation dose models consider several exposure pathways. Doses may be calculated for either a maximum-exposed individual or for a population group. The doses calculated are accumulated doses from continuous chronic exposure. A first-year committed dose is calculated as well as an integrated dose for a selected number of years. The equations for calculating internal radiation doses are derived from those given by the International Commission on Radiological Protection (ICRP) for body burdens and MPC's of each radionuclide. The radiation doses from external exposure to contaminated water and soil are calculated using the basic assumption that the contaminated medium is large enough to be considered an infinite volume or plane relative to the range of the emitted radiations. The equations for calculations of the radiation dose from external exposure to shoreline sediments include a correction for the finite width of the contaminated beach

  17. Automated quantification of epicardial adipose tissue using CT angiography: evaluation of a prototype software

    Energy Technology Data Exchange (ETDEWEB)

    Spearman, James V.; Silverman, Justin R.; Krazinski, Aleksander W.; Costello, Philip [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Meinel, Felix G.; Geyer, Lucas L. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Ludwig-Maximilians-University Hospital, Institute for Clinical Radiology, Munich (Germany); Schoepf, U.J. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Medical University of South Carolina, Division of Cardiology, Department of Medicine, Charleston, SC (United States); Apfaltrer, Paul [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); University Medical Center Mannheim, Medical Faculty Mannheim - Heidelberg University, Institute of Clinical Radiology and Nuclear Medicine, Mannheim (Germany); Canstein, Christian [Siemens Medical Solutions USA, Inc., Malvern, PA (United States); De Cecco, Carlo Nicola [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); University of Rome ' ' Sapienza' ' - Polo Pontino, Department of Radiological Sciences, Oncology and Pathology, Latina (Italy)

    2014-02-15

    This study evaluated the performance of a novel automated software tool for epicardial fat volume (EFV) quantification compared to a standard manual technique at coronary CT angiography (cCTA). cCTA data sets of 70 patients (58.6 ± 12.9 years, 33 men) were retrospectively analysed using two different post-processing software applications. Observer 1 performed a manual single-plane pericardial border definition and EFV{sub M} segmentation (manual approach). Two observers used a software program with fully automated 3D pericardial border definition and EFV{sub A} calculation (automated approach). EFV and time required for measuring EFV (including software processing time and manual optimization time) for each method were recorded. Intraobserver and interobserver reliability was assessed on the prototype software measurements. T test, Spearman's rho, and Bland-Altman plots were used for statistical analysis. The final EFV{sub A} (with manual border optimization) was strongly correlated with the manual axial segmentation measurement (60.9 ± 33.2 mL vs. 65.8 ± 37.0 mL, rho = 0.970, P < 0.001). A mean of 3.9 ± 1.9 manual border edits were performed to optimize the automated process. The software prototype required significantly less time to perform the measurements (135.6 ± 24.6 s vs. 314.3 ± 76.3 s, P < 0.001) and showed high reliability (ICC > 0.9). Automated EFV{sub A} quantification is an accurate and time-saving method for quantification of EFV compared to established manual axial segmentation methods. (orig.)

  18. Effects of a direct refill program for automated dispensing cabinets on medication-refill errors.

    Science.gov (United States)

    Helmons, Pieter J; Dalton, Ashley J; Daniels, Charles E

    2012-10-01

    The effects of a direct refill program for automated dispensing cabinets (ADCs) on medication-refill errors were studied. This study was conducted in designated acute care areas of a 386-bed academic medical center. A wholesaler-to-ADC direct refill program, consisting of prepackaged delivery of medications and bar-code-assisted ADC refilling, was implemented in the inpatient pharmacy of the medical center in September 2009. Medication-refill errors in 26 ADCs from the general medicine units, the infant special care unit, the surgical and burn intensive care units, and intermediate units were assessed before and after the implementation of this program. Medication-refill errors were defined as an ADC pocket containing the wrong drug, wrong strength, or wrong dosage form. ADC refill errors decreased by 77%, from 62 errors per 6829 refilled pockets (0.91%) to 8 errors per 3855 refilled pockets (0.21%) (p error type detected before the intervention was the incorrect medication (wrong drug, wrong strength, or wrong dosage form) in the ADC pocket. Of the 54 incorrect medications found before the intervention, 38 (70%) were loaded in a multiple-drug drawer. After the implementation of the new refill process, 3 of the 5 incorrect medications were loaded in a multiple-drug drawer. There were 3 instances of expired medications before and only 1 expired medication after implementation of the program. A redesign of the ADC refill process using a wholesaler-to-ADC direct refill program that included delivery of prepackaged medication and bar-code-assisted refill significantly decreased the occurrence of ADC refill errors.

  19. ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations

    Science.gov (United States)

    Laloo, Jalal Z. A.; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai

    2017-07-01

    The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.

  20. Assessment Study on Sensors and Automation in the Industries of the Future. Reports on Industrial Controls, Information Processing, Automation, and Robotics

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, Bonnie [Adventium Labs; Boddy, Mark [Adventium Labs; Doyle, Frank [Univ. of California, Santa Barbara, CA (United States); Jamshidi, Mo [Univ. of New Mexico, Albuquerque, NM (United States); Ogunnaike, Tunde [Univ. of Delaware, Newark, DE (United States)

    2004-11-01

    This report presents the results of an expert study to identify research opportunities for Sensors & Automation, a sub-program of the U.S. Department of Energy (DOE) Industrial Technologies Program (ITP). The research opportunities are prioritized by realizable energy savings. The study encompasses the technology areas of industrial controls, information processing, automation, and robotics. These areas have been central areas of focus of many Industries of the Future (IOF) technology roadmaps. This report identifies opportunities for energy savings as a direct result of advances in these areas and also recognizes indirect means of achieving energy savings, such as product quality improvement, productivity improvement, and reduction of recycle.

  1. Development of a program for calculation of second dose and securities in brachytherapy high dose rate

    International Nuclear Information System (INIS)

    Esteve Sanchez, S.; Martinez Albaladejo, M.; Garcia Fuentes, J. D.; Bejar Navarro, M. J.; Capuz Suarez, B.; Moris de Pablos, R.; Colmenares Fernandez, R.

    2015-01-01

    We assessed the reliability of the program with 80 patients in the usual points of prescription of each pathology. The average error of the calculation points is less than 0.3% in 95% of cases, finding the major differences in the axes of the applicators (maximum error -0.798%). The program has proved effective previously testing him with erroneous dosimetry. Thanks to the implementation of this program is achieved by the calculation of the dose and part of the process of quality assurance program in a few minutes, highlighting the case of HDR prostate due to having a limited time. Having separate data sheet allows each institution to its protocols modify parameters. (Author)

  2. Programs OPTMAN and SHEMMAN Version 6 (1999) - Coupled-Channels optical model and collective nuclear structure calculation -

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Jong Hwa; Lee, Jeong Yeon; Lee, Young Ouk; Sukhovitski, Efrem Sh [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-01-01

    Programs SHEMMAN and OPTMAN (Version 6) have been developed for determinations of nuclear Hamiltonian parameters and for optical model calculations, respectively. The optical model calculations by OPTMAN with coupling schemes built on wave functions functions of non-axial soft-rotator are self-consistent, since the parameters of the nuclear Hamiltonian are determined by adjusting the energies of collective levels to experimental values with SHEMMAN prior to the optical model calculation. The programs have been installed at Nuclear Data Evaluation Laboratory of KAERI. This report is intended as a brief manual of these codes. 43 refs., 9 figs., 1 tabs. (Author)

  3. Automated x-ray fluorescence analysis

    International Nuclear Information System (INIS)

    O'Connell, A.M.

    1977-01-01

    A fully automated x-ray fluorescence analytical system is described. The hardware is based on a Philips PW1220 sequential x-ray spectrometer. Software for on-line analysis of a wide range of sample types has been developed for the Hewlett-Packard 9810A programmable calculator. Routines to test the system hardware are also described. (Author)

  4. A FORTRAN program for an IBM PC compatible computer for calculating kinematical electron diffraction patterns

    International Nuclear Information System (INIS)

    Skjerpe, P.

    1989-01-01

    This report describes a computer program which is useful in transmission electron microscopy. The program is written in FORTRAN and calculates kinematical electron diffraction patterns in any zone axis from a given crystal structure. Quite large unit cells, containing up to 2250 atoms, can be handled by the program. The program runs on both the Helcules graphic card and the standard IBM CGA card

  5. An Intelligent Automation Platform for Rapid Bioprocess Design.

    Science.gov (United States)

    Wu, Tianyi; Zhou, Yuhong

    2014-08-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user's inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. © 2013 Society for Laboratory Automation and Screening.

  6. KAPSIES: A program for the calculation of multi-step direct reaction cross sections

    International Nuclear Information System (INIS)

    Koning, A.J.; Akkermans, J.M.

    1994-09-01

    We present a program for the calculation of continuum cross sections, sepctra, angular distributions and analyzing powers according to various quantum-mechanical theories for statistical multi-step direct nuclear reactions. (orig.)

  7. Expert Performance and Time Pressure: Implications for Automation Failures in Aviation

    Science.gov (United States)

    2016-09-30

    settled by these two studies. To help resolve the disagreement between the previous research findings, the present work used a computerized chess...communication between the automation and the pilots should also be helpful , but it is doubtful that the system designer or the real-time automation can...Performance and Time Pressure: Implications for Automation Failures in Aviation 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d

  8. A program for calculating and plotting soft x-ray optical interaction coefficients for molecules

    International Nuclear Information System (INIS)

    Thomas, M.M.; Davis, J.C.; Jacobsen, C.J.; Perera, R.C.C.

    1989-08-01

    Comprehensive tables for atomic scattering factor components, f1 and f2, were compiled by Henke et al. for the extended photon region 50 - 10000 eV. Accurate calculations of optical interaction coefficients for absorption, reflection and scattering by material systems (e.g. filters, multi-layers, etc...), which have widespread application, can be based simply upon the atomic scattering factors for the elements comprising the material, except near the absorption threshold energies. These calculations based upon the weighted sum of f1 and f2 for each atomic species present can be very tedious if done by hand. This led us to develop a user friendly program to perform these calculations on an IBM PC or compatible computer. By entering the chemical formula, density and thickness of up to six molecules, values of the f1, f2, mass absorption transmission efficiencies, attenuation lengths, mirror reflectivities and complex indices of refraction can be calculated and plotted as a function of energy or wavelength. This program will be available distribution. 7 refs., 1 fig

  9. Tegen - an onedimensional program to calculate a thermoelectric generator

    International Nuclear Information System (INIS)

    Rosa, M.A.P.; Ferreira, P.A.; Castro Lobo, P.D. de.

    1990-01-01

    A computer program for the solution of the one-dimensional, steady-state temperature equation in the arms of a thermoelectric generator. The discretized equations obtained through a finite difference scheme are solved by Gaussian Elimination. Due to nonlinearities caused by the temperature dependence of the coefficients of such equations, an iterative procedure is used to obtain the temperature distribution in the arms. Such distributions are used in the calculation of the efficiency, electric power, load voltage and other relevant parameters for the design of a thermoelectric generator. (author)

  10. Automating Visualization Service Generation with the WATT Compiler

    Science.gov (United States)

    Bollig, E. F.; Lyness, M. D.; Erlebacher, G.; Yuen, D. A.

    2007-12-01

    As tasks and workflows become increasingly complex, software developers are devoting increasing attention to automation tools. Among many examples, the Automator tool from Apple collects components of a workflow into a single script, with very little effort on the part of the user. Tasks are most often described as a series of instructions. The granularity of the tasks dictates the tools to use. Compilers translate fine-grained instructions to assembler code, while scripting languages (ruby, perl) are used to describe a series of tasks at a higher level. Compilers can also be viewed as transformational tools: a cross-compiler can translate executable code written on one computer to assembler code understood on another, while transformational tools can translate from one high-level language to another. We are interested in creating visualization web services automatically, starting from stand-alone VTK (Visualization Toolkit) code written in Tcl. To this end, using the OCaml programming language, we have developed a compiler that translates Tcl into C++, including all the stubs, classes and methods to interface with gSOAP, a C++ implementation of the Soap 1.1/1.2 protocols. This compiler, referred to as the Web Automation and Translation Toolkit (WATT), is the first step towards automated creation of specialized visualization web services without input from the user. The WATT compiler seeks to automate all aspects of web service generation, including the transport layer, the division of labor and the details related to interface generation. The WATT compiler is part of ongoing efforts within the NSF funded VLab consortium [1] to facilitate and automate time-consuming tasks for the science related to understanding planetary materials. Through examples of services produced by WATT for the VLab portal, we will illustrate features, limitations and the improvements necessary to achieve the ultimate goal of complete and transparent automation in the generation of web

  11. A program to calculate pulse transmission responses through transversely isotropic media

    Science.gov (United States)

    Li, Wei; Schmitt, Douglas R.; Zou, Changchun; Chen, Xiwei

    2018-05-01

    We provide a program (AOTI2D) to model responses of ultrasonic pulse transmission measurements through arbitrarily oriented transversely isotropic rocks. The program is built with the distributed point source method that treats the transducers as a series of point sources. The response of each point source is calculated according to the ray-tracing theory of elastic plane waves. The program could offer basic wave parameters including phase and group velocities, polarization, anisotropic reflection coefficients and directivity patterns, and model the wave fields, static wave beam, and the observed signals for pulse transmission measurements considering the material's elastic stiffnesses and orientations, sample dimensions, and the size and positions of the transmitters and the receivers. The program could be applied to exhibit the ultrasonic beam behaviors in anisotropic media, such as the skew and diffraction of ultrasonic beams, and analyze its effect on pulse transmission measurements. The program would be a useful tool to help design the experimental configuration and interpret the results of ultrasonic pulse transmission measurements through either isotropic or transversely isotropic rock samples.

  12. Automating dipole subtraction

    Energy Technology Data Exchange (ETDEWEB)

    Hasegawa, K.; Moch, S. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Uwer, P. [Karlsruhe Univ. (T.H.) (Germany). Inst. fuer Theoretische Teilchenphysik

    2008-07-15

    We report on automating the Catani-Seymour dipole subtraction which is a general procedure to treat infrared divergences in real emission processes at next-to-leading order in QCD. The automatization rests on three essential steps: the creation of the dipole terms, the calculation of the color linked squared Born matrix elements, and the evaluation of different helicity amplitudes. The routines have been tested for a number of complex processes, such as the real emission process gg{yields}t anti tggg. (orig.)

  13. Automating dipole subtraction

    International Nuclear Information System (INIS)

    Hasegawa, K.; Moch, S.; Uwer, P.

    2008-07-01

    We report on automating the Catani-Seymour dipole subtraction which is a general procedure to treat infrared divergences in real emission processes at next-to-leading order in QCD. The automatization rests on three essential steps: the creation of the dipole terms, the calculation of the color linked squared Born matrix elements, and the evaluation of different helicity amplitudes. The routines have been tested for a number of complex processes, such as the real emission process gg→t anti tggg. (orig.)

  14. Automating an integrated spatial data-mining model for landfill site selection

    Science.gov (United States)

    Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Aziz, Hamidi Abdul

    2017-10-01

    An integrated programming environment represents a robust approach to building a valid model for landfill site selection. One of the main challenges in the integrated model is the complicated processing and modelling due to the programming stages and several limitations. An automation process helps avoid the limitations and improve the interoperability between integrated programming environments. This work targets the automation of a spatial data-mining model for landfill site selection by integrating between spatial programming environment (Python-ArcGIS) and non-spatial environment (MATLAB). The model was constructed using neural networks and is divided into nine stages distributed between Matlab and Python-ArcGIS. A case study was taken from the north part of Peninsular Malaysia. 22 criteria were selected to utilise as input data and to build the training and testing datasets. The outcomes show a high-performance accuracy percentage of 98.2% in the testing dataset using 10-fold cross validation. The automated spatial data mining model provides a solid platform for decision makers to performing landfill site selection and planning operations on a regional scale.

  15. 'PRIZE': A program for calculating collision probabilities in R-Z geometry

    Energy Technology Data Exchange (ETDEWEB)

    Pitcher, H.H.W. [General Reactor Physics Division, Atomic Energy Establishment, Winfrith, Dorchester, Dorset (United Kingdom)

    1964-10-15

    PRIZE is an IBM7090 program which computes collision probabilities for systems with axial symmetry and outputs them on cards in suitable format for the PIP1 program. Its method of working, data requirements, output, running time and accuracy are described. The program has been used to compute non-escape (self-collision) probabilities of finite circular cylinders, and a table is given by which non-escape probabilities of slabs, finite and infinite circular cylinders, infinite square cylinders, cubes, spheres and hemispheres may quickly be calculated to 1/2% or better. (author)

  16. Programming of Canberra Industries 8100/Quanta System

    International Nuclear Information System (INIS)

    Yoshida, Hiroshi; Kubo, Katsumi

    1980-03-01

    In this report are described usage of an interactive programming language ''CLASS'' (Canberra Laboratory Automation Software System) which is a feature the software for Canberra Industries 8100/Quanta System consisting of a Canberra Industries 8100 multichannel analyzer (MCA) and a PDP-11/05 mini-computer, and the programs with CLASS developed to process and analyze the data of gamma spectra obtained with semiconductor detectors. The programs are (1) to compute the coefficients in the formulae that relate the channel numbers of gamma-ray photopeaks obtained from MCA and the energy values; (2) to subtract the background component from the total count of a photopeak obtained from MCA, and (3) to calculate the lapse of time in days or years following the preparation of a radiation source. (author)

  17. Version of ORIGEN2 with automated sensitivity-calculation capability

    International Nuclear Information System (INIS)

    Worley, B.A.; Wright, R.Q.; Pin, F.G.

    1986-01-01

    ORIGEN2 is a widely used point-depletion and radioactive-decay computer code for use in simulating nuclear fuel cycles and/or spent fuel characteristics. The code calculates the amount of each nuclide being considered in the problem at a specified number of times, and upon request, a database of conversion factors relating mass compositions to specific material characteristics is used to calculate and print the total nuclide-dependent radioactivity, thermal power, and toxicity, as well as absorption, fission, neutron emission, and photon emission rates. The purpose of this paper is to report on the availability of a version of ORIGEN2 that will calculate, on option the derivative of all responses with respect to any variable used in the code

  18. Automated toxicological screening reports of modified Agilent MSD Chemstation combined with Microsoft Visual Basic application programs.

    Science.gov (United States)

    Choe, Sanggil; Kim, Suncheun; Choi, Hyeyoung; Choi, Hwakyoung; Chung, Heesun; Hwang, Bangyeon

    2010-06-15

    Agilent GC-MS MSD Chemstation offers automated library search report for toxicological screening using total ion chromatogram (TIC) and mass spectroscopy in normal mode. Numerous peaks appear in the chromatogram of biological specimen such as blood or urine and often large migrating peaks obscure small target peaks, in addition, any target peaks of low abundance regularly give wrong library search result or low matching score. As a result, retention time and mass spectrum of all the peaks in the chromatogram have to be checked to see if they are relevant. These repeated actions are very tedious and time-consuming to toxicologists. MSD Chemstation software operates using a number of macro files which give commands and instructions on how to work on and extract data from the chromatogram and spectroscopy. These macro files are developed by the own compiler of the software. All the original macro files can be modified and new macro files can be added to the original software by users. To get more accurate results with more convenient method and to save time for data analysis, we developed new macro files for reports generation and inserted new menus in the Enhanced Data Analysis program. Toxicological screening reports generated by these new macro files are in text mode or graphic mode and these reports can be generated with three different automated subtraction options. Text reports have Brief mode and Full mode and graphic reports have the option with or without mass spectrum mode. Matched mass spectrum and matching score for detected compounds are printed in reports by modified library searching modules. We have also developed an independent application program named DrugMan. This program manages drug groups, lists and parameters that are in use in MSD Chemstation. The incorporation of DrugMan with modified macro modules provides a powerful tool for toxicological screening and save a lot of valuable time on toxicological work. (c) 2010 Elsevier Ireland Ltd. All

  19. Application of the REMIX thermal mixing calculation program for the Loviisa reactor

    International Nuclear Information System (INIS)

    Kokkonen, I.; Tuomisto, H.

    1987-08-01

    The REMIX computer program has been validated to be used in the pressurized thermal shock study of the Loviisa reactor pressure vessel. The program has been verified against the data from the thermal and fluid mixing experiments. These experiments have been carried out in Imatran voima Oy to study thermal mixing of the high-pressure safety injection water in the Loviisa VVER-440 type pressurized water reactor. The verified REMIX-versions were applied to reactor calculations in the probabilistic pressurized thermal shock study of the Loviisa Plant

  20. AFG-MONSU. A program for calculating axial heterogeneities in cylindrical pin cells

    International Nuclear Information System (INIS)

    Neltrup, H.; Kirkegaard, P.

    1978-08-01

    The AGF-MONSU program complex is designed to calculate the flux in cylindrical fuel pin cells into which heterogeneities are introduced in a regular array. The theory - integral transport theory combined with Monte Carlo by help of a superposition principle - is described in some detail. Detailed derivation of the superposition principle as well as the formulas used in the DIT (Discrete Integral Transport) method is given in the appendices along with a description of the input structure of the AFG-MONSU program complex. (author)

  1. Altering user' acceptance of automation through prior automation exposure.

    Science.gov (United States)

    Bekier, Marek; Molesworth, Brett R C

    2017-06-01

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  2. FLOWNET: A Computer Program for Calculating Secondary Flow Conditions in a Network of Turbomachinery

    Science.gov (United States)

    Rose, J. R.

    1978-01-01

    The program requires the network parameters, the flow component parameters, the reservoir conditions, and the gas properties as input. It will then calculate all unknown pressures and the mass flow rate in each flow component in the network. The program can treat networks containing up to fifty flow components and twenty-five unknown network pressures. The types of flow components that can be treated are face seals, narrow slots, and pipes. The program is written in both structured FORTRAN (SFTRAN) and FORTRAN 4. The program must be run in an interactive (conversational) mode.

  3. Automation of PCXMC and ImPACT for NASA Astronaut Medical Imaging Dose and Risk Tracking

    Science.gov (United States)

    Bahadori, Amir; Picco, Charles; Flores-McLaughlin, John; Shavers, Mark; Semones, Edward

    2011-01-01

    To automate astronaut organ and effective dose calculations from occupational X-ray and computed tomography (CT) examinations incorporating PCXMC and ImPACT tools and to estimate the associated lifetime cancer risk per the National Council on Radiation Protection & Measurements (NCRP) using MATLAB(R). Methods: NASA follows guidance from the NCRP on its operational radiation safety program for astronauts. NCRP Report 142 recommends that astronauts be informed of the cancer risks from reported exposures to ionizing radiation from medical imaging. MATLAB(R) code was written to retrieve exam parameters for medical imaging procedures from a NASA database, calculate associated dose and risk, and return results to the database, using the Microsoft .NET Framework. This code interfaces with the PCXMC executable and emulates the ImPACT Excel spreadsheet to calculate organ doses from X-rays and CTs, respectively, eliminating the need to utilize the PCXMC graphical user interface (except for a few special cases) and the ImPACT spreadsheet. Results: Using MATLAB(R) code to interface with PCXMC and replicate ImPACT dose calculation allowed for rapid evaluation of multiple medical imaging exams. The user inputs the exam parameter data into the database and runs the code. Based on the imaging modality and input parameters, the organ doses are calculated. Output files are created for record, and organ doses, effective dose, and cancer risks associated with each exam are written to the database. Annual and post-flight exposure reports, which are used by the flight surgeon to brief the astronaut, are generated from the database. Conclusions: Automating PCXMC and ImPACT for evaluation of NASA astronaut medical imaging radiation procedures allowed for a traceable and rapid method for tracking projected cancer risks associated with over 12,000 exposures. This code will be used to evaluate future medical radiation exposures, and can easily be modified to accommodate changes to the risk

  4. ROBOT3: a computer program to calculate the in-pile three-dimensional bowing of cylindrical fuel rods (AWBA Development Program)

    International Nuclear Information System (INIS)

    Kovscek, S.E.; Martin, S.E.

    1982-10-01

    ROBOT3 is a FORTRAN computer program which is used in conjunction with the CYGRO5 computer program to calculate the time-dependent inelastic bowing of a fuel rod using an incremental finite element method. The fuel rod is modeled as a viscoelastic beam whose material properties are derived as perturbations of the CYGRO5 axisymmetric model. Fuel rod supports are modeled as displacement, force, or spring-type nodal boundary conditions. The program input is described and a sample problem is given

  5. Easy-to-use application programs for decay heat and delayed neutron calculations on personal computers

    Energy Technology Data Exchange (ETDEWEB)

    Oyamatsu, Kazuhiro [Nagoya Univ. (Japan)

    1998-03-01

    Application programs for personal computers are developed to calculate the decay heat power and delayed neutron activity from fission products. The main programs can be used in any computers from personal computers to main frames because their sources are written in Fortran. These programs have user friendly interfaces to be used easily not only for research activities but also for educational purposes. (author)

  6. Automation; The New Industrial Revolution.

    Science.gov (United States)

    Arnstein, George E.

    Automation is a word that describes the workings of computers and the innovations of automatic transfer machines in the factory. As the hallmark of the new industrial revolution, computers displace workers and create a need for new skills and retraining programs. With improved communication between industry and the educational community to…

  7. LAVA: Large scale Automated Vulnerability Addition

    Science.gov (United States)

    2016-05-23

    LAVA: Large-scale Automated Vulnerability Addition Brendan Dolan -Gavitt∗, Patrick Hulin†, Tim Leek†, Fredrich Ulrich†, Ryan Whelan† (Authors listed...released, and thus rapidly become stale. We can expect tools to have been trained to detect bugs that have been released. Given the commercial price tag...low TCN) and dead (low liveness) program data is a powerful one for vulnera- bility injection. The DUAs it identifies are internal program quantities

  8. A computer vision-based automated Figure-8 maze for working memory test in rodents.

    Science.gov (United States)

    Pedigo, Samuel F; Song, Eun Young; Jung, Min Whan; Kim, Jeansok J

    2006-09-30

    The benchmark test for prefrontal cortex (PFC)-mediated working memory in rodents is a delayed alternation task utilizing variations of T-maze or Figure-8 maze, which requires the animals to make specific arm entry responses for reward. In this task, however, manual procedures involved in shaping target behavior, imposing delays between trials and delivering rewards can potentially influence the animal's performance on the maze. Here, we report an automated Figure-8 maze which does not necessitate experimenter-subject interaction during shaping, training or testing. This system incorporates a computer vision system for tracking, motorized gates to impose delays, and automated reward delivery. The maze is controlled by custom software that records the animal's location and activates the gates according to the animal's behavior and a control algorithm. The program performs calculations of task accuracy, tracks movement sequence through the maze, and provides other dependent variables (such as running speed, time spent in different maze locations, activity level during delay). Testing in rats indicates that the performance accuracy is inversely proportional to the delay interval, decreases with PFC lesions, and that animals anticipate timing during long delays. Thus, our automated Figure-8 maze is effective at assessing working memory and provides novel behavioral measures in rodents.

  9. User's Guide to Handlens - A Computer Program that Calculates the Chemistry of Minerals in Mixtures

    Science.gov (United States)

    Eberl, D.D.

    2008-01-01

    HandLens is a computer program, written in Excel macro language, that calculates the chemistry of minerals in mineral mixtures (for example, in rocks, soils and sediments) for related samples from inputs of quantitative mineralogy and chemistry. For best results, the related samples should contain minerals having the same chemical compositions; that is, the samples should differ only in the proportions of minerals present. This manual describes how to use the program, discusses the theory behind its operation, and presents test results of the program's accuracy. Required input for HandLens includes quantitative mineralogical data, obtained, for example, by RockJock analysis of X-ray diffraction (XRD) patterns, and quantitative chemical data, obtained, for example, by X-ray florescence (XRF) analysis of the same samples. Other quantitative data, such as sample depth, temperature, surface area, also can be entered. The minerals present in the samples are selected from a list, and the program is started. The results of the calculation include: (1) a table of linear coefficients of determination (r2's) which relate pairs of input data (for example, Si versus quartz weight percents); (2) a utility for plotting all input data, either as pairs of variables, or as sums of up to eight variables; (3) a table that presents the calculated chemical formulae for minerals in the samples; (4) a table that lists the calculated concentrations of major, minor, and trace elements in the various minerals; and (5) a table that presents chemical formulae for the minerals that have been corrected for possible systematic errors in the mineralogical and/or chemical analyses. In addition, the program contains a method for testing the assumption of constant chemistry of the minerals within a sample set.

  10. Automated Technology for Verificiation and Analysis

    DEFF Research Database (Denmark)

    -of-the-art research on theoretical and practical aspects of automated analysis, verification, and synthesis. Among 74 research papers and 10 tool papers submitted to ATVA 2009, the Program Committee accepted 23 as regular papers and 3 as tool papers. In all, 33 experts from 17 countries worked hard to make sure......This volume contains the papers presented at the 7th International Symposium on Automated Technology for Verification and Analysis held during October 13-16 in Macao SAR, China. The primary objective of the ATVA conferences remains the same: to exchange and promote the latest advances of state...

  11. An Intelligent Automation Platform for Rapid Bioprocess Design

    Science.gov (United States)

    Wu, Tianyi

    2014-01-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user’s inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. PMID:24088579

  12. The Automated Assembly Team contributions to the APRIMED Agile Manufacturing Project

    International Nuclear Information System (INIS)

    Jones, R.E.; Ames, A.L.; Calton, T.L.

    1995-06-01

    The Automated Assembly Team of the APRIMED Project (abbreviated as A') consists of two parts: the Archimedes Project, which is an ongoing project developing automated assembly technology, and the A' Robot Team. Archimedes is a second generation assembly planning system that both provides a general high-level assembly sequencing capability and, for a smaller class of products, facilitates automatic programming of a robotic workcell to assemble them. The A' robot team designed, developed, and implemented a flexible robot workcell which served as the automated factory of the A' project. In this document we briefly describe the role of automated assembly planning in agile manufacturing, and specifically describe the contributions of the Archimedes project and the A' robot team to the A' project. We introduce the concepts of the Archimedes automated assembly planning project, and discuss the enhancements to Archimedes which were developed in response to the needs of the A' project. We also present the work of the A' robot team in designing and developing the A' robot workcell, including all tooling and programming to support assembly of the A' discriminator devices. Finally, we discuss the process changes which these technologies have enabled in the A' project

  13. Program realization of mathematical model of kinematic calculation of flat lever mechanisms

    Directory of Open Access Journals (Sweden)

    M. A. Vasechkin

    2016-01-01

    Full Text Available Calculation of kinematic mechanisms is very time-consuming work. Due to the content of a large number of similar operations can be automated using computers. Forthis purpose, it is necessary to implement a software implementation ofthe mathematical model of calculation of kinematic mechanisms of the second class. In the article on Turbo Pascal presents the text module to library procedures all kinematic studies of planar lever mechanisms of the second class. The determination of the kinematic characteristics of the mechanism and the construction of its provisions, plans, plans, speeds and accelerations carried out on the example of the six-link mechanism. The beginning of the motionless coordinate system coincides with the axis of rotation of the crank AB. It is assumed that the known length of all links, the positions of all additional points of links and the coordinates of all kinematic pairs rack mechanism, i.e. this stage of work to determine the kinematics of the mechanism must be preceded by a stage of synthesis of mechanism (determining missing dimensions of links. Denote the coordinates of point C and considering that the analogues of velocities and accelerations of this point is 0 (stationary point, appeal to the procedure that computes the kinematics group the Assyrians (GA third. Specify kinematic parameters of point D, taking the beginning of the guide slide E at point C, the angle, the analogue of the angular velocity and the analogue of the angular acceleration of the guide is zero, knowing the length of the connecting rod DE and the length of link 5, refer to the procedure for the GA of the second kind. The use of library routines module of the kinematic calculation, makes it relatively simple to organize a simulation of the mechanism motion, to calculate the projection analogues of velocities and accelerations of all links of the mechanism, to build plans of the velocities and accelerations at each position of the mechanism.

  14. Automated calculation of matrix elements and physics motivated observables

    Science.gov (United States)

    Was, Z.

    2017-11-01

    The central aspect of my personal scientific activity, has focused on calculations useful for interpretation of High Energy accelerator experimental results, especially in a domain of precision tests of the Standard Model. My activities started in early 80’s, when computer support for algebraic manipulations was in its infancy. But already then it was important for my work. It brought a multitude of benefits, but at the price of some inconvenience for physics intuition. Calculations became more complex, work had to be distributed over teams of researchers and due to automatization, some aspects of the intermediate results became more difficult to identify. In my talk I will not be very exhaustive, I will present examples from my personal research only: (i) calculations of spin effects for the process e + e - → τ + τ - γ at Petra/PEP energies, calculations (with the help of the Grace system of Minami-tateya group) and phenomenology of spin amplitudes for (ii) e + e - → 4f and for (iii) e + e - → νeν¯eγγ processes, (iv) phenomenology of CP-sensitive observables for Higgs boson parity in H → τ + τ -, τ ± → ν2(3)π cascade decays.

  15. Automated bar coding of air samples at Hanford (ABCASH)

    International Nuclear Information System (INIS)

    Troyer, G.L.; Brayton, D.D.; McNeece, S.G.

    1992-10-01

    This article describes the basis, main features and benefits of an automated system for tracking and reporting radioactive air particulate samples. The system was developed due to recognized need for improving the quality and integrity of air sample data related to personnel and environmental protection. The data capture, storage, and retrieval of air sample data are described. The automation aspect of the associated and data input eliminates a large potential for human error. The system utilizes personal computers, handheld computers, a commercial personal computer database package, commercial programming languages, and complete documentation to satisfy the system's automation objective

  16. Automated validation of a computer operating system

    Science.gov (United States)

    Dervage, M. M.; Milberg, B. A.

    1970-01-01

    Programs apply selected input/output loads to complex computer operating system and measure performance of that system under such loads. Technique lends itself to checkout of computer software designed to monitor automated complex industrial systems.

  17. A data automation system at Los Alamos National Laboratory

    International Nuclear Information System (INIS)

    Betts, S.E.; Schneider, C.M.; Pickrell, M.M.

    2001-01-01

    Idaho National Engineering and Environmental Laboratory (INEEL) has developed an automated computer program, Data Review Expert System (DRXS), for reviewing nondestructive assay (NDA) data. DRXS significantly reduces the data review time needed to meet characterization requirements for the Waste Isolation Pilot Plant (WIPP). Los Alamos National Laboratory (LANL) is in the process of developing a computer program, Software System Logic for Intelligent Certification (SSLIC), to automate other tasks associa ted with characterization of Transuranic Waste (TRU) samples. LANL has incorporated a version of DRXS specific to LANL's isotopic data into SSLIC. This version of SSLIC was audited by the National Transuranic Program on October, 24, 2001. This paper will present the results of the audit, and discuss future plans for SSLIC including the integration on the INEELLANL developed Rule Editor.

  18. Complacency and Automation Bias in the Use of Imperfect Automation.

    Science.gov (United States)

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  19. Demonstration of automated robotic workcell for hazardous waste characterization

    International Nuclear Information System (INIS)

    Holliday, M.; Dougan, A.; Gavel, D.; Gustaveson, D.; Johnson, R.; Kettering, B.; Wilhelmsen, K.

    1993-02-01

    An automated robotic workcell to classify hazardous waste stream items with previously unknown characteristics has been designed, tested and demonstrated The object attributes being quantified are radiation signature, metal content, and object orientation and volume. The multi sensor information is used to make segregation decisions plus do automatic grasping of objects. The work-cell control program uses an off-line programming system by Cimetrix Inc. as a server to do both simulation control as well as actual hardware control of the workcell. This paper will discuss the overall workcell layout, sensor specifications, workcell supervisory control, 2D vision based automated grasp planning and object classification algorithms

  20. Development of an automated desktop procedure for defining macro ...

    African Journals Online (AJOL)

    2006-07-03

    break points' such as ... An automated desktop procedure was developed for computing statistically defensible, multiple change .... from source to mouth. .... the calculated value was less than the test statistic given in Owen.

  1. 'BLOC' program for elasto-plastic calculation of fissured media

    International Nuclear Information System (INIS)

    Pouyet, P.; Picaut, J.; Costaz, J.L.; Dulac, J.

    1983-01-01

    The method described is used to test failure mechanisms and to calculate the corresponding ultimate loads. The main advantages it offers are simple modelling, the possibility of representing all the prestressing and reinforcement steels simply and correctly, and fewer degrees of freedom, hence lower cost (the program can be run on a microcomputer). However, the model is sensitive to the arrangement of the interface elements, presupposing a given failure mechanism. This normally means testing several different models with different kinematically possible failure patterns. But the ease of modelling and low costs are ideal for this type of approach. (orig./RW)

  2. KOP program for calculating cross sections of neutron and charged particle interactions with atomic nuclei using the optical model

    International Nuclear Information System (INIS)

    Grudzevich, O.D.; Zelenetskij, A.V.; Pashchenko, A.B.

    1986-01-01

    The last version of the KOP program for calculating cross sections of neutron and charged particle interaction with atomic nuclei within the scope of the optical model is described. The structure and program organization, library of total parameters of the optical potential, program identificators and peculiarities of its operation, input of source data and output of calculational results for printing are described in detail. The KOP program is described in Fortran- and adapted for EC-1033 computer

  3. A program for calculating group constants on the basis of libraries of evaluated neutron data

    International Nuclear Information System (INIS)

    Sinitsa, V.V.

    1987-01-01

    The GRUKON program is designed for processing libraries of evaluated neutron data into group and fine-group (having some 300 groups) microscopic constants. In structure it is a package of applications programs with three basic components: a monitor, a command language and a library of functional modules. The first operative version of the package was restricted to obtaining mid-group non-block cross-sections from evaluated neutron data libraries in the ENDF/B format. This was then used to process other libraries. In the next two versions, cross-section table conversion modules and self-shielding factor calculation modules, respectively, were added to the functions already in the package. Currently, a fourth version of the GRUKON applications program package, for calculation of sub-group parameters, is under preparation. (author)

  4. Initial development of an automated task analysis profiling system

    International Nuclear Information System (INIS)

    Jorgensen, C.C.

    1984-01-01

    A program for automated task analysis is described. Called TAPS (task analysis profiling system), the program accepts normal English prose and outputs skills, knowledges, attitudes, and abilities (SKAAs) along with specific guidance and recommended ability measurement tests for nuclear power plant operators. A new method for defining SKAAs is presented along with a sample program output

  5. Fatigue analysis through automated cycle counting using ThermAND

    International Nuclear Information System (INIS)

    Burton, G.R.; Ding, Y.; Scovil, A.; Yetisir, M.

    2008-01-01

    The potential for fatigue damage due to thermal transients is one of the degradation mechanisms that needs to be managed for plant components. The original design of CANDU stations accounts for projected fatigue usage for specific components over a specified design lifetime. Fatigue design calculations were based on estimates of the number and severity of expected transients for 30 years operation at 80% power. Many CANDU plants are now approaching the end of their design lives and are being considered for extended operation. Industry practice is to have a comprehensive fatigue management program in place for extended operation beyond the original design life. A CANDU-specific framework for fatigue management has recently been developed to identify the options for implementation, and the critical components and locations requiring long-term fatigue monitoring. An essential element of fatigue monitoring is to identify, count and monitor the number of plant transients to ensure that the number assumed in the original design is not exceeded. The number and severity of actual CANDU station thermal transients at key locations in critical systems have been assessed using ThermAND, AECL's health monitor for systems and components, based on archived station operational data. The automated cycle counting has demonstrated that actual transients are generally less numerous than the quantity assumed in the design basis, and are almost always significantly less severe. This paper will discuss the methodology to adapt ThermAND for automated cycle counting of specific system transients, illustrate and test this capability for cycle-based fatigue monitoring using CANDU station data, report the results, and provide data for stress-based fatigue calculations. (author)

  6. Automating the Technical Library at Los Angeles' Department of Information Systems.

    Science.gov (United States)

    Gillette, Robert

    1992-01-01

    Description of the automation of the technical library of the City of Los Angeles Department of Information Services provides background information on the department and its library; lists the automation project goals and objectives; and describes the two software programs--ObjectVision and Paradox Engine--used as applications development tools…

  7. Implementation of the optimization for the methodology of the neutronic calculation and thermo-hydraulic in IEA-R1 reactor

    International Nuclear Information System (INIS)

    Stefani, Giovanni Laranjo de; Conti, Thadeu das Neves; Fedorenko, Giuliana G.; Castro, Vinicius A.; Maio, Mireia F.; Santos, Thiago Augusto dos

    2011-01-01

    This work objective was to create a manager program that would automate the programs and computer codes in use for neutronic calculation and thermo-hydraulic in IEA-R1 reactor thus making the process for calculation of safety parameters and for configuration change up to 98% faster than that used in the reactor today. This process was tested in combination with the reactor operators and is being implemented by the quality department. The main codes and programs involved in the calculations of configuration change are Leopard, Hammier-Technion, Twodb, Citation and Cobra. Calculations of delayed neutron and criticality coefficients given in the process of safety parameters calculation are given by the Hammer-Technion and Citation in a process that involves about eleven repetitions so that it meets all the necessary conditions (such different temperatures of the moderator and fuel). The results are entirely consistent with the expected and absolutely the same as those given by manual process. Thus the work shows its reliability as well the advantage of saving time, once a process that could take up to four hours was turned in one that takes around five minutes when done in a home computer. Much of this advantage is due to the fact that were created subprograms to treat the output of each program used and transform them into the input of the other programs, removing from it the intermediate essential data for this to occur, thus avoiding also a possible human error by handling the various data supplied. (author)

  8. Automated Calculation of DIII-D Neutral Beam Availability

    International Nuclear Information System (INIS)

    Phillips, J.C.; Hong, R.M.; Scoville, B.G.

    1999-01-01

    The neutral beam systems for the DIII-D tokamak are an extremely reliable source of auxiliary plasma heating, capable of supplying up to 20 MW of injected power, from eight separate beam sources into each tokamak discharge. The high availability of these systems for tokamak operations is sustained by careful monitoring of performance and following up on failures. One of the metrics for this performance is the requested injected power profile as compared to the power profile delivered for a particular pulse. Calculating this was a relatively straightforward task, however innovations such as the ability to modulate the beams and more recently the ability to substitute an idle beam for one which has failed during a plasma discharge, have made the task very complex. For example, with this latest advance it is possible for one or more beams to have failed, yet the delivered power profile may appear perfect. Availability used to be manually calculated. This paper presents the methods and algorithms used to produce a system which performs the calculations based on information concerning the neutral beam and plasma current waveforms, along with post-discharge information from the Plasma Control System, which has the ability to issue commands for beams in real time. Plots representing both the requested and actual power profiles, along with statistics, are automatically displayed and updated each shot, on a web-based interface viewable both at DIII-D and by our remote collaborators using no-cost software

  9. Quanty4RIXS: a program for crystal field multiplet calculations of RIXS and RIXS-MCD spectra using Quanty.

    Science.gov (United States)

    Zimmermann, Patric; Green, Robert J; Haverkort, Maurits W; de Groot, Frank M F

    2018-05-01

    Some initial instructions for the Quanty4RIXS program written in MATLAB ® are provided. The program assists in the calculation of 1s 2p RIXS and 1s 2p RIXS-MCD spectra using Quanty. Furthermore, 1s XAS and 2p 3d RIXS calculations in different symmetries can also be performed. It includes the Hartree-Fock values for the Slater integrals and spin-orbit interactions for several 3d transition metal ions that are required to create the .lua scripts containing all necessary parameters and quantum mechanical definitions for the calculations. The program can be used free of charge and is designed to allow for further adjustments of the scripts. open access.

  10. Evaluation of an automated karyotyping system for chromosome aberration analysis

    International Nuclear Information System (INIS)

    Prichard, H.M.

    1987-01-01

    Chromosome aberration analysis is a promising complement to conventional radiation dosimetry, particularly in the complex radiation fields encountered in the space environment. The capabilities of a recently developed automated karyotyping system were evaluated both to determine current capabilities and limitations and to suggest areas where future development should be emphasized. Cells exposed to radiometric chemicals and to photon and particulate radiation were evaluated by manual inspection and by automated karyotyping. It was demonstrated that the evaluated programs were appropriate for image digitization, storage, and transmission. However, automated and semi-automated scoring techniques must be advanced significantly if in-flight chromosome aberration analysis is to be practical. A degree of artificial intelligence may be necessary to realize this goal

  11. A fascinating country in the world of computing your guide to automated reasoning

    CERN Document Server

    Wos, Larry

    1999-01-01

    This book shows you - through examples and puzzles and intriguing questions - how to make your computer reason logically. To help you, the book includes a CD-ROM with OTTER, the world's most powerful general-purpose reasoning program. The automation of reasoning has advanced markedly in the past few decades, and this book discusses some of the remarkable successes that automated reasoning programs have had in tackling challenging problems in mathematics, logic, program verification, and circuit design. Because the intended audience includes students and teachers, the book provides many exercis

  12. Porting oxbash to linux and its application in SD-shell calculations

    International Nuclear Information System (INIS)

    Suman, H.; Suleiman, S.

    1998-01-01

    Oxbash, a code for nuclear structure calculations within the shell model approach, was ported to Linux that is a UNIX clone for PC's. Due to many faults in the code version we had, deep corrective actions in the code had to be undertaken. This was done through intensive use of UNIX utilities like sed, nm, make in addition to proper shell script programming. Our version contained calls for missing subroutines. Some of these were included from C- and f90 libraries. Others had to be written separately. All these actions were organized and automated through a robust system of M akefiles . Finally the code was tested and applied for nuclei with 18 and 20 nucleons. (author)

  13. Automated multi-lesion detection for referable diabetic retinopathy in indigenous health care.

    Science.gov (United States)

    Pires, Ramon; Carvalho, Tiago; Spurling, Geoffrey; Goldenstein, Siome; Wainer, Jacques; Luckie, Alan; Jelinek, Herbert F; Rocha, Anderson

    2015-01-01

    Diabetic Retinopathy (DR) is a complication of diabetes mellitus that affects more than one-quarter of the population with diabetes, and can lead to blindness if not discovered in time. An automated screening enables the identification of patients who need further medical attention. This study aimed to classify retinal images of Aboriginal and Torres Strait Islander peoples utilizing an automated computer-based multi-lesion eye screening program for diabetic retinopathy. The multi-lesion classifier was trained on 1,014 images from the São Paulo Eye Hospital and tested on retinal images containing no DR-related lesion, single lesions, or multiple types of lesions from the Inala Aboriginal and Torres Strait Islander health care centre. The automated multi-lesion classifier has the potential to enhance the efficiency of clinical practice delivering diabetic retinopathy screening. Our program does not necessitate image samples for training from any specific ethnic group or population being assessed and is independent of image pre- or post-processing to identify retinal lesions. In this Aboriginal and Torres Strait Islander population, the program achieved 100% sensitivity and 88.9% specificity in identifying bright lesions, while detection of red lesions achieved a sensitivity of 67% and specificity of 95%. When both bright and red lesions were present, 100% sensitivity with 88.9% specificity was obtained. All results obtained with this automated screening program meet WHO standards for diabetic retinopathy screening.

  14. A Python tool to set up relative free energy calculations in GROMACS.

    Science.gov (United States)

    Klimovich, Pavel V; Mobley, David L

    2015-11-01

    Free energy calculations based on molecular dynamics (MD) simulations have seen a tremendous growth in the last decade. However, it is still difficult and tedious to set them up in an automated manner, as the majority of the present-day MD simulation packages lack that functionality. Relative free energy calculations are a particular challenge for several reasons, including the problem of finding a common substructure and mapping the transformation to be applied. Here we present a tool, alchemical-setup.py, that automatically generates all the input files needed to perform relative solvation and binding free energy calculations with the MD package GROMACS. When combined with Lead Optimization Mapper (LOMAP; Liu et al. in J Comput Aided Mol Des 27(9):755-770, 2013), recently developed in our group, alchemical-setup.py allows fully automated setup of relative free energy calculations in GROMACS. Taking a graph of the planned calculations and a mapping, both computed by LOMAP, our tool generates the topology and coordinate files needed to perform relative free energy calculations for a given set of molecules, and provides a set of simulation input parameters. The tool was validated by performing relative hydration free energy calculations for a handful of molecules from the SAMPL4 challenge (Mobley et al. in J Comput Aided Mol Des 28(4):135-150, 2014). Good agreement with previously published results and the straightforward way in which free energy calculations can be conducted make alchemical-setup.py a promising tool for automated setup of relative solvation and binding free energy calculations.

  15. A new program for calculating matrix elements of one-particle operators in jj-coupling

    International Nuclear Information System (INIS)

    Pyper, N.C.; Grant, I.P.; Beatham, N.

    1978-01-01

    The aim of this paper is to calculate the matrix elements of one-particle tensor operators occurring in atomic and nuclear theory between configuration state functions representing states containing any number of open shells in jj-coupling. The program calculates the angular part of these matrix elements. The program is essentially a new version of RDMEJJ, written by J.J. Chang. The aims of this version are to eliminate inconsistencies from RDMEJJ, to modify its input requirements for consistency with MCP75, and to modify its output so that it can be stored in a discfile for access by other compatible programs. The program assumes that the configurational states are built from a common orthonormal set of basis orbitals. The number of electrons in a shell having j>=9/2 is restricted to be not greater than 2 by the available CFP routines . The present version allows up to 40 orbitals and 50 configurational states with <=10 open shells; these numbers can be changed by recompiling with modified COMMON/DIMENSION statements. The user should ensure that the CPC library subprograms AAGD, ACRI incorporate all current updates and have been converted to use double precision floating point arithmetic. (Auth.)

  16. Automated sampling and data processing derived from biomimetic membranes

    DEFF Research Database (Denmark)

    Perry, Mark; Vissing, Thomas; Boesen, P.

    2009-01-01

    data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition...... applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet (TM)) for efficient data management......Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new...

  17. Large scale oil lease automation and electronic custody transfer

    International Nuclear Information System (INIS)

    Price, C.R.; Elmer, D.C.

    1995-01-01

    Typically, oil field production operations have only been automated at fields with long term production profiles and enhanced recovery. The automation generally consists of monitoring and control at the wellhead and centralized facilities. However, Union Pacific Resources Co. (UPRC) has successfully implemented a large scale automation program for rapid-decline primary recovery Austin Chalk wells where purchasers buy and transport oil from each individual wellsite. This project has resulted in two significant benefits. First, operators are using the system to re-engineer their work processes. Second, an inter-company team created a new electronic custody transfer method. This paper will describe: the progression of the company's automation objectives in the area; the field operator's interaction with the system, and the related benefits; the research and development of the new electronic custody transfer method

  18. Designing Abstractions for JavaScript Program Analysis

    DEFF Research Database (Denmark)

    Andreasen, Esben Sparre

    JavaScript is a widely used dynamic programming language. What started out as a client-side scripting language for browsers, is now used for large applications in many different settings. As for other dynamic languages, JavaScript makes it easy to write programs quickly without being constrained...... by the language, and programmers exploit that power to write highly dynamic programs. Automated tools for helping programmers and optimizing programs are used successfully for many programming languages. Unfortunately, the automated tools for JavaScript are not as good as for other programming languages....... The program analyses, that the automated tools are built upon, are poorly suited to deal with the highly dynamic nature of JavaScript programs. The lack of language restrictions on the programmer are detrimental to the quality of program analyses for JavaScript. The aim of this dissertation is to address...

  19. PERL-2 and LAVR-2 programs for Monte Carlo calculation of reactivity disturbances with trajectory correlation using random numbers

    International Nuclear Information System (INIS)

    Kamaeva, O.B.; Polevoj, V.B.

    1983-01-01

    Realization of BESM-6 computer of a technique is described for calculating a wide class of reactivity disturbances by plotting trajectories in undisturbed and disturbed systems using one sequence of random numbers. The technique was realized on the base of earlier created programs of calculation of widespreed (PERL) and local (LAVR) reactivity disturbances. The efficiency of the technique and programs is demonstrated by calculation of change of effective neutron-multiplication factor when absorber is substituted for fuel element in a BFS-40 critical assembly and by calculation of control drum characteristics

  20. A program for calculating load coefficient matrices utilizing the force summation method, L218 (LOADS). Volume 1: Engineering and usage

    Science.gov (United States)

    Miller, R. D.; Anderson, L. R.

    1979-01-01

    The LOADS program L218, a digital computer program that calculates dynamic load coefficient matrices utilizing the force summation method, is described. The load equations are derived for a flight vehicle in straight and level flight and excited by gusts and/or control motions. In addition, sensor equations are calculated for use with an active control system. The load coefficient matrices are calculated for the following types of loads: translational and rotational accelerations, velocities, and displacements; panel aerodynamic forces; net panel forces; shears and moments. Program usage and a brief description of the analysis used are presented. A description of the design and structure of the program to aid those who will maintain and/or modify the program in the future is included.