WorldWideScience

Sample records for unix process control

  1. Utilities in UNIX; Programas de Utilidad UNIX

    Energy Technology Data Exchange (ETDEWEB)

    Perez, L

    2002-07-01

    This manual goes to the users with some or much experience in the unix operating system. In such manner that they can get more efficiency using the unix of the most vendors. Include the majority of UNIX commands, shell built-in functions to create scripts, and a brief explication of the variables in several environments. In addition, other products are included, more and more integrated in the most of the unix operating systems. For example; the scanning and processing language awk, the print server LPRng, GNU utilities, batch subsystem, etc. The manual was initially based in an specific unix. But it had been written for use of the most unix that exist: Tru64 unix, aix, iris, hpux, solaris y linux. In this way, many examples in the chapter had been included. The purpose of this manual is to provide an UNIX reference for advanced users in any of the unix operating systems family. (Author)

  2. Utilities in UNIX

    International Nuclear Information System (INIS)

    Perez, L.

    2002-01-01

    This manual goes to the users with some or much experience in the unix operating system. In such manner that they can get more efficiency using the unix of the most vendors. Include the majority of UNIX commands, shell built-in functions to create scripts, and a brief explication of the variables in several environments. In addition, other products are included, more and more integrated in the most of the unix operating systems. For example: the scanning and processing language awk, the print server LPRng, GNU Utilities, batch subsystem, etc. The manual was initially based in an specific unix. But it and been written for use of the most unix that exist: Tru64 unix, aix, iris, hpux. solaris y linux. In this way, many examples in the chapter had been included. The purpose of this manual is to provide an UNIX reference for advanced users in any of the unix operating systems family. (Author)

  3. The hybrid UNIX controller for real-time data acquisition

    International Nuclear Information System (INIS)

    Huesman, R.H.; Klein, G.J.; Fleming, T.K.

    1996-01-01

    The authors describe a hybrid data acquisition architecture integrating a conventional UNIX workstation with CAMAC-based real-time hardware. The system combines the high-level programming simplicity and user interface of a UNIX workstation with the low-level timing control available from conventional real-time hardware. They detail this architecture as it has been implemented for control of the Donner 600-Crystal Positron Tomograph (PET600). Low-level data acquisition is carried out in this system using eight LeCroy 3588 histogrammers, which together after derandomization, acquire events at rates up to 4 MHz, and two dedicated Motorola 6809 microprocessors, which arbitrate fine timing control during acquisition. A SUN Microsystems UNIX workstation is used for high-level control, allowing an easily extensible user interface in an X-Windows environment, as well as real-time communications to the low-level acquisition units. Communication between the high- and low-level units is carried out via a Jorway 73A SCSI-CAMAC crate controller and a serial interface. For this application, the hybrid configuration segments low from high-level control for ease of maintenance and provided a low-cost upgrade from dated high-level control hardware

  4. Implementation of a control system test environment in UNIX

    International Nuclear Information System (INIS)

    Brittain, C.R.; Otaduy, P.J.; Rovere, L.A.

    1990-01-01

    This paper discusses how UNIX features such as shared memory, remote procedure calls, and signalling have been used to implement a distributed computational environment ideal for the development and testing of digital control systems. The resulting environment -based on features commonly available in commercial workstations- is flexible, allows process simulation and controller development to proceed in parallel, and provides for testing and validation in a realistic environment. In addition, the use of shared memory to exchange data allows other tasks such as user interfaces and recorders to be added without affecting the process simulation or controllers. A library of functions is presented which provides a simple interface to using the features described. These functions can be used in either C or FORTRAN programs and have been tested on a network of Sun workstations and an ENCORE parallel computer. 6 refs., 2 figs

  5. The equipment access software for a distributed UNIX-based accelerator control system

    International Nuclear Information System (INIS)

    Trofimov, Nikolai; Zelepoukine, Serguei; Zharkov, Eugeny; Charrue, Pierre; Gareyte, Claire; Poirier, Herve

    1994-01-01

    This paper presents a generic equipment access software package for a distributed control system using computers with UNIX or UNIX-like operating systems. The package consists of three main components, an application Equipment Access Library, Message Handler and Equipment Data Base. An application task, which may run in any computer in the network, sends requests to access equipment through Equipment Library calls. The basic request is in the form Equipment-Action-Data and is routed via a remote procedure call to the computer to which the given equipment is connected. In this computer the request is received by the Message Handler. According to the type of the equipment connection, the Message Handler either passes the request to the specific process software in the same computer or forwards it to a lower level network of equipment controllers using MIL1553B, GPIB, RS232 or BITBUS communication. The answer is then returned to the calling application. Descriptive information required for request routing and processing is stored in the real-time Equipment Data Base. The package has been written to be portable and is currently available on DEC Ultrix, LynxOS, HPUX, XENIX, OS-9 and Apollo domain. ((orig.))

  6. Progress of data processing system in JT-60 utilizing the UNIX-based workstations

    International Nuclear Information System (INIS)

    Sakata, Shinya; Kiyono, Kimihiro; Oshima, Takayuki; Sato, Minoru; Ozeki, Takahisa

    2007-07-01

    JT-60 data processing system (DPS) possesses three-level hierarchy. At the top level of hierarchy is JT-60 inter-shot processor (MSP-ISP), which is a mainframe computer, provides communication with the JT-60 supervisory control system and supervises the internal communication inside the DPS. The middle level of hierarchy has minicomputers and the bottom level of hierarchy has individual diagnostic subsystems, which consist of the CAMAC and VME modules. To meet the demand for advanced diagnostics, the DPS has been progressed in stages from a three-level hierarchy system, which was dependent on the processing power of the MSP-ISP, to a two-level hierarchy system, which is decentralized data processing system (New-DPS) by utilizing the UNIX-based workstations and network technology. This replacement had been accomplished, and the New-DPS has been started to operate in October 2005. In this report, we describe the development and improvement of the New-DPS, whose functions were decentralized from the MSP-ISP to the UNIX-based workstations. (author)

  7. Doing accelerator physics using SDDS, UNIX, and EPICS

    International Nuclear Information System (INIS)

    Borland, M.; Emery, L.; Sereno, N.

    1995-01-01

    The use of the SDDS (Self-Describing Data Sets) file protocol, together with the UNIX operating system and EPICS (Experimental Physics and Industrial Controls System), has proved powerful during the commissioning of the APS (Advanced Photon Source) accelerator complex. The SDDS file protocol has permitted a tool-oriented approach to developing applications, wherein generic programs axe written that function as part of multiple applications. While EPICS-specific tools were written for data collection, automated experiment execution, closed-loop control, and so forth, data processing and display axe done with the SDDS Toolkit. Experiments and data reduction axe implemented as UNIX shell scripts that coordinate the execution of EPICS specific tools and SDDS tools. Because of the power and generic nature of the individual tools and of the UNIX shell environment, automated experiments can be prepared and executed rapidly in response to unanticipated needs or new ideas. Examples are given of application of this methodology to beam motion characterization, beam-position-monitor offset measurements, and klystron characterization

  8. UNIX by examples

    International Nuclear Information System (INIS)

    Nee, F.

    1990-10-01

    This report discusses the following topics on UNIX basis programming: file structure; frequently used commands/utilities; control structures used in shell script; c-shell programming; and bourne shell programming

  9. UNIX code management and distribution

    International Nuclear Information System (INIS)

    Hung, T.; Kunz, P.F.

    1992-09-01

    We describe a code management and distribution system based on tools freely available for the UNIX systems. At the master site, version control is managed with CVS, which is a layer on top of RCS, and distribution is done via NFS mounted file systems. At remote sites, small modifications to CVS provide for interactive transactions with the CVS system at the master site such that remote developers are true peers in the code development process

  10. Development of a UNIX network compatible reactivity computer

    International Nuclear Information System (INIS)

    Sanchez, R.F.; Edwards, R.M.

    1996-01-01

    A state-of-the-art UNIX network compatible controller and UNIX host workstation with MATLAB/SIMULINK software were used to develop, implement, and validate a digital reactivity calculation. An objective of the development was to determine why a Macintosh-based reactivity computer reactivity output drifted intolerably

  11. Learning Unix for Mac OS X Tiger Unlock the Power of Unix

    CERN Document Server

    Taylor, Dave

    2005-01-01

    Thoroughly revised and updated for Mac OS X Tiger, this new edition introduces Mac users to the Terminal application and shows you how to navigate the command interface, explore hundreds of Unix applications that come with the Mac, and, most importantly, how to take advantage of both the Mac and Unix interfaces. If you want to master the command-line, this gentle guide to using Unix on Mac OS X Tiger is well worth its cover price

  12. Multimedia Synchronization and UNIX-or-If Multimedia Support is the Problem, Is UNIX the Solution?

    NARCIS (Netherlands)

    D.C.A. Bulterman (Dick); G. van Rossum (Guido); D.T. Winter (Dik)

    1991-01-01

    htmlabstractThis paper considers the role of UNIX in supporting multimedia applications. In particular, we consider the ability of the UNIX operating system (in general) and the UNIX I/O system (in particular) to support the synchronization of a number of high-bandwidth data sets that must be

  13. Practical Unix and Internet Security

    CERN Document Server

    Garfinkel, Simson; Spafford, Gene

    2003-01-01

    When Practical Unix Security was first published more than a decade ago, it became an instant classic. Crammed with information about host security, it saved many a Unix system administrator from disaster. The second edition added much-needed Internet security coverage and doubled the size of the original volume. The third edition is a comprehensive update of this very popular book - a companion for the Unix/Linux system administrator who needs to secure his or her organization's system, networks, and web presence in an increasingly hostile world. Focusing on the four most popular Unix varia

  14. Fermi UNIX trademark environment

    International Nuclear Information System (INIS)

    Nicholls, J.

    1991-03-01

    The introduction of UNIX at Fermilab involves multiple platforms and multiple vendors. Additionally, a single user may have to use more than one platform. This heterogeneity and multiplicity makes it necessary to define a Fermilab environment for UNIX so that as much as possible the systems ''look and feel'' the same. We describe our environment, including both the commercial products and the local tools used to support it. Other products designed for the UNIX environment are also described. 19 refs

  15. Mac OS X Tiger for Unix Geeks

    CERN Document Server

    Jepson, Brian

    2005-01-01

    If you're one of the many Unix developers drawn to Mac OS X for its Unix core, you'll find yourself in surprisingly unfamiliar territory. Unix and Mac OS X are kissing cousins, but there are enough pitfalls and minefields in going from one to another that even a Unix guru can stumble, and most guides to Mac OS X are written for Mac aficionados. For a Unix developer, approaching Tiger from the Mac side is a bit like learning Russian by reading the Russian side of a Russian-English dictionary. Fortunately, O'Reilly has been the Unix authority for over 25 years, and in Mac OS X Tiger for Unix Gee

  16. Real-time UNIX in HEP data acquisition

    International Nuclear Information System (INIS)

    Buono, S.; Gaponenko, I.; Jones, R.; Mapelli, L.; Mornacchi, G.; Prigent, D.; Sanchez-Corral, E.; Skiadelli, M.; Toppers, A.; Duval, P.Y.; Ferrato, D.; Le Van Suu, A.; Qian, Z.; Rondot, C.; Ambrosini, G.; Fumagalli, G.; Aguer, M.; Huet, M.

    1994-01-01

    Today's experimentation in high energy physics is characterized by an increasing need for sensitivity to rare phenomena and complex physics signatures, which require the use of huge and sophisticated detectors and consequently a high performance readout and data acquisition. Multi-level triggering, hierarchical data collection and an always increasing amount of processing power, distributed throughout the data acquisition layers, will impose a number of features on the software environment, especially the need for a high level of standardization. Real-time UNIX seems, today, the best solution for the platform independence, operating system interface standards and real-time features necessary for data acquisition in HEP experiments. We present the results of the evaluation, in a realistic application environment, of a Real-Time UNIX operating system: the EP/LX real-time UNIX system. ((orig.))

  17. Switching the JLab Accelerator Operations Environment from an HP-UX Unix-based to a PC/Linux-based environment

    International Nuclear Information System (INIS)

    Mcguckin, Theodore

    2008-01-01

    The Jefferson Lab Accelerator Controls Environment (ACE) was predominantly based on the HP-UX Unix platform from 1987 through the summer of 2004. During this period the Accelerator Machine Control Center (MCC) underwent a major renovation which included introducing Redhat Enterprise Linux machines, first as specialized process servers and then gradually as general login servers. As computer programs and scripts required to run the accelerator were modified, and inherent problems with the HP-UX platform compounded, more development tools became available for use with Linux and the MCC began to be converted over. In May 2008 the last HP-UX Unix login machine was removed from the MCC, leaving only a few Unix-based remote-login servers still available. This presentation will explore the process of converting an operational Control Room environment from the HP-UX to Linux platform as well as the many hurdles that had to be overcome throughout the transition period

  18. Mac OS X for Unix Geeks (Leopard)

    CERN Document Server

    Rothman, Ernest E; Rosen, Rich

    2009-01-01

    If you've been lured to Mac OS X because of its Unix roots, this invaluable book serves as a bridge between Apple's Darwin OS and the more traditional Unix systems. The new edition offers a complete tour of Mac OS X's Unix shell for Leopard and Tiger, and helps you find the facilities that replace or correspond to standard Unix utilities. Learn how to compile code, link to libraries, and port Unix software to Mac OS X and much more with this concise guide.

  19. A secure file manager for UNIX

    Energy Technology Data Exchange (ETDEWEB)

    DeVries, R.G.

    1990-12-31

    The development of a secure file management system for a UNIX-based computer facility with supercomputers and workstations is described. Specifically, UNIX in its usual form does not address: (1) Operation which would satisfy rigorous security requirements. (2) Online space management in an environment where total data demands would be many times the actual online capacity. (3) Making the file management system part of a computer network in which users of any computer in the local network could retrieve data generated on any other computer in the network. The characteristics of UNIX can be exploited to develop a portable, secure file manager which would operate on computer systems ranging from workstations to supercomputers. Implementation considerations making unusual use of UNIX features, rather than requiring extensive internal system changes, are described, and implementation using the Cray Research Inc. UNICOS operating system is outlined.

  20. Unix Philosophy and the Real World: Control Software for Humanoid Robots

    Directory of Open Access Journals (Sweden)

    Neil Thomas Dantam

    2016-03-01

    Full Text Available Robot software combines the challenges of general purpose and real-time software, requiring complex logic and bounded resource use. Physical safety, particularly for dynamic systems such as humanoid robots, depends on correct software. General purpose computation has converged on unix-like operating systems -- standardized as POSIX, the Portable Operating System Interface -- for devices from cellular phones to supercomputers. The modular, multi-process design typical of POSIX applications is effective for building complex and reliable software. Absent from POSIX, however, is an interproccess communication mechanism that prioritizes newer data as typically desired for control of physical systems. We address this need in the Ach communication library which provides suitable semantics and performance for real-time robot control. Although initially designed for humanoid robots, Ach has broader applicability to complex mechatronic devices -- humanoid and otherwise -- that require real-time coupling of sensors, control, planning, and actuation. The initial user space implementation of Ach was limited in the ability to receive data from multiple sources. We remove this limitation by implementing Ach as a Linux kernel module, enabling Ach's high-performance and latest-message-favored semantics within conventional POSIX communication pipelines. We discuss how these POSIX interfaces and design principles apply to robot software, and we present a case study using the Ach kernel module for communication on the Baxter robot.

  1. The UNIX/XENIX Advantage: Applications in Libraries.

    Science.gov (United States)

    Gordon, Kelly L.

    1988-01-01

    Discusses the application of the UNIX/XENIX operating system to support administrative office automation functions--word processing, spreadsheets, database management systems, electronic mail, and communications--at the Central Michigan University Libraries. Advantages and disadvantages of the XENIX operating system and system configuration are…

  2. A small Unix-based data acquisition system

    International Nuclear Information System (INIS)

    Engberg, D.; Glanzman, T.

    1993-06-01

    The proposed SLAC B Factory detector plans to use Unix-based machines for all aspects of computing, including real-time data acquisition and experimental control. An R ampersand D program has been established to investigate the use of Unix in the various aspects of experimental computation. Earlier R ampersand D work investigated the basic real-time aspects of the IBM RS/6000 workstation running AIX. The next step in this R ampersand D is the construction of prototype data acquisition system which attempts to exercise many of the features needed in the final on-line system in a realistic situation. For this project, we have combined efforts with a team studying the use of novel cell designs and gas mixtures in a new prototype drift chamber

  3. Unix Application Migration Guide

    CERN Document Server

    Microsoft. Redmond

    2003-01-01

    Drawing on the experience of Microsoft consultants working in the field, as well as external organizations that have migrated from UNIX to Microsoft® Windows®, this guide offers practical, prescriptive guidance on the issues you are likely to face when porting existing UNIX applications to the Windows operating system environment. Senior IT decision makers, network managers, and operations managers will get real-world guidance and best practices on planning and implementation issues to understand the different methods through which migration or co-existence can be accomplished. Also detailing

  4. Integrating UNIX workstation into existing online data acquisition systems for Fermilab experiments

    International Nuclear Information System (INIS)

    Oleynik, G.

    1991-03-01

    With the availability of cost effective computing prior from multiple vendors of UNIX workstations, experiments at Fermilab are adding such computers to their VMS based online data acquisition systems. In anticipation of this trend, we have extended the software products available in our widely used VAXONLINE and PANDA data acquisition software systems, to provide support for integrating these workstations into existing distributed online systems. The software packages we are providing pave the way for the smooth migration of applications from the current Data Acquisition Host and Monitoring computers running the VMS operating systems, to UNIX based computers of various flavors. We report on software for Online Event Distribution from VAXONLINE and PANDA, integration of Message Reporting Facilities, and a framework under UNIX for experiments to monitor and view the raw event data produced at any level in their DA system. We have developed software that allows host UNIX computers to communicate with intelligent front-end embedded read-out controllers and processor boards running the pSOS operating system. Both RS-232 and Ethernet control paths are supported. This enables calibration and hardware monitoring applications to be migrated to these platforms. 6 refs., 5 figs

  5. Work with Apple's Rhapsody Operating System which Allows Simultaneous UNIX Program Development, UNIX Program Execution, and PC Application Execution

    OpenAIRE

    Summers, Don; Riley, Chris; Cremaldi, Lucien; Sanders, David

    2001-01-01

    Over the past decade, UNIX workstations have provided a very powerful program development environment. However, workstations are more expensive than PCs and Macintoshes and require a system manager for day-to-day tasks such as disk backup, adding users, and setting up print queues. Native commercial software for system maintenance and "PC applications" has been lacking under UNIX. Apple's new Rhapsody operating system puts the current MacOS on a NeXT UNIX foundation and adds an enhanced NeXTS...

  6. From a UNIX to a PC Based SCADA System

    CERN Document Server

    Momal, F

    1999-01-01

    In order to facilitate the development of supervisory applications involved in slow process control (such as cryogenic control), the LHC/IAS Group (Equipment Controls Group) opted, a few years ago, for an industrial SCADA package which runs on UNIXÒ platforms. However, to reduce costs and following market trends, it has been decided to move over to a PC-based package. Several processes relating to the testing of the prototypes of the LHC magnets are already controlled in this way. However, it was still necessary to provide all the services previously available to the users, for example, data archiving in central databases, real-time access through the Web, automatic GSM calls, etc. This paper presents the advantages and drawbacks of a PC-based package versus a Unix-based system. It also lists the criteria used in the market survey to arrive at the final selection, as well as, the overall architecture, highlighting the developments needed to integrate the package into the global computing environment.

  7. Evaluation of Unix-Based Integrated Office Automation Products.

    Science.gov (United States)

    1994-04-01

    recipient preferences of networked UNIX users. An e-mail directory contains the preferred applications (e.g., FrameMaker , Excel) for each user, and e...Future Not Available (B) FrameMaker (UNIX) E-Optional I/E-Standard Future* (B) Interleaf (UNIX) I/E-Optional I/E-Standard Future* (B) IslandWrite Not...Optional Future Not Available DXF I-Optional Future Not Available (B) EPSI I-Standard Not Available Future (B) FrameMaker (MIF) E-Optional I/E-Standard Not

  8. Protection against hostile algorithms in UNIX software

    Science.gov (United States)

    Radatti, Peter V.

    1996-03-01

    Protection against hostile algorithms contained in Unix software is a growing concern without easy answers. Traditional methods used against similar attacks in other operating system environments such as MS-DOS or Macintosh are insufficient in the more complex environment provided by Unix. Additionally, Unix provides a special and significant problem in this regard due to its open and heterogeneous nature. These problems are expected to become both more common and pronounced as 32 bit multiprocess network operating systems become popular. Therefore, the problems experienced today are a good indicator of the problems and the solutions that will be experienced in the future, no matter which operating system becomes predominate.

  9. Development of a new discharge control system utilizing UNIX workstations and VME-bus systems for JT-60

    Energy Technology Data Exchange (ETDEWEB)

    Akasaka, Hiromi; Sueoka, Michiharu; Takano, Shoji; Totsuka, Toshiyuki; Yonekawa, Izuru; Kurihara, Kenichi; Kimura, Toyoaki [Japan Atomic Energy Research Inst., Naka, Ibaraki (Japan). Naka Fusion Research Establishment

    2002-01-01

    The JT-60 discharge control system, which had used HIDIC-80E 16 bit mini-computers and CAMAC systems since the start of JT-60 experiment in 1985, was renewed in March, 2001. The new system consists of a UNIX workstation and a VME-bus system, and features a distributed control system. The workstation performs message communication with a VME-bus system and controllers of JT-60 sub-systems and processing for discharge control because of its flexibility to construction of a new network and modifications of software. The VME-bus system performs discharge sequence control because it is suitable for fast real time control and flexible to the hardware extension. The replacement has improved the control function and reliability of the discharge control system and also has provided sufficient performance necessary for future modifications of JT-60. The new system has been running successfully since April 2001. The data acquisition speed was confirmed to be twice faster than the previous one. This report describes major functions of the discharge control system, technical ideas for developing the system and results of the initial operation in detail. (author)

  10. A unix configuration engine

    International Nuclear Information System (INIS)

    Burgess, M.

    1994-06-01

    A high level description language is presented for the purpose of automatically configuring large heterogeneous networked unix environments, based on class-oriented abstractions. The configuration engine is portable and easily extensible

  11. Unix version of CALOR89 for calorimeter applications

    International Nuclear Information System (INIS)

    Handler, T.

    1992-01-01

    CALOR89 is a system of coupled Monte Carlo particle transport computer codes which has been successfully employed for the estimation of calorimeter parameters in High Energy Physics. In the past CALOR89 has been running on various IBM machines and on CRAY X-MP at Lawrence Livermore Lab. These machines had non-unix operating systems. In this report we present a UNIX version of CALOR89, which is especially suited for the UNIX work stations. Moreover CALOR89 is also been supplemented with two new program packages which makes it more user friendly. CALPREP is a program for the preparation of the input files for CALOR89 in general geometry and ANALYZ is an analysis package to extract the final results from CALOR89 relevant to calorimeters. This report also provides two script files LCALOR and PCALOR. LCALOR runs CALOR89 sequences of programs and EGS4 for a given configuration sequentially on a single processor and PCALOR concurrently on a multiprocessor unix workstation

  12. Understanding Unix/Linux programming a guide to theory and practice

    CERN Document Server

    Molay, Bruce

    2003-01-01

    This book explains in a clear and coherent manner how Unix works, how to understand existing Unix programs, and how to design and create new Unix programs. The book is organized by subsystem, each presented in visual terms and explained using vivid metaphors. It breaks the information into manageable parts that can be presented, explained, and mastered. By using case studies and an extremely reader-friendly manner to illustrate complex ideas and concepts, the book covers the basics of systems programming, users, files and manuals, how to read a directory, using 1S, writing PWD, studying STTY, writing a video game, studying SH, environment and shell variables, I/O redirection and pipes, servers and sockets, writing a web server, license servers, and concurrent functions. For Unix system administrators and programmers, network programmers, and others who have used other operating systems and need to learn Unix programming to expand their skill sets.

  13. A UNIX-based prototype biomedical virtual image processor

    International Nuclear Information System (INIS)

    Fahy, J.B.; Kim, Y.

    1987-01-01

    The authors have developed a multiprocess virtual image processor for the IBM PC/AT, in order to maximize image processing software portability for biomedical applications. An interprocess communication scheme, based on two-way metacode exchange, has been developed and verified for this purpose. Application programs call a device-independent image processing library, which transfers commands over a shared data bridge to one or more Autonomous Virtual Image Processors (AVIP). Each AVIP runs as a separate process in the UNIX operating system, and implements the device-independent functions on the image processor to which it corresponds. Application programs can control multiple image processors at a time, change the image processor configuration used at any time, and are completely portable among image processors for which an AVIP has been implemented. Run-time speeds have been found to be acceptable for higher level functions, although rather slow for lower level functions, owing to the overhead associated with sending commands and data over the shared data bridge

  14. gLExec: gluing grid computing to the Unix world

    Science.gov (United States)

    Groep, D.; Koeroo, O.; Venekamp, G.

    2008-07-01

    The majority of compute resources in todays scientific grids are based on Unix and Unix-like operating systems. In this world, user and user-group management are based around the concepts of a numeric 'user ID' and 'group ID' that are local to the resource. In contrast, grid concepts of user and group management are centered around globally assigned identifiers and VO membership, structures that are independent of any specific resource. At the fabric boundary, these 'grid identities' have to be translated to Unix user IDs. New job submission methodologies, such as job-execution web services, community-deployed local schedulers, and the late binding of user jobs in a grid-wide overlay network of 'pilot jobs', push this fabric boundary ever further down into the resource. gLExec, a light-weight (and thereby auditable) credential mapping and authorization system, addresses these issues. It can be run both on fabric boundary, as part of an execution web service, and on the worker node in a late-binding scenario. In this contribution we describe the rationale for gLExec, how it interacts with the site authorization and credential mapping frameworks such as LCAS, LCMAPS and GUMS, and how it can be used to improve site control and traceability in a pilot-job system.

  15. gLExec: gluing grid computing to the Unix world

    International Nuclear Information System (INIS)

    Groep, D; Koeroo, O; Venekamp, G

    2008-01-01

    The majority of compute resources in todays scientific grids are based on Unix and Unix-like operating systems. In this world, user and user-group management are based around the concepts of a numeric 'user ID' and 'group ID' that are local to the resource. In contrast, grid concepts of user and group management are centered around globally assigned identifiers and VO membership, structures that are independent of any specific resource. At the fabric boundary, these 'grid identities' have to be translated to Unix user IDs. New job submission methodologies, such as job-execution web services, community-deployed local schedulers, and the late binding of user jobs in a grid-wide overlay network of 'pilot jobs', push this fabric boundary ever further down into the resource. gLExec, a light-weight (and thereby auditable) credential mapping and authorization system, addresses these issues. It can be run both on fabric boundary, as part of an execution web service, and on the worker node in a late-binding scenario. In this contribution we describe the rationale for gLExec, how it interacts with the site authorization and credential mapping frameworks such as LCAS, LCMAPS and GUMS, and how it can be used to improve site control and traceability in a pilot-job system

  16. Unix Security Cookbook

    Science.gov (United States)

    Rehan, S. C.

    This document has been written to help Site Managers secure their Unix hosts from being compromised by hackers. I have given brief introductions to the security tools along with downloading, configuring and running information. I have also included a section on my recommendations for installing these security tools starting from an absolute minimum security requirement.

  17. Use of UNIX in large online processor farms

    Science.gov (United States)

    Biel, Joseph R.

    1990-08-01

    There has been a recent rapid increase in the power of RISC computers running the UNIX operating system. Fermilab has begun to make use of these computers in the next generation of offline computer farms. It is also planning to use such computers in online computer farms. Issues involved in constructing online UNIX farms are discussed.

  18. The Newcastle connection: A software subsystem for constructing distributed UNIX systems

    International Nuclear Information System (INIS)

    Randell, B.

    1985-01-01

    The Newcastle connection is a software subsystem that can be added to each of a set of physically interconnected UNIX or UNIX look-alike systems, so as to construct a distributed system which is functionally indistinguishable at both the user and the program level from a conventional single-processor UNIX system. The techniques used are applicable to a variety and multiplicity of both local and wide area networks, and enable all issues of inter-processor communication, network protocols, etc., to be hidden. A brief account is given of experience with such distributed systems, the first of which was constructed in 1982 using a set of PDP11s running UNIX Version 7, and connected by a Cambridge Ring - since this date the Connection has been used to construct distributed systems based on various other computers and versions of UNIX, both at Newcastle and elsewhere. The final sections compare our scheme to various precursor schemes and discuss its potential relevance to other operating systems. (orig.)

  19. Script-viruses Attacks on UNIX OS

    Directory of Open Access Journals (Sweden)

    D. M. Mikhaylov

    2010-06-01

    Full Text Available In this article attacks on UNIX OS are considered. Currently antivirus developers are concentrated on protecting systems from viruses that are most common and attack popular operating systems. If the system or its components are not often attacked then the antivirus products are not protecting these components as it is not profitable. The same situation is with script-viruses for UNIX OS as most experts consider that it is impossible for such viruses to get enough rights to attack. Nevertheless the main conclusion of this article is the fact that such viruses can be very powerful and can attack systems and get enough rights.

  20. CERN's common Unix and X terminal environment

    International Nuclear Information System (INIS)

    Cass, Tony

    1996-01-01

    The Desktop Infrastructure Group of CERN's Computing and Networks Division has developed a Common Unix and X Terminal Environment to case the migration to Unix based Interactive Computing. The CUTE architecture relies on a distributed flesystem - currently Transarc's AFS - to enable essentially interchangeable client workstation to access both home directory and program files transparently. Additionally, we provide a suite of programs to configure workstations for CUTE and to ensure continued compatibility. This paper describes the different components and the development of the CUTE architecture. (author)

  1. TkPl_SU: An Open-source Perl Script Builder for Seismic Unix

    Science.gov (United States)

    Lorenzo, J. M.

    2017-12-01

    TkPl_SU (beta) is a graphical user interface (GUI) to select parameters for Seismic Unix (SU) modules. Seismic Unix (Stockwell, 1999) is a widely distributed free software package for processing seismic reflection and signal processing. Perl/Tk is a mature, well-documented and free object-oriented graphical user interface for Perl. In a classroom environment, shell scripting of SU modules engages students and helps focus on the theoretical limitations and strengths of signal processing. However, complex interactive processing stages, e.g., selection of optimal stacking velocities, killing bad data traces, or spectral analysis requires advanced flows beyond the scope of introductory classes. In a research setting, special functionality from other free seismic processing software such as SioSeis (UCSD-NSF) can be incorporated readily via an object-oriented style to programming. An object oriented approach is a first step toward efficient extensible programming of multi-step processes, and a simple GUI simplifies parameter selection and decision making. Currently, in TkPl_SU, Perl 5 packages wrap 19 of the most common SU modules that are used in teaching undergraduate and first-year graduate student classes (e.g., filtering, display, velocity analysis and stacking). Perl packages (classes) can advantageously add new functionality around each module and clarify parameter names for easier usage. For example, through the use of methods, packages can isolate the user from repetitive control structures, as well as replace the names of abbreviated parameters with self-describing names. Moose, an extension of the Perl 5 object system, greatly facilitates an object-oriented style. Perl wrappers are self-documenting via Perl programming document markup language.

  2. Python for Unix and Linux system administration

    CERN Document Server

    Gift, Noah

    2007-01-01

    Python is an ideal language for solving problems, especially in Linux and Unix networks. With this pragmatic book, administrators can review various tasks that often occur in the management of these systems, and learn how Python can provide a more efficient and less painful way to handle them. Each chapter in Python for Unix and Linux System Administration presents a particular administrative issue, such as concurrency or data backup, and presents Python solutions through hands-on examples. Once you finish this book, you'll be able to develop your own set of command-line utilities with Pytho

  3. Real-time on a standard UNIX workstation?

    International Nuclear Information System (INIS)

    Glanzman, T.

    1992-09-01

    This is a report of an ongoing R ampersand D project which is investigating the use of standard UNIX workstations for the real-time data acquisition from a major new experimental initiative, the SLAC B Factory (PEP II). For this work an IBM RS/6000 workstation running the AIX operating system is used. Real-time extensions to the UNIX operating system are explored and performance measured. These extensions comprise a set of AIX-specific and POSIX-compliant system services. Benchmark comparisons are made with embedded processor technologies. Results are presented for a simple prototype on-line system for laboratory-testing of a new prototype drift chamber

  4. The Fermi Unix Environment - Dealing with Adolescence

    Science.gov (United States)

    Pordes, Ruth; Nicholls, Judy; Wicks, Matt

    Fermilab's Computing Division started early in the definition implemention and promulgation of a common environment for Users across the Laboratory's UNIX platforms and installations. Based on our experience over nearly five years, we discuss the status of the effort ongoing developments and needs, some analysis of where we could have done better, and identify future directions to allow us to provide better and more complete service to our customers. In particular, with the power of the new PCs making enthusiastic converts of physicists to the pc world, we are faced with the challenge of expanding the paradigm to non-UNIX platforms in a uniform and consistent way.

  5. Unix Domain Sockets Applied in Android Malware Should Not Be Ignored

    Directory of Open Access Journals (Sweden)

    Xu Jiang

    2018-03-01

    Full Text Available Increasingly, malicious Android apps use various methods to steal private user data without their knowledge. Detecting the leakage of private data is the focus of mobile information security. An initial investigation found that none of the existing security analysis systems can track the flow of information through Unix domain sockets to detect the leakage of private data through such sockets, which can result in zero-day exploits in the information security field. In this paper, we conduct the first systematic study on Unix domain sockets as applied in Android apps. Then, we identify scenarios in which such apps can leak private data through Unix domain sockets, which the existing dynamic taint analysis systems do not catch. Based on these insights, we propose and implement JDroid, a taint analysis system that can track information flows through Unix domain sockets effectively to detect such privacy leaks.

  6. Introduction of the UNIX International Performance Management Work Group

    Science.gov (United States)

    Newman, Henry

    1993-01-01

    In this paper we presented the planned direction of the UNIX International Performance Management Work Group. This group consists of concerned system developers and users who have organized to synthesize recommendations for standard UNIX performance management subsystem interfaces and architectures. The purpose of these recommendations is to provide a core set of performance management functions and these functions can be used to build tools by hardware system developers, vertical application software developers, and performance application software developers.

  7. UNIX secure server : a free, secure, and functional server example

    OpenAIRE

    Sastre, Hugo

    2016-01-01

    The purpose of this thesis work was to introduce UNIX server as a personal server but also as a start point for investigation and developing at a professional level. The objective of this thesis was to build a secure server providing not only a FTP server but also an HTTP server and a cloud system for remote backups. OpenBSD was used as the operating system. OpenBSD is a UNIX-like operating system made by hackers for hackers. The difference with other systems that might partially provid...

  8. Improved operating scenarios of the DIII-D tokamak as a result of the addition of UNIX computer systems

    International Nuclear Information System (INIS)

    Henline, P.A.

    1995-10-01

    The increased use of UNIX based computer systems for machine control, data handling and analysis has greatly enhanced the operating scenarios and operating efficiency of the DRI-D tokamak. This paper will describe some of these UNIX systems and their specific uses. These include the plasma control system, the electron cyclotron heating control system, the analysis of electron temperature and density measurements and the general data acquisition system (which is collecting over 130 Mbytes of data). The speed and total capability of these systems has dramatically affected the ability to operate DIII-D. The improved operating scenarios include better plasma shape control due to the more thorough MHD calculations done between shots and the new ability to see the time dependence of profile data as it relates across different spatial locations in the tokamak. Other analysis which engenders improved operating abilities will be described

  9. HUMPF [Heterogeneous Unix Montecarlo Production Facility] users guide

    International Nuclear Information System (INIS)

    Cahill, P.; Edgecock, R.; Fisher, S.M.; Gee, C.N.P.; Gordon, J.C.; Kidd, T.; Leake, J.; Rigby, D.J.; Roberts, J.H.C.

    1992-11-01

    The Heterogenous Unix Monte Carlo Production Facility (HUMPF) simplifies the running of particle physics simulation programs on Unix workstations. Monte Carlo is the largest consumer of IBM (CPU) capacity within the Atlas centre at Rutherford Appleton Laboratory (RAL). It is likely that the future computing requirements of the LEP and HERA experiments cannot be satisfied by the IBM 3090 system. HUMPF adds extra capacity, and can be expanded with minimal effort. Monte Carlo programs are CPU-bound, and make little use of the vector or the input/output capacity of the IBM 3090. Such programs are therefore excellent candidates to use the spare capacity of powerful workstations. The main data storage is still handled centrally by the IBM 3090 and its peripherals. The HUMPF facility is suitable for any program with a similar profile. (author)

  10. UNIX at high energy physics Laboratories

    Energy Technology Data Exchange (ETDEWEB)

    Silverman, Alan

    1994-03-15

    With more and more high energy physics Laboratories ''downsizing'' from large central proprietary mainframe computers towards distributed networks, usually involving UNIX operating systems, the need was expressed at the 1991 Computers in HEP (CHEP) Conference to create a group to consider the implications of this trend and perhaps work towards some common solutions to ease the transition for HEP users worldwide.

  11. UNIX at high energy physics Laboratories

    International Nuclear Information System (INIS)

    Silverman, Alan

    1994-01-01

    With more and more high energy physics Laboratories ''downsizing'' from large central proprietary mainframe computers towards distributed networks, usually involving UNIX operating systems, the need was expressed at the 1991 Computers in HEP (CHEP) Conference to create a group to consider the implications of this trend and perhaps work towards some common solutions to ease the transition for HEP users worldwide

  12. San Onofre 2/3 simulator: The move from Unix to Windows

    International Nuclear Information System (INIS)

    Paquette, C.; Desouky, C.; Gagnon, V.

    2006-01-01

    CAE has been developing nuclear power plant (NPP) simulators for over 30 years for customers around the world. While numerous operating systems are used today for simulators, many of the existing simulators were developed to run on workstation-type computers using a variant of the Unix operating system. Today, thanks to the advances in the power and capabilities of Personal Computers (PC's), and because most simulators will eventually need to be upgraded, more and more of these RISC processor-based simulators will be converted to PC-based platforms running either the Windows or Linux operating systems. CAE's multi-platform simulation environment runs on the UNIX Linux and Windows operating systems, enabling simulators to be 'open' and highly interoperable systems using industry-standard software components and methods. The result is simulators that are easier to maintain and modify as reference plants evolve. In early January 2003, CAE set out to upgrade Southern California Edison's San Onofre Unit 2/3 UNIX-based simulator with its latest integrated simulation environment. This environment includes CAE's instructor station Isis, the latest ROSE modeling and runtime tool, as well as the deployment of a new reactor kinetics model (COMET) and new nuclear steam supply system (ANTHEM2000). The chosen simulation platform is PC-based and runs the Windows XP operating system. The main features and achievements of the San Onofre 2/3 Simulator's modernization from RISC/Unix to Intel/Windows XP, running CAE's current simulation environment, is the subject of this paper. (author)

  13. Development of control and data processing system for CO2 laser interferometer

    International Nuclear Information System (INIS)

    Chiba, Shinichi; Kawano, Yasunori; Tsuchiya, Katsuhiko; Inoue, Akira

    2001-11-01

    CO 2 laser interferometer diagnostic has been operating to measure the central electron density in JT-60U plasmas. We have developed a control and data processing system for the CO 2 laser interferometer with flexible functions of data acquisition, data processing and data transfer in accordance with the sequence of JT-60U discharges. This system is mainly composed of two UNIX workstations and CAMAC clusters, in which the high reliability was obtained by sharing the data process functions to the each workstations. Consequently, the control and data processing system becomes to be able to provide electron density data immediately after a JT-60U discharge, routinely. The realtime feedback control of electron density in JT-60U also becomes to be available by using a reference density signal from the CO 2 laser interferometer. (author)

  14. ADAM (Affordable Desktop Application Manager): a Unix desktop application manager

    International Nuclear Information System (INIS)

    Liebana, M.; Marquina, M.; Ramos, R.

    1996-01-01

    ADAM stands for Affordable Desktop Application Manager. It is a GUI developed at CERN with the aim to ease access to applications. The motivation to develop ADAM came from the unavailability of environments like COSE/CDE and their heavy resource consumption. ADAM has proven to be user friendly: new users are able to customize it to their needs in few minutes. Groups of users may share through ADAM a common application environment. ADAM also integrates the Unix and the PC world. PC users can excess Unix applications in the same way as their usual Windows applications. This paper describes all the ADAM features, how they are used at CERN Public Services, and the future plans for ADAM. (author)

  15. Development of control and data processing system for CO{sub 2} laser interferometer

    Energy Technology Data Exchange (ETDEWEB)

    Chiba, Shinichi; Kawano, Yasunori; Tsuchiya, Katsuhiko; Inoue, Akira [Japan Atomic Energy Research Inst., Naka, Ibaraki (Japan). Naka Fusion Research Establishment

    2001-11-01

    CO{sub 2} laser interferometer diagnostic has been operating to measure the central electron density in JT-60U plasmas. We have developed a control and data processing system for the CO{sub 2} laser interferometer with flexible functions of data acquisition, data processing and data transfer in accordance with the sequence of JT-60U discharges. This system is mainly composed of two UNIX workstations and CAMAC clusters, in which the high reliability was obtained by sharing the data process functions to the each workstations. Consequently, the control and data processing system becomes to be able to provide electron density data immediately after a JT-60U discharge, routinely. The realtime feedback control of electron density in JT-60U also becomes to be available by using a reference density signal from the CO{sub 2} laser interferometer. (author)

  16. MULTITASKER, Multitasking Kernel for C and FORTRAN Under UNIX

    International Nuclear Information System (INIS)

    Brooks, E.D. III

    1988-01-01

    1 - Description of program or function: MULTITASKER implements a multitasking kernel for the C and FORTRAN programming languages that runs under UNIX. The kernel provides a multitasking environment which serves two purposes. The first is to provide an efficient portable environment for the development, debugging, and execution of production multiprocessor programs. The second is to provide a means of evaluating the performance of a multitasking program on model multiprocessor hardware. The performance evaluation features require no changes in the application program source and are implemented as a set of compile- and run-time options in the kernel. 2 - Method of solution: The FORTRAN interface to the kernel is identical in function to the CRI multitasking package provided for the Cray XMP. This provides a migration path to high speed (but small N) multiprocessors once the application has been coded and debugged. With use of the UNIX m4 macro preprocessor, source compatibility can be achieved between the UNIX code development system and the target Cray multiprocessor. The kernel also provides a means of evaluating a program's performance on model multiprocessors. Execution traces may be obtained which allow the user to determine kernel overhead, memory conflicts between various tasks, and the average concurrency being exploited. The kernel may also be made to switch tasks every cpu instruction with a random execution ordering. This allows the user to look for unprotected critical regions in the program. These features, implemented as a set of compile- and run-time options, cause extra execution overhead which is not present in the standard production version of the kernel

  17. A UNIX device driver for a Translink II Transputer board

    International Nuclear Information System (INIS)

    Wiley, J.C.

    1991-01-01

    A UNIX device driver for a TransLink II Transputer board is described. A complete listing of the code is presented. The device driver allows a transputer array to be used with the A/UX operating system

  18. UNIX-a solution to the compatibility problem

    International Nuclear Information System (INIS)

    Gulbranson, R.L.

    1983-01-01

    The UNIX operating system (TM Bell Laboratories) has achieved a high degree of popularity in recent years. It is rapidly becoming a defacto standard as the operating system for 16 and 32 bit microcomputers. The adoption of this operating system by the physics community offers several substantial advantages; a portable software environment (editors, file system, etc.), freedom to choose among a variety of higher-level languages for software applications, and computer hardware vendor independence

  19. Contribution to data acquisition software of Eurogram and Diamant multi detectors in an Unix/VXWorks environment; Contribution aux logiciels d`acquisition de donnees des multidetecteurs Eurogam et Diamant dans un environnement reparti Unix/VXWorks

    Energy Technology Data Exchange (ETDEWEB)

    Diarra, C

    1994-06-01

    Questions on nuclear matter, need to have new performant equipments. Eurogram is a 4 PI gamma radiations multi detector and a precious tool in gamma spectroscopy, but it is necessary to use a charged particles detector and in this aim Diamant is an Eurogram partner. These two multi detectors needed special software data acquisition systems. The whole of acquisition control and management is based on sun stations with unix system. 56 figs., 64 refs.

  20. UNIX-based operating systems robustness evaluation

    Science.gov (United States)

    Chang, Yu-Ming

    1996-01-01

    Robust operating systems are required for reliable computing. Techniques for robustness evaluation of operating systems not only enhance the understanding of the reliability of computer systems, but also provide valuable feed- back to system designers. This thesis presents results from robustness evaluation experiments on five UNIX-based operating systems, which include Digital Equipment's OSF/l, Hewlett Packard's HP-UX, Sun Microsystems' Solaris and SunOS, and Silicon Graphics' IRIX. Three sets of experiments were performed. The methodology for evaluation tested (1) the exception handling mechanism, (2) system resource management, and (3) system capacity under high workload stress. An exception generator was used to evaluate the exception handling mechanism of the operating systems. Results included exit status of the exception generator and the system state. Resource management techniques used by individual operating systems were tested using programs designed to usurp system resources such as physical memory and process slots. Finally, the workload stress testing evaluated the effect of the workload on system performance by running a synthetic workload and recording the response time of local and remote user requests. Moderate to severe performance degradations were observed on the systems under stress.

  1. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    Science.gov (United States)

    Pack, G.

    1994-01-01

    saved as a library file which represents a generic digraph structure for a class of components. The Generate Model feature can then use library files to generate digraphs for every component listed in the modeling tables, and these individual digraph files can be used in a variety of ways to speed generation of complete digraph models. FEAT contains a preprocessor which performs transitive closure on the digraph. This multi-step algorithm builds a series of phantom bridges, or gates, that allow accurate bi-directional processing of digraphs. This preprocessing can be time-consuming, but once preprocessing is complete, queries can be answered and displayed within seconds. A UNIX X-Windows port of version 3.5 of FEAT, XFEAT, is also available to speed the processing of digraph models created on the Macintosh. FEAT v3.6, which is only available for the Macintosh, has some report generation capabilities which are not available in XFEAT. For very large integrated systems, FEAT can be a real cost saver in terms of design evaluation, training, and knowledge capture. The capability of loading multiple digraphs and schematics into FEAT allows modelers to build smaller, more focused digraphs. Typically, each digraph file will represent only a portion of a larger failure scenario. FEAT will combine these files and digraphs from other modelers to form a continuous mathematical model of the system's failure logic. Since multiple digraphs can be cumbersome to use, FEAT ties propagation results to schematic drawings produced using MacDraw II (v1.1v2 or later) or MacDraw Pro. This makes it easier to identify single and double point failures that may have to cross several system boundaries and multiple engineering disciplines before creating a hazardous condition. FEAT v3.6 for the Macintosh is written in C-language using Macintosh Programmer's Workshop C v3.2. It requires at least a Mac II series computer running System 7 or System 6.0.8 and 32 Bit QuickDraw. It also requires a math

  2. GEMPAK 5.1 - A GENERAL METEOROLOGICAL PACKAGE (UNIX VERSION)

    Science.gov (United States)

    Desjardins, M. L.

    1994-01-01

    GEMPAK is a general meteorological software package developed at NASA/Goddard Space Flight Center. It includes programs to analyze and display surface, upper-air, and gridded data, including model output. There are very general programs to list, edit, and plot data on maps, to display profiles and time series, to draw and fill contours, to draw streamlines, to plot symbols for clouds, sky cover, and pressure tendency, and draw cross sections in the case of gridded data and sounding data. In addition, there are Barnes objective analysis programs to grid surface and upper-air data. The programs include the capabilities to derive meteorological parameters from those found in the dataset, to perform vertical interpolations of sounding data to different coordinate systems, and to compute an extensive set of gridded diagnostic quantities by specifying various nested combinations of scalars and vector arithmetic, algebraic, and differential operators. The GEMPAK 5.1 graphics/transformation subsystem, GEMPLT, provides device-independent graphics. GEMPLT also has the capability to display output in a variety of map projections or overlaid on satellite imagery. GEMPAK 5.1 is written in FORTRAN 77 and C-language and has been implemented on VAX computers under VMS and on computers running the UNIX operating system. During installation and normal use, this package occupies approximately 100Mb of hard disk space. The UNIX version of GEMPAK includes drivers for several graphic output systems including MIT's X Window System (X11,R4), Sun GKS, PostScript (color and monochrome), Silicon Graphics, and others. The VMS version of GEMPAK also includes drivers for several graphic output systems including PostScript (color and monochrome). The VMS version is delivered with the object code for the Transportable Applications Environment (TAE) program, version 4.1 which serves as a user interface. A color monitor is recommended for displaying maps on video display devices. Data for rendering

  3. Contribution to data acquisition software of Eurogram and Diamant multi detectors in an Unix/VXWorks environment

    International Nuclear Information System (INIS)

    Diarra, C.

    1994-06-01

    Questions on nuclear matter, need to have new performant equipments. Eurogram is a 4 PI gamma radiations multi detector and a precious tool in gamma spectroscopy, but it is necessary to use a charged particles detector and in this aim Diamant is an Eurogram partner. These two multi detectors needed special software data acquisition systems. The whole of acquisition control and management is based on sun stations with unix system. 56 figs., 64 refs

  4. Palantiri: a distributed real-time database system for process control

    International Nuclear Information System (INIS)

    Tummers, B.J.; Heubers, W.P.J.

    1992-01-01

    The medium-energy accelerator MEA, located in Amsterdam, is controlled by a heterogeneous computer network. A large real-time database contains the parameters involved in the control of the accelerator and the experiments. This database system was implemented about ten years ago and has since been extended several times. In response to increased needs the database system has been redesigned. The new database environment, as described in this paper, consists out of two new concepts: (1) A Palantir which is a per machine process that stores the locally declared data and forwards all non local requests for data access to the appropriate machine. It acts as a storage device for data and a looking glass upon the world. (2) Golems: working units that define the data within the Palantir, and that have knowledge of the hardware they control. Applications access the data of a Golem by name (which do resemble Unix path names). The palantir that runs on the same machine as the application handles the distribution of access requests. This paper focuses on the Palantir concept as a distributed data storage and event handling device for process control. (author)

  5. Migration of the UNIX Application for eFAST CANDU Nuclear Power Plant Analyzer

    International Nuclear Information System (INIS)

    Suh, Jae Seung; Sohn, Dae Seong; Kim, Sang Jae; Jeun, Gyoo Dong

    2006-01-01

    Since the mid 1980s, corporate data centers have been moving away from mainframes running dedicated operating systems to mini-computers, often using one or other of the myriad flavors of UNIX. At the same time, the users' experience of these systems has, in many cases, stayed the same, involving text-based interaction with dumb terminals or a terminal-emulation session on a Personal Computer. More recently, IT managers have questioned this approach, and have been looking at changes in the UNIX marketplace and the increasing expense of being tied in to single-vendor software and hardware solutions. The growth of Linux as a lightweight version of UNIX has fueled this interest, raising the number of organizations that are considering a migration to alternative platforms. The various implementations of the UNIX operating system have served industry well, as witnessed by the very large base both of installed systems and large-scale applications installed on those systems. However, there are increasing signs of dissatisfaction with expensive, often proprietary solutions and a growing sense that perhaps the concept of 'big iron' has had its day in the same way as it has for most of the mainframes of the type portrayed in 1970s science fiction films. One of the most extraordinary and unexpected successes of the Intel PC architecture is the extent to which this basic framework has been extended to encompass very large server and data center environments. Large-scale hosting companies are now offering enterprise level services to multiple client companies at availability levels of over 99.99 percent on what are simply racks of relatively cheap PCs. Technologies such as clustering, Network Load Balancing, and Component Load Balancing enable the personal computer to take on and match the levels of throughput, availability, and reliability of all but the most expensive 'big iron' solutions and the supercomputers

  6. CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (UNIX VERSION)

    Science.gov (United States)

    Donnell, B.

    1994-01-01

    COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the

  7. Implementation of the ALEPH detector simulation code using UNIX with on-line graphics display

    International Nuclear Information System (INIS)

    Corden, M.J.; Georgiopoulos, C.H.; Mermikides, M.E.; Streets, J.

    1989-01-01

    GALEPH, the detector simulation program of the ALEPH detector was ported to an ETA10 running under ATandT UNIX System 5. The program on the ETA10 can be driven using standard UNIX socket connections between the ETA and a Silicon Graphics Iris-3020 workstation. The simulated data on the ETA are transferred, using the machine independent binary format EPIO, and displayed on the workstation using a locally developed software package for the visualization of the ALEPH detector. The client (Iris-3020) can also pass parameters to the server (ETA10) and thus interactively change the type of events produced using the same socket connection. (orig.)

  8. DET/MPS - THE GSFC ENERGY BALANCE PROGRAM, DIRECT ENERGY TRANSFER/MULTIMISSION SPACECRAFT MODULAR POWER SYSTEM (UNIX VERSION)

    Science.gov (United States)

    Jagielski, J. M.

    1994-01-01

    The DET/MPS programs model and simulate the Direct Energy Transfer and Multimission Spacecraft Modular Power System in order to aid both in design and in analysis of orbital energy balance. Typically, the DET power system has the solar array directly to the spacecraft bus, and the central building block of MPS is the Standard Power Regulator Unit. DET/MPS allows a minute-by-minute simulation of the power system's performance as it responds to various orbital parameters, focusing its output on solar array output and battery characteristics. While this package is limited in terms of orbital mechanics, it is sufficient to calculate eclipse and solar array data for circular or non-circular orbits. DET/MPS can be adjusted to run one or sequential orbits up to about one week, simulated time. These programs have been used on a variety of Goddard Space Flight Center spacecraft projects. DET/MPS is written in FORTRAN 77 with some VAX-type extensions. Any FORTRAN 77 compiler that includes VAX extensions should be able to compile and run the program with little or no modifications. The compiler must at least support free-form (or tab-delineated) source format and 'do do-while end-do' control structures. DET/MPS is available for three platforms: GSC-13374, for DEC VAX series computers running VMS, is available in DEC VAX Backup format on a 9-track 1600 BPI tape (standard distribution) or TK50 tape cartridge; GSC-13443, for UNIX-based computers, is available on a .25 inch streaming magnetic tape cartridge in UNIX tar format; and GSC-13444, for Macintosh computers running AU/X with either the NKR FORTRAN or AbSoft MacFORTRAN II compilers, is available on a 3.5 inch 800K Macintosh format diskette. Source code and test data are supplied. The UNIX version of DET requires 90K of main memory for execution. DET/MPS was developed in 1990. A/UX and Macintosh are registered trademarks of Apple Computer, Inc. VMS, DEC VAX and TK50 are trademarks of Digital Equipment Corporation. UNIX is a

  9. AUTOMATED CONTROL AND REAL-TIME DATA PROCESSING OF WIRE SCANNER/HALO SCRAPER MEASUREMENTS

    International Nuclear Information System (INIS)

    Day, L.A.; Gilpatrick, J.D.

    2001-01-01

    The Low-Energy Demonstration Accelerator (LEDA), assembled and operating at Los Alamos National Laboratory, provides the platform for obtaining measurements of high-power proton beam-halo formation. Control system software and hardware have been integrated and customized to enable the production of real-time beam-halo profiles. The Experimental Physics and Industrial Control System (EPICS) hosted on a VXI platform, Interactive Data Language (IDL) programs hosted on UNIX platforms, and LabVIEW (LV) Virtual Instruments hosted on a PC platform have been integrated and customized to provide real-time, synchronous motor control, data acquisition, and data analysis of data acquired through specialized DSP instrumentation. These modules communicate through EPICS Channel Access (CA) communication protocol extensions to control and manage execution flow ensuring synchronous data acquisition and real-time processing of measurement data. This paper describes the software integration and management scheme implemented to produce these real-time beam profiles

  10. FASTRAN II - FATIGUE CRACK GROWTH STRUCTURAL ANALYSIS (UNIX VERSION)

    Science.gov (United States)

    Newman, J. C.

    1994-01-01

    loads may be either tensile or compressive. Several standardized aircraft flight-load histories, such as TWIST, Mini-TWIST, FALSTAFF, Inverted FALSTAFF, Felix and Gaussian, are included as options. FASTRAN II also includes two other methods that will help the user input spectrum load histories. The two methods are: (1) a list of stress points, and (2) a flight-by-flight history of stress points. Examples are provided in the user manual. Developed as a research program, FASTRAN II has successfully predicted crack growth in many metallic materials under various aircraft spectrum loading. A computer program DKEFF which is a part of the FASTRAN II package was also developed to analyze crack growth rate data from laboratory specimens to obtain the effective stress-intensity factor against crack growth rate relations used in FASTRAN II. FASTRAN II is written in standard FORTRAN 77. It has been successfully compiled and implemented on Sun4 series computers running SunOS and on IBM PC compatibles running MS-DOS using the Lahey F77L FORTRAN compiler. Sample input and output data are included with the FASTRAN II package. The UNIX version requires 660K of RAM for execution. The standard distribution medium for the UNIX version (LAR-14865) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format. The standard distribution medium for the MS-DOS version (LAR-14944) is a 5.25 inch 360K MS-DOS format diskette. The contents of the diskette are compressed using the PKWARE archiving tools. The utility to unarchive the files, PKUNZIP.EXE, is included. The program was developed in 1984 and revised in 1992. Sun4 and SunOS are trademarks of Sun Microsystems, Inc. IBM PC is a trademark of International Business Machines Corp. MS-DOS is a trademark of Microsoft, Inc. F77L is a trademark of the Lahey Computer Systems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories. PKWARE and PKUNZIP are trademarks of PKWare

  11. With Unix under the hood, the Mac has a toehold in the geophysical sector

    Energy Technology Data Exchange (ETDEWEB)

    Roche, P.

    2004-05-01

    A new lease on life in the geophysical sector is predicted for Apple Computer as result of the company's decision first to convert to the Unix System, and then to develop a new operating system called OS X (O-S-Ten) which runs under a version of Unix called Free BSD. While Apple shows no indication of interest to market its hardware and software to the oil industry, at least one oil company, Houston-based Seitel Inc., is using Apple products for its high performance computing and technical desktop applications, such as storing its onshore and offshore 3-D seismic data on Apple's Xserve RAID rack storage system. In another application, Virginia Polytechnic Institute built a supercomputer using 1,100 64-bit G5 Power Macs. The result is the third fastest super-computer in the world, running at a blazing 10.28 teraflops, or 10.28 trillion calculations per second. By all accounts, it is well suited to oilpatch tasks such as seismic data processing and running large-scale simulations such as fluid flow through porous media. Seitel is also interested in Apple's 64-bit G5 computers to run its seismic data processing operations which require large amounts of computing power. While Apple's hardware and software appear to be well adopted to perform oilpatch tasks, proliferation of use of Apple products by oilpatch companies is hindered by the almost complete absence of oilpatch software. Parallel Geoscience Corporation which had long been interested in creating Mac-based software for geoscience applications, is in the process of filling that gap by refocusing its attention on an O-S-Ten version of its Seismic Precessing Workshop{sup T}M{sup (}SPW). How long it will take to make the conversion will depend on customer demand. 2 figs.

  12. Montecarlo Simulations for a Lep Experiment with Unix Workstation Clusters

    Science.gov (United States)

    Bonesini, M.; Calegari, A.; Rossi, P.; Rossi, V.

    Modular systems of RISC CPU based computers have been implemented for large productions of Montecarlo simulated events for the DELPHI experiment at CERN. From a pilot system based on DEC 5000 CPU’s, a full size system based on a CONVEX C3820 UNIX supercomputer and a cluster of HP 735 workstations has been put into operation as a joint effort between INFN Milano and CILEA.

  13. FORTRAN data files transference from VAX/VMS to ALPHA/UNIX; Traspaso de ficheros FORTRAN de datos de VAX/VMS a ALPHA/UNIX

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, E.; Milligen, B. Ph van [CIEMAT (Spain)

    1997-09-01

    Several tools have been developed to access the TJ-IU databases, which currently reside in VAX/VMS servers, from the TJ-II Data Acquisition System DEC ALPHA 8400 server. The TJ-I/TJ-IU databases are not homogeneous and contain several types of data files, namely, SADE, CAMAC and FORTRAN unformatted files. The tools presented in this report allow one to transfer CAMAC and those FORTRAN unformatted files defined herein, from a VAX/VMS server, for data manipulation on the ALPHA/Digital UNIX server. (Author)

  14. Access to CAMAC from VxWorks and UNIX in DART

    International Nuclear Information System (INIS)

    Streets, J.; Meadows, J.; Moore, C.

    1995-05-01

    As part of the DART Project the authors have developed a package of software for CAMAC access from UNIX and VxWorks platforms, with support for several hardware interfaces. They report on developments for the CES CBD8210 VME to parallel CAMAC, the Hytec VSD2992 VME to serial CAMAC and Jorway 411S SCSI to parallel and serial CAMAC branch drivers, and give a summary of the timings obtained

  15. Addressing the Digital Divide in Contemporary Biology: Lessons from Teaching UNIX.

    Science.gov (United States)

    Mangul, Serghei; Martin, Lana S; Hoffmann, Alexander; Pellegrini, Matteo; Eskin, Eleazar

    2017-10-01

    Life and medical science researchers increasingly rely on applications that lack a graphical interface. Scientists who are not trained in computer science face an enormous challenge analyzing high-throughput data. We present a training model for use of command-line tools when the learner has little to no prior knowledge of UNIX. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. A real-time data-acquisition and analysis system with distributed UNIX workstations

    International Nuclear Information System (INIS)

    Yamashita, H.; Miyamoto, K.; Maruyama, K.; Hirosawa, H.; Nakayoshi, K.; Emura, T.; Sumi, Y.

    1996-01-01

    A compact data-acquisition system using three RISC/UNIX TM workstations (SUN TM /SPARCstation TM ) with real-time capabilities of monitoring and analysis has been developed for the study of photonuclear reactions with the large-acceptance spectrometer TAGX. One workstation acquires data from memory modules in the front-end electronics (CAMAC and TKO) with a maximum speed of 300 Kbytes/s, where data size times instantaneous rate is 1 Kbyte x 300 Hz. Another workstation, which has real-time capability for run monitoring, gets the data with a buffer manager called NOVA. The third workstation analyzes the data and reconstructs the event. In addition to a general hardware and software description, priority settings and run control by shell scripts are described. This system has recently been used successfully in a two month long experiment. (orig.)

  17. UNIX trademark in high energy physics: What we can learn from the initial experiences at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Butler, J.N.

    1991-03-01

    The reasons why Fermilab decided to support the UNIX operating system are reviewed and placed in the content of an overall model for high energy physics data analysis. The strengths and deficiencies of the UNIX environment for high energy physics are discussed. Fermilab's early experience in dealing with a an open'' multivendor environment, both for computers and for peripherals, is described. The human resources required to fully exploit the opportunities are clearly growing. The possibility of keeping the development and support efforts within reasonable bounds may depend on our ability to collaborate or at least to share information even more effectively than we have in the past. 7 refs., 4 figs., 5 tabs.

  18. UNIX trademark in high energy physics: What we can learn from the initial experiences at Fermilab

    International Nuclear Information System (INIS)

    Butler, J.N.

    1991-03-01

    The reasons why Fermilab decided to support the UNIX operating system are reviewed and placed in the content of an overall model for high energy physics data analysis. The strengths and deficiencies of the UNIX environment for high energy physics are discussed. Fermilab's early experience in dealing with a an ''open'' multivendor environment, both for computers and for peripherals, is described. The human resources required to fully exploit the opportunities are clearly growing. The possibility of keeping the development and support efforts within reasonable bounds may depend on our ability to collaborate or at least to share information even more effectively than we have in the past. 7 refs., 4 figs., 5 tabs

  19. The COSY control system, a distributed realtime operating system: First practical experience at the COSY-injector

    International Nuclear Information System (INIS)

    Stephan, M.; Hacker, U.; Henn, K.; Richert, A.; Sobotta, K.; Weinert, A.

    1991-01-01

    The COSY control system is hierarchically organized with distributed intelligence and autonomous processing units for dedicated components. Data communication is performed via LAN and over a fieldbus. The hostsystems are UNIX-based, whereas the field-controllers are running a modular realtime operating-system RT/OS which has been developed at KFA. The computer-hardware consists of RISC mini computers, VME-computers in the field and G64 equipment-control-module in geographical expansion of the controller by a fieldbus based on the PDV-standard. The man-machine interface consists of X-window based work stations. On top of X-window a graphical user interface based on object oriented methods is used. A distributed realtime data base allows access to the accelerator state from every workstation. A special highlevel language debugger hosted on the UNIX based workstation and connected over LAN to the VME targets will be used. Together with the software development system for UNIX applications an uniform view of the system appears to the programmer. First practical experience at the COSY injector is presented

  20. Interfacing ANSYS to user's programs using UNIX shell program

    Energy Technology Data Exchange (ETDEWEB)

    Kim, In Yong; Kim, Beom Shig [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-01-01

    It has been considered to be impossible to interface the ANSYS, which is the commercial finite element code and whose program is not open to public, to the other user's program. When the analysis need to be iterated, the user should wait until the analysis is finished and read the ANSYS result to make the input data for every iteration. In this report the direct interfacing techniques between the ANSYS and the other program using UNIX shell programming are proposed. The detail program lists and the application example are also provided. (Author) 19 refs., 6 figs., 7 tabs.

  1. FORTRAN data files transference from VAX/VMS to ALPHA/UNIX

    International Nuclear Information System (INIS)

    Sanchez, E.; Milligen, B.Ph. van

    1997-01-01

    Several tools have been developed to access the TJ-I and TJ-IU databases, which currently reside in VAX/VMS servers, from the TJ-II Data Acquisition System DEC ALPHA 8400 server. The TJ-I/TJ-IU databases are not homogeneous and contain several types of data files, namely, SADE. CAMAC and FORTRAN un formatted files. The tools presented in this report allow one to transfer CAMAC and those FORTRAN un formatted files defined herein. from a VAX/VMS server, for data manipulation on the ALPHA/Digital UNIX server. (Author) 5 refs

  2. Development of GUS for control applications at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Chung, Y.; Barr, D.; Borland, M.; Kirchman, J.; Decker, G.; Kim, K.

    1994-01-01

    A script-based interpretive shell GUS (General Purpose Data Acquisition for Unix Shell) has been developed for application to the Advanced Photon Source (APS) control. The primary design objective of GUS is to provide a mechanism for efficient data flow among modularized objects called Data Access Modules (DAMs). GUS consists of four major components: user interface, kernel, built-in command module, and DAMS. It also incorporates the Unix shell to make use of the existing utility programs for file manipulation and data analysis. At this time, DAMs have been written for device access through EPICS (Experimental Physics and Industrial Control System), data I/O for SDDS (Self-Describing Data Set) files, matrix manipulation, graphics display, digital signal processing, and beam position feedback system control. The modular and object-oriented construction of GUS will facilitate addition of more DAMs with other functions in the future

  3. Development and upgrade of new real time processor in JT-60 data processing system

    International Nuclear Information System (INIS)

    Sakata, Shinya; Koiwa, Motonao; Matsuda, Toshiaki; Aoyagi, Tetsuo

    2000-07-01

    At the beginning of JT-60 experiments, the real time processor (RTP) in the data processing system was mainly constructed by PANAFACOM U-1500. As the computer became superannuated, however, it gradually became difficult to maintain both hardware and software. A performance of a recent UNIX workstation has been remarkably progressed. The UNIX workstation has a large flexibility for user application programs, an easiness for maintenance of the hardware and an ability of expansion to peripheral devices. Therefore, the RTP system is newly reconstructed by using the UNIX workstation. This report describes the overview, the basic design and the recent upgrade on the RTP in the data processing system. (author)

  4. The transition of GTDS to the Unix workstation environment

    Science.gov (United States)

    Carter, D.; Metzinger, R.; Proulx, R.; Cefola, P.

    1995-01-01

    Future Flight Dynamics systems should take advantage of the possibilities provided by current and future generations of low-cost, high performance workstation computing environments with Graphical User Interface. The port of the existing mainframe Flight Dynamics systems to the workstation environment offers an economic approach for combining the tremendous engineering heritage that has been encapsulated in these systems with the advantages of the new computing environments. This paper will describe the successful transition of the Draper Laboratory R&D version of GTDS (Goddard Trajectory Determination System) from the IBM Mainframe to the Unix workstation environment. The approach will be a mix of historical timeline notes, descriptions of the technical problems overcome, and descriptions of associated SQA (software quality assurance) issues.

  5. A Survey of Research in Supervisory Control and Data Acquisition (SCADA)

    Science.gov (United States)

    2014-09-01

    RISC ) platforms running some version of UNIX.4 Around the turn of the millennium work began on applying Web technologies to SCADA systems.5–8 Lately...the 2 trend has been to move from the UNIX/ RISC system to commodity hardware and Microsoft solutions although there is some Linux,1 to move from...Control Center MTU Master Terminal Unit OS operating system PKI Public Key Infrastructure PLC Programmable Logic Controller RISC Reduced Instruction

  6. Exporting Variables in a Hierarchically Distributed Control System

    Energy Technology Data Exchange (ETDEWEB)

    Chamizo Llatas, M

    1995-07-01

    We describe the Remote Variable Access Service (RVAS), a network service developed and used in the distributed control and monitoring system of the TJ-II Heliac, which is under construction at CIEMAT (Madrid, Spain) and devoted to plasma studies in the nuclear fusion field. The architecture of the TJ-II control system consists of one central Sun workstation Sparc 10 and several autonomous subsystems based on VME crates with embedded processors running the OS-9 (V.24) real time operating system. The RVAS service allows state variables in local control processes running in subsystems to be exported to remote processes running in the central control workstation. Thus we extend the concept of exporting of file systems in UNIX machines to variables in processes running in different machines. (Author) 6 refs.

  7. Exporting Variables in a Hierarchically Distributed Control System

    International Nuclear Information System (INIS)

    Diaz Martin; Martinez Laso, L.

    1995-01-01

    We describe the Remote Variable Access Service (RVAS), a network service developed and use in the distributed control and monitoring system of the TJ-II Heliac, which is under construction at CIEMAT (Madrid, Spain) and devoted to plasma studies in the nuclear fusion field. The architecture of the TJ-II control system consists of one central Sun workstation Sparc 10 and several autonomous subsystems based on VME crates with embedded processors running the os-9 (V.24) real time operating system. The RVAS service allows state variables in local control processes running in subsystems to be exported to remote processes running in the central control workstation. Thus we extend the concept of exporting of file systems in UNIX machines to variables in processes running in different machines. (Author)

  8. Exporting Variables in a Hierarchically Distributed Control System

    International Nuclear Information System (INIS)

    Chamizo Llatas, M.

    1995-01-01

    We describe the Remote Variable Access Service (RVAS), a network service developed and used in the distributed control and monitoring system of the TJ-II Heliac, which is under construction at CIEMAT (Madrid, Spain) and devoted to plasma studies in the nuclear fusion field. The architecture of the TJ-II control system consists of one central Sun workstation Sparc 10 and several autonomous subsystems based on VME crates with embedded processors running the OS-9 (V.24) real time operating system. The RVAS service allows state variables in local control processes running in subsystems to be exported to remote processes running in the central control workstation. Thus we extend the concept of exporting of file systems in UNIX machines to variables in processes running in different machines. (Author) 6 refs

  9. SPS/LEP beam transfer equipment control using industrial automation components

    International Nuclear Information System (INIS)

    Aimar, A.; Berard, G.; Bretin, J.L.; Carlier, E.; Dieperink, J.H.; Laffin, M.; Mertens, V.; Verhagen, H.

    1992-01-01

    Several control systems for SPS and LEP beam transfer equipment have to be commissioned in the near future. Tools for fast software development, easy maintenance and modifications, compliance with industrial standards, and independence of specific suppliers are considered to be essential. A large fraction of the systems can be realized using off-the-shelf industrial automation components like industrial I/O systems, programmable logic controllers, or diskless PCs. Specific electronics built up in G-64 can be integrated. Diskless systems running UNIX and X Windows are foreseen as process controllers and local access media. (author)

  10. Flexible human machine interface for process diagnostics

    International Nuclear Information System (INIS)

    Reifman, J.; Graham, G.E.; Wei, T.Y.C.; Brown, K.R.; Chin, R.Y.

    1996-01-01

    A flexible human machine interface to design and display graphical and textual process diagnostic information is presented. The system operates on different computer hardware platforms, including PCs under MS Windows and UNIX Workstations under X-Windows, in a client-server architecture. The interface system is customized for specific process applications in a graphical user interface development environment by overlaying the image of the process piping and instrumentation diagram with display objects that are highlighted in color during diagnostic display. Customization of the system is presented for Commonwealth Edison's Braidwood PWR Chemical and Volume Control System with transients simulated by a full-scale operator-training simulator and diagnosed by a computer-based system

  11. SPAM- SPECTRAL ANALYSIS MANAGER (UNIX VERSION)

    Science.gov (United States)

    Solomon, J. E.

    1994-01-01

    The Spectral Analysis Manager (SPAM) was developed to allow easy qualitative analysis of multi-dimensional imaging spectrometer data. Imaging spectrometers provide sufficient spectral sampling to define unique spectral signatures on a per pixel basis. Thus direct material identification becomes possible for geologic studies. SPAM provides a variety of capabilities for carrying out interactive analysis of the massive and complex datasets associated with multispectral remote sensing observations. In addition to normal image processing functions, SPAM provides multiple levels of on-line help, a flexible command interpretation, graceful error recovery, and a program structure which can be implemented in a variety of environments. SPAM was designed to be visually oriented and user friendly with the liberal employment of graphics for rapid and efficient exploratory analysis of imaging spectrometry data. SPAM provides functions to enable arithmetic manipulations of the data, such as normalization, linear mixing, band ratio discrimination, and low-pass filtering. SPAM can be used to examine the spectra of an individual pixel or the average spectra over a number of pixels. SPAM also supports image segmentation, fast spectral signature matching, spectral library usage, mixture analysis, and feature extraction. High speed spectral signature matching is performed by using a binary spectral encoding algorithm to separate and identify mineral components present in the scene. The same binary encoding allows automatic spectral clustering. Spectral data may be entered from a digitizing tablet, stored in a user library, compared to the master library containing mineral standards, and then displayed as a timesequence spectral movie. The output plots, histograms, and stretched histograms produced by SPAM can be sent to a lineprinter, stored as separate RGB disk files, or sent to a Quick Color Recorder. SPAM is written in C for interactive execution and is available for two different

  12. DB2 9 for Linux, Unix, and Windows database administration certification study guide

    CERN Document Server

    Sanders, Roger E

    2007-01-01

    In DB2 9 for Linux, UNIX, and Windows Database Administration Certification Study Guide, Roger E. Sanders-one of the world's leading DB2 authors and an active participant in the development of IBM's DB2 certification exams-covers everything a reader needs to know to pass the DB2 9 UDB DBA Certification Test (731).This comprehensive study guide steps you through all of the topics that are covered on the test, including server management, data placement, database access, analyzing DB2 activity, DB2 utilities, high availability, security, and much more. Each chapter contains an extensive set of p

  13. The RTC - a software support system for the control of the CERN SPS

    International Nuclear Information System (INIS)

    Herr, W.; Lauckner, R.; Morpurga, G.

    1989-01-01

    The RTC (Run Time Coordinator) is a software support system designed for the SPS control system to provide a runtime environment for application software. It coordinates the execution of individual programs or processes and supervises the process control, i.e. process synchronization, inter process communication, data transfer and operator I/O. This supervision includes the control of processes distributed on a UNIX based network. A standard language independent data interface is part of the system. The system includes tools for data presentation, error logging and contention resolution. This strategy of separating system dependent features from the body of the application programs leads to high flexibility and simplifies the software development. The basic philosophy of the RTC is discussed and its implementation is described. 7 refs

  14. Interface and integration of a silicon graphics UNIX computer with the Encore based SCE SONGS 2/3 simulator

    International Nuclear Information System (INIS)

    Olmos, J.; Lio, P.; Chan, K.S.

    1991-01-01

    The SONGS Unit 2/3 simulator was originally implemented in 1983 on a Master/Slave 32/7780 Encore MPX platform by the Singer-Link Company. In 1986, a 32/9780 MPX Encore computer was incorporated into the simulator computer system to provide the additional CPU processing needed to install the PACE plant monitoring system and to enable the upgrade of the NSSS Simulation to the advanced RETACT/STK models. Since the spring of 1990, the SCE SONGS Nuclear Training Division simulator technical staff, in cooperation with Micro Simulation Inc., has undertaken a project to integrate a Silicon Graphics UNIX based computer with the Encore MPX SONGS 2/3 simulation computer system. In this paper the authors review the objectives, advantages to be gained, software and hardware approaches utilized, and the results so far achieved by the authors' project

  15. STAR- A SIMPLE TOOL FOR AUTOMATED REASONING SUPPORTING HYBRID APPLICATIONS OF ARTIFICIAL INTELLIGENCE (UNIX VERSION)

    Science.gov (United States)

    Borchardt, G. C.

    1994-01-01

    The Simple Tool for Automated Reasoning program (STAR) is an interactive, interpreted programming language for the development and operation of artificial intelligence (AI) application systems. STAR provides an environment for integrating traditional AI symbolic processing with functions and data structures defined in compiled languages such as C, FORTRAN and PASCAL. This type of integration occurs in a number of AI applications including interpretation of numerical sensor data, construction of intelligent user interfaces to existing compiled software packages, and coupling AI techniques with numerical simulation techniques and control systems software. The STAR language was created as part of an AI project for the evaluation of imaging spectrometer data at NASA's Jet Propulsion Laboratory. Programming in STAR is similar to other symbolic processing languages such as LISP and CLIP. STAR includes seven primitive data types and associated operations for the manipulation of these structures. A semantic network is used to organize data in STAR, with capabilities for inheritance of values and generation of side effects. The AI knowledge base of STAR can be a simple repository of records or it can be a highly interdependent association of implicit and explicit components. The symbolic processing environment of STAR may be extended by linking the interpreter with functions defined in conventional compiled languages. These external routines interact with STAR through function calls in either direction, and through the exchange of references to data structures. The hybrid knowledge base may thus be accessed and processed in general by either side of the application. STAR is initially used to link externally compiled routines and data structures. It is then invoked to interpret the STAR rules and symbolic structures. In a typical interactive session, the user enters an expression to be evaluated, STAR parses the input, evaluates the expression, performs any file input

  16. UNIX veida mikrokodola operētājsistēma FPGA procesoram

    OpenAIRE

    Liepkalns, Ansis

    2012-01-01

    Risinājumos, kuros izmanto programmējamo loģisko mezglu masīvu (FPGA) procesorus, programmatūras pieejamība ir svarīga, lai samazinātu galaprodukta iegūšanai nepieciešamo laiku. Plašu programmatūras atbalstu ir ieguvušas UNIX veida operētājsistēmas. To kombinācija ar FPGA procesoriem spēj nodrošināt vēlamo izstrādes ātrumu. Lai apmierinātu kvalitātes prasības, tiek piedāvāts izmantot mikrokodola operētājsistēmu. Darbā tiek apskatīta sistēmas mikroshēmā izveide darbībai ar „Minix 3“ mikrokodol...

  17. CEBAF control system

    International Nuclear Information System (INIS)

    Bork, R.; Grubb, C.; Lahti, G.; Navarro, E.; Sage, J.

    1989-01-01

    A logic-based computer control system is in development at CEBAF. This Unix/C language software package, running on a distributed, hierarchical system of workstation and supervisory minicomputers, interfaces to hardware via CAMAC. Software aspects to be covered are ladder logic, interactive database generation, networking, and graphic user interfaces. 1 fig

  18. Visual Motion Perception and Visual Attentive Processes.

    Science.gov (United States)

    1988-04-01

    88-0551 Visual Motion Perception and Visual Attentive Processes George Spering , New YorkUnivesity A -cesson For DTIC TAB rant AFOSR 85-0364... Spering . HIPSt: A Unix-based image processing syslem. Computer Vision, Graphics, and Image Processing, 1984,25. 331-347. ’HIPS is the Human Information...Processing Laboratory’s Image Processing System. 1985 van Santen, Jan P. It, and George Spering . Elaborated Reichardt detectors. Journal of the Optical

  19. INVITATION TO PERFORM Y2K TESTING UNDER UNIX

    CERN Multimedia

    CERN Y2K Co-ordinator

    1999-01-01

    IntroductionA special AFS cell Ôy2k.cern.chÕ has been established to allow service managers and users to test y2k compliance.In addition to AFS, the cluster consists of machines representing all the Unix flavours in use at CERN (AIX, DUNIX, HP-UX, IRIX, LINUX, and SOLARIS).More information can be obtained from the page: http://wwwinfo.cern.ch/pdp/bis/y2k/y2kplus.htmlTesting scheduleThe cluster will be set to 25 December 1999 on fixed days and then left running for three weeks. This gives people one week to prepare test programs in 1999 and two weeks to check the consequences of passing into year 2000. These fixed dates are set as follows:— 19 May 1999, date set to 25/12/99 (year 2000 starts on 26 May) — 9 June1999, date set to 25/12/99 (year 2000 starts on 16 June)— 30 June 1999, date set to 25/12/99 (year 2000 starts on 7 July)If more than these three sessions are needed an announcement will be made later. RegistrationThe following Web page should be used for r...

  20. Feasibility study of BES data off-line processing and D/Ds physics analysis on a PC/Linux platform

    International Nuclear Information System (INIS)

    Rong Gang; He Kanglin; Heng Yuekun; Zhang Chun; Liu Huaimin; Cheng Baosen; Yan Wuguang; Mai Jimao; Zhao Haiwen

    2000-01-01

    The authors report a feasibility study of BES data off-line processing (BES data off-line reconstruction and Monte Carlo simulation) and D/Ds physics analysis on a PC/Linux platform. The authors compared the results obtained from the PC/Linux with that from HP/UNIX workstation. It shows that PC/Linux platform can do BES data off-line analysis as good as HP/UNIX workstation, and is much powerful and economical

  1. Modeling in control of the Advanced Light Source

    International Nuclear Information System (INIS)

    Bengtsson, J.; Forest, E.; Nishimura, H.; Schachinger, L.

    1991-05-01

    A software system for control of accelerator physics parameters of the Advanced Light Source (ALS) is being designed and implemented at LBL. Some of the parameters we wish to control are tunes, chromaticities, and closed orbit distortions as well as linear lattice distortions and, possibly, amplitude- and momentum-dependent tune shifts. In all our applications, the goal is to allow the user to adjust physics parameters of the machine, instead of turning knobs that control magnets directly. This control will take place via a highly graphical user interface, with both a model appropriate to the application and any correction algorithm running alongside as separate processes. Many of these applications will run on a Unix workstation, separate from the controls system, but communicating with the hardware database via Remote Procedure Calls (RPCs)

  2. Multilingual Speech and Language Processing

    Science.gov (United States)

    2003-04-01

    client software handles the user end of the transaction. Historically, four clients were provided: e-mail, web, FrameMaker , and command line. By...command-line client and an API. The API allows integration of CyberTrans into a number of processes including word processing packages ( FrameMaker ...preservation and logging, and others. The available clients remain e-mail, Web and FrameMaker . Platforms include both Unix and PC for clients, with

  3. New Control Software of the 188cm Telescope of Okayama Astrophysical Observatory

    Science.gov (United States)

    Yoshida, Michitoshi; Shimizu, Yasuhiro; Watanabe, Etsuji; Yanagisawa, Kenshi; Uraguchi, Fumihiro

    2002-12-01

    We developed the telescope control software for the 188cm telescope of Okayama Astrophysical Observatory (OAO) based on Java technology. Basically, the software consists of two processes running on separate Java virtual machines; one of which is the "Command Dispatcher (CD)" and the other is the "User Interface (UI)". Among the two, CD is the main engine/server of the telescope control, whereas UI is just a client. The "standard" UI we provide is a graphical user interface written in Java/Swing. CD communicates with the local control units (LCUs) of the telescope through RS232C. CD is a Java multi-thread program, in which a number of threads run simultaneously. The threads running in CD are the follows: UNIX socket servers for external communications, socket opener for on-demand open/close of a socket port, socket client manager, auto-guider and dome watcher, internal command dispatcher, status manager, status collector, RS232C writer and reader, logger, and control units. The above "control units" are software models ("objects") of the telescope system. We introduced four control units- "Telescope", "Dome", "Weather-Monitor", and "Pointing"- for telescope control. The first three units are simple software models of the real-worlds devices. The last one, "Pointing", is a unit which abstracts pointing procedure of the telescope. CD and UI communicate with each other using UNIX socket. The command protocol of this communication is fairly simple, and observation instruments, auto guider, or additional UI for remote observation are also able to communicate with CD through socket using this protocol. CD opens and closes socket ports for communication on demand according to the request of client process (UI, instruments etc.), so that any clients can be connected to CD dynamically.

  4. The CEBAF control system

    International Nuclear Information System (INIS)

    Watson, W.A. III.

    1995-01-01

    CEBAF has recently upgraded its accelerator control system to use EPICS, a control system toolkit being developed by a collaboration among laboratories in the US and Europe. The migration to EPICS has taken place during a year of intense commissioning activity, with new and old control systems operating concurrently. Existing CAMAC hardware was preserved by adding a CAMAC serial highway link to VME; newer hardware developments are now primarily in VME. Software is distributed among three tiers of computers: first, workstations and X terminals for operator interfaces and high level applications; second, VME single board computers for distributed access to hardware and for local control processing; third, embedded processors where needed for faster closed loop operation. This system has demonstrated the ability to scale EPICS to controlling thousands of devices, including hundreds of embedded processors, with control distributed among dozens of VME processors executing more than 125,000 EPICS database records. To deal with the large size of the control system, CEBAF has integrated an object oriented database, providing data management capabilities for both low level I/O and high level machine modeling. A new callable interface which is control system independent permits access to live EPICS data, data in other Unix processes, and data contained in the object oriented database

  5. Doing It Right: 366 answers to computing questions you didn't know you had

    Energy Technology Data Exchange (ETDEWEB)

    Herring, Stuart Davis [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-19

    Slides include information on history: version control, version control: branches, version control: Git, releases, requirements, readability, readability control flow, global variables, architecture, architecture redundancy, processes, input/output, unix, etcetera.

  6. CHIMERA II - A real-time multiprocessing environment for sensor-based robot control

    Science.gov (United States)

    Stewart, David B.; Schmitz, Donald E.; Khosla, Pradeep K.

    1989-01-01

    A multiprocessing environment for a wide variety of sensor-based robot system, providing the flexibility, performance, and UNIX-compatible interface needed for fast development of real-time code is addressed. The requirements imposed on the design of a programming environment for sensor-based robotic control is outlined. The details of the current hardware configuration are presented, along with the details of the CHIMERA II software. Emphasis is placed on the kernel, low-level interboard communication, user interface, extended file system, user-definable and dynamically selectable real-time schedulers, remote process synchronization, and generalized interprocess communication. A possible implementation of a hierarchical control model, the NASA/NBS standard reference model for telerobot control system is demonstrated.

  7. New control system for the KEK-linac

    International Nuclear Information System (INIS)

    Kamikubota, N.; Furukawa, K.; Nakahara, K.; Abe, I.; Akimoto, H.

    1993-01-01

    New control system for the KEK-Linac has been developed. Unix-based workstations and VME-bus computers are introduced. They are inter-connected with an Ethernet, which is used as a high-speed data-exchange network. New system will start the operation after October 1993. (author)

  8. Fermilab DART run control

    International Nuclear Information System (INIS)

    Oleynik, G.; Engelfried, J.; Mengel, L.

    1996-01-01

    DART is the high speed, Unix based data acquisition system being developed by Fermilab in collaboration with seven High Energy Physics Experiments. This paper describes DART run control, which has been developed over the past year and is a flexible, distributed, extensible system for the control and monitoring of the data acquisition systems. The authors discuss the unique and interesting concepts of the run control and some of the experiences in developing it. They also give a brief update and status of the whole DART system

  9. Fermilab DART run control

    International Nuclear Information System (INIS)

    Oleynik, G.; Engelfried, J.; Mengel, L.

    1995-05-01

    DART is the high speed, Unix based data acquisition system being developed by Fermilab in collaboration with seven High Energy Physics Experiments. This paper describes DART run control, which has been developed over the past year and is a flexible, distributed, extensible system for the, control and monitoring of the data acquisition systems. We discuss the unique and interesting concepts of the run control and some of our experiences in developing it. We also give a brief update and status of the whole DART system

  10. DB2 9 for Linux, Unix, and Windows database administration upgrade certification study guide

    CERN Document Server

    Sanders, Roger E

    2007-01-01

    Written by one of the world's leading DB2 authors who is an active participant in the development of the DB2 certification exams, this resource covers everything a database adminstrator needs to know to pass the DB2 9 for Linux, UNIX, and Windows Database Administration Certification Upgrade exam (Exam 736). This comprehensive study guide discusses all exam topics: server management, data placement, XML concepts, analyzing activity, high availability, database security, and much more. Each chapter contains an extensive set of practice questions along with carefully explained answers. Both information-technology professionals who have experience as database administrators and have a current DBA certification on version 8 of DB2 and individuals who would like to learn the new features of DB2 9 will benefit from the information in this reference guide.

  11. Control system reliability at Jefferson Lab

    International Nuclear Information System (INIS)

    White, K.S.; Areti, H.; Garza, O.

    1997-01-01

    At Thomas Jefferson National Accelerator Facility (Jefferson Lab), the availability of the control system is crucial to the operation of the accelerator for experimental programs. Jefferson Lab's control system, uses 68040 based microprocessors running VxWorks, Unix workstations, and a variety of VME, CAMAC. GPIB, and serial devices. The software consists of control system toolkit software, commercial packages, and over 200 custom and generic applications, some of which are highly complex. The challenge is to keep this highly diverse and still growing system, with over 162,000 control points, operating reliably, while managing changes and upgrades to both the hardware and software. Downtime attributable to the control system includes the time to troubleshoot and repair problems and the time to restore the machine to operation of the scheduled program. This paper describes the availability of the control system during the last year, the heaviest contributors to downtime and the response to problems. Strategies for improving the robustness of the control system am detailed and include changes in hardware, software, procedures and processes. The improvements range from the routine preventive hardware maintenance, to improving their ability to detect, predict and prevent problems. This paper also describes the software tools used to assist in control system troubleshooting, maintenance and failure recovery processes

  12. Customizing graphical user interface technology for spacecraft control centers

    Science.gov (United States)

    Beach, Edward; Giancola, Peter; Gibson, Steven; Mahmot, Ronald

    1993-01-01

    The Transportable Payload Operations Control Center (TPOCC) project is applying the latest in graphical user interface technology to the spacecraft control center environment. This project of the Mission Operations Division's (MOD) Control Center Systems Branch (CCSB) at NASA Goddard Space Flight Center (GSFC) has developed an architecture for control centers which makes use of a distributed processing approach and the latest in Unix workstation technology. The TPOCC project is committed to following industry standards and using commercial off-the-shelf (COTS) hardware and software components wherever possible to reduce development costs and to improve operational support. TPOCC's most successful use of commercial software products and standards has been in the development of its graphical user interface. This paper describes TPOCC's successful use and customization of four separate layers of commercial software products to create a flexible and powerful user interface that is uniquely suited to spacecraft monitoring and control.

  13. Control system for the Spanish Stellarator TJ-II

    International Nuclear Information System (INIS)

    Pacios, L.; Blaumoser, M.; Pena, A. de la; Carrasco, R.; Labrador, I.; Lapayese, F.; Diaz, J.C.; Laso, L.M.

    1995-01-01

    We describe the distributed control and monitoring system for the Spanish Stellarator TJ-II, which is under construction at CIEMAT in Madrid. It consists of one central UNIX workstation and several autonomous subsystems based on VME crates with embedded processors under OS-9 real-time operating system and PLCs. The system integrates the machine and discharge control. An operator can perform the control and plasma discharge by means of a user-friendly graphic interface. (orig.)

  14. Development and testing of a diagnostic system for intelligen distributed control at EBR-2

    International Nuclear Information System (INIS)

    Edwards, R.M.; Ruhl, D.W.; Klevans, E.H.; Robinson, G.E.

    1990-01-01

    A diagnostic system is under development for demonstration of Intelligent Distributed Control at the Experimental Breeder Reactor (EBR--II). In the first phase of the project a diagnostic system is being developed for the EBR-II steam plant based on the DISYS expert systems approach. Current testing uses recorded plant data and data from simulated plant faults. The dynamical simulation of the EBR-II steam plant uses the Babcock and Wilcox (B ampersand W) Modular Modeling System (MMS). At EBR-II the diagnostic system operates in the UNIX workstation and receives live plant data from the plant Data Acquisition System (DAS). Future work will seek implementation of the steam plant diagnostic in a distributed manner using UNIX based computers and Bailey microprocessor-based control system. 10 refs., 6 figs

  15. Process control device

    International Nuclear Information System (INIS)

    Hayashi, Toshifumi; Kobayashi, Hiroshi.

    1994-01-01

    A process control device comprises a memory device for memorizing a plant operation target, a plant state or a state of equipments related with each other as control data, a read-only memory device for storing programs, a plant instrumentation control device or other process control devices, an input/output device for performing input/output with an operator, and a processing device which conducts processing in accordance with the program and sends a control demand or a display demand to the input/output device. The program reads out control data relative to a predetermined operation target, compares and verify them with actual values to read out control data to be a practice premise condition which is further to be a practice premise condition if necessary, thereby automatically controlling the plant or requiring or displaying input. Practice presuming conditions for the operation target can be examined succesively in accordance with the program without constituting complicated logical figures and AND/OR graphs. (N.H.)

  16. Aplicación de RT-Linux en el control de motores de pasos. Parte II; Appication of RT-Linux in the Control of Steps Motors. Part II

    Directory of Open Access Journals (Sweden)

    Ernesto Duany Renté

    2011-02-01

    Full Text Available Este trabajo complementa al presentado anteriormente: "Aplicación de RT-Linux en el control de motoresde pasos. Primera parte", de manera que se puedan relacionar a las tareas de adquisición y control para laobtención de un sistema lo más exacto posible. Las técnicas empleadas son las de tiempo real aprovechandolas posibilidades del microkernel RT-Linux y los software libres contenidos en sistemas Unix/Linux. Lasseñales se obtienen mediante un conversor AD y mostradas en pantalla empleando el Gnuplot.  The work presented in this paper is a complement to the control and acquisition tasks which were explainedin "Application of RT-Linux in the Control of Steps Motors. First Part", so that those both real time taskscan be fully related in order to make the whole control system more accurate. The employed techniquesare those of Real Time Taking advantage of the possibilities of the micro kernel RT-Linux and the freesoftware distributed in the Unix/Linux operating systems. The signals are obtained by means of an ADconverter and are shown in screen using Gnuplot.

  17. Control system technology for particle accelerators

    International Nuclear Information System (INIS)

    Tsumura, Yoshihiko; Matsuo, Keiichi; Maruyama, Takayuki.

    1995-01-01

    Control systems for particle accelerators are being designed around open-architecture systems, which allows easy upgrading, high-speed networks and high-speed processors. Mitsubishi Electric is applying realtime Unix operating systems, fiber-distributed data interface (FDDI), shared memory networks and remote I/O systems to achieve these objectives. In the area of vacuum control systems, which requires large-scale sequence control, the corporation is employing general-purpose programmable logic controllers (PLCs) to achieve cost-effective design. Software for these applications is designed around a library of application program interfaces (APIs) that give users direct access to key system functions. (author)

  18. The high level programmer and user interface of the NSLS control system

    International Nuclear Information System (INIS)

    Tang, Y.N.; Smith, J.D.; Sathe, S.

    1993-01-01

    This paper presents the major components of the high level software in the NSLS upgraded control system. Both programmer and user interfaces are discussed. The use of the high-speed work stations, fast network communications, UNIX system, X-window and Motif have greatly changed and improved these interfaces

  19. A study on the fusion reactor - Design study for tokamak control system

    Energy Technology Data Exchange (ETDEWEB)

    Ko, In Soo; Nam, Kung Won; Cho, Moo Hyun; Kim, Ji Hwa; Kim, Jae Myung; Kim, Sung Chul; Lee, Ki Sun [Pohang University of Science and Technology, Pohang (Korea, Republic of)

    1996-09-01

    Based on the experience accumulated during the construction of Pohang Light Source, we strongly suggest that the control system must have a hierarchical structure linked with multiple ethernets. Some UNIX-based workstation are recommended in the top layer where KSTAR operators are faced with several control/monitoring windows. As a supervisors to individual subsystem, realtime computers such as VME system are recommended. Choice of realtime operating system is up to the actual developer. However, we strongly suggest to use Lynx which is UNIX-based realtime OS and is getting more attentions after its introduction in the market. Graphic interface for operators will be developed by adopting development toolkit such as RTWorks and EPICS. We also recommend to use EPICS because it is introduced by Los Alamos National Laboratory, U. S. A. and several laboratories around world are now adopting it to their control system. This report is also pointing out several key issues relating the system development. Among them are the cabling plan, signal list, and noise suppressions. It is also suggested that early decision of communication protocol will help to integrated control system in the future. 11 refs., 5 tabs., 4 figs. (author)

  20. RHIC control system

    Energy Technology Data Exchange (ETDEWEB)

    Barton, D.S. E-mail: dsbarton@bnl.gov; Binello, S.; Buxton, W.; Clifford, T.; D' Ottavio, T.; Hartmann, H.; Hoff, L.T.; Katz, R.; Kennell, S.; Kerner, T.; Laster, J.; Lee, R.C.; Marusic, A.; Michnoff, R.; Morris, J.; Oerter, B.R.; Olsen, R.; Piacentino, J.; Skelly, J.F

    2003-03-01

    The RHIC control system architecture is hierarchical and consists of two physical layers with a fiber-optic network connection. The Front-End Level systems consist of VME chassis with processors running a real-time operating system and both VME I/O modules and remote bus interfaces. Accelerator device software interfaces are implemented as objects in C++. The network implementation uses high speed, switched Ethernet technology. Specialized hardware modules were built for waveform control of power supplies, multiplexed signal acquisition, and timing services. The Console Level systems are Unix workstations. A strong emphasis has been given to developing highly reusable, standard software tools for use in building physics and diagnostic application software.

  1. RHIC control system

    International Nuclear Information System (INIS)

    Barton, D.S.; Binello, S.; Buxton, W.; Clifford, T.; D'Ottavio, T.; Hartmann, H.; Hoff, L.T.; Katz, R.; Kennell, S.; Kerner, T.; Laster, J.; Lee, R.C.; Marusic, A.; Michnoff, R.; Morris, J.; Oerter, B.R.; Olsen, R.; Piacentino, J.; Skelly, J.F.

    2003-01-01

    The RHIC control system architecture is hierarchical and consists of two physical layers with a fiber-optic network connection. The Front-End Level systems consist of VME chassis with processors running a real-time operating system and both VME I/O modules and remote bus interfaces. Accelerator device software interfaces are implemented as objects in C++. The network implementation uses high speed, switched Ethernet technology. Specialized hardware modules were built for waveform control of power supplies, multiplexed signal acquisition, and timing services. The Console Level systems are Unix workstations. A strong emphasis has been given to developing highly reusable, standard software tools for use in building physics and diagnostic application software

  2. Take control of permissions in Leopard

    CERN Document Server

    Tanaka, Brian

    2009-01-01

    Permissions problems got you down? Turn to Unix expert Brian Tanaka's unique guide to the permissions in Mac OS X 10.5 Leopard that control access to your files, folders, and disks. You'll learn how to keep files private, when to set Ignore Permissions, what happens when you repair permissions, how to delete stuck files, and the best ways to solve permissions-related problems. Advanced concepts include the sticky bit, Leopard's more-important access control lists, bit masks, and symbolic versus absolute ways to set permissions. The book covers how to take control of permissions via the Finder

  3. New software of the control and data acquisition system for the Nuclotron internal target station

    International Nuclear Information System (INIS)

    Isupov, A.Yu.

    2012-01-01

    The control and data acquisition system for the Internal Target Station (ITS) of the Nuclotron (LHEP, JINR) is implemented. The new software is based on the ngdp framework under the Unix-like operating system FreeBSD to allow easy network distribution of the on-line data collected from ITS, as well as the internal target remote control

  4. Ground test accelerator control system software

    International Nuclear Information System (INIS)

    Burczyk, L.; Dalesio, R.; Dingler, R.; Hill, J.; Howell, J.A.; Kerstiens, D.; King, R.; Kozubal, A.; Little, C.; Martz, V.; Rothrock, R.; Sutton, J.

    1988-01-01

    This paper reports on the GTA control system that provides an environment in which the automation of a state-of-the-art accelerator can be developed. It makes use of commercially available computers, workstations, computer networks, industrial 110 equipment, and software. This system has built-in supervisory control (like most accelerator control systems), tools to support continuous control (like the process control industry), and sequential control for automatic start-up and fault recovery (like few other accelerator control systems). Several software tools support these levels of control: a real-time operating system (VxWorks) with a real-time kernel (VRTX), a configuration database, a sequencer, and a graphics editor. VxWorks supports multitasking, fast context-switching, and preemptive scheduling. VxWorks/VRTX is a network-based development environment specifically designed to work in partnership with the UNIX operating system. A data base provides the interface to the accelerator components. It consists of a run time library and a database configuration and editing tool. A sequencer initiates and controls the operation of all sequence programs (expressed as state programs). A graphics editor gives the user the ability to create color graphic displays showing the state of the machine in either text or graphics form

  5. Multi-microprocessor control of the main ring magnet power supply of the 12 GeV KEK proton synchrotron

    International Nuclear Information System (INIS)

    Sueno, T.; Mikawa, K.; Toda, M.; Toyama, T.; Sato, H.; Matsumoto, S.

    1992-01-01

    A general description of the computer control system of the KEK 12 GeV PS main ring magnet power supply is given, including its peripheral devices. The system consists of the main HIDIC-V90/25 CPU and of the input and output controllers HISEC-04M. The main CPU, supervised by UNIX, provides the man-machine interfacing and implements the repetitive control algorithm to correct for any magnet current deviation from reference. Two sub-CPU's are linked by a LAN and supported by a real time multi-task monitor. The output process controller distributes the control patterns to 16-bit DAC's, at 1.67 ms clock period in synchronism with the 3-phase ac line systems. The input controller logs the magnet current and voltage, via 16-bit ADC's at the same clock rate. (author)

  6. Robot welding process control

    Science.gov (United States)

    Romine, Peter L.

    1991-01-01

    This final report documents the development and installation of software and hardware for Robotic Welding Process Control. Primary emphasis is on serial communications between the CYRO 750 robotic welder, Heurikon minicomputer running Hunter & Ready VRTX, and an IBM PC/AT, for offline programming and control and closed-loop welding control. The requirements for completion of the implementation of the Rocketdyne weld tracking control are discussed. The procedure for downloading programs from the Intergraph, over the network, is discussed. Conclusions are made on the results of this task, and recommendations are made for efficient implementation of communications, weld process control development, and advanced process control procedures using the Heurikon.

  7. Opportunities and challenges for process control in process intensification

    NARCIS (Netherlands)

    Nikacevic, N.M.; Huesman, A.E.M.; Hof, Van den P.M.J.; Stankiewicz, A.

    2012-01-01

    This is a review and position article discussing the role and prospective for process control in process intensification. Firstly, the article outlines the classical role of control in process systems, presenting an overview of control systems’ development, from basic PID control to the advanced

  8. The Nuclotron internal target control and data acquisition system

    Energy Technology Data Exchange (ETDEWEB)

    Isupov, A.Yu., E-mail: isupov@moonhe.jinr.ru [Joint Institute for Nuclear Research, Dubna 141980 (Russian Federation); Krasnov, V.A.; Ladygin, V.P.; Piyadin, S.M.; Reznikov, S.G. [Joint Institute for Nuclear Research, Dubna 141980 (Russian Federation)

    2013-01-11

    The new control system of the Nuclotron (JINR, Dubna) internal target is described in both hardware and software aspects. The CAMAC hardware is based on the use of the standard CAMAC modules developed and manufactured at JINR. The internal target control and data acquisition (IntTarg CDAQ) system software is implemented using the ngdp framework under the Unix-like operating system (OS) FreeBSD to allow easy network distribution of the online data collected from internal target and accompanying detectors, as well as the internal target remote control.

  9. Resonance – Journal of Science Education | Indian Academy of ...

    Indian Academy of Sciences (India)

    Logo of the Indian Academy of Sciences. Indian Academy of ... Volume 17; Issue 8. UNIX: Genesis and Design Features ... Keywords. Operating system design; UNIX shell; UNIX kernel; file systems; process management; interprocess communication; versioning; documentation and high level language C; tools and utilities.

  10. Process control using modern systems of information processing

    International Nuclear Information System (INIS)

    Baldeweg, F.

    1984-01-01

    Modern digital automation techniques allow the application of demanding types of process control. These types of process control are characterized by their belonging to higher levels in a multilevel model. Functional and technical aspects of the performance of digital automation plants are presented and explained. A modern automation system is described considering special procedures of process control (e.g. real time diagnosis)

  11. Process control program development

    International Nuclear Information System (INIS)

    Dameron, H.J.

    1985-01-01

    This paper details the development and implementation of a ''Process Control Program'' at Duke Power's three nuclear stations - Oconee, McGuire, and Catawba. Each station is required by Technical Specification to have a ''Process Control Program'' (PCP) to control all dewatering and/or solidification activities for radioactive wastes

  12. Chemical process control using Mat lab

    International Nuclear Information System (INIS)

    Kang, Sin Chun; Kim, Raeh Yeon; Kim, Yang Su; Oh, Min; Yeo, Yeong Gu; Jung, Yeon Su

    2001-07-01

    This book is about chemical process control, which includes the basis of process control with conception, function, composition of system and summary, change of laplace and linearization, modeling of chemical process, transfer function and block diagram, the first dynamic property of process, the second dynamic property of process, the dynamic property of combined process, control structure of feedback on component of control system, the dynamic property of feedback control loop, stability of closed loop control structure, expression of process, modification and composition of controller, analysis of vibration response and adjustment controller using vibration response.

  13. DUAL-PROCESS, a highly reliable process control system

    International Nuclear Information System (INIS)

    Buerger, L.; Gossanyi, A.; Parkanyi, T.; Szabo, G.; Vegh, E.

    1983-02-01

    A multiprocessor process control system is described. During its development the reliability was the most important aspect because it is used in the computerized control of a 5 MW research reactor. DUAL-PROCESS is fully compatible with the earlier single processor control system PROCESS-24K. The paper deals in detail with the communication, synchronization, error detection and error recovery problems of the operating system. (author)

  14. Experimental Physics and Industrial Control System (EPICS): Application source/release control for EPICS R3.11.6

    International Nuclear Information System (INIS)

    Zieman, B.; Kraimer, M.

    1994-01-01

    This manual describes a set of tools that can be used to develop software for EPICS based control systems. It provides the following features: Multiple applications; the entire system is composed of an arbitrary number of applications: Source/Release Control; all files created or modified by the applications developers can be put under sccs (a UNIX Source/Release control utility): Multiple Developers; it allows a number of applications developers to work separately during the development phase but combine their applications for system testing and for a production system; Makefiles: makefiles are provided to automatically rebuild various application components. For C and state notation programs, Imagefiles are provided

  15. User internface for rail traffic control system

    OpenAIRE

    Križaj, Blaž

    2013-01-01

    The thesis represents a solution to the problem of remote access to a server application, its management, and graphical user interface (GUI) build above console application. The TRIS Server operates in the UNIX environment without a graphical user interface as a service. Access to the application, archive files and management without graphical user interface is user unfriendly, as it requires knowledge of UNIX environment and Bash scripting language. Remote access via the internet represen...

  16. Process control for sheet-metal stamping process modeling, controller design and shop-floor implementation

    CERN Document Server

    Lim, Yongseob; Ulsoy, A Galip

    2014-01-01

    Process Control for Sheet-Metal Stamping presents a comprehensive and structured approach to the design and implementation of controllers for the sheet metal stamping process. The use of process control for sheet-metal stamping greatly reduces defects in deep-drawn parts and can also yield large material savings from reduced scrap. Sheet-metal forming is a complex process and most often characterized by partial differential equations that are numerically solved using finite-element techniques. In this book, twenty years of academic research are reviewed and the resulting technology transitioned to the industrial environment. The sheet-metal stamping process is modeled in a manner suitable for multiple-input multiple-output control system design, with commercially available sensors and actuators. These models are then used to design adaptive controllers and real-time controller implementation is discussed. Finally, experimental results from actual shopfloor deployment are presented along with ideas for further...

  17. Upgrading NASA/DOSE laser ranging system control computers

    Science.gov (United States)

    Ricklefs, Randall L.; Cheek, Jack; Seery, Paul J.; Emenheiser, Kenneth S.; Hanrahan, William P., III; Mcgarry, Jan F.

    1993-01-01

    Laser ranging systems now managed by the NASA Dynamics of the Solid Earth (DOSE) and operated by the Bendix Field Engineering Corporation, the University of Hawaii, and the University of Texas have produced a wealth on interdisciplinary scientific data over the last three decades. Despite upgrades to the most of the ranging station subsystems, the control computers remain a mix of 1970's vintage minicomputers. These encompass a wide range of vendors, operating systems, and languages, making hardware and software support increasingly difficult. Current technology allows replacement of controller computers at a relatively low cost while maintaining excellent processing power and a friendly operating environment. The new controller systems are now being designed using IBM-PC-compatible 80486-based microcomputers, a real-time Unix operating system (LynxOS), and X-windows/Motif IB, and serial interfaces have been chosen. This design supports minimizing short and long term costs by relying on proven standards for both hardware and software components. Currently, the project is in the design and prototyping stage with the first systems targeted for production in mid-1993.

  18. New multi-microprocessor control system of the main ring magnet power supply for the KEK 12 GeV PS

    International Nuclear Information System (INIS)

    Matsumoto, S.; Sueno, T.; Mikawa, K.; Toda, M.; Baba, H.

    1987-01-01

    The new control computer system has been installed at the last June. The main cpu supervised by the Unix covers man-machine interfacing and correction of current deviation in periodic control algorithm. The two sub-cpu are linked by a LAN and supported by a real time multi-task monitor. The one as the output process controller distributes 15 control patterns to 16-bit D/A converters at every 1.67 ms synchronized 6 phase ac line. The other as the input logs B Qf and Qd magnet current and voltage by 7 sets of 16-bit ADC at the same clock. Resulting precision of the currents from beam injection to flat top would be estimated at about a resolution limit of ADC except for current ripple. Ripple currents estimated at several times of an intrinsic ripple in the DCCT

  19. Evolution in controls methods for the SPS power converters

    CERN Document Server

    Dinius, A H; Brazier, J C L

    1995-01-01

    In common with much accelerator specific material, there is a constant need to improve both hardware and software for power converter control. Maintenance and performance improvements of older systems have become extremely tedious and in some areas impossible. By using modern real-time software and the latest high-performance processors, such problems should be substantially reduced. This paper describes the software concepts and the hardware chosen for the upgrade of the existing facilities. Using the UNIX compatible LynxOS real time kernel, running on a PowerPC 603 in a VME environment, this new approach provides excellent performance while retaining the desired flexibility for future enhancements. The 64 channel system is implemented as a set of cooperating processes, several of which are multi-threaded. Processes include analogue function generation, analogue measurement and digital I/O, all of which are accurately scheduled by the accelerator timing system. This generalised structure, which performs comp...

  20. [Network Design of the Spaceport Command and Control System

    Science.gov (United States)

    Teijeiro, Antonio

    2017-01-01

    I helped the Launch Control System (LCS) hardware team sustain the network design of the Spaceport Command and Control System. I wrote the procedure that will be used to satisfy an official hardware test for the hardware carrying data from the Launch Vehicle. I installed hardware and updated design documents in support of the ongoing development of the Spaceport Command and Control System and applied firewall experience I gained during my spring 2017 semester to inspect and create firewall security policies as requested. Finally, I completed several online courses concerning networking fundamentals and Unix operating systems.

  1. MX: A beamline control system toolkit

    Science.gov (United States)

    Lavender, William M.

    2000-06-01

    The development of experimental and beamline control systems for two Collaborative Access Teams at the Advanced Photon Source has resulted in the creation of a portable data acquisition and control toolkit called MX. MX consists of a set of servers, application programs and libraries that enable the creation of command line and graphical user interface applications that may be easily retargeted to new and different kinds of motor and device controllers. The source code for MX is written in ANSI C and Tcl/Tk with interprocess communication via TCP/IP. MX is available for several versions of Unix, Windows 95/98/NT and DOS. It may be downloaded from the web site http://www.imca.aps.anl.gov/mx/.

  2. Upgrade of the Los Alamos Plutonium Facility control system

    International Nuclear Information System (INIS)

    Pope, N.G.; Turner, W.J.; Brown, R.E.; Bibeau, R.A.; Davis, R.R.; Hogan, K.

    1996-01-01

    After 20 yrs service, the Los Alamos Plutonium Facility is undergoing an upgrade to its aging Facility Control System. The new system design includes a network of redundantly-paired programmable logic controllers that will interface with about 2200 field data points. The data communications network that has been designed includes a redundant, self-healing fiber optic data highway as well as a fiber optic ethernet. Commercially available human-machine interface software running on a UNIX-based system displays facility subsystem status operator X-terminals. Project design features, methods, costs, and schedule are discussed

  3. Development of a data acquisition system using a RISC/UNIXTM workstation

    International Nuclear Information System (INIS)

    Takeuchi, Y.; Tanimori, T.; Yasu, Y.

    1993-01-01

    We have developed a compact data acquisition system on RISC/UNIX workstations. A SUN TM SPARCstation TM IPC was used, in which an extension bus 'SBus TM ' was linked to a VMEbus. The transfer rate achieved was better than 7 Mbyte/s between the VMEbus and the SUN. A device driver for CAMAC was developed in order to realize an interruptive feature in UNIX. In addition, list processing has been incorporated in order to keep the high priority of the data handling process in UNIX. The successful developments of both device driver and list processing have made it possible to realize the good real-time feature on the RISC/UNIX system. Based on this architecture, a portable and versatile data taking system has been developed, which consists of a graphical user interface, I/O handler, user analysis process, process manager and a CAMAC device driver. (orig.)

  4. Process control in biogas plants

    DEFF Research Database (Denmark)

    Holm-Nielsen, Jens Bo; Oleskowicz-Popiel, Piotr

    2013-01-01

    Efficient monitoring and control of anaerobic digestion (AD) processes are necessary in order to enhance biogas plant performance. The aim of monitoring and controlling the biological processes is to stabilise and optimise the production of biogas. The principles of process analytical technology...

  5. Data triggered data processing at MFTF-B

    International Nuclear Information System (INIS)

    Jackson, R.J.; Balch, T.R.; Preckshot, G.G.

    1985-01-01

    A primary characteristic of most batch systems is that the input data files must exist before jobs are scheduled. On the Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory we schedule jobs to process experimental data to be collected during a five minute shot cycle. Our data-driven processing system emulates a coarsely granular data flow architecture. Processing jobs are scheduled before the experimental data is collected. Processing jobs ''fire'', or execute, as input data becomes available. Similar to UNIX ''pipes'', data produced by upstream processing nodes may be used as inputs by following nodes. Users, working on our networked SUN workstations, specify data processing templates which define processes and their data dependencies. Data specifications indicate the source of data; actual associations with specific data instantiations are made when the jobs are scheduled. We report here on details of diagnostic data processing and our experiences

  6. Microcrystalline silicon deposition: Process stability and process control

    International Nuclear Information System (INIS)

    Donker, M.N. van den; Kilper, T.; Grunsky, D.; Rech, B.; Houben, L.; Kessels, W.M.M.; Sanden, M.C.M. van de

    2007-01-01

    Applying in situ process diagnostics, we identified several process drifts occurring in the parallel plate plasma deposition of microcrystalline silicon (μc-Si:H). These process drifts are powder formation (visible from diminishing dc-bias and changing spatial emission profile on a time scale of 10 0 s), transient SiH 4 depletion (visible from a decreasing SiH emission intensity on a time scale of 10 2 s), plasma heating (visible from an increasing substrate temperature on a time scale of 10 3 s) and a still puzzling long-term drift (visible from a decreasing SiH emission intensity on a time scale of 10 4 s). The effect of these drifts on the crystalline volume fraction in the deposited films is investigated by selected area electron diffraction and depth-profiled Raman spectroscopy. An example shows how the transient depletion and long-term drift can be prevented by suitable process control. Solar cells deposited using this process control show enhanced performance. Options for process control of plasma heating and powder formation are discussed

  7. Upgrade plan for the control system of the KEK e-/e+ linac

    International Nuclear Information System (INIS)

    Furukawa, K.; Kamikubota, N.; Nakahara, K.; Abe, I.

    1992-01-01

    The KEK 2.5-GeV linac has been operating since 1982. However, in order to maintain reliable operation, the control system should be upgraded within a few years. We plan to replace the minicomputers and the main network connecting them. Thus, the architecture of the control software will also be revised. In the new system we should adopt software and hardware standards. In the next control system we will employ the TCP/IP (DARPA) protocol suite for the main network and Unix workstations to replace the minicomputers. For connections to the local controllers, VME bus (IEEE 1014) will be utilized. (author)

  8. An integrated approach to process control

    NARCIS (Netherlands)

    Schippers, W.A.J.

    2001-01-01

    The control of production processes is the subject of several disciplines, such as statistical process control (SPC), total productive maintenance (TPM), and automated process control (APC). Although these disciplines are traditionally separated (both in science and in business practice), their

  9. Statistical process control in wine industry using control cards

    OpenAIRE

    Dimitrieva, Evica; Atanasova-Pacemska, Tatjana; Pacemska, Sanja

    2013-01-01

    This paper is based on the research of the technological process of automatic filling of bottles of wine in winery in Stip, Republic of Macedonia. The statistical process control using statistical control card is created. The results and recommendations for improving the process are discussed.

  10. Memory-type control charts in statistical process control

    NARCIS (Netherlands)

    Abbas, N.

    2012-01-01

    Control chart is the most important statistical tool to manage the business processes. It is a graph of measurements on a quality characteristic of the process on the vertical axis plotted against time on the horizontal axis. The graph is completed with control limits that cause variation mark. Once

  11. Fuzzy control of pressurizer dynamic process

    International Nuclear Information System (INIS)

    Ming Zhedong; Zhao Fuyu

    2006-01-01

    Considering the characteristics of pressurizer dynamic process, the fuzzy control system that takes the advantages of both fuzzy controller and PID controller is designed for the dynamic process in pressurizer. The simulation results illustrate this type of composite control system is with better qualities than those of single fuzzy controller and single PID controller. (authors)

  12. Statistical Process Control for KSC Processing

    Science.gov (United States)

    Ford, Roger G.; Delgado, Hector; Tilley, Randy

    1996-01-01

    The 1996 Summer Faculty Fellowship Program and Kennedy Space Center (KSC) served as the basis for a research effort into statistical process control for KSC processing. The effort entailed several tasks and goals. The first was to develop a customized statistical process control (SPC) course for the Safety and Mission Assurance Trends Analysis Group. The actual teaching of this course took place over several weeks. In addition, an Internet version of the same course complete with animation and video excerpts from the course when it was taught at KSC was developed. The application of SPC to shuttle processing took up the rest of the summer research project. This effort entailed the evaluation of SPC use at KSC, both present and potential, due to the change in roles for NASA and the Single Flight Operations Contractor (SFOC). Individual consulting on SPC use was accomplished as well as an evaluation of SPC software for KSC use in the future. A final accomplishment of the orientation of the author to NASA changes, terminology, data format, and new NASA task definitions will allow future consultation when the needs arise.

  13. Optimization of CernVM early boot process

    CERN Document Server

    Mazdin, Petra

    2015-01-01

    CernVM virtual machine is a Linux based virtual appliance optimized for High Energy Physics experiments. It is used for cloud computing, volunteer computing, and software development by the four large LHC experiments. The goal of this project is proling and optimizing the boot process of the CernVM. A key part was the development of a performance profiler for shell scripts as an extension to the popular BusyBox open source UNIX tool suite. Based on the measurements, costly shell code was replaced by more efficient, custom C programs. The results are compared to the original ones and successful optimization is proven.

  14. Control system for technological processes in tritium processing plants with process analysis

    International Nuclear Information System (INIS)

    Retevoi, Carmen Maria; Stefan, Iuliana; Balteanu, Ovidiu; Stefan, Liviu; Bucur, Ciprian

    2005-01-01

    Integration of a large variety of installations and equipment into a unitary system for controlling the technological process in tritium processing nuclear facilities appears to be a rather complex approach particularly when experimental or new technologies are developed. Ensuring a high degree of versatility allowing easy modifications in configurations and process parameters is a major requirement imposed on experimental installations. The large amount of data which must be processed, stored and easily accessed for subsequent analyses imposes development of a large information network based on a highly integrated system containing the acquisition, control and technological process analysis data as well as data base system. On such a basis integrated systems of computation and control able to conduct the technological process could be developed as well protection systems for cases of failures or break down. The integrated system responds to the control and security requirements in case of emergency and of the technological processes specific to the industry that processes radioactive or toxic substances with severe consequences in case of technological failure as in the case of tritium processing nuclear plant. In order to lower the risk technological failure of these processes an integrated software, data base and process analysis system are developed, which, based on identification algorithm of the important parameters for protection and security systems, will display the process evolution trend. The system was checked on a existing plant that includes a removal tritium unit, finally used in a nuclear power plant, by simulating the failure events as well as the process. The system will also include a complete data base monitoring all the parameters and a process analysis software for the main modules of the tritium processing plant, namely, isotope separation, catalytic purification and cryogenic distillation

  15. APS controls overview

    International Nuclear Information System (INIS)

    1996-01-01

    The APS accelerator control system described in this report is a distributed system consisting of operator interfaces, a network, and interfaces to hardware. The operator interface is a UNIX-based workstation with an X-windows graphical user interface. The workstation may be located at any point on the facility network and maintain full functionality. The user has the ability to generate and alter control displays and to access the alarm handler, the archiver, interactive control programs, custom code, and other tools. The TCP/EP networking protocol has been selected as the underlying protocol for the control system network. TCP/EP is a commercial standard and readily available from network hardware vendors. Its implementation is independent of the particular network medium selected to implement the controls network. In the development environment copper Ethernet is the network medium; however, in the actual implementation a fiber-based system using hub technology will be utilized. The function of the network is to provide a generalized communication path between the host computers, operator workstations, input/output crates, and other hardware that comprise the control system

  16. Welding process modelling and control

    Science.gov (United States)

    Romine, Peter L.; Adenwala, Jinen A.

    1993-01-01

    The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.

  17. Variation and Control of Process Behavior

    International Nuclear Information System (INIS)

    Pawlicki, Todd; Whitaker, Matthew

    2008-01-01

    The purpose of this work was to highlight the importance of controlling process variability for successful quality assurance (QA). We describe the method of statistical process control for characterizing and controlling a process. Traditionally, QA has been performed by comparing some important measurement (e.g., linear accelerator output) against a corresponding specification. Although useful in determining the fitness of a particular measurement, this approach does not provide information about the underlying process behavior over time. A modern view of QA is to consider the time-ordered behavior of a process. Every process displays characteristic behaviors that are independent of the specifications imposed on it. The goal of modern QA is, not only to ensure that a process is on-target, but that it is also operating with minimal variation. This is accomplished by way of a data-driven approach using process behavior charts. The development of process behavior charts, historically known as control charts, and process behavior (action) limits are described. The effect these concepts have on quality management is also discussed

  18. Science-based information processing in the process control of power stations

    International Nuclear Information System (INIS)

    Weisang, C.

    1992-01-01

    Through the application of specialized systems, future-orientated information processing integrates the sciences of processes, control systems, process control strategies, user behaviour and ergonomics. Improvements in process control can be attained, inter alia, by the preparation of the information contained (e.g. by suppressing the flow of signals and replacing it with signals which are found on substance) and also by an ergonomic representation of the study of the process. (orig.) [de

  19. Network communication libraries for the next control system of the KEK e-/e+ Linac

    International Nuclear Information System (INIS)

    Kamikubota, Norihiko; Furukawa, Kazuro; Nakahara, Kazuo; Abe, Isamu

    1992-01-01

    The network communication libraries for the next control system of the KEK Linac have been developed. They are based on TCP/IP sockets, and show high availability among the different operating systems: UNIX, VAX/VMS, and MS-DOS. They also show high source portability of application programs among the different computer systems provided by various vendors. The performance and problems are presented in detail. (author)

  20. Supporting Cross-Organizational Process Control

    Science.gov (United States)

    Angelov, Samuil; Vonk, Jochem; Vidyasankar, Krishnamurthy; Grefen, Paul

    E-contracts express the rights and obligations of parties through a formal, digital representation of the contract provisions. In process intensive relationships, e-contracts contain business processes that a party promises to perform for the counter party, optionally allowing monitoring of the execution of the promised processes. In this paper, we describe an approach in which the counter party is allowed to control the process execution. This approach will lead to more flexible and efficient business relations which are essential in the context of modern, highly dynamic and complex collaborations among companies. We present a specification of the process controls available to the consumer and their support in the private process specification of the provider.

  1. DSISoft—a MATLAB VSP data processing package

    Science.gov (United States)

    Beaty, K. S.; Perron, G.; Kay, I.; Adam, E.

    2002-05-01

    DSISoft is a public domain vertical seismic profile processing software package developed at the Geological Survey of Canada. DSISoft runs under MATLAB version 5.0 and above and hence is portable between computer operating systems supported by MATLAB (i.e. Unix, Windows, Macintosh, Linux). The package includes processing modules for reading and writing various standard seismic data formats, performing data editing, sorting, filtering, and other basic processing modules. The processing sequence can be scripted allowing batch processing and easy documentation. A structured format has been developed to ensure future additions to the package are compatible with existing modules. Interactive modules have been created using MATLAB's graphical user interface builder for displaying seismic data, picking first break times, examining frequency spectra, doing f- k filtering, and plotting the trace header information. DSISoft modular design facilitates the incorporation of new processing algorithms as they are developed. This paper gives an overview of the scope of the software and serves as a guide for the addition of new modules.

  2. Feasibility study of BES data processing and physics analysis on a PC/Linux platform

    International Nuclear Information System (INIS)

    Rong Gang; He Kanglin; Zhao Jiawei; Heng Yuekun; Zhang Chun

    1999-01-01

    The authors report a feasibility study of off-line BES data processing (data reconstruction and Detector simulation) on a PC/Linux platform and an application of the PC/Linux system in D/Ds physics analysis. The authors compared the results obtained from the PC/Linux with that from HP workstation. It shows that PC/Linux platform can do BES data offline analysis as good as UNIX workstation do, but it is much powerful and economical

  3. Processes subject to integrated pollution control. Petroleum processes: oil refining and associated processes

    International Nuclear Information System (INIS)

    1995-01-01

    This document, part of a series offering guidance on pollution control regulations issued by Her Majesty's Inspectorate of Pollution, (HMIP) focuses on petroleum processes such as oil refining and other associated processes. The various industrial processes used, their associated pollution release routes into the environment and techniques for controlling these releases are all discussed. Environmental quality standards are related to national and international agreements on pollution control and abatement. HMIP's work on air, water and land pollution monitoring is also reported. (UK)

  4. Applicability of statistical process control techniques

    NARCIS (Netherlands)

    Schippers, W.A.J.

    1998-01-01

    This paper concerns the application of Process Control Techniques (PCTs) for the improvement of the technical performance of discrete production processes. Successful applications of these techniques, such as Statistical Process Control Techniques (SPC), can be found in the literature. However, some

  5. Microprocessors control of fermentation process

    Energy Technology Data Exchange (ETDEWEB)

    Fawzy, A S; Hinton, O R

    1980-01-01

    This paper presents three schemes for the solution of the optimal control of fermentation process. It also shows the advantages of using microprocessors in controlling and monitoring this process. A linear model of the system is considered. An optimal feedback controller is determined which maintains the states (substrate and organisms concentration) at desired values when the system is subjected to disturbances in the influent substrate and organisms concentration. Simulation results are presented for the three cases.

  6. Graphical user interfaces for McCellan Nuclear Radiation Center (MNRC)

    International Nuclear Information System (INIS)

    Brown-VanHoozer, S. A.

    1998-01-01

    McClellan's Nuclear Radiation Center (MNRC) control console is in the process of being replaced due to spurious scrams, outdated software, and obsolete parts. The intent of the new control console is to eliminate the existing problems by installing a UNIX-based computer system with industry-standard interface software and incorporating human factors during all stages of the graphical user interface (GUI) development and control console design

  7. Association between product quality control and process quality control of bulk milk

    NARCIS (Netherlands)

    Velthuis, A.; Asseldonk, van M.A.P.M.

    2010-01-01

    Assessment of dairy-milk quality is based on product quality control (testing bulk-milk samples) and process quality control (auditing dairy farms). It is unknown whether process control improves product quality. To quantify possible association between product control and process control a

  8. Process and apparatus for controlling control rods

    International Nuclear Information System (INIS)

    Gebelin, B.; Couture, R.

    1987-01-01

    This process and apparatus is characterized by 2 methods, for examination of cluster of nuclear control rods. Foucault current analyzer which examines fraction by fraction all the control rods. This examination is made by rotation of the cluster. Doubtful rods are then analysed by ultrasonic probe [fr

  9. Third Dutch Process Security Control Event

    NARCIS (Netherlands)

    Luiijf, H.A.M.

    2009-01-01

    On June 4th, 2009, the third Dutch Process Control Security Event took place in Amsterdam. The event, organised by the Dutch National Infrastructure against Cybercrime (NICC), attracted both Dutch process control experts and members of the European SCADA and Control Systems Information Exchange

  10. Control measurement system in purex process

    International Nuclear Information System (INIS)

    Mani, V.V.S.

    1985-01-01

    The dependence of a bulk facility handling Purex Process on the control measurement system for evaluating the process performance needs hardly be emphasized. process control, Plant control, inventory control and quality control are the four components of the control measurement system. The scope and requirements of each component are different and the measurement methods are selected accordingly. However, each measurement system has six important elements. These are described in detail. The quality assurance programme carried out by the laboratory as a mechanism through which the quality of measurements is regularly tested and stated in quantitative terms is also explained in terms of internal and external quality assurance, with examples. Suggestions for making the control measurement system more responsive to the operational needs in future are also briefly discussed. (author)

  11. Operator interface for the PEP-II low level RF control system

    International Nuclear Information System (INIS)

    Allison, S.; Claus, R.

    1997-05-01

    This paper focuses on the operational aspects of the low level RF control system being built for the PEP-II storage rings at SLAC. Subsystems requiring major operational considerations include displays for monitor and control from UNIX workstations, slow feedback loops and control sequences residing on microprocessors, and various client applications in the existing SLAC Linear Collider (SLC) control system. Since commissioning of PEP-II RF is currently in-progress, only those parts of the control system used during this phase are discussed in detail. Based on past experience with the SLC control system, it is expected that effort expended during commissioning on a solid user interface will result in smoother transition to full reliable 24-hour-a-day operation

  12. Chado controller: advanced annotation management with a community annotation system.

    Science.gov (United States)

    Guignon, Valentin; Droc, Gaëtan; Alaux, Michael; Baurens, Franc-Christophe; Garsmeur, Olivier; Poiron, Claire; Carver, Tim; Rouard, Mathieu; Bocs, Stéphanie

    2012-04-01

    We developed a controller that is compliant with the Chado database schema, GBrowse and genome annotation-editing tools such as Artemis and Apollo. It enables the management of public and private data, monitors manual annotation (with controlled vocabularies, structural and functional annotation controls) and stores versions of annotation for all modified features. The Chado controller uses PostgreSQL and Perl. The Chado Controller package is available for download at http://www.gnpannot.org/content/chado-controller and runs on any Unix-like operating system, and documentation is available at http://www.gnpannot.org/content/chado-controller-doc The system can be tested using the GNPAnnot Sandbox at http://www.gnpannot.org/content/gnpannot-sandbox-form valentin.guignon@cirad.fr; stephanie.sidibe-bocs@cirad.fr Supplementary data are available at Bioinformatics online.

  13. Robust control charts in statistical process control

    NARCIS (Netherlands)

    Nazir, H.Z.

    2014-01-01

    The presence of outliers and contaminations in the output of the process highly affects the performance of the design structures of commonly used control charts and hence makes them of less practical use. One of the solutions to deal with this problem is to use control charts which are robust

  14. The UNK control system

    International Nuclear Information System (INIS)

    Alferov, V.N.; Brook, V.L.; Dunaitsev, A.F.

    1992-01-01

    The IHEP proton Accelerating and Storage Complex (UNK) includes in its first stage a 400 GeV conventional and a 3000 GeV superconducting ring placed in the same underground tunnel of 20.7 km circumference. The beam will be injected into UNK from the existing 70 GeV accelerator U-70. The experimental programme which is planned to start in 1995, will include 3000 GeV fixed target and 400 + 3000 GeV colliding beams physics. The size and complexity of the UNK dictate a distributed multiprocessor architecture of the control system. About 4000 of 8/16 bit controllers, directly attached to the UNK equipment will perform low level control and data acquisition tasks. The equipment controllers will be connected via the MIL-1553 field bus to VME based 32-bit front end computers. The TCP/IP network will interconnect front end computers in the UNK equipment buildings with UNIX workstations and servers in the Main Control Room. The report presents the general architecture and current status of the UNK control. (author)

  15. Distributed computer controls for accelerator systems

    Science.gov (United States)

    Moore, T. L.

    1989-04-01

    A distributed control system has been designed and installed at the Lawrence Livermore National Laboratory Multiuser Tandem Facility using an extremely modular approach in hardware and software. The two tiered, geographically organized design allowed total system implantation within four months with a computer and instrumentation cost of approximately $100k. Since the system structure is modular, application to a variety of facilities is possible. Such a system allows rethinking of operational style of the facilities, making possible highly reproducible and unattended operation. The impact of industry standards, i.e., UNIX, CAMAC, and IEEE-802.3, and the use of a graphics-oriented controls software suite allowed the effective implementation of the system. The definition, design, implementation, operation and total system performance will be discussed.

  16. Distributed computer controls for accelerator systems

    International Nuclear Information System (INIS)

    Moore, T.L.

    1989-01-01

    A distributed control system has been designed and installed at the Lawrence Livermore National Laboratory Multiuser Tandem Facility using an extremely modular approach in hardware and software. The two tiered, geographically organized design allowed total system implantation within four months with a computer and instrumentation cost of approximately $100k. Since the system structure is modular, application to a variety of facilities is possible. Such a system allows rethinking of operational style of the facilities, making possible highly reproducible and unattended operation. The impact of industry standards, i.e., UNIX, CAMAC, and IEEE-802.3, and the use of a graphics-oriented controls software suite allowed the effective implementation of the system. The definition, design, implementation, operation and total system performance will be discussed. (orig.)

  17. Distributed computer controls for accelerator systems

    International Nuclear Information System (INIS)

    Moore, T.L.

    1988-09-01

    A distributed control system has been designed and installed at the Lawrence Livermore National Laboratory Multi-user Tandem Facility using an extremely modular approach in hardware and software. The two tiered, geographically organized design allowed total system implementation with four months with a computer and instrumentation cost of approximately $100K. Since the system structure is modular, application to a variety of facilities is possible. Such a system allows rethinking and operational style of the facilities, making possible highly reproducible and unattended operation. The impact of industry standards, i.e., UNIX, CAMAC, and IEEE-802.3, and the use of a graphics-oriented controls software suite allowed the efficient implementation of the system. The definition, design, implementation, operation and total system performance will be discussed. 3 refs

  18. Neural PID Control Strategy for Networked Process Control

    Directory of Open Access Journals (Sweden)

    Jianhua Zhang

    2013-01-01

    Full Text Available A new method with a two-layer hierarchy is presented based on a neural proportional-integral-derivative (PID iterative learning method over the communication network for the closed-loop automatic tuning of a PID controller. It can enhance the performance of the well-known simple PID feedback control loop in the local field when real networked process control applied to systems with uncertain factors, such as external disturbance or randomly delayed measurements. The proposed PID iterative learning method is implemented by backpropagation neural networks whose weights are updated via minimizing tracking error entropy of closed-loop systems. The convergence in the mean square sense is analysed for closed-loop networked control systems. To demonstrate the potential applications of the proposed strategies, a pressure-tank experiment is provided to show the usefulness and effectiveness of the proposed design method in network process control systems.

  19. Data triggered data processing at the Mirror Fusion Test Facility

    International Nuclear Information System (INIS)

    Jackson, R.J.; Balch, T.R.; Preckshot, G.G.

    1986-01-01

    A primary characteristic of most batch systems is that the input data files must exist before jobs are scheduled. On the Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory the authors schedule jobs to process experimental data to be collected during a five minute shot cycle. The data driven processing system emulates a coarsely granular data flow architecture. Processing jobs are scheduled before the experimental data is collected. Processing jobs ''fire'', or execute, as input data becomes available. Similar to UNIX ''pipes'', data produced by upstream processing nodes may be used as inputs by following nodes. Users, working on the networked SUN workstations, specify data processing templates which define processes and their data dependencies. Data specifications indicate the source of data; actual associations with specific data instantiations are made when the jobs are scheduled. The authors report here on details of diagnostic data processing and their experiences

  20. A NEW BENCHMARK FOR PLANTWIDE PROCESS CONTROL

    Directory of Open Access Journals (Sweden)

    N. Klafke

    Full Text Available Abstract The hydrodealkylation process of toluene (HDA has been used as a case study in a large number of control studies. However, in terms of industrial application, this process has become obsolete and is nowadays superseded by new technologies capable of processing heavy aromatic compounds, which increase the added value of the raw materials, such as the process of transalkylation and disproportionation of toluene (TADP. TADP also presents more complex feed and product streams and challenging operational characteristics both in the reactor and separator sections than in HDA. This work is aimed at proposing the TADP process as a new benchmark for plantwide control studies in lieu of the HAD process. For this purpose, a nonlinear dynamic rigorous model for the TADP process was developed using Aspen Plus™ and Aspen Dynamics™ and industrial conditions. Plantwide control structures (oriented to control and to the process were adapted and applied for the first time for this process. The results show that, even though both strategies are similar in terms of control performance, the optimization of economic factors must still be sought.

  1. Closed-Loop Process Control for Electron Beam Freeform Fabrication and Deposition Processes

    Science.gov (United States)

    Taminger, Karen M. (Inventor); Hafley, Robert A. (Inventor); Martin, Richard E. (Inventor); Hofmeister, William H. (Inventor)

    2013-01-01

    A closed-loop control method for an electron beam freeform fabrication (EBF(sup 3)) process includes detecting a feature of interest during the process using a sensor(s), continuously evaluating the feature of interest to determine, in real time, a change occurring therein, and automatically modifying control parameters to control the EBF(sup 3) process. An apparatus provides closed-loop control method of the process, and includes an electron gun for generating an electron beam, a wire feeder for feeding a wire toward a substrate, wherein the wire is melted and progressively deposited in layers onto the substrate, a sensor(s), and a host machine. The sensor(s) measure the feature of interest during the process, and the host machine continuously evaluates the feature of interest to determine, in real time, a change occurring therein. The host machine automatically modifies control parameters to the EBF(sup 3) apparatus to control the EBF(sup 3) process in a closed-loop manner.

  2. Markov processes and controlled Markov chains

    CERN Document Server

    Filar, Jerzy; Chen, Anyue

    2002-01-01

    The general theory of stochastic processes and the more specialized theory of Markov processes evolved enormously in the second half of the last century. In parallel, the theory of controlled Markov chains (or Markov decision processes) was being pioneered by control engineers and operations researchers. Researchers in Markov processes and controlled Markov chains have been, for a long time, aware of the synergies between these two subject areas. However, this may be the first volume dedicated to highlighting these synergies and, almost certainly, it is the first volume that emphasizes the contributions of the vibrant and growing Chinese school of probability. The chapters that appear in this book reflect both the maturity and the vitality of modern day Markov processes and controlled Markov chains. They also will provide an opportunity to trace the connections that have emerged between the work done by members of the Chinese school of probability and the work done by the European, US, Central and South Ameri...

  3. Fundamentals of semiconductor manufacturing and process control

    CERN Document Server

    May, Gary S

    2006-01-01

    A practical guide to semiconductor manufacturing from process control to yield modeling and experimental design Fundamentals of Semiconductor Manufacturing and Process Control covers all issues involved in manufacturing microelectronic devices and circuits, including fabrication sequences, process control, experimental design, process modeling, yield modeling, and CIM/CAM systems. Readers are introduced to both the theory and practice of all basic manufacturing concepts. Following an overview of manufacturing and technology, the text explores process monitoring methods, including those that focus on product wafers and those that focus on the equipment used to produce wafers. Next, the text sets forth some fundamentals of statistics and yield modeling, which set the foundation for a detailed discussion of how statistical process control is used to analyze quality and improve yields. The discussion of statistical experimental design offers readers a powerful approach for systematically varying controllable p...

  4. Control System Design for Cylindrical Tank Process Using Neural Model Predictive Control Technique

    Directory of Open Access Journals (Sweden)

    M. Sridevi

    2010-10-01

    Full Text Available Chemical manufacturing and process industry requires innovative technologies for process identification. This paper deals with model identification and control of cylindrical process. Model identification of the process was done using ARMAX technique. A neural model predictive controller was designed for the identified model. The performance of the controllers was evaluated using MATLAB software. The performance of NMPC controller was compared with Smith Predictor controller and IMC controller based on rise time, settling time, overshoot and ISE and it was found that the NMPC controller is better suited for this process.

  5. A survey of process control computers at the Idaho Chemical Processing Plant

    International Nuclear Information System (INIS)

    Dahl, C.A.

    1989-01-01

    The Idaho Chemical Processing Plant (ICPP) at the Idaho National Engineering Laboratory is charged with the safe processing of spent nuclear fuel elements for the United States Department of Energy. The ICPP was originally constructed in the late 1950s and used state-of-the-art technology for process control at that time. The state of process control instrumentation at the ICPP has steadily improved to keep pace with emerging technology. Today, the ICPP is a college of emerging computer technology in process control with some systems as simple as standalone measurement computers while others are state-of-the-art distributed control systems controlling the operations in an entire facility within the plant. The ICPP has made maximal use of process computer technology aimed at increasing surety, safety, and efficiency of the process operations. Many benefits have been derived from the use of the computers for minimal costs, including decreased misoperations in the facility, and more benefits are expected in the future

  6. Control of Pressure Process Using Infineon Microcontroller

    Directory of Open Access Journals (Sweden)

    A. Siddique

    2007-07-01

    Full Text Available The main objective of this paper is to design a cost effective controller for real time implementation of pressure process using Infineon micro controller (SAB 80C517A. Model Identification is performed and it is found to be First Order Plus Dead Time Process (FOPDT. The performance measure is tabulated for different parameter and it is found that Proportional (P controller is suitable for controlling the process.

  7. Low Activity Waste Feed Process Control Strategy

    International Nuclear Information System (INIS)

    STAEHR, T.W.

    2000-01-01

    The primary purpose of this document is to describe the overall process control strategy for monitoring and controlling the functions associated with the Phase 1B high-level waste feed delivery. This document provides the basis for process monitoring and control functions and requirements needed throughput the double-shell tank system during Phase 1 high-level waste feed delivery. This document is intended to be used by (1) the developers of the future Process Control Plan and (2) the developers of the monitoring and control system

  8. Multivariate Statistical Process Control

    DEFF Research Database (Denmark)

    Kulahci, Murat

    2013-01-01

    As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control (SPC) and monitoring for which the aim...... is to identify “out-of-control” state of a process using control charts in order to reduce the excessive variation caused by so-called assignable causes. In practice, the most common method of monitoring multivariate data is through a statistic akin to the Hotelling’s T2. For high dimensional data with excessive...... amount of cross correlation, practitioners are often recommended to use latent structures methods such as Principal Component Analysis to summarize the data in only a few linear combinations of the original variables that capture most of the variation in the data. Applications of these control charts...

  9. Development of Control Applications for High-Throughput Protein Crystallography Experiments

    International Nuclear Information System (INIS)

    Gaponov, Yurii A.; Matsugaki, Naohiro; Honda, Nobuo; Sasajima, Kumiko; Igarashi, Noriyuki; Hiraki, Masahiko; Yamada, Yusuke; Wakatsuki, Soichi

    2007-01-01

    An integrated client-server control system (PCCS) with a unified relational database (PCDB) has been developed for high-throughput protein crystallography experiments on synchrotron beamlines. The major steps in protein crystallographic experiments (purification, crystallization, crystal harvesting, data collection, and data processing) are integrated into the software. All information necessary for performing protein crystallography experiments is stored in the PCDB database (except raw X-ray diffraction data, which is stored in the Network File Server). To allow all members of a protein crystallography group to participate in experiments, the system was developed as a multi-user system with secure network access based on TCP/IP secure UNIX sockets. Secure remote access to the system is possible from any operating system with X-terminal and SSH/X11 (Secure Shell with graphical user interface) support. Currently, the system covers the high-throughput X-ray data collection stages and is being commissioned at BL5A and NW12A (PF, PF-AR, KEK, Tsukuba, Japan)

  10. Linearizing control of continuous anaerobic fermentation processes

    Energy Technology Data Exchange (ETDEWEB)

    Babary, J.P. [Centre National d`Etudes Spatiales (CNES), 31 - Toulouse (France). Laboratoire d`Analyse et d`Architecture des Systemes; Simeonov, I. [Institute of Microbiology, Bulgarian Academy of Sciences (Bulgaria); Ljubenova, V. [Institute of Control and System Research, BAS (Country unknown/Code not available); Dochain, D. [Universite Catholique de Louvain (UCL), Louvain-la-Neuve (Belgium)

    1997-09-01

    Biotechnological processes (BTP) involve living organisms. In the anaerobic fermentation (biogas production process) the organic matter is mineralized by microorganisms into biogas (methane and carbon dioxide) in the absence of oxygen. The biogas is an additional energy source. Generally this process is carried out as a continuous BTP. It has been widely used in life process and has been confirmed as a promising method of solving some energy and ecological problems in the agriculture and industry. Because of the very restrictive on-line information the control of this process in continuous mode is often reduced to control of the biogas production rate or the concentration of the polluting organic matter (de-pollution control) at a desired value in the presence of some perturbations. Investigations show that classical linear controllers have good performances only in the linear zone of the strongly non-linear input-output characteristics. More sophisticated robust and with variable structure (VSC) controllers are studied. Due to the strongly non-linear dynamics of the process the performances of the closed loop system may be degrading in this case. The aim of this paper is to investigate different linearizing algorithms for control of a continuous non-linear methane fermentation process using the dilution rate as a control action and taking into account some practical implementation aspects. (authors) 8 refs.

  11. Cognitive process modelling of controllers in en route air traffic control.

    Science.gov (United States)

    Inoue, Satoru; Furuta, Kazuo; Nakata, Keiichi; Kanno, Taro; Aoyama, Hisae; Brown, Mark

    2012-01-01

    In recent years, various efforts have been made in air traffic control (ATC) to maintain traffic safety and efficiency in the face of increasing air traffic demands. ATC is a complex process that depends to a large degree on human capabilities, and so understanding how controllers carry out their tasks is an important issue in the design and development of ATC systems. In particular, the human factor is considered to be a serious problem in ATC safety and has been identified as a causal factor in both major and minor incidents. There is, therefore, a need to analyse the mechanisms by which errors occur due to complex factors and to develop systems that can deal with these errors. From the cognitive process perspective, it is essential that system developers have an understanding of the more complex working processes that involve the cooperative work of multiple controllers. Distributed cognition is a methodological framework for analysing cognitive processes that span multiple actors mediated by technology. In this research, we attempt to analyse and model interactions that take place in en route ATC systems based on distributed cognition. We examine the functional problems in an ATC system from a human factors perspective, and conclude by identifying certain measures by which to address these problems. This research focuses on the analysis of air traffic controllers' tasks for en route ATC and modelling controllers' cognitive processes. This research focuses on an experimental study to gain a better understanding of controllers' cognitive processes in air traffic control. We conducted ethnographic observations and then analysed the data to develop a model of controllers' cognitive process. This analysis revealed that strategic routines are applicable to decision making.

  12. A taxonomy of control in intensified processes

    International Nuclear Information System (INIS)

    Barzin, R.; Abd Shukor, S.R.; Ahmad, A.L.

    2006-01-01

    Process Intensification (PI) is a revolutionary approach to design, development and implementation of process and plant. PI technology offers improved environment in a chemical process in terms of better products, and processes which are safer, cleaner, smaller - and cheaper. PI is a strategy of making dramatic reductions in the size of unit operations within chemical plants, in order to achieve given production objectives. However, PI technology would be handicapped if such system is not properly controlled. There are some foreseeable problems in order to control such processes for instance, dynamic interaction between components that make up a control loop, response time of the instrumentations, availability of proper sensor and etc. In some cases, in order to control these systems, advanced control solutions have been applied i.e. model predictive controllers (MPC) and its different algorithms such as quadratic generalized predictive control (QGPC) and self tuning quadratic generalized predictive control (STQGPC). Nevertheless in some cases simpler solutions could be applied to control such system for example proportional integral controller in the control of reactive distillation systems. As mentioned, conventional control systems like proportional-integral, proportional-integral-derivative (PID) controllers and their different structures can be used in PI systems but due to inherent nonlinearity and fast responsiveness of PI systems, digital controllers-regarding to their robustness-are mostly applied in order to control PI systems. Regarding to the fact that choosing the appropriate control strategy is the most essential part of making PI systems possible to be handle easily, taxonomy of the usage of various control structure in controlling PI systems is proposed. This paper offers an overview and discussion on identifying potential problems of instrumentation in PI technology and available control strategies

  13. Dosimetry and process control for radiation processing

    International Nuclear Information System (INIS)

    Mod Ali, N.

    2002-01-01

    Complete text of publication follows. Accurate radiation dosimetry can provide quality assurance in radiation processing. Considerable relevant experiences in dosimetry by the SSDL-MINT has necessitate the development of methods making measurement at gamma plant traceable to the national standard. It involves the establishment of proper calibration procedure and selection of appropriate transfer system/technique to assure adequate traceability to a primary radiation standard. The effort forms the basis for irradiation process control, the legal approval of the process by the public health authorities (medical product sterilization and food preservation) and the safety and acceptance of the product

  14. Modeling and Advanced Control for Sustainable Process ...

    Science.gov (United States)

    This book chapter introduces a novel process systems engineering framework that integrates process control with sustainability assessment tools for the simultaneous evaluation and optimization of process operations. The implemented control strategy consists of a biologically-inspired, multi-agent-based method. The sustainability and performance assessment of process operating points is carried out using the U.S. E.P.A.’s GREENSCOPE assessment tool that provides scores for the selected economic, material management, environmental and energy indicators. The indicator results supply information on whether the implementation of the controller is moving the process towards a more sustainable operation. The effectiveness of the proposed framework is illustrated through a case study of a continuous bioethanol fermentation process whose dynamics are characterized by steady-state multiplicity and oscillatory behavior. This book chapter contribution demonstrates the application of novel process control strategies for sustainability by increasing material management, energy efficiency, and pollution prevention, as needed for SHC Sustainable Uses of Wastes and Materials Management.

  15. Model Process Control Language

    Data.gov (United States)

    National Aeronautics and Space Administration — The MPC (Model Process Control) language enables the capture, communication and preservation of a simulation instance, with sufficient detail that it can be...

  16. Welding process decoupling for improved control

    International Nuclear Information System (INIS)

    Hardt, D.E.; Eagar, T.W.; Lang, J.H.; Jones, L.

    1993-01-01

    The Gas Metal Arc Welding Process is characterized by many important process outputs, all of which should be controlled to ensure consistent high performance joints. However, application of multivariable control methods is confounded by the strong physical coupling of typical outputs of bead shape and thermal properties. This coupling arises from the three dimensional thermal diffusion processes inherent in welding, and cannot be overcome without significant process modification. This paper presents data on the extent of coupling of the process, and proposes process changes to overcome such strong output coupling. Work in rapid torch vibration to change the heat input distribution is detailed, and methods for changing the heat balance between base and fill material heat are described

  17. Internal Decoupling in Nonlinear Process Control

    Directory of Open Access Journals (Sweden)

    Jens G. Balchen

    1988-07-01

    Full Text Available A simple method has been investigated for the total or partial removal of the effect of non-linear process phenomena in multi-variable feedback control systems. The method is based upon computing the control variables which will drive the process at desired rates. It is shown that the effect of model errors in the linearization of the process can be partly removed through the use of large feedback gains. In practice there will be limits on how large gains can he used. The sensitivity to parameter errors is less pronounced and the transient behaviour is superior to that of ordinary PI controllers.

  18. A contribution to the test software for the VXI electronic cards of the Eurogam multidetector in a Unix/VXWorks distributed environment

    International Nuclear Information System (INIS)

    Kadionik, P.

    1992-01-01

    The Eurogam gamma ray multidetector involves, in a first phase, 45 hyper pure Ge detectors, each surrounded by an Anti Compton shield of 10 BGO detectors. In order to ensure the highest reliability and an easy upgrade of the array, the electronic cards have been designed in the new VXI (VME Bus Extension to Instrumentation) standard; this allows to drive the 495 detectors with 4300 parameters to be adjusted by software. The data acquisition architecture is distributed on an Ethernet network. The software for set up and tests of the VXI cards have been written in C, it uses a real time kernel (VxWorks from Wind River Systems) interfaced to the Sun Unix environment. The inter-tasks communications use the Remote Procedure Calls protocol. The inner-shell of the software is connected to a data base and to a graphic interface which allows the engineers or physicists to have a very easy set-up for so many parameters to adjust

  19. Run control techniques for the Fermilab DART data acquisition system

    International Nuclear Information System (INIS)

    Oleynik, G.; Engelfried, J.; Mengel, L.; Moore, C.; Pordes, R.; Udumula, L.; Votava, M.; Drunen, E. van; Zioulas, G.

    1996-01-01

    DART is the high speed, Unix based data acquisition system being developed by the Fermilab Computing Division in collaboration with eight High Energy Physics Experiments. This paper describes DART run-control which implements flexible, distributed, extensible and portable paradigms for the control monitoring of a data acquisition systems. We discuss the unique and interesting aspects of the run-control - why we chose the concepts we did, the benefits we have seen from the choices we made, as well as our experiences in deploying and supporting it for experiments during their commissioning and sub-system testing phases. We emphasize the software and techniques we believe are extensible to future use, and potential future modifications and extensions for those we feel are not. (author)

  20. Run control techniques for the Fermilab DART data acquisition system

    International Nuclear Information System (INIS)

    Oleynik, G.; Engelfried, J.; Mengel, L.

    1995-10-01

    DART is the high speed, Unix based data acquisition system being developed by the Fermilab Computing Division in collaboration with eight High Energy Physics Experiments. This paper describes DART run-control which implements flexible, distributed, extensible and portable paradigms for the control and monitoring of data acquisition systems. We discuss the unique and interesting aspects of the run-control - why we chose the concepts we did, the benefits we have seen from the choices we made, as well as our experiences in deploying and supporting it for experiments during their commissioning and sub-system testing phases. We emphasize the software and techniques we believe are extensible to future use, and potential future modifications and extensions for those we feel are not

  1. Statistical process control for serially correlated data

    NARCIS (Netherlands)

    Wieringa, Jakob Edo

    1999-01-01

    Statistical Process Control (SPC) aims at quality improvement through reduction of variation. The best known tool of SPC is the control chart. Over the years, the control chart has proved to be a successful practical technique for monitoring process measurements. However, its usefulness in practice

  2. An X window based graphics user interface for radiation information processing system developed with object-oriented programming technology

    International Nuclear Information System (INIS)

    Gao Wenhuan; Fu Changqing; Kang Kejun

    1993-01-01

    X Window is a network-oriented and network transparent windowing system, and now dominant in the Unix domain. The object-oriented programming technology can be used to change the extensibility of a software system remarkably. An introduction to graphics user interface is given. And how to develop a graphics user interface for radiation information processing system with object-oriented programming technology, which is based on X Window and independent of application is described briefly

  3. Expert systems in process control systems

    International Nuclear Information System (INIS)

    Wittig, T.

    1987-01-01

    To illustrate where the fundamental difference between expert systems in classical diagnosis and in industrial control lie, the work of process control instrumentation is used as an example for the job of expert systems. Starting from the general process of problem-solving, two classes of expert systems can be defined accordingly. (orig.) [de

  4. Engineering Process Monitoring for Control Room Operation

    CERN Document Server

    Bätz, M

    2001-01-01

    A major challenge in process operation is to reduce costs and increase system efficiency whereas the complexity of automated process engineering, control and monitoring systems increases continuously. To cope with this challenge the design, implementation and operation of process monitoring systems for control room operation have to be treated as an ensemble. This is only possible if the engineering of the monitoring information is focused on the production objective and is lead in close collaboration of control room teams, exploitation personnel and process specialists. In this paper some principles for the engineering of monitoring information for control room operation are developed at the example of the exploitation of a particle accelerator at the European Laboratory for Nuclear Research (CERN).

  5. Modern control of mineral wool production process

    Directory of Open Access Journals (Sweden)

    Stankov Stanko P.

    2013-01-01

    Full Text Available In this paper, the control of the plant for mineral wool production consisting of a number of the technological units of different sizes and complexity is considered. The application of modern equipment based on PLC (Programmable Logic Controller and SCADA (Supervisory Control And Data Acquisition configuration provides optimal control of technological process. Described supervisory and control system is consisting of a number of units doing decentralized distributed control of technological entities where all possible situation are considered during work of machines and devices, which are installed in electric drive and are protected from technological and electrical accident. Transformer station and diesel engine, raw materials transport and dosage, processes in dome oven, centrifuges, polycondensation (PC chamber, burners, compressor station, binder preparation and dosage, wool cutting, completed panel packing and their transport to storehouse are controlled. Process variables and parameters like as level, flow, velocity, temperature, pressure, etc. are controlled. Control system is doing identification of process states changes, diagnostic and prediction of errors and provides prediction of behavior of control objects when input flows of materials and generates optimal values of control variables due to decreasing downtime and technic - economical requires connected to wool quality to be achieved. Supervisory and control system either eliminates unwanted changes in the production line or restricts them within the allowable limits according to the technology. In this way, the optimization of energy and raw materials consumption and appropriate products quality is achieved, where requirements are satisfied in accordance with process safety and environmental standards. SCADA provides a visual representation of controlled and uncontrolled parts of the technological process, processing alarms and events, monitoring of the changes of relevant

  6. Design of SPring-8 control system

    International Nuclear Information System (INIS)

    Wada, T.; Kumahara, T.; Yonehara, H.; Yoshikawa, H.; Masuda, T.; Wang Zhen

    1992-01-01

    The control system of SPring-8 facility is designed. A distributed computer system is adopted with a three-hierarchy levels. All the computers are linked by computer networks. The network of upper level is a high-speed multi-media LAN such as FDDI which links sub-system control computers, and middle are Ethernet or MAP networks which link front end processors (FEP) such as VME system. The lowest is a field level bus which links VME and controlled devices. Workstations (WS) or X-terminals are useful for man-machine interfaces. For operating system (OS), UNIX is useful for upper level computers, and real-time OS's for FEP's. We will select hardwares and OS of which specifications are close to international standards. Since recently the cost of software has become higher than that of hardware, we introduce computer aided tools as many as possible for program developments. (author)

  7. New Process Controls for the Hera Cryogenic Plant

    Science.gov (United States)

    Böckmann, T.; Clausen, M.; Gerke, Chr.; Prüß, K.; Schoeneburg, B.; Urbschat, P.

    2010-04-01

    The cryogenic plant built for the HERA accelerator at DESY in Hamburg (Germany) is now in operation for more than two decades. The commercial process control system for the cryogenic plant is in operation for the same time period. Ever since the operator stations, the control network and the CPU boards in the process controllers went through several upgrade stages. Only the centralized Input/Output system was kept unchanged. Many components have been running beyond the expected lifetime. The control system for one at the three parts of the cryogenic plant has been replaced recently by a distributed I/O system. The I/O nodes are connected to several Profibus-DP field busses. Profibus provides the infrastructure to attach intelligent sensors and actuators directly to the process controllers which run the open source process control software EPICS. This paper describes the modification process on all levels from cabling through I/O configuration, the process control software up to the operator displays.

  8. Advanced Map For Real-Time Process Control

    Science.gov (United States)

    Shiobara, Yasuhisa; Matsudaira, Takayuki; Sashida, Yoshio; Chikuma, Makoto

    1987-10-01

    MAP, a communications protocol for factory automation proposed by General Motors [1], has been accepted by users throughout the world and is rapidly becoming a user standard. In fact, it is now a LAN standard for factory automation. MAP is intended to interconnect different devices, such as computers and programmable devices, made by different manufacturers, enabling them to exchange information. It is based on the OSI intercomputer com-munications protocol standard under development by the ISO. With progress and standardization, MAP is being investigated for application to process control fields other than factory automation [2]. The transmission response time of the network system and centralized management of data exchanged with various devices for distributed control are import-ant in the case of a real-time process control with programmable controllers, computers, and instruments connected to a LAN system. MAP/EPA and MINI MAP aim at reduced overhead in protocol processing and enhanced transmission response. If applied to real-time process control, a protocol based on point-to-point and request-response transactions limits throughput and transmission response. This paper describes an advanced MAP LAN system applied to real-time process control by adding a new data transmission control that performs multicasting communication voluntarily and periodically in the priority order of data to be exchanged.

  9. On-line optimal control improves gas processing

    International Nuclear Information System (INIS)

    Berkowitz, P.N.; Papadopoulos, M.N.

    1992-01-01

    This paper reports that the authors' companies jointly funded the first phase of a gas processing liquids optimization project that has the specific purposes to: Improve the return of processing natural gas liquids, Develop sets of control algorithms, Make available a low-cost solution suitable for small to medium-sized gas processing plants, Test and demonstrate the feasibility of line control. The ARCO Willard CO 2 gas recovery processing plant was chosen as the initial test site to demonstrate the application of multivariable on-line optimal control. One objective of this project is to support an R ampersand D effort to provide a standardized solution to the various types of gas processing plants in the U.S. Processes involved in these gas plants include cryogenic separations, demethanization, lean oil absorption, fractionation and gas treating. Next, the proposed solutions had to be simple yet comprehensive enough to allow an operator to maintain product specifications while operating over a wide range of gas input flow and composition. This had to be a supervisors system that remained on-line more than 95% of the time, and achieved reduced plant operating variability and improved variable cost control. It took more than a year to study various gas processes and to develop a control approach before a real application was finally exercised. An initial process for C 2 and CO 2 recoveries was chosen

  10. The Digital Motion Control System for the Submillimeter Array Antennas

    Science.gov (United States)

    Hunter, T. R.; Wilson, R. W.; Kimberk, R.; Leiker, P. S.; Patel, N. A.; Blundell, R.; Christensen, R. D.; Diven, A. R.; Maute, J.; Plante, R. J.; Riddle, P.; Young, K. H.

    2013-09-01

    We describe the design and performance of the digital servo and motion control system for the 6-meter parabolic antennas of the Submillimeter Array (SMA) on Mauna Kea, Hawaii. The system is divided into three nested layers operating at a different, appropriate bandwidth. (1) A rack-mounted, real-time Unix system runs the position loop which reads the high resolution azimuth and elevation encoders and sends velocity and acceleration commands at 100 Hz to a custom-designed servo control board (SCB). (2) The microcontroller-based SCB reads the motor axis tachometers and implements the velocity loop by sending torque commands to the motor amplifiers at 558 Hz. (3) The motor amplifiers implement the torque loop by monitoring and sending current to the three-phase brushless drive motors at 20 kHz. The velocity loop uses a traditional proportional-integral-derivative (PID) control algorithm, while the position loop uses only a proportional term and implements a command shaper based on the Gauss error function. Calibration factors and software filters are applied to the tachometer feedback prior to the application of the servo gains in the torque computations. All of these parameters are remotely adjustable in the software. The three layers of the control system monitor each other and are capable of shutting down the system safely if a failure or anomaly occurs. The Unix system continuously relays the antenna status to the central observatory computer via reflective memory. In each antenna, a Palm Vx hand controller displays the complete system status and allows full local control of the drives in an intuitive touchscreen user interface. The hand controller can also be connected outside the cabin, a major convenience during the frequent reconfigurations of the interferometer. Excellent tracking performance ( 0.3‧‧ rms) is achieved with this system. It has been in reliable operation on 8 antennas for over 10 years and has required minimal maintenance.

  11. PROCESS VARIABILITY REDUCTION THROUGH STATISTICAL PROCESS CONTROL FOR QUALITY IMPROVEMENT

    Directory of Open Access Journals (Sweden)

    B.P. Mahesh

    2010-09-01

    Full Text Available Quality has become one of the most important customer decision factors in the selection among the competing product and services. Consequently, understanding and improving quality is a key factor leading to business success, growth and an enhanced competitive position. Hence quality improvement program should be an integral part of the overall business strategy. According to TQM, the effective way to improve the Quality of the product or service is to improve the process used to build the product. Hence, TQM focuses on process, rather than results as the results are driven by the processes. Many techniques are available for quality improvement. Statistical Process Control (SPC is one such TQM technique which is widely accepted for analyzing quality problems and improving the performance of the production process. This article illustrates the step by step procedure adopted at a soap manufacturing company to improve the Quality by reducing process variability using Statistical Process Control.

  12. Quantum Control of Molecular Processes

    CERN Document Server

    Shapiro, Moshe

    2012-01-01

    Written by two of the world's leading researchers in the field, this is a systematic introduction to the fundamental principles of coherent control, and to the underlying physics and chemistry.This fully updated second edition is enhanced by 80% and covers the latest techniques and applications, including nanostructures, attosecond processes, optical control of chirality, and weak and strong field quantum control. Developments and challenges in decoherence-sensitive condensed phase control as well as in bimolecular control are clearly described.Indispensable for atomic, molecular and chemical

  13. Practical Implementations of Advanced Process Control for Linear Systems

    DEFF Research Database (Denmark)

    Knudsen, Jørgen K . H.; Huusom, Jakob Kjøbsted; Jørgensen, John Bagterp

    2013-01-01

    This paper describes some practical problems encountered, when implementing Advanced Process Control, APC, schemes on linear processes. The implemented APC controllers discussed will be LQR, Riccati MPC and Condensed MPC controllers illustrated by simulation of the Four Tank Process and a lineari......This paper describes some practical problems encountered, when implementing Advanced Process Control, APC, schemes on linear processes. The implemented APC controllers discussed will be LQR, Riccati MPC and Condensed MPC controllers illustrated by simulation of the Four Tank Process...... on pilot plant equipment on the department of Chemical Engineering DTU Lyngby....

  14. The Vivitron process control

    International Nuclear Information System (INIS)

    Lutz, J.R.; Marsaudon, J.C.

    1989-10-01

    The operation of the VIVITRON electrostatic accelerator designed since 1981 and under construction at the CRN since 1985 needs a dedicated process control set up. The study and design of this control system started in 1987. The electrostatic accelerators are rarely operated by a modern control system. So little knowledge is available in this field. The timing problems are generally weak but the Vivitron specific structure, with seven porticos in the tank and sophisticated beam handling in the terminal, imposes control equipment inside the tank under extreme severe conditions. Several steps are necessary to achieve the full size control system. Some tests in the MP used as a pilot machine supplied practical information about surrounding accelerator conditions inside the tank. They also provided better knowledge of the beam behavior, especially inside the accelerator tube

  15. Intelligent Controller Design for a Chemical Process

    OpenAIRE

    Mr. Glan Devadhas G; Dr.Pushpakumar S.

    2010-01-01

    Chemical process control is a challenging problem due to the strong on*line non*linearity and extreme sensitivity to disturbances of the process. Ziegler – Nichols tuned PI and PID controllers are found to provide poor performances for higher*order and non–linear systems. This paper presents an application of one*step*ahead fuzzy as well as ANFIS (adaptive*network*based fuzzy inference system) tuning scheme for an Continuous Stirred Tank Reactor CSTR process. The controller is designed based ...

  16. Fermentation process using specific oxygen uptake rates as a process control

    Science.gov (United States)

    Van Hoek, Pim [Minnetonka, MN; Aristidou, Aristos [Maple Grove, MN; Rush, Brian [Minneapolis, MN

    2011-05-10

    Specific oxygen uptake (OUR) is used as a process control parameter in fermentation processes. OUR is determined during at least the production phase of a fermentation process, and process parameters are adjusted to maintain the OUR within desired ranges. The invention is particularly applicable when the fermentation is conducted using a microorganism having a natural PDC pathway that has been disrupted so that it no longer functions. Microorganisms of this sort often produce poorly under strictly anaerobic conditions. Microaeration controlled by monitoring OUR allows the performance of the microorganism to be optimized.

  17. A commercial real-time manufacturing integration platform for the new control system on FTU

    International Nuclear Information System (INIS)

    Panella, M.; Bertocchi, A.; Bozzolan, V.; Buceti, G.; Centioli, C.; Imparato, A.; Mazza, G.; Torelli, C.; Vitale, V.

    1999-01-01

    In 1994 a working group was set up in Frascati to investigate how to build up a new control system for FTU (Frascati tokamak upgrade) considering the evolution in the information technology. Strong emphasis was placed on the use of standard solutions (be they de-facto or de-jure) and commercial platforms where-ever possible. This paper describes our operational experience with the new control system based on the commercial DEC BASEstar family of products. BASEstar is based on client/server computing technologies, providing an environment to collect, process, manage, distribute and integrate real time manufacturing data. UNIX, VMS, PC Win, OS-9 are integrated to handle hosts, PC, VME CPUs. A 4 GL programming language, CIMfast, has been used to handle via automatic procedures the tokamak discharge. X11 standard based mimics are available to display the plants status. A real flexibility of the whole system has been experience and the further use of the this system has been planned for the ITER DTP (divertor test platform). (orig.)

  18. Screen-based process control in nuclear plants

    International Nuclear Information System (INIS)

    Hinz, W.; Arnoldt, C.; Hessler, C.

    1993-01-01

    Requirements, development and conceptual design of a screen-based control room for nuclear power plants are outlined. The control room consists of three or four equally equipped operator workstations comprising screens for process information and manual process control. A plant overview will assist the coordination among the operators. A safety classified backup system (safety control area) is provided to cover postulated failures of the control means. Some aspects of ergonomical validation and of future development trends are discussed. (orig.) [de

  19. Microeconomics of process control in semiconductor manufacturing

    Science.gov (United States)

    Monahan, Kevin M.

    2003-06-01

    Process window control enables accelerated design-rule shrinks for both logic and memory manufacturers, but simple microeconomic models that directly link the effects of process window control to maximum profitability are rare. In this work, we derive these links using a simplified model for the maximum rate of profit generated by the semiconductor manufacturing process. We show that the ability of process window control to achieve these economic objectives may be limited by variability in the larger manufacturing context, including measurement delays and process variation at the lot, wafer, x-wafer, x-field, and x-chip levels. We conclude that x-wafer and x-field CD control strategies will be critical enablers of density, performance and optimum profitability at the 90 and 65nm technology nodes. These analyses correlate well with actual factory data and often identify millions of dollars in potential incremental revenue and cost savings. As an example, we show that a scatterometry-based CD Process Window Monitor is an economically justified, enabling technology for the 65nm node.

  20. Security of legacy process control systems : Moving towards secure process control systems

    NARCIS (Netherlands)

    Oosterink, M.

    2012-01-01

    This white paper describes solutions which organisations may use to improve the security of their legacy process control systems. When we refer to a legacy system, we generally refer to old methodologies, technologies, computer systems or applications which are still in use, despite the fact that

  1. [Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry].

    Science.gov (United States)

    Villani, N; Gérard, K; Marchesi, V; Huger, S; François, P; Noël, A

    2010-06-01

    The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (IMRT) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. At Alexis-Vautrin center, pretreatment quality controls in IMRT for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multileaf collimator). Correlation between dose measured at one point, given with the EPID and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. The study allowed to demonstrate the feasibility to reduce the time devoted to

  2. Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry

    International Nuclear Information System (INIS)

    Villani, N.; Noel, A.; Villani, N.; Gerard, K.; Marchesi, V.; Huger, S.; Noel, A.; Francois, P.

    2010-01-01

    Purpose The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (I.M.R.T.) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. Patients and methods At Alexis-Vautrin center, pretreatment quality controls in I.M.R.T. for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Results Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multi-leaf collimator). Correlation between dose measured at one point, given with the E.P.I.D. and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. Conclusion The study allowed to

  3. Application of Statistical Process Control (SPC in it´s Quality control

    Directory of Open Access Journals (Sweden)

    Carlos Hernández-Pedrera

    2015-12-01

    Full Text Available The overall objective of this paper is to use the SPC to assess the possibility of improving the process of obtaining a sanitary device. As specific objectives we set out to identify the variables to be analyzed to enter the statistical control of process (SPC, analyze possible errors and variations indicated by the control charts in addition to evaluate and compare the results achieved with the study of SPC before and after monitoring direct in the production line were used sampling methods and laboratory replacement to determine the quality of the finished product, then statistical methods were applied seeking to emphasize the importance and contribution from its application to monitor corrective actions and support processes in production. It was shown that the process is under control because the results were found within established control limits. There is a tendency to be displaced toward one end of the boundary, the distribution exceeds the limits, creating the possibility that under certain conditions the process is out of control, the results also showed that the process being within the limits of quality control is operating far from the optimal conditions. In any of the study situations were obtained products outside the limits of weight and discoloration but defective products were obtained.

  4. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)

    Energy Technology Data Exchange (ETDEWEB)

    Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, Francois; Aletti, Pierre [Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France) and DOSIsoft SA, 94230 Cachan (France); Research Laboratory for Innovative Processes (ERPI), Nancy University, EA 3767, 5400 Nancy Cedex (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France); DOSIsoft SA, 94230 Cachan (France); Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy, France and Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France)

    2009-04-15

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculating a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should

  5. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).

    Science.gov (United States)

    Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre

    2009-04-01

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the

  6. Discrete Control Processes, Dynamic Games and Multicriterion Control Problems

    Directory of Open Access Journals (Sweden)

    Dumitru Lozovanu

    2002-07-01

    Full Text Available The discrete control processes with state evaluation in time of dynamical system is considered. A general model of control problems with integral-time cost criterion by a trajectory is studied and a general scheme for solving such classes of problems is proposed. In addition the game-theoretical and multicriterion models for control problems are formulated and studied.

  7. Processing implicit control: evidence from reading times

    Directory of Open Access Journals (Sweden)

    Michael eMcCourt

    2015-10-01

    Full Text Available Sentences such as The ship was sunk to collect the insurance exhibit an unusual form of anaphora, implicit control, where neither anaphor nor antecedent is audible. The nonfinite reason clause has an understood subject, PRO, that is anaphoric; here it may be understood as naming the agent of the event of the host clause. Yet since the host is a short passive, this agent is realized by no audible dependent. The putative antecedent to PRO is therefore implicit, which it normally cannot be. What sorts of representations subserve the comprehension of this dependency? Here we present four self-paced reading time studies directed at this question. Previous work showed no processing cost for implicit versus explicit control, and took this to support the view that PRO is linked syntactically to a silent argument in the passive. We challenge this conclusion by reporting that we also find no processing cost for remote implicit control, as in: The ship was sunk. The reason was to collect the insurance. Here the dependency crosses two independent sentences, and so cannot, we argue, be mediated by syntax. Our Experiments 1-4 examined the processing of both implicit (short passive and explicit (active or long passive control in both local and remote configurations. Experiments 3 and 4 added either three days ago or just in order to the local conditions, to control for the distance between the passive and infinitival verbs, and for the predictability of the reason clause, respectively. We replicate the finding that implicit control does not impose an additional processing cost. But critically we show that remote control does not impose a processing cost either. Reading times at the reason clause were never slower when control was remote. In fact they were always faster. Thus efficient processing of local implicit control cannot show that implicit control is mediated by syntax; nor, in turn, that there is a silent but grammatically active argument in passives.

  8. Object oriented programming interfaces for accelerator control

    International Nuclear Information System (INIS)

    Hoff, L.T.

    1997-01-01

    Several years ago, the AGS controls group was given the task of developing software for the RHIC accelerator. Like the AGS, the RHIC control system needs to control and monitor equipment distributed around a relatively large geographic area. A local area network connects this equipment to a collection of UNIX workstations in a central control room. Similar software had been developed for the AGS about a decade earlier, but isn't well suited for RHIC use for a number of reasons. Rather than adapt the AGS software for RHIC use, the controls group opted to start with a clean slate. To develop software that would address the shortcomings of the AGS software, while preserving the useful features that evolved through years of use. A current trend in control system design is to provide an object oriented programming interface for application developers. This talk will discuss important aspects and features of object oriented application programming interfaces (APIs) for accelerator control systems, and explore why such interfaces are becoming the norm

  9. High-Level Waste (HLW) Feed Process Control Strategy

    International Nuclear Information System (INIS)

    STAEHR, T.W.

    2000-01-01

    The primary purpose of this document is to describe the overall process control strategy for monitoring and controlling the functions associated with the Phase 1B high-level waste feed delivery. This document provides the basis for process monitoring and control functions and requirements needed throughput the double-shell tank system during Phase 1 high-level waste feed delivery. This document is intended to be used by (1) the developers of the future Process Control Plan and (2) the developers of the monitoring and control system

  10. Trusted Unix Working Group (TRUSIX) Rationale for Selecting Access Control List Features for the Unix System

    National Research Council Canada - National Science Library

    1989-01-01

    .... By addressing the class B3 issues, the NCSC believes that this information will also kelp vendors understand how evaluation interpretations will be made at the levels of trust below this class...

  11. The monitoring and control of TRUEX processes

    International Nuclear Information System (INIS)

    Regalbuto, M.C.; Misra, B.; Chamberlain, D.B.; Leonard, R.A.; Vandegrift, G.F.

    1992-04-01

    The Generic TRUEX Model (GTM) was used to design a flowsheet for the TRUEX solvent extraction process that would be used to determine its instrumentation and control requirements. Sensitivity analyses of the key process variables, namely, the aqueous and organic flow rates, feed compositions, and the number of contactor stages, were carried out to assess their impact on the operation of the TRUEX process. Results of these analyses provide a basis for the selection of an instrument and control system and the eventual implementation of a control algorithm. Volume Two of this report is an evaluation of the instruments available for measuring many of the physical parameters. Equations that model the dynamic behavior of the TRUEX process have been generated. These equations can be used to describe the transient or dynamic behavior of the process for a given flowsheet in accordance with the TRUEX model. Further work will be done with the dynamic model to determine how and how quickly the system responds to various perturbations. The use of perturbation analysis early in the design stage will lead to a robust flowsheet, namely, one that will meet all process goals and allow for wide control bounds. The process time delay, that is, the speed with which the system reaches a new steady state, is an important parameter in monitoring and controlling a process. In the future, instrument selection and point-of-variable measurement, now done using the steady-state results reported here, will be reviewed and modified as necessary based on this dynamic method of analysis

  12. Controlling Laboratory Processes From A Personal Computer

    Science.gov (United States)

    Will, H.; Mackin, M. A.

    1991-01-01

    Computer program provides natural-language process control from IBM PC or compatible computer. Sets up process-control system that either runs without operator or run by workers who have limited programming skills. Includes three smaller programs. Two of them, written in FORTRAN 77, record data and control research processes. Third program, written in Pascal, generates FORTRAN subroutines used by other two programs to identify user commands with device-driving routines written by user. Also includes set of input data allowing user to define user commands to be executed by computer. Requires personal computer operating under MS-DOS with suitable hardware interfaces to all controlled devices. Also requires FORTRAN 77 compiler and device drivers written by user.

  13. Case Studies in Modelling, Control in Food Processes.

    Science.gov (United States)

    Glassey, J; Barone, A; Montague, G A; Sabou, V

    This chapter discusses the importance of modelling and control in increasing food process efficiency and ensuring product quality. Various approaches to both modelling and control in food processing are set in the context of the specific challenges in this industrial sector and latest developments in each area are discussed. Three industrial case studies are used to demonstrate the benefits of advanced measurement, modelling and control in food processes. The first case study illustrates the use of knowledge elicitation from expert operators in the process for the manufacture of potato chips (French fries) and the consequent improvements in process control to increase the consistency of the resulting product. The second case study highlights the economic benefits of tighter control of an important process parameter, moisture content, in potato crisp (chips) manufacture. The final case study describes the use of NIR spectroscopy in ensuring effective mixing of dry multicomponent mixtures and pastes. Practical implementation tips and infrastructure requirements are also discussed.

  14. Nonparametric predictive inference in statistical process control

    NARCIS (Netherlands)

    Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.

    2000-01-01

    New methods for statistical process control are presented, where the inferences have a nonparametric predictive nature. We consider several problems in process control in terms of uncertainties about future observable random quantities, and we develop inferences for these random quantities hased on

  15. Fuzzy Control in the Process Industry

    DEFF Research Database (Denmark)

    Jantzen, Jan; Verbruggen, Henk; Østergaard, Jens-Jørgen

    1999-01-01

    Control problems in the process industry are dominated by non-linear and time-varying behaviour, many inner loops, and much interaction between the control loops. Fuzzy controllers have in some cases nevertheless mimicked the control actions of a human operator. Simple fuzzy controllers can...... be designed starting from PID controllers, and in more complex cases these can be used in connection with model-based predictive control. For high level control and supervisory control several simple controllers can be combined in a priority hierarchy such as the one developed in the cement industry...

  16. Control of Neutralization Process Using Soft Computing

    Directory of Open Access Journals (Sweden)

    G. Balasubramanian

    2008-03-01

    Full Text Available A novel model-based nonlinear control strategy is proposed using an experimental pH neutralization process. The control strategy involves a non linear neural network (NN model, in the context of internal model control (IMC. When integrated into the internal model control scheme, the resulting controller is shown to have favorable practical implications as well as superior performance. The designed model based online IMC controller was implemented to a laboratory scaled pH process in real time using dSPACE 1104 interface card. The responses of pH and acid flow rate shows good tracking for both the set point and load chances over the entire nonlinear region.

  17. Implementación de un servidor FTP utilizando el modelo cliente/servidor mediante el uso de sockets en lenguaje c UNIX con el fin de mejorar los tiempos de respuesta en la red

    OpenAIRE

    Murillo, Juan de Dios; Caamaño Polini, Santiago

    2016-01-01

    Este trabajo pretende evaluar la latencia en la transferencia de archivos utilizando un servidor FTP con un modelo cliente-servidor empleando una computadora con el sistema operativo Fedora para ejecutar el código del modelo cliente/servidor con sockets en lenguaje C UNIX, con el fin de simular un servidor que contiene archivos con diferentes formatos y tamaños, y medir la latencia de la transmisión al subir y descargar los archivos del servidor, usando diferentes tamaños de buffer. Con los r...

  18. The product composition control system at Savannah River: Statistical process control algorithm

    International Nuclear Information System (INIS)

    Brown, K.G.

    1994-01-01

    The Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS) will be used to immobilize the approximately 130 million liters of high-level nuclear waste currently stored at the site in 51 carbon steel tanks. Waste handling operations separate this waste into highly radioactive insoluble sludge and precipitate and less radioactive water soluble salts. In DWPF, precipitate (PHA) is blended with insoluble sludge and ground glass frit to produce melter feed slurry which is continuously fed to the DWPF melter. The melter produces a molten borosilicate glass which is poured into stainless steel canisters for cooling and, ultimately, shipment to and storage in an geologic repository. Described here is the Product Composition Control System (PCCS) process control algorithm. The PCCS is the amalgam of computer hardware and software intended to ensure that the melt will be processable and that the glass wasteform produced will be acceptable. Within PCCS, the Statistical Process Control (SPC) Algorithm is the means which guides control of the DWPF process. The SPC Algorithm is necessary to control the multivariate DWPF process in the face of uncertainties arising from the process, its feeds, sampling, modeling, and measurement systems. This article describes the functions performed by the SPC Algorithm, characterization of DWPF prior to making product, accounting for prediction uncertainty, accounting for measurement uncertainty, monitoring a SME batch, incorporating process information, and advantages of the algorithm. 9 refs., 6 figs

  19. Overview of advanced process control in welding within ERDA

    International Nuclear Information System (INIS)

    Armstrong, R.E.

    1977-01-01

    The special kinds of demands placed on ERDA weapons and reactors require them to have very reliable welds. Process control is critical in achieving this reliability. ERDA has a number of advanced process control projects underway with much of the emphasis being on electron beam welding. These include projects on voltage measurement, beam-current control, beam focusing, beam spot tracking, spike suppression, and computer control. A general discussion of process control in welding is followed by specific examples of some of the advanced joining process control projects in ERDA

  20. Functional Dual Adaptive Control with Recursive Gaussian Process Model

    International Nuclear Information System (INIS)

    Prüher, Jakub; Král, Ladislav

    2015-01-01

    The paper deals with dual adaptive control problem, where the functional uncertainties in the system description are modelled by a non-parametric Gaussian process regression model. Current approaches to adaptive control based on Gaussian process models are severely limited in their practical applicability, because the model is re-adjusted using all the currently available data, which keeps growing with every time step. We propose the use of recursive Gaussian process regression algorithm for significant reduction in computational requirements, thus bringing the Gaussian process-based adaptive controllers closer to their practical applicability. In this work, we design a bi-criterial dual controller based on recursive Gaussian process model for discrete-time stochastic dynamic systems given in an affine-in-control form. Using Monte Carlo simulations, we show that the proposed controller achieves comparable performance with the full Gaussian process-based controller in terms of control quality while keeping the computational demands bounded. (paper)

  1. Two-process approach to electron beam welding control

    International Nuclear Information System (INIS)

    Lastovirya, V.N.

    1987-01-01

    The analysis and synthesis of multi-dimensional welding control systems, which require the usage of computers, should be conducted within the temporal range. From the general control theory point two approaches - one-process and two-process - are possible to electron beam welding. In case of two-process approach, subprocesses of heat source formation and direct metal melting are separated. Two-process approach leads to two-profile control system and provides the complete controlability of electron beam welding within the frameworks of systems with concentrated, as well as, with distributed parameters. Approach choice for the given problem solution is determined, first of all, by stability degree of heat source during welding

  2. Process-based quality for thermal spray via feedback control

    Science.gov (United States)

    Dykhuizen, R. C.; Neiser, R. A.

    2006-09-01

    Quality control of a thermal spray system manufacturing process is difficult due to the many input variables that need to be controlled. Great care must be taken to ensure that the process remains constant to obtain a consistent quality of the parts. Control is greatly complicated by the fact that measurement of particle velocities and temperatures is a noisy stochastic process. This article illustrates the application of quality control concepts to a wire flame spray process. A central feature of the real-time control system is an automatic feedback control scheme that provides fine adjustments to ensure that uncontrolled variations are accommodated. It is shown how the control vectors can be constructed from simple process maps to independently control particle velocity and temperature. This control scheme is shown to perform well in a real production environment. We also demonstrate that slight variations in the feed wire curvature can greatly influence the process. Finally, the geometry of the spray system and sensor must remain constant for the best reproducibility.

  3. Improving Accuracy of Processing Through Active Control

    Directory of Open Access Journals (Sweden)

    N. N. Barbashov

    2016-01-01

    Full Text Available An important task of modern mathematical statistics with its methods based on the theory of probability is a scientific estimate of measurement results. There are certain costs under control, and under ineffective control when a customer has got defective products these costs are significantly higher because of parts recall.When machining the parts, under the influence of errors a range scatter of part dimensions is offset towards the tolerance limit. To improve a processing accuracy and avoid defective products involves reducing components of error in machining, i.e. to improve the accuracy of machine and tool, tool life, rigidity of the system, accuracy of the adjustment. In a given time it is also necessary to adapt machine.To improve an accuracy and a machining rate there, currently  become extensively popular various the in-process gaging devices and controlled machining that uses adaptive control systems for the process monitoring. Improving the accuracy in this case is compensation of a majority of technological errors. The in-cycle measuring sensors (sensors of active control allow processing accuracy improvement by one or two quality and provide a capability for simultaneous operation of several machines.Efficient use of in-cycle measuring sensors requires development of methods to control the accuracy through providing the appropriate adjustments. Methods based on the moving average, appear to be the most promising for accuracy control since they include data on the change in some last measured values of the parameter under control.

  4. Fault tolerant control of multivariable processes using auto-tuning PID controller.

    Science.gov (United States)

    Yu, Ding-Li; Chang, T K; Yu, Ding-Wen

    2005-02-01

    Fault tolerant control of dynamic processes is investigated in this paper using an auto-tuning PID controller. A fault tolerant control scheme is proposed composing an auto-tuning PID controller based on an adaptive neural network model. The model is trained online using the extended Kalman filter (EKF) algorithm to learn system post-fault dynamics. Based on this model, the PID controller adjusts its parameters to compensate the effects of the faults, so that the control performance is recovered from degradation. The auto-tuning algorithm for the PID controller is derived with the Lyapunov method and therefore, the model predicted tracking error is guaranteed to converge asymptotically. The method is applied to a simulated two-input two-output continuous stirred tank reactor (CSTR) with various faults, which demonstrate the applicability of the developed scheme to industrial processes.

  5. Model-Based Integrated Process Design and Controller Design of Chemical Processes

    DEFF Research Database (Denmark)

    Abd Hamid, Mohd Kamaruddin Bin

    that is typically formulated as a mathematical programming (optimization with constraints) problem is solved by the so-called reverse approach by decomposing it into four sequential hierarchical sub-problems: (i) pre-analysis, (ii) design analysis, (iii) controller design analysis, and (iv) final selection......This thesis describes the development and application of a new systematic modelbased methodology for performing integrated process design and controller design (IPDC) of chemical processes. The new methodology is simple to apply, easy to visualize and efficient to solve. Here, the IPDC problem...... are ordered according to the defined performance criteria (objective function). The final selected design is then verified through rigorous simulation. In the pre-analysis sub-problem, the concepts of attainable region and driving force are used to locate the optimal process-controller design solution...

  6. Fuzzy systems for process identification and control

    International Nuclear Information System (INIS)

    Gorrini, V.; Bersini, H.

    1994-01-01

    Various issues related to the automatic construction and on-line adaptation of fuzzy controllers are addressed. A Direct Adaptive Fuzzy Control (this is an adaptive control methodology requiring a minimal knowledge of the processes to be coupled with) derived in a way reminiscent of neurocontrol methods, is presented. A classical fuzzy controller and a fuzzy realization of a PID controller is discussed. These systems implement a highly non-linear control law, and provide to be quite robust, even in the case of noisy inputs. In order to identify dynamic processes of order superior to one, we introduce a more complex architecture, called Recurrent Fuzzy System, that use some fuzzy internal variables to perform an inferential chaining.I

  7. Pre- and post-processing of TORT data and preliminary experience with TORT version 3

    International Nuclear Information System (INIS)

    Hoogenboom, J.E.; John, T.M.; Hersman, A.; Leege, P.F.A. de

    1997-01-01

    As the cross-section input to the TORT 3-D transport code is very rigid, subroutines have been included in the local version of TORT to process other cross-section libraries. A mixing table routine was added in order to prepare macroscopic cross-sections from microscopic cross-section libraries. Post-processing was added through additional output flux files in the CCCC-format together with the GEODST file describing the geometry. Recently the new TORT version 3 was successfully installed. However, many problems had to be solved to properly extract the source code and documentation from the UNIX script delivered with the code package. Preliminary tests did not show big differences in performance with the older version. (R.P.)

  8. Integrated Process Design, Control and Analysis of Intensified Chemical Processes

    DEFF Research Database (Denmark)

    Mansouri, Seyed Soheil

    chemical processes; for example, intensified processes such as reactive distillation. Most importantly, it identifies and eliminates potentially promising design alternatives that may have controllability problems later. To date, a number of methodologies have been proposed and applied on various problems......, that the same principles that apply to a binary non-reactive compound system are valid also for a binary-element or a multi-element system. Therefore, it is advantageous to employ the element based method for multicomponent reaction-separation systems. It is shown that the same design-control principles...

  9. Improving Instruction Using Statistical Process Control.

    Science.gov (United States)

    Higgins, Ronald C.; Messer, George H.

    1990-01-01

    Two applications of statistical process control to the process of education are described. Discussed are the use of prompt feedback to teachers and prompt feedback to students. A sample feedback form is provided. (CW)

  10. Engineering Process Monitoring for Control Room Operation

    OpenAIRE

    Bätz, M

    2001-01-01

    A major challenge in process operation is to reduce costs and increase system efficiency whereas the complexity of automated process engineering, control and monitoring systems increases continuously. To cope with this challenge the design, implementation and operation of process monitoring systems for control room operation have to be treated as an ensemble. This is only possible if the engineering of the monitoring information is focused on the production objective and is lead in close coll...

  11. Integrated control system for electron beam processes

    Science.gov (United States)

    Koleva, L.; Koleva, E.; Batchkova, I.; Mladenov, G.

    2018-03-01

    The ISO/IEC 62264 standard is widely used for integration of the business systems of a manufacturer with the corresponding manufacturing control systems based on hierarchical equipment models, functional data and manufacturing operations activity models. In order to achieve the integration of control systems, formal object communication models must be developed, together with manufacturing operations activity models, which coordinate the integration between different levels of control. In this article, the development of integrated control system for electron beam welding process is presented as part of a fully integrated control system of an electron beam plant, including also other additional processes: surface modification, electron beam evaporation, selective melting and electron beam diagnostics.

  12. Integrated Controlling System and Unified Database for High Throughput Protein Crystallography Experiments

    International Nuclear Information System (INIS)

    Gaponov, Yu.A.; Igarashi, N.; Hiraki, M.; Sasajima, K.; Matsugaki, N.; Suzuki, M.; Kosuge, T.; Wakatsuki, S.

    2004-01-01

    An integrated controlling system and a unified database for high throughput protein crystallography experiments have been developed. Main features of protein crystallography experiments (purification, crystallization, crystal harvesting, data collection, data processing) were integrated into the software under development. All information necessary to perform protein crystallography experiments is stored (except raw X-ray data that are stored in a central data server) in a MySQL relational database. The database contains four mutually linked hierarchical trees describing protein crystals, data collection of protein crystal and experimental data processing. A database editor was designed and developed. The editor supports basic database functions to view, create, modify and delete user records in the database. Two search engines were realized: direct search of necessary information in the database and object oriented search. The system is based on TCP/IP secure UNIX sockets with four predefined sending and receiving behaviors, which support communications between all connected servers and clients with remote control functions (creating and modifying data for experimental conditions, data acquisition, viewing experimental data, and performing data processing). Two secure login schemes were designed and developed: a direct method (using the developed Linux clients with secure connection) and an indirect method (using the secure SSL connection using secure X11 support from any operating system with X-terminal and SSH support). A part of the system has been implemented on a new MAD beam line, NW12, at the Photon Factory Advanced Ring for general user experiments

  13. Nonparametric predictive inference in statistical process control

    NARCIS (Netherlands)

    Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.

    2004-01-01

    Statistical process control (SPC) is used to decide when to stop a process as confidence in the quality of the next item(s) is low. Information to specify a parametric model is not always available, and as SPC is of a predictive nature, we present a control chart developed using nonparametric

  14. Testing a Constrained MPC Controller in a Process Control Laboratory

    Science.gov (United States)

    Ricardez-Sandoval, Luis A.; Blankespoor, Wesley; Budman, Hector M.

    2010-01-01

    This paper describes an experiment performed by the fourth year chemical engineering students in the process control laboratory at the University of Waterloo. The objective of this experiment is to test the capabilities of a constrained Model Predictive Controller (MPC) to control the operation of a Double Pipe Heat Exchanger (DPHE) in real time.…

  15. EPICS Input/Output Controller (IOC) application developer's guide. APS Release 3.12

    International Nuclear Information System (INIS)

    Kraimer, M.R.

    1994-11-01

    This document describes the core software that resides in an Input/Output Controller (IOC), one of the major components of EPICS. The basic components are: (OPI) Operator Interface; this is a UNIX based workstation which can run various EPICS tools; (IOC) Input/Output Controller; this is a VME/VXI based chassis containing a Motorola 68xxx processor, various I/O modules, and VME modules that provide access to other I/O buses such as GPIB, (LAN), Local Area Network; and this is the communication network which allows the IOCs and OPIs to communicate. Epics provides a software component, Channel Access, which provides network transparent communication between a Channel Access client and an arbitrary number of Channel Access servers

  16. A case study: application of statistical process control tool for determining process capability and sigma level.

    Science.gov (United States)

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical

  17. Development of computer-aided software engineering tool for sequential control of JT-60U

    International Nuclear Information System (INIS)

    Shimono, M.; Akasaka, H.; Kurihara, K.; Kimura, T.

    1995-01-01

    Discharge sequential control (DSC) is an essential control function for the intermittent and pulse discharge operation of a tokamak device, so that many subsystems may work with each other in correct order and/or synchronously. In the development of the DSC program, block diagrams of logical operation for sequential control are illustrated in its design at first. Then, the logical operators and I/O's which are involved in the block diagrams are compiled and converted to a certain particular form. Since the block diagrams of the sequential control amounts to about 50 sheets in the case of the JT-60 upgrade tokamak (JT-60U) high power discharge and the above steps of the development have been performed manually so far, a great effort has been required for the program development. In order to remove inefficiency in such development processes, a computer-aided software engineering (CASE) tool has been developed on a UNIX workstation. This paper reports how the authors design it for the development of the sequential control programs. The tool is composed of the following three tools: (1) Automatic drawing tool, (2) Editing tool, and (3) Trace tool. This CASE tool, an object-oriented programming tool having graphical formalism, can powerfully accelerate the cycle for the development of the sequential control function commonly associated with pulse discharge in a tokamak fusion device

  18. Development of Chemical Process Design and Control for ...

    Science.gov (United States)

    This contribution describes a novel process systems engineering framework that couples advanced control with sustainability evaluation and decision making for the optimization of process operations to minimize environmental impacts associated with products, materials, and energy. The implemented control strategy combines a biologically inspired method with optimal control concepts for finding more sustainable operating trajectories. The sustainability assessment of process operating points is carried out by using the U.S. E.P.A.’s Gauging Reaction Effectiveness for the ENvironmental Sustainability of Chemistries with a multi-Objective Process Evaluator (GREENSCOPE) tool that provides scores for the selected indicators in the economic, material efficiency, environmental and energy areas. The indicator scores describe process performance on a sustainability measurement scale, effectively determining which operating point is more sustainable if there are more than several steady states for one specific product manufacturing. Through comparisons between a representative benchmark and the optimal steady-states obtained through implementation of the proposed controller, a systematic decision can be made in terms of whether the implementation of the controller is moving the process towards a more sustainable operation. The effectiveness of the proposed framework is illustrated through a case study of a continuous fermentation process for fuel production, whose materi

  19. Optimal control of a CSTR process

    Directory of Open Access Journals (Sweden)

    A. Soukkou

    2008-12-01

    Full Text Available Designing an effective criterion and learning algorithm for find the best structure is a major problem in the control design process. In this paper, the fuzzy optimal control methodology is applied to the design of the feedback loops of an Exothermic Continuous Stirred Tank Reactor system. The objective of design process is to find an optimal structure/gains of the Robust and Optimal Takagi Sugeno Fuzzy Controller (ROFLC. The control signal thus obtained will minimize a performance index, which is a function of the tracking/regulating errors, the quantity of the energy of the control signal applied to the system, and the number of fuzzy rules. The genetic learning is proposed for constructing the ROFLC. The chromosome genes are arranged into two parts, the binary-coded part contains the control genes and the real-coded part contains the genes parameters representing the fuzzy knowledge base. The effectiveness of this chromosome formulation enables the fuzzy sets and rules to be optimally reduced. The performances of the ROFLC are compared to these found by the traditional PD controller with Genetic Optimization (PD_GO. Simulations demonstrate that the proposed ROFLC and PD_GO has successfully met the design specifications.

  20. Advanced Process Control Application and Optimization in Industrial Facilities

    Directory of Open Access Journals (Sweden)

    Howes S.

    2015-01-01

    Full Text Available This paper describes application of the new method and tool for system identification and PID tuning/advanced process control (APC optimization using the new 3G (geometric, gradient, gravity optimization method. It helps to design and implement control schemes directly inside the distributed control system (DCS or programmable logic controller (PLC. Also, the algorithm helps to identify process dynamics in closed-loop mode, optimizes controller parameters, and helps to develop adaptive control and model-based control (MBC. Application of the new 3G algorithm for designing and implementing APC schemes is presented. Optimization of primary and advanced control schemes stabilizes the process and allows the plant to run closer to process, equipment and economic constraints. This increases production rates, minimizes operating costs and improves product quality.

  1. Frontal Control Process in Intentional Forgetting: Electrophysiological Evidence

    Directory of Open Access Journals (Sweden)

    Heming Gao

    2018-01-01

    Full Text Available In this study, we aimed to seek for the neural evidence of the inhibition control process in directed forgetting (DF. We adopted a modified item-method DF paradigm, in which four kinds of cues were involved. In some trials, the words were followed by only a forgetting (F cue. In the other trials, after a word was presented, a maintenance (M cue was presented, followed by an explicit remembering (M-R cue or an forgetting (M-F cue. Data from 19 healthy adult participants showed that, (1 compared with the remembering cue (i.e., M-R cue, forgetting cues (i.e., M-F cue and F cue evoked enhanced frontal N2 and reduced parietal P3 and late positive complex (LPC components, indicating that the forgetting cues might trigger a more intensive cognitive control process and that fewer amounts of cognitive resources were recruited for the further rehearsal process. (2 Both the M cue and the F cue evoked enhanced N2 and decreased P3 and LPC components than the M-R or M-F cue. These results might indicate that compared with the M-R and M-F cues, both the M and F cues evoked a more intensive cognitive control process and decreased attentional resource allocation process. (3 The F cue evoked a decreased P2 component and an enhanced N2 component relative to the other cues (i.e., M-R, M-F, M, indicating that the F cue received fewer amounts of attentional resources and evoked a more intensive cognitive control process. Taken together, forgetting cues were associated with enhanced N2 activity relative to the maintenance rehearsal process or the remembering process, suggesting an enhanced cognitive control process under DF. This cognitive control process might reflect the role of inhibition in DF as attempting to suppress the ongoing encoding.

  2. Intelligent Predictive Control of Nonlienar Processes Using

    DEFF Research Database (Denmark)

    Nørgård, Peter Magnus; Sørensen, Paul Haase; Poulsen, Niels Kjølstad

    1996-01-01

    This paper presents a novel approach to design of generalized predictive controllers (GPC) for nonlinear processes. A neural network is used for modelling the process and a gain-scheduling type of GPC is subsequently designed. The combination of neural network models and predictive control has...... frequently been discussed in the neural network community. This paper proposes an approximate scheme, the approximate predictive control (APC), which facilitates the implementation and gives a substantial reduction in the required amount of computations. The method is based on a technique for extracting...... linear models from a nonlinear neural network and using them in designing the control system. The performance of the controller is demonstrated in a simulation study of a pneumatic servo system...

  3. A methodology to describe process control requirements

    International Nuclear Information System (INIS)

    Carcagno, R.; Ganni, V.

    1994-01-01

    This paper presents a methodology to describe process control requirements for helium refrigeration plants. The SSC requires a greater level of automation for its refrigeration plants than is common in the cryogenics industry, and traditional methods (e.g., written descriptions) used to describe process control requirements are not sufficient. The methodology presented in this paper employs tabular and graphic representations in addition to written descriptions. The resulting document constitutes a tool for efficient communication among the different people involved in the design, development, operation, and maintenance of the control system. The methodology is not limited to helium refrigeration plants, and can be applied to any process with similar requirements. The paper includes examples

  4. Adaptive neural network controller for the molten steel level control of strip casting processes

    International Nuclear Information System (INIS)

    Chen, Hung Yi; Huang, Shiuh Jer

    2010-01-01

    The twin-roll strip casting process is a steel-strip production method which combines continuous casting and hot rolling processes. The production line from molten liquid steel to the final steel-strip is shortened and the production cost is reduced significantly as compared to conventional continuous casting. The quality of strip casting process depends on many process parameters, such as molten steel level in the pool, solidification position, and roll gap. Their relationships are complex and the strip casting process has the properties of nonlinear uncertainty and time-varying characteristics. It is difficult to establish an accurate process model for designing a model-based controller to monitor the strip quality. In this paper, a model-free adaptive neural network controller is developed to overcome this problem. The proposed control strategy is based on a neural network structure combined with a sliding-mode control scheme. An adaptive rule is employed to on-line adjust the weights of radial basis functions by using the reaching condition of a specified sliding surface. This surface has the on-line learning ability to respond to the system's nonlinear and time-varying behaviors. Since this model-free controller has a simple control structure and small number of control parameters, it is easy to implement. Simulation results, based on a semi experimental system dynamic model and parameters, are executed to show the control performance of the proposed intelligent controller. In addition, the control performance is compared with that of a traditional Pid controller

  5. Modeling and Control of Multivariable Process Using Intelligent Techniques

    Directory of Open Access Journals (Sweden)

    Subathra Balasubramanian

    2010-10-01

    Full Text Available For nonlinear dynamic systems, the first principles based modeling and control is difficult to implement. In this study, a fuzzy controller and recurrent fuzzy controller are developed for MIMO process. Fuzzy logic controller is a model free controller designed based on the knowledge about the process. In fuzzy controller there are two types of rule-based fuzzy models are available: one the linguistic (Mamdani model and the other is Takagi–Sugeno model. Of these two, Takagi-Sugeno model (TS has attracted most attention. The fuzzy controller application is limited to static processes due to their feedforward structure. But, most of the real-time processes are dynamic and they require the history of input/output data. In order to store the past values a memory unit is needed, which is introduced by the recurrent structure. The proposed recurrent fuzzy structure is used to develop a controller for the two tank heating process. Both controllers are designed and implemented in a real time environment and their performance is compared.

  6. 21 CFR 820.70 - Production and process controls.

    Science.gov (United States)

    2010-04-01

    ...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Production and Process Controls § 820.70 Production and process... used as part of production or the quality system, the manufacturer shall validate computer software for... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Production and process controls. 820.70 Section...

  7. Advanced coking process control at Rautaruukki Steel

    Energy Technology Data Exchange (ETDEWEB)

    Ritamaki, O.; Luhtaniemi, H. [Rautaruukki Engineering (Finland)

    1999-12-01

    The paper presents the latest development of the Coking Process Management System (CPMS) at Raahe Steel. The latest third generation system is based on the previous system with the addition of fuzzy logic controllers. (The previous second generation system was based simultaneous feed forward and feedback control.) The system development has resulted in balanced coke oven battery heating, decreased variation in process regulation between shifts and increase of process information for operators. The economic results are very satisfactory. 7 figs.

  8. Learning-based controller for biotechnology processing, and method of using

    Science.gov (United States)

    Johnson, John A.; Stoner, Daphne L.; Larsen, Eric D.; Miller, Karen S.; Tolle, Charles R.

    2004-09-14

    The present invention relates to process control where some of the controllable parameters are difficult or impossible to characterize. The present invention relates to process control in biotechnology of such systems, but not limited to. Additionally, the present invention relates to process control in biotechnology minerals processing. In the inventive method, an application of the present invention manipulates a minerals bioprocess to find local exterma (maxima or minima) for selected output variables/process goals by using a learning-based controller for bioprocess oxidation of minerals during hydrometallurgical processing. The learning-based controller operates with or without human supervision and works to find processor optima without previously defined optima due to the non-characterized nature of the process being manipulated.

  9. High level model predictive control for plug-and-play process control with stability guaranty

    DEFF Research Database (Denmark)

    Michelsen, Axel Gottlieb; Stoustrup, Jakob

    2010-01-01

    In this paper a method for designing a stabilizing high level model predictive controller for a hierarchical plug- and-play process is presented. This is achieved by abstracting the lower layers of the controller structure as low order models with uncertainty and by using a robust model predictive...... controller for generating the references for these. A simulation example, in which the actuators in a process control system are changed, is reported to show the potential of this approach for plug and play process control....

  10. Improving the Document Development Process: Integrating Relational Data and Statistical Process Control.

    Science.gov (United States)

    Miller, John

    1994-01-01

    Presents an approach to document numbering, document titling, and process measurement which, when used with fundamental techniques of statistical process control, reveals meaningful process-element variation as well as nominal productivity models. (SR)

  11. Dosimetry control for radiation processing - basic requirements and standards

    International Nuclear Information System (INIS)

    Ivanova, M.; Tsrunchev, Ts.

    2004-01-01

    A brief review of the basic international codes and standards for dosimetry control for radiation processing (high doses dosimetry), setting up a dosimetry control for radiation processing and metrology control of the dosimetry system is made. The present state of dosimetry control for food processing and the Bulgarian long experience in food irradiation (three irradiation facilities are operational at these moment) are presented. The absence of neither national standard for high doses nor accredited laboratory for calibration and audit of radiation processing dosimetry systems is also discussed

  12. Fault Tolerant Control Using Gaussian Processes and Model Predictive Control

    Directory of Open Access Journals (Sweden)

    Yang Xiaoke

    2015-03-01

    Full Text Available Essential ingredients for fault-tolerant control are the ability to represent system behaviour following the occurrence of a fault, and the ability to exploit this representation for deciding control actions. Gaussian processes seem to be very promising candidates for the first of these, and model predictive control has a proven capability for the second. We therefore propose to use the two together to obtain fault-tolerant control functionality. Our proposal is illustrated by several reasonably realistic examples drawn from flight control.

  13. Simulation of process identification and controller tuning for flow control system

    Science.gov (United States)

    Chew, I. M.; Wong, F.; Bono, A.; Wong, K. I.

    2017-06-01

    PID controller is undeniably the most popular method used in controlling various industrial processes. The feature to tune the three elements in PID has allowed the controller to deal with specific needs of the industrial processes. This paper discusses the three elements of control actions and improving robustness of controllers through combination of these control actions in various forms. A plant model is simulated using the Process Control Simulator in order to evaluate the controller performance. At first, the open loop response of the plant is studied by applying a step input to the plant and collecting the output data from the plant. Then, FOPDT of physical model is formed by using both Matlab-Simulink and PRC method. Then, calculation of controller’s setting is performed to find the values of Kc and τi that will give satisfactory control in closed loop system. Then, the performance analysis of closed loop system is obtained by set point tracking analysis and disturbance rejection performance. To optimize the overall physical system performance, a refined tuning of PID or detuning is further conducted to ensure a consistent resultant output of closed loop system reaction to the set point changes and disturbances to the physical model. As a result, the PB = 100 (%) and τi = 2.0 (s) is preferably chosen for setpoint tracking while PB = 100 (%) and τi = 2.5 (s) is selected for rejecting the imposed disturbance to the model. In a nutshell, selecting correlation tuning values is likewise depended on the required control’s objective for the stability performance of overall physical model.

  14. A Taguchi approach on optimal process control parameters for HDPE pipe extrusion process

    Science.gov (United States)

    Sharma, G. V. S. S.; Rao, R. Umamaheswara; Rao, P. Srinivasa

    2017-06-01

    High-density polyethylene (HDPE) pipes find versatile applicability for transportation of water, sewage and slurry from one place to another. Hence, these pipes undergo tremendous pressure by the fluid carried. The present work entails the optimization of the withstanding pressure of the HDPE pipes using Taguchi technique. The traditional heuristic methodology stresses on a trial and error approach and relies heavily upon the accumulated experience of the process engineers for determining the optimal process control parameters. This results in setting up of less-than-optimal values. Hence, there arouse a necessity to determine optimal process control parameters for the pipe extrusion process, which can ensure robust pipe quality and process reliability. In the proposed optimization strategy, the design of experiments (DoE) are conducted wherein different control parameter combinations are analyzed by considering multiple setting levels of each control parameter. The concept of signal-to-noise ratio ( S/ N ratio) is applied and ultimately optimum values of process control parameters are obtained as: pushing zone temperature of 166 °C, Dimmer speed at 08 rpm, and Die head temperature to be 192 °C. Confirmation experimental run is also conducted to verify the analysis and research result and values proved to be in synchronization with the main experimental findings and the withstanding pressure showed a significant improvement from 0.60 to 1.004 Mpa.

  15. A reconfigurable hybrid supervisory system for process control

    International Nuclear Information System (INIS)

    Garcia, H.E.; Ray, A.; Edwards, R.M.

    1994-01-01

    This paper presents a reconfigurable approach to decision and control systems for complex dynamic processes. The proposed supervisory control system is a reconfigurable hybrid architecture structured into three functional levels of hierarchy, namely, execution, supervision, and coordination. While the bottom execution level is constituted by either reconfigurable continuously varying or discrete event systems, the top two levels are necessarily governed by reconfigurable sets of discrete event decision and control systems. Based on the process status, the set of active control and supervisory algorithm is chosen. The reconfigurable hybrid system is briefly described along with a discussion on its implementation at the Experimental Breeder Reactor II of Argonne National Laboratory. A process control application of this hybrid system is presented and evaluated in an in-plant experiment

  16. A reconfigurable hybrid supervisory system for process control

    International Nuclear Information System (INIS)

    Garcia, H.E.; Ray, A.; Edwards, R.M.

    1994-01-01

    This paper presents a reconfigurable approach to decision and control systems for complex dynamic processes. The proposed supervisory control system is a reconfigurable hybrid architecture structured into three functional levels of hierarchy, namely, execution, supervision, and coordination. While, the bottom execution level is constituted by either reconfigurable continuously varying or discrete event systems, the top two levels are necessarily governed by reconfigurable sets of discrete event decision and control systems. Based on the process status, the set of active control and supervisory algorithm is chosen. The reconfigurable hybrid system is briefly described along with a discussion on its implementation at the Experimental Breeder Reactor 2 of Argonne National Laboratory. A process control application of this hybrid system is presented and evaluated in an in-plant experiment

  17. Evaluation of control strategies in forming processes

    Directory of Open Access Journals (Sweden)

    Calmano Stefan

    2015-01-01

    Full Text Available Products of forming processes are subject to quality fluctuations due to uncertainty in semi-finished part properties as well as process conditions and environment. An approach to cope with these uncertainties is the implementation of a closed-loop control taking into account the actual product properties measured by sensors or estimated by a mathematical process model. Both methods of uncertainty control trade off with a financial effort. In case of sensor integration the effort is the cost of the sensor including signal processing as well as the design and manufacturing effort for integration. In case of an estimation model the effort is mainly determined by the time and knowledge needed to derive the model, identify the parameters and implement the model into the PLC. The risk of mismatch between model and reality as well as the risk of wrong parameter identification can be assumed as additional uncertainty (model uncertainty. This paper evaluates controlled and additional uncertainty by taking into account process boundary conditions like the degree of fluctuations in semi-finished part properties. The proposed evaluation is demonstrated by the analysis of exemplary processes.

  18. A Combined Control Chart for Identifying Out–Of–Control Points in Multivariate Processes

    Directory of Open Access Journals (Sweden)

    Marroquín–Prado E.

    2010-10-01

    Full Text Available The Hotelling's T2 control chart is widely used to identify out–of–control signals in multivariate processes. However, this chart is not sensitive to small shifts in the process mean vec tor. In this work we propose a control chart to identify out–of–control signals. The proposed chart is a combination of Hotelling's T2 chart, M chart proposed by Hayter et al. (1994 and a new chart based on Principal Components. The combination of these charts identifies any type and size of change in the process mean vector. Us ing simulation and the Average Run Length (ARL, the performance of the proposed control chart is evaluated. The ARL means the average points within control before an out–of–control point is detected, The results of the simulation show that the proposed chart is more sensitive that each one of the three charts individually

  19. Statistical process control applied to the liquid-fed ceramic melter process

    International Nuclear Information System (INIS)

    Pulsipher, B.A.; Kuhn, W.L.

    1987-09-01

    In this report, an application of control charts to the apparent feed composition of a Liquid-Fed Ceramic Melter (LFCM) is demonstrated by using results from a simulation of the LFCM system. Usual applications of control charts require the assumption of uncorrelated observations over time. This assumption is violated in the LFCM system because of the heels left in tanks from previous batches. Methods for dealing with this problem have been developed to create control charts for individual batches sent to the feed preparation tank (FPT). These control charts are capable of detecting changes in the process average as well as changes in the process variation. All numbers reported in this document were derived from a simulated demonstration of a plausible LFCM system. In practice, site-specific data must be used as input to a simulation tailored to that site. These data directly affect all variance estimates used to develop control charts. 64 refs., 3 figs., 2 tabs

  20. Optical metrology for advanced process control: full module metrology solutions

    Science.gov (United States)

    Bozdog, Cornel; Turovets, Igor

    2016-03-01

    Optical metrology is the workhorse metrology in manufacturing and key enabler to patterning process control. Recent advances in device architecture are gradually shifting the need for process control from the lithography module to other patterning processes (etch, trim, clean, LER/LWR treatments, etc..). Complex multi-patterning integration solutions, where the final pattern is the result of multiple process steps require a step-by-step holistic process control and a uniformly accurate holistic metrology solution for pattern transfer for the entire module. For effective process control, more process "knobs" are needed, and a tighter integration of metrology with process architecture.

  1. Multi-Model Adaptive Fuzzy Controller for a CSTR Process

    Directory of Open Access Journals (Sweden)

    Shubham Gogoria

    2015-09-01

    Full Text Available Continuous Stirred Tank Reactors are intensively used to control exothermic reactions in chemical industries. It is a very complex multi-variable system with non-linear characteristics. This paper deals with linearization of the mathematical model of a CSTR Process. Multi model adaptive fuzzy controller has been designed to control the reactor concentration and temperature of CSTR process. This method combines the output of multiple Fuzzy controllers, which are operated at various operating points. The proposed solution is a straightforward implementation of Fuzzy controller with gain scheduler to control the linearly inseparable parameters of a highly non-linear process.

  2. Multivariate Statistical Process Control Charts: An Overview

    OpenAIRE

    Bersimis, Sotiris; Psarakis, Stelios; Panaretos, John

    2006-01-01

    In this paper we discuss the basic procedures for the implementation of multivariate statistical process control via control charting. Furthermore, we review multivariate extensions for all kinds of univariate control charts, such as multivariate Shewhart-type control charts, multivariate CUSUM control charts and multivariate EWMA control charts. In addition, we review unique procedures for the construction of multivariate control charts, based on multivariate statistical techniques such as p...

  3. Simple Models for Process Control

    Czech Academy of Sciences Publication Activity Database

    Gorez, R.; Klán, Petr

    2011-01-01

    Roč. 22, č. 2 (2011), s. 58-62 ISSN 0929-2268 Institutional research plan: CEZ:AV0Z10300504 Keywords : process model s * PID control * second order dynamics Subject RIV: JB - Sensors, Measurment, Regulation

  4. Distributed process control system for remote control and monitoring of the TFTR tritium systems

    International Nuclear Information System (INIS)

    Schobert, G.; Arnold, N.; Bashore, D.; Mika, R.; Oliaro, G.

    1989-01-01

    This paper reviews the progress made in the application of a commercially available distributed process control system to support the requirements established for the Tritium REmote Control And Monitoring System (TRECAMS) of the Tokamak Fusion Test REactor (TFTR). The system that will discussed was purchased from Texas (TI) Instruments Automation Controls Division), previously marketed by Rexnord Automation. It consists of three, fully redundant, distributed process controllers interfaced to over 1800 analog and digital I/O points. The operator consoles located throughout the facility are supported by four Digital Equipment Corporation (DEC) PDP-11/73 computers. The PDP-11/73's and the three process controllers communicate over a fully redundant one megabaud fiber optic network. All system functionality is based on a set of completely integrated databases loaded to the process controllers and the PDP-11/73's. (author). 2 refs.; 2 figs

  5. Design of central control system for large helical device (LHD)

    International Nuclear Information System (INIS)

    Yamazaki, K.; Kaneko, H.; Yamaguchi, S.; Watanabe, K.Y.; Taniguchi, Y.; Motojima, O.

    1993-11-01

    The world largest superconducting fusion machine LHD (Large Helical Device) is under construction in Japan, aiming at steady state operations. Its basic control system consists of UNIX computers, FDDI/Ethernet LANs, VME multiprocessors and VxWorks real-time OS. For flexible and reliable operations of the LHD machine a cooperative distributed system with more than 30 experimental equipments is controlled by the central computer and the main timing system, and is supervised by the main protective interlock system. Intelligent control systems, such as applications of fuzzy logic and neural networks, are planed to be adopted for flexible feedback controls of plasma configurations besides the classical PID control scheme. Design studies of its control system and related R and D programs with coil-plasma simulation systems are now being performed. The construction of the LHD Control Building in a new site will begin in 1995 after finishing the construction of the LHD Experimental Building, and the hardware construction of the LHD central control equipments will be started in 1996. A first plasma production by means of this control system is expected in 1997. (author)

  6. Ventilation equations for improved exothermic process control.

    Science.gov (United States)

    McKernan, John L; Ellenbecker, Michael J

    2007-04-01

    Exothermic or heated processes create potentially unsafe work environments for an estimated 5-10 million American workers each year. Excessive heat and process contaminants have the potential to cause acute health effects such as heat stroke, and chronic effects such as manganism in welders. Although millions of workers are exposed to exothermic processes, insufficient attention has been given to continuously improving engineering technologies for these processes to provide effective and efficient control. Currently there is no specific occupational standard established by OSHA regarding exposure to heat from exothermic processes, therefore it is important to investigate techniques that can mitigate known and potential adverse occupational health effects. The current understanding of engineering controls for exothermic processes is primarily based on a book chapter written by W. C. L. Hemeon in 1955. Improvements in heat transfer and meteorological theory necessary to design improved process controls have occurred since this time. The research presented involved a review of the physical properties, heat transfer and meteorological theories governing buoyant air flow created by exothermic processes. These properties and theories were used to identify parameters and develop equations required for the determination of buoyant volumetric flow to assist in improving ventilation controls. Goals of this research were to develop and describe a new (i.e. proposed) flow equation, and compare it to currently accepted ones by Hemeon and the American Conference of Governmental Industrial Hygienists (ACGIH). Numerical assessments were conducted to compare solutions from the proposed equations for plume area, mean velocity and flow to those from the ACGIH and Hemeon. Parameters were varied for the dependent variables and solutions from the proposed, ACGIH, and Hemeon equations for plume area, mean velocity and flow were analyzed using a randomized complete block statistical

  7. Multivariable adaptive control of bio process

    Energy Technology Data Exchange (ETDEWEB)

    Maher, M.; Bahhou, B.; Roux, G. [Centre National de la Recherche Scientifique (CNRS), 31 - Toulouse (France); Maher, M. [Faculte des Sciences, Rabat (Morocco). Lab. de Physique

    1995-12-31

    This paper presents a multivariable adaptive control of a continuous-flow fermentation process for the alcohol production. The linear quadratic control strategy is used for the regulation of substrate and ethanol concentrations in the bioreactor. The control inputs are the dilution rate and the influent substrate concentration. A robust identification algorithm is used for the on-line estimation of linear MIMO model`s parameters. Experimental results of a pilot-plant fermenter application are reported and show the control performances. (authors) 8 refs.

  8. Knowledge-based processing for aircraft flight control

    Science.gov (United States)

    Painter, John H.; Glass, Emily; Economides, Gregory; Russell, Paul

    1994-01-01

    This Contractor Report documents research in Intelligent Control using knowledge-based processing in a manner dual to methods found in the classic stochastic decision, estimation, and control discipline. Such knowledge-based control has also been called Declarative, and Hybid. Software architectures were sought, employing the parallelism inherent in modern object-oriented modeling and programming. The viewpoint adopted was that Intelligent Control employs a class of domain-specific software architectures having features common over a broad variety of implementations, such as management of aircraft flight, power distribution, etc. As much attention was paid to software engineering issues as to artificial intelligence and control issues. This research considered that particular processing methods from the stochastic and knowledge-based worlds are duals, that is, similar in a broad context. They provide architectural design concepts which serve as bridges between the disparate disciplines of decision, estimation, control, and artificial intelligence. This research was applied to the control of a subsonic transport aircraft in the airport terminal area.

  9. Data processing system for real-time control

    International Nuclear Information System (INIS)

    Oasa, K.; Mochizuki, O.; Toyokawa, R.; Yahiro, K.

    1983-01-01

    Real-time control, for large Tokamak JT-60, requires various data processings between diagnostic devices to control system. These processings require to high speed performance so that it aims at giving information necessary for feedback control during discharges. Then, the architecture of this system has hierachical structure of processors. These processors are connected each other by the CAMAC modules and the optical communication network, which is the 5 M bytes/second CAMAC serial highway. This system has two kinds of intelligences for this purpose. One is ACM-PU pairs in some torus hall crates which has a microcomputerized auxiliary controller and a preprocessing unit. Other is real-time processor which has a minicomputer and preprocessing unit. Most of the real-time processing, for example Abel inversion are characteristic to the diagnostic devices. Such a processing is carried out by an ACM-PU pair in the crate dedicated to the diagnostic device. Some processings, however, are also necessary which compute secondary parameters as functions of primary parameters. A typical example is Zeff, which is a function of Te, Ne and bremsstrahluny intensity. The real-time processor is equipped for such secondary processings and transfer the results. Preprocessing unit -PU- attached to ACM and real-time processor contains a signal processor, which executes in parallel such function as move, add and multiply during one micro-instruction cycle of 200 nsec. According to the progress of the experiment, more high speed processing are required, so the authors developed the PU-X module that contains multi signal processors. After a shot, inter-shot-processor which consists of general-purpose computers, gathers data into the database, then analyze them, and improve these processes to more effective

  10. Computations on Wings With Full-Span Oscillating Control Surfaces Using Navier-Stokes Equations

    Science.gov (United States)

    Guruswamy, Guru P.

    2013-01-01

    A dual-level parallel procedure is presented for computing large databases to support aerospace vehicle design. This procedure has been developed as a single Unix script within the Parallel Batch Submission environment utilizing MPIexec and runs MPI based analysis software. It has been developed to provide a process for aerospace designers to generate data for large numbers of cases with the highest possible fidelity and reasonable wall clock time. A single job submission environment has been created to avoid keeping track of multiple jobs and the associated system administration overhead. The process has been demonstrated for computing large databases for the design of typical aerospace configurations, a launch vehicle and a rotorcraft.

  11. Functional graphical languages for process control

    International Nuclear Information System (INIS)

    1996-01-01

    A wide variety of safety systems are in use today in the process industries. Most of these systems rely on control software using procedural programming languages. This study investigates the use of functional graphical languages for controls in the process industry. Different vendor proprietary software and languages are investigated and evaluation criteria are outlined based on ability to meet regulatory requirements, reference sites involving applications with similar safety concerns, QA/QC procedures, community of users, type and user-friendliness of the man-machine interface, performance of operational code, and degree of flexibility. (author) 16 refs., 4 tabs

  12. An Overview of Pharmaceutical Validation and Process Controls in ...

    African Journals Online (AJOL)

    It has always been known that the processes involved in pharmaceutical production impact significantly on the quality of the products The processes include raw material and equipment inspections as well as in-process controls. Process controls are mandatory in good manufacturing practice (GMP). The purpose is to ...

  13. Modern operator's consoles for accelerator control at Fermilab

    International Nuclear Information System (INIS)

    Lucas, P.; Cahill, K.; Peters, R.; Smedinghoff, J.

    1991-01-01

    Since the construction of the Tevatron the Fermilab accelerator complex has been controlled from operator's consoles based on PDP-11 computers and interaction with display hardware via Camac. In addition the Linac has been controllable from microprocessor-based local consoles. The new generation of console devices is based on VAXstation computers, networked by Ethernet and Token Ring, and utilizing the X-windows protocol. Under X the physical display (server) can be driven by any network node, and need not be part of the console computer (client). This allows great flexibility in configuring display devices - with X-terminals, Unix workstations, and Macintoshes all having been utilized. Over half of the 800 application programs on the system have been demonstrated to work properly in the new environment. The modern version of a Linac local console runs in a Macintosh. These are networked via Token Ring to Linac local control stations. They provide color graphics and a hard copy capability which was previously lacking

  14. Quality control of static irradiation processing products

    International Nuclear Information System (INIS)

    Bao Jianzhong; Chen Xiulan; Cao Hong; Zhai Jianqing

    2002-01-01

    Based on the irradiation processing practice of the nuclear technique application laboratory of Yangzhou Institute of Agricultural Science, the quality control of irradiation processing products is discussed

  15. Process Control System Cyber Security Standards - An Overview

    Energy Technology Data Exchange (ETDEWEB)

    Robert P. Evans

    2006-05-01

    The use of cyber security standards can greatly assist in the protection of process control systems by providing guidelines and requirements for the implementation of computer-controlled systems. These standards are most effective when the engineers and operators, using the standards, understand what each standard addresses. This paper provides an overview of several standards that deal with the cyber security of process measurements and control systems.

  16. Optimization and control of metal forming processes

    NARCIS (Netherlands)

    Havinga, Gosse Tjipke

    2016-01-01

    Inevitable variations in process and material properties limit the accuracy of metal forming processes. Robust optimization methods or control systems can be used to improve the production accuracy. Robust optimization methods are used to design production processes with low sensitivity to the

  17. SPring-8 beamline control system.

    Science.gov (United States)

    Ohata, T; Konishi, H; Kimura, H; Furukawa, Y; Tamasaku, K; Nakatani, T; Tanabe, T; Matsumoto, N; Ishii, M; Ishikawa, T

    1998-05-01

    The SPring-8 beamline control system is now taking part in the control of the insertion device (ID), front end, beam transportation channel and all interlock systems of the beamline: it will supply a highly standardized environment of apparatus control for collaborative researchers. In particular, ID operation is very important in a third-generation synchrotron light source facility. It is also very important to consider the security system because the ID is part of the storage ring and is therefore governed by the synchrotron ring control system. The progress of computer networking systems and the technology of security control require the development of a highly flexible control system. An interlock system that is independent of the control system has increased the reliability. For the beamline control system the so-called standard model concept has been adopted. VME-bus (VME) is used as the front-end control system and a UNIX workstation as the operator console. CPU boards of the VME-bus are RISC processor-based board computers operated by a LynxOS-based HP-RT real-time operating system. The workstation and the VME are linked to each other by a network, and form the distributed system. The HP 9000/700 series with HP-UX and the HP 9000/743rt series with HP-RT are used. All the controllable apparatus may be operated from any workstation.

  18. Human factors challenges for advanced process control

    International Nuclear Information System (INIS)

    Stubler, W.F.; O'Hara, J..M.

    1996-01-01

    New human-system interface technologies provide opportunities for improving operator and plant performance. However, if these technologies are not properly implemented, they may introduce new challenges to performance and safety. This paper reports the results from a survey of human factors considerations that arise in the implementation of advanced human-system interface technologies in process control and other complex systems. General trends were identified for several areas based on a review of technical literature and a combination of interviews and site visits with process control organizations. Human factors considerations are discussed for two of these areas, automation and controls

  19. Flex & Bison

    CERN Document Server

    Levine, John

    2009-01-01

    If you need to parse or process text data in Linux or Unix, this useful book explains how to use flex and bison to solve your problems quickly. flex & bison is the long-awaited sequel to the classic O'Reilly book, lex & yacc. In the nearly two decades since the original book was published, the flex and bison utilities have proven to be more reliable and more powerful than the original Unix tools. flex & bison covers the same core functionality vital to Linux and Unix program development, along with several important new topics. You'll find revised tutorials for novices and references for ad

  20. Monitoring a PVC batch process with multivariate statistical process control charts

    NARCIS (Netherlands)

    Tates, A. A.; Louwerse, D. J.; Smilde, A. K.; Koot, G. L. M.; Berndt, H.

    1999-01-01

    Multivariate statistical process control charts (MSPC charts) are developed for the industrial batch production process of poly(vinyl chloride) (PVC). With these MSPC charts different types of abnormal batch behavior were detected on-line. With batch contribution plots, the probable causes of these

  1. High-volume manufacturing device overlay process control

    Science.gov (United States)

    Lee, Honggoo; Han, Sangjun; Woo, Jaeson; Lee, DongYoung; Song, ChangRock; Heo, Hoyoung; Brinster, Irina; Choi, DongSub; Robinson, John C.

    2017-03-01

    Overlay control based on DI metrology of optical targets has been the primary basis for run-to-run process control for many years. In previous work we described a scenario where optical overlay metrology is performed on metrology targets on a high frequency basis including every lot (or most lots) at DI. SEM based FI metrology is performed ondevice in-die as-etched on an infrequent basis. Hybrid control schemes of this type have been in use for many process nodes. What is new is the relative size of the NZO as compared to the overlay spec, and the need to find more comprehensive solutions to characterize and control the size and variability of NZO at the 1x nm node: sampling, modeling, temporal frequency and control aspects, as well as trade-offs between SEM throughput and accuracy.

  2. Fuzzy Coordinated PI Controller: Application to the Real-Time Pressure Control Process

    Directory of Open Access Journals (Sweden)

    N. Kanagaraj

    2008-01-01

    Full Text Available This paper presents the real-time implementation of a fuzzy coordinated classical PI control scheme for controlling the pressure in a pilot pressure tank system. The fuzzy system has been designed to track the variation parameters in a feedback loop and tune the classical controller to achieve a better control action for load disturbances and set point changes. The error and process inputs are chosen as the inputs of fuzzy system to tune the conventional PI controller according to the process condition. This online conventional controller tuning technique will reduce the human involvement in controller tuning and increase the operating range of the conventional controller. The proposed control algorithm is experimentally implemented for the real-time pressure control of a pilot air tank system and validated using a high-speed 32-bit ARM7 embedded microcontroller board (ATMEL AT91M55800A. To demonstrate the performance of the fuzzy coordinated PI control scheme, results are compared with a classical PI and PI-type fuzzy control method. It is observed that the proposed controller structure is able to quickly track the parameter variation and perform better in load disturbances and also for set point changes.

  3. Applying interactive control to waste processing operations

    International Nuclear Information System (INIS)

    Grasz, E.L.; Merrill, R.D.; Couture, S.A.

    1992-08-01

    At present waste and residue processing includes steps that require human interaction. The risk of exposure to unknown hazardous materials and the potential for radiation contamination motivates the desire to remove operators from these processes. Technologies that facilitate this include glove box robotics, modular systems for remote and automated servicing, and interactive controls that minimize human intervention. LLNL is developing an automated system which is designed to supplant the operator for glove box tasks, thus protecting the operator from the risk of radiation exposure and minimizing operator-associated waste. Although most of the processing can be automated with minimal human interaction, there are some tasks where intelligent intervention is both desirable and necessary to adapt to Enexpected circumstances and events. These activities require that the operator interact with the process using a remote manipulator which provides or reflects a natural feel to the operator. The remote manipulation system which was developed incorporates sensor fusion and interactive control, and provides the operator with an effective means of controlling the robot in a potentially unknown environment. This paper describes recent accomplishments in technology development and integration, and outlines the future goals of Lawrence Livermore National Laboratory for achieving this integrated interactive control capability

  4. Internal control in the management system of meat processing enterprises

    Directory of Open Access Journals (Sweden)

    Volodymyr Kushnir

    2018-03-01

    Full Text Available The article is described the theoretical basis of internal control and its practical aspects in the work of meat processing enterprises (a case in the meat processing industry in Ukraine. The purpose of the research is to establish the theoretical foundations of the internal control and its improvement in the activity of meat processing plants of various forms of management. It is proposed to use precisely internal control among other names of domestic control. Definition of internal control, its subject and purpose are improved. The subjects and objects of internal control are determined; the principles of its implementation are supplemented. Specific control tasks in meat processing plants according to the needs of this industry are outlined. Specific examples of control subjects are presented and the role of the revision commission is emphasized. The state of internal control in meat processing plants in Ukraine is investigated and it is established that it has a bad condition and unfounded approach to its implementation by managers of meat processing enterprises. To improve the situation we recommend that each meat processing enterprise have in its staff a revision commission or an apposer (auditor. It is established that internal control is more effective in joint-stock companies than in limited liability companies. The necessity of internal control as an important element in the enterprise management system is accented.

  5. Data-based control of a multi-step forming process

    Science.gov (United States)

    Schulte, R.; Frey, P.; Hildenbrand, P.; Vogel, M.; Betz, C.; Lechner, M.; Merklein, M.

    2017-09-01

    The fourth industrial revolution represents a new stage in the organization and management of the entire value chain. However, concerning the field of forming technology, the fourth industrial revolution has only arrived gradually until now. In order to make a valuable contribution to the digital factory the controlling of a multistage forming process was investigated. Within the framework of the investigation, an abstracted and transferable model is used to outline which data have to be collected, how an interface between the different forming machines can be designed tangible and which control tasks must be fulfilled. The goal of this investigation was to control the subsequent process step based on the data recorded in the first step. The investigated process chain links various metal forming processes, which are typical elements of a multi-step forming process. Data recorded in the first step of the process chain is analyzed and processed for an improved process control of the subsequent process. On the basis of the gained scientific knowledge, it is possible to make forming operations more robust and at the same time more flexible, and thus create the fundament for linking various production processes in an efficient way.

  6. Systematic Integrated Process Design and Control of Binary Element Reactive Distillation Processes

    DEFF Research Database (Denmark)

    Mansouri, Seyed Soheil; Sales-Cruz, Mauricio; Huusom, Jakob Kjøbsted

    2016-01-01

    In this work, integrated process design and control of reactive distillation processes is considered through a computer-aided framework. First, a set of simple design methods for reactive distillation column that are similar in concept to non-reactive distillation design methods are extended...... to design-control of reactive distillation columns. These methods are based on the element concept where the reacting system of compounds is represented as elements. When only two elements are needed to represent the reacting system of more than two compounds, a binary element system is identified....... It is shown that the same design-control principles that apply to a non-reacting binary system of compounds are also valid for a reactive binary system of elements for distillation columns. Application of this framework shows that designing the reactive distillation process at the maximum driving force...

  7. Expert system and process optimization techniques for real-time monitoring and control of plasma processes

    Science.gov (United States)

    Cheng, Jie; Qian, Zhaogang; Irani, Keki B.; Etemad, Hossein; Elta, Michael E.

    1991-03-01

    To meet the ever-increasing demand of the rapidly-growing semiconductor manufacturing industry it is critical to have a comprehensive methodology integrating techniques for process optimization real-time monitoring and adaptive process control. To this end we have accomplished an integrated knowledge-based approach combining latest expert system technology machine learning method and traditional statistical process control (SPC) techniques. This knowledge-based approach is advantageous in that it makes it possible for the task of process optimization and adaptive control to be performed consistently and predictably. Furthermore this approach can be used to construct high-level and qualitative description of processes and thus make the process behavior easy to monitor predict and control. Two software packages RIST (Rule Induction and Statistical Testing) and KARSM (Knowledge Acquisition from Response Surface Methodology) have been developed and incorporated with two commercially available packages G2 (real-time expert system) and ULTRAMAX (a tool for sequential process optimization).

  8. Instrumentation and control for fossil-energy processes

    Energy Technology Data Exchange (ETDEWEB)

    1982-09-01

    The 1982 symposium on instrumentation and control for fossil energy processes was held June 7 through 9, 1982, at Adam's Mark Hotel, Houston, Texas. It was sponsored by the US Department of Energy, Office of Fossil Energy; Argonne National Laboratory; and the Society for Control and Instrumentation of Energy Processes. Fifty-two papers have been entered individually into EDB and ERA; eleven papers had been entered previously from other sources. (LTN)

  9. Iterative Controller Tuning for Process with Fold Bifurcations

    DEFF Research Database (Denmark)

    Huusom, Jakob Kjøbsted; Poulsen, Niels Kjølstad; Jørgensen, Sten Bay

    2007-01-01

    Processes involving fold bifurcation are notoriously difficult to control in the vicinity of the fold where most often optimal productivity is achieved . In cases with limited process insight a model based control synthesis is not possible. This paper uses a data driven approach with an improved...... version of iterative feedback tuning to optimizing a closed loop performance criterion, as a systematic tool for tuning process with fold bifurcations....

  10. Biomolecular Modeling in a Process Dynamics and Control Course

    Science.gov (United States)

    Gray, Jeffrey J.

    2006-01-01

    I present modifications to the traditional course entitled, "Process dynamics and control," which I renamed "Modeling, dynamics, and control of chemical and biological processes." Additions include the central dogma of biology, pharmacokinetic systems, population balances, control of gene transcription, and large­-scale…

  11. Development of an intelligent CAI system for a distributed processing environment

    International Nuclear Information System (INIS)

    Fujii, M.; Sasaki, K.; Ohi, T.; Itoh, T.

    1993-01-01

    In order to operate a nuclear power plant optimally in both normal and abnormal situations, the operators are trained using an operator training simulator in addition to classroom instruction. Individual instruction using a CAI (Computer-Assisted Instruction) system has become popular as a method of learning plant information, such as plant dynamics, operational procedures, plant systems, plant facilities, etc. The outline is described of a proposed network-based intelligent CAI system (ICAI) incorporating multi-medial PWR plant dynamics simulation, teaching aids and educational record management using the following environment: existing standard workstations and graphic workstations with a live video processing function, TCP/IP protocol of Unix through Ethernet and X window system. (Z.S.) 3 figs., 2 refs

  12. Core Community Specifications for Electron Microprobe Operating Systems: Software, Quality Control, and Data Management Issues

    Science.gov (United States)

    Fournelle, John; Carpenter, Paul

    2006-01-01

    Modem electron microprobe systems have become increasingly sophisticated. These systems utilize either UNIX or PC computer systems for measurement, automation, and data reduction. These systems have undergone major improvements in processing, storage, display, and communications, due to increased capabilities of hardware and software. Instrument specifications are typically utilized at the time of purchase and concentrate on hardware performance. The microanalysis community includes analysts, researchers, software developers, and manufacturers, who could benefit from exchange of ideas and the ultimate development of core community specifications (CCS) for hardware and software components of microprobe instrumentation and operating systems.

  13. Automatic process control in anaerobic digestion technology: A critical review.

    Science.gov (United States)

    Nguyen, Duc; Gadhamshetty, Venkataramana; Nitayavardhana, Saoharit; Khanal, Samir Kumar

    2015-10-01

    Anaerobic digestion (AD) is a mature technology that relies upon a synergistic effort of a diverse group of microbial communities for metabolizing diverse organic substrates. However, AD is highly sensitive to process disturbances, and thus it is advantageous to use online monitoring and process control techniques to efficiently operate AD process. A range of electrochemical, chromatographic and spectroscopic devices can be deployed for on-line monitoring and control of the AD process. While complexity of the control strategy ranges from a feedback control to advanced control systems, there are some debates on implementation of advanced instrumentations or advanced control strategies. Centralized AD plants could be the answer for the applications of progressive automatic control field. This article provides a critical overview of the available automatic control technologies that can be implemented in AD processes at different scales. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Process control measurements in the SRP fuel separations plants

    International Nuclear Information System (INIS)

    McKibben, J.M.; Pickett, C.E.; Dickert, H.D.

    1982-02-01

    Programs were started to develop new in-line and at-line analytical techniques. Among the more promising techniques being investigated are: (1) an in-line instrument to analyze for percent tributyl phosphate in process solvent, (2) remote laser optrode techniques (using lazer light transmitted to and from the sample cell via light pipes) for a variety of possible analyses, and (3) sonic techniques for concentration analyses in two component systems. A subcommittee was also formed to investigate state-of-the-technology for process control. The final recommendation was to use a distributed control approach to upgrade the process control sytem. The system selected should be modular, easy to expand, and simple to change control strategies. A distributed system using microprocessorbased controllers would allow installation of the control intelligence near the process, thereby simplifying field wiring. Process information collected and stored in the controllers will be transmitted to operating consoles, via a data highway, for process management and display. The overall program has a number of distinct benefits. There are a number of cost savings that will be realized. Excellent annual return on investment - up to 110% - has been predicted for several of the projects in this program that are already funded. In addition, many of the instrument modifications will improve safety performance and production throughput in the specific ways shown

  15. Creys-Malville control room and data processing

    International Nuclear Information System (INIS)

    Decuyper, J.

    1984-01-01

    After a brief definition of the control of a plant, this article presents the Creys-Malville control room: control means display and considerations on ergonomy and specific features in respect of the PWR control room. The Creys-Malville data processing is then rapidly presented with a brief description, the different data treatments and the specificity of the centralised data computer [fr

  16. Using Statistical Process Control to Enhance Student Progression

    Science.gov (United States)

    Hanna, Mark D.; Raichura, Nilesh; Bernardes, Ednilson

    2012-01-01

    Public interest in educational outcomes has markedly increased in the most recent decade; however, quality management and statistical process control have not deeply penetrated the management of academic institutions. This paper presents results of an attempt to use Statistical Process Control (SPC) to identify a key impediment to continuous…

  17. 10 CFR 72.158 - Control of special processes.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Control of special processes. 72.158 Section 72.158 Energy... NUCLEAR FUEL, HIGH-LEVEL RADIOACTIVE WASTE, AND REACTOR-RELATED GREATER THAN CLASS C WASTE Quality Assurance § 72.158 Control of special processes. The licensee, applicant for a license, certificate holder...

  18. Manufacturing Squares: An Integrative Statistical Process Control Exercise

    Science.gov (United States)

    Coy, Steven P.

    2016-01-01

    In the exercise, students in a junior-level operations management class are asked to manufacture a simple product. Given product specifications, they must design a production process, create roles and design jobs for each team member, and develop a statistical process control plan that efficiently and effectively controls quality during…

  19. 77 FR 22707 - Electronic Reporting Under the Toxic Substances Control Act

    Science.gov (United States)

    2012-04-17

    ... to the Agency. The tool is available for use with Windows, Macs, Linux, and UNIX based computers... a fielded format, e.g., the Organisation for Economic Co-operation and Development (OECD) harmonized...

  20. In-process and post-process measurements of drill wear for control of the drilling process

    Science.gov (United States)

    Liu, Tien-I.; Liu, George; Gao, Zhiyu

    2011-12-01

    Optical inspection was used in this research for the post-process measurements of drill wear. A precision toolmakers" microscope was used. Indirect index, cutting force, is used for in-process drill wear measurements. Using in-process measurements to estimate the drill wear for control purpose can decrease the operation cost and enhance the product quality and safety. The challenge is to correlate the in-process cutting force measurements with the post-process optical inspection of drill wear. To find the most important feature, the energy principle was used in this research. It is necessary to select only the cutting force feature which shows the highest sensitivity to drill wear. The best feature selected is the peak of torque in the drilling process. Neuro-fuzzy systems were used for correlation purposes. The Adaptive-Network-Based Fuzzy Inference System (ANFIS) can construct fuzzy rules with membership functions to generate an input-output pair. A 1x6 ANFIS architecture with product of sigmoid membership functions can in-process measure the drill wear with an error as low as 0.15%. This is extremely important for control of the drilling process. Furthermore, the measurement of drill wear was performed under different drilling conditions. This shows that ANFIS has the capability of generalization.

  1. Advanced Control Synthesis for Reverse Osmosis Water Desalination Processes.

    Science.gov (United States)

    Phuc, Bui Duc Hong; You, Sam-Sang; Choi, Hyeung-Six; Jeong, Seok-Kwon

    2017-11-01

      In this study, robust control synthesis has been applied to a reverse osmosis desalination plant whose product water flow and salinity are chosen as two controlled variables. The reverse osmosis process has been selected to study since it typically uses less energy than thermal distillation. The aim of the robust design is to overcome the limitation of classical controllers in dealing with large parametric uncertainties, external disturbances, sensor noises, and unmodeled process dynamics. The analyzed desalination process is modeled as a multi-input multi-output (MIMO) system with varying parameters. The control system is decoupled using a feed forward decoupling method to reduce the interactions between control channels. Both nominal and perturbed reverse osmosis systems have been analyzed using structured singular values for their stabilities and performances. Simulation results show that the system responses meet all the control requirements against various uncertainties. Finally the reduced order controller provides excellent robust performance, with achieving decoupling, disturbance attenuation, and noise rejection. It can help to reduce the membrane cleanings, increase the robustness against uncertainties, and lower the energy consumption for process monitoring.

  2. Development of Chemical Process Design and Control for Sustainability

    Directory of Open Access Journals (Sweden)

    Shuyun Li

    2016-07-01

    Full Text Available This contribution describes a novel process systems engineering framework that couples advanced control with sustainability evaluation for the optimization of process operations to minimize environmental impacts associated with products, materials and energy. The implemented control strategy combines a biologically-inspired method with optimal control concepts for finding more sustainable operating trajectories. The sustainability assessment of process operating points is carried out by using the U.S. EPA’s Gauging Reaction Effectiveness for the ENvironmental Sustainability of Chemistries with a multi-Objective Process Evaluator (GREENSCOPE tool that provides scores for the selected indicators in the economic, material efficiency, environmental and energy areas. The indicator scores describe process performance on a sustainability measurement scale, effectively determining which operating point is more sustainable if there are more than several steady states for one specific product manufacturing. Through comparisons between a representative benchmark and the optimal steady states obtained through the implementation of the proposed controller, a systematic decision can be made in terms of whether the implementation of the controller is moving the process towards a more sustainable operation. The effectiveness of the proposed framework is illustrated through a case study of a continuous fermentation process for fuel production, whose material and energy time variation models are characterized by multiple steady states and oscillatory conditions.

  3. Process control upgrades yield huge operational improvements

    International Nuclear Information System (INIS)

    Fitzgerald, W.V.

    2001-01-01

    Most nuclear plants in North America were designed and built in the late 60 and 70. The regulatory nature of this industry over the years has made design changes at the plant level difficult, if not impossible, to implement. As a result, many plants in this world region have been getting by on technology that is over 40 years behind the times. What this translates into is that the plants have not been able to take advantage of the huge technology gains that have been made in process control during this period. As a result, most of these plants are much less efficient and productive than they could be. One particular area of the plant that is receiving a lot of attention is the feedwater heaters. These systems were put in place to improve efficiency, but most are not operating correctly. This paper will present a case study where one progressive mid-western utility decided that enough was enough and implemented a process control audit of their heater systems. The audit clearly pointed out the existing problems with the current process control system. It resulted in a proposal for the implementation of a state of the art, digital distributed process control system for the heaters along with a complete upgrade of the level controls and field devices that will stabilize heater levels, resulting in significant efficiency gains and lower maintenance bills. Overall the payback period for this investment should be less than 6 months and the plant is now looking for more opportunities that can provide even bigger gains. (author)

  4. Optimal control of switched systems arising in fermentation processes

    CERN Document Server

    Liu, Chongyang

    2014-01-01

    The book presents, in a systematic manner, the optimal controls under different mathematical models in fermentation processes. Variant mathematical models – i.e., those for multistage systems; switched autonomous systems; time-dependent and state-dependent switched systems; multistage time-delay systems and switched time-delay systems – for fed-batch fermentation processes are proposed and the theories and algorithms of their optimal control problems are studied and discussed. By putting forward novel methods and innovative tools, the book provides a state-of-the-art and comprehensive systematic treatment of optimal control problems arising in fermentation processes. It not only develops nonlinear dynamical system, optimal control theory and optimization algorithms, but can also help to increase productivity and provide valuable reference material on commercial fermentation processes.

  5. Control system design specification of advanced spent fuel management process units

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, S. H.; Kim, S. H.; Yoon, J. S

    2003-06-01

    In this study, the design specifications of instrumentation and control system for advanced spent fuel management process units are presented. The advanced spent fuel management process consists of several process units such as slitting device, dry pulverizing/mixing device, metallizer, etc. In this study, the control and operation characteristics of the advanced spent fuel management mockup process devices and the process devices developed in 2001 and 2002 are analysed. Also, a integral processing system of the unit process control signals is proposed, which the operation efficiency is improved. And a redundant PLC control system is constructed which the reliability is improved. A control scheme is proposed for the time delayed systems compensating the control performance degradation caused by time delay. The control system design specification is presented for the advanced spent fuel management process units. This design specifications can be effectively used for the detail design of the advanced spent fuel management process.

  6. FLUKA-LIVE-an embedded framework, for enabling a computer to execute FLUKA under the control of a Linux OS

    International Nuclear Information System (INIS)

    Cohen, A.; Battistoni, G.; Mark, S.

    2008-01-01

    This paper describes a Linux-based OS framework for integrating the FLUKA Monte Carlo software (currently distributed only for Linux) into a CD-ROM, resulting in a complete environment for a scientist to edit, link and run FLUKA routines-without the need to install a UNIX/Linux operating system. The building process includes generating from scratch a complete operating system distribution which will, when operative, build all necessary components for successful operation of FLUKA software and libraries. Various source packages, as well as the latest kernel sources, are freely available from the Internet. These sources are used to create a functioning Linux system that integrates several core utilities in line with the main idea-enabling FLUKA to act as if it was running under a popular Linux distribution or even a proprietary UNIX workstation. On boot-up a file system will be created and the contents from the CD will be uncompressed and completely loaded into RAM-after which the presence of the CD is no longer necessary, and could be removed for use on a second computer. The system can operate on any i386 PC as long as it can boot from a CD

  7. Modular and Adaptive Control of Sound Processing

    Science.gov (United States)

    van Nort, Douglas

    This dissertation presents research into the creation of systems for the control of sound synthesis and processing. The focus differs from much of the work related to digital musical instrument design, which has rightly concentrated on the physicality of the instrument and interface: sensor design, choice of controller, feedback to performer and so on. Often times a particular choice of sound processing is made, and the resultant parameters from the physical interface are conditioned and mapped to the available sound parameters in an exploratory fashion. The main goal of the work presented here is to demonstrate the importance of the space that lies between physical interface design and the choice of sound manipulation algorithm, and to present a new framework for instrument design that strongly considers this essential part of the design process. In particular, this research takes the viewpoint that instrument designs should be considered in a musical control context, and that both control and sound dynamics must be considered in tandem. In order to achieve this holistic approach, the work presented in this dissertation assumes complementary points of view. Instrument design is first seen as a function of musical context, focusing on electroacoustic music and leading to a view on gesture that relates perceived musical intent to the dynamics of an instrumental system. The important design concept of mapping is then discussed from a theoretical and conceptual point of view, relating perceptual, systems and mathematically-oriented ways of examining the subject. This theoretical framework gives rise to a mapping design space, functional analysis of pertinent existing literature, implementations of mapping tools, instrumental control designs and several perceptual studies that explore the influence of mapping structure. Each of these reflect a high-level approach in which control structures are imposed on top of a high-dimensional space of control and sound synthesis

  8. Statistical process control for electron beam monitoring.

    Science.gov (United States)

    López-Tarjuelo, Juan; Luquero-Llopis, Naika; García-Mollá, Rafael; Quirós-Higueras, Juan David; Bouché-Babiloni, Ana; Juan-Senabre, Xavier Jordi; de Marco-Blancas, Noelia; Ferrer-Albiach, Carlos; Santos-Serra, Agustín

    2015-07-01

    To assess the electron beam monitoring statistical process control (SPC) in linear accelerator (linac) daily quality control. We present a long-term record of our measurements and evaluate which SPC-led conditions are feasible for maintaining control. We retrieved our linac beam calibration, symmetry, and flatness daily records for all electron beam energies from January 2008 to December 2013, and retrospectively studied how SPC could have been applied and which of its features could be used in the future. A set of adjustment interventions designed to maintain these parameters under control was also simulated. All phase I data was under control. The dose plots were characterized by rising trends followed by steep drops caused by our attempts to re-center the linac beam calibration. Where flatness and symmetry trends were detected they were less-well defined. The process capability ratios ranged from 1.6 to 9.3 at a 2% specification level. Simulated interventions ranged from 2% to 34% of the total number of measurement sessions. We also noted that if prospective SPC had been applied it would have met quality control specifications. SPC can be used to assess the inherent variability of our electron beam monitoring system. It can also indicate whether a process is capable of maintaining electron parameters under control with respect to established specifications by using a daily checking device, but this is not practical unless a method to establish direct feedback from the device to the linac can be devised. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  9. Employing expert systems for process control

    International Nuclear Information System (INIS)

    Ahrens, W.

    1987-01-01

    The characteristic features of expert systems are explained in detail, and the systems' application in process control engineering. Four points of main interest are there, namely: Applications for diagnostic tasks, for safety analyses, planning, and training and expert training. For the modelling of the technical systems involved in all four task fields mentioned above, an object-centred approach has shown to be the suitable method, as process control techniques are determined by technical objects that in principle are specified by data sheets, schematic representations, flow charts, and plans. The graphical surface allows these data to be taken into account, so that the object can be displayed in the way best suited to the individual purposes. (orig./GL) [de

  10. Applying Statistical Process Control to Clinical Data: An Illustration.

    Science.gov (United States)

    Pfadt, Al; And Others

    1992-01-01

    Principles of statistical process control are applied to a clinical setting through the use of control charts to detect changes, as part of treatment planning and clinical decision-making processes. The logic of control chart analysis is derived from principles of statistical inference. Sample charts offer examples of evaluating baselines and…

  11. Using Paper Helicopters to Teach Statistical Process Control

    Science.gov (United States)

    Johnson, Danny J.

    2011-01-01

    This hands-on project uses a paper helicopter to teach students how to distinguish between common and special causes of variability when developing and using statistical process control charts. It allows the student to experience a process that is out-of-control due to imprecise or incomplete product design specifications and to discover how the…

  12. Process Analytical Technology (PAT): batch-to-batch reproducibility of fermentation processes by robust process operational design and control.

    Science.gov (United States)

    Gnoth, S; Jenzsch, M; Simutis, R; Lübbert, A

    2007-10-31

    The Process Analytical Technology (PAT) initiative of the FDA is a reaction on the increasing discrepancy between current possibilities in process supervision and control of pharmaceutical production processes and its current application in industrial manufacturing processes. With rigid approval practices based on standard operational procedures, adaptations of production reactors towards the state of the art were more or less inhibited for long years. Now PAT paves the way for continuous process and product improvements through improved process supervision based on knowledge-based data analysis, "Quality-by-Design"-concepts, and, finally, through feedback control. Examples of up-to-date implementations of this concept are presented. They are taken from one key group of processes in recombinant pharmaceutical protein manufacturing, the cultivations of genetically modified Escherichia coli bacteria.

  13. Novel strategies for control of fermentation processes

    DEFF Research Database (Denmark)

    Mears, Lisa

    to highly optimised industrial host strains. The focus of this project is instead on en-gineering of the process. The question to be answered in this thesis is, given a highly optimised industrial host strain, how can we operate the fermentation process in order to maximise the productivity of the system...... (2012). This model describes the fungal processes operated in the fermentation pilot plant at Novozymes A/S. This model is investigated using uncertainty analysis methods in order to as-sess the applicability to control applications. A mechanistic model approach is desirable, as it is a predictive....... This provides a prediction of the future trajectory of the process, so that it is possible to guide the system to the desired target mass. The control strategy is applied on-line at 550L scale in the Novozymes A/S fermentation pilot plant, and the method is challenged with four different sets of process...

  14. Training change control process at Cernavoda NPP

    International Nuclear Information System (INIS)

    Valache, Cornelia Mariana

    2005-01-01

    The paper presents the process of 'Training Change Control' at Cernavoda NPP. This process is a systematic approach that allows determination of the most effective training and/or non-training solutions for challenges that may influence the content and conditions for a training program or course. Changes may be the result of: - response to station systems or equipment modifications; - new or revised procedures; - regulatory requirements; - external organizations requirements; - internal evaluations meaning feedback from trainees, trainers, management or post-training evaluations; - self-assessments; - station condition reports; - operating experience (OPEX); - modifications of job scope; - management input. The Training Change Control Process at Cernavoda NPP includes the following aspects. The first step is the identification of all the initiating factors for a potential training change. Then, retain only those, which could have an impact on training and classify them in two categories: as deficiencies or as enhancement suggestions. The process is different for the two categories. The deficiency category supposes the application of the Training Needs Analysis (TNA) process. This is a performance-oriented process, resulting in more competent employees, solving existing and potential performance problems. By using needs analysis to systematically determine what people or courses and programs are expected to do and gathering data to reveal what they are really doing, we can receive a clear picture of the problem and then we can establish corrective action plans to fix it. The process is supported by plant subjects matter and by training specialists. On the other hand, enhancements suggestions are assessed by designated experienced persons and then are implemented in the training process. Regarding these two types of initiating factors for the training change control process, the final result consists of a training improvement, raising the effectiveness, efficiency or

  15. A process control software package for the SRS

    International Nuclear Information System (INIS)

    Atkins, V.R.; Poole, D.E.; Rawlinson, W.R.

    1980-03-01

    The development of software to give high level access from application programs for monitoring and control of the Daresbury Synchrotron Radiation Source on a network-wide basis is described. The design and implementation of the control system database, a special supervisor call and and 'executive' type task handling of all process input/output services for the 7/32 (which runs under 05/32-MT), and process control 'device driver' software for the 7/16 (run under L5/16-MT) are included. (UK)

  16. Positive affect improves working memory: implications for controlled cognitive processing.

    Science.gov (United States)

    Yang, Hwajin; Yang, Sujin; Isen, Alice M

    2013-01-01

    This study examined the effects of positive affect on working memory (WM) and short-term memory (STM). Given that WM involves both storage and controlled processing and that STM primarily involves storage processing, we hypothesised that if positive affect facilitates controlled processing, it should improve WM more than STM. The results demonstrated that positive affect, compared with neutral affect, significantly enhanced WM, as measured by the operation span task. The influence of positive affect on STM, however, was weaker. These results suggest that positive affect enhances WM, a task that involves controlled processing, not just storage processing. Additional analyses of recall and processing times and accuracy further suggest that improved WM under positive affect is not attributable to motivational differences, but results instead from improved controlled cognitive processing.

  17. Interfacing industrial process control systems to LEP/LHC

    International Nuclear Information System (INIS)

    Rabany, M.

    1992-01-01

    Modern industrial process control systems have developed to meet the needs of industry to increase the production while decreasing the costs. Although particle accelerators designers have pioneered in control systems during the seventies, it has now become possible to them to profit of industrial solutions in substitution of, or in complement with the more traditional home made ones. Adapting and integrating such industrial systems to the accelerator control area will certainly benefit to the field in terms of finance, human resources and technical facilities offered off-the-shelf by the widely experienced industrial controls community; however this cannot be done without slightly affecting the overall accelerator control architecture. The paper briefly describes the industrial controls arena and takes example on an industrial process control system recently installed at CERN to discuss in detail the related choices and issues. (author)

  18. Process control analysis of IMRT QA: implications for clinical trials

    International Nuclear Information System (INIS)

    Pawlicki, Todd; Rice, Roger K; Yoo, Sua; Court, Laurence E; McMillan, Sharon K; Russell, J Donald; Pacyniak, John M; Woo, Milton K; Basran, Parminder S; Boyer, Arthur L; Bonilla, Claribel

    2008-01-01

    The purpose of this study is two-fold: first is to investigate the process of IMRT QA using control charts and second is to compare control chart limits to limits calculated using the standard deviation (σ). Head and neck and prostate IMRT QA cases from seven institutions in both academic and community settings are considered. The percent difference between the point dose measurement in phantom and the corresponding result from the treatment planning system (TPS) is used for analysis. The average of the percent difference calculations defines the accuracy of the process and is called the process target. This represents the degree to which the process meets the clinical goal of 0% difference between the measurements and TPS. IMRT QA process ability defines the ability of the process to meet clinical specifications (e.g. 5% difference between the measurement and TPS). The process ability is defined in two ways: (1) the half-width of the control chart limits, and (2) the half-width of ±3σ limits. Process performance is characterized as being in one of four possible states that describes the stability of the process and its ability to meet clinical specifications. For the head and neck cases, the average process target across institutions was 0.3% (range: -1.5% to 2.9%). The average process ability using control chart limits was 7.2% (range: 5.3% to 9.8%) compared to 6.7% (range: 5.3% to 8.2%) using standard deviation limits. For the prostate cases, the average process target across the institutions was 0.2% (range: -1.8% to 1.4%). The average process ability using control chart limits was 4.4% (range: 1.3% to 9.4%) compared to 5.3% (range: 2.3% to 9.8%) using standard deviation limits. Using the standard deviation to characterize IMRT QA process performance resulted in processes being preferentially placed in one of the four states. This is in contrast to using control charts for process characterization where the IMRT QA processes were spread over three of the

  19. The application of mean control chart in managing industrial processes

    Directory of Open Access Journals (Sweden)

    Papić-Blagojević Nataša

    2013-01-01

    Full Text Available Along with the advent of mass production comes the problem of monitoring and maintaining the quality of the product, which stressed the need for the application of selected statistical and mathematical methods in the control process. The main objective of applying the methods of statistical control is continuous quality improvement through permanent monitoring of the process in order to discover the causes of errors. Shewart charts are the most popular method of statistical process control, which performs separation of controlled and uncontrolled variations along with detection of increased variations. This paper presents the example of Shewart mean control chart with application in managing industrial process.

  20. process controller for induction vacuum brazing

    International Nuclear Information System (INIS)

    Aldea, A.

    2016-01-01

    A brazing operation involves joining two parts made of different materials, using a filler material that has a melting temperature lower than the base materials used. The temperature of the process must be carefully controlled, sometimes with an accuracy of about 1°C, because overshooting the prescribed temperature results in detrimental metallurgic phenomena and joints of poor quality. The brazing system is composed of an operating cabinet, a mid-frequency generator, a vacuum chamber with an induction coil inside and the parts that have to be brazed. Until now, to operate this system two operators were required: one to continuously read the temperature with an optical pyrometer and another to manually adjust the current in the induction coil according to his intuition and prediction gained only by experience. The improvement that we made to the system involved creating an automatic temperature control unit, using a PID closed loop controller that reads the temperature of the parts and adjusts automatically the current in the coil. Using the PID controller, the brazing engineer can implement a certain temperature slope for the current brazing process. (authors)

  1. Development of COMPAS, computer aided process flowsheet design and analysis system of nuclear fuel reprocessing

    International Nuclear Information System (INIS)

    Homma, Shunji; Sakamoto, Susumu; Takanashi, Mitsuhiro; Nammo, Akihiko; Satoh, Yoshihiro; Soejima, Takayuki; Koga, Jiro; Matsumoto, Shiro

    1995-01-01

    A computer aided process flowsheet design and analysis system, COMPAS has been developed in order to carry out the flowsheet calculation on the process flow diagram of nuclear fuel reprocessing. All of equipments, such as dissolver, mixer-settler, and so on, in the process flowsheet diagram are graphically visualized as icon on a bitmap display of UNIX workstation. Drawing of a flowsheet can be carried out easily by the mouse operation. Not only a published numerical simulation code but also a user's original one can be used on the COMPAS. Specifications of the equipment and the concentration of components in the stream displayed as tables can be edited by a computer user. Results of calculation can be also displayed graphically. Two examples show that the COMPAS is applicable to decide operating conditions of Purex process and to analyze extraction behavior in a mixer-settler extractor. (author)

  2. Intelligent process control of fiber chemical vapor deposition

    Science.gov (United States)

    Jones, John Gregory

    Chemical Vapor Deposition (CVD) is a widely used process for the application of thin films. In this case, CVD is being used to apply a thin film interface coating to single crystal monofilament sapphire (Alsb2Osb3) fibers for use in Ceramic Matrix Composites (CMC's). The hot-wall reactor operates at near atmospheric pressure which is maintained using a venturi pump system. Inert gas seals obviate the need for a sealed system. A liquid precursor delivery system has been implemented to provide precise stoichiometry control. Neural networks have been implemented to create real-time process description models trained using data generated based on a Navier-Stokes finite difference model of the process. Automation of the process to include full computer control and data logging capability is also presented. In situ sensors including a quadrupole mass spectrometer, thermocouples, laser scanner, and Raman spectrometer have been implemented to determine the gas phase reactants and coating quality. A fuzzy logic controller has been developed to regulate either the gas phase or the in situ temperature of the reactor using oxygen flow rate as an actuator. Scanning electron microscope (SEM) images of various samples are shown. A hierarchical control structure upon which the control structure is based is also presented.

  3. HYBRID SYSTEM BASED FUZZY-PID CONTROL SCHEMES FOR UNPREDICTABLE PROCESS

    Directory of Open Access Journals (Sweden)

    M.K. Tan

    2011-07-01

    Full Text Available In general, the primary aim of polymerization industry is to enhance the process operation in order to obtain high quality and purity product. However, a sudden and large amount of heat will be released rapidly during the mixing process of two reactants, i.e. phenol and formalin due to its exothermic behavior. The unpredictable heat will cause deviation of process temperature and hence affect the quality of the product. Therefore, it is vital to control the process temperature during the polymerization. In the modern industry, fuzzy logic is commonly used to auto-tune PID controller to control the process temperature. However, this method needs an experienced operator to fine tune the fuzzy membership function and universe of discourse via trial and error approach. Hence, the setting of fuzzy inference system might not be accurate due to the human errors. Besides that, control of the process can be challenging due to the rapid changes in the plant parameters which will increase the process complexity. This paper proposes an optimization scheme using hybrid of Q-learning (QL and genetic algorithm (GA to optimize the fuzzy membership function in order to allow the conventional fuzzy-PID controller to control the process temperature more effectively. The performances of the proposed optimization scheme are compared with the existing fuzzy-PID scheme. The results show that the proposed optimization scheme is able to control the process temperature more effectively even if disturbance is introduced.

  4. Production process and quality control for the HTTR fuel

    International Nuclear Information System (INIS)

    Yoshimuta, S.; Suzuki, N.; Kaneko, M.; Fukuda, K.

    1991-01-01

    Development of the production and inspection technology for High Temperature Engineering Test Reactor (HTTR) fuel has been carried out by cooperative work between Japan Atomic Energy Research Institute (JAERI) and Nuclear Fuel Industries, Ltd (NFI). The performance and the quality level of the developed fuel are well established to meet the design requirements of the HTTR. For the commercial scale production of the fuel, statistical quality control and quality assurance must be carefully considered in order to assure the safety of the HTTR. It is also important to produce the fuel under well controlled process condition. To meet these requirements in the production of the HTTR fuel, a new production process and quality control system is to be introduced in the new facilities. The main feature of the system is a computer integrated control system. Process control data at each production stage of products and semi-products are all gathered by terminal computers and processed by a host computer. The processed information is effectively used for the production, quality and accountancy control. With the aid of this system, all the products will be easily traceable from starting materials to final stages and the statistical evaluation of the quality of products becomes more reliable. (author). 8 figs

  5. Metrology and process control: dealing with measurement uncertainty

    Science.gov (United States)

    Potzick, James

    2010-03-01

    Metrology is often used in designing and controlling manufacturing processes. A product sample is processed, some relevant property is measured, and the process adjusted to bring the next processed sample closer to its specification. This feedback loop can be remarkably effective for the complex processes used in semiconductor manufacturing, but there is some risk involved because measurements have uncertainty and product specifications have tolerances. There is finite risk that good product will fail testing or that faulty product will pass. Standard methods for quantifying measurement uncertainty have been presented, but the question arises: how much measurement uncertainty is tolerable in a specific case? Or, How does measurement uncertainty relate to manufacturing risk? This paper looks at some of the components inside this process control feedback loop and describes methods to answer these questions.

  6. Gemstone Grinding Process Improvement by using Impedance Force Control

    Directory of Open Access Journals (Sweden)

    Hamprommarat Chumpol

    2015-01-01

    Full Text Available Chula Automatic Faceting Machine has been developed by The Advance Manufacturing Research Lab, Chulalongkorn University to support Thailand Gems-Industry. The machine has high precision motion control by using position and force control. A contact stiffness model is used to estimate grinding force. Although polished gems from the Faceting Machine have uniform size and acceptable shape, the force of the grinding and polishing process cannot be maintain constant and has some fluctuation due to indirect force control. Therefor this research work propose a new controller for this process based on an impedance direct force control to improve the gemstone grinding performance during polishing process. The grinding force can be measured through motor current. The results show that the polished gems by using impedance direct force control can maintain uniform size as well as good shape and high quality surface.

  7. Path modeling and process control

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar; Rodionova, O.; Pomerantsev, A.

    2007-01-01

    and having three or more stages. The methods are applied to a process control of a multi-stage production process having 25 variables and one output variable. When moving along the process, variables change their roles. It is shown how the methods of path modeling can be applied to estimate variables...... be performed regarding the foreseeable output property y, and with respect to an admissible range of correcting actions for the parameters of the next stage. In this paper the basic principles of path modeling is presented. The mathematics is presented for processes having only one stage, having two stages...... of the next stage with the purpose of obtaining optimal or almost optimal quality of the output variable. An important aspect of the methods presented is the possibility of extensive graphic analysis of data that can provide the engineer with a detailed view of the multi-variate variation in data....

  8. Process and control systems for composites manufacturing

    Science.gov (United States)

    Tsiang, T. H.; Wanamaker, John L.

    1992-01-01

    A precise control of composite material processing would not only improve part quality, but it would also directly reduce the overall manufacturing cost. The development and incorporation of sensors will help to generate real-time information for material processing relationships and equipment characteristics. In the present work, the thermocouple, pressure transducer, and dielectrometer technologies were investigated. The monitoring sensors were integrated with the computerized control system in three non-autoclave fabrication techniques: hot-press, self contained tool (self heating and pressurizing), and pressure vessel). The sensors were implemented in the parts and tools.

  9. Apparatus and process for controlling fluidized beds

    Science.gov (United States)

    Rehmat, Amirali G.; Patel, Jitendra G.

    1985-10-01

    An apparatus and process for control and maintenance of fluidized beds under non-steady state conditions. An ash removal conduit is provided for removing solid particulates from a fluidized bed separate from an ash discharge conduit in the lower portion of the grate supporting such a bed. The apparatus and process of this invention is particularly suitable for use in ash agglomerating fluidized beds and provides control of the fluidized bed before ash agglomeration is initiated and during upset conditions resulting in stable, sinter-free fluidized bed maintenance.

  10. Development, validation and routine control of a radiation process

    International Nuclear Information System (INIS)

    Kishor Mehta

    2010-01-01

    Today, radiation is used in industrial processing for variety of applications; from low doses for blood irradiation to very high doses for materials modification and even higher for gemstone colour enhancement. At present, radiation is mainly provided by either radionuclides or machine sources; cobalt-60 is the most predominant radionuclide in use. Currently, there are several hundred irradiation facilities worldwide. Similar to other industries, quality management systems can assist radiation processing facilities in enhancing customer satisfaction and maintaining and improving product quality. To help fulfill quality management requirements, several national and international organizations have developed various standards related to radiation processing. They all have requirements and guidelines for development, validation and routine control of the radiation process. For radiation processing, these three phases involve the following activities. Development phase includes selecting the type of radiation source, irradiation facility and the dose required for the process. Validation phase includes conducting activities that give assurance that the process will be successful. Routine control then involves activities that provide evidence that the process has been successfully realized. These standards require documentary evidence that process validation and process control have been followed. Dosimetry information gathered during these processes provides this evidence. (authors)

  11. Statistical process control support during Defense Waste Processing Facility chemical runs

    International Nuclear Information System (INIS)

    Brown, K.G.

    1994-01-01

    The Product Composition Control System (PCCS) has been developed to ensure that the wasteforms produced by the Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS) will satisfy the regulatory and processing criteria that will be imposed. The PCCS provides rigorous, statistically-defensible management of a noisy, multivariate system subject to multiple constraints. The system has been successfully tested and has been used to control the production of the first two melter feed batches during DWPF Chemical Runs. These operations will demonstrate the viability of the DWPF process. This paper provides a brief discussion of the technical foundation for the statistical process control algorithms incorporated into PCCS, and describes the results obtained and lessons learned from DWPF Cold Chemical Run operations. The DWPF will immobilize approximately 130 million liters of high-level nuclear waste currently stored at the Site in 51 carbon steel tanks. Waste handling operations separate this waste into highly radioactive sludge and precipitate streams and less radioactive water soluble salts. (In a separate facility, soluble salts are disposed of as low-level waste in a mixture of cement slag, and flyash.) In DWPF, the precipitate steam (Precipitate Hydrolysis Aqueous or PHA) is blended with the insoluble sludge and ground glass frit to produce melter feed slurry which is continuously fed to the DWPF melter. The melter produces a molten borosilicate glass which is poured into stainless steel canisters for cooling and, ultimately, shipment to and storage in a geologic repository

  12. Food Processing Control

    Science.gov (United States)

    1997-01-01

    When NASA started plarning for manned space travel in 1959, the myriad challenges of sustaining life in space included a seemingly mundane but vitally important problem: How and what do you feed an astronaut? There were two main concerns: preventing food crumbs from contaminating the spacecraft's atmosphere or floating into sensitive instruments, and ensuring complete freedom from potentially catastrophic disease-producing bacteria, viruses, and toxins. To solve these concerns, NASA enlisted the help of the Pillsbury Company. Pillsbury quickly solved the first problem by coating bite-size foods to prevent crumbling. They developed the hazard analysis and critical control point (HACCP) concept to ensure against bacterial contamination. Hazard analysis is a systematic study of product, its ingredients, processing conditions, handling, storage, packing, distribution, and directions for consumer use to identify sensitive areas that might prove hazardous. Hazard analysis provides a basis for blueprinting the Critical Control Points (CCPs) to be monitored. CCPs are points in the chain from raw materials to the finished product where loss of control could result in unacceptable food safety risks. In early 1970, Pillsbury plants were following HACCP in production of food for Earthbound consumers. Pillsbury's subsequent training courses for Food and Drug Administration (FDA) personnel led to the incorporation of HACCP in the FDA's Low Acid Canned Foods Regulations, set down in the mid-1970s to ensure the safety of all canned food products in the U.S.

  13. Selection of input devices and controls for modern process control consoles

    International Nuclear Information System (INIS)

    Hasenfuss, O.; Zimmermann, R.

    1975-06-01

    In modern process control consoles man-machine communication is realized more and more by computer driven CRT displays, the most efficient communication system today. This paper describes the most important input devices and controls for such control consoles. A certain number of facts are given, which should be considered during the selection. The aptitude of the described devices for special tasks is discussed and recommendations are given for carrying out a selection. (orig.) [de

  14. A fast PID controller Design for Modern PLC for Process Control Application

    International Nuclear Information System (INIS)

    Mirza, A.; Nafis, A.; Anees, R.M.; Idris, S.

    2004-01-01

    PID is the most widely used control scheme in the process industry. Pill controllers are utilized for the control of such varied parameters as pressure, flow, temperature, etc. One characteristic of these parameters is that they posses slow dynamics. Most of the available digital controllers can manipulate only a single parameter- multiple controllers are required for control of more than one parameter. The Fast PID Controller for Modem PLC (Programmable Logic Controller) developed by the authors, provides control of several parameters at a time (through a single Pill control element), enhanced programmability including variable sampling period, parameter monitoring and data storage, which may be easily implemented in a PLC. (author)

  15. A modelling and control structure for product quality control in climate-controlled processing of agro-material

    NARCIS (Netherlands)

    Verdijck, G.J.C.; Straten, van G.

    2002-01-01

    In this paper a modelling and control structure for product quality control is presented for a class of operations that processes agro-material. This class can be characterised as climate-controlled operations, such as storage, transport and drying. The basic model consists of three parts. These are

  16. The CANDU 9 distributed control system design process

    International Nuclear Information System (INIS)

    Harber, J.E.; Kattan, M.K.; Macbeth, M.J.

    1997-01-01

    Canadian designed CANDU pressurized heavy water nuclear reactors have been world leaders in electrical power generation. The CANDU 9 project is AECL's next reactor design. Plant control for the CANDU 9 station design is performed by a distributed control system (DCS) as compared to centralized control computers, analog control devices and relay logic used in previous CANDU designs. The selection of a DCS as the platform to perform the process control functions and most of the data acquisition of the plant, is consistent with the evolutionary nature of the CANDU technology. The control strategies for the DCS control programs are based on previous CANDU designs but are implemented on a new hardware platform taking advantage of advances in computer technology. This paper describes the design process for developing the CANDU 9 DCS. Various design activities, prototyping and analyses have been undertaken in order to ensure a safe, functional, and cost-effective design. (author)

  17. Online sensing and control of oil in process wastewater

    Science.gov (United States)

    Khomchenko, Irina B.; Soukhomlinoff, Alexander D.; Mitchell, T. F.; Selenow, Alexander E.

    2002-02-01

    Industrial processes, which eliminate high concentration of oil in their waste stream, find it extremely difficult to measure and control the water purification process. Most oil separation processes involve chemical separation using highly corrosive caustics, acids, surfactants, and emulsifiers. Included in the output of this chemical treatment process are highly adhesive tar-like globules, emulsified and surface oils, and other emulsified chemicals, in addition to suspended solids. The level of oil/hydrocarbons concentration in the wastewater process may fluctuate from 1 ppm to 10,000 ppm, depending upon the specifications of the industry and level of water quality control. The authors have developed a sensing technology, which provides the accuracy of scatter/absorption sensing in a contactless environment by combining these methodologies with reflective measurement. The sensitivity of the sensor may be modified by changing the fluid level control in the flow cell, allowing for a broad range of accurate measurement from 1 ppm to 10,000 ppm. Because this sensing system has been designed to work in a highly invasive environment, it can be placed close to the process source to allow for accurate real time measurement and control.

  18. Controlled versus automatic processes: which is dominant to safety? The moderating effect of inhibitory control.

    Directory of Open Access Journals (Sweden)

    Yaoshan Xu

    Full Text Available This study explores the precursors of employees' safety behaviors based on a dual-process model, which suggests that human behaviors are determined by both controlled and automatic cognitive processes. Employees' responses to a self-reported survey on safety attitudes capture their controlled cognitive process, while the automatic association concerning safety measured by an Implicit Association Test (IAT reflects employees' automatic cognitive processes about safety. In addition, this study investigates the moderating effects of inhibition on the relationship between self-reported safety attitude and safety behavior, and that between automatic associations towards safety and safety behavior. The results suggest significant main effects of self-reported safety attitude and automatic association on safety behaviors. Further, the interaction between self-reported safety attitude and inhibition and that between automatic association and inhibition each predict unique variances in safety behavior. Specifically, the safety behaviors of employees with lower level of inhibitory control are influenced more by automatic association, whereas those of employees with higher level of inhibitory control are guided more by self-reported safety attitudes. These results suggest that safety behavior is the joint outcome of both controlled and automatic cognitive processes, and the relative importance of these cognitive processes depends on employees' individual differences in inhibitory control. The implications of these findings for theoretical and practical issues are discussed at the end.

  19. Image-guided radiotherapy quality control: Statistical process control using image similarity metrics.

    Science.gov (United States)

    Shiraishi, Satomi; Grams, Michael P; Fong de Los Santos, Luis E

    2018-05-01

    The purpose of this study was to demonstrate an objective quality control framework for the image review process. A total of 927 cone-beam computed tomography (CBCT) registrations were retrospectively analyzed for 33 bilateral head and neck cancer patients who received definitive radiotherapy. Two registration tracking volumes (RTVs) - cervical spine (C-spine) and mandible - were defined, within which a similarity metric was calculated and used as a registration quality tracking metric over the course of treatment. First, sensitivity to large misregistrations was analyzed for normalized cross-correlation (NCC) and mutual information (MI) in the context of statistical analysis. The distribution of metrics was obtained for displacements that varied according to a normal distribution with standard deviation of σ = 2 mm, and the detectability of displacements greater than 5 mm was investigated. Then, similarity metric control charts were created using a statistical process control (SPC) framework to objectively monitor the image registration and review process. Patient-specific control charts were created using NCC values from the first five fractions to set a patient-specific process capability limit. Population control charts were created using the average of the first five NCC values for all patients in the study. For each patient, the similarity metrics were calculated as a function of unidirectional translation, referred to as the effective displacement. Patient-specific action limits corresponding to 5 mm effective displacements were defined. Furthermore, effective displacements of the ten registrations with the lowest similarity metrics were compared with a three dimensional (3DoF) couch displacement required to align the anatomical landmarks. Normalized cross-correlation identified suboptimal registrations more effectively than MI within the framework of SPC. Deviations greater than 5 mm were detected at 2.8σ and 2.1σ from the mean for NCC and MI

  20. Flexible distributed architecture for semiconductor process control and experimentation

    Science.gov (United States)

    Gower, Aaron E.; Boning, Duane S.; McIlrath, Michael B.

    1997-01-01

    Semiconductor fabrication requires an increasingly expensive and integrated set of tightly controlled processes, driving the need for a fabrication facility with fully computerized, networked processing equipment. We describe an integrated, open system architecture enabling distributed experimentation and process control for plasma etching. The system was developed at MIT's Microsystems Technology Laboratories and employs in-situ CCD interferometry based analysis in the sensor-feedback control of an Applied Materials Precision 5000 Plasma Etcher (AME5000). Our system supports accelerated, advanced research involving feedback control algorithms, and includes a distributed interface that utilizes the internet to make these fabrication capabilities available to remote users. The system architecture is both distributed and modular: specific implementation of any one task does not restrict the implementation of another. The low level architectural components include a host controller that communicates with the AME5000 equipment via SECS-II, and a host controller for the acquisition and analysis of the CCD sensor images. A cell controller (CC) manages communications between these equipment and sensor controllers. The CC is also responsible for process control decisions; algorithmic controllers may be integrated locally or via remote communications. Finally, a system server images connections from internet/intranet (web) based clients and uses a direct link with the CC to access the system. Each component communicates via a predefined set of TCP/IP socket based messages. This flexible architecture makes integration easier and more robust, and enables separate software components to run on the same or different computers independent of hardware or software platform.

  1. Processing Controlled PROs in Spanish

    Science.gov (United States)

    Betancort, Moises; Carreiras, Manuel; Acuna-Farina, Carlos

    2006-01-01

    Two experiments were carried out to investigate the processing of the empty category PRO and the time-course of this in Spanish. Eye movements were recorded while participants read sentences in which a matrix clause was followed by a subordinate infinitival clause, so that the subject or the object of the main clause could act as controller of…

  2. Process control and optimization with simple interval calculation method

    DEFF Research Database (Denmark)

    Pomerantsev, A.; Rodionova, O.; Høskuldsson, Agnar

    2006-01-01

    for the quality improvement in the course of production. The latter is an active quality optimization, which takes into account the actual history of the process. The advocate approach is allied to the conventional method of multivariate statistical process control (MSPC) as it also employs the historical process......Methods of process control and optimization are presented and illustrated with a real world example. The optimization methods are based on the PLS block modeling as well as on the simple interval calculation methods of interval prediction and object status classification. It is proposed to employ...... the series of expanding PLS/SIC models in order to support the on-line process improvements. This method helps to predict the effect of planned actions on the product quality and thus enables passive quality control. We have also considered an optimization approach that proposes the correcting actions...

  3. Stepless control system for reciprocating compressors: energy savings + process control improvement

    Energy Technology Data Exchange (ETDEWEB)

    Grande, Alvaro; Wenisch, Markus [Hoerbiger Ventilwerke GmbH and Co KG, Wien (Austria); Jacobs, Denis [HOERBIGER do Brasil Industria de Equipamentos, Cajamar, SP (Brazil)

    2012-07-01

    In the past, the capacity of reciprocating compressors was typically controlled by on/off unloaders (step-control) and recycle valves. But due to the fact that the power ratings of new reciprocating compressors for the oil and gas industry increase significantly, advanced control systems are required to reduce power costs and save energy. On top of that, multi-stage compressors are frequently integrated into complex process plants that demand precise control and operational flexibility. There are several solutions for this equation, but maybe the most successful is the use of the reverse flow principle applied to an electronically controlled and hydraulically actuated suction valve unloaders system. (author)

  4. Subfemtosecond directional control of chemical processes in molecules

    Science.gov (United States)

    Alnaser, Ali S.; Litvinyuk, Igor V.

    2017-02-01

    Laser pulses with a waveform-controlled electric field and broken inversion symmetry establish the opportunity to achieve directional control of molecular processes on a subfemtosecond timescale. Several techniques could be used to break the inversion symmetry of an electric field. The most common ones include combining a fundamental laser frequency with its second harmonic or with higher -frequency pulses (or pulse trains) as well as using few-cycle pulses with known carrier-envelope phase (CEP). In the case of CEP, control over chemical transformations, typically occurring on a timescale of many femtoseconds, is driven by much faster sub-cycle processes of subfemtosecond to few-femtosecond duration. This is possible because electrons are much lighter than nuclei and fast electron motion is coupled to the much slower nuclear motion. The control originates from populating coherent superpositions of different electronic or vibrational states with relative phases that are dependent on the CEP or phase offset between components of a two-color pulse. In this paper, we review the recent progress made in the directional control over chemical processes, driven by intense few-cycle laser pulses a of waveform-tailored electric field, in different molecules.

  5. Improving industrial process control systems security

    CERN Document Server

    Epting, U; CERN. Geneva. TS Department

    2004-01-01

    System providers are today creating process control systems based on remote connectivity using internet technology, effectively exposing these systems to the same threats as corporate computers. It is becoming increasingly difficult and costly to patch/maintain the technical infrastructure monitoring and control systems to remove these vulnerabilities. A strategy including risk assessment, security policy issues, service level agreements between the IT department and the controls engineering groups must be defined. In addition an increased awareness of IT security in the controls system engineering domain is needed. As consequence of these new factors the control system architectures have to take into account security requirements, that often have an impact on both operational aspects as well as on the project and maintenance cost. Manufacturers of industrial control system equipment do however also propose progressively security related solutions that can be used for our active projects. The paper discusses ...

  6. 77 FR 46096 - Statistical Process Controls for Blood Establishments; Public Workshop

    Science.gov (United States)

    2012-08-02

    ...] Statistical Process Controls for Blood Establishments; Public Workshop AGENCY: Food and Drug Administration... workshop entitled: ``Statistical Process Controls for Blood Establishments.'' The purpose of this public workshop is to discuss the implementation of statistical process controls to validate and monitor...

  7. New Principles of Process Control in Geotechnics by Acoustic Methods

    OpenAIRE

    Leššo, I.; Flegner, P.; Pandula, B.; Horovčák, P.

    2007-01-01

    The contribution describes the new solution of the control of rotary drilling process as some elementary process in geotechnics. The article presents the first results of research on the utilization of acoustic methods in identification process by optimal control of rotary drilling.

  8. Studies of neutron methods for process control and criticality surveillance of fissile material processing facilities

    International Nuclear Information System (INIS)

    Zoltowski, T.

    1988-01-01

    The development of radiochemical processes for fissile material processing and spent fuel handling need new control procedures enabling an improvement of plant throughput. This is strictly related to the implementation of continuous criticality control policy and developing reliable methods for monitoring the reactivity of radiochemical plant operations in presence of the process perturbations. Neutron methods seem to be applicable for fissile material control in some technological facilities. The measurement of epithermal neutron source multiplication with heuristic evaluation of measured data enables surveillance of anomalous reactivity enhancement leading to unsafe states. 80 refs., 47 figs., 33 tabs. (author)

  9. Application of artificial intelligence in process control

    CERN Document Server

    Krijgsman, A

    1993-01-01

    This book is the result of a united effort of six European universities to create an overall course on the appplication of artificial intelligence (AI) in process control. The book includes an introduction to key areas including; knowledge representation, expert, logic, fuzzy logic, neural network, and object oriented-based approaches in AI. Part two covers the application to control engineering, part three: Real-Time Issues, part four: CAD Systems and Expert Systems, part five: Intelligent Control and part six: Supervisory Control, Monitoring and Optimization.

  10. Scheduling algorithms for automatic control systems for technological processes

    Science.gov (United States)

    Chernigovskiy, A. S.; Tsarev, R. Yu; Kapulin, D. V.

    2017-01-01

    Wide use of automatic process control systems and the usage of high-performance systems containing a number of computers (processors) give opportunities for creation of high-quality and fast production that increases competitiveness of an enterprise. Exact and fast calculations, control computation, and processing of the big data arrays - all of this requires the high level of productivity and, at the same time, minimum time of data handling and result receiving. In order to reach the best time, it is necessary not only to use computing resources optimally, but also to design and develop the software so that time gain will be maximal. For this purpose task (jobs or operations), scheduling techniques for the multi-machine/multiprocessor systems are applied. Some of basic task scheduling methods for the multi-machine process control systems are considered in this paper, their advantages and disadvantages come to light, and also some usage considerations, in case of the software for automatic process control systems developing, are made.

  11. Design and Irnplernentation of Distributed Relational DBMS

    OpenAIRE

    都司, 達夫; 丸山, 正理; 大木下, 俊也; 冨士, 竹仁; 田中, 仁士; 上坂, 利文; 加藤, 昌央; 木本, 茂; 林, 利治; 渡辺, 勝正; TSUJI, Tatsuo; MARUYAMA, Masari; OGINOSIDTA, Toshiya; FUJI, Takehito; TANAKA, Hitoshi

    1992-01-01

    This paper describes a distributed relational database management system with emphasis on design and implementation. We have designed and constructed the system on the UNIX local area network in our university. It is based upon the SQL standard, and distributed processings are implemented using RPC (Remote Procedure Calls). The main features of出esystem include: (1) Multi -client and multi -server system, (2) Client-based distribution management, (3) Deadlock free concurrency control scheme, (...

  12. New Principles of Process Control in Geotechnics by Acoustic Methods

    Directory of Open Access Journals (Sweden)

    Leššo, I.

    2007-01-01

    Full Text Available The contribution describes the new solution of the control of rotary drilling process as some elementary process in geotechnics. The article presents the first results of research on the utilization of acoustic methods in identification process by optimal control of rotary drilling.

  13. IPCS: An integrated process control system for enhanced in-situ bioremediation

    International Nuclear Information System (INIS)

    Huang, Y.F.; Wang, G.Q.; Huang, G.H.; Xiao, H.N.; Chakma, A.

    2008-01-01

    To date, there has been little or no research related to process control of subsurface remediation systems. In this study, a framework to develop an integrated process control system for improving remediation efficiencies and reducing operating costs was proposed based on physical and numerical models, stepwise cluster analysis, non-linear optimization and artificial neural networks. Process control for enhanced in-situ bioremediation was accomplished through incorporating the developed forecasters and optimizers with methods of genetic algorithm and neural networks modeling. Application of the proposed approach to a bioremediation process in a pilot-scale system indicated that it was effective in dynamic optimization and real-time process control of the sophisticated bioremediation systems. - A framework of process control system was developed to improve in-situ bioremediation efficiencies and reducing operating costs

  14. An avoidance layer in hierarchical process control

    International Nuclear Information System (INIS)

    De Callatay, A.

    1994-01-01

    A project of layered software architecture is proposed: a safety-critical real-time non-stop simple kernel system includes a layer avoiding threatening actions from operators or programs in other control systems. Complex process-control applications (such as fuzzy systems) are useful for the smooth operation of the system, optimum productivity, efficient diagnostics, and safe management of degraded modes of operation. Defects in these complex process-control applications do not have an impact on safety if their commands have first to be accepted by a safety-critical module. The development, testing, and certification of complex applications computed in the outside layers can be made simpler and less expensive than for those in the kernel. Avoidance systems use rule-base systems having negative fuzzy conditions and actions. Animal and human behaviour cannot be explained without active avoidance

  15. Science-based information processing in the process control of power stations. Wissensbasierte Informationsverarbeitung in der Prozessfuehrung von Kraftwerken

    Energy Technology Data Exchange (ETDEWEB)

    Weisang, C. (Asea Brown Boveri AG, Heidelberg (Germany). Konzernforschungszentrum)

    1992-02-01

    Through the application of specialized systems, future-orientated information processing integrates the sciences of processes, control systems, process control strategies, user behaviour and ergonomics. Improvements in process control can be attained, inter alia, by the preparation of the information contained (e.g. by suppressing the flow of signals and replacing it with signals which are found on substance) and also by an ergonomic representation of the study of the process. (orig.).

  16. Low-level wastewater treatment facility process control operational test report

    International Nuclear Information System (INIS)

    Bergquist, G.G.

    1996-01-01

    This test report documents the results obtained while conducting operational testing of a new TK 102 level controller and total outflow integrator added to the NHCON software that controls the Low-Level Wastewater Treatment Facility (LLWTF). The test was performed with WHC-SD-CP-OTP 154, PFP Low-Level Wastewater Treatment Facility Process Control Operational Test. A complete test copy is included in appendix A. The new TK 102 level controller provides a signal, hereafter referred to its cascade mode, to the treatment train flow controller which enables the water treatment process to run for long periods without continuous operator monitoring. The test successfully demonstrated the functionality of the new controller under standard and abnormal conditions expected from the LLWTF operation. In addition, a flow totalizer is now displayed on the LLWTF outlet MICON screen which tallies the process output in gallons. This feature substantially improves the ability to retrieve daily process volumes for maintaining accurate material balances

  17. In-line instrumentation and computer-controlled process supervision in reprocessing

    International Nuclear Information System (INIS)

    Mache, H.R.; Groll, P.

    Measuring equipment is needed for continuous monitoring of concentration in radioactive process solutions. A review is given of existing in-line apparatus and of computer-controlled data processing. A process control system is described for TAMARA, a model extraction facility for the U/HNO 3 /TBP system

  18. 40 CFR 63.1322 - Batch process vents-reference control technology.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 11 2010-07-01 2010-07-01 true Batch process vents-reference control technology. 63.1322 Section 63.1322 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Batch process vents—reference control technology. (a) Batch process vents. The owner or operator of a...

  19. Minicomputer controlled test system for process control and monitoring systems

    International Nuclear Information System (INIS)

    Worster, L.D.

    A minicomputer controlled test system for testing process control and monitoring systems is described. This system, in service for over one year, has demonstrated that computerized control of such testing has a real potential for expanding the scope of the testing, improving accuracy of testing, and significantly reducing the time required to do the testing. The test system is built around a 16-bit minicomputer with 12K of memory. The system programming language is BASIC with the addition of assembly level routines for communication with the peripheral devices. The peripheral devices include a 100 channel scanner, analog-to-digital converter, visual display, and strip printer. (auth)

  20. 21 CFR 114.80 - Processes and controls.

    Science.gov (United States)

    2010-04-01

    ... scheduled process and maintained in all finished foods. Manufacturing shall be in accordance with the... occur often enough to ensure that the container suitably protects the food from leakage or contamination... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Processes and controls. 114.80 Section 114.80 Food...

  1. Performance of Globally Linearized Controller and Two Region Fuzzy Logic Controller on a Nonlinear Process

    Directory of Open Access Journals (Sweden)

    N. Jaya

    2008-10-01

    Full Text Available In this work, a design and implementation of a Conventional PI controller, single region fuzzy logic controller, two region fuzzy logic controller and Globally Linearized Controller (GLC for a two capacity interacting nonlinear process is carried out. The performance of this process using single region FLC, two region FLC and GLC are compared with the performance of conventional PI controller about an operating point of 50 %. It has been observed that GLC and two region FLC provides better performance. Further, this procedure is also validated by real time experimentation using dSPACE.

  2. Use of neural networks in process engineering. Thermodynamics, diffusion, and process control and simulation applications

    International Nuclear Information System (INIS)

    Otero, F

    1998-01-01

    This article presents the current status of the use of Artificial Neural Networks (ANNs) in process engineering applications where common mathematical methods do not completely represent the behavior shown by experimental observations, results, and plant operating data. Three examples of the use of ANNs in typical process engineering applications such as prediction of activity in solvent-polymer binary systems, prediction of a surfactant self-diffusion coefficient of micellar systems, and process control and simulation are shown. These examples are important for polymerization applications, enhanced-oil recovery, and automatic process control

  3. Graphical user interfaces for McClellan Nuclear Radiation Center

    International Nuclear Information System (INIS)

    Brown-VanHoozer, S.A.; Power, M.; Forsmann, H.

    1998-01-01

    The control console of the TRIGA reactor at McClellan's Nuclear Radiation Center (MNRC) is in the process of being replaced because of spurious scrams, outdated software, and obsolete parts. The intent of the new control console is to eliminate the existing problems by installing a UNIX-based computer system with industry-standard interface software and by incorporating human factors during all stages of the graphical user interface (GUI) development and control console design. This paper gives a brief description of some of the guidelines used in developing the MNRC's GUIs as continuous, real-time displays

  4. 21 CFR 212.50 - What production and process controls must I have?

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 4 2010-04-01 2010-04-01 false What production and process controls must I have... DRUGS (Eff. 12-12-2011) Production and Process Controls § 212.50 What production and process controls must I have? You must have adequate production and process controls to ensure the consistent production...

  5. Hydrothermal processing of Hanford tank wastes: Process modeling and control

    International Nuclear Information System (INIS)

    Currier, R.P.

    1994-01-01

    In the Los Alamos National Laboratory (LANL) hydrothermal process, waste streams are first pressurized and heated as they pass through a continuous flow tubular reactor vessel. The waste is maintained at reaction temperature of 300--550 C where organic destruction and sludge reformation occur. This report documents LANL activities in process modeling and control undertaken in FY94 to support hydrothermal process development. Key issues discussed include non-ideal flow patterns (e.g. axial dispersion) and their effect on reactor performance, the use and interpretation of inert tracer experiments, and the use of computational fluid mechanics to evaluate novel hydrothermal reactor designs. In addition, the effects of axial dispersion (and simplifications to rate expressions) on the estimated kinetic parameters are explored by non-linear regression to experimental data. Safety-related calculations are reported which estimate the explosion limits of effluent gases and the fate of hydrogen as it passes through the reactor. Development and numerical solution of a generalized one-dimensional mathematical model is also summarized. The difficulties encountered in using commercially available software to correlate the behavior of high temperature, high pressure aqueous electrolyte mixtures are summarized. Finally, details of the control system and experiments conducted to empirically determine the system response are reported

  6. Quality control process improvement of flexible printed circuit board by FMEA

    Science.gov (United States)

    Krasaephol, Siwaporn; Chutima, Parames

    2018-02-01

    This research focuses on the quality control process improvement of Flexible Printed Circuit Board (FPCB), centred around model 7-Flex, by using Failure Mode and Effect Analysis (FMEA) method to decrease proportion of defective finished goods that are found at the final inspection process. Due to a number of defective units that were found at the final inspection process, high scraps may be escaped to customers. The problem comes from poor quality control process which is not efficient enough to filter defective products from in-process because there is no In-Process Quality Control (IPQC) or sampling inspection in the process. Therefore, the quality control process has to be improved by setting inspection gates and IPCQs at critical processes in order to filter the defective products. The critical processes are analysed by the FMEA method. IPQC is used for detecting defective products and reducing chances of defective finished goods escaped to the customers. Reducing proportion of defective finished goods also decreases scrap cost because finished goods incur higher scrap cost than work in-process. Moreover, defective products that are found during process can reflect the abnormal processes; therefore, engineers and operators should timely solve the problems. Improved quality control was implemented for 7-Flex production lines from July 2017 to September 2017. The result shows decreasing of the average proportion of defective finished goods and the average of Customer Manufacturers Lot Reject Rate (%LRR of CMs) equal to 4.5% and 4.1% respectively. Furthermore, cost saving of this quality control process equals to 100K Baht.

  7. Can we (control) Engineer the degree learning process?

    Science.gov (United States)

    White, A. S.; Censlive, M.; Neilsen, D.

    2014-07-01

    This paper investigates how control theory could be applied to learning processes in engineering education. The initial point for the analysis is White's Double Loop learning model of human automation control modified for the education process where a set of governing principals is chosen, probably by the course designer. After initial training the student decides unknowingly on a mental map or model. After observing how the real world is behaving, a strategy to achieve the governing variables is chosen and a set of actions chosen. This may not be a conscious operation, it maybe completely instinctive. These actions will cause some consequences but not until a certain time delay. The current model is compared with the work of Hollenbeck on goal setting, Nelson's model of self-regulation and that of Abdulwahed, Nagy and Blanchard at Loughborough who investigated control methods applied to the learning process.

  8. Can we (control) Engineer the degree learning process?

    International Nuclear Information System (INIS)

    White, A S; Censlive, M; Neilsen, D

    2014-01-01

    This paper investigates how control theory could be applied to learning processes in engineering education. The initial point for the analysis is White's Double Loop learning model of human automation control modified for the education process where a set of governing principals is chosen, probably by the course designer. After initial training the student decides unknowingly on a mental map or model. After observing how the real world is behaving, a strategy to achieve the governing variables is chosen and a set of actions chosen. This may not be a conscious operation, it maybe completely instinctive. These actions will cause some consequences but not until a certain time delay. The current model is compared with the work of Hollenbeck on goal setting, Nelson's model of self-regulation and that of Abdulwahed, Nagy and Blanchard at Loughborough who investigated control methods applied to the learning process

  9. Monitoring and controlling the biogas process

    Energy Technology Data Exchange (ETDEWEB)

    Ahring, B K; Angelidaki, I [The Technical Univ. of Denmark, Dept. of Environmental Science and Engineering, Lyngby (Denmark)

    1997-08-01

    Many modern large-scale biogas plants have been constructed recently, increasing the demand for proper monitoring and control of these large reactor systems. For monitoring the biogas process, an easy to measure and reliable indicator is required, which reflects the metabolic state and the activity of the bacterial populations in the reactor. In this paper, we discuss existing indicators as well as indicators under development which can potentially be used to monitor the state of the biogas process in a reactor. Furthermore, data are presented from two large scale thermophilic biogas plants, subjected to temperature changes and where the concentration of volatile fatty acids was monitored. The results clearly demonstrated that significant changes in the concentration of the individual VFA occurred although the biogas production was not significantly changed. Especially the concentrations of butyrate, isobutyrate and isovalerate showed significant changes. Future improvements of process control could therefore be based on monitoring of the concentration of specific VFA`s together with information about the bacterial populations in the reactor. The last information could be supplied by the use of modern molecular techniques. (au) 51 refs.

  10. Automatic and controlled processing and the Broad Autism Phenotype.

    Science.gov (United States)

    Camodeca, Amy; Voelker, Sylvia

    2016-01-30

    Research related to verbal fluency in the Broad Autism Phenotype (BAP) is limited and dated, but generally suggests intact abilities in the context of weaknesses in other areas of executive function (Hughes et al., 1999; Wong et al., 2006; Delorme et al., 2007). Controlled processing, the generation of search strategies after initial, automated responses are exhausted (Spat, 2013), has yet to be investigated in the BAP, and may be evidenced in verbal fluency tasks. One hundred twenty-nine participants completed the Delis-Kaplan Executive Function System Verbal Fluency test (D-KEFS; Delis et al., 2001) and the Broad Autism Phenotype Questionnaire (BAPQ; Hurley et al., 2007). The BAP group (n=53) produced significantly fewer total words during the 2nd 15" interval compared to the Non-BAP (n=76) group. Partial correlations indicated similar relations between verbal fluency variables for each group. Regression analyses predicting 2nd 15" interval scores suggested differentiation between controlled and automatic processing skills in both groups. Results suggest adequate automatic processing, but slowed development of controlled processing strategies in the BAP, and provide evidence for similar underlying cognitive constructs for both groups. Controlled processing was predictive of Block Design score for Non-BAP participants, and was predictive of Pragmatic Language score on the BAPQ for BAP participants. These results are similar to past research related to strengths and weaknesses in the BAP, respectively, and suggest that controlled processing strategy use may be required in instances of weak lower-level skills. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  11. Dosimetry and control of radiation processing

    International Nuclear Information System (INIS)

    1988-01-01

    Eight invited papers on the general theme of 'Dosimetry and Control of Radiation Processing', presented at a one day symposium held at the National Physical Laboratory, are collected together in this document. Seven of the papers are selected and indexed separately. (author)

  12. Multivariate statistical process control of a continuous pharmaceutical twin-screw granulation and fluid bed drying process.

    Science.gov (United States)

    Silva, A F; Sarraguça, M C; Fonteyne, M; Vercruysse, J; De Leersnyder, F; Vanhoorne, V; Bostijn, N; Verstraeten, M; Vervaet, C; Remon, J P; De Beer, T; Lopes, J A

    2017-08-07

    A multivariate statistical process control (MSPC) strategy was developed for the monitoring of the ConsiGma™-25 continuous tablet manufacturing line. Thirty-five logged variables encompassing three major units, being a twin screw high shear granulator, a fluid bed dryer and a product control unit, were used to monitor the process. The MSPC strategy was based on principal component analysis of data acquired under normal operating conditions using a series of four process runs. Runs with imposed disturbances in the dryer air flow and temperature, in the granulator barrel temperature, speed and liquid mass flow and in the powder dosing unit mass flow were utilized to evaluate the model's monitoring performance. The impact of the imposed deviations to the process continuity was also evaluated using Hotelling's T 2 and Q residuals statistics control charts. The influence of the individual process variables was assessed by analyzing contribution plots at specific time points. Results show that the imposed disturbances were all detected in both control charts. Overall, the MSPC strategy was successfully developed and applied. Additionally, deviations not associated with the imposed changes were detected, mainly in the granulator barrel temperature control. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Statistical process control for residential treated wood

    Science.gov (United States)

    Patricia K. Lebow; Timothy M. Young; Stan Lebow

    2017-01-01

    This paper is the first stage of a study that attempts to improve the process of manufacturing treated lumber through the use of statistical process control (SPC). Analysis of industrial and auditing agency data sets revealed there are differences between the industry and agency probability density functions (pdf) for normalized retention data. Resampling of batches of...

  14. Radionuclides for process control and inspection

    International Nuclear Information System (INIS)

    Hadden, R.J.B.

    1987-01-01

    Radiation sources have been used in process control for over 40 years. Their use in inspection, implying visual examination, although of much earlier origin in the form of gamma radiography, is also of recent emergence in the form of tomographic methods. This paper firstly reviews the justification for the continued world-wide usage of isotopic methods. It then reviews a selection of innovative process control applications, based on radiation sources, as illustrations of the present state of the art and also describes recent progress in inspection methods including progress in the development of on-line facilities. For all applications involving radiation sources, careful selection of parameters is required to achieve the highest efficiency compatible with an integrity suitable for the intended application. The paper concludes with a brief discussion of the common principles on which the fabrication of sources is based in order to satisfy national and international safety legislation. (author)

  15. Real-time control data wrangling for development of mathematical control models of technological processes

    Science.gov (United States)

    Vasilyeva, N. V.; Koteleva, N. I.; Fedorova, E. R.

    2018-05-01

    The relevance of the research is due to the need to stabilize the composition of the melting products of copper-nickel sulfide raw materials in the Vanyukov furnace. The goal of this research is to identify the most suitable methods for the aggregation of the real time data for the development of a mathematical model for control of the technological process of melting copper-nickel sulfide raw materials in the Vanyukov furnace. Statistical methods of analyzing the historical data of the real technological object and the correlation analysis of process parameters are described. Factors that exert the greatest influence on the main output parameter (copper content in matte) and ensure the physical-chemical transformations are revealed. An approach to the processing of the real time data for the development of a mathematical model for control of the melting process is proposed. The stages of processing the real time information are considered. The adopted methodology for the aggregation of data suitable for the development of a control model for the technological process of melting copper-nickel sulfide raw materials in the Vanyukov furnace allows us to interpret the obtained results for their further practical application.

  16. Some Aspects of Process Computers Configuration Control in Nuclear Power Plant Krsko - Process Computer Signal Configuration Database (PCSCDB)

    International Nuclear Information System (INIS)

    Mandic, D.; Kocnar, R.; Sucic, B.

    2002-01-01

    During the operation of NEK and other nuclear power plants it has been recognized that certain issues related to the usage of digital equipment and associated software in NPP technological process protection, control and monitoring, is not adequately addressed in the existing programs and procedures. The term and the process of Process Computers Configuration Control joins three 10CFR50 Appendix B quality requirements of Process Computers application in NPP: Design Control, Document Control and Identification and Control of Materials, Parts and Components. This paper describes Process Computer Signal Configuration Database (PCSCDB), that was developed and implemented in order to resolve some aspects of Process Computer Configuration Control related to the signals or database points that exist in the life cycle of different Process Computer Systems (PCS) in Nuclear Power Plant Krsko. PCSCDB is controlled, master database, related to the definition and description of the configurable database points associated with all Process Computer Systems in NEK. PCSCDB holds attributes related to the configuration of addressable and configurable real time database points and attributes related to the signal life cycle references and history data such as: Input/Output signals, Manually Input database points, Program constants, Setpoints, Calculated (by application program or SCADA calculation tools) database points, Control Flags (example: enable / disable certain program feature) Signal acquisition design references to the DCM (Document Control Module Application software for document control within Management Information System - MIS) and MECL (Master Equipment and Component List MIS Application software for identification and configuration control of plant equipment and components) Usage of particular database point in particular application software packages, and in the man-machine interface features (display mimics, printout reports, ...) Signals history (EEAR Engineering

  17. AN OVERVIEW OF PHARMACEUTICAL PROCESS VALIDATION AND PROCESS CONTROL VARIABLES OF TABLETS MANUFACTURING PROCESSES IN INDUSTRY

    OpenAIRE

    Mahesh B. Wazade*, Sheelpriya R. Walde and Abhay M. Ittadwar

    2012-01-01

    Validation is an integral part of quality assurance; the product quality is derived from careful attention to a number of factors including selection of quality parts and materials, adequate product and manufacturing process design, control of the process variables, in-process and end-product testing. Recently validation has become one of the pharmaceutical industry’s most recognized and discussed subjects. It is a critical success factor in product approval and ongoing commercialization, fac...

  18. Quality Control in Automated Manufacturing Processes – Combined Features for Image Processing

    Directory of Open Access Journals (Sweden)

    B. Kuhlenkötter

    2006-01-01

    Full Text Available In production processes the use of image processing systems is widespread. Hardware solutions and cameras respectively are available for nearly every application. One important challenge of image processing systems is the development and selection of appropriate algorithms and software solutions in order to realise ambitious quality control for production processes. This article characterises the development of innovative software by combining features for an automatic defect classification on product surfaces. The artificial intelligent method Support Vector Machine (SVM is used to execute the classification task according to the combined features. This software is one crucial element for the automation of a manually operated production process

  19. Potential use of advanced process control for safety purposes during attack of a process plant

    International Nuclear Information System (INIS)

    Whiteley, James R.

    2006-01-01

    Many refineries and commodity chemical plants employ advanced process control (APC) systems to improve throughputs and yields. These APC systems utilize empirical process models for control purposes and enable operation closer to constraints than can be achieved with traditional PID regulatory feedback control. Substantial economic benefits are typically realized from the addition of APC systems. This paper considers leveraging the control capabilities of existing APC systems to minimize the potential impact of a terrorist attack on a process plant (e.g., petroleum refinery). Two potential uses of APC are described. The first is a conventional application of APC and involves automatically moving the process to a reduced operating rate when an attack first begins. The second is a non-conventional application and involves reconfiguring the APC system to optimize safety rather than economics. The underlying intent in both cases is to reduce the demands on the operator to allow focus on situation assessment and optimal response planning. An overview of APC is provided along with a brief description of the modifications required for the proposed new applications of the technology

  20. Parallel computing for event reconstruction in high-energy physics

    International Nuclear Information System (INIS)

    Wolbers, S.

    1993-01-01

    Parallel computing has been recognized as a solution to large computing problems. In High Energy Physics offline event reconstruction of detector data is a very large computing problem that has been solved with parallel computing techniques. A review of the parallel programming package CPS (Cooperative Processes Software) developed and used at Fermilab for offline reconstruction of Terabytes of data requiring the delivery of hundreds of Vax-Years per experiment is given. The Fermilab UNIX farms, consisting of 180 Silicon Graphics workstations and 144 IBM RS6000 workstations, are used to provide the computing power for the experiments. Fermilab has had a long history of providing production parallel computing starting with the ACP (Advanced Computer Project) Farms in 1986. The Fermilab UNIX Farms have been in production for over 2 years with 24 hour/day service to experimental user groups. Additional tools for management, control and monitoring these large systems will be described. Possible future directions for parallel computing in High Energy Physics will be given

  1. An introduction to statistical process control in research proteomics.

    Science.gov (United States)

    Bramwell, David

    2013-12-16

    Statistical process control is a well-established and respected method which provides a general purpose, and consistent framework for monitoring and improving the quality of a process. It is routinely used in many industries where the quality of final products is critical and is often required in clinical diagnostic laboratories [1,2]. To date, the methodology has been little utilised in research proteomics. It has been shown to be capable of delivering quantitative QC procedures for qualitative clinical assays [3] making it an ideal methodology to apply to this area of biological research. To introduce statistical process control as an objective strategy for quality control and show how it could be used to benefit proteomics researchers and enhance the quality of the results they generate. We demonstrate that rules which provide basic quality control are easy to derive and implement and could have a major impact on data quality for many studies. Statistical process control is a powerful tool for investigating and improving proteomics research work-flows. The process of characterising measurement systems and defining control rules forces the exploration of key questions that can lead to significant improvements in performance. This work asserts that QC is essential to proteomics discovery experiments. Every experimenter must know the current capabilities of their measurement system and have an objective means for tracking and ensuring that performance. Proteomic analysis work-flows are complicated and multi-variate. QC is critical for clinical chemistry measurements and huge strides have been made in ensuring the quality and validity of results in clinical biochemistry labs. This work introduces some of these QC concepts and works to bridge their use from single analyte QC to applications in multi-analyte systems. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 The Author. Published by Elsevier

  2. Process controls for radiation hardened aluminum gate bulk silicon CMOS

    International Nuclear Information System (INIS)

    Gregory, B.L.

    1975-01-01

    Optimized dry oxides have recently yielded notable improvements in CMOS radiation-hardness. By following the proper procedures and recipes, it is now possible to produce devices which will function satisfactorily after exposure to a total ionizing dose in excess of 10 6 RADS (Si). This paper is concerned with the controls required on processing parameters once the optimized process is defined. In this process, the pre-irradiation electrical parameters must be closely controlled to insure that devices will function after irradiation. In particular, the specifications on n- and p-channel threshold voltages require tight control of fixed oxide charge, surface-state density, oxide thickness, and substrate and p-well surface concentrations. In order to achieve the above level of radiation hardness, certain processing procedures and parameters must also be closely controlled. Higher levels of cleanliness are required in the hardened process than are commonly required for commercial CMOS since, for hardened dry oxides, no impurity gettering can be employed during or after oxidation. Without such gettering, an unclean oxide is unacceptable due to bias-temperature instability. Correct pre-oxidation cleaning, residual surface damage removal, proper oxidation and annealing temperatures and times, and the correct metal sintering cycle are all important in determining device hardness. In a reproducible, hardened process, each of these processing steps must be closely controlled. (U.S.)

  3. Model Predictive Control of Mineral Column Flotation Process

    Directory of Open Access Journals (Sweden)

    Yahui Tian

    2018-06-01

    Full Text Available Column flotation is an efficient method commonly used in the mineral industry to separate useful minerals from ores of low grade and complex mineral composition. Its main purpose is to achieve maximum recovery while ensuring desired product grade. This work addresses a model predictive control design for a mineral column flotation process modeled by a set of nonlinear coupled heterodirectional hyperbolic partial differential equations (PDEs and ordinary differential equations (ODEs, which accounts for the interconnection of well-stirred regions represented by continuous stirred tank reactors (CSTRs and transport systems given by heterodirectional hyperbolic PDEs, with these two regions combined through the PDEs’ boundaries. The model predictive control considers both optimality of the process operations and naturally present input and state/output constraints. For the discrete controller design, spatially varying steady-state profiles are obtained by linearizing the coupled ODE–PDE model, and then the discrete system is obtained by using the Cayley–Tustin time discretization transformation without any spatial discretization and/or without model reduction. The model predictive controller is designed by solving an optimization problem with input and state/output constraints as well as input disturbance to minimize the objective function, which leads to an online-solvable finite constrained quadratic regulator problem. Finally, the controller performance to keep the output at the steady state within the constraint range is demonstrated by simulation studies, and it is concluded that the optimal control scheme presented in this work makes this flotation process more efficient.

  4. First Dutch Process Control Security Event

    NARCIS (Netherlands)

    Luiijf, H.A.M.

    2008-01-01

    On May 21st , 2008, the Dutch National Infrastructure against Cyber Crime (NICC) organised their first Process Control Security Event. Mrs. Annemarie Zielstra, the NICC programme manager, opened the event. She welcomed the over 100 representatives of key industry sectors. “Earlier studies in the

  5. Memory-type control charts for monitoring the process dispersion

    NARCIS (Netherlands)

    Abbas, N.; Riaz, M.; Does, R.J.M.M.

    2014-01-01

    Control charts have been broadly used for monitoring the process mean and dispersion. Cumulative sum (CUSUM) and exponentially weighted moving average (EWMA) control charts are memory control charts as they utilize the past information in setting up the control structure. This makes CUSUM and

  6. Process qualification and control in electron beams--requirements, methods, new concepts and challenges

    International Nuclear Information System (INIS)

    Mittendorfer, J.; Gratzl, F.; Hanis, D.

    2004-01-01

    In this paper the status of process qualification and control in electron beam irradiation is analyzed in terms of requirements, concepts, methods and challenges for a state-of-the-art process control concept for medical device sterilization. Aspects from process qualification to routine process control are described together with the associated process variables. As a case study the 10 MeV beams at Mediscan GmbH are considered. Process control concepts like statistical process control (SPC) and a new concept to determine process capability is briefly discussed

  7. Computer programme for control and maintenance and object oriented database: application to the realisation of an particle accelerator, the VIVITRON

    International Nuclear Information System (INIS)

    Diaz, A.

    1996-01-01

    The command and control system for the Vivitron, a new generation electrostatic particle accelerator, has been implemented using workstations and front-end computers using VME standards, the whole within an environment of UNIX/VxWorks. This architecture is distributed over an Ethernet network. Measurements and commands of the different sensors and actuators are concentrated in the front-end computers. The development of a second version of the software giving better performance and more functionality is described. X11 based communication has been utilised to transmit all the necessary informations to display parameters within the front-end computers on to the graphic screens. All other communications between processes use the Remote Procedure Call method (RPC). The conception of the system is based largely on the object oriented database O 2 which integrates a full description of equipments and the code necessary to manage it. This code is generated by the database. This innovation permits easy maintenance of the system and bypasses the need of a specialist when adding new equipments. The new version of the command and control system has been progressively installed since August 1995. (author)

  8. 10 CFR 20.1701 - Use of process or other engineering controls.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Use of process or other engineering controls. 20.1701... or other engineering controls. The licensee shall use, to the extent practical, process or other engineering controls (e.g., containment, decontamination, or ventilation) to control the concentration of...

  9. Milestones in screen-based process control

    International Nuclear Information System (INIS)

    Guesnier, G.P.

    1995-01-01

    The German approach is based on the utilisation of the conceptual elements of the PRISCA information system developed by Siemens and on operational experience with screen-based process control in a conventional power plant. In the French approach, the screen-based control room for the N4 plants, designed from scratch, has undergone extensive simulator tests for validation before going into realisation. It is now used in the commissioning phase of the first N4 plants. The design of the control room for the European Pressurized Water Reactor will be based on the common experience of Siemens and Electricite de France. Its main elements are several separate operator workstations, a safety control area used as a back-up for postulated failures of the workstations, and a commonly utilisable plant overview for the operators' coordination. (orig./HP) [de

  10. Management of Uncertainty by Statistical Process Control and a Genetic Tuned Fuzzy System

    Directory of Open Access Journals (Sweden)

    Stephan Birle

    2016-01-01

    Full Text Available In food industry, bioprocesses like fermentation often are a crucial part of the manufacturing process and decisive for the final product quality. In general, they are characterized by highly nonlinear dynamics and uncertainties that make it difficult to control these processes by the use of traditional control techniques. In this context, fuzzy logic controllers offer quite a straightforward way to control processes that are affected by nonlinear behavior and uncertain process knowledge. However, in order to maintain process safety and product quality it is necessary to specify the controller performance and to tune the controller parameters. In this work, an approach is presented to establish an intelligent control system for oxidoreductive yeast propagation as a representative process biased by the aforementioned uncertainties. The presented approach is based on statistical process control and fuzzy logic feedback control. As the cognitive uncertainty among different experts about the limits that define the control performance as still acceptable may differ a lot, a data-driven design method is performed. Based upon a historic data pool statistical process corridors are derived for the controller inputs control error and change in control error. This approach follows the hypothesis that if the control performance criteria stay within predefined statistical boundaries, the final process state meets the required quality definition. In order to keep the process on its optimal growth trajectory (model based reference trajectory a fuzzy logic controller is used that alternates the process temperature. Additionally, in order to stay within the process corridors, a genetic algorithm was applied to tune the input and output fuzzy sets of a preliminarily parameterized fuzzy controller. The presented experimental results show that the genetic tuned fuzzy controller is able to keep the process within its allowed limits. The average absolute error to the

  11. Digital signal processing in power system protection and control

    CERN Document Server

    Rebizant, Waldemar; Wiszniewski, Andrzej

    2011-01-01

    Digital Signal Processing in Power System Protection and Control bridges the gap between the theory of protection and control and the practical applications of protection equipment. Understanding how protection functions is crucial not only for equipment developers and manufacturers, but also for their users who need to install, set and operate the protection devices in an appropriate manner. After introductory chapters related to protection technology and functions, Digital Signal Processing in Power System Protection and Control presents the digital algorithms for signal filtering, followed

  12. Diagnostic system for process control at NPP Dukovany load follow

    International Nuclear Information System (INIS)

    Rubek, J.; Petruzela, I.

    1998-01-01

    The NPP Dukovany is being operated in the frequency control since 1996. In last year a project for the plant load follow has been developed. One part of the project is to install a diagnostic system for process control. At present the main control loops of the plant control system are regular tested after unit refuelling only. The functionality and control system parameter adjusting is tested by certificated procedures. This state is unsuitable in view of the plan load follow operation. The relevant operational modes are based on minimisation of influence on plant component life time and on achievement of planned unit parameters. Therefore it is necessary to provide testing of main control system parts in shorter time period. Mainly at time when the unit is really in load follow operation. The paper describes the diagnostic system for process control which will be at NPP Dukovany implemented. The principal of the system will be evaluation of real and expected changes of technological variables. The system utilises thermohydraulic relation among main technological variables and relation among controlled and manipulated variables. Outputs of the system will be used to operational staff support at the plant operation. It enables: determination of control system state, estimation and check of future control system state, early indication of the deviation of process from normal conditions, check of efficiency of operational staff intervention into plant control. The system gives the plant operator new information for the plant process control. Simultaneously the coupling of new system outputs on existing signalisation is solved. (author)

  13. Requirements for a modern process control system (PLS); Anforderungen an ein modernes Prozessleitsystem (PLS)

    Energy Technology Data Exchange (ETDEWEB)

    Maurer, Michael [SAR Elektronic GmbH, Dingolfing (Germany)

    2012-11-01

    The process control system is of crucial importance for the process management of process engineering systems. The process control system has to enable the operation and surveillance of processes, to register critical process conditions and to provide process data for the evaluation of processes. The process control system delivers real time process data to superior process engineering systems (MES, ERP) and implements control commands of superior systems on the plant. The market of process control engineering systems is characterized by a variety of different systems, most different project specific and customer specific configurations as well as different releases. Control systems are in competition to programmable logic controllers and black boxes. The satisfaction of the user with his process control system depends significantly on the attained quality of execution of the supplying company and the used power plant library. It does not depend on the chosen brand of process control system. The availability of process control systems depends on the chosen system architecture and the chosen components, but not from the brand of process control system.

  14. Comparison Analysis of Model Predictive Controller with Classical PID Controller For pH Control Process

    Directory of Open Access Journals (Sweden)

    V. Balaji

    2016-12-01

    Full Text Available pH control plays a important role in any chemical plant and process industries. For the past four decades the classical PID controller has been occupied by the industries. Due to the faster computing   technology in the industry demands a tighter advanced control strategy. To fulfill the needs and requirements Model Predictive Control (MPC is the best among all the advanced control algorithms available in the present scenario. The study and analysis has been done for First Order plus Delay Time (FOPDT model controlled by Proportional Integral Derivative (PID and MPC using the Matlab software. This paper explores the capability of the MPC strategy, analyze and compare the control effects with conventional control strategy in pH control. A comparison results between the PID and MPC is plotted using the software. The results clearly show that MPC provide better performance than the classical controller.

  15. Materials of the Regional Training Course on Validation and Process Control for Electron Beam Radiation Processing

    International Nuclear Information System (INIS)

    Kaluska, I.; Gluszewski, W.

    2007-01-01

    Irradiation with electron beams is used in the polymer industry, food, pharmaceutical and medical device industries for sterilization of surfaces. About 20 lectures presented during the Course were devoted to all aspects of control and validation of low energy electron beam processes. They should help the product manufacturers better understand the application of the ANSI/AAMI/ISO 11137 norm, which defines the requirements and standard practices for validation of the irradiation process and the process controls required during routine processing

  16. Process automation using combinations of process and machine control technologies with application to a continuous dissolver

    International Nuclear Information System (INIS)

    Spencer, B.B.; Yarbro, O.O.

    1991-01-01

    Operation of a continuous rotary dissolver, designed to leach uranium-plutonium fuel from chopped sections of reactor fuel cladding using nitric acid, has been automated. The dissolver is a partly continuous, partly batch process that interfaces at both ends with batchwise processes, thereby requiring synchronization of certain operations. Liquid acid is fed and flows through the dissolver continuously, whereas chopped fuel elements are fed to the dissolver in small batches and move through the compartments of the dissolver stagewise. Sequential logic (or machine control) techniques are used to control discrete activities such as the sequencing of isolation valves. Feedback control is used to control acid flowrates and temperatures. Expert systems technology is used for on-line material balances and diagnostics of process operation. 1 ref., 3 figs

  17. Statistical process control for alpha spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Richardson, W; Majoras, R E [Oxford Instruments, Inc. P.O. Box 2560, Oak Ridge TN 37830 (United States); Joo, I O; Seymour, R S [Accu-Labs Research, Inc. 4663 Table Mountain Drive, Golden CO 80403 (United States)

    1995-10-01

    Statistical process control(SPC) allows for the identification of problems in alpha spectroscopy processes before they occur, unlike standard laboratory Q C which only identifies problems after a process fails. SPC tools that are directly applicable to alpha spectroscopy include individual X-charts and X-bar charts, process capability plots, and scatter plots. Most scientists are familiar with the concepts the and methods employed by SPC. These tools allow analysis of process bias, precision, accuracy and reproducibility as well as process capability. Parameters affecting instrument performance are monitored and analyzed using SPC methods. These instrument parameters can also be compared to sampling, preparation, measurement, and analysis Q C parameters permitting the evaluation of cause effect relationships. Three examples of SPC, as applied to alpha spectroscopy , are presented. The first example investigates background contamination using averaging to show trends quickly. A second example demonstrates how SPC can identify sample processing problems, analyzing both how and why this problem occurred. A third example illustrates how SPC can predict when an alpha spectroscopy process is going to fail. This allows for an orderly and timely shutdown of the process to perform preventative maintenance, avoiding the need to repeat costly sample analyses. 7 figs., 2 tabs.

  18. Statistical process control for alpha spectroscopy

    International Nuclear Information System (INIS)

    Richardson, W.; Majoras, R.E.; Joo, I.O.; Seymour, R.S.

    1995-01-01

    Statistical process control(SPC) allows for the identification of problems in alpha spectroscopy processes before they occur, unlike standard laboratory Q C which only identifies problems after a process fails. SPC tools that are directly applicable to alpha spectroscopy include individual X-charts and X-bar charts, process capability plots, and scatter plots. Most scientists are familiar with the concepts the and methods employed by SPC. These tools allow analysis of process bias, precision, accuracy and reproducibility as well as process capability. Parameters affecting instrument performance are monitored and analyzed using SPC methods. These instrument parameters can also be compared to sampling, preparation, measurement, and analysis Q C parameters permitting the evaluation of cause effect relationships. Three examples of SPC, as applied to alpha spectroscopy , are presented. The first example investigates background contamination using averaging to show trends quickly. A second example demonstrates how SPC can identify sample processing problems, analyzing both how and why this problem occurred. A third example illustrates how SPC can predict when an alpha spectroscopy process is going to fail. This allows for an orderly and timely shutdown of the process to perform preventative maintenance, avoiding the need to repeat costly sample analyses. 7 figs., 2 tabs

  19. Advanced process monitoring and feedback control to enhance cell culture process production and robustness.

    Science.gov (United States)

    Zhang, An; Tsang, Valerie Liu; Moore, Brandon; Shen, Vivian; Huang, Yao-Ming; Kshirsagar, Rashmi; Ryll, Thomas

    2015-12-01

    It is a common practice in biotherapeutic manufacturing to define a fixed-volume feed strategy for nutrient feeds, based on historical cell demand. However, once the feed volumes are defined, they are inflexible to batch-to-batch variations in cell growth and physiology and can lead to inconsistent productivity and product quality. In an effort to control critical quality attributes and to apply process analytical technology (PAT), a fully automated cell culture feedback control system has been explored in three different applications. The first study illustrates that frequent monitoring and automatically controlling the complex feed based on a surrogate (glutamate) level improved protein production. More importantly, the resulting feed strategy was translated into a manufacturing-friendly manual feed strategy without impact on product quality. The second study demonstrates the improved process robustness of an automated feed strategy based on online bio-capacitance measurements for cell growth. In the third study, glucose and lactate concentrations were measured online and were used to automatically control the glucose feed, which in turn changed lactate metabolism. These studies suggest that the auto-feedback control system has the potential to significantly increase productivity and improve robustness in manufacturing, with the goal of ensuring process performance and product quality consistency. © 2015 Wiley Periodicals, Inc.

  20. Computer programme for control and maintenance and object oriented database: application to the realisation of an particle accelerator, the VIVITRON; Logiciel de controle et commande et base de donnees orientee objet: application dans le cadre de la mise en oeuvre d`un accelerateur de particules, le VIVITRON

    Energy Technology Data Exchange (ETDEWEB)

    Diaz, A

    1996-01-11

    The command and control system for the Vivitron, a new generation electrostatic particle accelerator, has been implemented using workstations and front-end computers using VME standards, the whole within an environment of UNIX/VxWorks. This architecture is distributed over an Ethernet network. Measurements and commands of the different sensors and actuators are concentrated in the front-end computers. The development of a second version of the software giving better performance and more functionality is described. X11 based communication has been utilised to transmit all the necessary informations to display parameters within the front-end computers on to the graphic screens. All other communications between processes use the Remote Procedure Call method (RPC). The conception of the system is based largely on the object oriented database O{sub 2} which integrates a full description of equipments and the code necessary to manage it. This code is generated by the database. This innovation permits easy maintenance of the system and bypasses the need of a specialist when adding new equipments. The new version of the command and control system has been progressively installed since August 1995. (author) 38 refs.

  1. Novel strategies for control of fermentation processes

    DEFF Research Database (Denmark)

    Mears, Lisa; Stocks, Stuart; Sin, Gürkan

    Bioprocesses are inherently sensitive to fluctuations in processing conditions and must be tightly regulated to maintain cellular productivity. Industrial fermentations are often difficult to replicate across production sites or between facilities as the small operating differences in the equipment...... of a fermentation. Industrial fermentation processes are typically operated in fed batch mode, which also poses specific challenges for process monitoring and control. This is due to many reasons including non-linear behaviour, and a relatively poor understanding of the system dynamics. It is therefore challenging...

  2. Process management and controlling in diagnostic radiology

    International Nuclear Information System (INIS)

    Gocke, P.; Debatin, J.F.; Duerselen, L.F.J.

    2002-01-01

    Systematic process management and efficient quality control is rapidly gaining importance in our healthcare system. What does this mean for diagnostic radiology departments?To improve efficiency, quality and productivity the workflow within the department of diagnostic and interventional radiology at the University Hospital of Essen were restructured over the last two years. Furthermore, a controlling system was established. One of the pursued aims was to create a quality management system as a basis for the subsequent certification according to the ISO EN 9001:2000 norm.Central to the success of the workflow reorganisation was the training of selected members of the department's staff in process and quality management theory. Thereafter, a dedicated working group was created to prepare the reorganisation and the subsequent ISO certification with the support of a consulting partner. To assure a smooth implementation of the restructured workflow and create acceptance for the required ISO-9001 documentation, the entire staff was familiarized with the basic ideas of process- and quality-management in several training sessions.This manuscript summarizes the basic concepts of process and quality management as they were taught to our staff. A direct relationship towards diagnostic radiology is maintained throughout the text. (orig.) [de

  3. Development of the operational information processing platform

    International Nuclear Information System (INIS)

    Shin, Hyun Kook; Park, Jeong Seok; Baek, Seung Min; Kim, Young Jin; Joo, Jae Yoon; Lee, Sang Mok; Jeong, Young Woo; Seo, Ho Jun; Kim, Do Youn; Lee, Tae Hoon

    1996-02-01

    The Operational Information Processing Platform(OIPP) is platform system which was designed to provide the development and operation environments for plant operation and plant monitoring. It is based on the Plant Computer Systems (PCS) of Yonggwang 3 and 4, Ulchin 3 and 4, and Yonggwang 5 and 6 Nuclear Power Plants (NPP). The UNIX based workstation, real time kernel and graphics design tool are selected and installed through the reviewing the function of PCS. In order to construct the development environment for open system architecture and distributed computer system, open computer system architecture was adapted both in hardware and software. For verification of system design and evaluation of technical methodologies, the PCS running under the OIPP is being designed and implemented. In this system, the man-machine interface and system functions are being designed and implemented to evaluate the differences between the UCN 3, 4 PCS and OIPP. 15 tabs., 32 figs., 11 refs. (Author)

  4. Process theory for supervisory control of stochastic systems with data

    NARCIS (Netherlands)

    Markovski, J.

    2012-01-01

    We propose a process theory for supervisory control of stochastic nondeterministic plants with data-based observations. The Markovian process theory with data relies on the notion of Markovian partial bisimulation to capture controllability of stochastic nondeterministic systems. It presents a

  5. Controlling the Instructional Development Process. Training Development and Research Center Project Number Fifteen.

    Science.gov (United States)

    Sleezer, Catherine M.; Swanson, Richard A.

    Process control is a way of training managers in business and industry to plan, monitor, and communicate the instructional development process of training projects. Two simple and useful tools that managers use in controlling the process of instructional development are the Process Control Planning Sheet and the Process Control Record. The Process…

  6. Optimal Control of Beer Fermentation Process Using Differential ...

    African Journals Online (AJOL)

    Optimal Control of Beer Fermentation Process Using Differential Transform Method. ... Journal of Applied Sciences and Environmental Management ... The method of differential transform was used to obtain the solution governing the fermentation process; the system of equation was transformed using the differential ...

  7. Design issues of a reinforcement-based self-learning fuzzy controller for petrochemical process control

    Science.gov (United States)

    Yen, John; Wang, Haojin; Daugherity, Walter C.

    1992-01-01

    Fuzzy logic controllers have some often-cited advantages over conventional techniques such as PID control, including easier implementation, accommodation to natural language, and the ability to cover a wider range of operating conditions. One major obstacle that hinders the broader application of fuzzy logic controllers is the lack of a systematic way to develop and modify their rules; as a result the creation and modification of fuzzy rules often depends on trial and error or pure experimentation. One of the proposed approaches to address this issue is a self-learning fuzzy logic controller (SFLC) that uses reinforcement learning techniques to learn the desirability of states and to adjust the consequent part of its fuzzy control rules accordingly. Due to the different dynamics of the controlled processes, the performance of a self-learning fuzzy controller is highly contingent on its design. The design issue has not received sufficient attention. The issues related to the design of a SFLC for application to a petrochemical process are discussed, and its performance is compared with that of a PID and a self-tuning fuzzy logic controller.

  8. Design and application of process control charting methodologies to gamma irradiation practices

    Science.gov (United States)

    Saylor, M. C.; Connaghan, J. P.; Yeadon, S. C.; Herring, C. M.; Jordan, T. M.

    2002-12-01

    The relationship between the contract irradiation facility and the customer has historically been based upon a "PASS/FAIL" approach with little or no quality metrics used to gage the control of the irradiation process. Application of process control charts, designed in coordination with mathematical simulation of routine radiation processing, can provide a basis for understanding irradiation events. By using tools that simulate the physical rules associated with the irradiation process, end-users can explore process-related boundaries and the effects of process changes. Consequently, the relationship between contractor and customer can evolve based on the derived knowledge. The resulting level of mutual understanding of the irradiation process and its resultant control benefits both the customer and contract operation, and provides necessary assurances to regulators. In this article we examine the complementary nature of theoretical (point kernel) and experimental (dosimetric) process evaluation, and the resulting by-product of improved understanding, communication and control generated through the implementation of effective process control charting strategies.

  9. Improving the effectiveness of detailed processing by dynamic control of processing with high sports range

    Directory of Open Access Journals (Sweden)

    Yu.V. Shapoval

    2017-12-01

    Full Text Available In this article the possibility of increasing the efficiency of the processing of parts with a diameter of up to 20 mm is analyzed, namely: vibration resistance of the cutting process at pinching due to cutting speed control in the processing, forecasting and selection of rotational frequencies, which ensure the stability of the processing system, controlling the dynamics of the process of displacement of the additional mass. The method of investigation of vibration processes during the sharpening is developed. As a result of the processing of experimental data, it was found that when an oscillatory motion is applied to the spindle rotation, the overall level of oscillation decreases, which is reflected on the quality of the treated surface. The choice of a previously known spindle rotation frequency range at which the lowest value of the oscillation amplitude of the instrument is observed in the radial direction to the detail part, allows you to increase the processing efficiency while maintaining the drawing requirements for roughness by increasing the spindle rotational speed. The combination of the node of the own forms of oscillation and the cutting zone, by dynamically controlling the fluctuations of the lathe armature due to the increase of the inertia characteristics of the machine and the reduction of the oscillation amplitude of the tool, can improve the accuracy of machining and roughness of the processed surface of the component at higher spindle speeds.

  10. Quality control of CANDU6 fuel element in fabrication process

    International Nuclear Information System (INIS)

    Li Yinxie; Zhang Jie

    2012-01-01

    To enhance the fine control over all aspects of the production process, improve product quality, fuel element fabrication process for CANDU6 quality process control activities carried out by professional technical and management technology combined mode, the quality of the fuel elements formed around CANDU6 weak links - - end plug , and brazing processes and procedures associated with this aspect of strict control, in improving staff quality consciousness, strengthening equipment maintenance, improved tooling, fixtures, optimization process test, strengthen supervision, fine inspection operations, timely delivery carry out aspects of the quality of information and concerns the production environment, etc., to find the problem from the improvement of product quality and factors affecting the source, and resolved to form the active control, comprehensive and systematic analysis of the problem of the quality management concepts, effectively reducing the end plug weld microstructure after the failure times and number of defects zirconium alloys brazed, improved product quality, and created economic benefits expressly provided, while staff quality consciousness and attention to detail, collaboration department, communication has been greatly improved and achieved very good management effectiveness. (authors)

  11. In-line metallurgical process control in the steel industry

    International Nuclear Information System (INIS)

    Wanin, M.

    1993-01-01

    The steel products manufacturing involves a long line of complex processes: liquid metal elaboration, solidification, hot and cold transformation by rolling surface protection by coating. The Process Control aims at improving global productivity and quality of the resulting products by optimizing each elementary process as well as management of tools or workshops interfaces. Complex processes, involving generally many variables, require for their control more or less sophisticated models. These process models are either analytical when physical and thermodynamical mechanisms are known or statistical or knowledge based, according to circumstances. In any case, it is necessary to have a reliable and precise instrumentation to adjust undetermined parameters during model development and to be able to take into account external parameters variability during current working. This instrumentation concerns both running of machines and testing of manufactured materials under harsh environment conditions of Iron and Steel industry: temperature, dusts, steam, electromagnetic interferences, vibrations, .. . In this context, in-line Non Destructive Testing methods contribute efficienly because they may give directly and in real time products characteristics, integrating both drifts of machines and sensors due to their ageing and the abnormal spread of material entering the process. These methods induce the development of sophisticated inspection equipments whose strategic significance is such that their failure to operate can require production shutdown. The paper gives some representative examples of improvement of the accuracy of an in-line measurement or controlling of elementary processes or processes interfaces: temperature measurement by infrared pyrometry, thickness profile determination by X-ray array sensor, recrystallization control in continuous by X-ray and ultrasonic methods, automatic detection and indentification of surface defects by optics, cracks detection on

  12. Fourth Dutch Process Security Control Event

    NARCIS (Netherlands)

    Luiijf, H.A.M.; Zielstra, A.

    2010-01-01

    On December 1st, 2009, the fourth Dutch Process Control Security Event took place in Baarn, The Netherlands. The security event with the title ‘Manage IT!’ was organised by the Dutch National Infrastructure against Cybercrime (NICC). Mid of November, a group of over thirty people participated in the

  13. Modified Smith-predictor multirate control utilizing secondary process measurements

    Directory of Open Access Journals (Sweden)

    Rolf Ergon

    2007-01-01

    Full Text Available The Smith-predictor is a well-known control structure for industrial time delay systems, where the basic idea is to estimate the non-delayed process output by use of a process model, and to use this estimate in an inner feedback control loop combined with an outer feedback loop based on the delayed estimation error. The model used may be either mechanistic or identified from input-output data. The paper discusses improvements of the Smith-predictor for systems where also secondary process measurements without time delay are available as a basis for the primary output estimation. The estimator may then be identified also in the common case with primary outputs sampled at a lower rate than the secondary outputs. A simulation example demonstrates the feasibility and advantages of the suggested control structure.

  14. Process control of Low and Intermediate-level radioactive wastes solidification

    International Nuclear Information System (INIS)

    1993-01-01

    Safety guidelines issued by the Spanish Council of Nuclear Safety (CSN) with basic criteria which must be adopted for the control of the Process for wastes solidification, establishing, in addition, a series of protocols and basic contents to assist the elaboration of Process Control Programs

  15. State Space Reduction of Linear Processes using Control Flow Reconstruction

    NARCIS (Netherlands)

    van de Pol, Jan Cornelis; Timmer, Mark

    2009-01-01

    We present a new method for fighting the state space explosion of process algebraic specifications, by performing static analysis on an intermediate format: linear process equations (LPEs). Our method consists of two steps: (1) we reconstruct the LPE's control flow, detecting control flow parameters

  16. State Space Reduction of Linear Processes Using Control Flow Reconstruction

    NARCIS (Netherlands)

    van de Pol, Jan Cornelis; Timmer, Mark; Liu, Zhiming; Ravn, Anders P.

    2009-01-01

    We present a new method for fighting the state space explosion of process algebraic specifications, by performing static analysis on an intermediate format: linear process equations (LPEs). Our method consists of two steps: (1) we reconstruct the LPE's control flow, detecting control flow parameters

  17. Quality control of the documentation process in electronic economic activities

    Directory of Open Access Journals (Sweden)

    Krutova A.S.

    2017-06-01

    Full Text Available It is proved that the main tool that will provide adequate information resources e economic activities of social and economic relations are documenting quality control processes as the basis of global information space. Directions problems as formation evaluation information resources in the process of documentation, namely development tools assess the efficiency of the system components – qualitative assessment; development of mathematical modeling tools – quantitative evaluation. A qualitative assessment of electronic documentation of economic activity through exercise performance, efficiency of communication; document management efficiency; effectiveness of flow control operations; relationship management effectiveness. The concept of quality control process documents electronically economic activity to components which include: the level of workflow; forms adequacy of information; consumer quality documents; quality attributes; type of income data; condition monitoring systems; organizational level process documentation; attributes of quality, performance quality consumer; type of management system; type of income data; condition monitoring systems. Grounded components of the control system electronic document subjects of economic activity. Detected components IT-audit management system economic activity: compliance audit; audit of internal control; detailed multilevel analysis; corporate risk assessment methodology. The stages and methods of processing electronic transactions economic activity during condition monitoring of electronic economic activity.

  18. Assessing Subjectivity in Sensor Data Post Processing via a Controlled Experiment

    Science.gov (United States)

    Jones, A. S.; Horsburgh, J. S.; Eiriksson, D.

    2017-12-01

    Environmental data collected by in situ sensors must be reviewed to verify validity, and conducting quality control often requires making edits in post processing to generate approved datasets. This process involves decisions by technicians, data managers, or data users on how to handle problematic data. Options include: removing data from a series, retaining data with annotations, and altering data based on algorithms related to adjacent data points or the patterns of data at other locations or of other variables. Ideally, given the same dataset and the same quality control guidelines, multiple data quality control technicians would make the same decisions in data post processing. However, despite the development and implementation of guidelines aimed to ensure consistent quality control procedures, we have faced ambiguity when performing post processing, and we have noticed inconsistencies in the practices of individuals performing quality control post processing. Technicians with the same level of training and using the same input datasets may produce different results, affecting the overall quality and comparability of finished data products. Different results may also be produced by technicians that do not have the same level of training. In order to assess the effect of subjective decision making by the individual technician on the end data product, we designed an experiment where multiple users performed quality control post processing on the same datasets using a consistent set of guidelines, field notes, and tools. We also assessed the effect of technician experience and training by conducting the same procedures with a group of novices unfamiliar with the data and the quality control process and compared their results to those generated by a group of more experienced technicians. In this presentation, we report our observations of the degree of subjectivity in sensor data post processing, assessing and quantifying the impacts of individual technician as

  19. Statistical process control charts for monitoring military injuries.

    Science.gov (United States)

    Schuh, Anna; Canham-Chervak, Michelle; Jones, Bruce H

    2017-12-01

    An essential aspect of an injury prevention process is surveillance, which quantifies and documents injury rates in populations of interest and enables monitoring of injury frequencies, rates and trends. To drive progress towards injury reduction goals, additional tools are needed. Statistical process control charts, a methodology that has not been previously applied to Army injury monitoring, capitalise on existing medical surveillance data to provide information to leadership about injury trends necessary for prevention planning and evaluation. Statistical process control Shewhart u-charts were created for 49 US Army installations using quarterly injury medical encounter rates, 2007-2015, for active duty soldiers obtained from the Defense Medical Surveillance System. Injuries were defined according to established military injury surveillance recommendations. Charts display control limits three standard deviations (SDs) above and below an installation-specific historical average rate determined using 28 data points, 2007-2013. Charts are available in Army strategic management dashboards. From 2007 to 2015, Army injury rates ranged from 1254 to 1494 unique injuries per 1000 person-years. Installation injury rates ranged from 610 to 2312 injuries per 1000 person-years. Control charts identified four installations with injury rates exceeding the upper control limits at least once during 2014-2015, rates at three installations exceeded the lower control limit at least once and 42 installations had rates that fluctuated around the historical mean. Control charts can be used to drive progress towards injury reduction goals by indicating statistically significant increases and decreases in injury rates. Future applications to military subpopulations, other health outcome metrics and chart enhancements are suggested. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  20. Digital Signal Processing and Control for the Study of Gene Networks

    Science.gov (United States)

    Shin, Yong-Jun

    2016-04-01

    Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.

  1. 40 CFR 63.113 - Process vent provisions-reference control technology.

    Science.gov (United States)

    2010-07-01

    ... § 63.113 Process vent provisions—reference control technology. (a) The owner or operator of a Group 1... 40 Protection of Environment 9 2010-07-01 2010-07-01 false Process vent provisions-reference control technology. 63.113 Section 63.113 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY...

  2. LABORATORY PROCESS CONTROLLER USING NATURAL LANGUAGE COMMANDS FROM A PERSONAL COMPUTER

    Science.gov (United States)

    Will, H.

    1994-01-01

    The complex environment of the typical research laboratory requires flexible process control. This program provides natural language process control from an IBM PC or compatible machine. Sometimes process control schedules require changes frequently, even several times per day. These changes may include adding, deleting, and rearranging steps in a process. This program sets up a process control system that can either run without an operator, or be run by workers with limited programming skills. The software system includes three programs. Two of the programs, written in FORTRAN77, record data and control research processes. The third program, written in Pascal, generates the FORTRAN subroutines used by the other two programs to identify the user commands with the user-written device drivers. The software system also includes an input data set which allows the user to define the user commands which are to be executed by the computer. To set the system up the operator writes device driver routines for all of the controlled devices. Once set up, this system requires only an input file containing natural language command lines which tell the system what to do and when to do it. The operator can make up custom commands for operating and taking data from external research equipment at any time of the day or night without the operator in attendance. This process control system requires a personal computer operating under MS-DOS with suitable hardware interfaces to all controlled devices. The program requires a FORTRAN77 compiler and user-written device drivers. This program was developed in 1989 and has a memory requirement of about 62 Kbytes.

  3. Methods of control the machining process

    Directory of Open Access Journals (Sweden)

    Yu.V. Petrakov

    2017-12-01

    Full Text Available Presents control methods, differentiated by the time of receipt of information used: a priori, a posteriori and current. When used a priori information to determine the mode of cutting is carried out by simulation the process of cutting allowance, where the shape of the workpiece and the details are presented in the form of wireframes. The office for current information provides for a system of adaptive control and modernization of CNC machine, where in the input of the unit shall be computed by using established optimization software. For the control by a posteriori information of the proposed method of correction of shape-generating trajectory in the second pass measurement surface of the workpiece formed by the first pass. Developed programs that automatically design the adjusted file for machining.

  4. Analyzing a Mature Software Inspection Process Using Statistical Process Control (SPC)

    Science.gov (United States)

    Barnard, Julie; Carleton, Anita; Stamper, Darrell E. (Technical Monitor)

    1999-01-01

    This paper presents a cooperative effort where the Software Engineering Institute and the Space Shuttle Onboard Software Project could experiment applying Statistical Process Control (SPC) analysis to inspection activities. The topics include: 1) SPC Collaboration Overview; 2) SPC Collaboration Approach and Results; and 3) Lessons Learned.

  5. Electron backscattering for process control in electron beam welding

    International Nuclear Information System (INIS)

    Ardenne, T. von; Panzer, S.

    1983-01-01

    A number of solutions to the automation of electron beam welding is presented. On the basis of electron backscattering a complex system of process control has been developed. It allows an enlarged imaging of the material's surface, improved adjustment of the beam focusing and definite focus positioning. Furthermore, both manual and automated positioning of the electron beam before and during the welding process has become possible. Monitoring of the welding process for meeting standard welding requirements can be achieved with the aid of a control quantity derived from the results of electronic evaluation of the high-frequency electron backscattering

  6. An application of neural networks to process and materials control

    International Nuclear Information System (INIS)

    Howell, J.A.; Whiteson, R.

    1991-01-01

    Process control consists of two basic elements: a model of the process and knowledge of the desired control algorithm. In some cases the level of the control algorithm is merely supervisory, as in an alarm-reporting or anomaly-detection system. If the model of the process is known, then a set of equations may often be solved explicitly to provide the control algorithm. Otherwise, the model has to be discovered through empirical studies. Neural networks have properties that make them useful in this application. They can learn (make internal models from experience or observations). The problem of anomaly detection in materials control systems fits well into this general control framework. To successfully model a process with a neutral network, a good set of observables must be chosen. These observables must in some sense adequately span the space of representable events, so that a signature metric can be built for normal operation. In this way, a non-normal event, one that does not fit within the signature, can be detected. In this paper, we discuss the issues involved in applying a neural network model to anomaly detection in materials control systems. These issues include data selection and representation, network architecture, prediction of events, the use of simulated data, and software tools. 10 refs., 4 figs., 1 tab

  7. Microstructural evolution and control in laser material processing

    International Nuclear Information System (INIS)

    Kaul, R.; Nath, A.K.

    2005-01-01

    Laser processing, because of its characteristic features, often gives rise to unique microstructure and properties not obtained with other conventional processes. We present various diverse laser processing case studies involving control of microstructure through judicious selection of processing parameters carried out with indigenously developed high power CO 2 lasers. The first study describes microstructural control during end plug laser welding of PFBR fuel pin, involving crack pone alloy D9 tube and type 316 M stainless steel (SS) plug, through preferential displacement of focused laser beam. Crater and associated cracks were eliminated by suitable laser power ramping. Another case study describes how low heat input characteristics of laser cladding process has been exploited for suppressing dilution in 'Colomony 6' deposits on austenitic SS. The results are in sharp contrast to extensive dilution noticed in Colmony 6 hard faced deposits made by GTAW. A novel laser surface melting (LSM) treatment for type 316 (N) SS weld metal has been developed to generate a sensitization-resistant microstructure which leads to enhanced resistance against intergranular corrosion (IGC). IGC resistance of laser treated surface has been found to be critically dependent on laser processing parameters. Experimental observations have been analyzed with thermal simulation. We have also studied the effect of laser beam spatial intensity profile on the microstructure in LSM. We have developed laser-assisted graded hard facing of austenitic SS substrate with Stellite 6 which, in contrast to direct deposition either by laser or GTAW, produced smooth transition in chemical composition and hardness used to control grain coarsening and martensite formation in type 430 SS weldment. Laser rapid manufacturing (LRM) is emerging as a new rapid and cost effective process for low volume fabrication, esp. of expensive materials. The talk will also present microstructural characteristics of laser

  8. Anaerobic Digestion. Student Manual. Biological Treatment Process Control.

    Science.gov (United States)

    Carnegie, John W., Ed.

    This student manual contains the textual material for a four-lesson unit on anaerobic digestion control. Areas addressed include: (1) anaerobic sludge digestion (considering the nature of raw sludge, purposes of anaerobic digestion, the results of digestion, types of equipment, and other topics); (2) digester process control (considering feeding…

  9. ACHEMA '85: Process control systems

    International Nuclear Information System (INIS)

    Rosskopf, E.

    1985-01-01

    The strategy obviously adopted by the well-established manufacturers is to offer 'easy-to-handle' equipment to gain new customers, and there is a variety of new compact systems or personal computers being put on the market. The changes and improvements within the processing sector proceed more or less in silence; high-capacity storage devices and multiprocessor configurations are obtainable at a moderate price, offering a greater variety of basic functions and enhanced control possibilities. Redundancy problems are handled with greater flexibility, and batch programs are advancing. Data communication has become a common feature, transmission speed and bus length have been improved. Important improvements have been made with regard to data display; even medium-sized equipment now offer the possibility of making dynamic flow-sheets and reserving space for process history display, and the hierarchy of displays has been considerably simplified. The user software also has been made more easy, 'fill-in-the-blancs' is the prevailing motto for dialog configurations, and such big terms as process computer' or 'programming skill' are passing into oblivion. (orig./HP) [de

  10. Experimental processing of a model data set using Geobit seismic software

    Energy Technology Data Exchange (ETDEWEB)

    Suh, Sang Yong [Korea Inst. of Geology Mining and Materials, Taejon (Korea, Republic of)

    1995-12-01

    A seismic data processing software, Geobit, has been developed and is continuously updated to implement newer processing techniques and to support more hardware platforms. Geobit is intended to support all Unix platforms ranging from PC to CRAY. The current version supports two platform, i.e., PC/Linux and Sun Sparc based Sun OS 4.1.x. PC/Linux attracted geophysicists in some universities trying to install Geobit in their laboratories to be used as their research tool. However, one of the problem is the difficulty in getting the seismic data. The primary reason is its huge volume. The field data is too bulky to fit their relatively small storage media, such as PC disk. To solve the problem, KIGAM released a model seismic data set via ftp.kigam.re.kr. This study aims two purposes. The first one is testing Geobit software for its suitability in seismic data processing. The test includes reproducing the model through the seismic data processing. If it fails to reproduce the original model, the software is considered buggy and incomplete. However, if it can successfully reproduce the input model, I would be proud of what I have accomplished for the last few years in writing Geobit. The second purpose is to give a guide on Geobit usage by providing an example set of job files needed to process a given data. This example will help scientists lacking Geobit experience to concentrate on their study more easily. Once they know the Geobit processing technique, and later on Geobit programming, they can implement their own processing idea, contributing newer technologies to Geobit. The complete Geobit job files needed to process the model data is written, in the following job sequence: (1) data loading, (2) CDP sort, (3) decon analysis, (4) velocity analysis, (5) decon verification, (6) stack, (7) filter analysis, (8) filtered stack, (9) time migration, (10) depth migration. The control variables in the job files are discussed. (author). 10 figs., 1 tab.

  11. Application of fractional control techniques in petrochemical process; Aplicacao de tecnicas de controle fracionario para processos petroquimicos

    Energy Technology Data Exchange (ETDEWEB)

    Isfer, Luis A.D.; Lenzi, Marcelo K. [Universidade Federal do Parana (UFPR), Curitiba, PR (Brazil); Lenzi, Ervin K. [Universidade Estadual de Maringa (UEM), PR (Brazil)

    2008-07-01

    This work deals with the study of petrochemical process modeling and control using fractional differential equations. This kind of equation has been successfully used in modeling/identification, due to its powerful phenomenological description capability, generalizing conventional/classical models commonly used. Experimental data from literature were used for process identification and the mathematical model obtained, based on fractional differential equations, was the study of servo closed-loop control. Owing to its simplicity, the proportional type controller was used for this task, being able to control the process, keeping its stability, despite off-set. Different controller gains were used for simulation purposes, indicating that the higher the value of K{sub c}, the lower the off-set. (author)

  12. Integrated Design and Control of Reactive and Non-Reactive Distillation Processes

    DEFF Research Database (Denmark)

    Mansouri, Seyed Soheil; Sales-Cruz, Mauricio; Huusom, Jakob Kjøbsted

    , an alternative approach is to tackle process design and controllability issues simultaneously, in the early stages of process design. This simultaneous synthesis approach provides optimal/near optimal operation and more efficient control of conventional (non-reactive binary distillation columns) (Hamid et al...... of methodologies have been proposed and applied on various problems to address the interactions between process design and control, and they range from optimization-based approaches to model-based methods (Sharifzadeh, 2013). In this work, integrated design and control of non-reactive distillation, ternary...... reactive distillation processes. The element concept (Pérez Cisneros et al., 1997) is used to translate a ternary system of compounds (A + B ↔ C) to a binary system of element (WA and WB). In the case of multicomponent reactive distillation processes the equivalent element concept is used to translate...

  13. Agents Modeling Experience Applied To Control Of Semi-Continuous Production Process

    Directory of Open Access Journals (Sweden)

    Gabriel Rojek

    2014-01-01

    Full Text Available The lack of proper analytical models of some production processes prevents us from obtaining proper values of process parameters by simply computing optimal values. Possible solutions of control problems in such areas of industrial processes can be found using certain methods from the domain of artificial intelligence: neural networks, fuzzy logic, expert systems, or evolutionary algorithms. Presented in this work, a solution to such a control problem is an alternative approach that combines control of the industrial process with learning based on production results. By formulating the main assumptions of the proposed methodology, decision processes of a human operator using his experience are taken into consideration. The researched model of using and gathering experience of human beings is designed with the contribution of agent technology. The presented solution of the control problem coincides with case-based reasoning (CBR methodology.

  14. The informed consent process in randomised controlled trials: a nurse-led process.

    Science.gov (United States)

    Cresswell, Pip; Gilmour, Jean

    2014-03-01

    Clinical trials are carried out with human participants to answer questions about the best way to diagnose, treat and prevent illness. Participants must give informed consent to take part in clinical trials that requires understanding of how clinical trials work and their purpose. Randomised controlled trials provide strong evidence but their complex design is difficult for both clinicians and participants to understand. Increasingly, ensuring informed consent in randomised controlled trials has become part of the clinical research nurse role. The aim of this study was to explore in depth the clinical research nurse role in the informed consent process using a qualitative descriptive approach. Three clinical research nurses were interviewed and data analysed using a thematic analysis approach. Three themes were identified to describe the process of ensuring informed consent. The first theme, Preparatory partnerships, canvassed the relationships required prior to initiation of the informed consent process. The second theme, Partnering the participant, emphasises the need for ensuring voluntariness and understanding, along with patient advocacy. The third theme, Partnership with the project, highlights the clinical research nurse contribution to the capacity of the trial to answer the research question through appropriate recruiting and follow up of participants. Gaining informed consent in randomised controlled trials was complex and required multiple partnerships. A wide variety of skills was used to protect the safety of trial participants and promote quality research. The information from this study contributes to a greater understanding of the clinical research nurse role, and suggests the informed consent process in trials can be a nurse-led one. In order to gain collegial, employer and industry recognition it is important this aspect of the nursing role is acknowledged.

  15. Integrated Process Design and Control of Multi-element Reactive Distillation Processes

    DEFF Research Database (Denmark)

    Mansouri, Seyed Soheil; Sales-Cruz, Mauricio; Huusom, Jakob Kjøbsted

    2016-01-01

    In this work, integrated process design and control of reactive distillation processes involving multi-elements is presented. The reactive distillation column is designed using methods and tools which are similar in concept to non-reactive distillation design methods, such as driving force approach....... The methods employed in this work are based on equivalent element concept. This concept facilitates the representation of a multi-element reactive system as equivalent binary light and heavy key elements. First, the reactive distillation column is designed at the maximum driving force where through steady...

  16. Dynamic modeling and control of industrial crude terephthalic acid hydropurification process

    Energy Technology Data Exchange (ETDEWEB)

    Li, Zhi; Zhong, Weimin; Liu, Yang; Luo, Na; Qian, Feng [East China University of Science and Technology, Shanghai (China)

    2015-04-15

    Purified terephthalic acid (PTA) is critical to the development of the polyester industry. PTA production consists of p-xylene oxidation reaction and crude terephthalic acid (CTA) hydropurification. The hydropurification process is necessary to eliminate 4-carboxybenzaldehyde (4-CBA), which is a harmful byproduct of the oxidation reaction process. Based on the dynamic model of the hydropurification process, two control systems are studied using Aspen Dynamics. The first system is the ratio control system, in which the mass flows of CTA and deionized water are controlled. The second system is the multivariable predictive control-proportional-integral-derivative cascade control strategy, in which the concentrations of 4-CBA and carbon monoxide are chosen as control variables and the reaction temperature and hydrogen flow are selected as manipulated variables. A detailed dynamic behavior is investigated through simulation. Results show that the developed control strategies exhibit good control performances, thereby providing theoretical guidance for advanced control of industry-scale PTA production.

  17. Digital signal processing in power electronics control circuits

    CERN Document Server

    Sozanski, Krzysztof

    2013-01-01

    Many digital control circuits in current literature are described using analog transmittance. This may not always be acceptable, especially if the sampling frequency and power transistor switching frequencies are close to the band of interest. Therefore, a digital circuit is considered as a digital controller rather than an analog circuit. This helps to avoid errors and instability in high frequency components. Digital Signal Processing in Power Electronics Control Circuits covers problems concerning the design and realization of digital control algorithms for power electronics circuits using

  18. Controllable unit concept as applied to a hypothetical tritium process

    International Nuclear Information System (INIS)

    Seabaugh, P.W.; Sellers, D.E.; Woltermann, H.A.; Boh, D.R.; Miles, J.C.; Fushimi, F.C.

    1976-01-01

    A methodology (controllable unit accountability) is described that identifies controlling errors for corrective action, locates areas and time frames of suspected diversions, defines time and sensitivity limits of diversion flags, defines the time frame in which pass-through quantities of accountable material and by inference SNM remain controllable and provides a basis for identification of incremental cost associated with purely safeguards considerations. The concept provides a rationale from which measurement variability and specific safeguard criteria can be converted into a numerical value that represents the degree of control or improvement attainable with a specific measurement system or combination of systems. Currently the methodology is being applied to a high-throughput, mixed-oxide fuel fabrication process. The process described is merely used to illustrate a procedure that can be applied to other more pertinent processes

  19. Radiation process control, study and acceptance of dosimetric methods

    International Nuclear Information System (INIS)

    Radak, B.B.

    1984-01-01

    The methods of primary dosimetric standardization and the calibration of dosimetric monitors suitable for radiation process control were outlined in the form of a logical pattern in which they are in current use on industrial scale in Yugoslavia. The reliability of the process control of industrial sterilization of medical supplies for the last four years was discussed. The preparatory works for the intermittent use of electron beams in cable industry were described. (author)

  20. Intelligent Transportation Control based on Proactive Complex Event Processing

    OpenAIRE

    Wang Yongheng; Geng Shaofeng; Li Qian

    2016-01-01

    Complex Event Processing (CEP) has become the key part of Internet of Things (IoT). Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is p...

  1. Artificial neural networks in variable process control: application in particleboard manufacture

    Energy Technology Data Exchange (ETDEWEB)

    Esteban, L. G.; Garcia Fernandez, F.; Palacios, P. de; Conde, M.

    2009-07-01

    Artificial neural networks are an efficient tool for modelling production control processes using data from the actual production as well as simulated or design of experiments data. In this study two artificial neural networks were combined with the control process charts and it was checked whether the data obtained by the networks were valid for variable process control in particleboard manufacture. The networks made it possible to obtain the mean and standard deviation of the internal bond strength of the particleboard within acceptable margins using known data of thickness, density, moisture content, swelling and absorption. The networks obtained met the acceptance criteria for test values from non-standard test methods, as well as the criteria for using these values in statistical process control. (Author) 47 refs.

  2. Effects of wireless packet loss in industrial process control systems.

    Science.gov (United States)

    Liu, Yongkang; Candell, Richard; Moayeri, Nader

    2017-05-01

    Timely and reliable sensing and actuation control are essential in networked control. This depends on not only the precision/quality of the sensors and actuators used but also on how well the communications links between the field instruments and the controller have been designed. Wireless networking offers simple deployment, reconfigurability, scalability, and reduced operational expenditure, and is easier to upgrade than wired solutions. However, the adoption of wireless networking has been slow in industrial process control due to the stochastic and less than 100% reliable nature of wireless communications and lack of a model to evaluate the effects of such communications imperfections on the overall control performance. In this paper, we study how control performance is affected by wireless link quality, which in turn is adversely affected by severe propagation loss in harsh industrial environments, co-channel interference, and unintended interference from other devices. We select the Tennessee Eastman Challenge Model (TE) for our study. A decentralized process control system, first proposed by N. Ricker, is adopted that employs 41 sensors and 12 actuators to manage the production process in the TE plant. We consider the scenario where wireless links are used to periodically transmit essential sensor measurement data, such as pressure, temperature and chemical composition to the controller as well as control commands to manipulate the actuators according to predetermined setpoints. We consider two models for packet loss in the wireless links, namely, an independent and identically distributed (IID) packet loss model and the two-state Gilbert-Elliot (GE) channel model. While the former is a random loss model, the latter can model bursty losses. With each channel model, the performance of the simulated decentralized controller using wireless links is compared with the one using wired links providing instant and 100% reliable communications. The sensitivity of the

  3. Tuning of PID controller using optimization techniques for a MIMO process

    Science.gov (United States)

    Thulasi dharan, S.; Kavyarasan, K.; Bagyaveereswaran, V.

    2017-11-01

    In this paper, two processes were considered one is Quadruple tank process and the other is CSTR (Continuous Stirred Tank Reactor) process. These are majorly used in many industrial applications for various domains, especially, CSTR in chemical plants.At first mathematical model of both the process is to be done followed by linearization of the system due to MIMO process and controllers are the major part to control the whole process to our desired point as per the applications so the tuning of the controller plays a major role among the whole process. For tuning of parameters we use two optimizations techniques like Particle Swarm Optimization, Genetic Algorithm. The above techniques are majorly used in different applications to obtain which gives the best among all, we use these techniques to obtain the best tuned values among many. Finally, we will compare the performance of the each process with both the techniques.

  4. Simulation of Simple Controlled Processes with Dead-Time.

    Science.gov (United States)

    Watson, Keith R.; And Others

    1985-01-01

    The determination of closed-loop response of processes containing dead-time is typically not covered in undergraduate process control, possibly because the solution by Laplace transforms requires the use of Pade approximation for dead-time, which makes the procedure lengthy and tedious. A computer-aided method is described which simplifies the…

  5. Neural Correlates of Automatic and Controlled Auditory Processing in Schizophrenia

    Science.gov (United States)

    Morey, Rajendra A.; Mitchell, Teresa V.; Inan, Seniha; Lieberman, Jeffrey A.; Belger, Aysenil

    2009-01-01

    Individuals with schizophrenia demonstrate impairments in selective attention and sensory processing. The authors assessed differences in brain function between 26 participants with schizophrenia and 17 comparison subjects engaged in automatic (unattended) and controlled (attended) auditory information processing using event-related functional MRI. Lower regional neural activation during automatic auditory processing in the schizophrenia group was not confined to just the temporal lobe, but also extended to prefrontal regions. Controlled auditory processing was associated with a distributed frontotemporal and subcortical dysfunction. Differences in activation between these two modes of auditory information processing were more pronounced in the comparison group than in the patient group. PMID:19196926

  6. Levels of integration in cognitive control and sequence processing in the prefrontal cortex.

    Science.gov (United States)

    Bahlmann, Jörg; Korb, Franziska M; Gratton, Caterina; Friederici, Angela D

    2012-01-01

    Cognitive control is necessary to flexibly act in changing environments. Sequence processing is needed in language comprehension to build the syntactic structure in sentences. Functional imaging studies suggest that sequence processing engages the left ventrolateral prefrontal cortex (PFC). In contrast, cognitive control processes additionally recruit bilateral rostral lateral PFC regions. The present study aimed to investigate these two types of processes in one experimental paradigm. Sequence processing was manipulated using two different sequencing rules varying in complexity. Cognitive control was varied with different cue-sets that determined the choice of a sequencing rule. Univariate analyses revealed distinct PFC regions for the two types of processing (i.e. sequence processing: left ventrolateral PFC and cognitive control processing: bilateral dorsolateral and rostral PFC). Moreover, in a common brain network (including left lateral PFC and intraparietal sulcus) no interaction between sequence and cognitive control processing was observed. In contrast, a multivariate pattern analysis revealed an interaction of sequence and cognitive control processing, such that voxels in left lateral PFC and parietal cortex showed different tuning functions for tasks involving different sequencing and cognitive control demands. These results suggest that the difference between the process of rule selection (i.e. cognitive control) and the process of rule-based sequencing (i.e. sequence processing) find their neuronal underpinnings in distinct activation patterns in lateral PFC. Moreover, the combination of rule selection and rule sequencing can shape the response of neurons in lateral PFC and parietal cortex.

  7. Inhibitory control and negative emotional processing in psychopathy and antisocial personality disorder.

    Science.gov (United States)

    Verona, Edelyn; Sprague, Jenessa; Sadeh, Naomi

    2012-05-01

    The field of personality disorders has had a long-standing interest in understanding interactions between emotion and inhibitory control, as well as neurophysiological indices of these processes. More work in particular is needed to clarify differential deficits in offenders with antisocial personality disorder (APD) who differ on psychopathic traits, as APD and psychopathy are considered separate, albeit related, syndromes. Evidence of distinct neurobiological processing in these disorders would have implications for etiology-based personality disorder taxonomies in future psychiatric classification systems. To inform this area of research, we recorded event-related brain potentials during an emotional-linguistic Go/No-Go task to examine modulation of negative emotional processing by inhibitory control in three groups: psychopathy (n = 14), APD (n = 16), and control (n = 15). In control offenders, inhibitory control demands (No-Go vs. Go) modulated frontal-P3 amplitude to negative emotional words, indicating appropriate prioritization of inhibition over emotional processing. In contrast, the psychopathic group showed blunted processing of negative emotional words regardless of inhibitory control demands, consistent with research on emotional deficits in psychopathy. Finally, the APD group demonstrated enhanced processing of negative emotion words in both Go and No-Go trials, suggesting a failure to modulate negative emotional processing when inhibitory control is required. Implications for emotion-cognition interactions and putative etiological processes in these personality disorders are discussed.

  8. Implementación de un servidor FTP utilizando el modelo cliente/servidor mediante el uso de sockets en lenguaje c UNIX con el fin de mejorar los tiempos de respuesta en la red

    Directory of Open Access Journals (Sweden)

    Juan de Dios Murillo

    2016-03-01

    Full Text Available Este trabajo pretende evaluar la latencia en la transferencia de archivos utilizando un servidor FTP con un modelo cliente-servidor empleando una computadora con el sistema operativo Fedora para ejecutar el código del modelo cliente/servidor con sockets en lenguaje C UNIX, con el fin de simular un servidor que contiene archivos con diferentes formatos y tamaños, y medir la latencia de la transmisión al subir y descargar los archivos del servidor, usando diferentes tamaños de buffer. Con los resultados del retardo en la transmisión en los diferentes escenarios y al compararlos, se observa que entre mayor sea el tamaño del buffer es menor la latencia y conforme aumenta el tamaño del archivo la latencia aumenta,

  9. Intelligent Transportation Control based on Proactive Complex Event Processing

    Directory of Open Access Journals (Sweden)

    Wang Yongheng

    2016-01-01

    Full Text Available Complex Event Processing (CEP has become the key part of Internet of Things (IoT. Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is proposed as sequential decision model. A Q-learning method is proposed for this model. The experimental evaluations show that this method works well when used to control congestion in in intelligent transportation systems.

  10. Process control by microprocessors

    Energy Technology Data Exchange (ETDEWEB)

    Arndt, W [ed.

    1978-12-01

    Papers from the workshop Process Control by Microprocessors being organized by the Karlsruhe Nuclear Research Center, Project PDV, together with the VDI/VDE-Gesellschaft fuer Mess- und Regelungstechnik are presented. The workshop was held on December 13 and 14, 1978 at the facilities of the Nuclear Research Center. The papers are arranged according to the topics of the workshop; one chapter deals with today's state of the art of microprocessor hardware and software technology; 5 chapters are dedicated to applications. The report also contains papers which will not be presented at the workshop. Both the workshop and the report are expected to improve and distribute the know-how about this modern technology.

  11. Myoelectric signal processing for control of powered limb prostheses.

    Science.gov (United States)

    Parker, P; Englehart, K; Hudgins, B

    2006-12-01

    Progress in myoelectric control technology has over the years been incremental, due in part to the alternating focus of the R&D between control methodology and device hardware. The technology has over the past 50 years or so moved from single muscle control of a single prosthesis function to muscle group activity control of multifunction prostheses. Central to these changes have been developments in the means of extracting information from the myoelectric signal. This paper gives an overview of the myoelectric signal processing challenge, a brief look at the challenge from an historical perspective, the state-of-the-art in myoelectric signal processing for prosthesis control, and an indication of where this field is heading. The paper demonstrates that considerable progress has been made in providing clients with useful and reliable myoelectric communication channels, and that exciting work and developments are on the horizon.

  12. Applying Trusted Network Technology To Process Control Systems

    Science.gov (United States)

    Okhravi, Hamed; Nicol, David

    Interconnections between process control networks and enterprise networks expose instrumentation and control systems and the critical infrastructure components they operate to a variety of cyber attacks. Several architectural standards and security best practices have been proposed for industrial control systems. However, they are based on older architectures and do not leverage the latest hardware and software technologies. This paper describes new technologies that can be applied to the design of next generation security architectures for industrial control systems. The technologies are discussed along with their security benefits and design trade-offs.

  13. An instrumentation and control philosophy for high-level nuclear waste processing facilities

    International Nuclear Information System (INIS)

    Weigle, D.H.

    1990-01-01

    The purpose of this paper is to present an instrumentation and control philosophy which may be applied to high-level nuclear waste processing facilities. This philosophy describes the recommended criteria for automatic/manual control, remote/local control, remote/local display, diagnostic instrumentation, interlocks, alarm levels, and redundancy. Due to the hazardous nature of the process constituents of a high-level nuclear waste processing facility, it is imperative that safety and control features required for accident-free operation and maintenance be incorporated. A well-instrumented and controlled process, while initially more expensive in capital and design costs, is generally safer and less expensive to operate. When the long term cost savings of a well designed process is coupled with the high savings enjoyed by accident avoidance, the benefits far outweigh the initial capital and design costs

  14. Control of instability in nitric acid evaporators for plutonium processing

    International Nuclear Information System (INIS)

    1998-03-01

    Improved control of the nitric acid process evaporators requires the detection of spontaneously unstable operating conditions. This process reduces the volume of contaminated liquid by evaporating nitric acid and concentrating salt residues. If a instability is identified quickly, prompt response can avert distillate contamination. An algorithm applied to the runtime data was evaluated to detect this situation. A snapshot of data from a histogram in the old process control software was captured during the unstable conditions and modeled

  15. The Usage of Time Series Control Charts for Financial Process Analysis

    Directory of Open Access Journals (Sweden)

    Kovářík Martin

    2012-09-01

    Full Text Available We will deal with financial proceedings of the company using methods of SPC (Statistical Process Control, specifically through time series control charts. The paper will outline the intersection of two disciplines which are econometrics and statistical process control. The theoretical part will discuss the methodology of time series control charts and in the research part there will be this methodology demonstrated in three case studies. The first study will focus on the regulation of simulated financial flows for a company by CUSUM control chart. The second study will involve the regulation of financial flows for a heteroskedastic financial process by EWMA control chart. The last case study of our paper will be devoted to applications of ARIMA, EWMA and CUSUM control charts in the financial data that are sensitive to the mean shifting while calculating the autocorrelation in the data. In this paper, we highlight the versatility of control charts not only in manufacturing but also in managing the financial stability of cash flows.

  16. Improving Accuracy of Processing by Adaptive Control Techniques

    Directory of Open Access Journals (Sweden)

    N. N. Barbashov

    2016-01-01

    Full Text Available When machining the work-pieces a range of scatter of the work-piece dimensions to the tolerance limit is displaced in response to the errors. To improve an accuracy of machining and prevent products from defects it is necessary to diminish the machining error components, i.e. to improve the accuracy of machine tool, tool life, rigidity of the system, accuracy of adjustment. It is also necessary to provide on-machine adjustment after a certain time. However, increasing number of readjustments reduces the performance and high machine and tool requirements lead to a significant increase in the machining cost.To improve the accuracy and machining rate, various devices of active control (in-process gaging devices, as well as controlled machining through adaptive systems for a technological process control now become widely used. Thus, the accuracy improvement in this case is reached by compensation of a majority of technological errors. The sensors of active control can provide improving the accuracy of processing by one or two quality classes, and simultaneous operation of several machines.For efficient use of sensors of active control it is necessary to develop the accuracy control methods by means of introducing the appropriate adjustments to solve this problem. Methods based on the moving average, appear to be the most promising for accuracy control, since they contain information on the change in the last several measured values of the parameter under control.When using the proposed method in calculation, the first three members of the sequence of deviations remain unchanged, therefore 1 1 x  x , 2 2 x  x , 3 3 x  x Then, for each i-th member of the sequence we calculate that way: , ' i i i x  x  k x , where instead of the i x values will be populated with the corresponding values ' i x calculated as an average of three previous members:3 ' 1  2  3  i i i i x x x x .As a criterion for the estimate of the control

  17. Control Improvement for Jump-Diffusion Processes with Applications to Finance

    International Nuclear Information System (INIS)

    Bäuerle, Nicole; Rieder, Ulrich

    2012-01-01

    We consider stochastic control problems with jump-diffusion processes and formulate an algorithm which produces, starting from a given admissible control π, a new control with a better value. If no improvement is possible, then π is optimal. Such an algorithm is well-known for discrete-time Markov Decision Problems under the name Howard’s policy improvement algorithm. The idea can be traced back to Bellman. Here we show with the help of martingale techniques that such an algorithm can also be formulated for stochastic control problems with jump-diffusion processes. As an application we derive some interesting results in financial portfolio optimization.

  18. Use of statistical process control in the production of blood components

    DEFF Research Database (Denmark)

    Magnussen, K; Quere, S; Winkel, P

    2008-01-01

    Introduction of statistical process control in the setting of a small blood centre was tested, both on the regular red blood cell production and specifically to test if a difference was seen in the quality of the platelets produced, when a change was made from a relatively large inexperienced...... by an experienced staff with four technologists. We applied statistical process control to examine if time series of quality control values were in statistical control. Leucocyte count in red blood cells was out of statistical control. Platelet concentration and volume of the platelets produced by the occasional...... occasional component manufacturing staff to an experienced regular manufacturing staff. Production of blood products is a semi-automated process in which the manual steps may be difficult to control. This study was performed in an ongoing effort to improve the control and optimize the quality of the blood...

  19. Aspects of parallel processing and control engineering

    OpenAIRE

    McKittrick, Brendan J

    1991-01-01

    The concept of parallel processing is not a new one, but the application of it to control engineering tasks is a relatively recent development, made possible by contemporary hardware and software innovation. It has long been accepted that, if properly orchestrated several processors/CPUs when combined can form a powerful processing entity. What prevented this from being implemented in commercial systems was the adequacy of the microprocessor for most tasks and hence the expense of a multi-pro...

  20. The statistical process control methods - SPC

    Directory of Open Access Journals (Sweden)

    Floreková Ľubica

    1998-03-01

    Full Text Available Methods of statistical evaluation of quality – SPC (item 20 of the documentation system of quality control of ISO norm, series 900 of various processes, products and services belong amongst basic qualitative methods that enable us to analyse and compare data pertaining to various quantitative parameters. Also they enable, based on the latter, to propose suitable interventions with the aim of improving these processes, products and services. Theoretical basis and applicatibily of the principles of the: - diagnostics of a cause and effects, - Paret analysis and Lorentz curve, - number distribution and frequency curves of random variable distribution, - Shewhart regulation charts, are presented in the contribution.

  1. Application of Special Cause Control Charts to Green Sand Process

    Directory of Open Access Journals (Sweden)

    Perzyk M.

    2015-12-01

    Full Text Available Statistical Process Control (SPC based on the well known Shewhart control charts, is widely used in contemporary manufacturing industry, including many foundries. However, the classic SPC methods require that the measured quantities, e.g. process or product parameters, are not auto-correlated, i.e. their current values do not depend on the preceding ones. For the processes which do not obey this assumption the Special Cause Control (SCC charts were proposed, utilizing the residual data obtained from the time-series analysis. In the present paper the results of application of SCC charts to a green sand processing system are presented. The tests, made on real industrial data collected in a big iron foundry, were aimed at the comparison of occurrences of out-of-control signals detected in the original data with those appeared in the residual data. It was found that application of the SCC charts reduces numbers of the signals in almost all cases It is concluded that it can be helpful in avoiding false signals, i.e. resulting from predictable factors.

  2. Applying Statistical Process Quality Control Methodology to Educational Settings.

    Science.gov (United States)

    Blumberg, Carol Joyce

    A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…

  3. Integration of process-oriented control with systematic inspection in FRAMATOME-FBFC fuel manufacturing

    International Nuclear Information System (INIS)

    Kopff, G.

    2000-01-01

    The classical approach to quality control is essentially based on final inspection of the product conducted through a qualified process. The main drawback of this approach lies in the separation and , therefore, in the low feedback between manufacturing and quality control, leading to a very static quality system. As a remedy, the modern approach to quality management focuses on the need for continuous improvement through process-oriented quality control. In the classical approach, high reliability of nuclear fuel and high quality level of the main characteristics are assumed to be attained, at the manufacturing step, through 100% inspection of the product, generally with automated inspection equipment. Such a 100% final inspection is not appropriate to obtain a homogeneous product with minimum variability, and cannot be a substitute for the SPC tools (Statistical Process Control) which are rightly designed with this aim. On the other hand, SPC methods, which detect process changes and are used to keep the process u nder control , leading to the optimal distribution of the quality characteristics, do not protect against non systematic or local disturbances, at low frequency. Only systematic control is capable of detecting local quality troubles. In fact, both approaches, SPC and systematic inspection, are complementary , because they are remedies for distinct causes of process and product changes. The term 'statistical' in the expression 'SPC' refers less to the sampling techniques than to the control of global distribution parameters of product or process variables (generally location and dispersion parameters). The successive integration levels of process control methods with systematic inspection are described and illustrated by examples from FRAMATOME-FBFC fuel manufacturing, from the simple control chart for checking the performance stability of automated inspection equipment to the global process control system including systematic inspection. This kind of

  4. Development of Chemical Process Design and Control for Sustainability

    Science.gov (United States)

    This contribution describes a novel process systems engineering framework that couples advanced control with sustainability evaluation and decision making for the optimization of process operations to minimize environmental impacts associated with products, materials, and energy....

  5. Touch-sensitive graphics terminal applied to process control

    International Nuclear Information System (INIS)

    Bennion, S.I.; Creager, J.D.; VanHouten, R.D.

    1981-01-01

    Limited initial demonstrations of the system described took place during September 1980. A single CRT was used an an input device in the control center while operating a furnace and a pellet inspection gage. These two process line devices were completely controlled, despite the longer than desired response times noted, using a single control station located in the control center. The operator could conveniently execute any function from this remote location which could be performed locally at the hard-wired control panels. With the installation of the enhancements, the integrated touchscreen/graphics terminal will provide a preferable alternative to normal keyboard command input devices

  6. Theoretical Background for the Decision-Making Process Modelling under Controlled Intervention Conditions

    Directory of Open Access Journals (Sweden)

    Bakanauskienė Irena

    2017-12-01

    Full Text Available This article is intended to theoretically justify the decision-making process model for the cases, when active participation of investing entities in controlling the activities of an organisation and their results is noticeable. Based on scientific literature analysis, a concept of controlled conditions is formulated, and using a rational approach to the decision-making process, a model of the 11-steps decision-making process under controlled intervention is presented. Also, there have been unified conditions, describing the case of controlled interventions thus providing preconditions to ensure the adequacy of the proposed decision-making process model.

  7. Online monitoring and control of the biogas process

    Energy Technology Data Exchange (ETDEWEB)

    Boe, K.

    2006-07-01

    The demand for online monitoring and control of biogas process is increasing, since better monitoring and control system can improve process stability and enhance process performance for better economy of the biogas plants. A number of parameters in both the liquid and the gas phase have been suggested as process indicators. These include gas production, pH, alkalinity, volatile fatty acids (VFA) and hydrogen. Of these, VFA is the most widely recognised as a direct, relevant measure of stability. The individual, rather than collective VFA concentrations are recognised as providing significantly more information for diagnosis. However, classic on-line measurement is based on filtration, which suffers from fouling, especially in particulate or slurry wastes. In this project, a new online VFA monitoring system has been developed using gas-phase VFA extraction to avoid sample filtration. The liquid sample is pumped into a sampling chamber, acidified, added with salt and heated to extract VFA into the gas phase before analysis by GC-FID. This allows easy application to manure. Sample and analysis time of the system varies from 25-40 min. depending on the washing duration. The sampling frequency is fast enough for the dynamic of a manure digester, which is in the range of several hours. This system has been validated over more than 6 months and had shown good agreement with offline VFA measurement. Response from this sensor was compared with other process parameters such as biogas production, pH and dissolved hydrogen during overload situations in a laboratory-scale digester, to investigate the suitability of each measure as a process indicator. VFA was most reliable for indicating process imbalance, and propionate was most persistent. However, when coupling the online VFA monitoring with a simple control for automatic controlling propionate level in a digester, it was found that propionate decreased so slow that the biogas production fluctuated. Therefore, it is more

  8. Development of integrated control system for smart factory in the injection molding process

    Science.gov (United States)

    Chung, M. J.; Kim, C. Y.

    2018-03-01

    In this study, we proposed integrated control system for automation of injection molding process required for construction of smart factory. The injection molding process consists of heating, tool close, injection, cooling, tool open, and take-out. Take-out robot controller, image processing module, and process data acquisition interface module are developed and assembled to integrated control system. By adoption of integrated control system, the injection molding process can be simplified and the cost for construction of smart factory can be inexpensive.

  9. Method for enhanced control of welding processes

    Science.gov (United States)

    Sheaffer, Donald A.; Renzi, Ronald F.; Tung, David M.; Schroder, Kevin

    2000-01-01

    Method and system for producing high quality welds in welding processes, in general, and gas tungsten arc (GTA) welding, in particular by controlling weld penetration. Light emitted from a weld pool is collected from the backside of a workpiece by optical means during welding and transmitted to a digital video camera for further processing, after the emitted light is first passed through a short wavelength pass filter to remove infrared radiation. By filtering out the infrared component of the light emitted from the backside weld pool image, the present invention provides for the accurate determination of the weld pool boundary. Data from the digital camera is fed to an imaging board which focuses on a 100.times.100 pixel portion of the image. The board performs a thresholding operation and provides this information to a digital signal processor to compute the backside weld pool dimensions and area. This information is used by a control system, in a dynamic feedback mode, to automatically adjust appropriate parameters of a welding system, such as the welding current, to control weld penetration and thus, create a uniform weld bead and high quality weld.

  10. Design of a QoS-controlled ATM-based communications system in chorus

    Science.gov (United States)

    Coulson, Geoff; Campbell, Andrew; Robin, Philippe; Blair, Gordon; Papathomas, Michael; Shepherd, Doug

    1995-05-01

    We describe the design of an application platform able to run distributed real-time and multimedia applications alongside conventional UNIX programs. The platform is embedded in a microkernel/PC environment and supported by an ATM-based, QoS-driven communications stack. In particular, we focus on resource-management aspects of the design and deal with CPU scheduling, network resource-management and memory-management issues. An architecture is presented that guarantees QoS levels of both communications and processing with varying degrees of commitment as specified by user-level QoS parameters. The architecture uses admission tests to determine whether or not new activities can be accepted and includes modules to translate user-level QoS parameters into representations usable by the scheduling, network, and memory-management subsystems.

  11. Central auditory processing and migraine: a controlled study.

    Science.gov (United States)

    Agessi, Larissa Mendonça; Villa, Thaís Rodrigues; Dias, Karin Ziliotto; Carvalho, Deusvenir de Souza; Pereira, Liliane Desgualdo

    2014-11-08

    This study aimed to verify and compare central auditory processing (CAP) performance in migraine with and without aura patients and healthy controls. Forty-one volunteers of both genders, aged between 18 and 40 years, diagnosed with migraine with and without aura by the criteria of "The International Classification of Headache Disorders" (ICDH-3 beta) and a control group of the same age range and with no headache history, were included. Gaps-in-noise (GIN), Duration Pattern test (DPT) and Dichotic Digits Test (DDT) tests were used to assess central auditory processing performance. The volunteers were divided into 3 groups: Migraine with aura (11), migraine without aura (15), and control group (15), matched by age and schooling. Subjects with aura and without aura performed significantly worse in GIN test for right ear (p = .006), for left ear (p = .005) and for DPT test (p UNIFESP.

  12. Environmental control costs for oil shale processes

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-10-01

    The studies reported herein are intended to provide more certainty regarding estimates of the costs of controlling environmental residuals from oil shale technologies being readied for commercial application. The need for this study was evident from earlier work conducted by the Office of Environment for the Department of Energy Oil Shale Commercialization Planning, Environmental Readiness Assessment in mid-1978. At that time there was little reliable information on the costs for controlling residuals and for safe handling of wastes from oil shale processes. The uncertainties in estimating costs of complying with yet-to-be-defined environmental standards and regulations for oil shale facilities are a critical element that will affect the decision on proceeding with shale oil production. Until the regulatory requirements are fully clarified and processes and controls are investigated and tested in units of larger size, it will not be possible to provide definitive answers to the cost question. Thus, the objective of this work was to establish ranges of possible control costs per barrel of shale oil produced, reflecting various regulatory, technical, and financing assumptions. Two separate reports make up the bulk of this document. One report, prepared by the Denver Research Institute, is a relatively rigorous engineering treatment of the subject, based on regulatory assumptions and technical judgements as to best available control technologies and practices. The other report examines the incremental cost effect of more conservative technical and financing alternatives. An overview section is included that synthesizes the products of the separate studies and addresses two variations to the assumptions.

  13. Statistical process control of cocrystallization processes: A comparison between OPLS and PLS.

    Science.gov (United States)

    Silva, Ana F T; Sarraguça, Mafalda Cruz; Ribeiro, Paulo R; Santos, Adenilson O; De Beer, Thomas; Lopes, João Almeida

    2017-03-30

    Orthogonal partial least squares regression (OPLS) is being increasingly adopted as an alternative to partial least squares (PLS) regression due to the better generalization that can be achieved. Particularly in multivariate batch statistical process control (BSPC), the use of OPLS for estimating nominal trajectories is advantageous. In OPLS, the nominal process trajectories are expected to be captured in a single predictive principal component while uncorrelated variations are filtered out to orthogonal principal components. In theory, OPLS will yield a better estimation of the Hotelling's T 2 statistic and corresponding control limits thus lowering the number of false positives and false negatives when assessing the process disturbances. Although OPLS advantages have been demonstrated in the context of regression, its use on BSPC was seldom reported. This study proposes an OPLS-based approach for BSPC of a cocrystallization process between hydrochlorothiazide and p-aminobenzoic acid monitored on-line with near infrared spectroscopy and compares the fault detection performance with the same approach based on PLS. A series of cocrystallization batches with imposed disturbances were used to test the ability to detect abnormal situations by OPLS and PLS-based BSPC methods. Results demonstrated that OPLS was generally superior in terms of sensibility and specificity in most situations. In some abnormal batches, it was found that the imposed disturbances were only detected with OPLS. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Strategies for Enhancing Nonlinear Internal Model Control of pH Processes

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Qiuping.; Rangaiah, G.P. [The National University of Singapore, Singapore (Singapore). Dept. of Chemical and Environmental Engineering

    1999-02-01

    Control of neutralization processes is very difficult due to nonlinear dynamics, different types of disturbances and modeling errors. The objective of the paper is to evaluate two strategies (augmented internal model control, AuIMC and adaptive internal model control, AdIMC) for enhancing pH control by nonlinear internal model control (NIMC). A NIMC controller is derived directly form input output linearization. The AuIMC is composed of NIMC and an additional loop through which the difference between the process and model outputs is fed back and added to the input of the controller. For the AdIMC, and adaptive law with two tuning parameters is proposed for estimating the unknown parameter. Both AuIMC and AdIMC are extensively tested via simulation for pH neutralization. The theoretical and simulation results show that both the proposed strategies can reduce the effect of modeling errors and disturbances, and thereby enhance the performance of NIMC for pH processes. (author)

  15. Development of Statistical Process Control Methodology for an Environmentally Compliant Surface Cleaning Process in a Bonding Laboratory

    Science.gov (United States)

    Hutchens, Dale E.; Doan, Patrick A.; Boothe, Richard E.

    1997-01-01

    Bonding labs at both MSFC and the northern Utah production plant prepare bond test specimens which simulate or witness the production of NASA's Reusable Solid Rocket Motor (RSRM). The current process for preparing the bonding surfaces employs 1,1,1-trichloroethane vapor degreasing, which simulates the current RSRM process. Government regulations (e.g., the 1990 Amendments to the Clean Air Act) have mandated a production phase-out of a number of ozone depleting compounds (ODC) including 1,1,1-trichloroethane. In order to comply with these regulations, the RSRM Program is qualifying a spray-in-air (SIA) precision cleaning process using Brulin 1990, an aqueous blend of surfactants. Accordingly, surface preparation prior to bonding process simulation test specimens must reflect the new production cleaning process. The Bonding Lab Statistical Process Control (SPC) program monitors the progress of the lab and its capabilities, as well as certifies the bonding technicians, by periodically preparing D6AC steel tensile adhesion panels with EA-91 3NA epoxy adhesive using a standardized process. SPC methods are then used to ensure the process is statistically in control, thus producing reliable data for bonding studies, and identify any problems which might develop. Since the specimen cleaning process is being changed, new SPC limits must be established. This report summarizes side-by-side testing of D6AC steel tensile adhesion witness panels and tapered double cantilevered beams (TDCBs) using both the current baseline vapor degreasing process and a lab-scale spray-in-air process. A Proceco 26 inches Typhoon dishwasher cleaned both tensile adhesion witness panels and TDCBs in a process which simulates the new production process. The tests were performed six times during 1995, subsequent statistical analysis of the data established new upper control limits (UCL) and lower control limits (LCL). The data also demonstrated that the new process was equivalent to the vapor

  16. Dissociation between controlled and automatic processes in the behavioral variant of fronto-temporal dementia.

    Science.gov (United States)

    Collette, Fabienne; Van der Linden, Martial; Salmon, Eric

    2010-01-01

    A decline of cognitive functioning affecting several cognitive domains was frequently reported in patients with frontotemporal dementia. We were interested in determining if these deficits can be interpreted as reflecting an impairment of controlled cognitive processes by using an assessment tool specifically developed to explore the distinction between automatic and controlled processes, namely the process dissociation procedure (PDP) developed by Jacoby. The PDP was applied to a word stem completion task to determine the contribution of automatic and controlled processes to episodic memory performance and was administered to a group of 12 patients with the behavioral variant of frontotemporal dementia (bv-FTD) and 20 control subjects (CS). Bv-FTD patients obtained a lower performance than CS for the estimates of controlled processes, but no group differences was observed for estimates of automatic processes. The between-groups comparison of the estimates of controlled and automatic processes showed a larger contribution of automatic processes to performance in bv-FTD, while a slightly more important contribution of controlled processes was observed in control subjects. These results are clearly indicative of an alteration of controlled memory processes in bv-FTD.

  17. Specification and development of the sharing memory data management module for a nuclear processes simulator

    International Nuclear Information System (INIS)

    Telesforo R, D.

    2003-01-01

    Actually it is developed in the Engineering Faculty of UNAM a simulator of nuclear processes with research and teaching purposes. It consists of diverse modules, included the one that is described in the present work that is the shared memory module. It uses the IPC mechanisms of the UNIX System V operative system, and it was codified with C language. To model the diverse components of the simulator the RELAP code is used. The function of the module is to generate locations of shared memory for to deposit in these the necessary variables for the interaction among the diverse ones processes of the simulator. In its it will be able read and to write the information that generate the running of the simulation program, besides being able to interact with the internal variables of the code in execution time. The graphic unfolding (mimic, pictorials, tendency graphics, virtual instrumentation, etc.) they also obtain information of the shared memory. In turn, actions of the user in interactive unfolding, they modify the segments of shared memory, and the information is sent to the RELAP code to modify the simulation course. The program has two beginning modes: automatic and manual. In automatic mode taking an enter file of RELAP (indta) and it joins in shared memory, the control variables that in this appear. In manual mode the user joins, he reads and he writes the wanted control variables, whenever they exist in the enter file (indta). This is a dynamic mode of interacting with the simulator in a direct way and of even altering the values as when its don't exist in the board elements associated to the variables. (Author)

  18. An Ontology for Identifying Cyber Intrusion Induced Faults in Process Control Systems

    Science.gov (United States)

    Hieb, Jeffrey; Graham, James; Guan, Jian

    This paper presents an ontological framework that permits formal representations of process control systems, including elements of the process being controlled and the control system itself. A fault diagnosis algorithm based on the ontological model is also presented. The algorithm can identify traditional process elements as well as control system elements (e.g., IP network and SCADA protocol) as fault sources. When these elements are identified as a likely fault source, the possibility exists that the process fault is induced by a cyber intrusion. A laboratory-scale distillation column is used to illustrate the model and the algorithm. Coupled with a well-defined statistical process model, this fault diagnosis approach provides cyber security enhanced fault diagnosis information to plant operators and can help identify that a cyber attack is underway before a major process failure is experienced.

  19. Design process and philosophy of TVA's latest advance control room complex

    International Nuclear Information System (INIS)

    Owens, G.R.; Masters, D.W.

    1979-01-01

    TVA's latest nuclear power plant control room design includes a greater emphasis on human factors as compared to their earlier plant designs. This emphasis has resulted in changes in the overall design philosophy and design process. This paper discusses some of the prominent design features of both the control room and the surrounding control room complex. In addition, it also presents some of the important activities involved in the process of developing the advanced control room design

  20. Statistical process control using optimized neural networks: a case study.

    Science.gov (United States)

    Addeh, Jalil; Ebrahimzadeh, Ata; Azarbad, Milad; Ranaee, Vahid

    2014-09-01

    The most common statistical process control (SPC) tools employed for monitoring process changes are control charts. A control chart demonstrates that the process has altered by generating an out-of-control signal. This study investigates the design of an accurate system for the control chart patterns (CCPs) recognition in two aspects. First, an efficient system is introduced that includes two main modules: feature extraction module and classifier module. In the feature extraction module, a proper set of shape features and statistical feature are proposed as the efficient characteristics of the patterns. In the classifier module, several neural networks, such as multilayer perceptron, probabilistic neural network and radial basis function are investigated. Based on an experimental study, the best classifier is chosen in order to recognize the CCPs. Second, a hybrid heuristic recognition system is introduced based on cuckoo optimization algorithm (COA) algorithm to improve the generalization performance of the classifier. The simulation results show that the proposed algorithm has high recognition accuracy. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.