WorldWideScience

Sample records for sampled-data control systems

  1. Fault tolerant controllers for sampled-data systems

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Stoustrup, Jakob

    2004-01-01

    A general compensator architecture for fault tolerant control (FTC) for sampled-data systems is proposed. The architecture is based on the YJBK parameterization of all stabilizing controllers, and uses the dual YJBK parameterization to quantify the performance of the fault tolerant system. The FTC...

  2. Sampled-data and discrete-time H2 optimal control

    NARCIS (Netherlands)

    Trentelman, Harry L.; Stoorvogel, Anton A.

    1993-01-01

    This paper deals with the sampled-data H2 optimal control problem. Given a linear time-invariant continuous-time system, the problem of minimizing the H2 performance over all sampled-data controllers with a fixed sampling period can be reduced to a pure discrete-time H2 optimal control problem. This

  3. Stabilization of nonlinear systems using sampled-data output-feedback fuzzy controller based on polynomial-fuzzy-model-based control approach.

    Science.gov (United States)

    Lam, H K

    2012-02-01

    This paper investigates the stability of sampled-data output-feedback (SDOF) polynomial-fuzzy-model-based control systems. Representing the nonlinear plant using a polynomial fuzzy model, an SDOF fuzzy controller is proposed to perform the control process using the system output information. As only the system output is available for feedback compensation, it is more challenging for the controller design and system analysis compared to the full-state-feedback case. Furthermore, because of the sampling activity, the control signal is kept constant by the zero-order hold during the sampling period, which complicates the system dynamics and makes the stability analysis more difficult. In this paper, two cases of SDOF fuzzy controllers, which either share the same number of fuzzy rules or not, are considered. The system stability is investigated based on the Lyapunov stability theory using the sum-of-squares (SOS) approach. SOS-based stability conditions are obtained to guarantee the system stability and synthesize the SDOF fuzzy controller. Simulation examples are given to demonstrate the merits of the proposed SDOF fuzzy control approach.

  4. Optimizing the data acquisition rate for a remotely controllable structural monitoring system with parallel operation and self-adaptive sampling

    International Nuclear Information System (INIS)

    Sheng, Wenjuan; Guo, Aihuang; Liu, Yang; Azmi, Asrul Izam; Peng, Gang-Ding

    2011-01-01

    We present a novel technique that optimizes the real-time remote monitoring and control of dispersed civil infrastructures. The monitoring system is based on fiber Bragg gating (FBG) sensors, and transfers data via Ethernet. This technique combines parallel operation and self-adaptive sampling to increase the data acquisition rate in remote controllable structural monitoring systems. The compact parallel operation mode is highly efficient at achieving the highest possible data acquisition rate for the FBG sensor based local data acquisition system. Self-adaptive sampling is introduced to continuously coordinate local acquisition and remote control for data acquisition rate optimization. Key issues which impact the operation of the whole system, such as the real-time data acquisition rate, data processing capability, and buffer usage, are investigated. The results show that, by introducing parallel operation and self-adaptive sampling, the data acquisition rate can be increased by several times without affecting the system operating performance on both local data acquisition and remote process control

  5. Robust sampled-data control of hydraulic flight control actuators

    OpenAIRE

    Kliffken, Markus Gustav

    1997-01-01

    In todays flight-by-wire systems the primary flight control surfaces of modern commercial and transport aircraft are driven by electro hydraulic linear actuators. Changing flight conditions as well as nonlinear actuator dynamics may be interpreted as parameter uncertainties of the linear actuator model. This demands a robust design for the controller. Here the parameter space design is used for the direct sampled-data controller synthesis. Therefore, a static output controller is choosen, the...

  6. Using sampled-data feedback control and linear feedback synchronization in a new hyperchaotic system

    International Nuclear Information System (INIS)

    Zhao Junchan; Lu Junan

    2008-01-01

    This paper investigates control and synchronization of a new hyperchaotic system which was proposed by [Chen A, Lu J-A, Lue J, Yu S. Generating hyperchaotic Lue attractor via state feedback control. Physica A 2006;364:103-10]. Firstly, we give different sampled-data feedback control schemes with the variation of system parameter d. Specifically, we only use one controller to drive the system to the origin when d element of (-0.35, 0), and use two controllers if d element of [0, 1.3]. Next, we combine PC method with linear feedback approach to realize synchronization, and derive similar conclusions with varying d. Numerical simulations are also given to validate the proposed approaches

  7. Sampled-data models for linear and nonlinear systems

    CERN Document Server

    Yuz, Juan I

    2014-01-01

    Sampled-data Models for Linear and Nonlinear Systems provides a fresh new look at a subject with which many researchers may think themselves familiar. Rather than emphasising the differences between sampled-data and continuous-time systems, the authors proceed from the premise that, with modern sampling rates being as high as they are, it is becoming more appropriate to emphasise connections and similarities. The text is driven by three motives: ·      the ubiquity of computers in modern control and signal-processing equipment means that sampling of systems that really evolve continuously is unavoidable; ·      although superficially straightforward, sampling can easily produce erroneous results when not treated properly; and ·      the need for a thorough understanding of many aspects of sampling among researchers and engineers dealing with applications to which they are central. The authors tackle many misconceptions which, although appearing reasonable at first sight, are in fact either p...

  8. Computer-controlled sampling system for airborne particulates

    International Nuclear Information System (INIS)

    Hall, C.F.; Anspaugh, L.R.; Koval, J.S.; Phelps, P.L.; Steinhaus, R.J.

    1975-01-01

    A self-contained, mobile, computer-controlled air-sampling system has been designed and fabricated that also collects and records the data from eight meteorological sensors. The air-samplers are activated automatically when the collected meteorological data meet the criteria specified at the beginning of the data-collection run. The filters from the samplers are intended to collect airborne 239 Pu for later radionuclide analysis and correlation with the meteorological data for the study of resuspended airborne radioactivity and for the development of a predictive model. This paper describes the system hardware, discusses the system and software concepts, and outlines the operational procedures for the system

  9. Practical iterative learning control with frequency domain design and sampled data implementation

    CERN Document Server

    Wang, Danwei; Zhang, Bin

    2014-01-01

    This book is on the iterative learning control (ILC) with focus on the design and implementation. We approach the ILC design based on the frequency domain analysis and address the ILC implementation based on the sampled data methods. This is the first book of ILC from frequency domain and sampled data methodologies. The frequency domain design methods offer ILC users insights to the convergence performance which is of practical benefits. This book presents a comprehensive framework with various methodologies to ensure the learnable bandwidth in the ILC system to be set with a balance between learning performance and learning stability. The sampled data implementation ensures effective execution of ILC in practical dynamic systems. The presented sampled data ILC methods also ensure the balance of performance and stability of learning process. Furthermore, the presented theories and methodologies are tested with an ILC controlled robotic system. The experimental results show that the machines can work in much h...

  10. Iterative learning control with sampled-data feedback for robot manipulators

    Directory of Open Access Journals (Sweden)

    Delchev Kamen

    2014-09-01

    Full Text Available This paper deals with the improvement of the stability of sampled-data (SD feedback control for nonlinear multiple-input multiple-output time varying systems, such as robotic manipulators, by incorporating an off-line model based nonlinear iterative learning controller. The proposed scheme of nonlinear iterative learning control (NILC with SD feedback is applicable to a large class of robots because the sampled-data feedback is required for model based feedback controllers, especially for robotic manipulators with complicated dynamics (6 or 7 DOF, or more, while the feedforward control from the off-line iterative learning controller should be assumed as a continuous one. The robustness and convergence of the proposed NILC law with SD feedback is proven, and the derived sufficient condition for convergence is the same as the condition for a NILC with a continuous feedback control input. With respect to the presented NILC algorithm applied to a virtual PUMA 560 robot, simulation results are presented in order to verify convergence and applicability of the proposed learning controller with SD feedback controller attached

  11. Sampled-Data Control of Spacecraft Rendezvous with Discontinuous Lyapunov Approach

    Directory of Open Access Journals (Sweden)

    Zhuoshi Li

    2013-01-01

    Full Text Available This paper investigates the sampled-data stabilization problem of spacecraft relative positional holding with improved Lyapunov function approach. The classical Clohessy-Wiltshire equation is adopted to describe the relative dynamic model. The relative position holding problem is converted into an output tracking control problem using sampling signals. A time-dependent discontinuous Lyapunov functionals approach is developed, which will lead to essentially less conservative results for the stability analysis and controller design of the corresponding closed-loop system. Sufficient conditions for the exponential stability analysis and the existence of the proposed controller are provided, respectively. Finally, a simulation result is established to illustrate the effectiveness of the proposed control scheme.

  12. Exponential synchronization of chaotic Lur'e systems with time-varying delay via sampled-data control

    International Nuclear Information System (INIS)

    Rakkiyappan, R.; Sivasamy, R.; Lakshmanan, S.

    2014-01-01

    In this paper, we study the exponential synchronization of chaotic Lur'e systems with time-varying delays via sampled-data control by using sector nonlinearties. In order to make full use of information about sampling intervals and interval time-varying delays, new Lyapunov—Krasovskii functionals with triple integral terms are introduced. Based on the convex combination technique, two kinds of synchronization criteria are derived in terms of linear matrix inequalities, which can be efficiently solved via standard numerical software. Finally, three numerical examples are provided to demonstrate the less conservatism and effectiveness of the proposed results

  13. Active Fault Diagnosis in Sampled-data Systems

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Poulsen, Niels Kjølstad

    2015-01-01

    The focus in this paper is on active fault diagnosis (AFD) in closed-loop sampleddata systems. Applying the same AFD architecture as for continuous-time systems does not directly result in the same set of closed-loop matrix transfer functions. For continuous-time systems, the LFT (linear fractional...... transformation) structure in the connection between the parametric faults and the matrix transfer function (also known as the fault signature matrix) applied for AFD is not directly preserved for sampled-data system. As a consequence of this, the AFD methods cannot directly be applied for sampled-data systems....... Two methods are considered in this paper to handle the fault signature matrix for sampled-data systems such that standard AFD methods can be applied. The first method is based on a discretization of the system such that the LFT structure is preserved resulting in the same LFT structure in the fault...

  14. Sampled-data-based vibration control for structural systems with finite-time state constraint and sensor outage.

    Science.gov (United States)

    Weng, Falu; Liu, Mingxin; Mao, Weijie; Ding, Yuanchun; Liu, Feifei

    2018-05-10

    The problem of sampled-data-based vibration control for structural systems with finite-time state constraint and sensor outage is investigated in this paper. The objective of designing controllers is to guarantee the stability and anti-disturbance performance of the closed-loop systems while some sensor outages happen. Firstly, based on matrix transformation, the state-space model of structural systems with sensor outages and uncertainties appearing in the mass, damping and stiffness matrices is established. Secondly, by considering most of those earthquakes or strong winds happen in a very short time, and it is often the peak values make the structures damaged, the finite-time stability analysis method is introduced to constrain the state responses in a given time interval, and the H-infinity stability is adopted in the controller design to make sure that the closed-loop system has a prescribed level of disturbance attenuation performance during the whole control process. Furthermore, all stabilization conditions are expressed in the forms of linear matrix inequalities (LMIs), whose feasibility can be easily checked by using the LMI Toolbox. Finally, numerical examples are given to demonstrate the effectiveness of the proposed theorems. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  15. Controlling a sample changer using the integrated counting system

    International Nuclear Information System (INIS)

    Deacon, S.; Stevens, M.P.

    1985-06-01

    Control of the Sample Changer from a counting system can be achieved by using a Scaler Timer type 6255 and Sample Changer Control Interface type 6263. The interface used, however, has quite complex circuitry. The application therefore lends itself to the use of another 6000 Series module the Integrated Counting System (ICS). Using this unit control is carried out through a control program written in BASIC for the Commodore PET (or any other device with an IEEE-488 interface). The ICS then controls the sample changer through an interface unit which is relatively simple. A brief description of how ICS controls the sample changer is given. The control program is then described, firstly the running options are given, followed by a program description listing and flowchart. (author)

  16. Controlling a sample changer using the integrated counting system

    Energy Technology Data Exchange (ETDEWEB)

    Deacon, S; Stevens, M P

    1985-06-01

    Control of the Sample Changer from a counting system can be achieved by using a Scaler Timer type 6255 and Sample Changer Control Interface type 6263. The interface used, however, has quite complex circuitry. The application therefore lends itself to the use of another 6000 Series module-the Integrated Counting System (ICS). Using this unit control is carried out through a control program written in BASIC for the Commodore PET (or any other device with an IEEE-488 interface). The ICS then controls the sample changer through an interface unit which is relatively simple. A brief description of how ICS controls the sample changer is given. The control program is then described; first the running options are given, followed by a program description listing and flowchart.

  17. Automated facility for analysis of soil samples by neutron activation, counting, and data control

    International Nuclear Information System (INIS)

    Voegele, A.L.; Jesse, R.H.; Russell, W.L.; Baker, J.

    1978-01-01

    An automated facility remotely and automatically analyzes soil, water, and sediment samples for uranium. The samples travel through pneumatic tubes and switches to be first irradiated by neutrons and then counted for resulting neutron and gamma emission. Samples are loaded into special carriers, or rabbits, which are then automatically loaded into the pneumatic transfer system. The sample carriers have been previously coded with an identification number, which can be automatically read in the system. This number is used for correlating and filing data about the samples. The transfer system, counters, and identification system are controlled by a network of microprocessors. A master microprocessor initiates routines in other microprocessors assigned to specific tasks. The software in the microprocessors is unique for this type of application and lends flexibility to the system

  18. Sampled-Data Consensus of Linear Multi-agent Systems With Packet Losses.

    Science.gov (United States)

    Zhang, Wenbing; Tang, Yang; Huang, Tingwen; Kurths, Jurgen

    In this paper, the consensus problem is studied for a class of multi-agent systems with sampled data and packet losses, where random and deterministic packet losses are considered, respectively. For random packet losses, a Bernoulli-distributed white sequence is used to describe packet dropouts among agents in a stochastic way. For deterministic packet losses, a switched system with stable and unstable subsystems is employed to model packet dropouts in a deterministic way. The purpose of this paper is to derive consensus criteria, such that linear multi-agent systems with sampled-data and packet losses can reach consensus. By means of the Lyapunov function approach and the decomposition method, the design problem of a distributed controller is solved in terms of convex optimization. The interplay among the allowable bound of the sampling interval, the probability of random packet losses, and the rate of deterministic packet losses are explicitly derived to characterize consensus conditions. The obtained criteria are closely related to the maximum eigenvalue of the Laplacian matrix versus the second minimum eigenvalue of the Laplacian matrix, which reveals the intrinsic effect of communication topologies on consensus performance. Finally, simulations are given to show the effectiveness of the proposed results.In this paper, the consensus problem is studied for a class of multi-agent systems with sampled data and packet losses, where random and deterministic packet losses are considered, respectively. For random packet losses, a Bernoulli-distributed white sequence is used to describe packet dropouts among agents in a stochastic way. For deterministic packet losses, a switched system with stable and unstable subsystems is employed to model packet dropouts in a deterministic way. The purpose of this paper is to derive consensus criteria, such that linear multi-agent systems with sampled-data and packet losses can reach consensus. By means of the Lyapunov function

  19. Architecture of SPIDER control and data acquisition system

    International Nuclear Information System (INIS)

    Luchetta, A.; Manduchi, G.; Taliercio, C.; Soppelsa, A.; Barbalace, A.; Paolucci, F.; Sartori, F.; Barbato, P.; Breda, M.; Capobianco, R.; Molon, F.; Moressa, M.; Polato, S.; Simionato, P.; Zampiva, E.

    2012-01-01

    The ITER Heating Neutral Beam injectors will be implemented in three steps: development of the ion source prototype, development of the full injector prototype, and, finally, construction of up to three ITER injectors. The first two steps will be carried out in the ITER neutral beam test facility under construction in Italy. The ion source prototype, referred to as SPIDER, which is currently in the development phase, is a complex experiment involving more than 20 plant units and operating with beam-on pulses lasting up to 1 h. As for control and data acquisition it requires fast and slow control (cycle time around 0.1 ms and 10 ms, respectively), synchronization (10 ns resolution), and data acquisition for about 1000 channels (analogue and images) with sampling frequencies up to tens of MS/s, data throughput up to 200 MB/s, and data storage volume of up to tens of TB/year. The paper describes the architecture of the SPIDER control and data acquisition system, discussing the SPIDER requirements and the ITER CODAC interfaces and specifications for plant system instrumentation and control.

  20. Management of complex data flows in the ASDEX Upgrade plasma control system

    International Nuclear Information System (INIS)

    Treutterer, Wolfgang; Neu, Gregor; Raupp, Gerhard; Zasche, Dieter; Zehetbauer, Thomas; Cole, Richard; Lüddecke, Klaus

    2012-01-01

    Highlights: ► Control system architectures with data-driven workflows are efficient, flexible and maintainable. ► Signal groups provide coherence of interrelated signals and increase the efficiency of process synchronisation. ► Sample tags indicating sample quality form the fundament of a local event handling strategy. ► A self-organising workflow benefits from sample tags consisting of time stamp and stream activity. - Abstract: Establishing adequate technical and physical boundary conditions for a sustained nuclear fusion reaction is a challenging task. Phased feedback control and monitoring for heating, fuelling and magnetic shaping is mandatory, especially for fusion devices aiming at high performance plasmas. Technical and physical interrelations require close collaboration of many components in sequential as well as in parallel processing flows. Moreover, handling of asynchronous, off-normal events has become a key element of modern plasma performance optimisation and machine protection recipes. The manifoldness of plasma states and events, the variety of plant system operation states and the diversity in diagnostic data sampling rates can hardly be mastered with a rigid control scheme. Rather, an adaptive system topology in combination with sophisticated synchronisation and process scheduling mechanisms is suited for such an environment. Moreover, the system is subject to real-time control constraints: response times must be deterministic and adequately short. Therefore, the experimental tokamak device ASDEX Upgrade employs a discharge control system DCS, whose core has been designed to meet these requirements. In the paper we will compare the scheduling schemes for the parallelised realisation of a control workflow and show the advantage of a data-driven workflow over a managed workflow. The data-driven workflow as used in DCS is based on signals connecting process outputs and inputs. These are implemented as real-time streams of data samples

  1. Management of complex data flows in the ASDEX Upgrade plasma control system

    Energy Technology Data Exchange (ETDEWEB)

    Treutterer, Wolfgang, E-mail: Wolfgang.Treutterer@ipp.mpg.de [Max-Planck Institut fuer Plasmaphysik, EURATOM Association, Garching (Germany); Neu, Gregor; Raupp, Gerhard; Zasche, Dieter; Zehetbauer, Thomas [Max-Planck Institut fuer Plasmaphysik, EURATOM Association, Garching (Germany); Cole, Richard; Lueddecke, Klaus [Unlimited Computer Systems, Iffeldorf (Germany)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer Control system architectures with data-driven workflows are efficient, flexible and maintainable. Black-Right-Pointing-Pointer Signal groups provide coherence of interrelated signals and increase the efficiency of process synchronisation. Black-Right-Pointing-Pointer Sample tags indicating sample quality form the fundament of a local event handling strategy. Black-Right-Pointing-Pointer A self-organising workflow benefits from sample tags consisting of time stamp and stream activity. - Abstract: Establishing adequate technical and physical boundary conditions for a sustained nuclear fusion reaction is a challenging task. Phased feedback control and monitoring for heating, fuelling and magnetic shaping is mandatory, especially for fusion devices aiming at high performance plasmas. Technical and physical interrelations require close collaboration of many components in sequential as well as in parallel processing flows. Moreover, handling of asynchronous, off-normal events has become a key element of modern plasma performance optimisation and machine protection recipes. The manifoldness of plasma states and events, the variety of plant system operation states and the diversity in diagnostic data sampling rates can hardly be mastered with a rigid control scheme. Rather, an adaptive system topology in combination with sophisticated synchronisation and process scheduling mechanisms is suited for such an environment. Moreover, the system is subject to real-time control constraints: response times must be deterministic and adequately short. Therefore, the experimental tokamak device ASDEX Upgrade employs a discharge control system DCS, whose core has been designed to meet these requirements. In the paper we will compare the scheduling schemes for the parallelised realisation of a control workflow and show the advantage of a data-driven workflow over a managed workflow. The data-driven workflow as used in DCS is based on signals

  2. Systems and methods for self-synchronized digital sampling

    Science.gov (United States)

    Samson, Jr., John R. (Inventor)

    2008-01-01

    Systems and methods for self-synchronized data sampling are provided. In one embodiment, a system for capturing synchronous data samples is provided. The system includes an analog to digital converter adapted to capture signals from one or more sensors and convert the signals into a stream of digital data samples at a sampling frequency determined by a sampling control signal; and a synchronizer coupled to the analog to digital converter and adapted to receive a rotational frequency signal from a rotating machine, wherein the synchronizer is further adapted to generate the sampling control signal, and wherein the sampling control signal is based on the rotational frequency signal.

  3. Research of pneumatic control transmission system for small irradiation samples

    International Nuclear Information System (INIS)

    Bai Zhongxiong; Zhang Haibing; Rong Ru; Zhang Tao

    2008-01-01

    In order to reduce the absorbed dose damage for the operator, pneumatic control has been adopted to realize the rapid transmission of small irradiation samples. On/off of pneumatic circuit and directions for the rapid transmission system are controlled by the electrical control part. The main program initializes the system and detects the location of the manual/automatic change-over switch, and call for the corresponding subprogram to achieve the automatic or manual operation. Automatic subprogram achieves the automatic sample transmission; Manual subprogram completes the deflation, and back and forth movement of the radiation samples. This paper introduces in detail the implementation of the system, in terms of both hardware and software design. (authors)

  4. Consensus of heterogeneous multi-agent systems based on sampled data with a small sampling delay

    International Nuclear Information System (INIS)

    Wang Na; Wu Zhi-Hai; Peng Li

    2014-01-01

    In this paper, consensus problems of heterogeneous multi-agent systems based on sampled data with a small sampling delay are considered. First, a consensus protocol based on sampled data with a small sampling delay for heterogeneous multi-agent systems is proposed. Then, the algebra graph theory, the matrix method, the stability theory of linear systems, and some other techniques are employed to derive the necessary and sufficient conditions guaranteeing heterogeneous multi-agent systems to asymptotically achieve the stationary consensus. Finally, simulations are performed to demonstrate the correctness of the theoretical results. (interdisciplinary physics and related areas of science and technology)

  5. High speed, locally controlled data acquisition system for TFTR

    International Nuclear Information System (INIS)

    Feng, H.K.; Bradish, G.J.

    1983-01-01

    A high speed, locally controlled, data acquisition and transmission system has been developed by the CICADA (Central Instrumentation Control and Data Acquisition) Group for extracting certain timecritical data during a TFTR pulse and passing it to the control room, 1000 feet distant, to satisfy realtime requirements of frequently sampled variables. The system is designed to utilize any or all of the standard CAMAC (Computer Automated Measurement and Control) modules now employed on the CAMAC links for retrieval of the main body of data, but to operate them in a much faster manner than in a standard CAMAC system. To do this, a pre-programmable ROM sequencer is employed as a controller to transmit commands to the modules at intervals down to one microsecond, replacing the usual CAMAC dedicated computer, and increasing the command rate by an order of magnitude over what could be sent down a Branch Highway. Data coming from any number of channels originating within a single CAMAC ''crate'' is then time-multiplexed and transmitted over a single conductor pair in bi-phase at a 2.5 MHz bit rate using Manchester coding techniques. Benefits gained from this approach include: Reduction in the number of conductors required, elimination of line-to-line skew found in parallel transmission systems, and the capability of being transformer coupled or transmitted over a fiber optic cable to avoid safety hazards and ground loops. The main application for this system so far has been as the feedback path in this closed loop control of currents through the Tokamak's field coils. The paper will treat the system's various applications

  6. FireSignal-Data acquisition and control system software

    Energy Technology Data Exchange (ETDEWEB)

    Neto, A. [Associacao Euratom/IST, Centro de Fusao Nuclear, Av. Rovisco Pais, P-1049-001 Lisboa (Portugal)], E-mail: andre.neto@cfn.ist.utl.pt; Fernandes, H.; Duarte, A.; Carvalho, B.B.; Sousa, J.; Valcarcel, D.F. [Associacao Euratom/IST, Centro de Fusao Nuclear, Av. Rovisco Pais, P-1049-001 Lisboa (Portugal); Hron, M. [Asociace EURATOM IPP.CR, Prague (Czech Republic); Varandas, C.A.F. [Associacao Euratom/IST, Centro de Fusao Nuclear, Av. Rovisco Pais, P-1049-001 Lisboa (Portugal)

    2007-10-15

    Control of fusion experiments requires non-ambiguous, easy to use, user-interfaces to configure hardware devices. With that aim, a highly generic system for data control and acquisition has been developed. Among the main features it allows remote hardware configuration, shot launching, data sharing between connected users and experiment monitoring. The system is fully distributed: the hardware driver nodes, clients and servers are completely independent from each other and might run in different operating systems and programmed in different languages. All the communication is provided through the Common Object Request Broker Architecture (CORBA) protocol. FireSignal is designed to be as independent as possible from any kind of constraints as it is a plugin based system. Database, data viewers and the security system are some examples of what can easily be changed and adapted to the target machine's needs. In this system, every hardware is described in eXtensible Markup Language (XML) and with this information Graphical User Interfaces (GUI) are automatically built and user's parameter configuration validated. Any type of hardware device can be integrated in the system as long as it is described in XML and the respective driver developed. Any modern programming language can be used to develop these drivers. Currently Python and Java generic drivers are used. Data storage and indexing is time stamp event-based. Nodes are responsible for tagging the acquired samples with the absolute time stamps and to react to machine events. FireSignal is currently being used to control the ISTTOK/PT and CASTOR/CZ tokamaks.

  7. FireSignal-Data acquisition and control system software

    International Nuclear Information System (INIS)

    Neto, A.; Fernandes, H.; Duarte, A.; Carvalho, B.B.; Sousa, J.; Valcarcel, D.F.; Hron, M.; Varandas, C.A.F.

    2007-01-01

    Control of fusion experiments requires non-ambiguous, easy to use, user-interfaces to configure hardware devices. With that aim, a highly generic system for data control and acquisition has been developed. Among the main features it allows remote hardware configuration, shot launching, data sharing between connected users and experiment monitoring. The system is fully distributed: the hardware driver nodes, clients and servers are completely independent from each other and might run in different operating systems and programmed in different languages. All the communication is provided through the Common Object Request Broker Architecture (CORBA) protocol. FireSignal is designed to be as independent as possible from any kind of constraints as it is a plugin based system. Database, data viewers and the security system are some examples of what can easily be changed and adapted to the target machine's needs. In this system, every hardware is described in eXtensible Markup Language (XML) and with this information Graphical User Interfaces (GUI) are automatically built and user's parameter configuration validated. Any type of hardware device can be integrated in the system as long as it is described in XML and the respective driver developed. Any modern programming language can be used to develop these drivers. Currently Python and Java generic drivers are used. Data storage and indexing is time stamp event-based. Nodes are responsible for tagging the acquired samples with the absolute time stamps and to react to machine events. FireSignal is currently being used to control the ISTTOK/PT and CASTOR/CZ tokamaks

  8. Synchronization of Hierarchical Time-Varying Neural Networks Based on Asynchronous and Intermittent Sampled-Data Control.

    Science.gov (United States)

    Xiong, Wenjun; Patel, Ragini; Cao, Jinde; Zheng, Wei Xing

    In this brief, our purpose is to apply asynchronous and intermittent sampled-data control methods to achieve the synchronization of hierarchical time-varying neural networks. The asynchronous and intermittent sampled-data controllers are proposed for two reasons: 1) the controllers may not transmit the control information simultaneously and 2) the controllers cannot always exist at any time . The synchronization is then discussed for a kind of hierarchical time-varying neural networks based on the asynchronous and intermittent sampled-data controllers. Finally, the simulation results are given to illustrate the usefulness of the developed criteria.In this brief, our purpose is to apply asynchronous and intermittent sampled-data control methods to achieve the synchronization of hierarchical time-varying neural networks. The asynchronous and intermittent sampled-data controllers are proposed for two reasons: 1) the controllers may not transmit the control information simultaneously and 2) the controllers cannot always exist at any time . The synchronization is then discussed for a kind of hierarchical time-varying neural networks based on the asynchronous and intermittent sampled-data controllers. Finally, the simulation results are given to illustrate the usefulness of the developed criteria.

  9. TFTR diagnostic control and data acquisition system

    International Nuclear Information System (INIS)

    Sauthoff, N.R.; Daniels, R.E.; PPL Computer Division

    1985-01-01

    General computerized control and data-handling support for TFTR diagnostics is presented within the context of the Central Instrumentation, Control and Data Acquisition (CICADA) System. Procedures, hardware, the interactive man--machine interface, event-driven task scheduling, system-wide arming and data acquisition, and a hierarchical data base of raw data and results are described. Similarities in data structures involved in control, monitoring, and data acquisition afford a simplification of the system functions, based on ''groups'' of devices. Emphases and optimizations appropriate for fusion diagnostic system designs are provided. An off-line data reduction computer system is under development

  10. Optimal sampling period of the digital control system for the nuclear power plant steam generator water level control

    International Nuclear Information System (INIS)

    Hur, Woo Sung; Seong, Poong Hyun

    1995-01-01

    A great effort has been made to improve the nuclear plant control system by use of digital technologies and a long term schedule for the control system upgrade has been prepared with an aim to implementation in the next generation nuclear plants. In case of digital control system, it is important to decide the sampling period for analysis and design of the system, because the performance and the stability of a digital control system depend on the value of the sampling period of the digital control system. There is, however, currently no systematic method used universally for determining the sampling period of the digital control system. Generally, a traditional way to select the sampling frequency is to use 20 to 30 times the bandwidth of the analog control system which has the same system configuration and parameters as the digital one. In this paper, a new method to select the sampling period is suggested which takes into account of the performance as well as the stability of the digital control system. By use of the Irving's model steam generator, the optimal sampling period of an assumptive digital control system for steam generator level control is estimated and is actually verified in the digital control simulation system for Kori-2 nuclear power plant steam generator level control. Consequently, we conclude the optimal sampling period of the digital control system for Kori-2 nuclear power plant steam generator level control is 1 second for all power ranges. 7 figs., 3 tabs., 8 refs. (Author)

  11. Design, analysis, and interpretation of field quality-control data for water-sampling projects

    Science.gov (United States)

    Mueller, David K.; Schertz, Terry L.; Martin, Jeffrey D.; Sandstrom, Mark W.

    2015-01-01

    The process of obtaining and analyzing water samples from the environment includes a number of steps that can affect the reported result. The equipment used to collect and filter samples, the bottles used for specific subsamples, any added preservatives, sample storage in the field, and shipment to the laboratory have the potential to affect how accurately samples represent the environment from which they were collected. During the early 1990s, the U.S. Geological Survey implemented policies to include the routine collection of quality-control samples in order to evaluate these effects and to ensure that water-quality data were adequately representing environmental conditions. Since that time, the U.S. Geological Survey Office of Water Quality has provided training in how to design effective field quality-control sampling programs and how to evaluate the resultant quality-control data. This report documents that training material and provides a reference for methods used to analyze quality-control data.

  12. COED Transactions, Vol. X, No. 10, October 1978. Simulation of a Sampled-Data System on a Hybrid Computer.

    Science.gov (United States)

    Mitchell, Eugene E., Ed.

    The simulation of a sampled-data system is described that uses a full parallel hybrid computer. The sampled data system simulated illustrates the proportional-integral-derivative (PID) discrete control of a continuous second-order process representing a stirred-tank. The stirred-tank is simulated using continuous analog components, while PID…

  13. Microcomputer-based equipment-control and data-acquisition system for fission-reactor reactivity-worth measurements

    International Nuclear Information System (INIS)

    McDowell, W.P.; Bucher, R.G.

    1980-01-01

    Material reactivity-worth measurements are one of the major classes of experiments conducted on the Zero Power research reactors (ZPR) at Argonne National Laboratory. These measurements require the monitoring of the position of a servo control element as a sample material is positioned at various locations in a critical reactor configuration. In order to guarantee operational reliability and increase experimental flexibility for these measurements, the obsolete hardware-based control unit has been replaced with a microcomputer based equipment control and data acquisition system. This system is based on an S-100 bus, dual floppy disk computer with custom built cards to interface with the experimental system. To measure reactivity worths, the system accurately positions samples in the reactor core and acquires data on the position of the servo control element. The data are then analyzed to determine statistical adequacy. The paper covers both the hardware and software aspects of the design

  14. Microcomputer-based equipment-control and data-acquisition system for fission-reactor reactivity-worth measurements

    Energy Technology Data Exchange (ETDEWEB)

    McDowell, W.P.; Bucher, R.G.

    1980-01-01

    Material reactivity-worth measurements are one of the major classes of experiments conducted on the Zero Power research reactors (ZPR) at Argonne National Laboratory. These measurements require the monitoring of the position of a servo control element as a sample material is positioned at various locations in a critical reactor configuration. In order to guarantee operational reliability and increase experimental flexibility for these measurements, the obsolete hardware-based control unit has been replaced with a microcomputer based equipment control and data acquisition system. This system is based on an S-100 bus, dual floppy disk computer with custom built cards to interface with the experimental system. To measure reactivity worths, the system accurately positions samples in the reactor core and acquires data on the position of the servo control element. The data are then analyzed to determine statistical adequacy. The paper covers both the hardware and software aspects of the design.

  15. FireSignal - Data Acquisition and Control System Software

    International Nuclear Information System (INIS)

    Neto, A.; Fernandes, H.; Duarte, A.; Carvalho, B.; Sousa, J.; Valcarcel, D.; Varandas, C.; Hron, M.

    2006-01-01

    Control of fusion devices requires good, non-ambiguous, easy to use user-interfaces to configure hardware devices. To solve this problem a highly generic system for data control and acquisition has been developed. Among the main features it allows remote hardware configuration, shot launching, data sharing between connected users and experiment monitoring. The system is fully distributed: the hardware driver nodes, clients and server are completely independent from each other and might be running in different operating systems and programmed in different languages. All the communication is provided through the Common Object Request Broker Architecture (CORBA) protocol. FireSignal was designed from the beginning to be as independent as possible from any kind of constraints as it's a plugin based system. Database, data viewers and the security system are some examples of what can easily be changed and adapted to the target machine's needs. All hardware is described in eXtendend Markup Language (XML) and from this information the FireSignal client application can build automatically Graphical User Interfaces (GUI) and validate the user's parameter configuration. Any type of hardware can be integrated in the system as long as it is described in XML and the respective driver is developed. Any modern programming language can be used to develop these drivers, and currently we use Python and Java generic drivers. All data storage and indexing is time stamped event-based s. Nodes are responsible for tagging the acquired samples with the absolute time stamps and to react to machine events. FireSignal is currently being used to control the ISTTOK/PT and CASTOR/CZ tokamaks. (author)

  16. Robust H2 performance for sampled-data systems

    DEFF Research Database (Denmark)

    Rank, Mike Lind

    1997-01-01

    Robust H2 performance conditions under structured uncertainty, analogous to well known methods for H∞ performance, have recently emerged in both discrete and continuous-time. This paper considers the extension into uncertain sampled-data systems, taking into account inter-sample behavior. Convex...... conditions for robust H2 performance are derived for different uncertainty sets...

  17. Information management system breadboard data acquisition and control system.

    Science.gov (United States)

    Mallary, W. E.

    1972-01-01

    Description of a breadboard configuration of an advanced information management system based on requirements for high data rates and local and centralized computation for subsystems and experiments to be housed on a space station. The system is to contain a 10-megabit-per-second digital data bus, remote terminals with preprocessor capabilities, and a central multiprocessor. A concept definition is presented for the data acquisition and control system breadboard, and a detailed account is given of the operation of the bus control unit, the bus itself, and the remote acquisition and control unit. The data bus control unit is capable of operating under control of both its own test panel and the test processor. In either mode it is capable of both single- and multiple-message operation in that it can accept a block of data requests or update commands for transmission to the remote acquisition and control unit, which in turn is capable of three levels of data-handling complexity.

  18. MoonDB — A Data System for Analytical Data of Lunar Samples

    Science.gov (United States)

    Lehnert, K.; Ji, P.; Cai, M.; Evans, C.; Zeigler, R.

    2018-04-01

    MoonDB is a data system that makes analytical data from the Apollo lunar sample collection and lunar meteorites accessible by synthesizing published and unpublished datasets in a relational database with an online search interface.

  19. Data-driven soft sensor design with multiple-rate sampled data

    DEFF Research Database (Denmark)

    Lin, Bao; Recke, Bodil; Knudsen, Jørgen K.H.

    2007-01-01

    Multi-rate systems are common in industrial processes where quality measurements have slower sampling rate than other process variables. Since inter-sample information is desirable for effective quality control, different approaches have been reported to estimate the quality between samples......, including numerical interpolation, polynomial transformation, data lifting and weighted partial least squares (WPLS). Two modifications to the original data lifting approach are proposed in this paper: reformulating the extraction of a fast model as an optimization problem and ensuring the desired model...... properties through Tikhonov Regularization. A comparative investigation of the four approaches is performed in this paper. Their applicability, accuracy and robustness to process noise are evaluated on a single-input single output (SISO) system. The regularized data lifting and WPLS approaches...

  20. A microcomputer controlled sample changer system for γ-ray spectroscopy

    International Nuclear Information System (INIS)

    Jost, D.T.; Kraehenbuehl, U.; Gunten, H.R. von

    1982-01-01

    A Z-80 based microcomputer is used to control a sample changer system connected to two γ-ray spectrometers. Samples are changed according to preselected counting criteria (maximum time and/or desired precision). Special precautions were taken to avoid the loss of information and of samples in case of a power failure. (orig.)

  1. A microprocessor-based power control data acquisition system

    International Nuclear Information System (INIS)

    Greenberg, S.

    1982-10-01

    The project reported deals with one of the aspects of power plant control and management. In order to perform optimal distribution of power and load switching, one has to solve a specific optimization problem. In order to solve this problem one needs to collect current and power expenditure data from a large number of channels and have them processed. This particular procedure is defined as data acquisition and it constitutes the main topic of this project. A microprocessor-based data acquisition system for power management is investigated and developed. The current and power data of about 100 analog channels are sampled and collected in real-time. These data are subsequently processed to calculate the power factor (cos phi) for each channel and the maximum demand. The data is processed by an AMD 9511 Arithmetic Processing Unit and the whole system is controlled by an Intel 8080A CPU. All this information is then transfered to a universal computer through a synchronized communication channel. The optimization computations would be performed by the high level computer. Different ways of performing the search of data over a large number of channels have been investigated. A particular solution to overcome the gain and offset drift of the A/D converter, using software, has been proposed. The 8080A supervises the collection and routing of data in real time, while the 9511 performs calculation, using these data. (Author)

  2. Jefferson Lab Data Acquisition Run Control System

    International Nuclear Information System (INIS)

    Vardan Gyurjyan; Carl Timmer; David Abbott; William Heyes; Edward Jastrzembski; David Lawrence; Elliott Wolin

    2004-01-01

    A general overview of the Jefferson Lab data acquisition run control system is presented. This run control system is designed to operate the configuration, control, and monitoring of all Jefferson Lab experiments. It controls data-taking activities by coordinating the operation of DAQ sub-systems, online software components and third-party software such as external slow control systems. The main, unique feature which sets this system apart from conventional systems is its incorporation of intelligent agent concepts. Intelligent agents are autonomous programs which interact with each other through certain protocols on a peer-to-peer level. In this case, the protocols and standards used come from the domain-independent Foundation for Intelligent Physical Agents (FIPA), and the implementation used is the Java Agent Development Framework (JADE). A lightweight, XML/RDF-based language was developed to standardize the description of the run control system for configuration purposes

  3. Water sample-collection and distribution system

    Science.gov (United States)

    Brooks, R. R.

    1978-01-01

    Collection and distribution system samples water from six designated stations, filtered if desired, and delivers it to various analytical sensors. System may be controlled by Water Monitoring Data Acquisition System or operated manually.

  4. Data acquisition and experiment control system for high-data-rate experiments at the National Synchrotron Light Source

    International Nuclear Information System (INIS)

    Alberi, J.L.; Stubblefield, F.W.

    1981-11-01

    A data acquisition and experiment control system for experiments at the Biology Small-Angle X-ray Scattering Station at the National Synchrotron Light Source has been developed based on a multiprocessor, functionally distributed architecture. The system controls an x-ray monochromator and spectrometer and acquires data from any one of three position-sensitive x-ray detectors. The average data rate from the position-sensitive detector is approx. 10 6 events/sec. Data is stored in a one megaword histogramming memory. The experiments at this Station require that x-ray diffraction patterns be correlated with timed stimuli at the sample. Therefore, depending on which detector is in use, up to 10 3 time-correlated diffraction patterns may be held in the system memory simultaneously. The operation of the system is functionally distributed over four processors communicating via a multiport memory

  5. Computer system for environmental sample analysis and data storage and analysis

    International Nuclear Information System (INIS)

    Brauer, F.P.; Fager, J.E.

    1976-01-01

    A mini-computer based environmental sample analysis and data storage system has been developed. The system is used for analytical data acquisition, computation, storage of analytical results, and tabulation of selected or derived results for data analysis, interpretation and reporting. This paper discussed the structure, performance and applications of the system

  6. Overview of data acquisition and central control system of steady state superconducting Tokamak (SST-1)

    Energy Technology Data Exchange (ETDEWEB)

    Pradhan, S., E-mail: pradhan@ipr.res.in; Mahajan, K.; Gulati, H.K.; Sharma, M.; Kumar, A.; Patel, K.; Masand, H.; Mansuri, I.; Dhongde, J.; Bhandarkar, M.; Chudasama, H.

    2016-11-15

    Highlights: • The paper gives overview on SST-1 data acquisition and central control system and future upgrade plans. • The lossless PXI based data acquisition of SST-1 is capable of acquiring around 130 channels with sampling frequency ranging from 10 KHz to 1 MHz sampling frequency. • Design, architecture and technologies used for central control system (CCS) of SST-1. • Functions performed by CCS. - Abstract: Steady State Superconducting Tokamak (SST-1) has been commissioned successfully and has been carrying out limiter assisted ohmic plasma experiments since the beginning of 2014 achieving a maximum plasma current of 75 kA at a central field of 1.5 T and the plasma duration ∼500 ms. In near future, SST-1 looks forward to carrying out elongated plasma experiments and stretching plasma pulses beyond 1 s. The data acquisition and central control system (CCS) for SST-1 are distributed, modular, hierarchical and scalable in nature The CCS has been indigenously designed, developed, implemented, tested and validated for the operation of SST-1. The CCS has been built using well proven technologies like Redhat Linux, vxWorks RTOS for deterministic control, FPGA based hardware implementation, Ethernet, fiber optics backbone for network, DSP for real-time computation & Reflective memory for high-speed data transfer etc. CCS in SST-1 controls & monitors various heterogeneous SST-1 subsystems dispersed in the same campus. The CCS consists of machine control system, basic plasma control system, GPS time synchronization system, storage area network (SAN) for centralize data storage, SST-1 networking system, real-time networks, SST-1 control room infrastructure and many other supportive systems. Machine Control System (MCS) is a multithreaded event driven system running on Linux based servers, where each thread of the software communicates to a unique subsystem for monitoring and control from SST-1 central control room through network programming. The CCS hardware

  7. Overview of data acquisition and central control system of steady state superconducting Tokamak (SST-1)

    International Nuclear Information System (INIS)

    Pradhan, S.; Mahajan, K.; Gulati, H.K.; Sharma, M.; Kumar, A.; Patel, K.; Masand, H.; Mansuri, I.; Dhongde, J.; Bhandarkar, M.; Chudasama, H.

    2016-01-01

    Highlights: • The paper gives overview on SST-1 data acquisition and central control system and future upgrade plans. • The lossless PXI based data acquisition of SST-1 is capable of acquiring around 130 channels with sampling frequency ranging from 10 KHz to 1 MHz sampling frequency. • Design, architecture and technologies used for central control system (CCS) of SST-1. • Functions performed by CCS. - Abstract: Steady State Superconducting Tokamak (SST-1) has been commissioned successfully and has been carrying out limiter assisted ohmic plasma experiments since the beginning of 2014 achieving a maximum plasma current of 75 kA at a central field of 1.5 T and the plasma duration ∼500 ms. In near future, SST-1 looks forward to carrying out elongated plasma experiments and stretching plasma pulses beyond 1 s. The data acquisition and central control system (CCS) for SST-1 are distributed, modular, hierarchical and scalable in nature The CCS has been indigenously designed, developed, implemented, tested and validated for the operation of SST-1. The CCS has been built using well proven technologies like Redhat Linux, vxWorks RTOS for deterministic control, FPGA based hardware implementation, Ethernet, fiber optics backbone for network, DSP for real-time computation & Reflective memory for high-speed data transfer etc. CCS in SST-1 controls & monitors various heterogeneous SST-1 subsystems dispersed in the same campus. The CCS consists of machine control system, basic plasma control system, GPS time synchronization system, storage area network (SAN) for centralize data storage, SST-1 networking system, real-time networks, SST-1 control room infrastructure and many other supportive systems. Machine Control System (MCS) is a multithreaded event driven system running on Linux based servers, where each thread of the software communicates to a unique subsystem for monitoring and control from SST-1 central control room through network programming. The CCS hardware

  8. Control and Data Acquisition System of the ATLAS Facility

    International Nuclear Information System (INIS)

    Choi, Ki-Yong; Kwon, Tae-Soon; Cho, Seok; Park, Hyun-Sik; Baek, Won-Pil; Kim, Jung-Taek

    2007-02-01

    This report describes the control and data acquisition system of an integral effect test facility, ATLAS (Advanced Thermal-hydraulic Test Loop for Accident Simulation) facility, which recently has been constructed at KAERI (Korea Atomic Energy Research Institute). The control and data acquisition system of the ATLAS is established with the hybrid distributed control system (DCS) by RTP corp. The ARIDES system on a LINUX platform which is provided by BNF Technology Inc. is used for a control software. The IO signals consists of 1995 channels and they are processed at 10Hz. The Human-Machine-Interface (HMI) consists of 43 processing windows and they are classified according to fluid system. All control devices can be controlled by manual, auto, sequence, group, and table control methods. The monitoring system can display the real time trend or historical data of the selected IO signals on LCD monitors in a graphical form. The data logging system can be started or stopped by operator and the logging frequency can be selected among 0.5, 1, 2, 10Hz. The fluid system of the ATLAS facility consists of several systems including a primary system to auxiliary system. Each fluid system has a control similarity to the prototype plant, APR1400/OPR1000

  9. Advanced computer-controlled automatic alpha-beta air sample counter

    International Nuclear Information System (INIS)

    Howell, W.P.; Bruinekool, D.J.; Stapleton, E.E.

    1983-01-01

    An improved computer controlled automatic alpha-beta air sample counter was developed, based upon an earlier automatic air sample counter design. The system consists of an automatic sample changer, an electronic counting system utilizing a large silicon diode detector, a small desk-type microcomputer, a high speed matrix printer, and the necessary data interfaces. The system is operated by commands from the keyboard and programs stored on magnetic tape cassettes. The programs provide for background counting, Chi 2 test, radon subtraction, and sample counting for sample periods of one day to one week. Output data are printed by the matrix printer on standard multifold paper. The data output includes gross beta, gross alpha, and plutonium results. Data are automatically corrected for background, counter efficiency, and in the gross alpha and plutonium channels, for the presence of radon

  10. Development of an automatic sample changer and a data acquisition system

    International Nuclear Information System (INIS)

    Bianchini, Ricardo M.; Estevez, Jorge; Vollmer, Alberto E.; Iglicki, Flora A.

    1999-01-01

    An automatic electro-pneumatic sample changer with a rotating sample holder is described. The changer is coupled through an electronic interface with the data acquisition station. The software to automate the system has been designed. (author)

  11. Hanford Environmental Information System (HEIS). Volume 7: Sample and Data Tracking subject area

    International Nuclear Information System (INIS)

    1994-06-01

    The Hanford Environmental Information System (HEIS) Sample and Data Tracking subject area allows insertion of tracking information into a central repository where the data is immediately available for viewing. For example, a technical coordinator is able to view the current status of a particular sampling effort, from sample collection to data package validation dates. Four major types of data comprise the Sample and Data Tracking subject area: data about the mechanisms that groups a set of samples for a particular sampling effort; data about how constituents are grouped and assigned to a sample; data about when, where, and how samples are sent to a laboratory for analysis; and data bout the status of a sample's constituent analysis requirements, i.e., whether the analysis results have been returned from the laboratory

  12. A tracking system for groundwater sampling and data transfer schedules

    International Nuclear Information System (INIS)

    Mercier, T.M.

    1990-12-01

    Since groundwater monitoring programs at the Oak Ridge Y-12 Plant have become more complex and varied and as the occasions to respond to internal and external reporting requirements have become more frequent and time constrained, the need to track groundwater sampling activities and data transfer from the analytical laboratories has become imperative. If backlogs can be caught early, resources can be added or reallocated in the field and in the laboratory in a timely manner to ensure reporting deadlines are met. The tracking system discussed in this paper starts with clear definition of the groundwater monitoring program at the facility. This information is input into base datasets at the beginning of the sampling cycle. As the sampling program progresses, information about well sampling dates and data transfer dates is input into the base datasets. From the base program data and the update data, a status report is periodically generated by a computer program which identifies the type and nature of bottle necks encountered during the implementation of the groundwater monitoring program

  13. Sampled data CT system including analog filter and compensating digital filter

    International Nuclear Information System (INIS)

    Glover, G. H.; DallaPiazza, D. G.; Pelc, N. J.

    1985-01-01

    A CT scanner in which the amount of x-ray information acquired per unit time is substantially increased by using a continuous-on x-ray source and a sampled data system with the detector. An analog filter is used in the sampling system for band limiting the detector signal below the highest frequency of interest, but is a practically realizable filter and is therefore non-ideal. A digital filter is applied to the detector data after digitization to compensate for the characteristics of the analog filter, and to provide an overall filter characteristic more nearly like the ideal

  14. Microcomputer data acquisition and control.

    Science.gov (United States)

    East, T D

    1986-01-01

    In medicine and biology there are many tasks that involve routine well defined procedures. These tasks are ideal candidates for computerized data acquisition and control. As the performance of microcomputers rapidly increases and cost continues to go down the temptation to automate the laboratory becomes great. To the novice computer user the choices of hardware and software are overwhelming and sadly most of the computer sales persons are not at all familiar with real-time applications. If you want to bill your patients you have hundreds of packaged systems to choose from; however, if you want to do real-time data acquisition the choices are very limited and confusing. The purpose of this chapter is to provide the novice computer user with the basics needed to set up a real-time data acquisition system with the common microcomputers. This chapter will cover the following issues necessary to establish a real time data acquisition and control system: Analysis of the research problem: Definition of the problem; Description of data and sampling requirements; Cost/benefit analysis. Choice of Microcomputer hardware and software: Choice of microprocessor and bus structure; Choice of operating system; Choice of layered software. Digital Data Acquisition: Parallel Data Transmission; Serial Data Transmission; Hardware and software available. Analog Data Acquisition: Description of amplitude and frequency characteristics of the input signals; Sampling theorem; Specification of the analog to digital converter; Hardware and software available; Interface to the microcomputer. Microcomputer Control: Analog output; Digital output; Closed-Loop Control. Microcomputer data acquisition and control in the 21st Century--What is in the future? High speed digital medical equipment networks; Medical decision making and artificial intelligence.

  15. Microcomputer-based systems for automatic control of sample irradiation and chemical analysis of short-lived isotopes

    International Nuclear Information System (INIS)

    Bourret, S.C.

    1974-01-01

    Two systems resulted from the need for the study of the nuclear decay of short-lived radionuclides. Automation was required for better repeatability, speed of chemical separation after irradiation and for protection from the high radiation fields of the samples. A MCS-8 computer was used as the nucleus of the automatic sample irradiation system because the control system required an extensive multiple-sequential circuit. This approach reduced the sequential problem to a computer program. The automatic chemistry control system is a mixture of a fixed and a computer-based programmable control system. The fixed control receives the irradiated liquid sample from the reactor, extracts the liquid and disposes of the used sample container. The programmable control executes the chemistry program that the user has entered through the teletype. (U.S.)

  16. A separation theorem for the stochastic sampled-data LQG problem. [control of continuous linear plant disturbed by white noise

    Science.gov (United States)

    Halyo, N.; Caglayan, A. K.

    1976-01-01

    This paper considers the control of a continuous linear plant disturbed by white plant noise when the control is constrained to be a piecewise constant function of time; i.e. a stochastic sampled-data system. The cost function is the integral of quadratic error terms in the state and control, thus penalizing errors at every instant of time while the plant noise disturbs the system continuously. The problem is solved by reducing the constrained continuous problem to an unconstrained discrete one. It is shown that the separation principle for estimation and control still holds for this problem when the plant disturbance and measurement noise are Gaussian.

  17. The BaBar Data Reconstruction Control System

    International Nuclear Information System (INIS)

    Ceseracciu, A

    2005-01-01

    The BaBar experiment is characterized by extremely high luminosity and very large volume of data produced and stored, with increasing computing requirements each year. To fulfill these requirements a Control System has been designed and developed for the offline distributed data reconstruction system. The control system described in this paper provides the performance and flexibility needed to manage a large number of small computing farms, and takes full benefit of OO design. The infrastructure is well isolated from the processing layer, it is generic and flexible, based on a light framework providing message passing and cooperative multitasking. The system is distributed in a hierarchical way: the top-level system is organized in farms, farms in services, and services in subservices or code modules. It provides a powerful Finite State Machine framework to describe custom processing models in a simple regular language. This paper describes the design and evolution of this control system, currently in use at SLAC and Padova on ∼450 CPUs organized in 9 farms

  18. Integrating a sampling oscilloscope card and spectroscopy ADCs in a data acquisition system

    CERN Document Server

    Maartensson, L

    2001-01-01

    A high-rate sampling oscilloscope card has been integrated into an existing data acquisition system for spectroscopy ADCs. Experiments where pulse-shape analyses are important have then been made possible. Good performance characteristics of the integrated system have been achieved. Spectroscopy ADC data together with pulse-shape data sampled 512 times at 100 MHz are saved to hard disk at event rates up to about 1 kHz with low dead time losses.

  19. Recent developments of the RFX control and data acquisition system

    International Nuclear Information System (INIS)

    Barana, O.; Luchetta, A.; Manduchi, G.; Taliercio, C.

    2004-01-01

    Although the new RFX machine is still under modification, most power supply systems have been used since early 2003 for testing an ITER high-power by-pass switch. This has given us the opportunity to verify the effectiveness of several choices we made in the development of the new data acquisition and control system of RFX. The system has been renewed both in its control and data acquisition components. For control, the new system employs Simatic S7 PLCs and a commercial Supervisory Control and Data Acquisition (SCADA) tool. Many improvements have been made to the MDSplus-based data acquisition system. The whole system has been ported from OpenVMS to Linux, using a server for data storage and CAMAC data acquisition, and a set of CompactPCI crates, each hosting a Linux PC board. Device-specific code is now entirely implemented in TDI, the scripting language of MDSplus. Our experience in the new system has been positive, especially for the data acquisition system

  20. An advanced computer-controlled automatic alpha-beta air sample counter

    International Nuclear Information System (INIS)

    Howell, W.P.; Bruinekool, D.J.; Stapleton, E.E.

    1984-01-01

    An improved computer-controlled automatic alpha-beta air sample counter was developed, based upon an earlier automatic air sample counter design. The system consists of an automatic sample changer, an electronic counting system utilizing a large silicon diode detector, a small desk-type microcomputer, a high-speed matrix printer and the necessary data interfaces. The system is operated by commands from the keyboard and programs stored on magnetic tape cassettes. The programs provide for background counting, Chi 2 test, radon subtraction and sample counting for sample periods of one day to one week. Output data are printed by the matrix printer on standard multifold paper. The data output includes gross beta, gross alpha and plutonium results. Data are automatically corrected for background, counter efficiency, and in the gross alpha and plutonium channels, for the presence of radon

  1. Adaptive sampling rate control for networked systems based on statistical characteristics of packet disordering.

    Science.gov (United States)

    Li, Jin-Na; Er, Meng-Joo; Tan, Yen-Kheng; Yu, Hai-Bin; Zeng, Peng

    2015-09-01

    This paper investigates an adaptive sampling rate control scheme for networked control systems (NCSs) subject to packet disordering. The main objectives of the proposed scheme are (a) to avoid heavy packet disordering existing in communication networks and (b) to stabilize NCSs with packet disordering, transmission delay and packet loss. First, a novel sampling rate control algorithm based on statistical characteristics of disordering entropy is proposed; secondly, an augmented closed-loop NCS that consists of a plant, a sampler and a state-feedback controller is transformed into an uncertain and stochastic system, which facilitates the controller design. Then, a sufficient condition for stochastic stability in terms of Linear Matrix Inequalities (LMIs) is given. Moreover, an adaptive tracking controller is designed such that the sampling period tracks a desired sampling period, which represents a significant contribution. Finally, experimental results are given to illustrate the effectiveness and advantages of the proposed scheme. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Data-Driven Based Asynchronous Motor Control for Printing Servo Systems

    Science.gov (United States)

    Bian, Min; Guo, Qingyun

    Modern digital printing equipment aims to the environmental-friendly industry with high dynamic performances and control precision and low vibration and abrasion. High performance motion control system of printing servo systems was required. Control system of asynchronous motor based on data acquisition was proposed. Iterative learning control (ILC) algorithm was studied. PID control was widely used in the motion control. However, it was sensitive to the disturbances and model parameters variation. The ILC applied the history error data and present control signals to approximate the control signal directly in order to fully track the expect trajectory without the system models and structures. The motor control algorithm based on the ILC and PID was constructed and simulation results were given. The results show that data-driven control method is effective dealing with bounded disturbances for the motion control of printing servo systems.

  3. Observer-based output feedback control of networked control systems with non-uniform sampling and time-varying delay

    Science.gov (United States)

    Meng, Su; Chen, Jie; Sun, Jian

    2017-10-01

    This paper investigates the problem of observer-based output feedback control for networked control systems with non-uniform sampling and time-varying transmission delay. The sampling intervals are assumed to vary within a given interval. The transmission delay belongs to a known interval. A discrete-time model is first established, which contains time-varying delay and norm-bounded uncertainties coming from non-uniform sampling intervals. It is then converted to an interconnection of two subsystems in which the forward channel is delay-free. The scaled small gain theorem is used to derive the stability condition for the closed-loop system. Moreover, the observer-based output feedback controller design method is proposed by utilising a modified cone complementary linearisation algorithm. Finally, numerical examples illustrate the validity and superiority of the proposed method.

  4. Microcomputer-controlled ultrasonic data acquisition system

    International Nuclear Information System (INIS)

    Simpson, W.A. Jr.

    1978-11-01

    The large volume of ultrasonic data generated by computer-aided test procedures has necessitated the development of a mobile, high-speed data acquisition and storage system. This approach offers the decided advantage of on-site data collection and remote data processing. It also utilizes standard, commercially available ultrasonic instrumentation. This system is controlled by an Intel 8080A microprocessor. The MCS80-SDK microcomputer board was chosen, and magnetic tape is used as the storage medium. A detailed description is provided of both the hardware and software developed to interface the magnetic tape storage subsystem to Biomation 8100 and Biomation 805 waveform recorders. A boxcar integrator acquisition system is also described for use when signal averaging becomes necessary. Both assembly language and machine language listings are provided for the software

  5. Analysis of a control and data acquisition system for radiation protection monitors of spent fuel reprocessing plant

    International Nuclear Information System (INIS)

    Liu Boxue

    1997-01-01

    For the radiation protection monitoring of spent nuclear fuel reprocessing plant, the paper analyzes the composition and requirements of a control and data acquisition system. With the concepts of typical distributing and opening models, the hardware consists of IPC, communication of RS-485 bus lines and data multiplexer. The software consists of real-time multi-services operation system and modelling program. It can sample monitoring data, control monitor's operation, and process data and other information. It has good expansive and compatible features

  6. Big Data, Small Sample.

    Science.gov (United States)

    Gerlovina, Inna; van der Laan, Mark J; Hubbard, Alan

    2017-05-20

    Multiple comparisons and small sample size, common characteristics of many types of "Big Data" including those that are produced by genomic studies, present specific challenges that affect reliability of inference. Use of multiple testing procedures necessitates calculation of very small tail probabilities of a test statistic distribution. Results based on large deviation theory provide a formal condition that is necessary to guarantee error rate control given practical sample sizes, linking the number of tests and the sample size; this condition, however, is rarely satisfied. Using methods that are based on Edgeworth expansions (relying especially on the work of Peter Hall), we explore the impact of departures of sampling distributions from typical assumptions on actual error rates. Our investigation illustrates how far the actual error rates can be from the declared nominal levels, suggesting potentially wide-spread problems with error rate control, specifically excessive false positives. This is an important factor that contributes to "reproducibility crisis". We also review some other commonly used methods (such as permutation and methods based on finite sampling inequalities) in their application to multiple testing/small sample data. We point out that Edgeworth expansions, providing higher order approximations to the sampling distribution, offer a promising direction for data analysis that could improve reliability of studies relying on large numbers of comparisons with modest sample sizes.

  7. Data Acquisition for Modular Biometric Monitoring System

    Science.gov (United States)

    Chmiel, Alan J. (Inventor); Humphreys, Bradley T. (Inventor); Grodsinsky, Carlos M. (Inventor)

    2014-01-01

    A modular system for acquiring biometric data includes a plurality of data acquisition modules configured to sample biometric data from at least one respective input channel at a data acquisition rate. A representation of the sampled biometric data is stored in memory of each of the plurality of data acquisition modules. A central control system is in communication with each of the plurality of data acquisition modules through a bus. The central control system is configured to collect data asynchronously, via the bus, from the memory of the plurality of data acquisition modules according to a relative fullness of the memory of the plurality of data acquisition modules.

  8. Computer graphics for quality control in the INAA of geological samples

    International Nuclear Information System (INIS)

    Grossman, J.N.; Baedecker, P.A.

    1987-01-01

    A data reduction system for the routine instrumental activation analysis of samples is described, with particular emphasis on interactive graphics capabilities for evaluating analytical quality. Graphics procedures were developed to interactively control the analysis of selected photopeaks during spectral analysis, and to evaluate detector performance during a given counting cycle. Graphics algorithms are also used to compare the data on reference samples with accepted values, to prepare quality control charts to evaluate long term precision and to search for systematic variations in data on reference samples as a function of time. (author)

  9. A distributed timing system for sychronizing control and data correlation

    International Nuclear Information System (INIS)

    Stettler, M.; Thout, M.; Dalesio, L.R.; Cole, R.; Fite, C.; Slentz, G.; Warren, D.

    1992-01-01

    Synchronization is necessary in experimental physics machines to provide positive control over related events. The Ground Test Accelerator (GTA) timing system provides this function through a distributed control system, known as the Experimental Physics and Industrial Control System (EPICS). The EPICS timing system was designed to take advantage of a distributed architecture, and provides time stamping for synchronous data correlation as well as event control. The system has been successfully demonstrated on over a dozen controller nodes for operation and data analysis. The design of the hardware, software, and operational results are discussed

  10. Review of Supervisory Control and Data Acquisition (SCADA) Systems

    Energy Technology Data Exchange (ETDEWEB)

    Reva Nickelson; Briam Johnson; Ken Barnes

    2004-01-01

    A review using open source information was performed to obtain data related to Supervisory Control and Data Acquisition (SCADA) systems used to supervise and control domestic electric power generation, transmission, and distribution. This report provides the technical details for the types of systems used, system disposal, cyber and physical security measures, network connections, and a gap analysis of SCADA security holes.

  11. Using joined minicomputer-microcomputer systems for intricate sample and data manipulations

    International Nuclear Information System (INIS)

    Meng, J.D.

    1980-09-01

    We have produced, over the past three years, three automated x-ray fluorescence based elemental analysis systems, that combine a minicomputer and a microcomputer to perform intricate sample and data manipulations. The mini-micro combination facilitates the reuse of sizable sections of hardware and programs for different x-ray analysis projects. Each of our systems has been a step closer to an optimum general solution. The combination reaps economic benefits throughout development, fabrication and maintenance, an important consideration for designers of custom-built, one-of-a-kind data analysis systems such as these

  12. Control System Radioactive Contamination in Food Samples in Poland

    International Nuclear Information System (INIS)

    Grabowski, D.; Kurowski, W.; Muszynski, W.; Rubel, B.; Smagala, G.; Swietochowska, J.

    2001-01-01

    Full text: The analyses of the level of radioactive contamination in food samples are carried out by the Service for Measurements of Radioactive Contamination (SMRC) in Poland. The Service was brought into existence in 1961. The Service comprises of a network of measurement stations and the Centre of Radioactive Contamination Measurements (CRCM). The duty of the Centre is being executed by the Central Laboratory for Radiological Protection (CLRP). The uniform methods of sampling are used in measurement stations. All important foodstuff: milk, meat, vegetables, fruit, cereals are controlled in the Service stations. The radiochemical and spectrometric methods are used to determine the activity of radioactive isotopes. The standard equipment of the measurement station is the measurement system type SAPOS-90 and multichannel analyser with scintillation or germanium detector. The structure of the Service, kinds of samples tested by each station, program of sampling in normal and during accident situation are presented in this paper. (author)

  13. DABASCO Experiment Data Acquisition and Control System

    International Nuclear Information System (INIS)

    Alberdi Primicia, J.; Artigao Arteaga, A.; Barcala Rieveira, J. M.; Oller Gonzalez, J. C.

    2000-01-01

    DABASCO experiment wants to study the thermohydraulic phenomena produced into the containment area for a severe accident in a nuclear power facility. This document describes the characteristics of the data acquisition and control system used in the experiment. The main elements of the system were a data acquisition board, PCI-MIO-16E-4, and an application written with LaB View. (Author) 5 refs

  14. A distributed timing system for synchronizing control and data correlation

    International Nuclear Information System (INIS)

    Stettler, M.; Thuot, M.; Dalesio, L.R.; Cole, R.; Fite, C.; Slentz, G.; Warren, D.

    1992-01-01

    Synchronization is necessary in experimental physics machines to provide positive control over related events. The Ground Test Accelerator (GTA) timing system provides this function through a distributed control system, known as the Experimental Physics and Industrial Control System (EPICS). The EPICS timing system was designed to take advantage of a distributed architecture, and provides time stamping for synchronous data correlation as well as event control. The system has been successfully demonstrated on over a dozen controller nodes for operation and data analysis. The design of the hardware, software, and operational results are discussed. (author). 2 refs., 4 figs

  15. Revitalisation of Control and Data Acquisition Systems for Corrosion Test Loop

    International Nuclear Information System (INIS)

    Khairul Handono; Kiswanta; Edy Sumarno

    2008-01-01

    The replacement of control and data acquisition systems for Corrosion Test Loop (CTL) has been conducted. The aim of revitalisation for CTL is to increase controller system performance Kent 4000 which is based on PLC. On the other side revitalisation of acquisition data system is done to build computer based data retrieval system for transformation gauging of parameters in thermalhydraulic experiment of CTL. Previously, data collector system used indicator recorder analog, while data recording is done manually, which caused causing very slow response and the result is less accurate. To increase the user quality of data collector system, the data acquisition system is developed with application program Visual Basic and acquisition apparatus card of data. Result of the activity of revitalisation CTL is to obtain of control systems based on PLC and data acquisition system capable to present information in the form of temperature, pressure and cooling water level interactively, namely easy to read, quickly, realtime and accurate. This results give the improvement of control systems performance and data acquisition system which data storage of acquisition into hard disk in the form of file and further processed in the form of tables or graph to facilitate the analysis. (author)

  16. Effects of data sampling rate on image quality in fan-beam-CT system

    International Nuclear Information System (INIS)

    Iwata, Akira; Yamagishi, Nobutoshi; Suzumura, Nobuo; Horiba, Isao.

    1984-01-01

    Investigation was made into the relationship between spatial resolution or artifacts and data sampling rate in order to pursue the causes of the degradation of CT image quality by computer simulation. First the generation of projection data and reconstruction calculating process are described, and then the results are shown about the relation between angular sampling interval and spatical resolution or artifacts, and about the relation between projection data sampling interval and spatial resolution or artifacts. It was clarified that the formulation of the relationship between spatial resolution and data sampling rate performed so far for parallel X-ray beam was able to be applied to fan beam. As a conclusion, when other reconstruction parameters are the same in fan beam CT systems, spatial resolution can be determined by projection data sampling rate rather than angular sampling rate. The mechanism of artifact generation due to the insufficient number of angular samples was made clear. It was also made clear that there was a definite relationship among measuring region, angular sampling rate and projection data sampling rate, and the amount of artifacts depending upon projection data sampling rate was proportional to the amount of spatial frequency components (Aliasing components) of a test object above the Nyquist frequency of projection data. (Wakatsuki, Y.)

  17. The Nuclotron internal target control and data acquisition system

    Energy Technology Data Exchange (ETDEWEB)

    Isupov, A.Yu., E-mail: isupov@moonhe.jinr.ru [Joint Institute for Nuclear Research, Dubna 141980 (Russian Federation); Krasnov, V.A.; Ladygin, V.P.; Piyadin, S.M.; Reznikov, S.G. [Joint Institute for Nuclear Research, Dubna 141980 (Russian Federation)

    2013-01-11

    The new control system of the Nuclotron (JINR, Dubna) internal target is described in both hardware and software aspects. The CAMAC hardware is based on the use of the standard CAMAC modules developed and manufactured at JINR. The internal target control and data acquisition (IntTarg CDAQ) system software is implemented using the ngdp framework under the Unix-like operating system (OS) FreeBSD to allow easy network distribution of the online data collected from internal target and accompanying detectors, as well as the internal target remote control.

  18. Schedulability analysis for systems with data and control dependencies

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    2000-01-01

    Is this paper we present an approach to schedulability analysis for hard real-time systems with control and data dependencies. We consider distributed architectures consisting of multiple programmable processors, and the scheduling policy is based on a static priority preemptive strategy! Our model...... of the system captures bath data and control dependencies, and the schedulability approach is able to reduce the pessimism of the analysis by using the knowledge about control ann data dependencies. Extensive experiments as well as a real life example demonstrate the efficiency of our approach....

  19. Computer graphics for quality control in the INAA of geological samples

    Science.gov (United States)

    Grossman, J.N.; Baedecker, P.A.

    1987-01-01

    A data reduction system for the routine instrumental activation analysis of samples is described, with particular emphasis on interactive graphics capabilities for evaluating analytical quality. Graphics procedures have been developed to interactively control the analysis of selected photopeaks during spectral analysis, and to evaluate detector performance during a given counting cycle. Graphics algorithms are also used to compare the data on reference samples with accepted values, to prepare quality control charts to evaluate long term precision and to search for systematic variations in data on reference samples as a function of time. ?? 1987 Akade??miai Kiado??.

  20. Gene expression data from acetaminophen-induced toxicity in human hepatic in vitro systems and clinical liver samples

    Directory of Open Access Journals (Sweden)

    Robim M. Rodrigues

    2016-06-01

    Full Text Available This data set is composed of transcriptomics analyses of (i liver samples from patients suffering from acetaminophen-induced acute liver failure (ALF and (ii hepatic cell systems exposed to acetaminophen and their respective controls. The in vitro systems include widely employed cell lines i.e. HepaRG and HepG2 cells as well as a novel stem cell-derived model i.e. human skin-precursors-derived hepatocyte-like cells (hSKP-HPC. Data from primary human hepatocytes was also added to the data set “Open TG-GATEs: a large-scale toxicogenomics database” (Igarashi et al., 2015 [1]. Changes in gene expression due to acetaminophen intoxication as well as comparative information between human in vivo and in vitro samples are provided. The microarray data have been deposited in NCBI׳s Gene Expression Omnibus and are accessible through GEO Series accession number GEO: GSE74000. The provided data is used to evaluate the predictive capacity of each hepatic in vitro system and can be directly compared with large-scale publically available toxicogenomics databases. Further interpretation and discussion of these data feature in the corresponding research article “Toxicogenomics-based prediction of acetaminophen-induced liver injury using human hepatic cell systems” (Rodrigues et al., 2016 [2].

  1. The WAIS Melt Monitor: An automated ice core melting system for meltwater sample handling and the collection of high resolution microparticle size distribution data

    Science.gov (United States)

    Breton, D. J.; Koffman, B. G.; Kreutz, K. J.; Hamilton, G. S.

    2010-12-01

    Paleoclimate data are often extracted from ice cores by careful geochemical analysis of meltwater samples. The analysis of the microparticles found in ice cores can also yield unique clues about atmospheric dust loading and transport, dust provenance and past environmental conditions. Determination of microparticle concentration, size distribution and chemical makeup as a function of depth is especially difficult because the particle size measurement either consumes or contaminates the meltwater, preventing further geochemical analysis. Here we describe a microcontroller-based ice core melting system which allows the collection of separate microparticle and chemistry samples from the same depth intervals in the ice core, while logging and accurately depth-tagging real-time electrical conductivity and particle size distribution data. This system was designed specifically to support microparticle analysis of the WAIS Divide WDC06A deep ice core, but many of the subsystems are applicable to more general ice core melting operations. Major system components include: a rotary encoder to measure ice core melt displacement with 0.1 millimeter accuracy, a meltwater tracking system to assign core depths to conductivity, particle and sample vial data, an optical debubbler level control system to protect the Abakus laser particle counter from damage due to air bubbles, a Rabbit 3700 microcontroller which communicates with a host PC, collects encoder and optical sensor data and autonomously operates Gilson peristaltic pumps and fraction collectors to provide automatic sample handling, melt monitor control software operating on a standard PC allowing the user to control and view the status of the system, data logging software operating on the same PC to collect data from the melting, electrical conductivity and microparticle measurement systems. Because microparticle samples can easily be contaminated, we use optical air bubble sensors and high resolution ice core density

  2. ACS sampling system: design, implementation, and performance evaluation

    Science.gov (United States)

    Di Marcantonio, Paolo; Cirami, Roberto; Chiozzi, Gianluca

    2004-09-01

    By means of ACS (ALMA Common Software) framework we designed and implemented a sampling system which allows sampling of every Characteristic Component Property with a specific, user-defined, sustained frequency limited only by the hardware. Collected data are sent to various clients (one or more Java plotting widgets, a dedicated GUI or a COTS application) using the ACS/CORBA Notification Channel. The data transport is optimized: samples are cached locally and sent in packets with a lower and user-defined frequency to keep network load under control. Simultaneous sampling of the Properties of different Components is also possible. Together with the design and implementation issues we present the performance of the sampling system evaluated on two different platforms: on a VME based system using VxWorks RTOS (currently adopted by ALMA) and on a PC/104+ embedded platform using Red Hat 9 Linux operating system. The PC/104+ solution offers, as an alternative, a low cost PC compatible hardware environment with free and open operating system.

  3. The New BaBar Data Reconstruction Control System

    International Nuclear Information System (INIS)

    Ceseracciu, Antonio

    2003-01-01

    The BaBar experiment is characterized by extremely high luminosity, a complex detector, and a huge data volume, with increasing requirements each year. To fulfill these requirements a new control system has been designed and developed for the offline data reconstruction system. The new control system described in this paper provides the performance and flexibility needed to manage a large number of small computing farms, and takes full benefit of OO design. The infrastructure is well isolated from the processing layer, it is generic and flexible, based on a light framework providing message passing and cooperative multitasking. The system is actively distributed, enforces the separation between different processing tiers by using different naming domains, and glues them together by dedicated brokers. It provides a powerful Finite State Machine framework to describe custom processing models in a simple regular language. This paper describes this new control system, currently in use at SLAC and Padova on ∼450 CPUs organized in 12 farms

  4. Synchronization of a Class of Memristive Stochastic Bidirectional Associative Memory Neural Networks with Mixed Time-Varying Delays via Sampled-Data Control

    Directory of Open Access Journals (Sweden)

    Manman Yuan

    2018-01-01

    Full Text Available The paper addresses the issue of synchronization of memristive bidirectional associative memory neural networks (MBAMNNs with mixed time-varying delays and stochastic perturbation via a sampled-data controller. First, we propose a new model of MBAMNNs with mixed time-varying delays. In the proposed approach, the mixed delays include time-varying distributed delays and discrete delays. Second, we design a new method of sampled-data control for the stochastic MBAMNNs. Traditional control methods lack the capability of reflecting variable synaptic weights. In this paper, the methods are carefully designed to confirm the synchronization processes are suitable for the feather of the memristor. Third, sufficient criteria guaranteeing the synchronization of the systems are derived based on the derive-response concept. Finally, the effectiveness of the proposed mechanism is validated with numerical experiments.

  5. A nuclear data acquisition system flow control model

    International Nuclear Information System (INIS)

    Hack, S.N.

    1988-01-01

    A general Petri Net representation of a nuclear data acquisition system model is presented. This model provides for the unique requirements of a nuclear data acquisition system including the capabilities of concurrently acquiring asynchronous and synchronous data, of providing multiple priority levels of flow control arbitration, and of permitting multiple input sources to reside at the same priority without the problem of channel lockout caused by a high rate data source. Finally, a previously implemented gamma camera/physiological signal data acquisition system is described using the models presented

  6. Design of Data Acquisition and Control System for Indian Test Facility of Diagnostics Neutral Beam

    International Nuclear Information System (INIS)

    Soni, Jignesh; Tyagi, Himanshu; Yadav, Ratnakar; Rotti, Chandramouli; Bandyopadhyay, Mainak; Bansal, Gourab; Gahluat, Agrajit; Sudhir, Dass; Joshi, Jaydeep; Prasad, Rambilas; Pandya, Kaushal; Shah, Sejal; Parmar, Deepak; Chakraborty, Arun

    2015-01-01

    Highlights: • More than 900 channels Data Acquisition and Control System. • INTF DACS has been designed based on ITER-PCDH guidelines. • Separate Interlock and Safety system designed based on IEC 61508 standard. • Hardware selected from ITER slow controller and fast controller catalog. • Software framework based on ITER CODAC Core System and LabVIEW software. - Abstract: The Indian Test Facility (INTF) – a negative hydrogen ion based 100 kV, 60 A, 5 Hz modulated NBI system having 3 s ON/20 s OFF duty cycle. Prime objective of the facility is to install a full-scale test bed for the qualification of all Diagnostic Neutral Beam (DNB) parameters, prior to installation in ITER. The automated and safe operation of the INTF will require a reliable and rugged instrumentation and control system which provide control, data acquisition (DAQ), interlock and safety functions, referred as INTF-DACS. The INTF-DACS has been decided to be design based on the ITER CODAC architecture and ITER-PCDH guidelines since the technical understanding of CODAC technology gained from this will later be helpful in development of plant system I&C for DNB. For complete operation of the INTF, approximately 900 numbers of signals are required to be superintending by the DACS. In INTF conventional control loop time required is within the range of 5–100 ms and for DAQ except high-end diagnostics, required sampling rates in range of 5 sample per second (Sps) to 10 kSps; to fulfill these requirements hardware components have been selected from the ITER slow and fast controller catalogs. For high-end diagnostics required sampling rates up to 100 MSps normally in case of certain events, therefore event and burst based DAQ hardware has been finalized. Combined use of CODAC core software (CCS) and NI-LabVIEW has been finalized due to the fact that full required DAQ support is not available in present version of CCS. Interlock system for investment protection of facility and Safety system for

  7. Design of Data Acquisition and Control System for Indian Test Facility of Diagnostics Neutral Beam

    Energy Technology Data Exchange (ETDEWEB)

    Soni, Jignesh, E-mail: jsoni@ipr.res.in [Institute for Plasma Research, Bhat, Gandhinagar 382 428, Gujarat (India); Tyagi, Himanshu; Yadav, Ratnakar; Rotti, Chandramouli; Bandyopadhyay, Mainak [ITER-India, Institute for Plasma Research, Gandhinagar 380 025, Gujarat (India); Bansal, Gourab; Gahluat, Agrajit [Institute for Plasma Research, Bhat, Gandhinagar 382 428, Gujarat (India); Sudhir, Dass; Joshi, Jaydeep; Prasad, Rambilas [ITER-India, Institute for Plasma Research, Gandhinagar 380 025, Gujarat (India); Pandya, Kaushal [Institute for Plasma Research, Bhat, Gandhinagar 382 428, Gujarat (India); Shah, Sejal; Parmar, Deepak [ITER-India, Institute for Plasma Research, Gandhinagar 380 025, Gujarat (India); Chakraborty, Arun [Institute for Plasma Research, Bhat, Gandhinagar 382 428, Gujarat (India)

    2015-10-15

    Highlights: • More than 900 channels Data Acquisition and Control System. • INTF DACS has been designed based on ITER-PCDH guidelines. • Separate Interlock and Safety system designed based on IEC 61508 standard. • Hardware selected from ITER slow controller and fast controller catalog. • Software framework based on ITER CODAC Core System and LabVIEW software. - Abstract: The Indian Test Facility (INTF) – a negative hydrogen ion based 100 kV, 60 A, 5 Hz modulated NBI system having 3 s ON/20 s OFF duty cycle. Prime objective of the facility is to install a full-scale test bed for the qualification of all Diagnostic Neutral Beam (DNB) parameters, prior to installation in ITER. The automated and safe operation of the INTF will require a reliable and rugged instrumentation and control system which provide control, data acquisition (DAQ), interlock and safety functions, referred as INTF-DACS. The INTF-DACS has been decided to be design based on the ITER CODAC architecture and ITER-PCDH guidelines since the technical understanding of CODAC technology gained from this will later be helpful in development of plant system I&C for DNB. For complete operation of the INTF, approximately 900 numbers of signals are required to be superintending by the DACS. In INTF conventional control loop time required is within the range of 5–100 ms and for DAQ except high-end diagnostics, required sampling rates in range of 5 sample per second (Sps) to 10 kSps; to fulfill these requirements hardware components have been selected from the ITER slow and fast controller catalogs. For high-end diagnostics required sampling rates up to 100 MSps normally in case of certain events, therefore event and burst based DAQ hardware has been finalized. Combined use of CODAC core software (CCS) and NI-LabVIEW has been finalized due to the fact that full required DAQ support is not available in present version of CCS. Interlock system for investment protection of facility and Safety system for

  8. Controller synthesis for negative imaginary systems: a data driven approach

    KAUST Repository

    Mabrok, Mohamed

    2016-02-17

    The negative imaginary (NI) property occurs in many important applications. For instance, flexible structure systems with collocated force actuators and position sensors can be modelled as negative imaginary systems. In this study, a data-driven controller synthesis methodology for NI systems is presented. In this approach, measured frequency response data of the plant is used to construct the controller frequency response at every frequency by minimising a cost function. Then, this controller response is used to identify the controller transfer function using system identification methods. © The Institution of Engineering and Technology 2016.

  9. Computer-controlled data acquisition system for the ISX-B neutral injection system

    International Nuclear Information System (INIS)

    Edmonds, P.H.; Sherrill, B.; Pearce, J.W.

    1980-05-01

    A data acquisition system for the Impurity Study Experiment (ISX-B) neutral injection system at the Oak Ridge National Laboratory is presented. The system is based on CAMAC standards and is controlled by a MIK-11/2 microcomputer. The system operates at the ion source high voltage on the source table, transmitting the analyzed data to a terminal at ground potential. This reduces the complexity of the communications link and also allows much flexibility in the diagnostics and eventual control of the beam line

  10. A DSP controlled data acquisition system for CELSIUS

    International Nuclear Information System (INIS)

    Bengtsson, M.; Lofnes, T.; Ziemann, V.

    2000-01-01

    We describe a data acquisition system based on two 10 MHz A/D-converters, a SHARC Digital Signal Processor (DSP), and a digital synthesizer used for triggering the A/D-converters. The temporal macrostructure of the data acquisition can be determined by external triggers or by timer interrupts from the DSP. In this way up to two million samples can be stored in DSP external memory. The samples are analyzed by directly fast Fourier transforming blocks of samples. In another mode we use software-based downmixing and filtering techniques to increase the resolution and zoom in on a small frequency band. Spectra of up to 5 MHz can be manipulated and displayed as waterfall plots or spectral maps on the host computer directly. Moreover, signals of up to 70 MHz can be analyzed by undersampling techniques. We use this system to analyze Schottky spectra from electron-cooled ion beams in CELSIUS and report drag rate measurements and observations of instabilities

  11. A DSP controlled data acquisition system for CELSIUS

    CERN Document Server

    Bengtsson, M; Ziemann, Volker

    2000-01-01

    We describe a data acquisition system based on two 10 MHz A/D-converters, a SHARC Digital Signal Processor (DSP), and a digital synthesizer used for triggering the A/D-converters. The temporal macrostructure of the data acquisition can be determined by external triggers or by timer interrupts from the DSP. In this way up to two million samples can be stored in DSP external memory. The samples are analyzed by directly fast Fourier transforming blocks of samples. In another mode we use software-based downmixing and filtering techniques to increase the resolution and zoom in on a small frequency band. Spectra of up to 5 MHz can be manipulated and displayed as waterfall plots or spectral maps on the host computer directly. Moreover, signals of up to 70 MHz can be analyzed by undersampling techniques. We use this system to analyze Schottky spectra from electron-cooled ion beams in CELSIUS and report drag rate measurements and observations of instabilities.

  12. Building HVAC control knowledge data schema – Towards a unified representation of control system knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yan; Treado, Stephen J.; Messner, John I.

    2016-12-01

    Building control systems for Heating, Ventilation, and Air Conditioning (HVAC) play a key role in realizing the functionality and operation of building systems and components. Building Control Knowledge (BCK) is the logic and algorithms embedded throughout building control system. There are different methods to represent the BCK. These methods differ in the selection of BCK representing elements and the format of those elements. There is a lack of standard data schema, for storing, retrieving, and reusing structured BCK. In this study, a modular data schema is created for BCK representation. The data schema contains eleven representing elements, i.e., control module name, operation mode, system schematic, control flow diagram, data point, alarm, parameter, control sequence, function, and programming code. Each element is defined with specific attributes. This data schema is evaluated through a case study demonstration. The demonstration shows a new way to represent the BCK with standard formats.

  13. Data-based control trajectory planning for nonlinear systems

    International Nuclear Information System (INIS)

    Rhodes, C.; Morari, M.; Tsimring, L.S.; Rulkov, N.F.

    1997-01-01

    An open-loop trajectory planning algorithm is presented for computing an input sequence that drives an input-output system such that a reference trajectory is tracked. The algorithm utilizes only input-output data from the system to determine the proper control sequence, and does not require a mathematical or identified description of the system dynamics. From the input-output data, the controlled input trajectory is calculated in a open-quotes one-step-aheadclose quotes fashion using local modeling. Since the algorithm is calculated in this fashion, the output trajectories to be tracked can be nonperiodic. The algorithm is applied to a driven Lorenz system, and an experimental electrical circuit and the results are analyzed. Issues of stability associated with the implementation of this open-loop scheme are also examined using an analytic example of a driven Hacute enon map, problems associated with inverse controllers are illustrated, and solutions to these problems are proposed. copyright 1997 The American Physical Society

  14. Stochastic bounded consensus tracking of leader-follower multi-agent systems with measurement noises based on sampled data with general sampling delay

    International Nuclear Information System (INIS)

    Wu Zhi-Hai; Peng Li; Xie Lin-Bo; Wen Ji-Wei

    2013-01-01

    In this paper we provide a unified framework for consensus tracking of leader-follower multi-agent systems with measurement noises based on sampled data with a general sampling delay. First, a stochastic bounded consensus tracking protocol based on sampled data with a general sampling delay is presented by employing the delay decomposition technique. Then, necessary and sufficient conditions are derived for guaranteeing leader-follower multi-agent systems with measurement noises and a time-varying reference state to achieve mean square bounded consensus tracking. The obtained results cover no sampling delay, a small sampling delay and a large sampling delay as three special cases. Last, simulations are provided to demonstrate the effectiveness of the theoretical results. (interdisciplinary physics and related areas of science and technology)

  15. AZ-101 Mixer Pump Demonstration Data Acquisition System and Gamma Cart Data Acquisition Control System Software Configuration Management Plan

    International Nuclear Information System (INIS)

    WHITE, D.A.

    1999-01-01

    This Software Configuration Management Plan (SCMP) provides the instructions for change control of the AZ1101 Mixer Pump Demonstration Data Acquisition System (DAS) and the Sludge Mobilization Cart (Gamma Cart) Data Acquisition and Control System (DACS)

  16. Design and Implementation of the Dynamic Data Acquisition and Historical Data Query System for BEPCII Control System

    International Nuclear Information System (INIS)

    Ma Mei; Huang Song; Zhao Jijiu; Zhao Zhuo; Huang Xianghao; Wang Jincan; Wang Chunhong; Fan Ruohong

    2009-01-01

    The BEPCII is the project of upgrading Beijing Electron Positron Collider, and it's one of the National fundamental research center. Based on the dynamic data query system of the BEPCII control system which has already been put into operation, it provides a more comprehensive presentation of the dynamic data acquisition, historical data query system, including system architecture, database design, development schema and technology, the system environment and system function, and illustrated some results of the system operating. (authors)

  17. A quality control system for digital elevation data

    Science.gov (United States)

    Knudsen, Thomas; Kokkendorf, Simon; Flatman, Andrew; Nielsen, Thorbjørn; Rosenkranz, Brigitte; Keller, Kristian

    2015-04-01

    In connection with the introduction of a new version of the Danish national coverage Digital Elevation Model (DK-DEM), the Danish Geodata Agency has developed a comprehensive quality control (QC) and metadata production (MP) system for LiDAR point cloud data. The architecture of the system reflects its origin in a national mapping organization where raw data deliveries are typically outsourced to external suppliers. It also reflects a design decision of aiming at, whenever conceivable, doing full spatial coverage tests, rather than scattered sample checks. Hence, the QC procedure is split in two phases: A reception phase and an acceptance phase. The primary aim of the reception phase is to do a quick assessment of things that can typically go wrong, and which are relatively simple to check: Data coverage, data density, strip adjustment. If a data delivery passes the reception phase, the QC continues with the acceptance phase, which checks five different aspects of the point cloud data: Vertical accuracy Vertical precision Horizontal accuracy Horizontal precision Point classification correctness The vertical descriptors are comparatively simple to measure: The vertical accuracy is checked by direct comparison with previously surveyed patches. The vertical precision is derived from the observed variance on well defined flat surface patches. These patches are automatically derived from the road centerlines registered in FOT, the official Danish map data base. The horizontal descriptors are less straightforward to measure, since potential reference material for direct comparison is typically expected to be less accurate than the LiDAR data. The solution selected is to compare photogrammetrically derived roof centerlines from FOT with LiDAR derived roof centerlines. These are constructed by taking the 3D Hough transform of a point cloud patch defined by the photogrammetrical roof polygon. The LiDAR derived roof centerline is then the intersection line of the two primary

  18. Microcomputer-controlled ultrasonic data acquisition system. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, W.A. Jr.

    1978-11-01

    The large volume of ultrasonic data generated by computer-aided test procedures has necessitated the development of a mobile, high-speed data acquisition and storage system. This approach offers the decided advantage of on-site data collection and remote data processing. It also utilizes standard, commercially available ultrasonic instrumentation. This system is controlled by an Intel 8080A microprocessor. The MCS80-SDK microcomputer board was chosen, and magnetic tape is used as the storage medium. A detailed description is provided of both the hardware and software developed to interface the magnetic tape storage subsystem to Biomation 8100 and Biomation 805 waveform recorders. A boxcar integrator acquisition system is also described for use when signal averaging becomes necessary. Both assembly language and machine language listings are provided for the software.

  19. System design specification for rotary mode core sample trucks No. 2, 3, and 4 programmable logic controller

    International Nuclear Information System (INIS)

    Dowell, J.L.; Akers, J.C.

    1995-01-01

    The system this document describes controls several functions of the Core Sample Truck(s) used to obtain nuclear waste samples from various underground storage tanks at Hanford. The system will monitor the sampling process and provide alarms and other feedback to insure the sampling process is performed within the prescribed operating envelope. The intended audience for this document is anyone associated with rotary or push mode core sampling. This document describes the Alarm and Control logic installed on Rotary Mode Core Sample Trucks (RMCST) number-sign 2, 3, and 4. It is intended to define the particular requirements of the RMCST alarm and control operation (not defined elsewhere) sufficiently for detailed design to implement on a Programmable Logic Controller (PLC)

  20. Database application research in real-time data access of accelerator control system

    International Nuclear Information System (INIS)

    Chen Guanghua; Chen Jianfeng; Wan Tianmin

    2012-01-01

    The control system of Shanghai Synchrotron Radiation Facility (SSRF) is a large-scale distributed real-time control system, It involves many types and large amounts of real-time data access during the operating. Database system has wide application prospects in the large-scale accelerator control system. It is the future development direction of the accelerator control system, to replace the differently dedicated data structures with the mature standardized database system. This article discusses the application feasibility of database system in accelerators based on the database interface technology, real-time data access testing, and system optimization research and to establish the foundation of the wide scale application of database system in the SSRF accelerator control system. Based on the database interface technology, real-time data access testing and system optimization research, this article will introduce the application feasibility of database system in accelerators, and lay the foundation of database system application in the SSRF accelerator control system. (authors)

  1. Control and data acquisition system for versatile experiment spherical torus at SNU

    Energy Technology Data Exchange (ETDEWEB)

    An, YoungHwa [Department of Nuclear Engineering, Seoul National University, Seoul 151-742 (Korea, Republic of); Chung, Kyoung-Jae, E-mail: jkjlsh1@snu.ac.kr [Department of Nuclear Engineering, Seoul National University, Seoul 151-742 (Korea, Republic of); Na, DongHyeon; Hwang, Y.S. [Department of Nuclear Engineering, Seoul National University, Seoul 151-742 (Korea, Republic of)

    2013-10-15

    A control and data acquisition system for VEST (Versatile Experiment Spherical Torus) at Seoul National University (SNU) has been developed to enable remote operation from a central control room. The control and data acquisition system consists of three subsystems; a main control and data acquisition system that triggers each device at the preprogrammed timing and collects various diagnostic signals during discharges, a monitoring system that watches and logs the device status continuously, and a data storage and distribution system that stores collected data and provides data access layer via Ethernet. The system is designed to be cost-effective, extensible and easy to develop by using well-established standard technologies and solutions. Combining broad accessibility with modern information technology, alarm signal can be sent immediately to the registered cell phones when the abnormal status of devices is found, and the web data distribution system enables data access from almost everywhere using smart phones or tablet computers. Since December 2011, VEST is operational and the control and data acquisition system has been successfully used for remote operation of VEST.

  2. Control and data acquisition system for versatile experiment spherical torus at SNU

    International Nuclear Information System (INIS)

    An, YoungHwa; Chung, Kyoung-Jae; Na, DongHyeon; Hwang, Y.S.

    2013-01-01

    A control and data acquisition system for VEST (Versatile Experiment Spherical Torus) at Seoul National University (SNU) has been developed to enable remote operation from a central control room. The control and data acquisition system consists of three subsystems; a main control and data acquisition system that triggers each device at the preprogrammed timing and collects various diagnostic signals during discharges, a monitoring system that watches and logs the device status continuously, and a data storage and distribution system that stores collected data and provides data access layer via Ethernet. The system is designed to be cost-effective, extensible and easy to develop by using well-established standard technologies and solutions. Combining broad accessibility with modern information technology, alarm signal can be sent immediately to the registered cell phones when the abnormal status of devices is found, and the web data distribution system enables data access from almost everywhere using smart phones or tablet computers. Since December 2011, VEST is operational and the control and data acquisition system has been successfully used for remote operation of VEST

  3. Dynamics and Robust Control of Sampled Data Systems for Large Space Structures

    Science.gov (United States)

    1992-11-01

    physical interpretation of J 1 is this: We wish to keep the state near zero without excessive control-energy expenditcure. The weighting matrix, Q...can be given as follows. Defining v(k) -- [RI+ HT75(k÷!) H) -’ HTP (k+l)Gx(k) (249) where P(k) is a modified version of the Ricatti Equation. £(k)-C... Manual ", GTICES Systems Laboratory, Georgia Institute of Technology, Altanta, GA, Rev.J, April 1978. 21) Ericsson, A.J., "Determination of Frequencies

  4. Compressing Control System Data for Efficient Storage and Retrieval

    International Nuclear Information System (INIS)

    Christopher Larrieu

    2003-01-01

    The controls group at the Thomas Jefferson National Accelerator Facility (Jefferson Lab), acquires multiple terabytes of EPICS control system data per year via CZAR, its new archiving system. By heuristically applying a combination of rudimentary compression techniques, in conjunction with several specialized data transformations and algorithms, the CZAR storage engine reduces the size of this data by approximately 88 percent, without any loss of information. While the compression process requires significant memory and processor time, the decompression routine suffers only slightly in this regard

  5. Development of PC based control and data acquisition system for photphysics beamline at Indus-1 synchrotron source

    International Nuclear Information System (INIS)

    Mallick, Manika B.; Somkuwar, Sanjay P.; Ravindranath, S.V.G.; Das, N.C.

    2003-12-01

    A control and data acquisition system (DAS) has been designed and developed for photophysics beamline facility at Indus-1, CAT, Indore. It consists of two independent DASs for two monochromators e.g. 1-meter Seya-Namioka monochromator developed indigenously and a microcontroller based 1/4 meter Czerny-Turner monochromator procured from M/s CVI Laser. Each DAS performs tasks such as grating movement control, signal detection, digitization and acquisition. For this purpose, hardware has been developed around procured components e.g. stepper motor, driver, photomultiplier tube, counter/timer/DIO card, etc. These components are interconnected through a control and data acquisition unit developed in-house and are being controlled using a PC. User-friendly software has been developed using VC++ to provide modes like move to wavelength, scan wavelength, single step, set parameters, check reference and plot of scan data. The system performance has been tested with standard source (Hg lamp) and found to be satisfactory. Absorption spectrum of benzene and fluorescence spectrum of typical multilayer (Gd 2 O 3 + SiO 2 + Gd 2 O 3 tri-layer) have been recorded with this setup. The stability of the system is excellent and the data recorded at very low beam current, i.e. lower than 8 mA beam current for fluorescence signal, also provides significant information of the sample. The system is being used for characterization of various gaseous and solid-state samples. (author)

  6. Use of robotic systems for radiochemical sample changing and for analytical sample preparation

    International Nuclear Information System (INIS)

    Delmastro, J.R.; Hartenstein, S.D.; Wade, M.A.

    1989-01-01

    Two uses of the Perkin-Elmer (PE) robotic system will be presented. In the first, a PE robot functions as an automatic sample changer for up to five low energy photon spectrometry (LEPS) detectors operated with a Nuclear Data ND 6700 system. The entire system, including the robot, is controlled by an IBM PC-AT using software written in compiled BASIC. Problems associated with the development of the system and modifications to the robot will be presented. In the second, an evaluation study was performed to assess the abilities of the PE robotic system for performing complex analytical sample preparation procedures. For this study, a robotic system based upon the PE robot and auxiliary devices was constructed and programmed to perform the preparation of final product samples (UO 3 ) for accountability and impurity specification analyses. These procedures require sample dissolution, dilution, and liquid-liquid extraction steps. The results of an in-depth evaluation of all system components will be presented

  7. Computer data-acquisition and control system for Thomson-scattering measurements

    International Nuclear Information System (INIS)

    Stewart, K.A.; Foskett, R.D.; Kindsfather, R.R.; Lazarus, E.A.; Thomas, C.E.

    1983-03-01

    The Thomson-Scattering Diagnostic System (SCATPAK II) used to measure the electron temperature and density in the Impurity Study Experiment is interfaced to a Perkin-Elmer 8/32 computer that operates under the OS/32 operating system. The calibration, alignment, and operation of this diagnostic are all under computer control. Data acquired from 106 photomultiplier tubes installed on 15 spectrometers are transmitted to the computer by eighteen 12-channel, analog-to-digital integrators along a CAMAC serial highway. With each laser pulse, 212 channels of data are acquired: 106 channels of signal plus background and 106 channels of background only. Extensive use of CAMAC instrumentation enables large amounts of data to be acquired and control processes to be performed in a time-dependent environment. The Thomson-scattering computer system currently operates in three modes: user interaction and control, data acquisition and transmission, and data analysis. This paper discusses the development and implementation of this system as well as data storage and retrieval

  8. Sampled Data Systems Passivity and Discrete Port-Hamiltonian Systems

    NARCIS (Netherlands)

    Stramigioli, Stefano; Secchi, Cristian; Schaft, Arjan J. van der; Fantuzzi, Cesare

    2005-01-01

    In this paper, we present a novel way to approach the interconnection of a continuous and a discrete time physical system. This is done in a way which preserves passivity of the coupled system independently of the sampling time T. This strategy can be used both in the field of telemanipulation, for

  9. Data acquisition and control system for SMARTEX – C

    Energy Technology Data Exchange (ETDEWEB)

    Yeole, Yogesh Govind, E-mail: yogesh@ipr.res.in [Institute for Plasma Research, Gandhinagar, 382 428 Gujarat (India); Lachhvani, Lavkesh; Bajpai, Manu; Rathod, Surendrasingh; Kumar, Abhijeet; Sathyanarayana, K.; Pujara, H.D. [Institute for Plasma Research, Gandhinagar, 382 428 Gujarat (India); Pahari, Sambaran [BARC, Vishakhapatanam, 530 012 Andhra Pradesh (India); Chattopadhyay, Prabal K. [Institute for Plasma Research, Gandhinagar, 382 428 Gujarat (India)

    2016-11-15

    Highlights: • We have developed control and data acquisition system for Nonneutral Plasma experiment named as SMARTEX – C. • The hardware of the system includes a high current power supply, a trigger circuit, a comparator circuit, a PXI system and a computer. • The software has been developed in LabVIEW{sup ®}. • We have presented the complete time synchronization of the operation of the system. • Results obtained from the equipment has been shown. - Abstract: A PXI based data acquisition system has been developed for Small Aspect Ratio Toroidal Experiment in C – shaped geometry (SMARTEX – C), a device to create and confine non-neutral plasma. The data acquisition system (DAQ) includes PXI based data acquisition cards, communication card, chassis, Optical fiber link, a dedicated computer, a trigger circuit (TC) and a voltage comparator. In this paper, we report the development of a comprehensive code in LabVIEW{sup ®} – 2012 software in order to control the operation of SMARTEX – C as well as to acquire the experimental data from it. The code has been incorporated with features like configuration of card parameters. A hardware based control sequence involving TC has also been developed and integrated with the DAQ. In the acquisition part, the data from an experimental shot is acquired when a digital pulse from one of the PXI cards triggers TC, which further triggers the TF – power supply and rest of the DAQ. The data hence acquired, is stored in the hard disc in binary format for further analysis.

  10. The Tara control, monitoring, data acquisition and analysis system

    International Nuclear Information System (INIS)

    Sullivan, J.D.; Gaudreau, M.P.J.; Blanter, B.

    1986-09-01

    Experiments at the MIT Tara Tandem Mirror utilize an integrated system for control, monitoring, data acquisition, physics analysis, and archiving. This system consists of two distinct parts with narrowly defined information interchange; one to provide automated control and real time monitoring of engineering functions and one to acquire, analyze, and display data for physics in near real time. Typical machine operation achieves a total cycle time of 3 to 8 minutes with 5 to 7 Mbytes of data stored and with ∼160 individual signals displayed in hardcopy on ∼10 pages

  11. The Tara control, monitoring, data acquisition, and analysis system

    International Nuclear Information System (INIS)

    Sullivan, J.D.; Gaudreau, M.P.J.; Blanter, B.

    1987-01-01

    Experiments at the MIT Tara Tandem Mirror utilize an integrated system for control, monitoring, data acquisition, physics analysis, and archiving. This system consists of two distinct parts with narrowly defined information interchange; one to provide automated control and real time monitoring of engineering functions and one to acquire, analyze, and display data for physics in near real time. Typical machine operation achieves a total cycle time of 3 to 8 minutes with 5 to 7 Mbytes of data stored and with --160 individual signals displayed in hardcopy on --10 pages

  12. Sample and data management process description

    International Nuclear Information System (INIS)

    Kessner, J.H.

    2000-01-01

    The sample and data management process was initiated in 1994 as a result of a process improvement workshop. The purpose of the workshop was to develop a sample and data management process that would reduce cycle time and costs, simplify systems and procedures, and improve customer satisfaction for sampling, analytical services, and data management activities

  13. Data-Driven H∞ Control for Nonlinear Distributed Parameter Systems.

    Science.gov (United States)

    Luo, Biao; Huang, Tingwen; Wu, Huai-Ning; Yang, Xiong

    2015-11-01

    The data-driven H∞ control problem of nonlinear distributed parameter systems is considered in this paper. An off-policy learning method is developed to learn the H∞ control policy from real system data rather than the mathematical model. First, Karhunen-Loève decomposition is used to compute the empirical eigenfunctions, which are then employed to derive a reduced-order model (ROM) of slow subsystem based on the singular perturbation theory. The H∞ control problem is reformulated based on the ROM, which can be transformed to solve the Hamilton-Jacobi-Isaacs (HJI) equation, theoretically. To learn the solution of the HJI equation from real system data, a data-driven off-policy learning approach is proposed based on the simultaneous policy update algorithm and its convergence is proved. For implementation purpose, a neural network (NN)- based action-critic structure is developed, where a critic NN and two action NNs are employed to approximate the value function, control, and disturbance policies, respectively. Subsequently, a least-square NN weight-tuning rule is derived with the method of weighted residuals. Finally, the developed data-driven off-policy learning approach is applied to a nonlinear diffusion-reaction process, and the obtained results demonstrate its effectiveness.

  14. Development of a data base system for quality control of MOX fuels

    International Nuclear Information System (INIS)

    Takahashi, Kuniaki; Yamaguchi, Toshihiro; Mishima, Takeshi

    1988-01-01

    For the purpose of improvement and speed up of work concerning quality control for mixed oxide fuel fabrication, we have been developing a data base system having a data base consisting of data as to fabrication conditions and inspects, We have aimed to develop a data base system having capability of analysis and function giving informations as to quality control. The data base system is full interactive system on real time basis, consequently it makes analyzing and editing data easy. It has capability of relational research, numerical analysis, correlation analysis, drawing control charts, histograms, and other figures, and expressing status of fabrication processes using control charts. (author)

  15. System design description for mini-dacs data acquisition and control system

    International Nuclear Information System (INIS)

    Vargo, F.G. Jr.; Trujillo, L.T.; Smith, S.O.

    1994-01-01

    This document describes the hardware computer system, for the mini data acquisition and control system (DACS) that was fabricated by Los Alamos National Laboratory (LANL), to support the testing of the spare mixer pump for SY-101

  16. Embedded systems design for high-speed data acquisition and control

    CERN Document Server

    Di Paolo Emilio, Maurizio

    2015-01-01

    This book serves as a practical guide for practicing engineers who need to design embedded systems for high-speed data acquisition and control systems. A minimum amount of theory is presented, along with a review of analog and digital electronics, followed by detailed explanations of essential topics in hardware design and software development. The discussion of hardware focuses on microcontroller design (ARM microcontrollers and FPGAs), techniques of embedded design, high speed data acquisition (DAQ) and control systems. Coverage of software development includes main programming techniques, culminating in the study of real-time operating systems. All concepts are introduced in a manner to be highly-accessible to practicing engineers and lead to the practical implementation of an embedded board that can be used in various industrial fields as a control system and high speed data acquisition system.   • Describes fundamentals of embedded systems design in an accessible manner; • Takes a problem-solving ...

  17. Towards a new generation of control and data acquisition systems for thermonuclear fusion research

    International Nuclear Information System (INIS)

    Van Haren, P.C.

    1993-01-01

    Because of the complexity of thermonuclear fusion test reactors, control systems are indispensable. The physical properties of the reactor medium, i.e. the plasma, are still not well understood. Therefore, many diagnostic techniques are applied to investigate the plasma and to discover its properties. As a consequence, data acquisition systems play an important role in thermonuclear fusion research. This thesis reports on three projects that were carried out in the field of control and data acquisition. The target experiment is the Rijnhuizen Tokamak Project (RTP), a medium-sized experiment dedicated to studies of transport in the reactor medium. One of the projects is aimed at the development of a new Plasma Position and Current Control feedback System (PPCCS). This system evaluates signals of a large (about 20) number of sensors, computes the actual state of the plasma from these signals and generates command signals for the power supplies that govern the plasma position. The most ambitious project described in this thesis is the development of a data acquisition system, called TRAMP (Transient Recorders and Amoeba Multi Processor), that aims to be a testbed for smart data acquisition strategies. TRAMP attempts to acquire and store temporarily all possible data at a high sampling frequency from a single RTP pulse, and accommodates for a resampling in software prior to transferring the data to a mass storage facility. The software resampling frequency can be tuned by analysis of the acquired data and, in that way, only interesting data will be stored. In the course of the development of both the above-mentioned systems it turned out that the existing database format applied for managing experimental data provided many hurdles in the realization of efficient solutions. Consequently, a new database format was developed together with software to deal with it. This new database, called DOM4 (Data Organization and Management), is now applied at all data acquisition

  18. Collection and control of tritium bioassay samples at Pantex

    International Nuclear Information System (INIS)

    Fairrow, N.L.; Ivie, W.E.

    1992-01-01

    Pantex is the final assembly/disassembly point for US nuclear weapons. The Pantex internal dosimetry section monitors radiation workers once a month for tritium exposure. In order to manage collection and control of the bioassay specimens efficiently, a bar code system for collection of samples was developed and implemented to speed up the process and decrease the number of errors probable when transferring data. In the past, all the bioassay data from samples were entered manually into a computer database. Transferring the bioassay data from the liquid scintillation counter to each individual's dosimetry record required as much as two weeks of concentrated effort

  19. KENS data acquisition system KENSnet

    International Nuclear Information System (INIS)

    Arai, Masatoshi; Furusaka, Michihiro; Satoh, Setsuo; Johnson, M.W.

    1988-01-01

    The installation of a new data acquisition system KENSnet has been completed at the KENS neutron facility. For data collection, 160 Mbytes are necessary for temporary disk storage, and 1 MIPS of CPU is required. For the computing system, models were chosen from the VAX family of computers running their proprietary operating system VMS. The VMS operating system has a very user friendly interface, and is well suited to instrument control applications. New data acquisition electronics were developed. A gate module receives a signal of proton extraction time from the accelerator, and checks the veto signals from the sample environment equipment (vacuum, temperature, chopper phasing, etc.). Then the signal is issued to a delay-time module. A time-control module starts timing from the delayed start signal from the delay-time module, and distributes an encoded time-boundary address to memory modules at the preset times anabling the memory modules to accumulate data histograms. The data acquisition control program (ICP) and the general data analysis program (Genie) were both developed at ISIS, and have been installed in the new data acquisition system. They give the experimenter 'user-friendly' data acquisition and a good environment for data manipulation. The ICP controls the DAE and transfers the histogram data into the computers. (N.K.)

  20. Data-Driven Predictive Direct Load Control of Refrigeration Systems

    DEFF Research Database (Denmark)

    Shafiei, Seyed Ehsan; Knudsen, Torben; Wisniewski, Rafal

    2015-01-01

    A predictive control using subspace identification is applied for the smart grid integration of refrigeration systems under a direct load control scheme. A realistic demand response scenario based on regulation of the electrical power consumption is considered. A receding horizon optimal control...... is proposed to fulfil two important objectives: to secure high coefficient of performance and to participate in power consumption management. Moreover, a new method for design of input signals for system identification is put forward. The control method is fully data driven without an explicit use of model...... against real data. The performance improvement results in a 22% reduction in the energy consumption. A comparative simulation is accomplished showing the superiority of the method over the existing approaches in terms of the load following performance....

  1. Lessons learned from the MIT Tara control and data system

    International Nuclear Information System (INIS)

    Gaudreau, M.P.J.; Sullivan, J.D.; Fredian, T.W.; Irby, J.H.; Karcher, C.A.; Rameriz, R.A.; Sevillano, E.; Stillerman, J.A.; Thomas, P.

    1987-10-01

    The control and data system of the MIT Tara Tandem Mirror has worked successfully throughout the lifetime of the experiment (1983 through 1987). As the Tara project winds down, it is appropriate to summarize the lessons learned from the implementation and operation of the control and data system over the years and in its final form. The control system handled ∼2400 I/0 points in real time throughout the 5 to 10 minute shot cycle while the data system, in near real time, handled ∼1000 signals with a total of 5 to 7 Mbytes of data each shot. The implementation depended upon a consistent approach based on separating physics and engineering functions and on detailed functional diagrams with narrowly defined cross communication. This paper is a comprehensive treatment of the principal successes, residual problems, and dilemmas that arose from the beginning until the final hardware and software implementation. Suggestions for future systems of either similar size or of larger scale such as CIT are made in the conclusion. 11 refs., 1 fig

  2. Data-based control tuning in master-slave systems

    NARCIS (Netherlands)

    Heertjes, M.F.; Temizer, B.

    2012-01-01

    For improved output synchronization in master-slave systems, a data-based control tuning is presented. Herein the coefficients of two finite-duration impulse response (FIR) filters are found through machine-in-the-loop optimization. One filter is used to shape the input to the slave system while the

  3. Temperature Control Diagnostics for Sample Environments

    International Nuclear Information System (INIS)

    Santodonato, Louis J.; Walker, Lakeisha M.H.; Church, Andrew J.; Redmon, Christopher Mckenzie

    2010-01-01

    In a scientific laboratory setting, standard equipment such as cryocoolers are often used as part of a custom sample environment system designed to regulate temperature over a wide range. The end user may be more concerned with precise sample temperature control than with base temperature. But cryogenic systems tend to be specified mainly in terms of cooling capacity and base temperature. Technical staff at scientific user facilities (and perhaps elsewhere) often wonder how to best specify and evaluate temperature control capabilities. Here we describe test methods and give results obtained at a user facility that operates a large sample environment inventory. Although this inventory includes a wide variety of temperature, pressure, and magnetic field devices, the present work focuses on cryocooler-based systems.

  4. Anomaly Detection for Resilient Control Systems Using Fuzzy-Neural Data Fusion Engine

    Energy Technology Data Exchange (ETDEWEB)

    Ondrej Linda; Milos Manic; Timothy R. McJunkin

    2011-08-01

    Resilient control systems in critical infrastructures require increased cyber-security and state-awareness. One of the necessary conditions for achieving the desired high level of resiliency is timely reporting and understanding of the status and behavioral trends of the control system. This paper describes the design and development of a neural-network based data-fusion system for increased state-awareness of resilient control systems. The proposed system consists of a dedicated data-fusion engine for each component of the control system. Each data-fusion engine implements three-layered alarm system consisting of: (1) conventional threshold-based alarms, (2) anomalous behavior detector using self-organizing maps, and (3) prediction error based alarms using neural network based signal forecasting. The proposed system was integrated with a model of the Idaho National Laboratory Hytest facility, which is a testing facility for hybrid energy systems. Experimental results demonstrate that the implemented data fusion system provides timely plant performance monitoring and cyber-state reporting.

  5. NetCDF based data archiving system applied to ITER Fast Plant System Control prototype

    International Nuclear Information System (INIS)

    Castro, R.; Vega, J.; Ruiz, M.; De Arcas, G.; Barrera, E.; López, J.M.; Sanz, D.; Gonçalves, B.; Santos, B.; Utzel, N.; Makijarvi, P.

    2012-01-01

    Highlights: ► Implementation of a data archiving solution for a Fast Plant System Controller (FPSC) for ITER CODAC. ► Data archiving solution based on scientific NetCDF-4 file format and Lustre storage clustering. ► EPICS control based solution. ► Tests results and detailed analysis of using NetCDF-4 and clustering technologies on fast acquisition data archiving. - Abstract: EURATOM/CIEMAT and Technical University of Madrid (UPM) have been involved in the development of a FPSC (Fast Plant System Control) prototype for ITER, based on PXIe (PCI eXtensions for Instrumentation). One of the main focuses of this project has been data acquisition and all the related issues, including scientific data archiving. Additionally, a new data archiving solution has been developed to demonstrate the obtainable performances and possible bottlenecks of scientific data archiving in Fast Plant System Control. The presented system implements a fault tolerant architecture over a GEthernet network where FPSC data are reliably archived on remote, while remaining accessible to be redistributed, within the duration of a pulse. The storing service is supported by a clustering solution to guaranty scalability, so that FPSC management and configuration may be simplified, and a unique view of all archived data provided. All the involved components have been integrated under EPICS (Experimental Physics and Industrial Control System), implementing in each case the necessary extensions, state machines and configuration process variables. The prototyped solution is based on the NetCDF-4 (Network Common Data Format) file format in order to incorporate important features, such as scientific data models support, huge size files management, platform independent codification, or single-writer/multiple-readers concurrency. In this contribution, a complete description of the above mentioned solution is presented, together with the most relevant results of the tests performed, while focusing in the

  6. Asynchronous data change notification between database server and accelerator controls system

    International Nuclear Information System (INIS)

    Fu, W.; Morris, J.; Nemesure, S.

    2011-01-01

    Database data change notification (DCN) is a commonly used feature. Not all database management systems (DBMS) provide an explicit DCN mechanism. Even for those DBMS's which support DCN (such as Oracle and MS SQL server), some server side and/or client side programming may be required to make the DCN system work. This makes the setup of DCN between database server and interested clients tedious and time consuming. In accelerator control systems, there are many well established software client/server architectures (such as CDEV, EPICS, and ADO) that can be used to implement data reflection servers that transfer data asynchronously to any client using the standard SET/GET API. This paper describes a method for using such a data reflection server to set up asynchronous DCN (ADCN) between a DBMS and clients. This method works well for all DBMS systems which provide database trigger functionality. Asynchronous data change notification (ADCN) between database server and clients can be realized by combining the use of a database trigger mechanism, which is supported by major DBMS systems, with server processes that use client/server software architectures that are familiar in the accelerator controls community (such as EPICS, CDEV or ADO). This approach makes the ADCN system easy to set up and integrate into an accelerator controls system. Several ADCN systems have been set up and used in the RHIC-AGS controls system.

  7. A self-description data framework for Tokamak control system design

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Ming; Zhang, Jing [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, Wuhan 430074 (China); School of Electrical and Electronic Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China); Zheng, Wei, E-mail: zhengwei@hust.edu.cn [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, Wuhan 430074 (China); School of Electrical and Electronic Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China); Hu, Feiran; Zhuang, Ge [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, Wuhan 430074 (China); School of Electrical and Electronic Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China)

    2015-10-15

    Highlights: • The SDD framework can be applied to different Tokamak devices. • We explain how configuration settings of control systems are described in SDD models, namely components and connections. • Evolving SDD models are stored in a dynamic schema database. • The SDD editor supports plug-and-play SDD models. - Abstract: A Tokamak device consists of numerous control systems, which need to be integrated. CODAC (Control, Data Access and Communication) system requires the configuration settings of these control systems to carry out the integration smoothly. SDD (Self-description data) is designed to describe the static configuration of control systems. ITER CODAC group has released an SDD software package for control system designers to manage the static configuration, but it is specific for ITER plant control systems. Following the idea of ITER SDD, we developed a flexible and scalable SDD framework to develop SDD software for J-TEXT and other sophisticated devices. The SDD framework describes the configuration settings of various control systems, including physical and logical elements and their relation information, in SDD models which are classified into Components and Connections. The framework is composed of three layers: the MongoDB database, an open-source, dynamic schema, NoSQL (Not Only SQL) database; the SDD service, which maps SDD models to MongoDB and handles the transaction and business logic; the SDD applications, which can be used to create and maintain SDD information, and generate various kinds of output using the stored SDD information.

  8. A self-description data framework for Tokamak control system design

    International Nuclear Information System (INIS)

    Zhang, Ming; Zhang, Jing; Zheng, Wei; Hu, Feiran; Zhuang, Ge

    2015-01-01

    Highlights: • The SDD framework can be applied to different Tokamak devices. • We explain how configuration settings of control systems are described in SDD models, namely components and connections. • Evolving SDD models are stored in a dynamic schema database. • The SDD editor supports plug-and-play SDD models. - Abstract: A Tokamak device consists of numerous control systems, which need to be integrated. CODAC (Control, Data Access and Communication) system requires the configuration settings of these control systems to carry out the integration smoothly. SDD (Self-description data) is designed to describe the static configuration of control systems. ITER CODAC group has released an SDD software package for control system designers to manage the static configuration, but it is specific for ITER plant control systems. Following the idea of ITER SDD, we developed a flexible and scalable SDD framework to develop SDD software for J-TEXT and other sophisticated devices. The SDD framework describes the configuration settings of various control systems, including physical and logical elements and their relation information, in SDD models which are classified into Components and Connections. The framework is composed of three layers: the MongoDB database, an open-source, dynamic schema, NoSQL (Not Only SQL) database; the SDD service, which maps SDD models to MongoDB and handles the transaction and business logic; the SDD applications, which can be used to create and maintain SDD information, and generate various kinds of output using the stored SDD information.

  9. The control and data acquisition system of a laser in-vessel viewing system

    International Nuclear Information System (INIS)

    Pereira, Rita C.; Cruz, Nuno; Neri, C.; Riva, M.; Correia, C.; Varandas, C.A.F.

    2000-01-01

    This paper presents the dedicated control and data acquisition system (CADAS) of a new laser in-vessel viewing system that has been developed for inspection purposes in fusion experiments. CADAS is based on a MC68060 microprocessor and on-site developed VME instrumentation. Its main aims are to simultaneously control the laser alignment system as well as the laser beam deflection for in-vessel scanning, acquire a high-resolution image and support real-time data flow rates up to 2 Mbyte/s from the acquisition modules to the hard disk and network. The hardware (modules for control and alignment acquisition, scanning acquisition and monitoring) as well as the three levels of software are described

  10. Sampling and Control Circuit Board for an Inertial Measurement Unit

    Science.gov (United States)

    Chelmins, David T (Inventor); Powis, Richard T., Jr. (Inventor); Sands, Obed (Inventor)

    2016-01-01

    A circuit board that serves as a control and sampling interface to an inertial measurement unit ("IMU") is provided. The circuit board is also configured to interface with a local oscillator and an external trigger pulse. The circuit board is further configured to receive the external trigger pulse from an external source that time aligns the local oscillator and initiates sampling of the inertial measurement device for data at precise time intervals based on pulses from the local oscillator. The sampled data may be synchronized by the circuit board with other sensors of a navigation system via the trigger pulse.

  11. Process theory for supervisory control of stochastic systems with data

    NARCIS (Netherlands)

    Markovski, J.

    2012-01-01

    We propose a process theory for supervisory control of stochastic nondeterministic plants with data-based observations. The Markovian process theory with data relies on the notion of Markovian partial bisimulation to capture controllability of stochastic nondeterministic systems. It presents a

  12. Ground-Based Global Navigation Satellite System Data (30-second sampling, 1 hour files) from NASA CDDIS

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Navigation Satellite System (GNSS) daily 30-second sampled data available from the Crustal Dynamics Data Information System (CDDIS). Global Navigation...

  13. The Los Alamos accelerator control system data base: A generic instrumentation interface

    International Nuclear Information System (INIS)

    Dalesio, L.R.

    1990-01-01

    Controlling experimental-physics applications requires a control system that can be quickly integrated and easily modified. One aspect of the control system is the interface to the instrumentation. An instrumentation set has been chosen to implement the basic functions needed to monitor and control these applications. A data-driven interface to this instrumentation set provides the required quick integration of the control system. This type of interface is limited by its built-in capabilities. Therefore, these capabilities must provide an adequate range of functions to be of any use. The data-driven interface must support the instrumentation range requird, the events on which to read or control the instrumentation and a method for manipulating the data to calculate terms or close control loops. The database for the Los Alamos Accelerator Control System addresses these requirements. (orig.)

  14. Comparing U.S. Army suicide cases to a control sample: initial data and methodological lessons.

    Science.gov (United States)

    Alexander, Cynthia L; Reger, Mark A; Smolenski, Derek J; Fullerton, Nicole R

    2014-10-01

    Identification of risk and protective factors for suicide is a priority for the United States military, especially in light of the recent steady increase in military suicide rates. The Department of Defense Suicide Event Report contains comprehensive data on suicides for active duty military personnel, but no analogous control data is available to permit identification of factors that differentially determine suicide risk. This proof-of-concept study was conducted to determine the feasibility of collecting such control data. The study employed a prospective case-control design in which control cases were randomly selected from a large Army installation at a rate of four control participants for every qualifying Army suicide. Although 111 Army suicides were confirmed during the study period, just 27 control soldiers completed the study. Despite the small control sample, preliminary analyses comparing suicide cases to controls identified several factors more frequently reported for suicide cases, including recent failed intimate relationships, outpatient mental health history, mood disorder diagnosis, substance abuse history, and prior self-injury. No deployment-related risk factors were found. These data are consistent with existing literature and form a foundation for larger control studies. Methodological lessons learned regarding study design and recruitment are discussed to inform future studies. Reprint & Copyright © 2014 Association of Military Surgeons of the U.S.

  15. Networked control systems with communication constraints :tradeoffs between sampling intervals, delays and performance

    NARCIS (Netherlands)

    Heemels, W.P.M.H.; Teel, A.R.; Wouw, van de N.; Nesic, D.

    2010-01-01

    There are many communication imperfections in networked control systems (NCS) such as varying transmission delays, varying sampling/transmission intervals, packet loss, communication constraints and quantization effects. Most of the available literature on NCS focuses on only some of these aspects,

  16. Data-driven design of fault diagnosis and fault-tolerant control systems

    CERN Document Server

    Ding, Steven X

    2014-01-01

    Data-driven Design of Fault Diagnosis and Fault-tolerant Control Systems presents basic statistical process monitoring, fault diagnosis, and control methods, and introduces advanced data-driven schemes for the design of fault diagnosis and fault-tolerant control systems catering to the needs of dynamic industrial processes. With ever increasing demands for reliability, availability and safety in technical processes and assets, process monitoring and fault-tolerance have become important issues surrounding the design of automatic control systems. This text shows the reader how, thanks to the rapid development of information technology, key techniques of data-driven and statistical process monitoring and control can now become widely used in industrial practice to address these issues. To allow for self-contained study and facilitate implementation in real applications, important mathematical and control theoretical knowledge and tools are included in this book. Major schemes are presented in algorithm form and...

  17. Hazard Control Extensions in a COTS Based Data Handling System

    Science.gov (United States)

    Vogel, Torsten; Rakers, Sven; Gronowski, Matthias; Schneegans, Joachim

    2011-08-01

    EML is an electromagnetic levitator for containerless processing of conductive samples on the International Space Station. This material sciences experiment is running in the European Drawer Rack (EDR) facility. The objective of this experiment is to gain insight into the parameters of liquid metal samples and their crystallisation processes without the influence of container walls. To this end the samples are electromagnetically positioned in a coil system and then heated up beyond their melting point in an ultraclean environment.The EML programme is currently under development by Astrium Space Transportation in Friedrichshafen and Bremen; jointly funded by ESA and DLR (on behalf of BMWi, contract 50WP0808). EML consists of four main modules listed in Table 1. The paper focuses mainly on the architecture and design of the ECM module and its contribution to a safe operation of the experiment. The ECM is a computer system that integrates the power supply to the EML experiment, control functions and video handling and compression features. Experiment control is performed by either telecommand or the execution of predefined experiment scripts.

  18. Data acquisition system in TPE-1RM15

    International Nuclear Information System (INIS)

    Yagi, Yasuyuki; Yahagi, Eiichi; Hirano, Yoichi; Shimada, Toshio; Hirota, Isao; Maejima, Yoshiki

    1991-01-01

    The data acquisition system for TPE-1RM15 reversed field pinch machine had been developed and has recently been completed. Thd data to be acquired consist of many channels of time series data which come from plasma diagnostics. The newly developed data acquisition system uses CAMAC (Computer Automated Measurement And Control) system as a front end data acquisition system and micro-VAX II for control, file management and analyses. Special computer programs, DAQR/D, have been developed for data acquisition routine. Experimental setting and process controlling items are managed by a parameter database in a shared common region and every task can easily refer to it. The acquired data are stored into a mass storage system (total of 1.3GBytes plus a magnetic tape system) including an optical disk system, which can save storage space and allow quick reference. At present, the CAMAC system has 88 (1MHz sampling) and 64(5kHz sampling) channels corresponding to 1.6 MBytes per shot. The data acquisition system can finish one routine within 5 minutes with 1.6MBytes data depending on the amount of graphic outputs. Hardwares and softwares of the system are specified so that the system can be easily expanded. The computer is connected to the AIST Ethernet and the system can be remotely accessed and the acquired data can be transferred to the mainframes on the network. Details about specifications and performance of the system are given in this report. (author)

  19. Control and data acquisition system for rotary compressor

    Directory of Open Access Journals (Sweden)

    Buczaj Marcin

    2017-01-01

    Full Text Available The rotary compressor (crimping machine is a machine designed for making hollow forgings. The rotary compressor is a prototype device designed and built at the Technical University of Lublin. The compressor is dedicated to perform laboratory tests related to the hollow forgings of various shapes using different materials. Since the rotary compressor is an experimental device, there is no control and acquisition data system available. The article presents the concept and the capabilities of the computer control and data acquisition system supporting rotary compressing process. The main task of software system is acquisition of force and kinetic parameters related to the analysed process of the rotary forging compression. The software allows the user to declare the course of the forming forgings. This system allows current recording and analysis of four physical values: feed rate (speed of working head movement, hydraulic oil pressure at inlet and outlet of hydraulic cylinder and the torque of engine. Application functions can be divided into three groups: the configuration of the pressing process, the acquisition and analysis of data from the pressing process and the recording and presentation of stored results. The article contains a detailed description about hardware and software implementation of mentioned functions.

  20. System control and data acquisition of the two new FWCD RF systems at DIII-D

    International Nuclear Information System (INIS)

    Harris, T.E.; Allen, J.C.; Cary, W.P. Petty, C.C.

    1995-10-01

    The Fast Wave Current Drive (FWCD) system at DIII-D has increased its available radio frequency (RF) power capabilities with the addition of two new high power transmitters along with their associated transmission line systems. A Sun Sparc-10 workstation, functioning as the FWCD operator console, is being used to control transmitter operating parameters and transmission line tuning parameters, along with acquiring data and making data available for integration into the DIII-D data acquisition system. Labview, a graphical user interface application, is used to manage and control the above processes. This paper will discuss the three primary branches of the FWCD computer control system: transmitter control, transmission line tuning control, and FWCD data acquisition. The main control program developed uses VXI, GPIB, CAMAC, Serial, and Ethernet protocols to blend the three branches together into one cohesive system. The control of the transmitters utilizes VXI technology to communicate with the transmitter's digital interface. A GPIB network allows for communication with various instruments and CAMAC crate controllers. CAMAC crates are located at each phase-shifter/stub-tuner station and are used to digitize transmission line parameters along with transmission line fault detection during RF transmission. The phase-shifter/stub-tuner stations are located through out the DIII-D facility and are controlled from the FWCD operator console via the workstation's Serial port. The Sun workstation has an Ethernet connection allowing for the utilization of the DIII-D data acquisition open-quotes Open Systemclose quotes architecture and of course providing communication with the rest of the world

  1. Asynchronous sampled-data approach for event-triggered systems

    Science.gov (United States)

    Mahmoud, Magdi S.; Memon, Azhar M.

    2017-11-01

    While aperiodically triggered network control systems save a considerable amount of communication bandwidth, they also pose challenges such as coupling between control and event-condition design, optimisation of the available resources such as control, communication and computation power, and time-delays due to computation and communication network. With this motivation, the paper presents separate designs of control and event-triggering mechanism, thus simplifying the overall analysis, asynchronous linear quadratic Gaussian controller which tackles delays and aperiodic nature of transmissions, and a novel event mechanism which compares the cost of the aperiodic system against a reference periodic implementation. The proposed scheme is simulated on a linearised wind turbine model for pitch angle control and the results show significant improvement against the periodic counterpart.

  2. Robust Adaptive Stabilization of Linear Time-Invariant Dynamic Systems by Using Fractional-Order Holds and Multirate Sampling Controls

    Directory of Open Access Journals (Sweden)

    S. Alonso-Quesada

    2010-01-01

    Full Text Available This paper presents a strategy for designing a robust discrete-time adaptive controller for stabilizing linear time-invariant (LTI continuous-time dynamic systems. Such systems may be unstable and noninversely stable in the worst case. A reduced-order model is considered to design the adaptive controller. The control design is based on the discretization of the system with the use of a multirate sampling device with fast-sampled control signal. A suitable on-line adaptation of the multirate gains guarantees the stability of the inverse of the discretized estimated model, which is used to parameterize the adaptive controller. A dead zone is included in the parameters estimation algorithm for robustness purposes under the presence of unmodeled dynamics in the controlled dynamic system. The adaptive controller guarantees the boundedness of the system measured signal for all time. Some examples illustrate the efficacy of this control strategy.

  3. Object oriented run control for the CEBAF data acquisition system

    International Nuclear Information System (INIS)

    Quarrie, D.R.; Heyes, G.; Jastrzembski, E.; Watson, W.A. III

    1992-01-01

    After an extensive evaluation, the Eiffel object oriented language has been selected for the design and implementation of the run control portion of the CEBAF Data Acquisition System. The OSF/Motif graphical user interface toolkit and Data Views process control system have been incorporated into this framework. in this paper, the authors discuss the evaluation process, the status of the implementation and the lessons learned, particularly in the use of object oriented techniques

  4. Optimal sampling strategy for data mining

    International Nuclear Information System (INIS)

    Ghaffar, A.; Shahbaz, M.; Mahmood, W.

    2013-01-01

    Latest technology like Internet, corporate intranets, data warehouses, ERP's, satellites, digital sensors, embedded systems, mobiles networks all are generating such a massive amount of data that it is getting very difficult to analyze and understand all these data, even using data mining tools. Huge datasets are becoming a difficult challenge for classification algorithms. With increasing amounts of data, data mining algorithms are getting slower and analysis is getting less interactive. Sampling can be a solution. Using a fraction of computing resources, Sampling can often provide same level of accuracy. The process of sampling requires much care because there are many factors involved in the determination of correct sample size. The approach proposed in this paper tries to find a solution to this problem. Based on a statistical formula, after setting some parameters, it returns a sample size called s ufficient sample size , which is then selected through probability sampling. Results indicate the usefulness of this technique in coping with the problem of huge datasets. (author)

  5. ICH rf system data acquisition and real time control using a microcomputer system

    International Nuclear Information System (INIS)

    Cary, W.P.; Allen, J.A.; Pinsker, R.I.; Petty, C.C.

    1993-10-01

    On the basis of the rapidly increasing power, speed, and decreasing cost of the personal computer (microcomputer) it was felt that a real time data acquisition and control system could be configured quickly and very cost effectively. It was further felt that by using a high level or object-oriented programming language that considerable time and expense could be saved and at the same time increase system flexibility. This paper will attempt to address the desired system requirements and performance for both the control of the high power transmitters and for the data acquisition and presentation of the information

  6. System of automatic control over data Acquisition and Transmission to IGR NNC RK Data Center

    International Nuclear Information System (INIS)

    Komarov, I.I.; Gordienko, D.D.; Kunakov, A.V.

    2005-01-01

    Automated system for seismic and acoustic data acquisition and transmission in real time was established in Data Center IGR NNC RK, which functions very successively. The system monitors quality and volume of acquired information and also controls the status of the system and communication channels. Statistical data on system operation are accumulated in created database. Information on system status is reflected on the Center Web page. (author)

  7. Automated sampling and control of gaseous simulations

    KAUST Repository

    Huang, Ruoguan

    2013-05-04

    In this work, we describe a method that automates the sampling and control of gaseous fluid simulations. Several recent approaches have provided techniques for artists to generate high-resolution simulations based on a low-resolution simulation. However, often in applications the overall flow in the low-resolution simulation that an animator observes and intends to preserve is composed of even lower frequencies than the low resolution itself. In such cases, attempting to match the low-resolution simulation precisely is unnecessarily restrictive. We propose a new sampling technique to efficiently capture the overall flow of a fluid simulation, at the scale of user\\'s choice, in such a way that the sampled information is sufficient to represent what is virtually perceived and no more. Thus, by applying control based on the sampled data, we ensure that in the resulting high-resolution simulation, the overall flow is matched to the low-resolution simulation and the fine details on the high resolution are preserved. The samples we obtain have both spatial and temporal continuity that allows smooth keyframe matching and direct manipulation of visible elements such as smoke density through temporal blending of samples. We demonstrate that a user can easily configure a simulation with our system to achieve desired results. © 2013 Springer-Verlag Berlin Heidelberg.

  8. Accounting for sensor calibration, data validation, measurement and sampling uncertainties in monitoring urban drainage systems.

    Science.gov (United States)

    Bertrand-Krajewski, J L; Bardin, J P; Mourad, M; Béranger, Y

    2003-01-01

    Assessing the functioning and the performance of urban drainage systems on both rainfall event and yearly time scales is usually based on online measurements of flow rates and on samples of influent effluent for some rainfall events per year. In order to draw pertinent scientific and operational conclusions from the measurement results, it is absolutely necessary to use appropriate methods and techniques in order to i) calibrate sensors and analytical methods, ii) validate raw data, iii) evaluate measurement uncertainties, iv) evaluate the number of rainfall events to sample per year in order to determine performance indicator with a given uncertainty. Based an previous work, the paper gives a synthetic review of required and techniques, and illustrates their application to storage and settling tanks. Experiments show that, controlled and careful experimental conditions, relative uncertainties are about 20% for flow rates in sewer pipes, 6-10% for volumes, 25-35% for TSS concentrations and loads, and 18-276% for TSS removal rates. In order to evaluate the annual pollutant interception efficiency of storage and settling tanks with a given uncertainty, efforts should first be devoted to decrease the sampling uncertainty by increasing the number of sampled events.

  9. RaPToRS Sample Delivery System

    Science.gov (United States)

    Henchen, Robert; Shibata, Kye; Krieger, Michael; Pogozelski, Edward; Padalino, Stephen; Glebov, Vladimir; Sangster, Craig

    2010-11-01

    At various labs (NIF, LLE, NRL), activated material samples are used to measure reaction properties. The Rapid Pneumatic Transport of Radioactive Samples (RaPToRS) system quickly and safely moves these radioactive samples through a closed PVC tube via airflow. The carrier travels from the reaction chamber to the control and analysis station, pneumatically braking at the outlet. A reversible multiplexer routes samples from various locations near the shot chamber to the analysis station. Also, the multiplexer allows users to remotely load unactivated samples without manually approaching the reaction chamber. All elements of the system (pneumatic drivers, flow control valves, optical position sensors, multiplexers, Geiger counters, and release gates at the analysis station) can be controlled manually or automatically using a custom LabVIEW interface. A prototype is currently operating at NRL in Washington DC. Prospective facilities for Raptors systems include LLE and NIF.

  10. Improving the Acquisition and Management of Sample Curation Data

    Science.gov (United States)

    Todd, Nancy S.; Evans, Cindy A.; Labasse, Dan

    2011-01-01

    This paper discusses the current sample documentation processes used during and after a mission, examines the challenges and special considerations needed for designing effective sample curation data systems, and looks at the results of a simulated sample result mission and the lessons learned from this simulation. In addition, it introduces a new data architecture for an integrated sample Curation data system being implemented at the NASA Astromaterials Acquisition and Curation department and discusses how it improves on existing data management systems.

  11. Automatic Control and Data Acquisition System for Combustion Laboratory Applications.

    Science.gov (United States)

    1982-10-01

    O VPI Access~.ion FCr- 1473 2 UNCLASSIFIED Approved for public release; distribution unlimited JAutomatic Control and Data Acquisition System for...unit. The CPU/ROK board includes a 16 bit microprocessor chip which decodes and executes all in- structions, and controls all data transfers. The 12K...in the limited memory space of 32K of the HP-85 33 ACQDTA’ 1) Controls DevicesCRAIN ,2) Acquires Photodiods Output$ 3) Stores Data o Disc 1

  12. Run control techniques for the Fermilab DART data acquisition system

    International Nuclear Information System (INIS)

    Oleynik, G.; Engelfried, J.; Mengel, L.; Moore, C.; Pordes, R.; Udumula, L.; Votava, M.; Drunen, E. van; Zioulas, G.

    1996-01-01

    DART is the high speed, Unix based data acquisition system being developed by the Fermilab Computing Division in collaboration with eight High Energy Physics Experiments. This paper describes DART run-control which implements flexible, distributed, extensible and portable paradigms for the control monitoring of a data acquisition systems. We discuss the unique and interesting aspects of the run-control - why we chose the concepts we did, the benefits we have seen from the choices we made, as well as our experiences in deploying and supporting it for experiments during their commissioning and sub-system testing phases. We emphasize the software and techniques we believe are extensible to future use, and potential future modifications and extensions for those we feel are not. (author)

  13. Run control techniques for the Fermilab DART data acquisition system

    International Nuclear Information System (INIS)

    Oleynik, G.; Engelfried, J.; Mengel, L.

    1995-10-01

    DART is the high speed, Unix based data acquisition system being developed by the Fermilab Computing Division in collaboration with eight High Energy Physics Experiments. This paper describes DART run-control which implements flexible, distributed, extensible and portable paradigms for the control and monitoring of data acquisition systems. We discuss the unique and interesting aspects of the run-control - why we chose the concepts we did, the benefits we have seen from the choices we made, as well as our experiences in deploying and supporting it for experiments during their commissioning and sub-system testing phases. We emphasize the software and techniques we believe are extensible to future use, and potential future modifications and extensions for those we feel are not

  14. Data-Based Predictive Control with Multirate Prediction Step

    Science.gov (United States)

    Barlow, Jonathan S.

    2010-01-01

    Data-based predictive control is an emerging control method that stems from Model Predictive Control (MPC). MPC computes current control action based on a prediction of the system output a number of time steps into the future and is generally derived from a known model of the system. Data-based predictive control has the advantage of deriving predictive models and controller gains from input-output data. Thus, a controller can be designed from the outputs of complex simulation code or a physical system where no explicit model exists. If the output data happens to be corrupted by periodic disturbances, the designed controller will also have the built-in ability to reject these disturbances without the need to know them. When data-based predictive control is implemented online, it becomes a version of adaptive control. One challenge of MPC is computational requirements increasing with prediction horizon length. This paper develops a closed-loop dynamic output feedback controller that minimizes a multi-step-ahead receding-horizon cost function with multirate prediction step. One result is a reduced influence of prediction horizon and the number of system outputs on the computational requirements of the controller. Another result is an emphasis on portions of the prediction window that are sampled more frequently. A third result is the ability to include more outputs in the feedback path than in the cost function.

  15. A test system and supervisory control and data acquisition application with programmable logic controller for thermoelectric generators

    International Nuclear Information System (INIS)

    Ahiska, Rasit; Mamur, Hayati

    2012-01-01

    Highlights: ► A new TEG test measurement system with the PLC has been carried out. ► A new SCADA program has been written and tested for the test measurement system. ► An operator panel has been used for monitoring to the instant TEG data. ► All of the measurement data of TEG have been aggregated in the system. - Abstract: In this study, a new test measurement system and supervisory control and data acquisition application with programmable logic controller has been carried out to be enable the collection of the data of thermoelectric generator for the usage of thermoelectric modules as thermoelectric generator. During the production of the electric energy from the thermoelectric generator, the temperatures of the surfaces of the thermoelectric generator, current–voltage values obtained from output of the thermoelectric generator, hot and cold flows have been measured by the newly established system instantly. All these data have been monitored continuously from the computer and recorded by a supervisory control and data acquisition program. At the same time, in environments where there was no computer, an operator panel with the ability to communicate with the programmable logic controller has been added for the monitoring of the instant thermoelectric generator data. All of the measurement data of the thermoelectric generator have been aggregated in the new test measurement and supervisory control and data acquisition system. The setup test measurement system has been implemented on the thermoelectric generator system with about 10 W. Thermoelectric generators, Altec-GM-1 brand-coded have been examined by the new proposed test measurement system and the values of maximum power and thermoelectric generator efficiency were calculated by the programmable logic controller. When the obtained results were compared with the datasheets, the relative error for the maximum power was around 4% and the value for efficiency was below 3%.

  16. Data processing system for real-time control

    International Nuclear Information System (INIS)

    Oasa, K.; Mochizuki, O.; Toyokawa, R.; Yahiro, K.

    1983-01-01

    Real-time control, for large Tokamak JT-60, requires various data processings between diagnostic devices to control system. These processings require to high speed performance so that it aims at giving information necessary for feedback control during discharges. Then, the architecture of this system has hierachical structure of processors. These processors are connected each other by the CAMAC modules and the optical communication network, which is the 5 M bytes/second CAMAC serial highway. This system has two kinds of intelligences for this purpose. One is ACM-PU pairs in some torus hall crates which has a microcomputerized auxiliary controller and a preprocessing unit. Other is real-time processor which has a minicomputer and preprocessing unit. Most of the real-time processing, for example Abel inversion are characteristic to the diagnostic devices. Such a processing is carried out by an ACM-PU pair in the crate dedicated to the diagnostic device. Some processings, however, are also necessary which compute secondary parameters as functions of primary parameters. A typical example is Zeff, which is a function of Te, Ne and bremsstrahluny intensity. The real-time processor is equipped for such secondary processings and transfer the results. Preprocessing unit -PU- attached to ACM and real-time processor contains a signal processor, which executes in parallel such function as move, add and multiply during one micro-instruction cycle of 200 nsec. According to the progress of the experiment, more high speed processing are required, so the authors developed the PU-X module that contains multi signal processors. After a shot, inter-shot-processor which consists of general-purpose computers, gathers data into the database, then analyze them, and improve these processes to more effective

  17. Controller synthesis for negative imaginary systems: a data driven approach

    KAUST Repository

    Mabrok, Mohamed; Petersen, Ian R.

    2016-01-01

    -driven controller synthesis methodology for NI systems is presented. In this approach, measured frequency response data of the plant is used to construct the controller frequency response at every frequency by minimising a cost function. Then, this controller

  18. The IEO Data Center Management System: Tools for quality control, analysis and access marine data

    Science.gov (United States)

    Casas, Antonia; Garcia, Maria Jesus; Nikouline, Andrei

    2010-05-01

    Since 1994 the Data Centre of the Spanish Oceanographic Institute develops system for archiving and quality control of oceanographic data. The work started in the frame of the European Marine Science & Technology Programme (MAST) when a consortium of several Mediterranean Data Centres began to work on the MEDATLAS project. Along the years, old software modules for MS DOS were rewritten, improved and migrated to Windows environment. Oceanographic data quality control includes now not only vertical profiles (mainly CTD and bottles observations) but also time series of currents and sea level observations. New powerful routines for analysis and for graphic visualization were added. Data presented originally in ASCII format were organized recently in an open source MySQL database. Nowadays, the IEO, as part of SeaDataNet Infrastructure, has designed and developed a new information system, consistent with the ISO 19115 and SeaDataNet standards, in order to manage the large and diverse marine data and information originated in Spain by different sources, and to interoperate with SeaDataNet. The system works with data stored in ASCII files (MEDATLAS, ODV) as well as data stored within the relational database. The components of the system are: 1.MEDATLAS Format and Quality Control - QCDAMAR: Quality Control of Marine Data. Main set of tools for working with data presented as text files. Includes extended quality control (searching for duplicated cruises and profiles, checking date, position, ship velocity, constant profiles, spikes, density inversion, sounding, acceptable data, impossible regional values,...) and input/output filters. - QCMareas: A set of procedures for the quality control of tide gauge data according to standard international Sea Level Observing System. These procedures include checking for unexpected anomalies in the time series, interpolation, filtering, computation of basic statistics and residuals. 2. DAMAR: A relational data base (MySql) designed to

  19. Development of RF non-IQ sampling module for Helium RFQ LLRF system

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Hae-Seong; Ahn, Tae-Sung; Kim, Seong-Gu; Kwon, Hyeok-Jung; Kim, Han-Sung; Song, Young-Gi; Seol, Kyung-Tae; Cho, Yong-Sub [KOMAC, Gyeongju (Korea, Republic of)

    2015-05-15

    KOMAC (Korea Multi-purpose Accelerator Complex) has a plan to develop the helium irradiation system. This system includes the Ion source, LEBT, RFQ, MEBT systems to transport helium particles to the target. Especially, the RFQ (Radio Frequency Quadrupole) system should receive the 200MHz RF within 1% amplitude error stability. For supplying stable 200MHz RF to the RFQ, the low-level radio frequency (LLRF) should be controlled by control system. The helium RFQ LLRF control system adopted non- IQ sampling method to sample the analog input RF. Sampled input data will be calculated to get the I, Q values. These I, Q values will be used to monitor the amplitude and phase of the RF signal. In this paper, non-IQ sampling logic and amplitude and phase calculating logic of the FPGA will be introduced. Using Xilinx ISE design suite which is tool for developing the FPGA logic module, non-IQ sampling module and amplitude and phase computing module developed. In the future, PI gain module and frequency error computing module will be developed.

  20. Preparation and analysis of standardized waste samples for Controlled Ecological Life Support Systems (CELSS)

    Science.gov (United States)

    Carden, J. L.; Browner, R.

    1982-01-01

    The preparation and analysis of standardized waste samples for controlled ecological life support systems (CELSS) are considered. Analysis of samples from wet oxidation experiments, the development of ion chromatographic techniques utilizing conventional high pressure liquid chromatography (HPLC) equipment, and an investigation of techniques for interfacing an ion chromatograph (IC) with an inductively coupled plasma optical emission spectrometer (ICPOES) are discussed.

  1. MINDS: A microcomputer interactive data system for 8086-based controllers

    Science.gov (United States)

    Soeder, J. F.

    1985-01-01

    A microcomputer interactive data system (MINDS) software package for the 8086 family of microcomputers is described. To enhance program understandability and ease of code maintenance, the software is written in PL/M-86, Intel Corporation's high-level system implementation language. The MINDS software is intended to run in residence with real-time digital control software to provide displays of steady-state and transient data. In addition, the MINDS package provides classic monitor capabilities along with extended provisions for debugging an executing control system. The software uses the CP/M-86 operating system developed by Digital Research, Inc., to provide program load capabilities along with a uniform file structure for data and table storage. Finally, a library of input and output subroutines to be used with consoles equipped with PL/M-86 and assembly language is described.

  2. Design of real-time monitoring and control system of 222Rn/220Rn sampling for radon chamber

    International Nuclear Information System (INIS)

    Wu Rongyan; Zhao Xiuliang; Zhang Meiqin; Yu Hong

    2008-01-01

    This paper describes the design of 222 Rn/ 220 Rn sampling monitoring and control system based on single-chip microcomputer of series Intel51. The hardware design involves the choosing and usage of sensors-chips, A/D conversion-chip, USB interface-chip, keyboard-chip, digital display-chip, photoelectric coupling isolation-chips and drive circuit-chips of the direct current pump. Software design is composed by software of Personal Computer (PC) and software of Single Chip Microcomputer (SCM). The data acquisition and conversion and the flux control of direct current pump are realized by using soft of Visual Basic and assemble language. The program flow charts are given. Furthermore, we improved the stability of the direct current pump by means of PID Control Algorithms. (authors)

  3. DACS II - A distributed thermal/mechanical loads data acquisition and control system

    Science.gov (United States)

    Zamanzadeh, Behzad; Trover, William F.; Anderson, Karl F.

    1987-01-01

    A distributed data acquisition and control system has been developed for the NASA Flight Loads Research Facility. The DACS II system is composed of seven computer systems and four array processors configured as a main computer system, three satellite computer systems, and 13 analog input/output systems interconnected through three independent data networks. Up to three independent heating and loading tests can be run concurrently on different test articles or the entire system can be used on a single large test such as a full scale hypersonic aircraft. Thermal tests can include up to 512 independent adaptive closed loop control channels. The control system can apply up to 20 MW of heating to a test specimen while simultaneously applying independent mechanical loads. Each thermal control loop is capable of heating a structure at rates of up to 150 F per second over a temperature range of -300 to +2500 F. Up to 64 independent mechanical load profiles can be commanded along with thermal control. Up to 1280 analog inputs monitor temperature, load, displacement and strain on the test specimens with real time data displayed on up to 15 terminals as color plots and tabular data displays. System setup and operation is accomplished with interactive menu-driver displays with extensive facilities to assist the users in all phases of system operation.

  4. ATLAS Detector Control System Data Viewer

    CERN Document Server

    Tsarouchas, Charilaos; Roe, S; Bitenc, U; Fehling-Kaschek, ML; Winkelmann, S; D’Auria, S; Hoffmann, D; Pisano, O

    2011-01-01

    The ATLAS experiment at CERN is one of the four Large Hadron Collider experiments. DCS Data Viewer (DDV) is a web interface application that provides access to historical data of ATLAS Detector Control System [1] (DCS) parameters written to the database (DB). It has a modular andflexible design and is structured using a clientserver architecture. The server can be operated stand alone with a command-line interface to the data while the client offers a user friendly, browser independent interface. The selection of the metadata of DCS parameters is done via a column-tree view or with a powerful search engine. The final visualisation of the data is done using various plugins such as “value over time” charts, data tables, raw ASCII or structured export to ROOT. Excessive access or malicious use of the database is prevented by dedicated protection mechanisms, allowing the exposure of the tool to hundreds of inexperienced users. The metadata selection and data output features can be used separately by XML con...

  5. Physical Samples Linked Data in Action

    Science.gov (United States)

    Ji, P.; Arko, R. A.; Lehnert, K.; Bristol, S.

    2017-12-01

    Most data and metadata related to physical samples currently reside in isolated relational databases driven by diverse data models. How to approach the challenge for sharing, interchanging and integrating data from these difference relational databases motivated us to publish Linked Open Data for collections of physical samples, using Semantic Web technologies including the Resource Description Framework (RDF), RDF Query Language (SPARQL), and Web Ontology Language (OWL). In last few years, we have released four knowledge graphs concentrated on physical samples, including System for Earth Sample Registration (SESAR), USGS National Geochemical Database (NGDC), Ocean Biogeographic Information System (OBIS), and Earthchem Database. Currently the four knowledge graphs contain over 12 million facets (triples) about objects of interest to the geoscience domain. Choosing appropriate domain ontologies for representing context of data is the core of the whole work. Geolink ontology developed by Earthcube Geolink project was used as top level to represent common concepts like person, organization, cruise, etc. Physical sample ontology developed by Interdisciplinary Earth Data Alliance (IEDA) and Darwin Core vocabulary were used as second level to describe details about geological samples and biological diversity. We also focused on finding and building best tool chains to support the whole life cycle of publishing linked data we have, including information retrieval, linked data browsing and data visualization. Currently, Morph, Virtuoso Server, LodView, LodLive, and YASGUI were employed for converting, storing, representing, and querying data in a knowledge base (RDF triplestore). Persistent digital identifier is another main point we concentrated on. Open Researcher & Contributor IDs (ORCIDs), International Geo Sample Numbers (IGSNs), Global Research Identifier Database (GRID) and other persistent identifiers were used to link different resources from various graphs with

  6. Tank vapor sampling and analysis data package for tank 241-C-106 waste retrieval sluicing system process test phase III, sampled March 28, 1999

    International Nuclear Information System (INIS)

    LOCKREM, L.L.

    1999-01-01

    This data package presents sampling data and analytical results from the March 28, 1999, vapor sampling of Hanford Site single-shell tank 241-C-106 during active sluicing. Samples were obtained from the 296-C-006 ventilation system stack and ambient air at several locations. Characterization Project Operations (CPO) was responsible for the collection of all SUMMATM canister samples. The Special Analytical Support (SAS) vapor team was responsible for the collection of all triple sorbent trap (TST), sorbent tube train (STT), polyurethane foam (PUF), and particulate filter samples collected at the 296-C-006 stack. The SAS vapor team used the non-electrical vapor sampling (NEVS) system to collect samples of the air, gases, and vapors from the 296-C-006 stack. The SAS vapor team collected and analyzed these samples for Lockheed Martin Hanford Corporation (LMHC) and Tank Waste Remediation System (TWRS) in accordance with the sampling and analytical requirements specified in the Waste Retrieval Sluicing System Vapor Sampling and Analysis Plan (SAP) for Evaluation of Organic Emissions, Process Test Phase III, HNF-4212, Rev. 0-A, (LMHC, 1999). All samples were stored in a secured Radioactive Materials Area (RMA) until the samples were radiologically released and received by SAS for analysis. The Waste Sampling and Characterization Facility (WSCF) performed the radiological analyses. The samples were received on April 5, 1999

  7. Robust stability bounds for multi-delay networked control systems

    Science.gov (United States)

    Seitz, Timothy; Yedavalli, Rama K.; Behbahani, Alireza

    2018-04-01

    In this paper, the robust stability of a perturbed linear continuous-time system is examined when controlled using a sampled-data networked control system (NCS) framework. Three new robust stability bounds on the time-invariant perturbations to the original continuous-time plant matrix are presented guaranteeing stability for the corresponding discrete closed-loop augmented delay-free system (ADFS) with multiple time-varying sensor and actuator delays. The bounds are differentiated from previous work by accounting for the sampled-data nature of the NCS and for separate communication delays for each sensor and actuator, not a single delay. Therefore, this paper expands the knowledge base in multiple inputs multiple outputs (MIMO) sampled-data time delay systems. Bounds are presented for unstructured, semi-structured, and structured perturbations.

  8. The sample handling system for the Mars Icebreaker Life mission: from dirt to data.

    Science.gov (United States)

    Davé, Arwen; Thompson, Sarah J; McKay, Christopher P; Stoker, Carol R; Zacny, Kris; Paulsen, Gale; Mellerowicz, Bolek; Glass, Brian J; Willson, David; Bonaccorsi, Rosalba; Rask, Jon

    2013-04-01

    The Mars Icebreaker Life mission will search for subsurface life on Mars. It consists of three payload elements: a drill to retrieve soil samples from approximately 1 m below the surface, a robotic sample handling system to deliver the sample from the drill to the instruments, and the instruments themselves. This paper will discuss the robotic sample handling system. Collecting samples from ice-rich soils on Mars in search of life presents two challenges: protection of that icy soil--considered a "special region" with respect to planetary protection--from contamination from Earth, and delivery of the icy, sticky soil to spacecraft instruments. We present a sampling device that meets these challenges. We built a prototype system and tested it at martian pressure, drilling into ice-cemented soil, collecting cuttings, and transferring them to the inlet port of the SOLID2 life-detection instrument. The tests successfully demonstrated that the Icebreaker drill, sample handling system, and life-detection instrument can collectively operate in these conditions and produce science data that can be delivered via telemetry--from dirt to data. Our results also demonstrate the feasibility of using an air gap to prevent forward contamination. We define a set of six analog soils for testing over a range of soil cohesion, from loose sand to basalt soil, with angles of repose of 27° and 39°, respectively. Particle size is a key determinant of jamming of mechanical parts by soil particles. Jamming occurs when the clearance between moving parts is equal in size to the most common particle size or equal to three of these particles together. Three particles acting together tend to form bridges and lead to clogging. Our experiments show that rotary-hammer action of the Icebreaker drill influences the particle size, typically reducing particle size by ≈ 100 μm.

  9. Conceptual requirements for large fusion experiment control, data, robotics, and management systems

    International Nuclear Information System (INIS)

    Gaudreau, M.P.J.; Sullivan, J.D.

    1987-05-01

    The conceptual system requirements for the control, data, robotics, and project management (CDRM) system for the next generation of fusion experiments are developed by drawing on the success of the Tara control and data system. The requirements are described in terms of an integrated but separable matrix of well-defined interfaces among the various systems and subsystems. The study stresses modularity, performance, cost effectiveness, and exportability

  10. Real time quality control of meteorological data used in SRP's emergency response system

    International Nuclear Information System (INIS)

    Pendergast, M.M.

    1980-05-01

    The Savannah River Laboratory's WIND minicomputer system allows quick and accurate assessment of an accidental release at the Savannah River Plant using data from eight meteorological towers. The accuracy of the assessment is largely determined by the accuracy of the meteorological data; therefore quality control is important in an emergency response system. Real-time quality control of this data will be added to the WIND system to automatically identify inaccurate data. Currently, the system averages the measurements from the towers to minimize the influence of inaccurate data being used in calculations. The computer code used in the real-time quality control has been previously used to identify inaccurate measurements from the archived tower data

  11. Development of control and data processing system for CO2 laser interferometer

    International Nuclear Information System (INIS)

    Chiba, Shinichi; Kawano, Yasunori; Tsuchiya, Katsuhiko; Inoue, Akira

    2001-11-01

    CO 2 laser interferometer diagnostic has been operating to measure the central electron density in JT-60U plasmas. We have developed a control and data processing system for the CO 2 laser interferometer with flexible functions of data acquisition, data processing and data transfer in accordance with the sequence of JT-60U discharges. This system is mainly composed of two UNIX workstations and CAMAC clusters, in which the high reliability was obtained by sharing the data process functions to the each workstations. Consequently, the control and data processing system becomes to be able to provide electron density data immediately after a JT-60U discharge, routinely. The realtime feedback control of electron density in JT-60U also becomes to be available by using a reference density signal from the CO 2 laser interferometer. (author)

  12. Data-driven modeling, control and tools for cyber-physical energy systems

    Science.gov (United States)

    Behl, Madhur

    Energy systems are experiencing a gradual but substantial change in moving away from being non-interactive and manually-controlled systems to utilizing tight integration of both cyber (computation, communications, and control) and physical representations guided by first principles based models, at all scales and levels. Furthermore, peak power reduction programs like demand response (DR) are becoming increasingly important as the volatility on the grid continues to increase due to regulation, integration of renewables and extreme weather conditions. In order to shield themselves from the risk of price volatility, end-user electricity consumers must monitor electricity prices and be flexible in the ways they choose to use electricity. This requires the use of control-oriented predictive models of an energy system's dynamics and energy consumption. Such models are needed for understanding and improving the overall energy efficiency and operating costs. However, learning dynamical models using grey/white box approaches is very cost and time prohibitive since it often requires significant financial investments in retrofitting the system with several sensors and hiring domain experts for building the model. We present the use of data-driven methods for making model capture easy and efficient for cyber-physical energy systems. We develop Model-IQ, a methodology for analysis of uncertainty propagation for building inverse modeling and controls. Given a grey-box model structure and real input data from a temporary set of sensors, Model-IQ evaluates the effect of the uncertainty propagation from sensor data to model accuracy and to closed-loop control performance. We also developed a statistical method to quantify the bias in the sensor measurement and to determine near optimal sensor placement and density for accurate data collection for model training and control. Using a real building test-bed, we show how performing an uncertainty analysis can reveal trends about

  13. Actuator digital interface unit (AIU). [control units for space shuttle data system

    Science.gov (United States)

    1973-01-01

    Alternate versions of the actuator interface unit are presented. One alternate is a dual-failure immune configuration which feeds a look-and-switch dual-failure immune hydraulic system. The other alternate is a single-failure immune configuration which feeds a majority voting hydraulic system. Both systems communicate with the data bus through data terminals dedicated to each user subsystem. Both operational control data and configuration control information are processed in and out of the subsystem via the data terminal which yields the actuator interface subsystem, self-managing within its failure immunity capability.

  14. A novel condition for stable nonlinear sampled-data models using higher-order discretized approximations with zero dynamics.

    Science.gov (United States)

    Zeng, Cheng; Liang, Shan; Xiang, Shuwen

    2017-05-01

    Continuous-time systems are usually modelled by the form of ordinary differential equations arising from physical laws. However, the use of these models in practice and utilizing, analyzing or transmitting these data from such systems must first invariably be discretized. More importantly, for digital control of a continuous-time nonlinear system, a good sampled-data model is required. This paper investigates the new consistency condition which is weaker than the previous similar results presented. Moreover, given the stability of the high-order approximate model with stable zero dynamics, the novel condition presented stabilizes the exact sampled-data model of the nonlinear system for sufficiently small sampling periods. An insightful interpretation of the obtained results can be made in terms of the stable sampling zero dynamics, and the new consistency condition is surprisingly associated with the relative degree of the nonlinear continuous-time system. Our controller design, based on the higher-order approximate discretized model, extends the existing methods which mainly deal with the Euler approximation. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  15. On the optimal sampling of bandpass measurement signals through data acquisition systems

    International Nuclear Information System (INIS)

    Angrisani, L; Vadursi, M

    2008-01-01

    Data acquisition systems (DAS) play a fundamental role in a lot of modern measurement solutions. One of the parameters characterizing a DAS is its maximum sample rate, which imposes constraints on the signals that can be alias-free digitized. Bandpass sampling theory singles out separated ranges of admissible sample rates, which can be significantly lower than carrier frequency. But, how to choose the most convenient sample rate according to the purpose at hand? The paper proposes a method for the automatic selection of the optimal sample rate in measurement applications involving bandpass signals; the effects of sample clock instability and limited resolution are also taken into account. The method allows the user to choose the location of spectral replicas of the sampled signal in terms of normalized frequency, and the minimum guard band between replicas, thus introducing a feature that no DAS currently available on the market seems to offer. A number of experimental tests on bandpass digitally modulated signals are carried out to assess the concurrence of the obtained central frequency with the expected one

  16. On the Protection of Personal Data in the Access Control System

    Directory of Open Access Journals (Sweden)

    A. P. Durakovskiy

    2012-03-01

    Full Text Available The aim is to prove the qualification system of access control systems (ACS as an information system for personal data (ISPDn. Applications: systems of physical protection of facilities.

  17. Data-Driven Control of Refrigeration System

    DEFF Research Database (Denmark)

    Vinther, Kasper

    Refrigeration is used in a wide range of applications, e.g., for storage of food at low temperatures to prolong shelf life and in air conditioning for occupancy comfort. The main focus of this thesis is control of supermarket refrigeration systems. This market is very competitive and it is import......Refrigeration is used in a wide range of applications, e.g., for storage of food at low temperatures to prolong shelf life and in air conditioning for occupancy comfort. The main focus of this thesis is control of supermarket refrigeration systems. This market is very competitive...... traditionally are a pressure and a temperature sensor. In this thesis, a novel maximum slope-seeking (MSS) control method is developed. This has resulted in a control implementation, which successfully has been able to control the evaporator superheat in four widely different refrigeration system test...... problems. The method utilizes the qualitative nonlinearity in the system and harmonic analysis of a perturbation signal to reach an unknown, but suitable, operating point. Another important control task in refrigeration systems is to maintain the temperature of the refrigerated space or foodstuff within...

  18. Getting DNA copy numbers without control samples.

    Science.gov (United States)

    Ortiz-Estevez, Maria; Aramburu, Ander; Rubio, Angel

    2012-08-16

    The selection of the reference to scale the data in a copy number analysis has paramount importance to achieve accurate estimates. Usually this reference is generated using control samples included in the study. However, these control samples are not always available and in these cases, an artificial reference must be created. A proper generation of this signal is crucial in terms of both noise and bias.We propose NSA (Normality Search Algorithm), a scaling method that works with and without control samples. It is based on the assumption that genomic regions enriched in SNPs with identical copy numbers in both alleles are likely to be normal. These normal regions are predicted for each sample individually and used to calculate the final reference signal. NSA can be applied to any CN data regardless the microarray technology and preprocessing method. It also finds an optimal weighting of the samples minimizing possible batch effects. Five human datasets (a subset of HapMap samples, Glioblastoma Multiforme (GBM), Ovarian, Prostate and Lung Cancer experiments) have been analyzed. It is shown that using only tumoral samples, NSA is able to remove the bias in the copy number estimation, to reduce the noise and therefore, to increase the ability to detect copy number aberrations (CNAs). These improvements allow NSA to also detect recurrent aberrations more accurately than other state of the art methods. NSA provides a robust and accurate reference for scaling probe signals data to CN values without the need of control samples. It minimizes the problems of bias, noise and batch effects in the estimation of CNs. Therefore, NSA scaling approach helps to better detect recurrent CNAs than current methods. The automatic selection of references makes it useful to perform bulk analysis of many GEO or ArrayExpress experiments without the need of developing a parser to find the normal samples or possible batches within the data. The method is available in the open-source R package

  19. Digital control and data acquisition system for the QUIET experiment

    International Nuclear Information System (INIS)

    Bogdan, Mircea; Kapner, Dan; Samtleben, Dorothea; Vanderlinde, Keith

    2007-01-01

    We present the Digital Control and Data Acquisition System (DCDAQ) for Phase I of the Q/U Imaging Experiment (QUIET), arrays of 91 W-band and 19 Q-band receivers, placed on 1.4 m telescopes, in Chajnantor, Chile to measure the polarization of the cosmic microwave background. QUIET uses custom-built electronics boards that control and monitor its polarimeters. Each of these boards is digitally addressable, so that the DCDAQ can set and monitor any of the 1600 biases needed to operate the 91 receivers. The DCDAQ consists of a controller and up to 13 custom-made 32-channel ADC cards. Local FPGAs allow real-time data processing for each channel. This immediate data reduction is necessary, as it is planned to scale this technology beyond Phase I. The DCDAQ system is implemented with this future in mind and can easily be scaled to operate 1000 receivers

  20. A remote data acquisition and control system for Moessbauer spectroscopy

    International Nuclear Information System (INIS)

    Zhou Qingguo; Wang Li; Wang Yanlong; Zhao Hong; Zhou Rongjie

    2004-01-01

    A remote data acquisition and control system for Moessbauer spectroscopy based on an embedded computer with the Mini Real-Time Linux operating system is presented. This system can be accessed by an Internet browser or a Java application program, which is designed especially for this purpose. So controlling this system is simple and the interface is user friendly. The components of this system can easily be obtained. So it could be built in most laboratories. We have succeeded in designing and developing this system, as well as using the system at the Key Laboratory for Magnetism and Magnetic Material of Ministry of Education, Lanzhou University, PR China

  1. Utilization of Integrated Process Control, Data Capture, and Data Analysis in Construction of Accelerator Systems

    International Nuclear Information System (INIS)

    Bonnie Madre; Charles Reece; Joseph Ozelis; Valerie Bookwalter

    2003-01-01

    Jefferson Lab has developed a web-based system that integrates commercial database, data analysis, document archiving and retrieval, and user interface software, into a coherent knowledge management product (Pansophy). This product provides important tools for the successful pursuit of major projects such as accelerator system development and construction, by offering elements of process and procedure control, data capture and review, and data mining and analysis. After a period of initial development, Pansophy is now being used in Jefferson Lab's SNS superconducting linac construction effort, as a means for structuring and implementing the QA program, for process control and tracking, and for cryomodule test data capture and presentation/analysis. Development of Pansophy is continuing, in particular data queries and analysis functions that are the cornerstone of its utility

  2. Control and data acquisition systems for the Fermi Elettra experimental stations

    International Nuclear Information System (INIS)

    Borghes, R.; Chenda, V.; Curri, A.; Gaio, G.; Kourousias, G.; Lonza, M.; Passos, G.; Passuello, R.; Pivetta, L.; Prica, M.; Pugliese, R.; Strangolino, G.

    2012-01-01

    FERMI-Elettra is a single-pass Free Electron Laser (FEL) user-facility covering the wavelength range from 100 nm to 4 nm. The facility is located in Trieste, Italy, nearby the third-generation synchrotron light source Elettra. Three experimental stations, dedicated to different scientific areas, have been installed in 2011: Low Density Matter (LDM), Elastic and Inelastic Scattering (EIS) and Diffraction and Projection Imaging (DiProI). The experiment control and data acquisition system is the natural extension of the machine control system. It integrates a shot-by-shot data acquisition framework with a centralized data storage and analysis system. Low-level applications for data acquisition and online processing have been developed using the Tango framework on Linux platforms. High-level experimental applications can be developed on both Linux and Windows platforms using C/C++, Python, LabView, IDL or Matlab. The Elettra scientific computing portal allows remote access to the experiment and to the data storage system. (authors)

  3. Real-time control and data-acquisition system for high-energy neutral-beam injectors

    International Nuclear Information System (INIS)

    Glad, A.S.; Jacobson, V.

    1981-12-01

    The need for a real-time control system and a data acquisition, processing and archiving system operating in parallel on the same computer became a requirement on General Atomic's Doublet III fusion energy project with the addition of high energy neutral beam injectors. The data acquisition processing and archiving system is driven from external events and is sequenced through each experimental shot utilizing ModComp's intertask message service. This system processes, archives and displays on operator console CRTs all physics diagnostic data related to the neutral beam injectores such as temperature, beam alignment, etc. The real-time control system is data base driven and provides periodic monitoring and control of the numerous dynamic subsystems of the neutral beam injectors such as power supplies, timing, water cooling, etc

  4. A novel atmospheric tritium sampling system

    Science.gov (United States)

    Qin, Lailai; Xia, Zhenghai; Gu, Shaozhong; Zhang, Dongxun; Bao, Guangliang; Han, Xingbo; Ma, Yuhua; Deng, Ke; Liu, Jiayu; Zhang, Qin; Ma, Zhaowei; Yang, Guo; Liu, Wei; Liu, Guimin

    2018-06-01

    The health hazard of tritium is related to its chemical form. Sampling different chemical forms of tritium simultaneously becomes significant. Here a novel atmospheric tritium sampling system (TS-212) was developed to collect the tritiated water (HTO), tritiated hydrogen (HT) and tritiated methane (CH3T) simultaneously. It consisted of an air inlet system, three parallel connected sampling channels, a hydrogen supply module, a methane supply module and a remote control system. It worked at air flow rate of 1 L/min to 5 L/min, with temperature of catalyst furnace at 200 °C for HT sampling and 400 °C for CH3T sampling. Conversion rates of both HT and CH3T to HTO were larger than 99%. The collecting efficiency of the two-stage trap sets for HTO was larger than 96% in 12 h working-time without being blocked. Therefore, the collected efficiencies of TS-212 are larger than 95% for tritium with different chemical forms in environment. Besides, the remote control system made sampling more intelligent, reducing the operator's work intensity. Based on the performance parameters described above, the TS-212 can be used to sample atmospheric tritium in different chemical forms.

  5. Computer system requirements specification for 101-SY hydrogen mitigation test project data acquisition and control system (DACS-1)

    International Nuclear Information System (INIS)

    McNeece, S.G.; Truitt, R.W.

    1994-01-01

    The system requirements specification for SY-101 hydrogen mitigation test project (HMTP) data acquisition and control system (DACS-1) documents the system requirements for the DACS-1 project. The purpose of the DACS is to provide data acquisition and control capabilities for the hydrogen mitigation testing of Tank SY-101. Mitigation testing uses a pump immersed in the waste, directed at varying angles and operated at different speeds and time durations. Tank and supporting instrumentation is brought into the DACS to monitor the status of the tank and to provide information on the effectiveness of the mitigation test. Instrumentation is also provided for closed loop control of the pump operation. DACS is also capable for being expanded to control and monitor other mitigation testing. The intended audience for the computer system requirements specification includes the SY-101 hydrogen mitigation test data acquisition and control system designers: analysts, programmers, instrument engineers, operators, maintainers. It is intended for the data users: tank farm operations, mitigation test engineers, the Test Review Group (TRG), data management support staff, data analysis, Hanford data stewards, and external reviewers

  6. Supervisory control and data acquisition system development for superconducting current feeder system of SST-1

    International Nuclear Information System (INIS)

    Patel, R.; Mahesuria, G.; Gupta, N.C.; Sonara, D.; Panchal, R.; Panchal, P.; Tanna, V.L.; Pradhan, S.

    2014-01-01

    The Current Feeders System (CFS) is essentially an optimized bridge between the power supply at room temperature and Super Conducting Magnet System (SCMS) of the SST-1 machine at 4.5 K.CFS is a complex electrical and cryogenic network which consists of ten pairs of 10 KA rating helium Vapor cooled Conventional Current Leads (VCCLs), superconducting (SC) current feeder and associated components. For the safe and reliable operation of CFS, it is equipped with different physical process parameters measuring instruments like flow, pressure, temperature, level, vacuum, voltage taps and final control element like control valves, heaters, vacuum pumps etc. PLC program is developed in ladder language for acquiring and controlling the process parameters. Independent SCADA applications developed in WonderwareIntouch software for data communication from PLC, front-end Graphical User Interface (GUI), auto-manual interface, real time trends, history trends, events and alarm pages. Time synchronized communication established between CFS control system and Industrial SQL server (InSQL) Historian for centralized storage of CFS process parameters which intern provides the CFS process data to SST-1 central control room. SCADA based data acquisition and data retrieval system is found to be satisfactory during the recent SST-1 cool down experiment. This paper describes the SCADA and PLC application development and their communication to InSQL server. (author)

  7. Sample size determinations for group-based randomized clinical trials with different levels of data hierarchy between experimental and control arms.

    Science.gov (United States)

    Heo, Moonseong; Litwin, Alain H; Blackstock, Oni; Kim, Namhee; Arnsten, Julia H

    2017-02-01

    We derived sample size formulae for detecting main effects in group-based randomized clinical trials with different levels of data hierarchy between experimental and control arms. Such designs are necessary when experimental interventions need to be administered to groups of subjects whereas control conditions need to be administered to individual subjects. This type of trial, often referred to as a partially nested or partially clustered design, has been implemented for management of chronic diseases such as diabetes and is beginning to emerge more commonly in wider clinical settings. Depending on the research setting, the level of hierarchy of data structure for the experimental arm can be three or two, whereas that for the control arm is two or one. Such different levels of data hierarchy assume correlation structures of outcomes that are different between arms, regardless of whether research settings require two or three level data structure for the experimental arm. Therefore, the different correlations should be taken into account for statistical modeling and for sample size determinations. To this end, we considered mixed-effects linear models with different correlation structures between experimental and control arms to theoretically derive and empirically validate the sample size formulae with simulation studies.

  8. Adaptive intrusion data system (AIDS) software routines

    International Nuclear Information System (INIS)

    Corlis, N.E.

    1980-07-01

    An Adaptive Intrusion Data System (AIDS) was developed to collect information from intrusion alarm sensors as part of an evaluation system to improve sensor performance. AIDS is a unique digital data-compression, storage, and formatting system; it also incorporates a capability for video selection and recording for assessment of the sensors monitored by the system. The system is software reprogrammable to numerous configurations that may be used for the collection of environmental, bilevel, analog, and video data. This report describes the software routines that control the different AIDS data-collection modes, the diagnostic programs to test the operating hardware, and the data format. Sample data printouts are also included

  9. Conceptual Design Approach to Implementing Hardware-based Security Controls in Data Communication Systems

    International Nuclear Information System (INIS)

    Ibrahim, Ahmad Salah; Jung, Jaecheon

    2016-01-01

    In the Korean Advanced Power Reactor (APR1400), safety control systems network is electrically isolated and physically separated from non-safety systems data network. Unidirectional gateways, include data diode fiber-optic cabling and computer-based servers, transmit the plant safety critical parameters to the main control room (MCR) for control and monitoring processes. The data transmission is only one-way from safety to non-safety. Reverse communication is blocked so that safety systems network is protected from potential cyberattacks or intrusions from non-safety side. Most of commercials off-the-shelf (COTS) security devices are software-based solutions that require operating systems and processors to perform its functions. Field Programmable Gate Arrays (FPGAs) offer digital hardware solutions to implement security controls such as data packet filtering and deep data packet inspection. This paper presents a conceptual design to implement hardware-based network security controls for maintaining the availability of gateway servers. A conceptual design of hardware-based network security controls was discussed in this paper. The proposed design is aiming at utilizing the hardware-based capabilities of FPGAs together with filtering and DPI functions of COTS software-based firewalls and intrusion detection and prevention systems (IDPS). The proposed design implemented a network security perimeter between the DCN-I zone and gateway servers zone. Security control functions are to protect the gateway servers from potential DoS attacks that could affect the data availability and integrity

  10. Conceptual Design Approach to Implementing Hardware-based Security Controls in Data Communication Systems

    Energy Technology Data Exchange (ETDEWEB)

    Ibrahim, Ahmad Salah; Jung, Jaecheon [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2016-10-15

    In the Korean Advanced Power Reactor (APR1400), safety control systems network is electrically isolated and physically separated from non-safety systems data network. Unidirectional gateways, include data diode fiber-optic cabling and computer-based servers, transmit the plant safety critical parameters to the main control room (MCR) for control and monitoring processes. The data transmission is only one-way from safety to non-safety. Reverse communication is blocked so that safety systems network is protected from potential cyberattacks or intrusions from non-safety side. Most of commercials off-the-shelf (COTS) security devices are software-based solutions that require operating systems and processors to perform its functions. Field Programmable Gate Arrays (FPGAs) offer digital hardware solutions to implement security controls such as data packet filtering and deep data packet inspection. This paper presents a conceptual design to implement hardware-based network security controls for maintaining the availability of gateway servers. A conceptual design of hardware-based network security controls was discussed in this paper. The proposed design is aiming at utilizing the hardware-based capabilities of FPGAs together with filtering and DPI functions of COTS software-based firewalls and intrusion detection and prevention systems (IDPS). The proposed design implemented a network security perimeter between the DCN-I zone and gateway servers zone. Security control functions are to protect the gateway servers from potential DoS attacks that could affect the data availability and integrity.

  11. Operations and maintenance manual for the LDUA supervisory control and data acquisition system (LDUA System 4200) and control network (LDUA System 4400)

    International Nuclear Information System (INIS)

    Barnes, G.A.

    1998-01-01

    This document defines the requirements applicable to the operation, maintenance and storage of the Supervisory Control and Data Acquisition System (SCADAS) and Control Network in support of the Light Duty Utility Arm (LDUA) operations

  12. Control and data processing systems in UK nuclear power plant and nuclear facilities

    International Nuclear Information System (INIS)

    Baldwin, J.A.; Wall, D.N.

    1997-01-01

    This note identifies some of the data processing and control systems in UK nuclear power plant, with emphasis on direct digital control systems and sequence control. A brief indication is also given of some of the associated research activities on control systems and software. (author). 2 figs

  13. Control and data processing systems in UK nuclear power plant and nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Baldwin, J A; Wall, D N [AEA Technology, Winfrith, Dorchester (United Kingdom)

    1997-07-01

    This note identifies some of the data processing and control systems in UK nuclear power plant, with emphasis on direct digital control systems and sequence control. A brief indication is also given of some of the associated research activities on control systems and software. (author). 2 figs.

  14. System for Earth Sample Registration SESAR: Services for IGSN Registration and Sample Metadata Management

    Science.gov (United States)

    Chan, S.; Lehnert, K. A.; Coleman, R. J.

    2011-12-01

    SESAR, the System for Earth Sample Registration, is an online registry for physical samples collected for Earth and environmental studies. SESAR generates and administers the International Geo Sample Number IGSN, a unique identifier for samples that is dramatically advancing interoperability amongst information systems for sample-based data. SESAR was developed to provide the complete range of registry services, including definition of IGSN syntax and metadata profiles, registration and validation of name spaces requested by users, tools for users to submit and manage sample metadata, validation of submitted metadata, generation and validation of the unique identifiers, archiving of sample metadata, and public or private access to the sample metadata catalog. With the development of SESAR v3, we placed particular emphasis on creating enhanced tools that make metadata submission easier and more efficient for users, and that provide superior functionality for users to manage metadata of their samples in their private workspace MySESAR. For example, SESAR v3 includes a module where users can generate custom spreadsheet templates to enter metadata for their samples, then upload these templates online for sample registration. Once the content of the template is uploaded, it is displayed online in an editable grid format. Validation rules are executed in real-time on the grid data to ensure data integrity. Other new features of SESAR v3 include the capability to transfer ownership of samples to other SESAR users, the ability to upload and store images and other files in a sample metadata profile, and the tracking of changes to sample metadata profiles. In the next version of SESAR (v3.5), we will further improve the discovery, sharing, registration of samples. For example, we are developing a more comprehensive suite of web services that will allow discovery and registration access to SESAR from external systems. Both batch and individual registrations will be possible

  15. System Identification of a Non-Uniformly Sampled Multi-Rate System in Aluminium Electrolysis Cells

    Directory of Open Access Journals (Sweden)

    Håkon Viumdal

    2014-07-01

    Full Text Available Standard system identification algorithms are usually designed to generate mathematical models with equidistant sampling instants, that are equal for both input variables and output variables. Unfortunately, real industrial data sets are often disrupted by missing samples, variations of sampling rates in the different variables (also known as multi-rate systems, and intermittent measurements. In industries with varying events based maintenance or manual operational measures, intermittent measurements are performed leading to uneven sampling rates. Such is the case with aluminium smelters, where in addition the materials fed into the cell create even more irregularity in sampling. Both measurements and feeding are mostly manually controlled. A simplified simulation of the metal level in an aluminium electrolysis cell is performed based on mass balance considerations. System identification methods based on Prediction Error Methods (PEM such as Ordinary Least Squares (OLS, and the sub-space method combined Deterministic and Stochastic system identification and Realization (DSR, and its variants are applied to the model of a single electrolysis cell as found in the aluminium smelters. Aliasing phenomena due to large sampling intervals can be crucial in avoiding unsuitable models, but with knowledge about the system dynamics, it is easier to optimize the sampling performance, and hence achieve successful models. The results based on the simulation studies of molten aluminium height in the cells using the various algorithms give results which tally well with the synthetic data sets used. System identification on a smaller data set from a real plant is also implemented in this work. Finally, some concrete suggestions are made for using these models in the smelters.

  16. Human Factors and Data Fusion as Part of Control Systems Resilience

    Energy Technology Data Exchange (ETDEWEB)

    David I. Gertman

    2009-05-01

    Human performance and human decision making is counted upon as a crucial aspect of overall system resilience. Advanced control systems have the potential to provide operators and asset owners a wide range of data, deployed at different levels that can be used to support operator situation awareness. However, the sheer amount of data available can make it challenging for operators to assimilate information and respond appropriately. This paper reviews some of the challenges and issues associated with providing operators with actionable state awareness and argues for the over arching importance of integrating human factors as part of intelligent control systems design and implementation. It is argued that system resilience is improved by implementing human factors in operations and maintenance. This paper also introduces issues associated with resilience and data fusion and highlights areas in which human factors including field studies hold promise.

  17. Normalization of RNA-seq data using factor analysis of control genes or samples

    Science.gov (United States)

    Risso, Davide; Ngai, John; Speed, Terence P.; Dudoit, Sandrine

    2015-01-01

    Normalization of RNA-seq data has proven essential to ensure accurate inference of expression levels. Here we show that usual normalization approaches mostly account for sequencing depth and fail to correct for library preparation and other more-complex unwanted effects. We evaluate the performance of the External RNA Control Consortium (ERCC) spike-in controls and investigate the possibility of using them directly for normalization. We show that the spike-ins are not reliable enough to be used in standard global-scaling or regression-based normalization procedures. We propose a normalization strategy, remove unwanted variation (RUV), that adjusts for nuisance technical effects by performing factor analysis on suitable sets of control genes (e.g., ERCC spike-ins) or samples (e.g., replicate libraries). Our approach leads to more-accurate estimates of expression fold-changes and tests of differential expression compared to state-of-the-art normalization methods. In particular, RUV promises to be valuable for large collaborative projects involving multiple labs, technicians, and/or platforms. PMID:25150836

  18. Data bank for a data retrieval system

    Energy Technology Data Exchange (ETDEWEB)

    Vernikovskii, V V

    1980-01-01

    The data bank of the computerized data retrieval system is an organic and constituent part of the system; the level of technology and performance of the data retrieval system as a whole depend on the results of its design and operation. The data bank integrates a storage system for the entire set of data, as well as implementing an organization of a feasible storage mode for the system dictionary, computer processing procedures, user forms, system archieves and other service information. Functions of the data bank are computerized by means of a database control system. The retriveal system data bank was designed for the OKA database control system; the selection and evaluation of the feasibility of the OKA database control system, in turn, were one stage in the design of the system as a whole. The OKA database control system has been used to computerize data retrieval functions in the computerized data retrieval system, and also to maintain the system data bank in updated status.

  19. Overview of the data acquisition and control system for plasma diagnostics on MFTF-B

    International Nuclear Information System (INIS)

    Wyman, R.H.; Deadrick, F.J.; Lau, N.H.; Nelson, B.C.; Preckshot, G.G.; Throop, A.L.

    1983-01-01

    For MFTF-B, the plasma diagnostics system is expected to grow from a collection of 12 types of diagnostic instruments, initially producing about 1 Megabyte of data per shot, to an expanded set of 22 diagnostics producing about 8 Megabytes of data per shot. To control these diagnostics and acquire and process the data, a system design has been developed which uses an architecture similar to the supervisory/local-control computer system which is used to control other MFTF-B subsystems. This paper presents an overview of the hardware and software that will control and acquire data from the plasma diagnostics system. Data flow paths from the instruments, through processing, and into final archived storage will be described. A discussion of anticipated data rates, including anticipated software overhead at various points of the system, is included, along with the identification of possible bottlenecks. A methodology for processing of the data is described, along with the approach to handle the planned growth in the diagnostic system. Motivations are presented for various design choices which have been made

  20. Distributed Wireless Data Acquisition System with Synchronized Data Flow

    CERN Document Server

    Astakhova, N V; Dikoussar, N D; Eremin, G I; Gerasimov, A V; Ivanov, A I; Kryukov, Yu S; Mazny, N G; Ryabchun, O V; Salamatin, I M

    2006-01-01

    New methods to provide succession of computer codes under changes of the class of problems and to integrate the drivers of special-purpose devices into application are devised. The worked out scheme and methods for constructing automation systems are used to elaborate a distributed wireless system intended for registration of the characteristics of pulse processes with synchronized data flow, transmitted over a radio channel. The equipment with a sampling frequency of 20 kHz allowed us to achieve a synchronization accuracy of up to $\\pm $ 50 $\\mu$s. Modification of part of the equipment (sampling frequency) permits one to improve the accuracy up to 0.1 $\\mu$s. The obtained results can be applied to develop systems for monitoring various objects, as well as automation systems for experiments and automated process control systems.

  1. Candidate Mission from Planet Earth control and data delivery system architecture

    Science.gov (United States)

    Shapiro, Phillip; Weinstein, Frank C.; Hei, Donald J., Jr.; Todd, Jacqueline

    1992-01-01

    Using a structured, experienced-based approach, Goddard Space Flight Center (GSFC) has assessed the generic functional requirements for a lunar mission control and data delivery (CDD) system. This analysis was based on lunar mission requirements outlined in GSFC-developed user traffic models. The CDD system will facilitate data transportation among user elements, element operations, and user teams by providing functions such as data management, fault isolation, fault correction, and link acquisition. The CDD system for the lunar missions must not only satisfy lunar requirements but also facilitate and provide early development of data system technologies for Mars. Reuse and evolution of existing data systems can help to maximize system reliability and minimize cost. This paper presents a set of existing and currently planned NASA data systems that provide the basic functionality. Reuse of such systems can have an impact on mission design and significantly reduce CDD and other system development costs.

  2. Engineered barrier experiment. Power control and data acquisition systems

    International Nuclear Information System (INIS)

    Alberdi, J.; Barcala, J.M.; Gamero, E.; Martin, P.L.; Molinero, A.; Navarrete, J.J.; Yuste, C.

    1997-01-01

    The engineered barrier concept for the storage of radioactive wastes is being tested at almost full scale at CIEMAT facilities. A data acquisition and control is an element of this experiment. This system would be operating for next three years. (Author)

  3. Getting DNA copy numbers without control samples

    Directory of Open Access Journals (Sweden)

    Ortiz-Estevez Maria

    2012-08-01

    Full Text Available Abstract Background The selection of the reference to scale the data in a copy number analysis has paramount importance to achieve accurate estimates. Usually this reference is generated using control samples included in the study. However, these control samples are not always available and in these cases, an artificial reference must be created. A proper generation of this signal is crucial in terms of both noise and bias. We propose NSA (Normality Search Algorithm, a scaling method that works with and without control samples. It is based on the assumption that genomic regions enriched in SNPs with identical copy numbers in both alleles are likely to be normal. These normal regions are predicted for each sample individually and used to calculate the final reference signal. NSA can be applied to any CN data regardless the microarray technology and preprocessing method. It also finds an optimal weighting of the samples minimizing possible batch effects. Results Five human datasets (a subset of HapMap samples, Glioblastoma Multiforme (GBM, Ovarian, Prostate and Lung Cancer experiments have been analyzed. It is shown that using only tumoral samples, NSA is able to remove the bias in the copy number estimation, to reduce the noise and therefore, to increase the ability to detect copy number aberrations (CNAs. These improvements allow NSA to also detect recurrent aberrations more accurately than other state of the art methods. Conclusions NSA provides a robust and accurate reference for scaling probe signals data to CN values without the need of control samples. It minimizes the problems of bias, noise and batch effects in the estimation of CNs. Therefore, NSA scaling approach helps to better detect recurrent CNAs than current methods. The automatic selection of references makes it useful to perform bulk analysis of many GEO or ArrayExpress experiments without the need of developing a parser to find the normal samples or possible batches within the

  4. Performance Estimation for Embedded Systems with Data and Control Dependencies

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    2000-01-01

    In this paper we present an approach to performance estimation for hard real-time systems. We consider architectures consisting of multiple processors. The scheduling policy is based on a preemptive strategy with static priorities. Our model of the system captures both data and control dependencies...

  5. Data acquisition and command system for use with a microprocessor-based control chassis

    International Nuclear Information System (INIS)

    Halbig, J.K.; Klosterbuer, S.F.; Martinez, V.A. Jr.

    1980-01-01

    The Pion Generation for Medical Irradiations (PIGMI) program at the Los Alamos Scientific Laboratory is developing the technology to build smaller, less expensive, and more reliable proton linear accelerators for medical applications, and has designed a powerful, simple, inexpensive, and reliable control and data acquisition system that is central to the program development. The system is a NOVA-3D minicomputer interfaced to several outlying microprocessor-based controllers, which accomplish control and data acquisition through data I/O chasis. The equipment interface chassis, which can issue binary commands, read binary data, issue analog commands, and read timed and untimed analog data is described

  6. Discrete-time control system design with applications

    CERN Document Server

    Rabbath, C A

    2014-01-01

    This book presents practical techniques of discrete-time control system design. In general, the design techniques lead to low-order dynamic compensators that ensure satisfactory closed-loop performance for a wide range of sampling rates. The theory is given in the form of theorems, lemmas, and propositions. The design of the control systems is presented as step-by-step procedures and algorithms. The proposed feedback control schemes are applied to well-known dynamic system models. This book also discusses: Closed-loop performance of generic models of mobile robot and airborne pursuer dynamic systems under discrete-time feedback control with limited computing capabilities Concepts of discrete-time models and sampled-data models of continuous-time systems, for both single- and dual-rate operation Local versus global digital redesign Optimal, closed-loop digital redesign methods Plant input mapping design Generalized holds and samplers for use in feedback control loops, Numerical simulation of fixed-point arithm...

  7. Development of control and data processing system for CO{sub 2} laser interferometer

    Energy Technology Data Exchange (ETDEWEB)

    Chiba, Shinichi; Kawano, Yasunori; Tsuchiya, Katsuhiko; Inoue, Akira [Japan Atomic Energy Research Inst., Naka, Ibaraki (Japan). Naka Fusion Research Establishment

    2001-11-01

    CO{sub 2} laser interferometer diagnostic has been operating to measure the central electron density in JT-60U plasmas. We have developed a control and data processing system for the CO{sub 2} laser interferometer with flexible functions of data acquisition, data processing and data transfer in accordance with the sequence of JT-60U discharges. This system is mainly composed of two UNIX workstations and CAMAC clusters, in which the high reliability was obtained by sharing the data process functions to the each workstations. Consequently, the control and data processing system becomes to be able to provide electron density data immediately after a JT-60U discharge, routinely. The realtime feedback control of electron density in JT-60U also becomes to be available by using a reference density signal from the CO{sub 2} laser interferometer. (author)

  8. Computer control and data acquisition system for the R.F. Test Facility

    International Nuclear Information System (INIS)

    Stewart, K.A.; Burris, R.D.; Mankin, J.B.; Thompson, D.H.

    1986-01-01

    The Radio Frequency Test Facility (RFTF) at Oak Ridge National Laboratory, used to test and evaluate high-power ion cyclotron resonance heating (ICRH) systems and components, is monitored and controlled by a multicomponent computer system. This data acquisition and control system consists of three major hardware elements: (1) an Allen-Bradley PLC-3 programmable controller; (2) a VAX 11/780 computer; and (3) a CAMAC serial highway interface. Operating in LOCAL as well as REMOTE mode, the programmable logic controller (PLC) performs all the control functions of the test facility. The VAX computer acts as the operator's interface to the test facility by providing color mimic panel displays and allowing input via a trackball device. The VAX also provides archiving of trend data acquired by the PLC. Communications between the PLC and the VAX are via the CAMAC serial highway. Details of the hardware, software, and the operation of the system are presented in this paper

  9. Multivariate Process Control with Autocorrelated Data

    DEFF Research Database (Denmark)

    Kulahci, Murat

    2011-01-01

    As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control and monitoring. This new high dimensional data...... often exhibit not only cross-­‐correlation among the quality characteristics of interest but also serial dependence as a consequence of high sampling frequency and system dynamics. In practice, the most common method of monitoring multivariate data is through what is called the Hotelling’s T2 statistic....... In this paper, we discuss the effect of autocorrelation (when it is ignored) on multivariate control charts based on these methods and provide some practical suggestions and remedies to overcome this problem....

  10. Security of the data transmission in the industrial control system

    Directory of Open Access Journals (Sweden)

    Marcin Bednarek

    2015-12-01

    Full Text Available The theme of this paper is to present the data transmission security system between the stations of the industrial control system. The possible options for secure communications between process stations, as well as between process and operator station are described. Transmission security mechanism is based on algorithms for symmetric and asymmetric encryption. The authentication process uses a software token algorithm and a one-way hash function. The algorithm for establishing a secured connection between the stations, including the authentication process and encryption of data transmission is given. The process of securing the transmission consists of 4 sub-processes: (I authentication; (II asymmetric, public keys transmission; (III symmetric key transmission; (IV data transmission. The presented process of securing the transmission was realized in the industrial controller and emulator. For this purpose, programming languages in accordance with EN 61131 were used. The functions were implemented as user function blocks. This allows us to include a mixed code in the structure of the block (both: ST and FBD. Available function categories: support of the asymmetric encryption; asymmetric encryption utility functions; support of the symmetric encryption; symmetric encryption utility functions; support of the hash value calculations; utility functions of conversion.[b]Keywords[/b]: transmission security, encryption, authentication, industrial control system

  11. GET electronics samples data analysis

    International Nuclear Information System (INIS)

    Giovinazzo, J.; Goigoux, T.; Anvar, S.; Baron, P.; Blank, B.; Delagnes, E.; Grinyer, G.F.; Pancin, J.; Pedroza, J.L.; Pibernat, J.; Pollacco, E.; Rebii, A.

    2016-01-01

    The General Electronics for TPCs (GET) has been developed to equip a generation of time projection chamber detectors for nuclear physics, and may also be used for a wider range of detector types. The goal of this paper is to propose first analysis procedures to be applied on raw data samples from the GET system, in order to correct for systematic effects observed on test measurements. We also present a method to estimate the response function of the GET system channels. The response function is required in analysis where the input signal needs to be reconstructed, in terms of time distribution, from the registered output samples.

  12. Fast control and data acquisition in the neutral beam test facility

    International Nuclear Information System (INIS)

    Luchetta, A.; Manduchi, G.; Taliercio, C.

    2014-01-01

    Highlights: • The paper describes the fast control and data acquisition in the ITER neutral beam test facility. • The usage of real time control in ion beam generation and extraction is proposed. • Real time management of breakdowns is described. • The implementation of event-driven data acquisition is reported. - Abstract: Fast control and data acquisition are required in the ion source test bed of the ITER neutral beam test facility, referred to as SPIDER. Fast control will drive the operation of the power supply systems with particular reference to special asynchronous events, such as the breakdowns. These are short-circuits among grids or between grids and vessel that can occur repeatedly during beam operation. They are normal events and, as such, they will be managed by the fast control system. Cycle time associated to such fast control is down to hundreds of microseconds. Fast data acquisition is required when breakdowns occur. Event-driven data acquisition is triggered in real time by fast control at the occurrence of each breakdown. Pre- and post-event samples are acquired, allowing capturing information on transient phenomena in a whole time-window centered on the event. Sampling rate of event-driven data acquisition is up to 5 MS/s. Fast data acquisition may also be independent of breakdowns as in the case of the cavity ring-down spectroscopy where data chunks are acquired at 100 MS/s in bursts of 1.5 ms every 100 ms and are processed in real time to produce derived measurements. The paper after the description of the SPIDER fast control and data acquisition application will report the system design based on commercially available hardware and the MARTe and MDSplus software frameworks. The results obtained by running a full prototype of the fast control and data acquisition system are also reported and discussed. They demonstrate that all SPIDER fast control and data acquisition requirements can be met in the prototype solution

  13. Fast control and data acquisition in the neutral beam test facility

    Energy Technology Data Exchange (ETDEWEB)

    Luchetta, A., E-mail: adriano.luchetta@igi.cnr.it; Manduchi, G.; Taliercio, C.

    2014-05-15

    Highlights: • The paper describes the fast control and data acquisition in the ITER neutral beam test facility. • The usage of real time control in ion beam generation and extraction is proposed. • Real time management of breakdowns is described. • The implementation of event-driven data acquisition is reported. - Abstract: Fast control and data acquisition are required in the ion source test bed of the ITER neutral beam test facility, referred to as SPIDER. Fast control will drive the operation of the power supply systems with particular reference to special asynchronous events, such as the breakdowns. These are short-circuits among grids or between grids and vessel that can occur repeatedly during beam operation. They are normal events and, as such, they will be managed by the fast control system. Cycle time associated to such fast control is down to hundreds of microseconds. Fast data acquisition is required when breakdowns occur. Event-driven data acquisition is triggered in real time by fast control at the occurrence of each breakdown. Pre- and post-event samples are acquired, allowing capturing information on transient phenomena in a whole time-window centered on the event. Sampling rate of event-driven data acquisition is up to 5 MS/s. Fast data acquisition may also be independent of breakdowns as in the case of the cavity ring-down spectroscopy where data chunks are acquired at 100 MS/s in bursts of 1.5 ms every 100 ms and are processed in real time to produce derived measurements. The paper after the description of the SPIDER fast control and data acquisition application will report the system design based on commercially available hardware and the MARTe and MDSplus software frameworks. The results obtained by running a full prototype of the fast control and data acquisition system are also reported and discussed. They demonstrate that all SPIDER fast control and data acquisition requirements can be met in the prototype solution.

  14. Quality Control Of Compton Suppression System As An Environmental Sample Counting System

    International Nuclear Information System (INIS)

    Siswohartoyo, Sudarti; Soepardi, Dewita

    1996-01-01

    Quality control on Compton Suppression System has been done, i.e : 1) testing of HPGe as the main detector (FWHM, P/C d c level /n oise ) , 2) the Nal(Tl) detector shielding characteristic, 3) timing spectrum (FWHM), and 4) suppression factor. From the collected data, the characteristic of HPGe were found to be in the same range as shown in the manual. From the Nal(Tl) testing, it was found that the resolution was about 9%. From the time spectrum testing, the resolution was about 12-13 ns, while the suppression factor measurement was found to be about 4 - 4.6

  15. Optimizing data access in the LAMPF control system

    International Nuclear Information System (INIS)

    Schaller, S.C.; Corley, J.K.; Rose, P.A.

    1985-01-01

    The LAMPF control system data access software offers considerable power and flexibility to application programs through symbolic device naming and an emphasis on hardware independence. This paper discusses optimizations aimed at improving the performance of the data access software while retaining these capabilities. The only aspects of the optimizations visible to the application programs are ''vector devices'' and ''aggregate devices.'' A vector device accesses a set of hardware related data items through a single device name. Aggregate devices allow run-time optimization of references to groups of unrelated devices. Optimizations not visible on the application level include careful handling of: network message traffic; the sharing of global resources; and storage allocation

  16. Automatic data acquisition and on-line analysis of trace element concentration in serum samples

    International Nuclear Information System (INIS)

    Lecomte, R.; Paradis, P.; Monaro, S.

    1978-01-01

    A completely automated system has been developed to determine the trace element concentration in biological samples by measuring charged particle induced X-rays. A CDC-3100 computer with ADC and CAMAC interface is employed to control the data collection apparatus, acquire data and perform simultaneously the analysis. The experimental set-up consists of a large square plexiglass chamber in which a commercially available 750H Kodak Carousel is suitably arranged as a computer controlled sample changer. A method of extracting trace element concentrations using reference spectra is presented and an on-line program has been developed to easily and conveniently obtain final results at the end of each run. (Auth.)

  17. Design and implementation of a control and data acquisition system for pellet injectors

    International Nuclear Information System (INIS)

    Baylor, L.R.; Burris, R.D.; Greenwood, D.E.; Stewart, K.A.

    1985-01-01

    A stand-alone control and data acquisition system for pellet injectors has been designed and implemented to support pellet injector development at Oak Ridge Laboratory (ORNL) and to enable ORNL pellet injectors to be installed on various fusion experimental devices. The stand-alone system permits LOCAL operation of the injector from a nearby panel and REMOTE operation from the experiment control room. Major components of the system are (1) an Allen-Bradley PLC 2/30 programmable controller, (2) a VAX minicomputer, and (3) a CAMAC serial highway interface. The programmable logic controller (PLC) is used to perform all control functions of the injector. In LOCAL, the operator interface is provided by an intelligent panel system that has a keypad and pushbutton module programmed from the PLC. In REMOTE, the operator interfaces via a VAX-based color graphics display and uses a trackball and keyboard to issue commands. Communications between the remote and local controls and to the fusion experiment supervisory system are via the CAMAC highway. The VAX archives transient data from pellet shots and trend data acquired from the PLC. Details of the hardware and software design and the operation of the system are presented in this paper. 3 refs., 1 fig

  18. A new kind of data acquisition control system with multi-functions

    International Nuclear Information System (INIS)

    Zhang Songshou; Li Shuyuan; Li Runxin; Guo Dunjun

    1996-01-01

    The paper introduces a new kind of data acquisition control system with multi-functions. The system uses the single-chip microcomputer and the standard STD system. The text recounts its functions, principal, constitution, software and hardware designs, experiment outcome and the conclusions

  19. Data-driven adaptive fractional order PI control for PMSM servo system with measurement noise and data dropouts.

    Science.gov (United States)

    Xie, Yuanlong; Tang, Xiaoqi; Song, Bao; Zhou, Xiangdong; Guo, Yixuan

    2018-04-01

    In this paper, data-driven adaptive fractional order proportional integral (AFOPI) control is presented for permanent magnet synchronous motor (PMSM) servo system perturbed by measurement noise and data dropouts. The proposed method directly exploits the closed-loop process data for the AFOPI controller design under unknown noise distribution and data missing probability. Firstly, the proposed method constructs the AFOPI controller tuning problem as a parameter identification problem using the modified l p norm virtual reference feedback tuning (VRFT). Then, iteratively reweighted least squares is integrated into the l p norm VRFT to give a consistent compensation solution for the AFOPI controller. The measurement noise and data dropouts are estimated and eliminated by feedback compensation periodically, so that the AFOPI controller is updated online to accommodate the time-varying operating conditions. Moreover, the convergence and stability are guaranteed by mathematical analysis. Finally, the effectiveness of the proposed method is demonstrated both on simulations and experiments implemented on a practical PMSM servo system. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  20. Preview-based sampling for controlling gaseous simulations

    KAUST Repository

    Huang, Ruoguan

    2011-01-01

    In this work, we describe an automated method for directing the control of a high resolution gaseous fluid simulation based on the results of a lower resolution preview simulation. Small variations in accuracy between low and high resolution grids can lead to divergent simulations, which is problematic for those wanting to achieve a desired behavior. Our goal is to provide a simple method for ensuring that the high resolution simulation matches key properties from the lower resolution simulation. We first let a user specify a fast, coarse simulation that will be used for guidance. Our automated method samples the data to be matched at various positions and scales in the simulation, or allows the user to identify key portions of the simulation to maintain. During the high resolution simulation, a matching process ensures that the properties sampled from the low resolution simulation are maintained. This matching process keeps the different resolution simulations aligned even for complex systems, and can ensure consistency of not only the velocity field, but also advected scalar values. Because the final simulation is naturally similar to the preview simulation, only minor controlling adjustments are needed, allowing a simpler control method than that used in prior keyframing approaches. Copyright © 2011 by the Association for Computing Machinery, Inc.

  1. A Fine-Grained Data Access Control System in Wireless Sensor Network

    Directory of Open Access Journals (Sweden)

    Boniface K. Alese

    2015-12-01

    Full Text Available The evolving realities of Wireless Sensor Network (WSN deployed to various terrain of life require serving multiple applications. As large amount of sensed data are distributed and stored in individual sensors nodes, the illegal access to these sensitive data can be devastating. Consequently, data insecurity becomes a big concern. This study, therefore, proposes a fine-grained access control system which only requires the right set of users to access a particular data, based on their access privileges in the sensor networks. It is designed using Priccess Protocol with Access policy formulation adopting the principle of Bell Lapadula model as well as Attribute-Based Encryption (ABE to control access to sensor data. The functionality of the proposed system is simulated using Netbeans. The performance analysis of the proposed system using execution time and size of the key show that the higher the key size, the harder it becomes for the attacker to hack the system. Additionally, the time taken for the proposed work is lesser which makes the work faster than the existing work. Consequently, a well secure interactive web-based application that could facilitates the field officers access to stored data in safe and secure manner is developed.

  2. Design, development and testing of real time control and data acquisition system for R&D IC H&CD source

    Energy Technology Data Exchange (ETDEWEB)

    Rajnish, Kumar, E-mail: krajnish@iter-india.org [ITER-India, Institute for Plasma Research, Gandhinagar, Gujarat (India); Soni, Dipal; Verma, Sriprakash; Patel, Hriday; Trivedi, Rajesh; Singh, Raghuraj; Patel, Manoj; Mukherjee, Aparajita [ITER-India, Institute for Plasma Research, Gandhinagar, Gujarat (India); Makadia, Keyur [Optimized Solutions Pvt. Ltd., Ahmedabad, Gujarat (India)

    2016-11-15

    Highlights: • Application program of LCU is developed using LabVIEW™ to operate megawatt RF source. • PXI 6133 module is used for acquiring data with 1 MHz sampling frequency in trigger mode. • PXI 6255 is used for continuous acquisition with 1 KHz sampling frequency. • PXI 7841 R is used for hard wired fast protection function and for real time control loop. • Schneider PLC is used to develop sequence control system. - Abstract: One of the important auxiliary heating and current drive systems for ITER experiment is Ion Cyclotron Heating & Current Drive (IC H&CD) system. ITER requires 1 prototype and 8 independent RF Sources, each having capability to deliver constant output power of 2.5 MW at VSWR 2 in the frequency range of 35–65 MHz. To meet the ITER requirement, an R&D program has been launched to identify the final stage tube (Tetrode/Diacrode) for MW level RF amplifier, along with other critical components. Under this R&D program, single chain of RF cascaded amplifiers, consisting of low & high power RF sections, auxiliary power supply, Anode power supply and test bench simulating ITER requirement, are being developed. To ensure reliable and safe operation of the entire system, PLC/PXI based Local Control Unit (LCU) is developed. PLC based sequential control system is developed for sequential biasing of electrodes of high power vacuum tubes (Tetrode/Diacrode). To protect the RF Source against critical fault condition, local protection logic is developed using PXI-7841R, which will ensure fast shut off (<10 μs) of RF and power supplies. Constant and stable power output even with variable load condition (up to VSWR 2) is achieved by employing two real time feedback control loops-one for making constant output power and the other one for optimizing anode & screen grid dissipation. For fault/offline analysis, event based acquisition and logging functionality is provided that acquires and logs data at 1 μs sampling rate for 100 ms pre and post time

  3. Design and Architecture of SST-1 basic plasma control system

    Energy Technology Data Exchange (ETDEWEB)

    Patel, Kirit, E-mail: kpatel@ipr.res.in; Raju, D.; Dhongde, J.; Mahajan, K.; Chudasama, H.; Gulati, H.; Chauhan, A.; Masand, H.; Bhandarkar, M.; Pradhan, S.

    2016-11-15

    Highlights: • Reflective Memory network. • FPAG based Timing system for trigger distribution. • IRIG-B network for GPS time synchronization. • PMC based Digital Signal Processors and VME. • Simultaneous sampling ADC. - Abstract: Primary objective of SST-1 Plasma control system is to achieve Plasma position, shape and current profile control. Architecture of control system for SST-1 is distributed in nature. Fastest control loop time requirement of 100 μs is achieved using VME based simultaneous sampling ADCs, PMC based quad core DSP, Reflective Memory [RFM] based real-time network, VME based real-time trigger distribution network and Ethernet network. All the control loops for shape control, position control and current profile control share common signals from Magnetic diagnostic so it is planned to accommodate all the algorithms on the same PMC based quad core DSP module TS C-43. RFM based real-time data network replicate data from one node to next node in a ring network topology at sustained throughput rate of 13.4 MBps. Real-time Timing System network provides guaranteed trigger distribution in 3.8 μs from one node to all node of the network. Monitoring and configuration of different systems participating in the operation of SST-1 is done by Ethernet network. Magnetic sensors data is acquired using Pentek 6802 simultaneously sampling ADC card at the rate of 10KSPS. All the real-time raw data along with the control data will be archived using RFM network and SCSI HDD for the experiment duration of 1000 s. RFM network is also planned for real-time plotting of key parameter of Plasma during long experiment. After experiment this data is transferred to central storage server for archival purpose. This paper discusses the architecture and hardware implementation of the control system by describing all the involved hardware and software along with future plans for up-gradations.

  4. Exploiting H infinity sampled-data control theory for high-precision electromechanical servo control design

    NARCIS (Netherlands)

    Oomen, T.A.E.; Wal, van de M.M.J.; Bosgra, O.H.

    2006-01-01

    Optimal design of digital controllers for industrial electromechanical servo systems using an Hinf-criterion is considered. Present industrial practice is to perform the control design in the continuous time domain and to discretize the controller a posteriori. This procedure involves unnecessary

  5. STATISTIC MODEL OF DYNAMIC DELAY AND DROPOUT ON CELLULAR DATA NETWORKED CONTROL SYSTEM

    Directory of Open Access Journals (Sweden)

    MUHAMMAD A. MURTI

    2017-07-01

    Full Text Available Delay and dropout are important parameters influence overall control performance in Networked Control System (NCS. The goal of this research is to find a model of delay and dropout of data communication link in the NCS. Experiments have been done in this research to a water level control of boiler tank as part of the NCS based on internet communication network using High Speed Packet Access (HSPA cellular technology. By this experiments have been obtained closed-loop system response as well as data delay and dropout of data packets. This research contributes on modeling of the NCS which is combination of controlled plant and data communication link. Another contribution is statistical model of delay and dropout on the NCS.

  6. Automated sample mounting and technical advance alignment system for biological crystallography at a synchrotron source

    International Nuclear Information System (INIS)

    Snell, Gyorgy; Cork, Carl; Nordmeyer, Robert; Cornell, Earl; Meigs, George; Yegian, Derek; Jaklevic, Joseph; Jin, Jian; Stevens, Raymond C.; Earnest, Thomas

    2004-01-01

    High-throughput data collection for macromolecular crystallography requires an automated sample mounting system for cryo-protected crystals that functions reliably when integrated into protein-crystallography beamlines at synchrotrons. Rapid mounting and dismounting of the samples increases the efficiency of the crystal screening and data collection processes, where many crystals can be tested for the quality of diffraction. The sample-mounting subsystem has random access to 112 samples, stored under liquid nitrogen. Results of extensive tests regarding the performance and reliability of the system are presented. To further increase throughput, we have also developed a sample transport/storage system based on 'puck-shaped' cassettes, which can hold sixteen samples each. Seven cassettes fit into a standard dry shipping Dewar. The capabilities of a robotic crystal mounting and alignment system with instrumentation control software and a relational database allows for automated screening and data collection to be developed

  7. A continuous flow from sample collection to data acceptability determination using an automated system

    International Nuclear Information System (INIS)

    Fisk, J.F.; Leasure, C.; Sauter, A.D.

    1993-01-01

    In its role as regulator, EPA is the recipient of enormous reams of analytical data, especially within the Superfund Program. In order to better manage the volume of paper that comes in daily, Superfund has required its laboratories to provide data that is contained on reporting forms to be delivered also on a diskette for uploading into data bases for various purposes, such as checking for contractual compliance, tracking quality assurance parameters, and, ultimately, for reviewing the data by computer. This last area, automated review of the data, has generated programs that are not necessarily appropriate for use by clients other than Superfund. Such is the case with Los Alamos National Laboratory's Environmental Chemistry Group and its emerging subcontractor community, designed to meet the needs of the remedial action program at LANL. LANL is in the process of implementing an automated system that will be used from the planning stage of sample collection to the production of a project-specific report on analytical data quality. Included are electronic scheduling and tracking of samples, data entry, checking and transmission, data assessment and qualification for use, and report generation that will tie the analytical data quality back to the performance criteria defined prior to sample collection. Industry standard products will be used (e.g., ORACLE, Microsoft Excel) to ensure support for users, prevent dependence on proprietary software, and to protect LANL's investment for the future

  8. A VME based cryogenic data acquisition and control system (CRYO-DACS)

    International Nuclear Information System (INIS)

    Antony, Joby; Rajkumar; Datta, T.S.

    2005-01-01

    This report describes a newly developed VME based data acquisition and control system named CRYO-DACS for acquiring and controlling various analog and digital cryogenic parameters from equipment's like beam-line cryostats, Helium compressors, liquefier, cryogenic distribution line etc. A new central control room has been set-up for the remote controls and monitoring. The system monitors various analog parameters like temperature, pressure, vacuum and cryogenic fluid levels inside the cryostats and performs closed loop controls of cryogen valves. The hardware architecture of CRYO-DACS is multi-crate distributed VME, all linked by workstation clients in 100 Mb/s LAN for distributed logging, historical trending, analysis, alarm management and control GUIs. (author)

  9. Requirements for quality control of analytical data

    International Nuclear Information System (INIS)

    Westmoreland, R.D.; Bartling, M.H.

    1990-07-01

    The National Contingency Plan (NCP) of the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) provides procedures for the identification, evaluation, and remediation of past hazardous waste disposal sites. The Hazardous Materials Response section of the NCP consists of several phases: Preliminary Assessment, Site Inspection, Remedial Investigation, Feasibility Study, Remedial Design, and Remedial Action. During any of these phases, analysis of soil, water, and waste samples may be performed. The Hazardous Waste Remedial Actions Program (HAZWRAP) is involved in performing field investigations and sample analyses pursuant to the NCP for the US Department of Energy and other federal agencies. The purpose of this document is to specify the requirements of Martin Marietta Energy Systems, Inc., for the control of accuracy, precision, and completeness of samples and data from the point of collection through analysis. Requirements include data reduction and reporting of resulting environmentally related data. Because every instance and concern may not be addressed in this document, HAZWRAP subcontractors are encouraged to discuss any questions with the Analytical Quality Control Specialist (AQCS) and the HAZWRAP Project Manager. This revision supercedes all other versions of this document

  10. State estimation for networked control systems using fixed data rates

    Science.gov (United States)

    Liu, Qing-Quan; Jin, Fang

    2017-07-01

    This paper investigates state estimation for linear time-invariant systems where sensors and controllers are geographically separated and connected via a bandwidth-limited and errorless communication channel with the fixed data rate. All plant states are quantised, coded and converted together into a codeword in our quantisation and coding scheme. We present necessary and sufficient conditions on the fixed data rate for observability of such systems, and further develop the data-rate theorem. It is shown in our results that there exists a quantisation and coding scheme to ensure observability of the system if the fixed data rate is larger than the lower bound given, which is less conservative than the one in the literature. Furthermore, we also examine the role that the disturbances have on the state estimation problem in the case with data-rate limitations. Illustrative examples are given to demonstrate the effectiveness of the proposed method.

  11. Rorschach Comprehensive System data for a sample of 141 adult nonpatients from Denmark.

    Science.gov (United States)

    Ivanouw, Jan

    2007-01-01

    A sample (n = 141) of Danish nonpatients 25-50 years of age, never hospitalized with a psychiatric diagnosis and currently employed, was demographically representative of two geographical areas of Copenhagen with different social strain. The sample, as well as 45 persons not currently employed, was tested with the Rorschach (Exner, 1995), MMPI-2 (Butcher, Dahlstrom, Graham, Tellegen, & Kaemmer, 1989), Word Association Test (Ivanouw, 1999b), WAIS Comprehension subtest (Hess, 1974), and SCL-90-R (Olsen, Mortenson, & Bech, 2006). Half of the persons contacted volunteered for the study. There was no difference in rate of volunteering between a standard no-feedback condition and a feedback condition; the latter, however, tended to attract more psychologically resourceful persons. The employed persons tended to appear healthier than the not employed. Response style of the subjects, quality of the Rorschach protocols, reliability of scoring, and the effect of the data being grouped on geographical area and examiner were examined. Form level, color, texture, cooperative movement, and EA were lower than in the Comprehensive System (CS; n = 450) sample, but higher than in nine international nonpatient Rorschach studies. Unique for the Danish sample was a low number of animal movement answers. The Rorschach data showed women to be healthier than men. Differences in Rorschach variables based on educational level were small.

  12. Data processing system for ETL TPE-2

    International Nuclear Information System (INIS)

    Yahagi, E.; Kiyama, M.

    1988-01-01

    The data processing system for ETL TPE-2 consists of 2 CPU systems and it is composing a duplex system. One system is used as a data acquisition system, which is abbreviated as DAS and functions controlling various data input devices, data acquisition, communication with the main controller of TPE-2 confirming safety system operation. Another one is used as data processing system, which is abbreviated as DPS and functions the processing of the data after the acquisition, the interconnections with the mainframe and the development of software. A transient memory system, which has 64 channels of 8 bits ADC with maximum sampling frequency of 20 MHz and 4 KB buffer memory in each channel, is used to record the time sequential experimental data. Two CAMAC crates are used for the acquisition of the informations of the experiment condition and Thomson scattering data. They are composing a serial high way system through fiber optics. The CAMAC crate for Thomson scattering data is controlled by a personal computer, HP-85, and is available stand-alone use, and the communication between the CAMAC system and DAS is easily performed by using a CAMAC memory module as an intermediator without complicated procedure in the connection of different type computers. Two magnetic disk pack units, which have the formatted storage capacity of 158 KB in each one and can record the data over 2,000 shots, are used in parallel with a magnetic tape handler for the data file. Thus we realized the high speed data processing over the wide range of experimental shots and confirmed the preservation of the data. (author)

  13. Environmental data qualification system

    International Nuclear Information System (INIS)

    Hester, O.V.; Groh, M.R.

    1989-01-01

    The Integrated Environmental Data Management System (IEDMS) is a PC-based system that can support environmental investigations from their design stage and throughout the duration of the study. The system integrates data originating from the Sampling and Analysis Plan, field data and analytical findings. The IEDMS automated features include sampling guidance forms, barcoded sample labels and tags, field and analytical forms reproduction, sample tracking, analytical data qualification, completeness reports, and results and QC data reporting. The IEDMS has extensive automated capabilities that support a systematic and comprehensive process for performing quality assessment of EPA-CLP chemical analyses data. One product of this process is a unique and extremely useful tabular presentation of the data. One table contains the complete set of results and QC data included on the CLP data forms while presenting the information consistent with the chronology in which the analysis was performed. 3 refs., 1 fig

  14. Software Design Concepts for Archiving and Retrieving Control System Data

    International Nuclear Information System (INIS)

    Christopher Larrieu; Matt Bickley

    2001-01-01

    To develop and operate the control system effectively at the Thomas Jefferson National Accelerator Facility, users require the ability to diagnose its behavior not only in real-time, but also in retrospect. The new Jefferson Lab data logging system provides an acquisition and storage component capable of archiving enough data to provide suitable context for such analyses. In addition, it provides an extraction and presentation-facility which efficiently fulfills requests for both raw and processed data. This paper discusses several technologies and design methodologies which contribute to the system's overall utility. The Application Programming Interface (API) which developers use to access the data derives from a view of the storage system as a specialized relational database. An object-oriented and compartmental design contributes to its portability at several levels, and the use of CORBA facilitates interaction between distributed components in an industry-standard fashion. This work was supported by the U.S. DOE contract No. DE-AC05-84ER40150

  15. Integrated Laser Characterization, Data Acquisition, and Command and Control Test System

    Science.gov (United States)

    Stysley, Paul; Coyle, Barry; Lyness, Eric

    2012-01-01

    Satellite-based laser technology has been developed for topographical measurements of the Earth and of other planets. Lasers for such missions must be highly efficient and stable over long periods in the temperature variations of orbit. In this innovation, LabVIEW is used on an Apple Macintosh to acquire and analyze images of the laser beam as it exits the laser cavity to evaluate the laser s performance over time, and to monitor and control the environmental conditions under which the laser is tested. One computer attached to multiple cameras and instruments running LabVIEW-based software replaces a conglomeration of computers and software packages, saving hours in maintenance and data analysis, and making very longterm tests possible. This all-in-one system was written primarily using LabVIEW for Mac OS X, which allows the combining of data from multiple RS-232, USB, and Ethernet instruments for comprehensive laser analysis and control. The system acquires data from CCDs (charge coupled devices), power meters, thermistors, and oscilloscopes over a controllable period of time. This data is saved to an html file that can be accessed later from a variety of data analysis programs. Also, through the LabVIEW interface, engineers can easily control laser input parameters such as current, pulse width, chiller temperature, and repetition rates. All of these parameters can be adapted and cycled over a period of time.

  16. Pierre Gy's sampling theory and sampling practice heterogeneity, sampling correctness, and statistical process control

    CERN Document Server

    Pitard, Francis F

    1993-01-01

    Pierre Gy's Sampling Theory and Sampling Practice, Second Edition is a concise, step-by-step guide for process variability management and methods. Updated and expanded, this new edition provides a comprehensive study of heterogeneity, covering the basic principles of sampling theory and its various applications. It presents many practical examples to allow readers to select appropriate sampling protocols and assess the validity of sampling protocols from others. The variability of dynamic process streams using variography is discussed to help bridge sampling theory with statistical process control. Many descriptions of good sampling devices, as well as descriptions of poor ones, are featured to educate readers on what to look for when purchasing sampling systems. The book uses its accessible, tutorial style to focus on professional selection and use of methods. The book will be a valuable guide for mineral processing engineers; metallurgists; geologists; miners; chemists; environmental scientists; and practit...

  17. Congestion Control in Data Transmission Networks Sliding Mode and Other Designs

    CERN Document Server

    Ignaciuk, Przemysław

    2013-01-01

    Congestion Control in Data Transmission Networks details the modeling and control of data traffic in communication networks. It shows how various networking phenomena can be represented in a consistent mathematical framework suitable for rigorous formal analysis. The monograph differentiates between fluid-flow continuous-time traffic models, discrete-time processes with constant sampling rates, and sampled-data systems with variable discretization periods. The authors address a number of difficult real-life problems, such as: • optimal control of flows with disparate, time-varying delay; • the existence of source and channel nonlinearities; • the balancing of quality of service and fairness requirements; and • the incorporation of variable rate allocation policies. Appropriate control mechanisms which can handle congestion and guarantee high throughput in various traffic scenarios (with different networking phenomena being considered) are proposed. Systematic design procedures using sound control-theo...

  18. Electronic Integrated Disease Surveillance System and Pathogen Asset Control System

    Directory of Open Access Journals (Sweden)

    Tom G. Wahl

    2012-06-01

    Full Text Available Electronic Integrated Disease Surveillance System (EIDSS has been used to strengthen and support monitoring and prevention of dangerous diseases within One Health concept by integrating veterinary and human surveillance, passive and active approaches, case-based records including disease-specific clinical data based on standardised case definitions and aggregated data, laboratory data including sample tracking linked to each case and event with test results and epidemiological investigations. Information was collected and shared in secure way by different means: through the distributed nodes which are continuously synchronised amongst each other, through the web service, through the handheld devices. Electronic Integrated Disease Surveillance System provided near real time information flow that has been then disseminated to the appropriate organisations in a timely manner. It has been used for comprehensive analysis and visualisation capabilities including real time mapping of case events as these unfold enhancing decision making. Electronic Integrated Disease Surveillance System facilitated countries to comply with the IHR 2005 requirements through a data transfer module reporting diseases electronically to the World Health Organisation (WHO data center as well as establish authorised data exchange with other electronic system using Open Architecture approach. Pathogen Asset Control System (PACS has been used for accounting, management and control of biological agent stocks. Information on samples and strains of any kind throughout their entire lifecycle has been tracked in a comprehensive and flexible solution PACS. Both systems have been used in a combination and individually. Electronic Integrated Disease Surveillance System and PACS are currently deployed in the Republics of Kazakhstan, Georgia and Azerbaijan as a part of the Cooperative Biological Engagement Program (CBEP sponsored by the US Defense Threat Reduction Agency (DTRA.

  19. Electronic integrated disease surveillance system and pathogen asset control system.

    Science.gov (United States)

    Wahl, Tom G; Burdakov, Aleksey V; Oukharov, Andrey O; Zhilokov, Azamat K

    2012-06-20

    Electronic Integrated Disease Surveillance System (EIDSS) has been used to strengthen and support monitoring and prevention of dangerous diseases within One Health concept by integrating veterinary and human surveillance, passive and active approaches, case-based records including disease-specific clinical data based on standardised case definitions and aggregated data, laboratory data including sample tracking linked to each case and event with test results and epidemiological investigations. Information was collected and shared in secure way by different means: through the distributed nodes which are continuously synchronised amongst each other, through the web service, through the handheld devices. Electronic Integrated Disease Surveillance System provided near real time information flow that has been then disseminated to the appropriate organisations in a timely manner. It has been used for comprehensive analysis and visualisation capabilities including real time mapping of case events as these unfold enhancing decision making. Electronic Integrated Disease Surveillance System facilitated countries to comply with the IHR 2005 requirements through a data transfer module reporting diseases electronically to the World Health Organisation (WHO) data center as well as establish authorised data exchange with other electronic system using Open Architecture approach. Pathogen Asset Control System (PACS) has been used for accounting, management and control of biological agent stocks. Information on samples and strains of any kind throughout their entire lifecycle has been tracked in a comprehensive and flexible solution PACS.Both systems have been used in a combination and individually. Electronic Integrated Disease Surveillance System and PACS are currently deployed in the Republics of Kazakhstan, Georgia and Azerbaijan as a part of the Cooperative Biological Engagement Program (CBEP) sponsored by the US Defense Threat Reduction Agency (DTRA).

  20. The new real-time control and data acquisition system for an experimental tritium removal facility

    International Nuclear Information System (INIS)

    Stefan, Iuliana; Stefan, Liviu; Retevoi, Carmen; Balteanu, Ovidiu; Bucur, Ciprian

    2006-01-01

    Full text: The purpose of the paper is to present a real-time control and data acquisition system based on virtual instrumentation (LabView, compact I/O) applicable to an experimental heavy water detritiation plant. The initial data acquisition system based on analogue instruments is now upgraded to a fully digital system, this because of greater flexibility and capability than analogue hardware what allows easy modifications of the control system. Virtual instrumentation became lately much used for monitoring and controlling the operational parameters in plants. In the specific case of ETRF there are a lot of process parameters which have to be monitored and controlled. The essential improvement in the new system is the collection of all signals and control functions by a PC, what makes any changes in configuration easy. The system hardware-PC with embedded controllers is selected, as most cost effective. The LabView platform provides faster program development with a convenient user interface. The system provides independent digital control of each parameters and records data of the process. The system is flexible and has the advantage of further extension. (authors)

  1. Storing and accessing radioactivity data in environmental samples: the resources of GEORAD

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Tadeu A. de A.; Gonzalez, Sergio de A.; Reis, Rocio G. dos; Vasconcellos, Luiza M. de H. e; Lauria, Dejanira de C., E-mail: tedsilva@ird.gov.br, E-mail: gonzalez@ird.gov.br, E-mail: rocio@ird.gov.br, E-mail: luiza@ird.gov.br, E-mail: dejanira@ird.gov.br [Isntituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2013-07-01

    A georeferenced information system of radioactivity in environmental samples, named GEORAD, was created with the goal of aggregating, storing and promoting the preservation of the data produced by Brazilian researches, and sharing with the research community a database on radioactivity in Brazil. The system provides information on concentrations of the natural series, cosmogenic and fall out radionuclides in samples of soil, water and food, among others, along with the geographical location of the samples. By this way, the location of the sample can be visualized on Brazilian map. A spreadsheet containing all the data and information about the sample can be also obtained. As a result, the database system can enable the available data to be exploited to the maximum potential for further research and allows new research on existing information. The system also provides reference information on where the data information were obtained, that enables data citation and linking data with publications to increase visibility and accessibility of data and the research itself. The GEORAD system has been continuously fed and updated, containing, currently, data from more than 2,000 samples. This paper presents the latest system updates and discusses its resources. (author)

  2. Storing and accessing radioactivity data in environmental samples: the resources of GEORAD

    International Nuclear Information System (INIS)

    Silva, Tadeu A. de A.; Gonzalez, Sergio de A.; Reis, Rocio G. dos; Vasconcellos, Luiza M. de H. e; Lauria, Dejanira de C.

    2013-01-01

    A georeferenced information system of radioactivity in environmental samples, named GEORAD, was created with the goal of aggregating, storing and promoting the preservation of the data produced by Brazilian researches, and sharing with the research community a database on radioactivity in Brazil. The system provides information on concentrations of the natural series, cosmogenic and fall out radionuclides in samples of soil, water and food, among others, along with the geographical location of the samples. By this way, the location of the sample can be visualized on Brazilian map. A spreadsheet containing all the data and information about the sample can be also obtained. As a result, the database system can enable the available data to be exploited to the maximum potential for further research and allows new research on existing information. The system also provides reference information on where the data information were obtained, that enables data citation and linking data with publications to increase visibility and accessibility of data and the research itself. The GEORAD system has been continuously fed and updated, containing, currently, data from more than 2,000 samples. This paper presents the latest system updates and discusses its resources. (author)

  3. Exponential Synchronization for Stochastic Neural Networks with Mixed Time Delays and Markovian Jump Parameters via Sampled Data

    Directory of Open Access Journals (Sweden)

    Yingwei Li

    2014-01-01

    Full Text Available The exponential synchronization issue for stochastic neural networks (SNNs with mixed time delays and Markovian jump parameters using sampled-data controller is investigated. Based on a novel Lyapunov-Krasovskii functional, stochastic analysis theory, and linear matrix inequality (LMI approach, we derived some novel sufficient conditions that guarantee that the master systems exponentially synchronize with the slave systems. The design method of the desired sampled-data controller is also proposed. To reflect the most dynamical behaviors of the system, both Markovian jump parameters and stochastic disturbance are considered, where stochastic disturbances are given in the form of a Brownian motion. The results obtained in this paper are a little conservative comparing the previous results in the literature. Finally, two numerical examples are given to illustrate the effectiveness of the proposed methods.

  4. Coil measurement data acquisition and curing press control system for SSC dipole magnet coils

    International Nuclear Information System (INIS)

    Dickey, C.E.

    1989-03-01

    A coil matching program, similar in theory to the methods used to match Tevatron coils, is being developed at Fermilab. Modulus of elasticity and absolute coil size will be determined at 18-inch intervals along the coils while in the coil curing press immediately following the curing process. A data acquisition system is under construction to automatically acquire and manage the large quantities of data that result. Data files will be transferred to Fermilab's VAX Cluster for long-term storage and actual coil matching. The data acquisition system will also provide the control algorithm for the curing press hydraulic system. A description of the SSC Curing Press Data Acquisition and Controls System will be reported. 20 figs

  5. Computer network data communication controller for the Plutonium Protection System (PPS)

    International Nuclear Information System (INIS)

    Rogers, M.S.

    1978-10-01

    Systems which employ several computers for distributed processing must provide communication links between the computers to effectively utilize their capacity. The technique of using a central network controller to supervise and route messages on a multicomputer digital communications net has certain economic and performance advantages over alternative implementations. Conceptually, the number of stations (computers) which can be accommodated by such a controller is unlimited, but practical considerations dictate a maximum of about 12 to 15. A Data Network Controller (DNC) has been designed around a M6800 microprocessor for use in the Plutonium Protection System (PPS) demonstration facilities

  6. Program controlled system for mathematical processing the αp-experiment data

    International Nuclear Information System (INIS)

    Glagolev, V.V.; Govorun, N.N.; Dirner, A.; Ivanov, V.G.; Kretov, A.P.; Mirolyubov, V.P.; Pervushov, V.V.; Shelontsev, I.I.

    1982-01-01

    ZEUS system which allows one mathematical processing of bubble chamber pictures for αp-experiment with computer control is descibed. The comparison and basic defect of traditional processing of film information is considered. The structure, operation and further development of this system are described. It consists of the monitoring programs, directory file, input request language, data bank and documentation. ZEUS system is developed for processing αp-experiment from JINR one-meter-hydrogen liquid chamber. It makes possible to eliminate big manual work at organization of mass data processing by a computer. The system is realized on the CDC-6500 computer

  7. On sampling and modeling complex systems

    International Nuclear Information System (INIS)

    Marsili, Matteo; Mastromatteo, Iacopo; Roudi, Yasser

    2013-01-01

    The study of complex systems is limited by the fact that only a few variables are accessible for modeling and sampling, which are not necessarily the most relevant ones to explain the system behavior. In addition, empirical data typically undersample the space of possible states. We study a generic framework where a complex system is seen as a system of many interacting degrees of freedom, which are known only in part, that optimize a given function. We show that the underlying distribution with respect to the known variables has the Boltzmann form, with a temperature that depends on the number of unknown variables. In particular, when the influence of the unknown degrees of freedom on the known variables is not too irregular, the temperature decreases as the number of variables increases. This suggests that models can be predictable only when the number of relevant variables is less than a critical threshold. Concerning sampling, we argue that the information that a sample contains on the behavior of the system is quantified by the entropy of the frequency with which different states occur. This allows us to characterize the properties of maximally informative samples: within a simple approximation, the most informative frequency size distributions have power law behavior and Zipf’s law emerges at the crossover between the under sampled regime and the regime where the sample contains enough statistics to make inferences on the behavior of the system. These ideas are illustrated in some applications, showing that they can be used to identify relevant variables or to select the most informative representations of data, e.g. in data clustering. (paper)

  8. Data acquisition and control using ETHERNET

    International Nuclear Information System (INIS)

    Elkins, E.P.

    1985-01-01

    We have developed a distributed computer control system to monitor and control a linear accelerator. This system consists of two PDP-11s and eight LSI 11/23s linked together with ETHERNET. The higher level systems (control consoles, etc.) use the RSX11M operating system, whereas the data acquisition and control is performed using the RSX11S operating system downline loaded from a central host computer. Locally written ETHERNET drivers are used to reduce the CPU overhead and therefore improve system response. The ETHERNET system permits remote file access by means of operator or program interaction, as well as supporting downline system loading. Control-system functions supported are supervisory control, closed-loop control, data monitoring, and data recording. 4 refs., 2 figs., 1 tab

  9. Real-time data acquisition and control system for the 349-pixel TACTIC atmospheric Cherenkov imaging telescope

    Energy Technology Data Exchange (ETDEWEB)

    Yadav, K.K.; Koul, R.; Kanda, A.; Kaul, S.R.; Tickoo, A.K. E-mail: aktickoo@apsara.barc.ernet.in; Rannot, R.C.; Chandra, P.; Bhatt, N.; Chouhan, N.; Venugopal, K.; Kothari, M.; Goyal, H.C.; Dhar, V.K.; Kaul, S.K

    2004-07-21

    An interrupt-based multinode data acquisition and control system has been developed for the imaging element of the TACTIC {gamma}-ray telescope. The system which has been designed around a 3-node network of PCs running the QNX real-time operating system, provides single-point control with elaborate GUI facilities for operating the multi-pixel camera of the telescope. In addition to acquiring data from the 349-pixel photomultiplier tube based imaging camera in real time, the system also provides continuous monitoring and control of several vital parameters of the telescope for ensuring the quality of the data. The paper describes the salient features of the hardware and software of the data acquisition and control system of the telescope.

  10. Data-driven modeling and real-time distributed control for energy efficient manufacturing systems

    International Nuclear Information System (INIS)

    Zou, Jing; Chang, Qing; Arinez, Jorge; Xiao, Guoxian

    2017-01-01

    As manufacturers face the challenges of increasing global competition and energy saving requirements, it is imperative to seek out opportunities to reduce energy waste and overall cost. In this paper, a novel data-driven stochastic manufacturing system modeling method is proposed to identify and predict energy saving opportunities and their impact on production. A real-time distributed feedback production control policy, which integrates the current and predicted system performance, is established to improve the overall profit and energy efficiency. A case study is presented to demonstrate the effectiveness of the proposed control policy. - Highlights: • A data-driven stochastic manufacturing system model is proposed. • Real-time system performance and energy saving opportunity identification method is developed. • Prediction method for future potential system performance and energy saving opportunity is developed. • A real-time distributed feedback control policy is established to improve energy efficiency and overall system profit.

  11. Computer-based supervisory control and data acquisition system for the radioactive waste evaporator

    International Nuclear Information System (INIS)

    Pope, N.G.; Schreiber, S.B.; Yarbro, S.L.; Gomez, B.G.; Nekimken, H.L.; Sanchez, D.E.; Bibeau, R.A.; Macdonald, J.M.

    1994-12-01

    The evaporator process at TA-55 reduces the amount of transuranic liquid radioactive waste by separating radioactive salts from relatively low-level radioactive nitric acid solution. A computer-based supervisory control and data acquisition (SCADA) system has been installed on the process that allows the operators to easily interface with process equipment. Individual single-loop controllers in the SCADA system allow more precise process operation with less human intervention. With this system, process data can be archieved in computer files for later analysis. Data are distributed throughout the TA-55 site through a local area network so that real-time process conditions can be monitored at multiple locations. The entire system has been built using commercially available hardware and software components

  12. QSpec: online control and data analysis system for single-cell Raman spectroscopy

    Directory of Open Access Journals (Sweden)

    Lihui Ren

    2014-06-01

    Full Text Available Single-cell phenotyping is critical to the success of biological reductionism. Raman-activated cell sorting (RACS has shown promise in resolving the dynamics of living cells at the individual level and to uncover population heterogeneities in comparison to established approaches such as fluorescence-activated cell sorting (FACS. Given that the number of single-cells would be massive in any experiment, the power of Raman profiling technique for single-cell analysis would be fully utilized only when coupled with a high-throughput and intelligent process control and data analysis system. In this work, we established QSpec, an automatic system that supports high-throughput Raman-based single-cell phenotyping. Additionally, a single-cell Raman profile database has been established upon which data-mining could be applied to discover the heterogeneity among single-cells under different conditions. To test the effectiveness of this control and data analysis system, a sub-system was also developed to simulate the phenotypes of single-cells as well as the device features.

  13. Control and data acquisition system for the macromolecular crystallography beamline of SSRF

    International Nuclear Information System (INIS)

    Wang Qisheng; Huang Sheng; Sun Bo; Tang Lin; He Jianhua

    2012-01-01

    The macromolecular crystallography beamline BL17U1 of Shanghai Synchrotron Radiation Facility (SSRF) is an important platform for structure biological science. High performance of the beamline would benefit the users greatly in their experiment and data acquisition. To take full advantage of the state-of-the-art mechanical and physical design of the beamline, we have made a series of efforts to develop a robust control and data acquisition system, with user-friendly GUI. These were done by adopting EPICS and Blu-Ice systems on the BL17U1 beamline, with considerations on easy accommodation of new beeline components. In this paper, we report the integration of EPICS and Blu-Ice systems. By using the EPICS gateway interface and several new DHS, Blu-Ice was successfully established for the BL17U1 beamline. As a result, the experiment control and data acquisition system is reliable and functional for users. (authors)

  14. Engineering Design of ITER Prototype Fast Plant System Controller

    Science.gov (United States)

    Goncalves, B.; Sousa, J.; Carvalho, B.; Rodrigues, A. P.; Correia, M.; Batista, A.; Vega, J.; Ruiz, M.; Lopez, J. M.; Rojo, R. Castro; Wallander, A.; Utzel, N.; Neto, A.; Alves, D.; Valcarcel, D.

    2011-08-01

    The ITER control, data access and communication (CODAC) design team identified the need for two types of plant systems. A slow control plant system is based on industrial automation technology with maximum sampling rates below 100 Hz, and a fast control plant system is based on embedded technology with higher sampling rates and more stringent real-time requirements than that required for slow controllers. The latter is applicable to diagnostics and plant systems in closed-control loops whose cycle times are below 1 ms. Fast controllers will be dedicated industrial controllers with the ability to supervise other fast and/or slow controllers, interface to actuators and sensors and, if necessary, high performance networks. Two prototypes of a fast plant system controller specialized for data acquisition and constrained by ITER technological choices are being built using two different form factors. This prototyping activity contributes to the Plant Control Design Handbook effort of standardization, specifically regarding fast controller characteristics. Envisaging a general purpose fast controller design, diagnostic use cases with specific requirements were analyzed and will be presented along with the interface with CODAC and sensors. The requirements and constraints that real-time plasma control imposes on the design were also taken into consideration. Functional specifications and technology neutral architecture, together with its implications on the engineering design, were considered. The detailed engineering design compliant with ITER standards was performed and will be discussed in detail. Emphasis will be given to the integration of the controller in the standard CODAC environment. Requirements for the EPICS IOC providing the interface to the outside world, the prototype decisions on form factor, real-time operating system, and high-performance networks will also be discussed, as well as the requirements for data streaming to CODAC for visualization and

  15. Open-closed-loop iterative learning control for a class of nonlinear systems with random data dropouts

    Science.gov (United States)

    Cheng, X. Y.; Wang, H. B.; Jia, Y. L.; Dong, YH

    2018-05-01

    In this paper, an open-closed-loop iterative learning control (ILC) algorithm is constructed for a class of nonlinear systems subjecting to random data dropouts. The ILC algorithm is implemented by a networked control system (NCS), where only the off-line data is transmitted by network while the real-time data is delivered in the point-to-point way. Thus, there are two controllers rather than one in the control system, which makes better use of the saved and current information and thereby improves the performance achieved by open-loop control alone. During the transfer of off-line data between the nonlinear plant and the remote controller data dropout occurs randomly and the data dropout rate is modeled as a binary Bernoulli random variable. Both measurement and control data dropouts are taken into consideration simultaneously. The convergence criterion is derived based on rigorous analysis. Finally, the simulation results verify the effectiveness of the proposed method.

  16. Tank vapor sampling and analysis data package for tank 241-C-106 waste retrieval sluicing system process test phase III

    Energy Technology Data Exchange (ETDEWEB)

    LOCKREM, L.L.

    1999-08-13

    This data package presents sampling data and analytical results from the March 28, 1999, vapor sampling of Hanford Site single-shell tank 241-C-106 during active sluicing. Samples were obtained from the 296-C-006 ventilation system stack and ambient air at several locations. Characterization Project Operations (CPO) was responsible for the collection of all SUMMATM canister samples. The Special Analytical Support (SAS) vapor team was responsible for the collection of all triple sorbent trap (TST), sorbent tube train (STT), polyurethane foam (PUF), and particulate filter samples collected at the 296-C-006 stack. The SAS vapor team used the non-electrical vapor sampling (NEVS) system to collect samples of the air, gases, and vapors from the 296-C-006 stack. The SAS vapor team collected and analyzed these samples for Lockheed Martin Hanford Corporation (LMHC) and Tank Waste Remediation System (TWRS) in accordance with the sampling and analytical requirements specified in the Waste Retrieval Sluicing System Vapor Sampling and Analysis Plan (SAP) for Evaluation of Organic Emissions, Process Test Phase III, HNF-4212, Rev. 0-A, (LMHC, 1999). All samples were stored in a secured Radioactive Materials Area (RMA) until the samples were radiologically released and received by SAS for analysis. The Waste Sampling and Characterization Facility (WSCF) performed the radiological analyses. The samples were received on April 5, 1999.

  17. PXIe based data acquisition and control system for ECRH systems on SST-1 and Aditya tokamak

    Energy Technology Data Exchange (ETDEWEB)

    Patel, Jatinkumar J., E-mail: jatin@ipr.res.in [Institute for Plasma Research, Bhat, Gandhinagar (India); Shukla, B.K.; Rajanbabu, N.; Patel, H.; Dhorajiya, P.; Purohit, D. [Institute for Plasma Research, Bhat, Gandhinagar (India); Mankadiya, K. [Optimized Solutions Pvt. Ltd (India)

    2016-11-15

    Highlights: • Data Aquisition and control system (DAQ). • PXIe hardware–(PXI–PCI bus extension for Instrumention Express). • RHVPS–Regulated High Voltage Power supply. • SST1–Steady state superconducting tokamak. - Abstract: In Steady State Superconducting (SST-1) tokamak, various RF heating sub-systems are used for plasma heating experiments. In SST-1, Two Electron Cyclotron Resonance Heating (ECRH) systems have been installed for pre-ionization, heating and current drive experiments. The 42 GHz gyrotron based ECRH system is installed and in operation with SST-1 plasma experiments. The 82.6 GHz gyrotron delivers 200 kW CW power (1000 s) while the 42 GHz gyrotron delivers 500 kW power for 500 ms duration. Each gyrotron system consists of various auxiliary power supplies, the crowbar unit and the water cooling system. The PXIe (PCI bus extension for Instrumentation Express)bus based DAC (Data Acquisition and Control) system has been designed, developed and under implementation for safe and reliable operation of the gyrotron. The Control and Monitoring Software applications have been developed using NI LabView 2014 software with real time support on windows platform.

  18. PXIe based data acquisition and control system for ECRH systems on SST-1 and Aditya tokamak

    International Nuclear Information System (INIS)

    Patel, Jatinkumar J.; Shukla, B.K.; Rajanbabu, N.; Patel, H.; Dhorajiya, P.; Purohit, D.; Mankadiya, K.

    2016-01-01

    Highlights: • Data Aquisition and control system (DAQ). • PXIe hardware–(PXI–PCI bus extension for Instrumention Express). • RHVPS–Regulated High Voltage Power supply. • SST1–Steady state superconducting tokamak. - Abstract: In Steady State Superconducting (SST-1) tokamak, various RF heating sub-systems are used for plasma heating experiments. In SST-1, Two Electron Cyclotron Resonance Heating (ECRH) systems have been installed for pre-ionization, heating and current drive experiments. The 42 GHz gyrotron based ECRH system is installed and in operation with SST-1 plasma experiments. The 82.6 GHz gyrotron delivers 200 kW CW power (1000 s) while the 42 GHz gyrotron delivers 500 kW power for 500 ms duration. Each gyrotron system consists of various auxiliary power supplies, the crowbar unit and the water cooling system. The PXIe (PCI bus extension for Instrumentation Express)bus based DAC (Data Acquisition and Control) system has been designed, developed and under implementation for safe and reliable operation of the gyrotron. The Control and Monitoring Software applications have been developed using NI LabView 2014 software with real time support on windows platform.

  19. A newly developed grab sampling system for collecting stratospheric air over Antarctica

    Directory of Open Access Journals (Sweden)

    Hideyuki Honda

    1996-07-01

    Full Text Available In order to measure the concentrations of various minor constituents and their isotopic ratios in the stratosphere over Antarctica, a simple grab sampling system was newly developed. The sampling system was designed to be launched by a small number of personnel using a rubber balloon under severe experimental conditions. Special attention was paid to minimize the contamination of sample air, as well as to allow easy handling of the system. The sampler consisted mainly of a 15l sample container with electromagnetic and manual valves, control electronics for executing the air sampling procedures and sending the position and status information of the sampler to the ground station, batteries and a transmitter. All these parts were assembled in an aluminum frame gondola with a shock absorbing system for landing. The sampler was equipped with a turn-over mechanism of the gondola to minimize contamination from the gondola, as well as with a GPS receiver and a rawinsonde for its tracking. Total weight of the sampler was about 11kg. To receive, display and store the position and status data of the sampling system at the ground station, a simple data acquisition system with a portable receiver and a microcomputer was also developed. A new gas handling system was prepared to simplify the injection of He gas into the balloon. For air sampling experiments, three sampling systems were launched at Syowa Station (69°00′S, 39°35′E, Antarctica and then recovered on sea ice near the station on January 22 and 25,1996.

  20. Stability and performance of propulsion control systems with distributed control architectures and failures

    Science.gov (United States)

    Belapurkar, Rohit K.

    Future aircraft engine control systems will be based on a distributed architecture, in which, the sensors and actuators will be connected to the Full Authority Digital Engine Control (FADEC) through an engine area network. Distributed engine control architecture will allow the implementation of advanced, active control techniques along with achieving weight reduction, improvement in performance and lower life cycle cost. The performance of a distributed engine control system is predominantly dependent on the performance of the communication network. Due to the serial data transmission policy, network-induced time delays and sampling jitter are introduced between the sensor/actuator nodes and the distributed FADEC. Communication network faults and transient node failures may result in data dropouts, which may not only degrade the control system performance but may even destabilize the engine control system. Three different architectures for a turbine engine control system based on a distributed framework are presented. A partially distributed control system for a turbo-shaft engine is designed based on ARINC 825 communication protocol. Stability conditions and control design methodology are developed for the proposed partially distributed turbo-shaft engine control system to guarantee the desired performance under the presence of network-induced time delay and random data loss due to transient sensor/actuator failures. A fault tolerant control design methodology is proposed to benefit from the availability of an additional system bandwidth and from the broadcast feature of the data network. It is shown that a reconfigurable fault tolerant control design can help to reduce the performance degradation in presence of node failures. A T-700 turbo-shaft engine model is used to validate the proposed control methodology based on both single input and multiple-input multiple-output control design techniques.

  1. Asynchronous data change notification between database server and accelerator control systems

    International Nuclear Information System (INIS)

    Wenge Fu; Seth Nemesure; Morris, J.

    2012-01-01

    Database data change notification (DCN) is a commonly used feature, it allows to be informed when the data has been changed on the server side by another client. Not all database management systems (DBMS) provide an explicit DCN mechanism. Even for those DBMS's which support DCN (such as Oracle and MS SQL server), some server side and/or client side programming may be required to make the DCN system work. This makes the setup of DCN between database server and interested clients tedious and time consuming. In accelerator control systems, there are many well established software client/server architectures (such as CDEV, EPICS, and ADO) that can be used to implement data reflection servers that transfer data asynchronously to any client using the standard SET/GET API. This paper describes a method for using such a data reflection server to set up asynchronous DCN (ADCN) between a DBMS and clients. This method works well for all DBMS systems which provide database trigger functionality. (authors)

  2. The data acquisition and control system for Thomson Scattering on ATF [Advanced Toroidal Facility

    International Nuclear Information System (INIS)

    Stewart, K.A.; Kindsfather, R.R.; Rasmussen, D.A.

    1989-01-01

    The 2-dimensional Thomson Scattering System measuring electron temperatures and densities in the Advanced Toroidal Facility (ATF) is interfaced to a VAX-8700 computer system running in a clustered configuration. Calibration, alignment, and operation of this diagnostic is under computer control. Extensive CAMAC instrumentation is used for timing control, data acquisition, and laser alignment. This paper will discuss the computer hardware and software, system operations, and data storage and retrieval. 3 refs

  3. New data acquisition system for the lujan center

    International Nuclear Information System (INIS)

    Nelson, R.; Bowling, P.S.; Cooper, G.M.; Kozlowski, T.

    2001-01-01

    To meet the data acquisition requirements for six new neutron scattering instruments at the Los Alamos Science Center (LANSCE), we are building systems using Web tools, commercial hardware and software, software developed by the controls community, and custom hardware developed by the neutron scattering community. To service these new instruments as well as seven existing instruments, our data acquisition system needs common software and hardware core capabilities and the means to flexibly integrate them while differentiating the needs of the diverse instrument suite. Neutron events are captured and processed in VXI modules while controls for sample environment and beam line setup are processed with PCs. Typically users access the system through web browsers. (author)

  4. New data acquisition system for the lujan center

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.; Bowling, P.S.; Cooper, G.M.; Kozlowski, T. [Los Alamos National Loboratory, Los Alamos, NM (United States)

    2001-03-01

    To meet the data acquisition requirements for six new neutron scattering instruments at the Los Alamos Science Center (LANSCE), we are building systems using Web tools, commercial hardware and software, software developed by the controls community, and custom hardware developed by the neutron scattering community. To service these new instruments as well as seven existing instruments, our data acquisition system needs common software and hardware core capabilities and the means to flexibly integrate them while differentiating the needs of the diverse instrument suite. Neutron events are captured and processed in VXI modules while controls for sample environment and beam line setup are processed with PCs. Typically users access the system through web browsers. (author)

  5. Expression and methylation data from SLE patient and healthy control blood samples subdivided with respect to ARID3a levels

    Directory of Open Access Journals (Sweden)

    Julie M. Ward

    2016-12-01

    Full Text Available Previously published studies revealed that variation in expression of the DNA-binding protein ARID3a in B lymphocytes from patients with systemic lupus erythematosus (SLE correlated with levels of disease activity (“Disease activity in systemic lupus erythematosus correlates with expression of the transcription factor AT-rich-interactive domain 3A” (J.M. Ward, K. Rose, C. Montgomery, I. Adrianto, J.A. James, J.T. Merrill et al., 2014 [1]. The data presented here compare DNA methylation patterns from SLE peripheral blood mononuclear cells obtained from samples with high numbers of ARID3a expressing B cells (ARID3aH versus SLE samples with normal numbers of ARID3a+ B cells (ARID3aN. The methylation data is available at the gene expression omnibus (GEO repository, “Gene Expression Omnibus: NCBI gene expression and hybridization array data repository” (R. Edgar, M. Domrachev, A.E. Lash, 2002 [2]. Isolated B cells from SLE ARID3aH and ARID3aN B samples were also evaluated via qRT-PCR for Type I interferon (IFN signature and pathway gene expression levels by qRT-PCR. Similarly, healthy control B cells and B cells stimulated to express ARID3a with the TLR agonist, CpG, were also compared via qRT-PCR. Primers designed to detect 6 IFNa subtype mRNAs were tested in 4 IFNa, Epstein-Barr Virus-transformed B cell lines (“Reduced interferon-alpha production by Epstein-Barr virus transformed B-lymphoblastoid cell lines and lectin-stimulated lymphocytes in congenital dyserythropoietic anemia type I” (S.H. Wickramasinghe, R. Hasan, J. Smythe, 1997 [3]. The data in this article support the publication, “Human effector B lymphocytes express ARID3a and secrete interferon alpha” (J.M. Ward, M.L. Ratliff, M.G. Dozmorov, G. Wiley, J.M. Guthridge, P.M. Gaffney, J.A. James, C.F. Webb, 2016 [4].

  6. Secure Data Transfer Guidance for Industrial Control and SCADA Systems

    Energy Technology Data Exchange (ETDEWEB)

    Mahan, Robert E.; Fluckiger, Jerry D.; Clements, Samuel L.; Tews, Cody W.; Burnette, John R.; Goranson, Craig A.; Kirkham, Harold

    2011-09-01

    This document was developed to provide guidance for the implementation of secure data transfer in a complex computational infrastructure representative of the electric power and oil and natural gas enterprises and the control systems they implement. For the past 20 years the cyber security community has focused on preventative measures intended to keep systems secure by providing a hard outer shell that is difficult to penetrate. Over time, the hard exterior, soft interior focus changed to focus on defense-in-depth adding multiple layers of protection, introducing intrusion detection systems, more effective incident response and cleanup, and many other security measures. Despite much larger expenditures and more layers of defense, successful attacks have only increased in number and severity. Consequently, it is time to re-focus the conventional approach to cyber security. While it is still important to implement measures to keep intruders out, a new protection paradigm is warranted that is aimed at discovering attempted or real compromises as early as possible. Put simply, organizations should take as fact that they have been, are now, or will be compromised. These compromises may be intended to steal information for financial gain as in the theft of intellectual property or credentials that lead to the theft of financial resources, or to lie silent until instructed to cause physical or electronic damage and/or denial of services. This change in outlook has been recently confirmed by the National Security Agency [19]. The discovery of attempted and actual compromises requires an increased focus on monitoring events by manual and/or automated log monitoring, detecting unauthorized changes to a system's hardware and/or software, detecting intrusions, and/or discovering the exfiltration of sensitive information and/or attempts to send inappropriate commands to ICS/SCADA (Industrial Control System/Supervisory Control And Data Acquisition) systems.

  7. Computer software design description for the integrated control and data acquisition system LDUA system

    International Nuclear Information System (INIS)

    Aftanas, B.L.

    1998-01-01

    This Computer Software Design Description (CSDD) document provides the overview of the software design for all the software that is part of the integrated control and data acquisition system of the Light Duty Utility Arm System (LDUA). It describes the major software components and how they interface. It also references the documents that contain the detailed design description of the components

  8. Real-time data acquisition and feedback control using Linux Intel computers

    International Nuclear Information System (INIS)

    Penaflor, B.G.; Ferron, J.R.; Piglowski, D.A.; Johnson, R.D.; Walker, M.L.

    2006-01-01

    This paper describes the experiences of the DIII-D programming staff in adapting Linux based Intel computing hardware for use in real-time data acquisition and feedback control systems. Due to the highly dynamic and unstable nature of magnetically confined plasmas in tokamak fusion experiments, real-time data acquisition and feedback control systems are in routine use with all major tokamaks. At DIII-D, plasmas are created and sustained using a real-time application known as the digital plasma control system (PCS). During each experiment, the PCS periodically samples data from hundreds of diagnostic signals and provides these data to control algorithms implemented in software. These algorithms compute the necessary commands to send to various actuators that affect plasma performance. The PCS consists of a group of rack mounted Intel Xeon computer systems running an in-house customized version of the Linux operating system tailored specifically to meet the real-time performance needs of the plasma experiments. This paper provides a more detailed description of the real-time computing hardware and custom developed software, including recent work to utilize dual Intel Xeon equipped computers within the PCS

  9. A flexible Labviewtrademark-based data acquisition and analysis system for scanning microscopy

    International Nuclear Information System (INIS)

    Morse, Daniel H.; Antolak, Arlyn J.; Bench, Graham S.; Roberts, Mark L.

    1998-01-01

    A new data analysis system has been developed with computer-controlled beam and sample positioning, video sample imaging, multiple large solid angle detectors for x-rays and gamma-rays, and surface barrier detectors for charged particles. The system uses the LabVIEWtrademark programming language allowing it to be easily ported between different computer operating systems. In the present configuration, digital signal processors are directly interfaced to a SCSI CAMAC controller. However, the modular software design permits the substitution of other hardware with LabVIEW-supported drivers. On-line displays of histogram and two-dimensional elemental map images provide a user-friendly data acquisition interface. Subregions of the two-dimensional maps may be selected interactively for detailed analysis or for subsequent scanning. Off-line data processing of archived data currently yields elemental maps, analyzed spectra and reconstructions of tomographic data

  10. Data acquisition and control system for the High-Level Waste Tank Farm at Hanford, Washington

    International Nuclear Information System (INIS)

    Hoida, H.W.; Hatcher, C.R.; Trujillo, L.T.; Holt, D.H.; Vargo, G.F.; Martin, J.; Stastny, G.; Echave, R.; Eldridge, K.

    1993-01-01

    The High-Level Nuclear Waste Storage Tank 241-SY-101 periodically releases flammable gasses. Mitigation experiments to release the gasses continuously to avoid a catastrophic build-up are planned for FY93 and beyond. Los Alamos has provided a data acquisition and control system (DACS) to monitor and control mitigation experiments on SY-101. The DACS consists of a data acquisition trailer to house the electronic components and computers in a friendly environment, a computer system running process control software for monitoring and controlling the tests, signal conditioners to convert the instrument signals to a usable form for the DACS, programmable logic controllers to process sensor signals and take action quickly, a fast data acquisition system for recording transient data, and a remote monitoring system to monitor the progress of the experiment. Equipment to monitor the release of the gasses was also provided. The first experiment involves a mixer pump to mix the waste and allow the gasses to be released at the surface of the liquid as the gas is being formed. The initial tests are scheduled for July 1993

  11. Multi-Level Data-Security and Data-Protection in a Distributed Search Infrastructure for Digital Medical Samples.

    Science.gov (United States)

    Witt, Michael; Krefting, Dagmar

    2016-01-01

    Human sample data is stored in biobanks with software managing digital derived sample data. When these stand-alone components are connected and a search infrastructure is employed users become able to collect required research data from different data sources. Data protection, patient rights, data heterogeneity and access control are major challenges for such an infrastructure. This dissertation will investigate concepts for a multi-level security architecture to comply with these requirements.

  12. Design of Tokamak synchronous data acquisition system based on PXI express

    International Nuclear Information System (INIS)

    Liu Rui; Zheng Wei; Zhang Ming; Weng Chuqiao; Zhuang Ge; Ding Tonghai; Yu Kexun

    2014-01-01

    With the development of J-TEXT device, the original data acquisition system can't meet the experiment's requirement on stability, modularity and sampling rate, so a new data acquisition system needs to be built. This paper introduces the design and implementation of the distributed Tokamak synchronous high-speed data acquisition system based on PXI Express. The acquisition unit consists of PXIe case Nl PXIe 1062Q, PXIe controller NI PXIe-8133 and high-speed synchronous data acquisition card Nl PXIe-6368, compatible with the latest standard of ITER CODAC, so it has good mechanical sealing, strong modularity and high sampling rate etc. The system takes a synchronous difference acquisition for diagnosis signal. The data storage adopts MDSplus which is the general database in the nuclear fusion field. The test and experimental results show that the system can work continuously and stably at 2 MSps sampling rate, and meet the requirement of experiment device's operation well. (authors)

  13. Dependability analysis of the data communication system in train control system

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    Communication based train control (CBTC) system is based on mobile communication and overcomes fixed blocks in order to increase track utilization and train safety. The data communication system (DCS) between trains and wayside equipment is a crucial factor for the safe and efficient operation of CBTC system. The dependability under various transmission conditions needs to be modeled and evaluated. In this paper,a stochastic reward net (SRN) model for DCS based IEEE 802.11 standard was developed,which captures all relevant failure and failure recovery behavior system aspects in a concise way. We compared the reliability,availability for DCS with and without access point (AP) and antenna redundant configuration. We also quantitatively evaluated and compared the frame loss probability for three DCS configurations with different train velocities and train numbers in one radio cell. Fixed-point iteration was adopted to simplify the analysis. Numerical results showed the significant improvement of the reliability,availability and the frame loss probability index for the full redundant configuration.

  14. Sampling from a system-theoretic viewpoint

    NARCIS (Netherlands)

    Meinsma, Gjerrit; Mirkin, Leonid

    2009-01-01

    This paper studies a system-theoretic approach to the problem of reconstructing an analog signal from its samples. The idea, borrowed from earlier treatments in the control literature, is to address the problem as a hybrid model-matching problem in which performance is measured by system norms. The

  15. Control and Data Acquisition System for the Spanish Beamline (BM25) at the ESRF

    International Nuclear Information System (INIS)

    Pereira Gonzalez, A.; Olalla Garcia, C.; Sanchez Sanz, J.; Castro, G. R.

    2005-01-01

    A new control and data acquisition system has been developed for BM25 Spanish Line at the ESRF. The system is based in VMEbus, Motorola PreP architecture and Linux Operating System and it's linked to a local ETHERNET network which provides the way of communicate with the servers (PC workstations). In these computers, the data are available for general usage in order to analyze them. The data acquisition consists of many channels connected to the VME crates mainly, independent between them, and fully programmable by drivers, CLUI's and GUI's interfaces, and a set of independent systems (embedded ones, PLCs, others) controlling the security aspects. This report is described in terms of their architecture, their electronic system to the process hard ware and the functionality and the application development facilities they provide using the software and the data acquisition. (Author) 18 refs

  16. Computer systems for nuclear installation data control

    International Nuclear Information System (INIS)

    1987-09-01

    The computer programs developed by Divisao de Instalacoes Nucleares (DIN) from Brazilian CNEN for data control on nuclear installations in Brazil are presented. The following computer programs are described: control of registered companies, control of industrial sources, irradiators and monitors; control of liable person; control of industry irregularities; for elaborating credence tests; for shielding analysis; control of waste refuge [pt

  17. Integrating the Theory of Sampling into Underground Mine Grade Control Strategies

    Directory of Open Access Journals (Sweden)

    Simon C. Dominy

    2018-05-01

    Full Text Available Grade control in underground mines aims to deliver quality tonnes to the process plant via the accurate definition of ore and waste. It comprises a decision-making process including data collection and interpretation; local estimation; development and mining supervision; ore and waste destination tracking; and stockpile management. The foundation of any grade control programme is that of high-quality samples collected in a geological context. The requirement for quality samples has long been recognised, where they should be representative and fit-for-purpose. Once a sampling error is introduced, it propagates through all subsequent processes contributing to data uncertainty, which leads to poor decisions and financial loss. Proper application of the Theory of Sampling reduces errors during sample collection, preparation, and assaying. To achieve quality, sampling techniques must minimise delimitation, extraction, and preparation errors. Underground sampling methods include linear (chip and channel, grab (broken rock, and drill-based samples. Grade control staff should be well-trained and motivated, and operating staff should understand the critical need for grade control. Sampling must always be undertaken with a strong focus on safety and alternatives sought if the risk to humans is high. A quality control/quality assurance programme must be implemented, particularly when samples contribute to a reserve estimate. This paper assesses grade control sampling with emphasis on underground gold operations and presents recommendations for optimal practice through the application of the Theory of Sampling.

  18. Online data processing system

    International Nuclear Information System (INIS)

    Nakahara, Yoshinori; Yagi, Hideyuki; Yamada, Takayuki

    1979-02-01

    A pulse height analyzer terminal system PHATS has been developed for online data processing via JAERI-TOKAI computer network. The system is controled by using a micro-computer MICRO-8 which was developed for the JAERI-TOKAI network. The system program consists of two subprograms, online control system ONLCS and pulse height analyzer control system PHACS. ONLCS links the terminal with the conversational programming system of FACOM 230/75 through the JAERI-TOKAI network and controls data processing in TSS and remote batch modes. PHACS is used to control INPUT/OUTPUT of data between pulse height analyzer and cassette-MT or typewriter. This report describes the hardware configuration and the system program in detail. In the appendix, explained are real time monitor, type of message, PEX to PEX protocol and Host to Host protocol, required for the system programming. (author)

  19. Supervisory Control and Data Acquisition (SCADA) Systems and Cyber-Security: Best Practices to Secure Critical Infrastructure

    Science.gov (United States)

    Morsey, Christopher

    2017-01-01

    In the critical infrastructure world, many critical infrastructure sectors use a Supervisory Control and Data Acquisition (SCADA) system. The sectors that use SCADA systems are the electric power, nuclear power and water. These systems are used to control, monitor and extract data from the systems that give us all the ability to light our homes…

  20. Computer control and data acquisition system for the Mirror Fusion Test Facility Ion Cyclotron Resonant Heating System (ICRH)

    International Nuclear Information System (INIS)

    Cheshire, D.L.; Thomas, R.A.

    1985-01-01

    The Lawrence Livermore National Laboratory (LLNL) large Mirror Fusion Test Facility (MFTF-B) will employ an Ion Cyclotron Resonant Heating (ICRH) system for plasma startup. As the MFTF-B Industrial Participant, TRW has responsibility for the ICRH system, including development of the data acquisition and control system. During the MFTF-B Supervisory Control and Diagnostic System (SCDS). For subsystem development and checkout at TRW, and for verification and acceptance testing at LLNL, the system will be run from a stand-alone computer system designed to simulate the functions of SCDS. The ''SCDS Simulator'' was developed originally for the MFTF-B ECRH System; descriptions of the hardware and software are updated in this paper. The computer control and data acquisition functions implemented for ICRH are described, including development status, and test schedule at TRW and at LLNL. The application software is written for the SCDS Simulator, but it is programmed in PASCAL and designed to facilitate conversion for use on the SCDS computers

  1. Computer system design description for SY-101 hydrogen mitigation test project data acquisition and control system (DACS-1). Revision 1

    International Nuclear Information System (INIS)

    Truitt, R.W.

    1994-01-01

    This document provides descriptions of components and tasks that are involved in the computer system for the data acquisition and control of the mitigation tests conducted on waste tank SY-101 at the Hanford Nuclear Reservation. The system was designed and implemented by Los alamos National Laboratory and supplied to Westinghouse Hanford Company. The computers (both personal computers and specialized data-taking computers) and the software programs of the system will hereafter collectively be referred to as the DACS (Data Acquisition and Control System)

  2. State and data techniques for control of discontinuous systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1986-01-01

    The need for automated control systems becomes clear as the complexity of nuclear power plants increases and economic incentives demand higher plant availability. A control system with intelligence distributed throughout its controllers allows reduction in operator workload, perhaps reduction in crew size, and potentially a reduction in on-line human error. In automated systems of this kind, each controller should be capable of making decisions and carrying out a plan of action. This paper describes a technique for structured analysis and design of automated control systems. The technique integrates control of continuous and discontinuous nuclear power plant subsystems and components. A hierarchical control system with distributed intelligence follows from applying the technique. Further, it can be applied to all phases of control system design. For simplicity, the example used in the paper is limited to phase I design (basic automatic control action), in which no maintenance, testing, or contingency capability is attempted

  3. Control, data acquisition and remote participation for steady-state operation in LHD

    International Nuclear Information System (INIS)

    Sudo, S.; Nagayama, Y.; Emoto, M.; Nakanishi, H.; Chikaraishi, H.; Imazu, S.; Iwata, C.; Kogi, Y.; Kojima, M.; Komada, S.; Kubo, S.; Kumazawa, R.; Mase, A.; Miyazawa, J.; Mutoh, T.; Nakamura, Y.; Nonomura, M.; Ohsuna, M.; Saito, K.; Sakamoto, R.; Seki, T.; Shoji, M.; Tsuda, K.; Yoshida, M.

    2006-01-01

    Control, data acquisition, plasma monitoring and remote participation for steady state operation in the large helical device (LHD) are reviewed. By controlling the impedance matching of ICH, the plasma position and the electron density, high temperature plasma is confined for 1905s. The plasma parameters are monitored in real time. Data are continuously sampled by the YOKOGAWA WE7000 system and by the NATIONAL INSTRUMENTS CompactPCI system. Those data are managed by the object-oriented database system based on ObjectStore in distributed servers with mass storage. By using the multi protocol label switching-virtual private network (MPLS-VPN) technology, the local area network of LHD is expanded to the Japanese fusion community. This provides the remote participants with the same environment of the LHD control room

  4. Control, data acquisition and remote participation for steady-state operation in LHD

    Energy Technology Data Exchange (ETDEWEB)

    Sudo, S. [National Institute of Natural Sciences, 322-6 Oroshi, Toki 509-5292 (Japan)]. E-mail: sudo@nifs.ac.jp; Nagayama, Y. [National Institute of Natural Sciences, 322-6 Oroshi, Toki 509-5292 (Japan); Emoto, M. [National Institute of Natural Sciences, 322-6 Oroshi, Toki 509-5292 (Japan); Nakanishi, H. [National Institute of Natural Sciences, 322-6 Oroshi, Toki 509-5292 (Japan); Chikaraishi, H. [National Institute of Natural Sciences, 322-6 Oroshi, Toki 509-5292 (Japan); Imazu, S. [National Institute of Natural Sciences, 322-6 Oroshi, Toki 509-5292 (Japan); Iwata, C. [National Institute of Natural Sciences, 322-6 Oroshi, Toki 509-5292 (Japan); Kogi, Y. [KASTEC, Kyushu University, Kasuga 816-8580 (Japan); Kojima, M. [National Institute of Natural Sciences, 322-6 Oroshi, Toki 509-5292 (Japan); Komada, S. [National Institute of Natural Sciences, 322-6 Oroshi, Toki 509-5292 (Japan); Kubo, S. [National Institute of Natural Sciences, 322-6 Oroshi, Toki 509-5292 (Japan); Kumazawa, R. [National Institute of Natural Sciences, 322-6 Oroshi, Toki 509-5292 (Japan); Mase, A. [KASTEC, Kyushu University, Kasuga 816-8580 (Japan); Miyazawa, J. [National Institute of Natural Sciences, 322-6 Oroshi, Toki 509-5292 (Japan); Mutoh, T. [National Institute of Natural Sciences, 322-6 Oroshi, Toki 509-5292 (Japan); Nakamura, Y. [National Institute of Natural Sciences, 322-6 Oroshi, Toki 509-5292 (Japan); Nonomura, M. [National Institute of Natural Sciences, 322-6 Oroshi, Toki 509-5292 (Japan); Ohsuna, M. [National Institute of Natural Sciences, 322-6 Oroshi, Toki 509-5292 (Japan); Saito, K. [National Institute of Natural Sciences, 322-6 Oroshi, Toki 509-5292 (Japan); Sakamoto, R.; Seki, T.; Shoji, M.; Tsuda, K.; Yoshida, M. [National Institute of Natural Sciences, 322-6 Oroshi, Toki 509-5292 (Japan)

    2006-07-15

    Control, data acquisition, plasma monitoring and remote participation for steady state operation in the large helical device (LHD) are reviewed. By controlling the impedance matching of ICH, the plasma position and the electron density, high temperature plasma is confined for 1905s. The plasma parameters are monitored in real time. Data are continuously sampled by the YOKOGAWA WE7000 system and by the NATIONAL INSTRUMENTS CompactPCI system. Those data are managed by the object-oriented database system based on ObjectStore in distributed servers with mass storage. By using the multi protocol label switching-virtual private network (MPLS-VPN) technology, the local area network of LHD is expanded to the Japanese fusion community. This provides the remote participants with the same environment of the LHD control room.

  5. Control, monitoring and data acquisition systems in pilot plant for tritium and deuterium separation

    International Nuclear Information System (INIS)

    Retevoi, Carmen; Balteanu, Ovidiu Ioan

    1999-01-01

    To achieve the control, monitoring and data acquisition for a pilot plant for tritium and deuterium separation we have developed a system based on computer processing which transfers and treats all the data from the physical system. It consists of six basic elements: 1. a process computer ; 2. a National Instruments Amplifier/Multiplexed - SCXI 1000 with a SCXI 1100 Module with 32 differential input channels; 3. a Honeywell Digital Process Recorder - DPR 250, with 32 universal input, 12 digital input and 12 internal relays; 4. a control system for 4 throttle valves; 5. a National Instruments Data Acquisition board - AT-MIO-16XE-10, with 8 differential channels; 6. a system consisting of up to 20 digital programming current units for carbon RTD's. All the parameters from transducers, sensors and transmitters are introduced into the multiplexer and beyond into the acquisition data board. With LabVIEW soft support (National Instrument product), we made a graphic interface which displays the plant and all the parameters and their points of measure and cumulates all these data into a file. On the other hand all the pressure flow and level values are monitored by the recorder DPR 250, which has a RS232/RS485 port for PC communication. The temperatures are measured with carbon RTD's and a system comprising 20 programming current units connected by RS485 serial bus and a RS485/RS232 converter directly to the serial port of process computer. A special program makes the voltage/temperature conversion. The control system for throttle valves comprises a central unit, which communicates by RS232 bus with 4 controllers commanding 4 stepping motors. Every stepping motor is linked by a reductor to the throttle valve. This system can operate in either manual or automatic mode. The central unit can communicate with process computer via RS232 link. In this way a process computer can receive all the parameters by means of RS232/RS245 link or directly through the multiplexer and

  6. Joint sampling programme-Verification of data obtained in environmental monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Lauria, D.C. [Instituto de Radioprotecao e Dosimetria, Comissao Nacional de Energia Nuclear, Av. Salvador Allende s/no., CEP 22780-160, Rio de Janeiro, RJ (Brazil)], E-mail: dejanira@ird.gov.br; Martins, N.S.F.; Vasconcellos, M.L.H.; Zenaro, R.; Peres, S.S.; Pires do Rio, M.A. [Instituto de Radioprotecao e Dosimetria, Comissao Nacional de Energia Nuclear, Av. Salvador Allende s/no., CEP 22780-160, Rio de Janeiro, RJ (Brazil)

    2008-11-15

    The objective of the Environmental Radiological Monitoring Control programme carried out by the Institute of Radiation Protection and Dosimetry (IRD) in Brazil is to verify the licensee's compliance with the requirements for environmental monitoring of Brazilian facilities. The Joint Sampling Programme (JSP) is just one part of the control programme. In order to verify that the data reported by the licensees is representative and legitimate, this programme verifies sampling procedures, accuracy and precision of the data and the changes in the environmental conditions. This paper discusses the main findings of this programme that allowed IRD to optimize its available resources to control the monitoring of the eight facilities in Brazil.

  7. Data acquisition and control system with a programmable logic controller (PLC) for a pulsed chemical oxygen-iodine laser

    Science.gov (United States)

    Yu, Haijun; Li, Guofu; Duo, Liping; Jin, Yuqi; Wang, Jian; Sang, Fengting; Kang, Yuanfu; Li, Liucheng; Wang, Yuanhu; Tang, Shukai; Yu, Hongliang

    2015-02-01

    A user-friendly data acquisition and control system (DACS) for a pulsed chemical oxygen -iodine laser (PCOIL) has been developed. It is implemented by an industrial control computer,a PLC, and a distributed input/output (I/O) module, as well as the valve and transmitter. The system is capable of handling 200 analogue/digital channels for performing various operations such as on-line acquisition, display, safety measures and control of various valves. These operations are controlled either by control switches configured on a PC while not running or by a pre-determined sequence or timings during the run. The system is capable of real-time acquisition and on-line estimation of important diagnostic parameters for optimization of a PCOIL. The DACS system has been programmed using software programmable logic controller (PLC). Using this DACS, more than 200 runs were given performed successfully.

  8. Position-controlled data acquisition embedded system for magnetic NDE of bridge stay cables.

    Science.gov (United States)

    Maldonado-Lopez, Rocio; Christen, Rouven

    2011-01-01

    This work presents a custom-tailored sensing and data acquisition embedded system, designed to be integrated in a new magnetic NDE inspection device under development at Empa, a device intended for routine testing of large diameter bridge stay cables. The data acquisition (DAQ) system fulfills the speed and resolution requirements of the application and is able to continuously capture and store up to 2 GB of data at a sampling rate of 27 kS/s, with 12-bit resolution. This paper describes the DAQ system in detail, including both hardware and software implementation, as well as the key design challenges and the techniques employed to meet the specifications. Experimental results showing the performance of the system are also presented.

  9. Work plan for SY Farm Integrated Data Acquisition and Control System (DACS-2a)

    International Nuclear Information System (INIS)

    Conner, R.P.; Katz, R.S.

    1994-01-01

    The SY Farm currently has a temporary Data Acquisition ampersand Control System (DACS) housed in a mobile trailer. The system is currently referred to as DACS-1. It was designed and configured to support engineers and scientists conducting the special performance evaluation and testing program for the safety mitigation test equipment located in waste tank 241-SY-101 (101-SY). It is currently being maintained and utilized by engineering personnel to monitor and control the 101-SY mitigation pump activities. Based upon the results of the mitigation testing program, some of the temporary test mitigation equipment (such as mixing pump) will be replaced with longer-term ''operational'' mitigation equipment. This is resulting in new requirements for the Data Acquisition and Control System which will be full-filled by a newer control facility referred to as the DACS-2. A teaming between Westinghouse Hanford Company (WHC) and Los Alamos National Laboratory (LANL) has been established for the SY farm mitigation program in order to develop and implement the ''next generation'' of the data acquisition and control system for the mitigation pump operations. The new system will be configured for use by the tank farm operational personnel. It will support the routine operations necessary for safety mitigation and the future waste retrieval of Project W-211. It is intended to replace the existing DACS-1 and provide the necessary control room space for future integration of W-211

  10. Accurate Sample Time Reconstruction of Inertial FIFO Data

    Directory of Open Access Journals (Sweden)

    Sebastian Stieber

    2017-12-01

    Full Text Available In the context of modern cyber-physical systems, the accuracy of underlying sensor data plays an increasingly important role in sensor data fusion and feature extraction. The raw events of multiple sensors have to be aligned in time to enable high quality sensor fusion results. However, the growing number of simultaneously connected sensor devices make the energy saving data acquisition and processing more and more difficult. Hence, most of the modern sensors offer a first-in-first-out (FIFO interface to store multiple data samples and to relax timing constraints, when handling multiple sensor devices. However, using the FIFO interface increases the negative influence of individual clock drifts—introduced by fabrication inaccuracies, temperature changes and wear-out effects—onto the sampling data reconstruction. Furthermore, additional timing offset errors due to communication and software latencies increases with a growing number of sensor devices. In this article, we present an approach for an accurate sample time reconstruction independent of the actual clock drift with the help of an internal sensor timer. Such timers are already available in modern sensors, manufactured in micro-electromechanical systems (MEMS technology. The presented approach focuses on calculating accurate time stamps using the sensor FIFO interface in a forward-only processing manner as a robust and energy saving solution. The proposed algorithm is able to lower the overall standard deviation of reconstructed sampling periods below 40 μ s, while run-time savings of up to 42% are achieved, compared to single sample acquisition.

  11. A Framework for Control System Design Subject to Average Data-Rate Constraints

    DEFF Research Database (Denmark)

    Silva, Eduardo; Derpich, Milan; Østergaard, Jan

    2011-01-01

    This paper studies discrete-time control systems subject to average data-rate limits. We focus on a situation where a noisy linear system has been designed assuming transparent feedback and, due to implementation constraints, a source-coding scheme (with unity signal transfer function) has to be ...

  12. Development of a supervisory control and data acquisitioning system

    International Nuclear Information System (INIS)

    Kamboh, A.M.; Fakhar, H.A.; Rafiq, G.; Kazmi, S.R.

    2004-01-01

    This paper describes a Supervisory Control and Data Acquisitioning system called SCADA that we have developed at NUST. This research is aimed at the development of a network where a central Control Server extends bidirectional data exchange capability to hundreds of geographically remote sensors and actuators spread over a distance greater than 3 Km. Several battery-driven handheld terminals called Remote Terminal Units (RTU) have been designed to provide both wired and wireless connectivity between sensors and the network, also adding limited mobility to the sensors. Simple transceivers give the RTUs wireless access to network. The Human-Machine Intel (HMI) for the RTUs and the Server have been provided. A repeater has also been designed to increase the number of RTUs connected, and the maximum allowed distance between units and the server. The wired network gives several times faster connectivity than the wireless network, in addition to the larger area covered, but at the cost of mobility. (author)

  13. DABASCO Experiment Data Acquisition and Control System; Sistema de Toma de Datos y Control del Experimento DABASCO

    Energy Technology Data Exchange (ETDEWEB)

    Alberdi, J.; Artigao, A.; Barcala, J. M.; Oller, J. C. [Ciemat, Madrid (Spain)

    2000-07-01

    DABASCO experiment wants to study the thermohydraulic phenomena produced into the containment area for a severe accident in a nuclear power facility. This document describes the characteristics of the data acquisition and control system used in the experiment. The main elements of the system were a data acquisition board, PCI-MIO-16E-4, and an application written with LaB View. (Author) 5 refs.

  14. Identification of Classified Information in Unclassified DoD Systems During the Audit of Internal Controls and Data Reliability in the Deployable Disbursing System

    Science.gov (United States)

    2009-02-17

    Identification of Classified Information in Unclassified DoD Systems During the Audit of Internal Controls and Data Reliability in the Deployable...TITLE AND SUBTITLE Identification of Classified Information in Unclassified DoD Systems During the Audit of Internal Controls and Data Reliability...Systems During the Audit ofInternal Controls and Data Reliability in the Deployable Disbursing System (Report No. D-2009-054) Weare providing this

  15. Reliability estimation system: its application to the nuclear geophysical sampling of ore deposits

    International Nuclear Information System (INIS)

    Khaykovich, I.M.; Savosin, S.I.

    1992-01-01

    The reliability estimation system accepted in the Soviet Union for sampling data in nuclear geophysics is based on unique requirements in metrology and methodology. It involves estimating characteristic errors in calibration, as well as errors in measurement and interpretation. This paper describes the methods of estimating the levels of systematic and random errors at each stage of the problem. The data of nuclear geophysics sampling are considered to be reliable if there are no statistically significant, systematic differences between ore intervals determined by this method and by geological control, or by other methods of sampling; the reliability of the latter having been verified. The difference between the random errors is statistically insignificant. The system allows one to obtain information on the parameters of ore intervals with a guaranteed random error and without systematic errors. (Author)

  16. Quality-control design for surface-water sampling in the National Water-Quality Network

    Science.gov (United States)

    Riskin, Melissa L.; Reutter, David C.; Martin, Jeffrey D.; Mueller, David K.

    2018-04-10

    The data-quality objectives for samples collected at surface-water sites in the National Water-Quality Network include estimating the extent to which contamination, matrix effects, and measurement variability affect interpretation of environmental conditions. Quality-control samples provide insight into how well the samples collected at surface-water sites represent the true environmental conditions. Quality-control samples used in this program include field blanks, replicates, and field matrix spikes. This report describes the design for collection of these quality-control samples and the data management needed to properly identify these samples in the U.S. Geological Survey’s national database.

  17. A dedicated database system for handling multi-level data in systems biology.

    Science.gov (United States)

    Pornputtapong, Natapol; Wanichthanarak, Kwanjeera; Nilsson, Avlant; Nookaew, Intawat; Nielsen, Jens

    2014-01-01

    Advances in high-throughput technologies have enabled extensive generation of multi-level omics data. These data are crucial for systems biology research, though they are complex, heterogeneous, highly dynamic, incomplete and distributed among public databases. This leads to difficulties in data accessibility and often results in errors when data are merged and integrated from varied resources. Therefore, integration and management of systems biological data remain very challenging. To overcome this, we designed and developed a dedicated database system that can serve and solve the vital issues in data management and hereby facilitate data integration, modeling and analysis in systems biology within a sole database. In addition, a yeast data repository was implemented as an integrated database environment which is operated by the database system. Two applications were implemented to demonstrate extensibility and utilization of the system. Both illustrate how the user can access the database via the web query function and implemented scripts. These scripts are specific for two sample cases: 1) Detecting the pheromone pathway in protein interaction networks; and 2) Finding metabolic reactions regulated by Snf1 kinase. In this study we present the design of database system which offers an extensible environment to efficiently capture the majority of biological entities and relations encountered in systems biology. Critical functions and control processes were designed and implemented to ensure consistent, efficient, secure and reliable transactions. The two sample cases on the yeast integrated data clearly demonstrate the value of a sole database environment for systems biology research.

  18. Oceanids command and control (C2) data system - Marine autonomous systems data for vehicle piloting, scientific data users, operational data assimilation, and big data

    Science.gov (United States)

    Buck, J. J. H.; Phillips, A.; Lorenzo, A.; Kokkinaki, A.; Hearn, M.; Gardner, T.; Thorne, K.

    2017-12-01

    The National Oceanography Centre (NOC) operate a fleet of approximately 36 autonomous marine platforms including submarine gliders, autonomous underwater vehicles, and autonomous surface vehicles. Each platform effectivity has the capability to observe the ocean and collect data akin to a small research vessel. This is creating a growth in data volumes and complexity while the amount of resource available to manage data remains static. The OceanIds Command and Control (C2) project aims to solve these issues by fully automating the data archival, processing and dissemination. The data architecture being implemented jointly by NOC and the Scottish Association for Marine Science (SAMS) includes a single Application Programming Interface (API) gateway to handle authentication, forwarding and delivery of both metadata and data. Technicians and principle investigators will enter expedition data prior to deployment of vehicles enabling automated data processing when vehicles are deployed. The system will support automated metadata acquisition from platforms as this technology moves towards operational implementation. The metadata exposure to the web builds on a prototype developed by the European Commission supported SenseOCEAN project and is via open standards including World Wide Web Consortium (W3C) RDF/XML and the use of the Semantic Sensor Network ontology and Open Geospatial Consortium (OGC) SensorML standard. Data will be delivered in the marine domain Everyone's Glider Observatory (EGO) format and OGC Observations and Measurements. Additional formats will be served by implementation of endpoints such as the NOAA ERDDAP tool. This standardised data delivery via the API gateway enables timely near-real-time data to be served to Oceanids users, BODC users, operational users and big data systems. The use of open standards will also enable web interfaces to be rapidly built on the API gateway and delivery to European research infrastructures that include aligned

  19. An airborne meteorological data collection system using satellite relay /ASDAR/

    Science.gov (United States)

    Bagwell, J. W.; Lindow, B. G.

    1978-01-01

    The paper describes the aircraft to satellite data relay (ASDAR) project which processes information collected by the navigation and data systems of widebody jet aircraft which cross data-sparse areas of the tropics and southern hemisphere. The ASDAR system consists of a data acquisition and control unit to acquire, store, and format latitude, longitude, altitude, wind speed, wind direction, and outside air temperature data; a transmitter to relay the formatted data via satellite to the ground; and a clock to time the data sampling and transmission periods.

  20. Two-Stage Variable Sample-Rate Conversion System

    Science.gov (United States)

    Tkacenko, Andre

    2009-01-01

    A two-stage variable sample-rate conversion (SRC) system has been pro posed as part of a digital signal-processing system in a digital com munication radio receiver that utilizes a variety of data rates. The proposed system would be used as an interface between (1) an analog- todigital converter used in the front end of the receiver to sample an intermediatefrequency signal at a fixed input rate and (2) digita lly implemented tracking loops in subsequent stages that operate at v arious sample rates that are generally lower than the input sample r ate. This Two-Stage System would be capable of converting from an input sample rate to a desired lower output sample rate that could be var iable and not necessarily a rational fraction of the input rate.

  1. Evaluating Information Assurance Control Effectiveness on an Air Force Supervisory Control and Data Acquisition (SCADA) System

    Science.gov (United States)

    2011-03-01

    Byres, E. J., Lowe, J. (2004). The Myths and facts behind cyber security risks for industrial control systems . Berlin, Germany: VDE 2004 Congress...ACQUISITION (SCADA) SYSTEM THESIS Jason R. Nielsen, Major, USAF AFIT/GCO/ENG/11-10 DEPARTMENT OF THE AIR FORCE AIR UNIVERSITY AIR FORCE...DATA ACQUISITION (SCADA) SYSTEM THESIS Presented to the Faculty Department of Electrical and Computer Engineering Graduate School of

  2. Quality control of software in dissimilar systems using a common clinical data base

    International Nuclear Information System (INIS)

    Erickson, J.J.; Price, R.R.; Touya, J.J.; Kronenberg, M.W.; Pederson, R.; Rollo, F.D.

    1980-01-01

    For a long time there has been widespread interest in the quality control of diagnostic instrumentation. The increasing dependence on computational systems for clinical results makes it imperative that methods for quality control of diagnostic software be developed. This paper proposes a method based on the use of a collection of patient studies for which the results have been corroborated by independent methods. The data set will be distributed in a format requiring no special handling by the system being tested and will appear identical to studies actually collected by the host system. An example of the use of a preliminary version of the data set for comparison of two systems is shown. The comparison shows that analyses performed on the two systems agree very well and can be reliably compared for follow-up studies of a patient

  3. Data derandomizer and method of operation for radiation imaging detection systems

    International Nuclear Information System (INIS)

    Hatch, K.F.

    1977-01-01

    A nuclear imaging system includes an analog signal processor which features analog data derandomization for minimizing data loss due to pulse pile-up. A scintillation detector provides a sequence of analog data pulses to the signal processor, the data pulses characterizing the energy level and situs of respective radiation events striking the detector. The signal processor includes sets of novel peak detectors and of sample and hold circuits which are serially connected and are operated to derandomize or space the sequence of analog data pulses so that the system can process pulses corresponding to photopeak events occurring only 1.5 microseconds apart. The analog data pulses are stored in analog pulse form in the peak detectors and are selectively transferred into the sample and hold circuitry from which they are transferred to the display mechanism. The signal processor is multiplexed with several data input channels for accommodating dual isotope operation. A control unit is provided which controls the data processing cycle according to a predetermined processing time, or according to signals from external system apparatus. The control unit provides automatic resetting for assurance that the signal processor does not become locked into an inoperative, nondata processing state. The novel peak detectors are controlled by the control unit and feature input biasing for increased detection sensitivity, proportional dumping for discharging the stored peak value at a rate proportional to the value of the stored peak, and selective input data gating so that only the peak containing portion of the input signal is input into the detector. 28 claims, 10 figures

  4. New software of the control and data acquisition system for the Nuclotron internal target station

    International Nuclear Information System (INIS)

    Isupov, A.Yu.

    2012-01-01

    The control and data acquisition system for the Internal Target Station (ITS) of the Nuclotron (LHEP, JINR) is implemented. The new software is based on the ngdp framework under the Unix-like operating system FreeBSD to allow easy network distribution of the on-line data collected from ITS, as well as the internal target remote control

  5. MAST data acquisition system

    International Nuclear Information System (INIS)

    Shibaev, S.; Counsell, G.; Cunningham, G.; Manhood, S.J.; Thomas-Davies, N.; Waterhouse, J.

    2006-01-01

    The data acquisition system of the Mega-Amp Spherical Tokamak (MAST) presently collects up to 400 MB of data in about 3000 data items per shot, and subsequent fast growth is expected. Since the start of MAST operations (in 1999) the system has changed dramatically. Though we continue to use legacy CAMAC hardware, newer VME, PCI, and PXI based sub-systems collect most of the data now. All legacy software has been redesigned and new software has been developed. Last year a major system improvement was made-replacement of the message distribution system. The new message system provides easy connection of any sub-system independently of its platform and serves as a framework for many new applications. A new data acquisition controller provides full control of common sub-systems, central error logging, and data acquisition alarms for the MAST plant. A number of new sub-systems using Linux and Windows OSs on VME, PCI, and PXI platforms have been developed. A new PXI unit has been designed as a base sub-system accommodating any type of data acquisition and control devices. Several web applications for the real-time MAST monitoring and data presentation have been developed

  6. Data acquisition system for the Hamilton Standard W2 Electron Beam Welder

    International Nuclear Information System (INIS)

    Hopwood, J.

    1979-06-01

    A data acquisition system has been designed which will perform on-line weld-parameter sampling. It is a microprocessor-based program controller and calculator. The parameters sampled are beam current, accelerating voltage, focus-coil current, workpiece rpm, and filament voltage. Sampling in analog form occurs in pre-selected angular-rotation increments from 1 to 9 degrees. There are three data printout options: (A) all data displayed; (B) only out-of-tolerance values displayed; and (C) differences between nominal and sampled values in excess of preselected error bands displayed. A magnetic tape cartridge unit allows long-term data storage and easy retrieval. This report is a manual for system operation. It also describes the design--logic principles, circuitry, and programming--in detail

  7. Data acquisition system for the Hamilton Standard W2 Electron Beam Welder

    Energy Technology Data Exchange (ETDEWEB)

    Hopwood, J.

    1979-06-01

    A data acquisition system has been designed which will perform on-line weld-parameter sampling. It is a microprocessor-based program controller and calculator. The parameters sampled are beam current, accelerating voltage, focus-coil current, workpiece rpm, and filament voltage. Sampling in analog form occurs in pre-selected angular-rotation increments from 1 to 9 degrees. There are three data printout options: (A) all data displayed; (B) only out-of-tolerance values displayed; and (C) differences between nominal and sampled values in excess of preselected error bands displayed. A magnetic tape cartridge unit allows long-term data storage and easy retrieval. This report is a manual for system operation. It also describes the design--logic principles, circuitry, and programming--in detail.

  8. Submicronic Particle Measurement Instrumentation Test Bench Data Acquisition and Control System

    International Nuclear Information System (INIS)

    Alberdi, J.; Barcala, J. M.; Sanz, D.; Gomez, F. J.; Molinero, A.; Navarrete, J. J.

    1999-01-01

    This document describes the SAD-100 system characteristics. The unit makes the instrumentation test bench data acquisition and control, SAD-100 was designed and developed by Electronic and Automation Area (CIEMAT) and Aerosol Technology in Energy Generation Project (CIEMAT). (Author) 2 refs

  9. Applying Service-Oriented Architecture to Archiving Data in Control and Monitoring Systems

    Energy Technology Data Exchange (ETDEWEB)

    Nogiec, J. M. [Fermilab; Trombly-Freytag, K. [Fermilab

    2017-01-01

    Current trends in the architecture of software systems focus our attention on building systems using a set of loosely coupled components, each providing a specific functionality known as service. It is not much different in control and monitoring systems, where a functionally distinct sub-system can be identified and independently designed, implemented, deployed and maintained. One functionality that renders itself perfectly to becoming a service is archiving the history of the system state. The design of such a service and our experience of using it are the topic of this article. The service is built with responsibility segregation in mind, therefore, it provides for reducing data processing on the data viewer side and separation of data access and modification operations. The service architecture and the details concerning its data store design are discussed. An implementation of a service client capable of archiving EPICS process variables (PV) and LabVIEW shared variables is presented. Data access tools, including a browser-based data viewer and a mobile viewer, are also presented.

  10. Computer system design description for the spare pump mini-dacs data acquisition and control system

    International Nuclear Information System (INIS)

    Vargo, G.F. Jr.

    1994-01-01

    The attached document outlines the computer software design for the mini data acquisition and control system (DACS), that supports the testing of the spare pump for Tank 241-SY-101, at the maintenance and storage facility (MASF)

  11. A flexible LabVIEWTM-based data acquisition and analysis system for scanning microscopy

    International Nuclear Information System (INIS)

    Morse, Daniel H.; Antolak, Arlyn J.; Bench, Graham S.; Roberts, Mark L.

    1999-01-01

    A new data analysis system has been developed with computer-controlled beam and sample positioning, video sample imaging, multiple large solid angle detectors for X-rays and gamma-rays, and surface barrier detectors for charged particles. The system uses the LabVIEW TM programming language allowing it to be easily ported between different computer operating systems. In the present configuration, digital signal processors are directly interfaced to a SCSI CAMAC controller. However, the modular software design permits the substitution of other hardware with LabVIEW-supported drivers. On-line displays of histogram and two-dimensional elemental map images provide a user-friendly data acquisition interface. Subregions of the two-dimensional maps may be selected interactively for detailed analysis or for subsequent scanning. Off-line data processing of archived data currently yields elemental maps, analyzed spectra and reconstructions of tomographic data

  12. Data declaration, control and record of an experiment in nuclear physics. Data acquisition system development under X Window with the system OS-9

    International Nuclear Information System (INIS)

    Michel, L.

    1990-09-01

    To compensate for the increase in data produced by experiments in nuclear physics, the development of a data storage system much more compact than the magnetic tape is most important. The first goal of this work is to establish a data storage unit built on a 8 mm video cartridge (Exabyte) at a given experimental site, the 4pi gamma multidetector array Chateau de Cristal, set up at the CNRS unit in Strasbourg. We have built on a VME crate a data acquisition system working with the real time operating system OS-9 and integrating the Exabyte unit. The system control is realized with an original graphic interface that has been developed under X-Window. This interface allows the command and monitoring of data acquisition as well as the set up of acquisition parameters. The system worked up since january 1990 [fr

  13. A distributed real-time system for event-driven control and dynamic data acquisition on a fusion plasma experiment

    International Nuclear Information System (INIS)

    Sousa, J.; Combo, A.; Batista, A.; Correia, M.; Trotman, D.; Waterhouse, J.; Varandas, C.A.F.

    2000-01-01

    A distributed real-time trigger and timing system, designed in a tree-type topology and implemented in VME and CAMAC versions, has been developed for a magnetic confinement fusion experiment. It provides sub-microsecond time latencies for the transport of small data objects allowing event-driven discharge control with failure counteraction, dynamic pre-trigger sampling and event recording as well as accurate simultaneous triggers and synchronism on all nodes with acceptable optimality and predictability of timeliness. This paper describes the technical characteristics of the hardware components (central unit composed by one or more reflector crates, event and synchronism reflector cards, event and pulse node module, fan-out and fan-in modules) as well as software for both tests and integration on a global data acquisition system. The results of laboratory operation for several configurations and the overall performance of the system are presented and analysed

  14. Development of a VME and CAMAC based data acquisition and transfer system for JT-60 control

    International Nuclear Information System (INIS)

    Totsuka, Toshiyuki

    1993-08-01

    Development of a VME and CAMAC based data acquisition and transfer system for JT-60 Control is reported. The present data acquisition and transfer system in JT-60 control is basically composed of CAMAC devices. Since the system equipped with 16-bit microcomputers was manufactured more than ten years ago, the performance and program development environment of the system are apparently worse than those of modern 32-bit microcomputers. To improve these disadvantages, a new data acquisition and transfer system using VME-based 32-bit microcomputers and CAMAC drivers is under design. Corresponding to this design, a CAMAC handler, which runs on the microcomputer, for the VME based CAMAC driver was newly developed. Moreover, the functions of the driver and data transfer performance of the VME and CAMAC complex system were tested. The test results shown that the VME based microcomputer and CAMAC serial driver can be applied for the fast and reliable acquisition and transfer system for JT-60 control. (author)

  15. Quality data systems

    International Nuclear Information System (INIS)

    Bergman, J.E.; Patterson, R.G.

    1976-01-01

    General Electric's Nuclear Fuel Department data system strategy of multifunctional system integration and specific applications of data systems for the Quality Assurance Programme is detailed. Descriptions of two manufacturing control systems and their function in satisfying quality data requirements are included. The timesharing quality data system developed for processing laboratory, traceability and material release data in the Fuel Manufacturing Operation is described. In addition, specific references are made to those areas where significant time reductions have been realized through the utilization of mechanized data-handling systems. (author)

  16. Event Recording Data Acquisition System and Experiment Data Management System for Neutron Experiments at MLF, J-PARC

    Science.gov (United States)

    Nakatani, T.; Inamura, Y.; Moriyama, K.; Ito, T.; Muto, S.; Otomo, T.

    Neutron scattering can be a powerful probe in the investigation of many phenomena in the materials and life sciences. The Materials and Life Science Experimental Facility (MLF) at the Japan Proton Accelerator Research Complex (J-PARC) is a leading center of experimental neutron science and boasts one of the most intense pulsed neutron sources in the world. The MLF currently has 18 experimental instruments in operation that support a wide variety of users from across a range of research fields. The instruments include optical elements, sample environment apparatus and detector systems that are controlled and monitored electronically throughout an experiment. Signals from these components and those from the neutron source are converted into a digital format by the data acquisition (DAQ) electronics and recorded as time-tagged event data in the DAQ computers using "DAQ-Middleware". Operating in event mode, the DAQ system produces extremely large data files (˜GB) under various measurement conditions. Simultaneously, the measurement meta-data indicating each measurement condition is recorded in XML format by the MLF control software framework "IROHA". These measurement event data and meta-data are collected in the MLF common storage and cataloged by the MLF Experimental Database (MLF EXP-DB) based on a commercial XML database. The system provides a web interface for users to manage and remotely analyze experimental data.

  17. Analysis of Clinical Cohort Data Using Nested Case-control and Case-cohort Sampling Designs. A Powerful and Economical Tool.

    Science.gov (United States)

    Ohneberg, K; Wolkewitz, M; Beyersmann, J; Palomar-Martinez, M; Olaechea-Astigarraga, P; Alvarez-Lerma, F; Schumacher, M

    2015-01-01

    Sampling from a large cohort in order to derive a subsample that would be sufficient for statistical analysis is a frequently used method for handling large data sets in epidemiological studies with limited resources for exposure measurement. For clinical studies however, when interest is in the influence of a potential risk factor, cohort studies are often the first choice with all individuals entering the analysis. Our aim is to close the gap between epidemiological and clinical studies with respect to design and power considerations. Schoenfeld's formula for the number of events required for a Cox' proportional hazards model is fundamental. Our objective is to compare the power of analyzing the full cohort and the power of a nested case-control and a case-cohort design. We compare formulas for power for sampling designs and cohort studies. In our data example we simultaneously apply a nested case-control design with a varying number of controls matched to each case, a case cohort design with varying subcohort size, a random subsample and a full cohort analysis. For each design we calculate the standard error for estimated regression coefficients and the mean number of distinct persons, for whom covariate information is required. The formula for the power of a nested case-control design and the power of a case-cohort design is directly connected to the power of a cohort study using the well known Schoenfeld formula. The loss in precision of parameter estimates is relatively small compared to the saving in resources. Nested case-control and case-cohort studies, but not random subsamples yield an attractive alternative for analyzing clinical studies in the situation of a low event rate. Power calculations can be conducted straightforwardly to quantify the loss of power compared to the savings in the num-ber of patients using a sampling design instead of analyzing the full cohort.

  18. The structure of control and data transfer management system for the GAMMA-400 scientific complex

    International Nuclear Information System (INIS)

    Arkhangelskiy, A I; Bobkov, S G; Serdin, O V; Gorbunov, M S; Topchiev, N P

    2016-01-01

    A description of the control and data transfer management system for scientific instrumentation involved in the GAMMA-400 space project is given. The technical capabilities of all specialized equipment to provide the functioning of the scientific instrumentation and satellite support systems are unified in a single structure. Control of the scientific instruments is maintained using one-time pulse radio commands, as well as program commands in the form of 16-bit code words, which are transmitted via onboard control system and scientific data acquisition system. Up to 100 GByte of data per day can be transferred to the ground segment of the project. The correctness of the proposed and implemented structure, engineering solutions and electronic elemental base selection has been verified by the experimental working-off of the prototype of the GAMMA-400 scientific complex in laboratory conditions. (paper)

  19. Phase II and III the next generation of CLS beamline control and data acquisition systems

    International Nuclear Information System (INIS)

    Matias, E.; Beauregard, D.; Berg, R.; Black, G.; Boots, M.J.; Dolton, W.; Hunter, D.; Igarashi, R.; Liu, D.; Maxwell, D.; Miller, C.D.; Wilson, T.; Wright, G.

    2012-01-01

    The Canadian Light Source (CLS) is nearing the completion of its suite of Phase II Beamlines and in detailed design of its Phase III Beamlines. The paper presents an overview of the overall approach adopted by CLS in the development of beamline control and data acquisition systems. Building on the experience of our first phase of beamlines the CLS has continued to make extensive use of EPICS with EDM and QT based user interfaces. Increasing interpretive languages such as Python are finding a place in the beamline control systems. Web based environment such as ScienceStudio have also found a prominent place in the control system architecture as we move to tighter integration between data acquisition, visualization and data analysis. (authors)

  20. Measuring the Balance Control System – Review

    Directory of Open Access Journals (Sweden)

    Jitka Jančová

    2008-01-01

    Full Text Available Past studies of postural control during standing have employed wide range of procedures including the outcome measures use to quantify postural control, the duration of the sample collected, sampling frequency and methods for data processing. Due to these differences there remains little, if any, common grounds for comparisons between studies to establish a concrete understanding of the features and bouns which characterize normal healthy postural control. This article deals with terms such as reliability and repeatability of stabilometric measurements, stabilometric data quantification and analysis. To clear up those terms is suggested, by the author of this paper, very important. The stabilometric measurements remain, nevertheless, different when dealing with aging adults. Though, we notes some alterations of the aging systems, this article is not entirely dedicated to the seniors population. Measurements of COP and technical notes remain the main axis of present paper.

  1. Conceptual Design, Implementation and Commissioning of Data Acquisition and Control System for Negative Ion Source at IPR

    International Nuclear Information System (INIS)

    Soni, Jignesh; Gahlaut, A.; Bansal, G.; Parmar, K. G.; Pandya, K.; Chakraborty, A.; Yadav, Ratnakar; Singh, M. J.; Bandyopadhyay, M.

    2011-01-01

    Negative ion Experimental facility has been setup at IPR. The facility consists of a RF based negative ion source (ROBIN)--procured under a license agreement with IPP Garching, as a replica of BATMAN, presently operating in IPP, 100 kW 1 MHz RF generators and a set of low and high voltage power supplies, vacuum system and diagnostics. 35 keV 10A H- beam is expected from this setup. Automated successful operation of the system requires an advanced, rugged, time proven and flexible control system. Further the data generated in the experimental phase needs to be acquired, monitored and analyzed to verify and judge the system performance. In the present test bed, this is done using a combination of PLC based control system and a PXI based data acquisition system. The control system consists of three different Siemens PLC systems viz. (1) S-7 400 PLC as a Master Control, (2) S-7 300 PLC for Vacuum system control and (3) C-7 PLC for RF generator control. Master control PLC directly controls all the subsystems except the Vacuum system and RF generator. The Vacuum system and RF generator have their own dedicated PLCs (S-7 300 and C-7 respectively). Further, these two PLC systems work as a slave for the Master control PLC system. Communication between PLC S-7 400, S-7 300 and central control room computer is done through Industrial Ethernet (IE). Control program and GUI are developed in Siemens Step-7 PLC programming software and Wincc SCADA software, respectively. There are approximately 150 analog and 200 digital control and monitoring signals required to perform complete closed loop control of the system. Since the source floats at high potential (∼35 kV); a combination of galvanic and fiber optic isolation has been implemented. PXI based Data Acquisition system (DAS) is a combination of PXI RT (Real time) system, front end signal conditioning electronics, host system and DAQ program. All the acquisition signals coming from various sub-systems are connected and

  2. Conceptual Design, Implementation and Commissioning of Data Acquisition and Control System for Negative Ion Source at IPR

    Science.gov (United States)

    Soni, Jignesh; Yadav, Ratnakar; Gahlaut, A.; Bansal, G.; Singh, M. J.; Bandyopadhyay, M.; Parmar, K. G.; Pandya, K.; Chakraborty, A.

    2011-09-01

    Negative ion Experimental facility has been setup at IPR. The facility consists of a RF based negative ion source (ROBIN)—procured under a license agreement with IPP Garching, as a replica of BATMAN, presently operating in IPP, 100 kW 1 MHz RF generators and a set of low and high voltage power supplies, vacuum system and diagnostics. 35 keV 10A H- beam is expected from this setup. Automated successful operation of the system requires an advanced, rugged, time proven and flexible control system. Further the data generated in the experimental phase needs to be acquired, monitored and analyzed to verify and judge the system performance. In the present test bed, this is done using a combination of PLC based control system and a PXI based data acquisition system. The control system consists of three different Siemens PLC systems viz. (1) S-7 400 PLC as a Master Control, (2) S-7 300 PLC for Vacuum system control and (3) C-7 PLC for RF generator control. Master control PLC directly controls all the subsystems except the Vacuum system and RF generator. The Vacuum system and RF generator have their own dedicated PLCs (S-7 300 and C-7 respectively). Further, these two PLC systems work as a slave for the Master control PLC system. Communication between PLC S-7 400, S-7 300 and central control room computer is done through Industrial Ethernet (IE). Control program and GUI are developed in Siemens Step-7 PLC programming software and Wincc SCADA software, respectively. There are approximately 150 analog and 200 digital control and monitoring signals required to perform complete closed loop control of the system. Since the source floats at high potential (˜35 kV); a combination of galvanic and fiber optic isolation has been implemented. PXI based Data Acquisition system (DAS) is a combination of PXI RT (Real time) system, front end signal conditioning electronics, host system and DAQ program. All the acquisition signals coming from various sub-systems are connected and

  3. Integrated control systems

    International Nuclear Information System (INIS)

    Smith, D.J.

    1991-01-01

    This paper reports that instrument manufacturers must develop standard network interfaces to pull together interrelated systems such as automatic start-up, optimization programs, and online diagnostic systems. In the past individual control system manufacturers have developed their own data highways with proprietary hardware and software designs. In the future, electric utilities will require that future systems, irrespective of manufacturer, should be able to communicate with each other. Until now the manufactures of control systems have not agreed on the standard high-speed data highway system. Currently, the Electric Power Research Institute (EPRI), in conjunction with several electric utilities and equipment manufactures, is working on developing a standard protocol for communicating between various manufacturers' control systems. According to N. Michael of Sargent and Lundy, future control room designs will require that more of the control and display functions be accessible from the control room through CRTs. There will be less emphasis on traditional hard-wired control panels

  4. Statistical transformation and the interpretation of inpatient glucose control data.

    Science.gov (United States)

    Saulnier, George E; Castro, Janna C; Cook, Curtiss B

    2014-03-01

    To introduce a statistical method of assessing hospital-based non-intensive care unit (non-ICU) inpatient glucose control. Point-of-care blood glucose (POC-BG) data from hospital non-ICUs were extracted for January 1 through December 31, 2011. Glucose data distribution was examined before and after Box-Cox transformations and compared to normality. Different subsets of data were used to establish upper and lower control limits, and exponentially weighted moving average (EWMA) control charts were constructed from June, July, and October data as examples to determine if out-of-control events were identified differently in nontransformed versus transformed data. A total of 36,381 POC-BG values were analyzed. In all 3 monthly test samples, glucose distributions in nontransformed data were skewed but approached a normal distribution once transformed. Interpretation of out-of-control events from EWMA control chart analyses also revealed differences. In the June test data, an out-of-control process was identified at sample 53 with nontransformed data, whereas the transformed data remained in control for the duration of the observed period. Analysis of July data demonstrated an out-of-control process sooner in the transformed (sample 55) than nontransformed (sample 111) data, whereas for October, transformed data remained in control longer than nontransformed data. Statistical transformations increase the normal behavior of inpatient non-ICU glycemic data sets. The decision to transform glucose data could influence the interpretation and conclusions about the status of inpatient glycemic control. Further study is required to determine whether transformed versus nontransformed data influence clinical decisions or evaluation of interventions.

  5. The Sample Handling System for the Mars Icebreaker Life Mission: from Dirt to Data

    Science.gov (United States)

    Dave, Arwen; Thompson, Sarah J.; McKay, Christopher P.; Stoker, Carol R.; Zacny, Kris; Paulsen, Gale; Mellerowicz, Bolek; Glass, Brian J.; Wilson, David; Bonaccorsi, Rosalba; hide

    2013-01-01

    The Mars icebreaker life mission will search for subsurface life on mars. It consists of three payload elements: a drill to retrieve soil samples from approx. 1 meter below the surface, a robotic sample handling system to deliver the sample from the drill to the instruments, and the instruments themselves. This paper will discuss the robotic sample handling system.

  6. Hybrid dynamical systems observation and control

    CERN Document Server

    Defoort, Michael

    2015-01-01

    This book is a collection of contributions defining the state of current knowledge and new trends in hybrid systemssystems involving both continuous dynamics and discrete events – as described by the work of several well-known groups of researchers. Hybrid Dynamical Systems presents theoretical advances in such areas as diagnosability, observability and stabilization for various classes of system. Continuous and discrete state estimation and self-triggering control of nonlinear systems are advanced. The text employs various methods, among them, high-order sliding modes, Takagi–Sugeno representation and sampled-data switching to achieve its ends. The many applications of hybrid systems from power converters to computer science are not forgotten; studies of flexible-joint robotic arms and – as representative biological systems – the behaviour of the human heart and vasculature, demonstrate the wide-ranging practical significance of control in hybrid systems. The cross-disciplinary origins of study ...

  7. Overview of data acquisition system for SST-1 diagnostics

    International Nuclear Information System (INIS)

    Sharma, Manika; Mansuri, Imran; Raval, Tushar; Sharma, A.L; Pradhan, S.

    2016-01-01

    Highlights: • An account of architecture and data acquisition activities of SST-1 data acquisition system (DAS) for SST-1 diagnostics and subsystems. • PXI based Data acquisition system and CAMAC based Data acquisition system for slow and fast plasma diagnostics. • SST-1 DAS interface and its communication with SST-1 central control system. Integration of SST-1 DAS with timing system. • SST-1 DAS data archival and data analysis. - Abstract: The recent first phase operations of SST-1 in short pulse mode have provided an excellent opportunity for the essential initial tests and benchmark of the SST-1 Data Acquisition System. This paper describes the SST-1 Data Acquisition systems (DAS), which with its heterogeneous composition and distributed architecture, aims to cover a wide range of slow to fast channels interfaced with a large set of diagnostics. The DAS also provides the essential user interface for data acquisition to cater both on and off-line data usage. The central archiving and retrieval service is based on a dual step architecture involving a combination of Network Attached Server (NAS) and a Storage Area Network (SAN). SST-1 Data Acquisition Systems have been reliably operated in the SST-1 experimental campaigns. At present different distributed DAS caters the need of around 130 channels from different SST-1 diagnostics and its subsystems. PXI based DAS and CAMAC based DAS have been chosen to cater the need, with sampling rates varying from 10Ksamples/sec to 1Msamples/sec. For these large sets of channels acquiring from individual diagnostics and subsystems has been a combined setup, subjected to a gradual phase of optimization and tests resulting into a series of improvisations over the recent operations. In order to facilitate a reliable data acquisition, the model further integrates the objects of the systems with the Central Control System of SST-1 using the TCP/IP communication. The associated DAS software essentially addresses the

  8. Overview of data acquisition system for SST-1 diagnostics

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, Manika, E-mail: bithi@ipr.res.in; Mansuri, Imran; Raval, Tushar; Sharma, A.L; Pradhan, S.

    2016-11-15

    Highlights: • An account of architecture and data acquisition activities of SST-1 data acquisition system (DAS) for SST-1 diagnostics and subsystems. • PXI based Data acquisition system and CAMAC based Data acquisition system for slow and fast plasma diagnostics. • SST-1 DAS interface and its communication with SST-1 central control system. Integration of SST-1 DAS with timing system. • SST-1 DAS data archival and data analysis. - Abstract: The recent first phase operations of SST-1 in short pulse mode have provided an excellent opportunity for the essential initial tests and benchmark of the SST-1 Data Acquisition System. This paper describes the SST-1 Data Acquisition systems (DAS), which with its heterogeneous composition and distributed architecture, aims to cover a wide range of slow to fast channels interfaced with a large set of diagnostics. The DAS also provides the essential user interface for data acquisition to cater both on and off-line data usage. The central archiving and retrieval service is based on a dual step architecture involving a combination of Network Attached Server (NAS) and a Storage Area Network (SAN). SST-1 Data Acquisition Systems have been reliably operated in the SST-1 experimental campaigns. At present different distributed DAS caters the need of around 130 channels from different SST-1 diagnostics and its subsystems. PXI based DAS and CAMAC based DAS have been chosen to cater the need, with sampling rates varying from 10Ksamples/sec to 1Msamples/sec. For these large sets of channels acquiring from individual diagnostics and subsystems has been a combined setup, subjected to a gradual phase of optimization and tests resulting into a series of improvisations over the recent operations. In order to facilitate a reliable data acquisition, the model further integrates the objects of the systems with the Central Control System of SST-1 using the TCP/IP communication. The associated DAS software essentially addresses the

  9. State and data techniques for control of discontinuous systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1986-01-01

    This paper describes a technique for structured analysis and design of automated control systems. The technique integrates control of continuous and discontinuous nuclear power plant subsystems and components. A hierarchical control system with distributed intelligence follows from applying the technique. Further, it can be applied to all phases of control system design. For simplicity, the example used in the paper is limited to phase 1 design (basic automatic control action), in which no maintenance, testing, or contingency capability is attempted. 11 figs

  10. ATROPOS: a versatile data acquisition and analysis system

    International Nuclear Information System (INIS)

    Logg, C.A.; Cottrell, R.L.A.

    1978-10-01

    Versatile, portable, rugged, and compact test and control modules for use in the development and testing of detection equipment for high-energy physics experiments are frequently needed at SLAC. The basic system developed is based on an LSI-11 microcomputer with 24K RAM, 4K ROM, 2 serial interfaces (one to the console terminal, the other to the large SLAC IBM computer complex (the TRIPLEX)), a programable clock, and a CAMAC crate controller. Data logging support is provided for magnetic tape, floppy disk, and an interactive program (ACQUIRE) which runs on the TRIPLEX under the timesharing system ORVYL. Data are read from various CAMAC modules, collected, buffered, and optionally logged. At a lower priority, the data read are sampled and analyzed in real-time on the LSI-11 to produce various histograms and tables. Concurrently, a more extensive analysis can be performed by the TRIPLEX program on the data which are logged to it. Interactive facilities provided by the microcomputer operating system enable the user to change CAMAC module addresses and the function codes used with them, specify various data cuts and transformations that are to be performed on the sample data, and specify new histogram limits and titles. Results of the real-time analysis, by both the microcomputer and the TRIPLEX program (if it is attached), may be displayed in graphical or tabular form on the console terminal. The basic system hardware cost (exclusive of the magnetic tape drive and floppy disk drive) is around $7000. The software is written in a modular fashion so that the user can supply his own data reading and analysis routines. This system has been in use for two years by various groups on several LSI-11s at SLAC. 3 figures

  11. The EnzymeTracker: an open-source laboratory information management system for sample tracking.

    Science.gov (United States)

    Triplet, Thomas; Butler, Gregory

    2012-01-26

    In many laboratories, researchers store experimental data on their own workstation using spreadsheets. However, this approach poses a number of problems, ranging from sharing issues to inefficient data-mining. Standard spreadsheets are also error-prone, as data do not undergo any validation process. To overcome spreadsheets inherent limitations, a number of proprietary systems have been developed, which laboratories need to pay expensive license fees for. Those costs are usually prohibitive for most laboratories and prevent scientists from benefiting from more sophisticated data management systems. In this paper, we propose the EnzymeTracker, a web-based laboratory information management system for sample tracking, as an open-source and flexible alternative that aims at facilitating entry, mining and sharing of experimental biological data. The EnzymeTracker features online spreadsheets and tools for monitoring numerous experiments conducted by several collaborators to identify and characterize samples. It also provides libraries of shared data such as protocols, and administration tools for data access control using OpenID and user/team management. Our system relies on a database management system for efficient data indexing and management and a user-friendly AJAX interface that can be accessed over the Internet. The EnzymeTracker facilitates data entry by dynamically suggesting entries and providing smart data-mining tools to effectively retrieve data. Our system features a number of tools to visualize and annotate experimental data, and export highly customizable reports. It also supports QR matrix barcoding to facilitate sample tracking. The EnzymeTracker was designed to be easy to use and offers many benefits over spreadsheets, thus presenting the characteristics required to facilitate acceptance by the scientific community. It has been successfully used for 20 months on a daily basis by over 50 scientists. The EnzymeTracker is freely available online at http

  12. Laboratory technical services provides business opportunities for supervisory control and data acquisition systems

    International Nuclear Information System (INIS)

    Ballard, W.

    1994-01-01

    The author presents some additional information about what he considers are some really great opportunities for the business community to participate in developing the greatest scientific project in the history of mankind. Facility Engineering Services is part of Laboratory Technical Services. As part of this group, it has the responsibility to direct the construction of interim facilities, scientific labs, production process, cooling towers, cooling ponds and the operation and control of SSC Laboratory conventional support systems. These operations and controls will be accomplished through the employment of a Supervisory Control and Data Acquisition system (SCADA)

  13. Stochastic bounded consensus tracking of leader—follower multi-agent systems with measurement noises and sampled-data

    International Nuclear Information System (INIS)

    Wu Zhi-Hai; Peng Li; Xie Lin-Bo; Wen Ji-Wei

    2012-01-01

    This paper is concerned with the stochastic bounded consensus tracking problems of leader—follower multi-agent systems, where the control input of an agent can only use the information measured at the sampling instants from its neighbours or the virtual leader with a time-varying reference state, and the measurements are corrupted by random noises. The probability limit theory and the algebra graph theory are employed to derive the necessary and sufficient conditions guaranteeing the mean square bounded consensus tracking. It is shown that the maximum allowable upper boundary of the sampling period simultaneously depends on the constant feedback gains and the network topology. Furthermore, the effects of the sampling period on the tracking performance are analysed. It turns out that from the view point of the sampling period, there is a trade-off between the tracking speed and the static tracking error. Simulations are provided to demonstrate the effectiveness of the theoretical results. (interdisciplinary physics and related areas of science and technology)

  14. Software design for the EBT-P data acquisition and control system R and D

    International Nuclear Information System (INIS)

    Boyd, R.A.

    1983-01-01

    The instrumentation and control system for the EBT-P device is composed of a hierarchy of programmable logic controllers, microprocessor-based data acquisition computers, and a large minicomputer-based facility computer system. The software being developed to support this data acquisition and control system is necessarily quite complex due to several requirements imposed upon the EBT-P overall design criteria. These requirements, which include such considerations as overall reliability, operator interface, real-time display, interprocessor communication, and minimum cost to build, operate, and maintain, dictate that the software be developed in a well structured and controlled manner. To this end, structured software engineering practices are being applied to the design and development of the EBT-P data acquistion and control software. The design process began with the production of a software Requirements Document which describes the hardware and software environment in which the software development takes place. It identifies the major deliverable software items to be produced and describes the practices to be used to design and develop the software. The software design is split into three components: the facility computer software, the microcomputer software, and the PLC software. Within these physical boundaries, the following five functions are defined: data acquisition, display, communication, storage, and control. The software design is further detailed in a Structured Specification Document for each of the three physical components. Each specification describes the software in detailed terms so that a programmer can directly write the required software. Each specification is composed of: data flow diagrams, a data dictionary, structure diagrams, and program design language mini-specifications. Examples of the design issues exposed and addressed during the structured decomposition of EBT-P software processes are discussed in detail

  15. MODIS information, data and control system (MIDACS) level 2 functional requirements

    Science.gov (United States)

    Han, D.; Salomonson, V.; Ormsby, J.; Sharts, B.; Folta, D.; Ardanuy, P.; Mckay, A.; Hoyt, D.; Jaffin, S.; Vallette, B.

    1988-01-01

    The MODIS Information, Data and Control System (MIDACS) Level 2 Functional Requirements Document establishes the functional requirements for MIDACS and provides a basis for the mutual understanding between the users and the designers of the EosDIS, including the requirements, operating environment, external interfaces, and development plan. In defining the requirements and scope of the system, this document describes how MIDACS will operate as an element of the EOS within the EosDIS environment. This version of the Level 2 Requirements Document follows an earlier release of a preliminary draft version. The sections on functional and performance requirements do not yet fully represent the requirements of the data system needed to achieve the scientific objectives of the MODIS instruments and science teams. Indeed, the team members have not yet been selected and the team has not yet been formed; however, it has been possible to identify many relevant requirements based on the present concept of EosDIS and through interviews and meetings with key members of the scientific community. These requirements have been grouped by functional component of the data system, and by function within each component. These requirements have been merged with the complete set of Level 1 and Level 2 context diagrams, data flow diagrams, and data dictionary.

  16. Data-based fault-tolerant control for affine nonlinear systems with actuator faults.

    Science.gov (United States)

    Xie, Chun-Hua; Yang, Guang-Hong

    2016-09-01

    This paper investigates the fault-tolerant control (FTC) problem for unknown nonlinear systems with actuator faults including stuck, outage, bias and loss of effectiveness. The upper bounds of stuck faults, bias faults and loss of effectiveness faults are unknown. A new data-based FTC scheme is proposed. It consists of the online estimations of the bounds and a state-dependent function. The estimations are adjusted online to compensate automatically the actuator faults. The state-dependent function solved by using real system data helps to stabilize the system. Furthermore, all signals in the resulting closed-loop system are uniformly bounded and the states converge asymptotically to zero. Compared with the existing results, the proposed approach is data-based. Finally, two simulation examples are provided to show the effectiveness of the proposed approach. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  17. Programmable automatic alpha--beta air sample counter

    International Nuclear Information System (INIS)

    Howell, W.P.

    1978-01-01

    A programmable automatic alpha-beta air sample counter was developed for routine sample counting by operational health physics personnel. The system is composed of an automatic sample changer utilizing a large silicon diode detector, an electronic counting system with energy analysis capability, an automatic data acquisition controller, an interface module, and a teletypewriter with paper tape punch and paper tape reader. The system is operated through the teletypewriter keyboard and the paper tape reader, which are used to instruct the automatic data acquisition controller. Paper tape programs are provided for background counting, Chi 2 test, and sample counting. Output data are printed by the teletypewriter on standard continuous roll or multifold paper. Data are automatically corrected for background and counter efficiency

  18. Data card system for filmless radiography

    International Nuclear Information System (INIS)

    Siedband, M.P.

    1987-01-01

    Data cards using the sample principles as music compact discs can store 4 MB of digital data. This is sufficient for 4 uncompressed radiographic images or 16 images with 4:1 average compression. Radiograph memory screens (stimulable phosphors) can be scanned at 1023 lines to provide the input signals. A filmless radiographic x-ray system is described which uses digital data cards of the size of common credit cards. These can be used in the same way as films are now used: placed in patient folders, copied, mailed, seen on view boxes, etc. The techniques of data acquisition, processing, compression, storage and display are described. The advantages of the system are explained in terms of economies, elimination of film (chemicals and processing), and compatibility with other data transmission methods. Suggestions are made for standardization of data storage and control so that this method may be used for other medical imaging applications, such as CT and ultrasound

  19. Joint Design of Control and Power Efficiency in Wireless Networked Control System

    Directory of Open Access Journals (Sweden)

    Yan Wang

    2014-01-01

    Full Text Available This paper presents a joint design method for wireless networked control system (WNCS to balance both the demands of network service and the control performance. Since the problems of power consumption, communication reliability, and system stability exist simultaneously and interdependently in WNCS, most of the achieved results in the wireless network and wired networked control system cannot be used directly. To coordinate the three problems, sampling period is found to be the linking bridge. An adaptive sampling power efficiency algorithm is proposed to manage the power consumption such that it can meet the demands of network life span. The sampling period is designed to update periodically on the constraints of network schedulability and system stability. The convergence of the power efficiency algorithm is further proved. The sampling period is no longer a fixed value, however; thus, increasing the difficulty in modeling and controlling such a complicated time-varying system remains. In this work, a switched control system scheme is applied to model such a WNCS, and the effect of network-induced delay is considered. Switched feedback controllers are introduced to stabilize the WNCS, and some considerations on stability condition and the bounds of the update circle for renewing sampling period are discussed. A numerical example shows the effectiveness of the proposed method.

  20. Control, data acquisition, data analysis and remote participation in LHD

    International Nuclear Information System (INIS)

    Nagayama, Y.; Emoto, M.; Nakanishi, H.; Sudo, S.; Imazu, S.; Inagaki, S.; Iwata, C.; Kojima, M.; Nonomura, M.; Ohsuna, M.; Tsuda, K.; Yoshida, M.; Chikaraishi, H.; Funaba, H.; Horiuchi, R.; Ishiguro, S.; Ito, Y.; Kubo, S.; Mase, A.; Mito, T.

    2008-01-01

    This paper presents the control, data acquisition, data analysis and remote participation facilities of the Large Helical Device (LHD), which is designed to confine the plasma in steady state. In LHD the plasma duration exceeds 3000 s by controlling the plasma position, the density and the ICRF heating. The 'LABCOM' data acquisition system takes both the short-pulse and the steady-state data. A two-layer Mass Storage System with RAIDs and Blu-ray Disk jukeboxes in a storage area network has been developed to increase capacity of storage. The steady-state data can be monitored with a Web browser in real time. A high-level data analysis system with Web interfaces is being developed in order to provide easier usage of LHD data and large FORTRAN codes in a supercomputer. A virtual laboratory system for the Japanese fusion community has been developed with Multi-protocol Label Switching Virtual Private Network Technology. Collaborators at remote sites can join the LHD experiment or use the NIFS supercomputer system as if they were working in the LHD control room

  1. Turbidity-controlled sampling for suspended sediment load estimation

    Science.gov (United States)

    Jack Lewis

    2003-01-01

    Abstract - Automated data collection is essential to effectively measure suspended sediment loads in storm events, particularly in small basins. Continuous turbidity measurements can be used, along with discharge, in an automated system that makes real-time sampling decisions to facilitate sediment load estimation. The Turbidity Threshold Sampling method distributes...

  2. Characterization of rock samples and mineralogical controls on leachates

    Science.gov (United States)

    Hammarstrom, Jane M.; Cravotta, Charles A.; Galeone, Daniel G.; Jackson, John C.; Dulong, Frank T.; Hornberger, Roger J.; Brady, Keith B.C.

    2009-01-01

    Rocks associated with coal beds typically include shale, sandstone, and (or) limestone. In addition to common rock-forming minerals, all of these rock types may contain sulfide and sulfate minerals, various carbonate minerals, and organic material. These different minerals have inherently different solubility characteristics, as well as different acid-generating or acid-neutralizing potentials. The abundance and composition of sulfur- and carbonate-bearing minerals are of particular interest in interpreting the leaching column data because (1) pyrite and carbonate minerals are the primary controls on the acid-base account of a sample, (2) these minerals incorporate trace metals that can be released during weathering, and (3) these minerals readily react during weathering due to mineral dissolution and oxidation of iron.Rock samples were collected by the Pennsylvania Department of Environmental Protection (PaDEP) from five different sites to assess the draft standardized leaching column method (ADTI-WP2) for the prediction of weathering rates and water quality at coal mines. Samples were sent to USGS laboratories for mineralogical characterization and to ActLabs for chemical analysis. The samples represent a variety of rock types (shales, sandstones, and coal refuse) that are typical of coal overburden in the eastern United States. These particular samples were chosen for testing the weathering protocols because they represent a range of geochemical and lithologic characteristics, sulfur contents, and acid-base accounting characteristics (Hornberger et al., 2003). The rocks contain variable amounts of pyrite and carbonate minerals and vary in texture.This chapter includes bulk rock chemical data and detailed mineralogical and textural data for unweathered starting materials used in the interlaboratory validation study, and for two samples used in the early phases of leaching column tests (Wadesville Sandstone, Leechburg Coal Refuse). We also characterize some of the

  3. Developments in data acquisition systems with LabView datalogging and supervisory control module for tritium removal plant, with data base and process analysis

    International Nuclear Information System (INIS)

    Moraru, Carmen Maria; Stefan, Iuliana; Balteanu, Ovidiu; Stefan, Liviu; Bucur, Ciprian; Hartescu, Florin

    2006-01-01

    Full text: The implementation of the new trends for tritium processing nuclear plants, and especially those with an experimental character or of new technology development, shows a very high complexity due to issues raised by the integration of a high diversity of instrumentation and equipment into a unitary control system of the technological process. Keeping the system's flexibility is a demand of the experimental plants for which the change of configuration, process and parameters is something usual. The big amount of data that needs to be monitored, stored and accessed for ulterior analyses demands the achievement of an information network where the data acquiring, control and analysis systems of the technological process can be integrated with a data base system. Thus, integrated computing and control systems needed for the control of the technological process will be executed, to be continued with the execution of failure protection system, by choosing methods corresponding to the technological processes within the tritium processing nuclear plants. (authors)

  4. Computer-based data acquisition system in the Large Coil Test Facility

    International Nuclear Information System (INIS)

    Gould, S.S.; Layman, L.R.; Million, D.L.

    1983-01-01

    The utilization of computers for data acquisition and control is of paramount importance on large-scale fusion experiments because they feature the ability to acquire data from a large number of sensors at various sample rates and provide for flexible data interpretation, presentation, reduction, and analysis. In the Large Coil Test Facility (LCTF) a Digital Equipment Corporation (DEC) PDP-11/60 host computer with the DEC RSX-11M operating system coordinates the activities of five DEC LSI-11/23 front-end processors (FEPs) via direct memory access (DMA) communication links. This provides host control of scheduled data acquisition and FEP event-triggered data collection tasks. Four of the five FEPs have no operating system

  5. A large-scale cryoelectronic system for biological sample banking

    Science.gov (United States)

    Shirley, Stephen G.; Durst, Christopher H. P.; Fuchs, Christian C.; Zimmermann, Heiko; Ihmig, Frank R.

    2009-11-01

    We describe a polymorphic electronic infrastructure for managing biological samples stored over liquid nitrogen. As part of this system we have developed new cryocontainers and carrier plates attached to Flash memory chips to have a redundant and portable set of data at each sample. Our experimental investigations show that basic Flash operation and endurance is adequate for the application down to liquid nitrogen temperatures. This identification technology can provide the best sample identification, documentation and tracking that brings added value to each sample. The first application of the system is in a worldwide collaborative research towards the production of an AIDS vaccine. The functionality and versatility of the system can lead to an essential optimization of sample and data exchange for global clinical studies.

  6. Risk assessment of safety data link and network communication in digital safety feature control system of nuclear power plant

    International Nuclear Information System (INIS)

    Lee, Sang Hun; Son, Kwang Seop; Jung, Wondea; Kang, Hyun Gook

    2017-01-01

    Highlights: • Safety data communication risk assessment framework and quantitative scheme were proposed. • Fault-tree model of ESFAS unavailability due to safety data communication failure was developed. • Safety data link and network risk were assessed based on various ESF-CCS design specifications. • The effect of fault-tolerant algorithm reliability of safety data network on ESFAS unavailability was assessed. - Abstract: As one of the safety-critical systems in nuclear power plants (NPPs), the Engineered Safety Feature-Component Control System (ESF-CCS) employs safety data link and network communication for the transmission of safety component actuation signals from the group controllers to loop controllers to effectively accommodate various safety-critical field controllers. Since data communication failure risk in the ESF-CCS has yet to be fully quantified, the ESF-CCS employing data communication systems have not been applied in NPPs. This study therefore developed a fault tree model to assess the data link and data network failure-induced unavailability of a system function used to generate an automated control signal for accident mitigation equipment. The current aim is to provide risk information regarding data communication failure in a digital safety feature control system in consideration of interconnection between controllers and the fault-tolerant algorithm implemented in the target system. Based on the developed fault tree model, case studies were performed to quantitatively assess the unavailability of ESF-CCS signal generation due to data link and network failure and its risk effect on safety signal generation failure. This study is expected to provide insight into the risk assessment of safety-critical data communication in a digitalized NPP instrumentation and control system.

  7. Integration of autonomous systems for remote control of data acquisition and diagnostics in the TJ-II device

    International Nuclear Information System (INIS)

    Vega, J.; Mollinedo, A.; Lopez, A.; Pacios, L.; Dormido, S.

    1997-01-01

    The data acquisition system for TJ-II will consist of a central computer, containing the data base of the device, and a set of independent systems (personal computers, embedded ones, workstations, minicomputers, PLCs, and microprocessor systems among others), controlling data collection, and automated diagnostics. Each autonomous system can be used to isolate and manage specific problems in the most efficient manner. These problems are related to data acquisition, hard (μs endash ms) real time requirements, soft (ms endash s) real time requirements, remote control of diagnostics, etc. In the operation of TJ-II, the programming of systems will be carried out from the central computer. Coordination and synchronization will be performed by linking systems to local area networks. Several Ethernet segments and FDDI rings will be used for these purposes. Programmable logic controller devices (PLCs) used for diagnostic low level control will be linked among them through a fast serial link, the RS485 Profibus standard. One VME crate, running on the OS-9 real time operating system, will be assigned as a gateway, so as to connect the PLCs based systems with an Ethernet segment. copyright 1997 American Institute of Physics

  8. Adaptive Kalman Filter Based on Adjustable Sampling Interval in Burst Detection for Water Distribution System

    Directory of Open Access Journals (Sweden)

    Doo Yong Choi

    2016-04-01

    Full Text Available Rapid detection of bursts and leaks in water distribution systems (WDSs can reduce the social and economic costs incurred through direct loss of water into the ground, additional energy demand for water supply, and service interruptions. Many real-time burst detection models have been developed in accordance with the use of supervisory control and data acquisition (SCADA systems and the establishment of district meter areas (DMAs. Nonetheless, no consideration has been given to how frequently a flow meter measures and transmits data for predicting breaks and leaks in pipes. This paper analyzes the effect of sampling interval when an adaptive Kalman filter is used for detecting bursts in a WDS. A new sampling algorithm is presented that adjusts the sampling interval depending on the normalized residuals of flow after filtering. The proposed algorithm is applied to a virtual sinusoidal flow curve and real DMA flow data obtained from Jeongeup city in South Korea. The simulation results prove that the self-adjusting algorithm for determining the sampling interval is efficient and maintains reasonable accuracy in burst detection. The proposed sampling method has a significant potential for water utilities to build and operate real-time DMA monitoring systems combined with smart customer metering systems.

  9. ASN.1 notation for exchange of data in computer-based railway control systems

    Directory of Open Access Journals (Sweden)

    Zbigniew ŁUKASIK

    2009-01-01

    Full Text Available Development of railway control systems aims at computerization. In most cases these systems are Distributed Real Time Systems. However, a huge problem in their putting into practice is the lack of interface standardization in the range of data structures and information exchange methods. It results in a variety of solutions, and thus in problems concerning cooperation of systems that come from different software vendors. Specification of protocols for data exchanging applications should therefore be created with the use of generally accepted standards. One of them is ASN.1 (Abstract Syntax Notation One language, which shall be presented in this article.

  10. CONTROL: the run control program used on the WA62 VAX data acquisition system at CERN

    International Nuclear Information System (INIS)

    Hand, R.P.

    1980-11-01

    The criteria are described which have been used in the design of CONTROL, a program in the WA62 VAX Native mode data acquisition system which serves the triple purpose of driving a status display, an operator's console and a message reporter. (U.K.)

  11. A sample design for globally consistent biomass estimation using lidar data from the Geoscience Laser Altimeter System (GLAS)

    Science.gov (United States)

    Sean P. Healey; Paul L. Patterson; Sassan S. Saatchi; Michael A. Lefsky; Andrew J. Lister; Elizabeth A. Freeman

    2012-01-01

    Lidar height data collected by the Geosciences Laser Altimeter System (GLAS) from 2002 to 2008 has the potential to form the basis of a globally consistent sample-based inventory of forest biomass. GLAS lidar return data were collected globally in spatially discrete full waveform "shots," which have been shown to be strongly correlated with aboveground forest...

  12. [Project summarize of "reestablishing disease prevention and control system of China"].

    Science.gov (United States)

    Hao, Mo; Yu, Jingjin; Yu, Mingzhu; Duan, Yong

    2005-01-01

    This paper introduced the project of "reestablishing the disease control and prevention system of China" in brief, including background, objectives, funding resources, researching objects and sampling methods. This project which funded by National Outstanding Younger Fund and the research fund of MOH aimed at nailing down the key problem existed in disease control and prevention system of China, demonstrating the reasons and mechanism of key problem, developing feasible policy idea and strategy. This paper also introduced some issues concerning the reestablishing of the disease control and prevention system of China: the definition of public function, the standard of human resource allocation and the standard of financing. The Centers for Disease Control and Prevention in 8 provinces, 80 cities and 80 counties have been sampled to provide information that project needed. In addition, this project also cited some data which come from the early study, in which 3 provinces, 12 counties, 49 towns, 179 villages and 9781 rural families have been sampled and investigated.

  13. Contribution of expert systems to data processing in non-destructive control

    International Nuclear Information System (INIS)

    Augendre, H.; Perron, M.C.

    1990-01-01

    The increase of non-destructive control in industrial applications requires the development of new data processing methods. The expert system approach is able to provide signal modelling means which are closer to the human behaviour. Such methods used in more traditional programs lead to substantial improvements. These investigations come within our design to apply sophisticated methods to industrial non-destructive control. For defect characterization purposes in ultrasonic control, various supervised learning methods have been investigated in an experimental study. The traditional approach is concerned with statistics based methods, whereas the second one lies in learning logical decision rules valid within a numerical description space [fr

  14. Computer-controlled system for rapid soil analysis of 226Ra

    International Nuclear Information System (INIS)

    Doane, R.W.; Berven, B.A.; Blair, M.S.

    1984-01-01

    A computer-controlled multichannel analysis system has been developed by the Radiological Survey Activities Group at Oak Ridge National Laboratory (ORNL) for the Department of Energy (DOE) in support of the DOE's remedial action programs. The purpose of this system is to provide a rapid estimate of the 226 Ra concentration in soil samples using a 6 x 9-in. NaI(Tl) crystal containing a 3.25-in. deep by 3.5-in. diameter well. This gamma detection system is controlled by a mini-computer with a dual floppy disk storage medium. A two-chip interface was also designed at ORNL which handles all control signals generated from the computer keyboard. These computer-generated control signals are processed in machine language for rapid data transfer and BASIC language is used for data processing

  15. Sampling system for in vivo ultrasound images

    DEFF Research Database (Denmark)

    Jensen, Jorgen Arendt; Mathorne, Jan

    1991-01-01

    Newly developed algorithms for processing medical ultrasound images use the high frequency sampled transducer signal. This paper describes demands imposed on a sampling system suitable for acquiring such data and gives details about a prototype constructed. It acquires full clinical images...... at a sampling frequency of 20 MHz with a resolution of 12 bits. The prototype can be used for real time image processing. An example of a clinical in vivo image is shown and various aspects of the data acquisition process are discussed....

  16. Summary of control and data acquisition systems for NOVA experiments (invited)

    International Nuclear Information System (INIS)

    McCauley, E.W.; Campbell, E.M.; Auerbach, J.M.; Montgomery, D.S.; Martin, V.A.; Randolph, J.E.; Shaw, J.G.; Stewart, B.L.; Stone, G.F.

    1986-01-01

    The NOVA laser has completed its first year of operation. During this period, emphasis has been placed on activation of the facility and of the numerous target and beam diagnostics. Two separate target chambers are in use. NOVA operation is separated into two broad functions: laser operations and experiments. The operations group provides the laser system control, operation, and data acquisition and the experiments group provides experiment definition, diagnostic instrumentation, and overall data processing. On the operations side, VAX 11/780 computers are used to set up diagnostic operating parameters and collect data recorded by the CAMAC and CCD modules. These data are delivered in files by electronic link to the Laser Experiments and Analysis Facility (LEAF) VAX 11/785 of the experiments group for processing. Film data are digitized at LEAF and the film data files are also processed on the LEAF VAX. The LEAF provides collection, processing, analysis, and archiving of the experimental data. The many applications software packages written for LEAF provide the experimental physicists and NOVA operations staff with programs and data bases for interpretation of experimental results. This software makes fundamental use of the ORACLE relational data base management system to both access the required data and archive the obtained results. Post-shot data processing produces sets of scalar values, x, y profiles and x, y, z contour data. The scalar data are stored in the ORACLE DB; the more extensive results are stored in binary files on disk. All data forms are accessed by a comprehensive software system, the electronic SHOTBOOK, developed around the ORACLE DBMS

  17. A modular and extensible data acquisition and control system for testing superconducting magnets

    International Nuclear Information System (INIS)

    Darryl F. Orris and Ruben H. Carcagno

    2001-01-01

    The Magnet Test Facility at Fermilab tests a variety of full-scale and model superconducting magnets for both R and D and production. As the design characteristics and test requirements of these magnets vary widely, the magnet test stand must accommodate a wide range of Data Acquisition (DAQ) and Control requirements. Such a system must provide several functions, which includes: quench detection, quench protection, power supply control, quench characterization, and slow DAQ of temperature, mechanical strain gauge, liquid helium level, etc. The system must also provide cryogenic valve control, process instrumentation monitoring, and process interlock logic associated with the test stand. A DAQ and Control system architecture that provides the functionality described above has been designed, fabricated, and put into operation. This system utilizes a modular approach that provides both extensibility and flexibility. As a result, the complexity of the hardware is minimized while remaining optimized for future expansion. The architecture of this new system is presented along with a description of the different technologies applied to each module. Commissioning and operating experience as well as plans for future expansion are discussed

  18. The CEBAF control system

    International Nuclear Information System (INIS)

    Watson, W.A. III.

    1995-01-01

    CEBAF has recently upgraded its accelerator control system to use EPICS, a control system toolkit being developed by a collaboration among laboratories in the US and Europe. The migration to EPICS has taken place during a year of intense commissioning activity, with new and old control systems operating concurrently. Existing CAMAC hardware was preserved by adding a CAMAC serial highway link to VME; newer hardware developments are now primarily in VME. Software is distributed among three tiers of computers: first, workstations and X terminals for operator interfaces and high level applications; second, VME single board computers for distributed access to hardware and for local control processing; third, embedded processors where needed for faster closed loop operation. This system has demonstrated the ability to scale EPICS to controlling thousands of devices, including hundreds of embedded processors, with control distributed among dozens of VME processors executing more than 125,000 EPICS database records. To deal with the large size of the control system, CEBAF has integrated an object oriented database, providing data management capabilities for both low level I/O and high level machine modeling. A new callable interface which is control system independent permits access to live EPICS data, data in other Unix processes, and data contained in the object oriented database

  19. Data acquisition and experiment control system for a large area neutron detector

    International Nuclear Information System (INIS)

    Alberi, J.L.

    1978-10-01

    The system consists of a data input subsystem, a display subsystem and a spectrometer control subsystem. The data input subsystem consists of a two-dimensional analog-to-digital converter with the analog section remote from the rest of the system. The analog-to-digital converter clock runs at 100 MHz. There can be up to 1024 channels in each dimension for a maximum array size of approx. 1 million words. Arrays of this size may be easily handled by a multiport memory with 6.71 x 10 7 words of address space. The read/increment/write time for data in this array is 2.5 μsec per event. The display and neutron spectrometer subsystem are also briefly described

  20. Computerized Analytical Data Management System and Automated Analytical Sample Transfer System at the COGEMA Reprocessing Plants in La Hague

    International Nuclear Information System (INIS)

    Flament, T.; Goasmat, F.; Poilane, F.

    2002-01-01

    Managing the operation of large commercial spent nuclear fuel reprocessing plants, such as UP3 and UP2-800 in La Hague, France, requires an extensive analytical program and the shortest possible analysis response times. COGEMA, together with its engineering subsidiary SGN, decided to build high-performance laboratories to support operations in its plants. These laboratories feature automated equipment, safe environments for operators, and short response times, all in centralized installations. Implementation of a computerized analytical data management system and a fully automated pneumatic system for the transfer of radioactive samples was a key factor contributing to the successful operation of the laboratories and plants

  1. Ground-Based Global Navigation Satellite System (GNSS) Compact Observation Data (1-second sampling, sub-hourly files) from NASA CDDIS

    Data.gov (United States)

    National Aeronautics and Space Administration — This dataset consists of ground-based Global Navigation Satellite System (GNSS) Observation Data (1-second sampling, sub-hourly files) from the NASA Crustal Dynamics...

  2. Data processing system for NBT experiments

    International Nuclear Information System (INIS)

    Takahashi, C.; Hosokawa, M.; Shoji, T.; Fujiwara, M.

    1981-07-01

    Data processing system for Nagoya Bumpy Torus (NBT) has been developed. Since plasmas are produced and heated in steady state by use of high power microwaves, sampling and processing data prevails in long time scale on the order of one minute. The system, which consists of NOVA 3/12 minicomputer and many data acquisition devices, is designed to sample and process large amount of data before the next discharge starts. Several features of such long time scale data processing system are described in detail. (author)

  3. ANL small-sample calorimeter system design and operation

    International Nuclear Information System (INIS)

    Roche, C.T.; Perry, R.B.; Lewis, R.N.; Jung, E.A.; Haumann, J.R.

    1978-07-01

    The Small-Sample Calorimetric System is a portable instrument designed to measure the thermal power produced by radioactive decay of plutonium-containing fuels. The small-sample calorimeter is capable of measuring samples producing power up to 32 milliwatts at a rate of one sample every 20 min. The instrument is contained in two packages: a data-acquisition module consisting of a microprocessor with an 8K-byte nonvolatile memory, and a measurement module consisting of the calorimeter and a sample preheater. The total weight of the system is 18 kg

  4. The ATLAS/TILECAL Detector Control System

    CERN Document Server

    Santos, H; The ATLAS collaboration

    2010-01-01

    Tilecal, the barrel hadronic calorimeter of ATLAS, is a sampling calorimeter where scintillating tiles are embedded in an iron matrix. The tiles are optically coupled to wavelength shifting fibers that carry the optical signal to photo-multipliers. It has a cylindrical shape and is made out of 3 cylinders, the Long Barrel with the LBA and LBC partitions, and the two Extended Barrel with the EBA and EBC partitions. The main task of the Tile calorimeter Detector Control System (DCS) is to enable the coherent and safe operation of the calorimeter. All actions initiated by the operator, as well as all errors, warnings and alarms concerning the hardware of the detector are handled by DCS. The Tile calorimeter DCS controls and monitors mainly the low voltage and high voltage power supply systems, but it is also interfaced with the infrastructure (cooling system and racks), the laser and cesium calibration systems, the data acquisition system, configuration and conditions databases and the detector safety system. In...

  5. 21 CFR 203.38 - Sample lot or control numbers; labeling of sample units.

    Science.gov (United States)

    2010-04-01

    ... numbers; labeling of sample units. (a) Lot or control number required on drug sample labeling and sample... identifying lot or control number that will permit the tracking of the distribution of each drug sample unit... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Sample lot or control numbers; labeling of sample...

  6. Definition, analysis and development of an optical data distribution network for integrated avionics and control systems. Part 2: Component development and system integration

    Science.gov (United States)

    Yen, H. W.; Morrison, R. J.

    1984-01-01

    Fiber optic transmission is emerging as an attractive concept in data distribution onboard civil aircraft. Development of an Optical Data Distribution Network for Integrated Avionics and Control Systems for commercial aircraft will provide a data distribution network that gives freedom from EMI-RFI and ground loop problems, eliminates crosstalk and short circuits, provides protection and immunity from lightning induced transients and give a large bandwidth data transmission capability. In addition there is a potential for significantly reducing the weight and increasing the reliability over conventional data distribution networks. Wavelength Division Multiplexing (WDM) is a candidate method for data communication between the various avionic subsystems. With WDM all systems could conceptually communicate with each other without time sharing and requiring complicated coding schemes for each computer and subsystem to recognize a message. However, the state of the art of optical technology limits the application of fiber optics in advanced integrated avionics and control systems. Therefore, it is necessary to address the architecture for a fiber optics data distribution system for integrated avionics and control systems as well as develop prototype components and systems.

  7. A system architecture for online data interpretation and reduction in fluorescence microscopy

    Science.gov (United States)

    Röder, Thorsten; Geisbauer, Matthias; Chen, Yang; Knoll, Alois; Uhl, Rainer

    2010-01-01

    In this paper we present a high-throughput sample screening system that enables real-time data analysis and reduction for live cell analysis using fluorescence microscopy. We propose a novel system architecture capable of analyzing a large amount of samples during the experiment and thus greatly minimizing the post-analysis phase that is the common practice today. By utilizing data reduction algorithms, relevant information of the target cells is extracted from the online collected data stream, and then used to adjust the experiment parameters in real-time, allowing the system to dynamically react on changing sample properties and to control the microscope setup accordingly. The proposed system consists of an integrated DSP-FPGA hybrid solution to ensure the required real-time constraints, to execute efficiently the underlying computer vision algorithms and to close the perception-action loop. We demonstrate our approach by addressing the selective imaging of cells with a particular combination of markers. With this novel closed-loop system the amount of superfluous collected data is minimized, while at the same time the information entropy increases.

  8. A service-oriented data access control model

    Science.gov (United States)

    Meng, Wei; Li, Fengmin; Pan, Juchen; Song, Song; Bian, Jiali

    2017-01-01

    The development of mobile computing, cloud computing and distributed computing meets the growing individual service needs. Facing with complex application system, it's an urgent problem to ensure real-time, dynamic, and fine-grained data access control. By analyzing common data access control models, on the basis of mandatory access control model, the paper proposes a service-oriented access control model. By regarding system services as subject and data of databases as object, the model defines access levels and access identification of subject and object, and ensures system services securely to access databases.

  9. Internal control system

    OpenAIRE

    Pavésková, Ivana

    2014-01-01

    Dissertation focuse on the internal control system in the enterprises, aims to map the control system by focusing on the purchasing department. I focused on the purchasing process, because with an increasing trends of outsourcing services and the increasing interconnectedness of enterprises increases the risk of fraud currently in the purchasing process. To the research was selected the sample of companies from the banking and non-banking environment, to which were sent a questionnaire focusi...

  10. An Indoor Location-Based Control System Using Bluetooth Beacons for IoT Systems.

    Science.gov (United States)

    Huh, Jun-Ho; Seo, Kyungryong

    2017-12-19

    The indoor location-based control system estimates the indoor position of a user to provide the service he/she requires. The major elements involved in the system are the localization server, service-provision client, user application positioning technology. The localization server controls access of terminal devices (e.g., Smart Phones and other wireless devices) to determine their locations within a specified space first and then the service-provision client initiates required services such as indoor navigation and monitoring/surveillance. The user application provides necessary data to let the server to localize the devices or allow the user to receive various services from the client. The major technological elements involved in this system are indoor space partition method, Bluetooth 4.0, RSSI (Received Signal Strength Indication) and trilateration. The system also employs the BLE communication technology when determining the position of the user in an indoor space. The position information obtained is then used to control a specific device(s). These technologies are fundamental in achieving a "Smart Living". An indoor location-based control system that provides services by estimating user's indoor locations has been implemented in this study (First scenario). The algorithm introduced in this study (Second scenario) is effective in extracting valid samples from the RSSI dataset but has it has some drawbacks as well. Although we used a range-average algorithm that measures the shortest distance, there are some limitations because the measurement results depend on the sample size and the sample efficiency depends on sampling speeds and environmental changes. However, the Bluetooth system can be implemented at a relatively low cost so that once the problem of precision is solved, it can be applied to various fields.

  11. An Indoor Location-Based Control System Using Bluetooth Beacons for IoT Systems

    Directory of Open Access Journals (Sweden)

    Jun-Ho Huh

    2017-12-01

    Full Text Available The indoor location-based control system estimates the indoor position of a user to provide the service he/she requires. The major elements involved in the system are the localization server, service-provision client, user application positioning technology. The localization server controls access of terminal devices (e.g., Smart Phones and other wireless devices to determine their locations within a specified space first and then the service-provision client initiates required services such as indoor navigation and monitoring/surveillance. The user application provides necessary data to let the server to localize the devices or allow the user to receive various services from the client. The major technological elements involved in this system are indoor space partition method, Bluetooth 4.0, RSSI (Received Signal Strength Indication and trilateration. The system also employs the BLE communication technology when determining the position of the user in an indoor space. The position information obtained is then used to control a specific device(s. These technologies are fundamental in achieving a “Smart Living”. An indoor location-based control system that provides services by estimating user’s indoor locations has been implemented in this study (First scenario. The algorithm introduced in this study (Second scenario is effective in extracting valid samples from the RSSI dataset but has it has some drawbacks as well. Although we used a range-average algorithm that measures the shortest distance, there are some limitations because the measurement results depend on the sample size and the sample efficiency depends on sampling speeds and environmental changes. However, the Bluetooth system can be implemented at a relatively low cost so that once the problem of precision is solved, it can be applied to various fields.

  12. A distributed, hardware reconfigurable and packet switched real-time control and data acquisition system

    International Nuclear Information System (INIS)

    Batista, A.J.N.; Combo, A.; Sousa, J.; Varandas, C.A.F.

    2002-01-01

    The architecture of a synchronized event-based control and data acquisition system that aims to improve significantly the performance of actual systems is presented. The design explores recent developments in data transport, signal processing and system synchronization. Data transport between the acquisition, processing and storing devices and at backplane level will be performed by InfiniBand, a low latency packet switched network standard. Data processing algorithms will be performed in a mixture of digital signal processors and reconfigurable field programmable gate arrays. Both devices will be programmed from a descriptive high-level mathematical language. Acquisition synchronization, data stamping and event management will be performed through a specialized low latency synchronous optical network for the time critical signals

  13. The EnzymeTracker: an open-source laboratory information management system for sample tracking

    Directory of Open Access Journals (Sweden)

    Triplet Thomas

    2012-01-01

    Full Text Available Abstract Background In many laboratories, researchers store experimental data on their own workstation using spreadsheets. However, this approach poses a number of problems, ranging from sharing issues to inefficient data-mining. Standard spreadsheets are also error-prone, as data do not undergo any validation process. To overcome spreadsheets inherent limitations, a number of proprietary systems have been developed, which laboratories need to pay expensive license fees for. Those costs are usually prohibitive for most laboratories and prevent scientists from benefiting from more sophisticated data management systems. Results In this paper, we propose the EnzymeTracker, a web-based laboratory information management system for sample tracking, as an open-source and flexible alternative that aims at facilitating entry, mining and sharing of experimental biological data. The EnzymeTracker features online spreadsheets and tools for monitoring numerous experiments conducted by several collaborators to identify and characterize samples. It also provides libraries of shared data such as protocols, and administration tools for data access control using OpenID and user/team management. Our system relies on a database management system for efficient data indexing and management and a user-friendly AJAX interface that can be accessed over the Internet. The EnzymeTracker facilitates data entry by dynamically suggesting entries and providing smart data-mining tools to effectively retrieve data. Our system features a number of tools to visualize and annotate experimental data, and export highly customizable reports. It also supports QR matrix barcoding to facilitate sample tracking. Conclusions The EnzymeTracker was designed to be easy to use and offers many benefits over spreadsheets, thus presenting the characteristics required to facilitate acceptance by the scientific community. It has been successfully used for 20 months on a daily basis by over 50

  14. The Safeguards analysis applied to the RRP. Automatic sampling authentication system

    International Nuclear Information System (INIS)

    Ono, Sawako; Nakashima, Shinichi; Iwamoto, Tomonori

    2004-01-01

    The sampling for analysis from vessels and columns at the Rokkasho Reprocessing Plant (RRP) is performed mostly by the automatic sampling system. The safeguards sample for the verification also will be taken using these sampling systems and transfer to the OSL though the pneumatic transfer network owned and controlled by operator. In order to maintaining sample integrity and continuity of knowledge (CoK) for throughout the sample processing. It is essential to develop and establish the authentication measures for the automatic sampling system including transfer network. We have developed the Automatic Sampling Authentication System (ASAS) under consultation by IAEA. This paper describes structure, function and concept of ASAS. (author)

  15. Control and data acquisition system for Beam Line C and HRS

    International Nuclear Information System (INIS)

    Naivar, F.J.; Rogers, W.L.; Simmonds, D.D.; Spencer, J.E.; Spencer, N.C.; Stark, C.F.

    1977-03-01

    The High Resolution Spectrometer (HRS) at LAMPF is a multimillion dollar facility intended for the study of proton induced reactions and scattering with 100 to 800 MeV protons. It will be used by research scientists from universities and other national laboratories throughout the United States. The design resolution of +-1.5:10 -5 is better than any similar system ever attempted in this energy range. To obtain this quality as well as fully realize the research potential of the facility, a rather sophisticated computer control and data acquisition system is required. The facility, based on a general multiuser, multitask, multiprocessor system, is described

  16. A real time data acquisition system using the MIL-STD-1553B bus. [for transmission of data to host computer for control law processing

    Science.gov (United States)

    Peri, Frank, Jr.

    1992-01-01

    A flight digital data acquisition system that uses the MIL-STD-1553B bus for transmission of data to a host computer for control law processing is described. The instrument, the Remote Interface Unit (RIU), can accommodate up to 16 input channels and eight output channels. The RIU employs a digital signal processor to perform local digital filtering before sending data to the host. The system allows flexible sensor and actuator data organization to facilitate quick control law computations on the host computer. The instrument can also run simple control laws autonomously without host intervention. The RIU and host computer together have replaced a similar larger, ground minicomputer system with favorable results.

  17. [Automatic adjustment control system for DC glow discharge plasma source].

    Science.gov (United States)

    Wan, Zhen-zhen; Wang, Yong-qing; Li, Xiao-jia; Wang, Hai-zhou; Shi, Ning

    2011-03-01

    There are three important parameters in the DC glow discharge process, the discharge current, discharge voltage and argon pressure in discharge source. These parameters influence each other during glow discharge process. This paper presents an automatic control system for DC glow discharge plasma source. This system collects and controls discharge voltage automatically by adjusting discharge source pressure while the discharge current is constant in the glow discharge process. The design concept, circuit principle and control program of this automatic control system are described. The accuracy is improved by this automatic control system with the method of reducing the complex operations and manual control errors. This system enhances the control accuracy of glow discharge voltage, and reduces the time to reach discharge voltage stability. The glow discharge voltage stability test results with automatic control system are provided as well, the accuracy with automatic control system is better than 1% FS which is improved from 4% FS by manual control. Time to reach discharge voltage stability has been shortened to within 30 s by automatic control from more than 90 s by manual control. Standard samples like middle-low alloy steel and tin bronze have been tested by this automatic control system. The concentration analysis precision has been significantly improved. The RSDs of all the test result are better than 3.5%. In middle-low alloy steel standard sample, the RSD range of concentration test result of Ti, Co and Mn elements is reduced from 3.0%-4.3% by manual control to 1.7%-2.4% by automatic control, and that for S and Mo is also reduced from 5.2%-5.9% to 3.3%-3.5%. In tin bronze standard sample, the RSD range of Sn, Zn and Al elements is reduced from 2.6%-4.4% to 1.0%-2.4%, and that for Si, Ni and Fe is reduced from 6.6%-13.9% to 2.6%-3.5%. The test data is also shown in this paper.

  18. Multi-level access control in the data pipeline of the international supply chain system

    NARCIS (Netherlands)

    Pruksasri, P.; Berg, J. van den; Hofman, W.; Daskapan, S.

    2013-01-01

    The Seamless Integrated Data Pipeline system was proposed to the European Union in order to overcome the information quality shortcomings of the current international supply chain information exchange systems. Next to identification and authorization of stakeholders, secure access control needs to

  19. Prototype Real-time ATCA-based LLRF Control System

    CERN Document Server

    Makowski, Dariusz; Jezynski, Tomasz; Piotrowski, Adam; Jablonski, Grzegorz; Jalmuzna, Wojciech; Czuba, Krzysztof; Predki, Paweł; Simrock, Stefan

    2011-01-01

    The linear accelerators employed to drive Free Electron Lasers (FELs), such as the X-ray Free Electron Laser (XFEL) currently being built in Hamburg, require sophisticated control systems. The Low Level Radio Frequency (LLRF) control system should stabilize the phase and amplitude of the electromagnetic field in accelerating modules with tolerances below 0.02 % for amplitude and 0.01 degree for phase to produce ultra-stable electron beam that meets the conditions required for Self-Amplified Spontaneous Emission (SASE). The LLRF control system of 32-cavity accelerating module of the XFEL accelerator requires acquisition of more than 100 analogue signals sampled with frequency around 100 MHz. Data processing in real-time loop should complete within a few hundreds of nanoseconds. Moreover, the LLRF control system should be reliable, upgradable and serviceable. The Advanced Telecommunications Computing Architecture (ATCA) standard, developed for telecommunication applications, can fulfil all of the above mentione...

  20. A Self-Calibrating Remote Control Chemical Monitoring System

    Energy Technology Data Exchange (ETDEWEB)

    Jessica Croft

    2007-06-01

    The Susie Mine, part of the Upper Tenmile Mining Area, is located in Rimini, MT about 15 miles southwest of Helena, MT. The Upper Tenmile Creek Mining Area is an EPA Superfund site with 70 abandoned hard rock mines and several residential yards prioritized for clean up. Water from the Susie mine flows into Tenmile Creek from which the city of Helena draws part of its water supply. MSE Technology Applications in Butte, Montana was contracted by the EPA to build a treatment system for the Susie mine effluent and demonstrate a system capable of treating mine waste water in remote locations. The Idaho National Lab was contracted to design, build and demonstrate a low maintenance self-calibrating monitoring system that would monitor multiple sample points, allow remote two-way communications with the control software and allow access to the collected data through a web site. The Automated Chemical Analysis Monitoring (ACAM) system was installed in December 2006. This thesis documents the overall design of the hardware, control software and website, the data collected while MSE-TA’s system was operational, the data collected after MSE-TA’s system was shut down and suggested improvements to the existing system.

  1. Instrumentation, control and data management for the MIST (Modular Integrated Utility System) Facility

    Science.gov (United States)

    Celino, V. A.

    1977-01-01

    An appendix providing the technical data required for computerized control and/or monitoring of selected MIST subsystems is presented. Specific computerized functions to be performed are as follows: (1) Control of the MIST heating load simulator and monitoring of the diesel engine generators' cooling system; (2) Control of the MIST heating load simulator and MIST heating subsystem including the heating load simulator; and (3) Control of the MIST air conditioning load simulator subsystem and the MIST air conditioning subsystem, including cold thermal storage and condenser water flows.

  2. Integrated approach to the development of the ITER control system configuration data

    International Nuclear Information System (INIS)

    Stepanov, D.; Abadie, L.; Bertin, J.; Bourguignon, G.; Darcourt, G.; Liotard, O.

    2012-01-01

    ITER control system (CODAC) will rely on a large number of configuration data, coming from different sources. This information is being created using different tools, stored in various databases and, generally, has different life-cycle. In many cases it is difficult for instrumentation and control (IC) engineers to have a common view on this information or to check data consistency. The plant system profile database, described in this paper, tries to address these issues by gathering all IC-specific information in the same database and providing means to analyze these data. At the design phase, the current ITER infrastructure and CODAC practices have been evaluated, and the following architectural and software decisions have been made: -1) the database back-end should be Microsoft SQL Server; -2) the application should have web interface; -3) business logic and the user interface should be written in Java. Prime Faces should be used for the user interface, Spring for transactional support and Java Hibernate for interaction with the database; and -4) data integration tool should be Talend. The task was launched in September 2010; in February 2011 the first version was put in production, and it was gradually introduced into CODAC processes in the following months

  3. Sampled Data Adaptive Digital Computer Control of Surface Ship Maneuvers

    Science.gov (United States)

    1976-06-01

    0.53 feet. Systems fcr which fuel considerations are not a motivating 157 factor lay te designed without this part of the control law ta allow finer...COXXXQXxaQXQ«^2Q£>’^ o>- —,>->>>ozor X < a. Ps4 <i i— « aC _J o < a o-*»-» ujOO • •>- o • •oo«mo z o «j II II ** » < ii ii -^ -* -,-^a:- i—— * O.-IUJ

  4. Communication and synchronization aspects of a mixed hardware control and data acquisition system

    International Nuclear Information System (INIS)

    Schmidt, V.; Flor, G.; Luchetta, A.; Manduchi, G.; Piacentini, I.E.; Vitturi, S.; Hemming, O.N.

    1989-01-01

    The paper deals with some specific aspects of the control and data acquisition system of the RFX nuclear fusion experiment, at present under construction in Padova, Italy. This system is built around a local area network which connects programmable controllers, minicomputers with CAMAC front-end, and personal computers as operator consoles. These three types of nodes use compatible software which contain a set of low level routines according to levels one to four of the ISO OSI recommendations. The paper describes in detail how the overall system synchronization is achieved. Another aspect described in the paper is the proposed solution for the precision timing and waveform generation (which uses commercial CAMAC hardware) and its integration with the overall system synchronization

  5. 241-SY-101 data acquisition and control system (DACS) operator interface upgrade operational test report

    International Nuclear Information System (INIS)

    ERMI, A.M.

    1999-01-01

    This procedure provides instructions for readiness of the first portion of the upgraded 241-SY-101 Data Acquisition and Control System (DACS) computer system to provide proper control and monitoring of the mitigation mixer pump and instrumentation installed in the 241-SY-101 underground storage tank will be systematically evaluated by the performance of this procedure

  6. Development of a generic system for real-time data access and remote control of multiple in-situ water quality monitoring instruments

    Science.gov (United States)

    Wright, S. A.; Bennett, G. E.; Andrews, T.; Melis, T. S.; Topping, D. J.

    2005-05-01

    Currently, in-situ monitoring of water quality parameters (e.g. water temperature, conductivity, turbidity) in the Colorado River ecosystem typically consists of deploying instruments in the river, retrieving them at a later date, downloading the datalogger, then examining the data; an arduous process in the remote settings of Grand Canyon. Under this protocol, data is not available real-time and there is no way to detect problems with the instrumentation until after retrieval. The next obvious stage in the development of in-situ monitoring in Grand Canyon was the advent of one-way telemetry, i.e. streaming data in real-time from the instrument to the office and/or the world-wide-web. This protocol allows for real-time access to data and the identification of instrumentation problems, but still requires a site visit to address instrument malfunctions, i.e. the user does not have the ability to remotely control the instrument. At some field sites, such as the Colorado River in Grand Canyon, site visitation is restricted by remoteness and lack of traditional access routes (i.e. roads). Even at less remote sites, it may still be desirable to have two-way communication with instruments in order to, for example, diagnose and potentially fix instrumentation problems, change sampling parameters to save battery power, etc., without having to visit the site. To this end, the U.S. Geological Survey, Grand Canyon Monitoring and Research Center, is currently developing and testing a high-speed, two-way communication system that allows for real-time data access and remote control of instrumentation. The approach tested relies on internet access and may be especially useful in areas where land-line or cellular connections are unavailable. The system is composed of off-the-shelf products, uses a commercial broadband satellite service, and is designed in a generic way such that any instrument that communicates through RS-232 communication (i.e. a serial port) is compatible with

  7. Fastbus for data aquisition and control

    International Nuclear Information System (INIS)

    Costrell, L.; Dawson, W.K.

    1983-01-01

    FASTBUS is a standardized modular data-bus system for data acqusition, data processing and control applications. It is the result of an interlaboratory development undertaken to meet the needs of the high energy physics community. However, the versatility, speed and addressing capability of FASTBUS make it attractive for many other types of application. A FASTBUS system consists of bus Segments which operate independently but dynamically link together as needed for operation passing. This parallel processing feature accounts to a great extent for the high throughput of FASTBUS in multisegment systems. Master modules compete for single or multiple Segment Control through a bus arbitration scheme using assigned priorities. Logical, geographical, secondary and broadcast addressing methods are used to access either data space or control and status register space. Features include block transfers, a sparse data scan and interrupts

  8. Changing an automated drug inventory control system to a data base design.

    Science.gov (United States)

    Bradish, R A

    1982-09-01

    A pharmacy department's change from indexed sequential access files to a data base management system (DBMS) for purposes of automated inventory control is described. The DBMS has three main functional areas: (1) inventory ordering and accountability, (2) charging of interdepartmental and intradepartmental orders, and (3) data manipulation with report design for management control. There are seven files directly related to the inventory ordering and accountability area. Each record can be accessed directly or through another file. Information on the quantity of a drug on hand, drug(s) supplied by a specific vendor, status of a purchase order, or calculation of an estimated order quantity can be retrieved quickly. In the drug master file, two records contain a reorder point and safety-stock level that are determined by searching the entries in the order history file and vendor master file. The intradepartmental and interdepartmental orders section contains five files assigned to record and store information on drug distribution. All items removed from the stockroom and distributed are recorded, and reports can be generated for itemized bills, total cost by area, and as formatted files for the accounts payable department. The design, development, and implementation of the DBMS took approximately a year using a part-time pharmacist and minimal outside help, while the previous system required constant expensive help of a programmer/analyst. The DBMS has given the pharmacy department a flexible inventory management system with increased drug control, decreased operating expenses, increased use of department personnel, and the ability to develop and enhance other systems.

  9. LabData database sub-systems for post-processing and quality control of stable isotope and gas chromatography measurements

    Science.gov (United States)

    Suckow, A. O.

    2013-12-01

    Measurements need post-processing to obtain results that are comparable between laboratories. Raw data may need to be corrected for blank, memory, drift (change of reference values with time), linearity (dependence of reference on signal height) and normalized to international reference materials. Post-processing parameters need to be stored for traceability of results. State of the art stable isotope correction schemes are available based on MS Excel (Geldern and Barth, 2012; Gröning, 2011) or MS Access (Coplen, 1998). These are specialized to stable isotope measurements only, often only to the post-processing of a special run. Embedding of algorithms into a multipurpose database system was missing. This is necessary to combine results of different tracers (3H, 3He, 2H, 18O, CFCs, SF6...) or geochronological tools (Sediment dating e.g. with 210Pb, 137Cs), to relate to attribute data (submitter, batch, project, geographical origin, depth in core, well information etc.) and for further interpretation tools (e.g. lumped parameter modelling). Database sub-systems to the LabData laboratory management system (Suckow and Dumke, 2001) are presented for stable isotopes and for gas chromatographic CFC and SF6 measurements. The sub-system for stable isotopes allows the following post-processing: 1. automated import from measurement software (Isodat, Picarro, LGR), 2. correction for sample-to sample memory, linearity, drift, and renormalization of the raw data. The sub-system for gas chromatography covers: 1. storage of all raw data 2. storage of peak integration parameters 3. correction for blank, efficiency and linearity The user interface allows interactive and graphical control of the post-processing and all corrections by export to and plot in MS Excel and is a valuable tool for quality control. The sub-databases are integrated into LabData, a multi-user client server architecture using MS SQL server as back-end and an MS Access front-end and installed in four

  10. Data Acquisition System

    International Nuclear Information System (INIS)

    Watwood, D.; Beatty, J.

    1991-01-01

    The Data Acquisition System (DAS) is comprised of a Hewlett-Packard (HP) model 9816, Series 200 Computer System with the appropriate software to acquire, control, and archive data from a Data Acquisition/Control Unit, models HP3497A and HP3498A. The primary storage medium is an HP9153 16-megabyte hard disc. The data is backed-up on three floppy discs. One floppy disc drive is contained in the HP9153 chassis; the other two comprise an HP9122 dual disc drive. An HP82906A line printer supplies hard copy backup. A block diagram of the hardware setup is shown. The HP3497A/3498A Data Acquisition/Control Units read each input channel and transmit the raw voltage reading to the HP9816 CPU via the HPIB bus. The HP9816 converts this voltage to the appropriate engineering units using the calibration curves for the sensor being read. The HP9816 archives both the raw and processed data along with the time and the readings were taken to hard and floppy discs. The processed values and reading time are printed on the line printer. This system is designed to accommodate several types of sensors; each type is discussed in the following sections

  11. Communications and Information: Strategic Automated Command Control System-Data Transmission Subsystem (SACCS-DTS) Software Configuration Management and Change Control

    National Research Council Canada - National Science Library

    1997-01-01

    .... It prescribes the requirements, responsibilities, and procedures for operation, security, and configuration management of the Strategic Automated Command Control System-Data Transmission Subsystem (SACCS-DTS...

  12. SCADA based radioactive sample bottle delivery system for fuel reprocessing project

    International Nuclear Information System (INIS)

    Kaushik, Subrat; Munj, Niket; Chauhan, R.K.; Jayaram, M.N.; Haneef, K.K.M.

    2014-01-01

    Radioactive samples of process streams need to be analyzed in centralized control lab for measuring concentration of heavy elements as well as activity at various stages of re-processing plants. The sample is taken from biologically shielded process cells remotely through sampling blisters in sample bottles. These are then transferred to control lab located about 50 meters using vacuum transfer system. The bottle movement is tracked from origin to destination in rich HMI SCADA system using Infra-red non contact type proximity sensors located along sampling line and these sensors are connected to PLC in a fail-safe mode. The sample bottle travels at a speed of 10 m/s under vacuum motive force and the detection time is of the order of 1 mS. The flow meters have been used to know the air flow in sampling line. The system has been designed, developed, tested and commissioned and in use for four years. (author)

  13. Calibrating a combined energy systems analysis and controller design method with empirical data

    International Nuclear Information System (INIS)

    Murphy, Gavin Bruce; Counsell, John; Allison, John; Brindley, Joseph

    2013-01-01

    The drive towards low carbon constructions has seen buildings increasingly utilise many different energy systems simultaneously to control the human comfort of the indoor environment; such as ventilation with heat recovery, various heating solutions and applications of renewable energy. This paper describes a dynamic modelling and simulation method (IDEAS – Inverse Dynamics based Energy Assessment and Simulation) for analysing the energy utilisation of a building and its complex servicing systems. The IDEAS case study presented in this paper is based upon small perturbation theory and can be used for the analysis of the performance of complex energy systems and also for the design of smart control systems. This paper presents a process of how any dynamic model can be calibrated against a more empirical based data model, in this case the UK Government's SAP (Standard Assessment Procedure). The research targets of this work are building simulation experts for analysing the energy use of a building and also control engineers to assist in the design of smart control systems for dwellings. The calibration process presented is transferable and has applications for simulation experts to assist in calibrating any dynamic building simulation method with an empirical based method. - Highlights: • Presentation of an energy systems analysis method for assessing the energy utilisation of buildings and their complex servicing systems. • An inverse dynamics based controller design method is detailed. • Method of how a dynamic model can be calibrated with an empirical based model

  14. Working with sample data exploration and inference

    CERN Document Server

    Chaffe-Stengel, Priscilla

    2014-01-01

    Managers and analysts routinely collect and examine key performance measures to better understand their operations and make good decisions. Being able to render the complexity of operations data into a coherent account of significant events requires an understanding of how to work well with raw data and to make appropriate inferences. Although some statistical techniques for analyzing data and making inferences are sophisticated and require specialized expertise, there are methods that are understandable and applicable by anyone with basic algebra skills and the support of a spreadsheet package. By applying these fundamental methods themselves rather than turning over both the data and the responsibility for analysis and interpretation to an expert, managers will develop a richer understanding and potentially gain better control over their environment. This text is intended to describe these fundamental statistical techniques to managers, data analysts, and students. Statistical analysis of sample data is enh...

  15. SRL-NURE hydrogeochemical data management system

    International Nuclear Information System (INIS)

    Maddox, J.H.; Wren, H.F.; Honeck, H.C.; Tharin, C.R.; Howard, M.D.

    1976-07-01

    A data management system was developed to store and retrieve all physical, chemical, and geological data collected for the NURE Hydrogeochemical Reconnaissance program by the Savannah River laboratory (SRL). In 1975, SRL accepted responsibility for hydrogeochemical reconnaissance of twenty-five states in the eastern United States as part of the National Uranium Resource Evaluation (NURE) program to identify areas favorable for uranium exploration. The SRL-NURE hydrogeochemical data management system is written in FORTRAN IV for an IBM System 360/195 computer. The system is designed to accommodate the changes in the types of data collected about a sampling site and for the different numbers of samples taken at the sites. The data are accepted as they become available and are combined with relevant data already in the system

  16. Design architecture for multi-zone HVAC control systems from existing single-zone systems using wireless sensor networks

    Science.gov (United States)

    Redfern, Andrew; Koplow, Michael; Wright, Paul

    2007-01-01

    Most residential heating, ventilating, and air-conditioning (HVAC) systems utilize a single zone for conditioning air throughout the entire house. While inexpensive, these systems lead to wide temperature distributions and inefficient cooling due to the difference in thermal loads in different rooms. The end result is additional cost to the end user because the house is over conditioned. To reduce the total amount of energy used in a home and to increase occupant comfort there is a need for a better control system using multiple temperature zones. Typical multi-zone systems are costly and require extensive infrastructure to function. Recent advances in wireless sensor networks (WSNs) have enabled a low cost drop-in wireless vent register control system. The register control system is controlled by a master controller unit, which collects sensor data from a distributed wireless sensor network. Each sensor node samples local settings (occupancy, light, humidity and temperature) and reports the data back to the master control unit. The master control unit compiles the incoming data and then actuates the vent resisters to control the airflow throughout the house. The control system also utilizes a smart thermostat with a movable set point to enable the user to define their given comfort levels. The new system can reduce the run time of the HVAC system and thus decreasing the amount of energy used and increasing the comfort of the home occupations.

  17. Data archiving system implementation in ITER's CODAC Core System

    International Nuclear Information System (INIS)

    Castro, R.; Abadie, L.; Makushok, Y.; Ruiz, M.; Sanz, D.; Vega, J.; Faig, J.; Román-Pérez, G.; Simrock, S.; Makijarvi, P.

    2015-01-01

    Highlights: • Implementation of ITER's data archiving solution. • Integration of the solution into CODAC Core System. • Data archiving structure. • High efficient data transmission into fast plant system controllers. • Fast control and data acquisition in Linux. - Abstract: The aim of this work is to present the implementation of data archiving in ITER's CODAC Core System software. This first approach provides a client side API and server side software allowing the creation of a simplified version of ITERDB data archiving software, and implements all required elements to complete data archiving flow from data acquisition until its persistent storage technology. The client side includes all necessary components that run on devices that acquire or produce data, distributing and streaming to configure remote archiving servers. The server side comprises an archiving service that stores into HDF5 files all received data. The archiving solution aims at storing data coming for the data acquisition system, the conventional control and also processed/simulated data.

  18. Air sampling system for airborne surveys

    International Nuclear Information System (INIS)

    Jupiter, C.; Tipton, W.J.

    1975-01-01

    An air sampling system has been designed for installation on the Beechcraft King Air A-100 aircraft as a part of the Aerial Radiological Measuring System (ARMS). It is intended for both particle and whole gas sampling. The sampling probe is designed for isokinetic sampling and is mounted on a removable modified escape hatch cover, behind the co-pilot's seat, and extends about two feet forward of the hatch cover in the air stream lines. Directly behind the sampling probe inside the modified hatch cover is an expansion chamber, space for a 5-inch diameter filter paper cassette, and an optional four-stage cascade impactor for particle size distribution measurements. A pair of motors and blower pumps provide the necessary 0.5 atmosphere pressure across the type MSA 1106 B glass fiber filter paper to allow a flow rate of 50 cfm. The MSA 1106 B filter paper is designed to trap sub-micrometer particles with a high efficiency; it was chosen to enable a quantitative measurement of airborne radon daughters, one of the principal sources of background signals when radiological surveys are being performed. A venturi section and pressure gauges allow air flow rate measurements so that airborne contaminant concentrations may be quantified. A whole gas sampler capable of sampling a cubic meter of air is mounted inside the aircraft cabin. A nuclear counting system on board the aircraft provides capability for α, β and γ counting of filter paper samples. Design data are presented and types of survey missions which may be served by this system are described

  19. Intelligent Control of Micro Grid: A Big Data-Based Control Center

    Science.gov (United States)

    Liu, Lu; Wang, Yanping; Liu, Li; Wang, Zhiseng

    2018-01-01

    In this paper, a structure of micro grid system with big data-based control center is introduced. Energy data from distributed generation, storage and load are analized through the control center, and from the results new trends will be predicted and applied as a feedback to optimize the control. Therefore, each step proceeded in micro grid can be adjusted and orgnized in a form of comprehensive management. A framework of real-time data collection, data processing and data analysis will be proposed by employing big data technology. Consequently, a integrated distributed generation and a optimized energy storage and transmission process can be implemented in the micro grid system.

  20. Methods and Technologies of XML Data Modeling for IP Mode Intelligent Measuring and Controlling System

    International Nuclear Information System (INIS)

    Liu, G X; Hong, X B; Liu, J G

    2006-01-01

    This paper presents the IP mode intelligent measuring and controlling system (IMIMCS). Based on object-oriented modeling technology of UML and XML Schema, the innovative methods and technologies of some key problems for XML data modeling in the IMIMCS were especially discussed, including refinement for systemic business by means of use-case diagram of UML, the confirmation of the content of XML data model and logic relationship of the objects of XML Schema with the aid of class diagram of UML, the mapping rules from the UML object model to XML Schema. Finally, the application of the IMIMCS based on XML for a modern greenhouse was presented. The results show that the modeling methods of the measuring and controlling data in the IMIMCS involving the multi-layer structure and many operating systems process strong reliability and flexibility, guarantee uniformity of complex XML documents and meet the requirement of data communication across platform

  1. Data acquisition and control for gamma scanning

    International Nuclear Information System (INIS)

    Barnes, B.K.; Murray, A.S.; Quintana, J.N.

    1980-01-01

    A new computer-based data acquisition and control unit has been installed in the Los Alamos Scientific Laboratory (LASL) system for scanning irradiated reactor fuel pins. The scanning mechanism is controlled by a commercial multichannel analyzer via a CAMAC link with an intelligent crate controller. The scanning and control unit consists of three linked LSI-11 computers. The multitasking capability of the commercial operation system allows control decisions to be based upon currently acquiring data

  2. Toward a Tiered Model to Share Clinical Trial Data and Samples in Precision Oncology.

    Science.gov (United States)

    Broes, Stefanie; Lacombe, Denis; Verlinden, Michiel; Huys, Isabelle

    2018-01-01

    The recent revolution in science and technology applied to medical research has left in its wake a trial of biomedical data and human samples; however, its opportunities remain largely unfulfilled due to a number of legal, ethical, financial, strategic, and technical barriers. Precision oncology has been at the vanguard to leverage this potential of "Big data" and samples into meaningful solutions for patients, considering the need for new drug development approaches in this area (due to high costs, late-stage failures, and the molecular diversity of cancer). To harness the potential of the vast quantities of data and samples currently fragmented across databases and biobanks, it is critical to engage all stakeholders and share data and samples across research institutes. Here, we identified two general types of sharing strategies. First, open access models, characterized by the absence of any review panel or decision maker, and second controlled access model where some form of control is exercised by either the donor (i.e., patient), the data provider (i.e., initial organization), or an independent party. Further, we theoretically describe and provide examples of nine different strategies focused on greater sharing of patient data and material. These models provide varying levels of control, access to various data and/or samples, and different types of relationship between the donor, data provider, and data requester. We propose a tiered model to share clinical data and samples that takes into account privacy issues and respects sponsors' legitimate interests. Its implementation would contribute to maximize the value of existing datasets, enabling unraveling the complexity of tumor biology, identify novel biomarkers, and re-direct treatment strategies better, ultimately to help patients with cancer.

  3. Use of data libraries in dosimetry control systems

    International Nuclear Information System (INIS)

    Babenko, V.V.; Babenko, M.I.; Kazimirov, A.S.

    2002-01-01

    Analysis, prediction and planning of dose loads, adequacy in dose management of personnel, evaluation of expediency and sufficiency of existing radiation protection system can be realized with the help of database system of dosimetry control in 'Ukrytie'-shelter

  4. Software for NAA sample changer control

    Energy Technology Data Exchange (ETDEWEB)

    Dutra Neto, Aimore; Menezes, Maria Angela de B.C., E-mail: dutraa@cdtn.br, E-mail: menezes@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-RJ), Belo Horizonte, MG (Brazil)

    2015-07-01

    In CDTN/CNEN laboratories, the neutron activation analysis (NAA) is an analytical technique routinely employed. The irradiation is performed in the TRIGA MARK I IPR-R1 reactor. After irradiated, the samples depend of an operator to be changed, creating a bottle neck in the process. To optimize the whole process, an automation of the changing samples is necessary. In order to achieve this goal, it was developed a software to control a sample changer under construction at CDTN laboratories. Two programs, running in two different environments, manages the entire acquisition process and performs all activities necessary to move the motors to positioning the samples and control the vacuum to grip the vials. The high level routine communicates with Genie 2000 software to control a Canberra Multiport II, while a low level program controls the physical assemble. (author)

  5. Software for NAA sample changer control

    International Nuclear Information System (INIS)

    Dutra Neto, Aimore; Menezes, Maria Angela de B.C.

    2015-01-01

    In CDTN/CNEN laboratories, the neutron activation analysis (NAA) is an analytical technique routinely employed. The irradiation is performed in the TRIGA MARK I IPR-R1 reactor. After irradiated, the samples depend of an operator to be changed, creating a bottle neck in the process. To optimize the whole process, an automation of the changing samples is necessary. In order to achieve this goal, it was developed a software to control a sample changer under construction at CDTN laboratories. Two programs, running in two different environments, manages the entire acquisition process and performs all activities necessary to move the motors to positioning the samples and control the vacuum to grip the vials. The high level routine communicates with Genie 2000 software to control a Canberra Multiport II, while a low level program controls the physical assemble. (author)

  6. An instrumentation for control and measurement of activated mineral samples

    International Nuclear Information System (INIS)

    Skaarup, P.

    1976-01-01

    A description is given of an instrumentation for control of a pneumatic tube system used to transport mineral samples for activation in a reactor and from there to a detector arrangement. A possible content of uranium in the samples can be seen from the radiation measured. The instrumentation includes a PDP-11 computer and a CAMAC crate

  7. Density meter algorithm and system for estimating sampling/mixing uncertainty

    International Nuclear Information System (INIS)

    Shine, E.P.

    1986-01-01

    The Laboratories Department at the Savannah River Plant (SRP) has installed a six-place density meter with an automatic sampling device. This paper describes the statistical software developed to analyze the density of uranyl nitrate solutions using this automated system. The purpose of this software is twofold: to estimate the sampling/mixing and measurement uncertainties in the process and to provide a measurement control program for the density meter. Non-uniformities in density are analyzed both analytically and graphically. The mean density and its limit of error are estimated. Quality control standards are analyzed concurrently with process samples and used to control the density meter measurement error. The analyses are corrected for concentration due to evaporation of samples waiting to be analyzed. The results of this program have been successful in identifying sampling/mixing problems and controlling the quality of analyses

  8. Density meter algorithm and system for estimating sampling/mixing uncertainty

    International Nuclear Information System (INIS)

    Shine, E.P.

    1986-01-01

    The Laboratories Department at the Savannah River Plant (SRP) has installed a six-place density meter with an automatic sampling device. This paper describes the statisical software developed to analyze the density of uranyl nitrate solutions using this automated system. The purpose of this software is twofold: to estimate the sampling/mixing and measurement uncertainties in the process and to provide a measurement control program for the density meter. Non-uniformities in density are analyzed both analytically and graphically. The mean density and its limit of error are estimated. Quality control standards are analyzed concurrently with process samples and used to control the density meter measurement error. The analyses are corrected for concentration due to evaporation of samples waiting to be analyzed. The results of this program have been successful in identifying sampling/mixing problems and controlling the quality of analyses

  9. Signal system data mining

    Science.gov (United States)

    2000-09-01

    Intelligent transportation systems (ITS) include large numbers of traffic sensors that collect enormous quantities of data. The data provided by ITS is necessary for advanced forms of control, however basic forms of control, primarily time-of-day (TO...

  10. Cryogenic Liquid Sample Acquisition System for Remote Space Applications

    Science.gov (United States)

    Mahaffy, Paul; Trainer, Melissa; Wegel, Don; Hawk, Douglas; Melek, Tony; Johnson, Christopher; Amato, Michael; Galloway, John

    2013-01-01

    There is a need to acquire autonomously cryogenic hydrocarbon liquid sample from remote planetary locations such as the lakes of Titan for instruments such as mass spectrometers. There are several problems that had to be solved relative to collecting the right amount of cryogenic liquid sample into a warmer spacecraft, such as not allowing the sample to boil off or fractionate too early; controlling the intermediate and final pressures within carefully designed volumes; designing for various particulates and viscosities; designing to thermal, mass, and power-limited spacecraft interfaces; and reducing risk. Prior art inlets for similar instruments in spaceflight were designed primarily for atmospheric gas sampling and are not useful for this front-end application. These cryogenic liquid sample acquisition system designs for remote space applications allow for remote, autonomous, controlled sample collections of a range of challenging cryogenic sample types. The design can control the size of the sample, prevent fractionation, control pressures at various stages, and allow for various liquid sample levels. It is capable of collecting repeated samples autonomously in difficult lowtemperature conditions often found in planetary missions. It is capable of collecting samples for use by instruments from difficult sample types such as cryogenic hydrocarbon (methane, ethane, and propane) mixtures with solid particulates such as found on Titan. The design with a warm actuated valve is compatible with various spacecraft thermal and structural interfaces. The design uses controlled volumes, heaters, inlet and vent tubes, a cryogenic valve seat, inlet screens, temperature and cryogenic liquid sensors, seals, and vents to accomplish its task.

  11. A system for on-line monitoring of light element concentration distributions in thin samples

    Energy Technology Data Exchange (ETDEWEB)

    Brands, P.J.M. E-mail: p.j.m.brands@tue.nl; Mutsaers, P.H.A.; Voigt, M.J.A. de

    1999-09-02

    At the Cyclotron Laboratory, a scanning proton microprobe is used to determine concentration distributions in biomedical samples. The data acquired in these measurements used to be analysed in a time consuming off-line analysis. To avoid the loss of valuable measurement and analysis time, DYANA was developed. DYANA is an on-line method for the analysis of data from biomedical measurements. By using a database of background shapes, light elements such as Na and Mg, can be fitted even more precisely than in conventional fitting procedures. The entire analysis takes only several seconds and is performed while the acquisition system is gathering a new subset of data. Data acquisition must be guaranteed and may not be interfered by other parallel processes. Therefore, the analysis, the data acquisition and the experiment control is performed on a PCI-based Pentium personal computer (PC), running a real-time operating system. A second PC is added to run a graphical user interface for interaction with the experimenter and the monitoring of the analysed results. The system is here illustrated using atherosclerotic tissue but is applicable to all kinds of thin samples.

  12. Progress in control and data acquisition for the ITER neutral beam test facility

    Energy Technology Data Exchange (ETDEWEB)

    Luchetta, Adriano, E-mail: adriano.luchetta@igi.cnr.it [Consorzio RFX, Euratom-ENEA Association, Padova (Italy); Manduchi, Gabriele; Taliercio, Cesare; Soppelsa, Anton [Consorzio RFX, Euratom-ENEA Association, Padova (Italy); Paolucci, Francesco; Sartori, Filippo [Fusion for Energy, Barcelona (Spain); Barbato, Paolo; Capobianco, Roberto; Breda, Mauro; Molon, Federico; Moressa, Modesto; Polato, Sandro; Simionato, Paola; Zampiva, Enrico [Consorzio RFX, Euratom-ENEA Association, Padova (Italy)

    2013-10-15

    Highlights: ► An ion source experiment, referred to as SPIDER, is under construction in the ITER neutral beam test facility. ► The progress in designing and testing the SPIDER control and data acquisition system is reported. ► An original approach is proposed in using ITER CODAC and non-ITER CODAC technology. -- Abstract: SPIDER, the ion source test bed in the ITER neutral beam test facility, is under construction and its operation is expected to start in 2014. Control and data acquisition for SPIDER are undergoing final design. SPIDER CODAS, as the control and data acquisition system is referred to, is requested to manage 25 plant units, to acquire 1000 analogue signals with sampling rates ranging from a few S/s to 10 MS/s, to acquire images with up to 100 frames per second, to operate with long pulses lasting up to 1 h, and to sustain 200 MB/s data throughput into the data archive with an annual data storage amount of up to 50 TB. SPIDER CODAS software architecture integrates three open-source software frameworks each addressing specific system requirements. Slow control exploits the synergy among EPICS and Siemens S7 programmable controllers. Data handling is by MDSplus a data-centric framework that is geared towards the collection and organization of scientific data. Diagnostics based on imaging drive the design of data throughput and archive size. Fast control is implemented by using MARTe, a data-driven, object-oriented, real-time environment. The paper will describe in detail the progress of the system hardware and software architecture and will show how the software frameworks interact to provide the functions requested by SPIDER CODAS. The paper will focus on how the performance requirements can be met with the described SPIDER CODAS architecture, describing the progress achieved by carrying out prototyping activities.

  13. Progress in control and data acquisition for the ITER neutral beam test facility

    International Nuclear Information System (INIS)

    Luchetta, Adriano; Manduchi, Gabriele; Taliercio, Cesare; Soppelsa, Anton; Paolucci, Francesco; Sartori, Filippo; Barbato, Paolo; Capobianco, Roberto; Breda, Mauro; Molon, Federico; Moressa, Modesto; Polato, Sandro; Simionato, Paola; Zampiva, Enrico

    2013-01-01

    Highlights: ► An ion source experiment, referred to as SPIDER, is under construction in the ITER neutral beam test facility. ► The progress in designing and testing the SPIDER control and data acquisition system is reported. ► An original approach is proposed in using ITER CODAC and non-ITER CODAC technology. -- Abstract: SPIDER, the ion source test bed in the ITER neutral beam test facility, is under construction and its operation is expected to start in 2014. Control and data acquisition for SPIDER are undergoing final design. SPIDER CODAS, as the control and data acquisition system is referred to, is requested to manage 25 plant units, to acquire 1000 analogue signals with sampling rates ranging from a few S/s to 10 MS/s, to acquire images with up to 100 frames per second, to operate with long pulses lasting up to 1 h, and to sustain 200 MB/s data throughput into the data archive with an annual data storage amount of up to 50 TB. SPIDER CODAS software architecture integrates three open-source software frameworks each addressing specific system requirements. Slow control exploits the synergy among EPICS and Siemens S7 programmable controllers. Data handling is by MDSplus a data-centric framework that is geared towards the collection and organization of scientific data. Diagnostics based on imaging drive the design of data throughput and archive size. Fast control is implemented by using MARTe, a data-driven, object-oriented, real-time environment. The paper will describe in detail the progress of the system hardware and software architecture and will show how the software frameworks interact to provide the functions requested by SPIDER CODAS. The paper will focus on how the performance requirements can be met with the described SPIDER CODAS architecture, describing the progress achieved by carrying out prototyping activities

  14. Instrumentation, control and data acquisition system with multiple configurations for test in nuclear environment

    Energy Technology Data Exchange (ETDEWEB)

    Monti, Chiara, E-mail: chiara.monti@enea.it; Neri, Carlo; Pollastrone, Fabio

    2015-10-15

    Highlights: • ENEA developed and characterized a first prototype of the In-Vessel Viewing System (IVVS) probe for ITER. • Piezo motor technology to be used in IVVS probe was tested in neutrons, gamma radiations, high temperature, vacuum and high magnetic fields. • A general architecture of the Data Acquisition and Control System (DACS) was defined and then specialized for each test. • The test campaign has validated instrumentation solutions, which can be effectively used in final IVVS implementation or other ITER diagnostics or applications. - Abstract: The In-Vessel Viewing System is a 3D laser scanning system which will be used to inspect the blanket first wall in ITER. To make the IVVS probe design compatible with the harsh environmental conditions present in ITER, a test campaign was performed in 2012–2013 to verify the adequacy of the main components of the IVVS probe. The IVVS components inspected were an optical encoder, passive components and two customized ultrasonic piezoceramic motors that were instrumented with various sensors. A general architecture of the Data Acquisition and Control System (DACS) was defined and then specialized for each test. To be suitable for this test campaign, the DACS had to host various I/O modules and to properly interface the driver of the customized piezo motors, in order to permit the full control of the test and the acquisition of experimental data. This paper presents the instrumentation solutions designed and implemented for different facilities constraints and the related DACS developed in four specialized versions for the described test campaign.

  15. SCADA based radioactive sample bottle delivery system for fuel reprocessing project

    International Nuclear Information System (INIS)

    Kaushik, Subrat; Munj, Niket; Chauhan, R.K.; Kumar, Pramod; Mishra, A.C.

    2011-01-01

    Radioactive samples of process streams need to be analyzed in centralized control lab for measuring concentration of heavy elements as well as activity at various stages of re-processing plants. The sample is taken from biologically shielded process cells remotely through sampling blisters in sample bottles. These are then transferred to control lab located about 50 meters using vacuum transfer system. The bottle movement is tracked from origin to destination in rich HMI SCADA system using Infra-red non contact type proximity sensors located along sampling line and these sensors are connected to PLC in a fail-safe mode. The sample bottle travels at a speed of 10 m/s under vacuum motive force and the detection time is of the order of 1 mS. The contact time Flow meters have been used to know the air flow in sampling line

  16. Telemetry System Data Latency

    Science.gov (United States)

    2017-07-13

    latencies will be measured. DATS Network TM Antenna TM ReceiverMCS System IOPlex IOPlexIADS CDS IADS Client TM Transmitter Sensors Signal Conditioning...TIME Figure 1-2 Mission Control System (MCS) / Interactive Analysis and Display System (IADS) Overview IADS CDSIADS Client TELEMETRY SYSTEM DATA...Sim GPS Signal Combiner MCS system Oscilloscope IADS Client IADS CDS Figure 13-1 IADS Data Flow 13.2. Test Results The results of the data test at

  17. Network Model-Assisted Inference from Respondent-Driven Sampling Data.

    Science.gov (United States)

    Gile, Krista J; Handcock, Mark S

    2015-06-01

    Respondent-Driven Sampling is a widely-used method for sampling hard-to-reach human populations by link-tracing over their social networks. Inference from such data requires specialized techniques because the sampling process is both partially beyond the control of the researcher, and partially implicitly defined. Therefore, it is not generally possible to directly compute the sampling weights for traditional design-based inference, and likelihood inference requires modeling the complex sampling process. As an alternative, we introduce a model-assisted approach, resulting in a design-based estimator leveraging a working network model. We derive a new class of estimators for population means and a corresponding bootstrap standard error estimator. We demonstrate improved performance compared to existing estimators, including adjustment for an initial convenience sample. We also apply the method and an extension to the estimation of HIV prevalence in a high-risk population.

  18. A real-time data acquisition and elaboration system for instabilities control in the FTU tokamak

    Energy Technology Data Exchange (ETDEWEB)

    Alessi, E., E-mail: alessi@ifp.cnr.it [Associazione EURATOM-ENEA, IFP-CNR, Milano (Italy); Boncagni, L. [Associazione EURATOM-ENEA, C.R. Frascati (Italy); Galperti, C.; Marchetto, C.; Nowak, S.; Sozzi, C. [Associazione EURATOM-ENEA, IFP-CNR, Milano (Italy); Apruzzese, G. [Associazione EURATOM-ENEA, C.R. Frascati (Italy); Bin, W. [Associazione EURATOM-ENEA, IFP-CNR, Milano (Italy); Belli, F.; Botrugno, A. [Associazione EURATOM-ENEA, C.R. Frascati (Italy); Bruschi, A.; Cirant, S. [Associazione EURATOM-ENEA, IFP-CNR, Milano (Italy); D' Antona, G.; Davoudi, M. [Politecnico di Milano, Dipartimento di Elettrotecnica (Italy); Figini, L. [Associazione EURATOM-ENEA, IFP-CNR, Milano (Italy); Ferrero, R. [Politecnico di Milano, Dipartimento di Elettrotecnica (Italy); Gabellieri, L. [Associazione EURATOM-ENEA, C.R. Frascati (Italy); Garavaglia, S.; Granucci, G. [Associazione EURATOM-ENEA, IFP-CNR, Milano (Italy); Grosso, A. [Associazione EURATOM-ENEA, C.R. Frascati (Italy); and others

    2013-08-21

    A real-time data acquisition and elaboration system is being implemented to control the new ECH launcher recently installed at FTU (Frascati Tokamak Upgrade). The system is aimed at controlling different kinds of magnetohydrodynamic instabilities, in particular the deleterious 3/2 and 2/1 (neoclassical) tearing modes, (N)TM, and the saw teeth period in order to prevent the seeding of NTMs. The complete system is presented here together with preliminary offline and real-time tests. © 2001 Elsevier Science. All rights reserved.

  19. Flexible CP-ABE Based Access Control on Encrypted Data for Mobile Users in Hybrid Cloud System

    Institute of Scientific and Technical Information of China (English)

    Wen-Min Li; Xue-Lei Li; Qiao-Yan Wen; Shuo Zhang; Hua Zhang

    2017-01-01

    In hybrid cloud computing, encrypted data access control can provide a fine-grained access method for orga-nizations to enact policies closer to organizational policies. This paper presents an improved CP-ABE (ciphertext-policy attribute-based encryption) scheme to construct an encrypted data access control solution that is suitable for mobile users in hybrid cloud system. In our improvement, we split the original decryption keys into a control key, a secret key and a set of transformation keys. The private cloud managed by the organization administrator takes charge of updating the transformation keys using the control key. It helps to handle the situation of flexible access management and attribute alteration. Meanwhile, the mobile user's single secret key remains unchanged as well as the ciphertext even if the data user's attribute has been revoked. In addition, we modify the access control list through adding the attributes with corresponding control key and transformation keys so as to manage user privileges depending upon the system version. Finally, the analysis shows that our scheme is secure, flexible and efficient to be applied in mobile hybrid cloud computing.

  20. Towards steady-state operational design for the data and PF control systems of the HT-7U

    International Nuclear Information System (INIS)

    Luo, J.R.; Zhu, L.; Wang, H.Z.; Ji, Z.S.; Wang, F.

    2003-01-01

    Fusion energy is an ultimate and inexhaustible source of energy for mankind and is expected to be obtained in controlled operation within this century. Among various possible candidates for fusion, the tokamak is presently the most qualified one, and since it uses superconducting magnetic coils, it will be adequate for steady-state operation. The HT-7U superconducting tokamak is a part of national project in China on fusion research, scheduled to become available on-line by the end of 2004 (Wan Y.X. and HT-7 and HT-7U Groups 2000 Overview of steady state operation of HT-7 and present status of the HT-7U project Nucl. Fusion 40 1057). The control system of the HT-7U is designed as a distributed control system (HT7UDCS), including many subsystems that provide the various functions of supervision, remote control, real-time monitoring, data acquisition and data handling. The major features of the HT-7U tokamak, which make long-pulse (∼1000 s) operation possible are the flexible poloidal field (PF) system, an auxiliary heating system, the current-driving system and a divertor system. In order to realize these features simultaneously, real-time data handling and analysis, along with a significant control capability is required. This paper discusses the design of the HT7UDCS. (author)

  1. Sampling Operations on Big Data

    Science.gov (United States)

    2015-11-29

    gories. These include edge sampling methods where edges are selected by a predetermined criteria; snowball sampling methods where algorithms start... Sampling Operations on Big Data Vijay Gadepally, Taylor Herr, Luke Johnson, Lauren Milechin, Maja Milosavljevic, Benjamin A. Miller Lincoln...process and disseminate information for discovery and exploration under real-time constraints. Common signal processing operations such as sampling and

  2. A hybrid solution using computational prediction and measured data to accurately determine process corrections with reduced overlay sampling

    Science.gov (United States)

    Noyes, Ben F.; Mokaberi, Babak; Mandoy, Ram; Pate, Alex; Huijgen, Ralph; McBurney, Mike; Chen, Owen

    2017-03-01

    Reducing overlay error via an accurate APC feedback system is one of the main challenges in high volume production of the current and future nodes in the semiconductor industry. The overlay feedback system directly affects the number of dies meeting overlay specification and the number of layers requiring dedicated exposure tools through the fabrication flow. Increasing the former number and reducing the latter number is beneficial for the overall efficiency and yield of the fabrication process. An overlay feedback system requires accurate determination of the overlay error, or fingerprint, on exposed wafers in order to determine corrections to be automatically and dynamically applied to the exposure of future wafers. Since current and future nodes require correction per exposure (CPE), the resolution of the overlay fingerprint must be high enough to accommodate CPE in the overlay feedback system, or overlay control module (OCM). Determining a high resolution fingerprint from measured data requires extremely dense overlay sampling that takes a significant amount of measurement time. For static corrections this is acceptable, but in an automated dynamic correction system this method creates extreme bottlenecks for the throughput of said system as new lots have to wait until the previous lot is measured. One solution is using a less dense overlay sampling scheme and employing computationally up-sampled data to a dense fingerprint. That method uses a global fingerprint model over the entire wafer; measured localized overlay errors are therefore not always represented in its up-sampled output. This paper will discuss a hybrid system shown in Fig. 1 that combines a computationally up-sampled fingerprint with the measured data to more accurately capture the actual fingerprint, including local overlay errors. Such a hybrid system is shown to result in reduced modelled residuals while determining the fingerprint, and better on-product overlay performance.

  3. Distributed control system for the FMIT

    International Nuclear Information System (INIS)

    Johnson, J.A.; Machen, D.R.; Suyama, R.M.

    1979-01-01

    The control system for the Fusion Materials Irradiation Test (FMIT) Facility will provide the primary data acquisition, control, and interface components that integrate all of the individual FMIT systems into a functional facility. The control system consists of a distributed computer network, control consoles and instrumentation subsystems. The FMIT Facility will be started, operated and secured from a Central Control Room. All FMIT systems and experimental functions will be monitored from the Central Control Room. The data acquisition and control signals will be handled by a data communications network, which connects dual computers in the Central Control Room to the microcomputers in CAMAC crates near the various subsystems of the facility

  4. Upgrading the data acquisition and control systems of the European Breeding Blanket Test Facility

    International Nuclear Information System (INIS)

    Mannori, Simone; Sermenghi, Valerio; Utili, Marco; Malavasi, Andrea; Gianotti, Daniel

    2013-01-01

    Highlights: • Data Acquisition and Control Systems (DACS) upgrading of experimental plant for full size thermo hydraulic testing of nuclear subsystems. • DACS development using integrated hardware/software platform with graphical programming (LabVIEW). • Development of simplified models for real-time simulation. • Rapid prototyping with real time simulation of the complete plant. • Using the code developed for the real time simulator for the real plant DACS. -- Abstract: The EBBTF (European Breeding Blanket Test Facility) experimental plant is a key component for the development of the breeding blankets (TBMs test blanket modules, HCLL helium cooled lithium lead and HCPB helium cooled pebble bed types) used by ITER. EBBTF is an experimental plant which provides the double breeding/cooling loops (liquid metal and gas) required for HCLL testing. EBBTF is composed of four subsystems (TBM, IELLLO integrated European lead lithium loop, HE-FUS3 helium fusion loop, version 3 and helium compressor build by ATEKO) with dedicated control systems realized with hardware/software combinations covering 15 years (1995–2010) time span. At the end of 2010 we began to upgrade the HE-FUS3 data acquisition control systems (DACS) replacing the obsolete PLC Siemens S5 with National Instruments Compact FieldPoint and LabVIEW. The control room has been completely reorganized using high resolution monitors and workstations linked with standard Ethernet interfaces. The data acquisition, control, safety and SCADA software has been completely developed in ENEA using LabVIEW. In this paper we are going to discuss the technical difficulties and the solutions that we have used to accomplish the upgrade

  5. An integrated and accessible sample data library for Mars sample return science

    Science.gov (United States)

    Tuite, M. L., Jr.; Williford, K. H.

    2015-12-01

    Over the course of the next decade or more, many thousands of geological samples will be collected and analyzed in a variety of ways by researchers at the Jet Propulsion Laboratory (California Institute of Technology) in order to facilitate discovery and contextualize observations made of Mars rocks both in situ and here on Earth if samples are eventually returned. Integration of data from multiple analyses of samples including petrography, thin section and SEM imaging, isotope and organic geochemistry, XRF, XRD, and Raman spectrometry is a challenge and a potential obstacle to discoveries that require supporting lines of evidence. We report the development of a web-accessible repository, the Sample Data Library (SDL) for the sample-based data that are generated by the laboratories and instruments that comprise JPL's Center for Analysis of Returned Samples (CARS) in order to facilitate collaborative interpretation of potential biosignatures in Mars-analog geological samples. The SDL is constructed using low-cost, open-standards-based Amazon Web Services (AWS), including web-accessible storage, relational data base services, and a virtual web server. The data structure is sample-centered with a shared registry for assigning unique identifiers to all samples including International Geo-Sample Numbers. Both raw and derived data produced by instruments and post-processing workflows are automatically uploaded to online storage and linked via the unique identifiers. Through the web interface, users are able to find all the analyses associated with a single sample or search across features shared by multiple samples, sample localities, and analysis types. Planned features include more sophisticated search and analytical interfaces as well as data discoverability through NSF's EarthCube program.

  6. Laser velocimeter data acquisition, processing, and control system

    International Nuclear Information System (INIS)

    Croll, R.H. Jr.; Peterson, C.W.

    1975-01-01

    The use of a mini-computer for data acquisition, processing, and control of a two-velocity-component dual beam laser velocimeter in a low-speed wind tunnel is described in detail. Digital stepping motors were programmed to map the mean-flow and turbulent fluctuating velocities in the test section boundary layer and free stream. The mini-computer interface controlled the operation of the LV processor and the high-speed selection of the photomultiplier tube whose output was to be processed. A statistical analysis of the large amount of data from the LV processor was performed by the computer while the experiment was in progress. The resulting velocities are in good agreement with hot-wire survey data obtained in the same facility

  7. PCI-VME bridge device driver design of a high-performance data acquisition and control system on LINUX

    International Nuclear Information System (INIS)

    Sun Yan; Ye Mei; Zhang Nan; Zhao Jingwei

    2000-01-01

    Data Acquisition and Control is an important part of Nuclear Electronic and Nuclear Detection application in HEP. The key methods are introduced for designing LINUX Device Driver of PCI-VME Bridge Device based on the realized Data Acquisition and Control System

  8. System Design Description for the SY-101 Hydrogen Mitigation Test Project Data Acquisition and Control System (DACS-1)

    Energy Technology Data Exchange (ETDEWEB)

    ERMI, A.M.

    2000-01-24

    This document describes the hardware and software of the computer subsystems for the Data Acquisition and Control System (DACS) used in mitigation tests conducted on waste tank 241-SY-101 at the Hanford Nuclear Reservation.

  9. The cryogenic control system of BEPCⅡ

    Institute of Scientific and Technical Information of China (English)

    LI Gang; WANG Ke-Xiang; ZHAO Ji-Jiu; YUE Ke-Juan; DAI Ming-Sui; HUANG Yi-Ling; JIANG Bo

    2008-01-01

    A superconducting cryogenic system has been designed and deployed in the Beijing Electron-Positron Collider Upgrade Project(BEPCⅡ).The system consists of a Siemens PLC(ST-PLC,Programmable Logic Controller)for the compressor control,an Allen Bradley(AB)PLC for the cryogenic equipments,and the Experimental Physics and Industrial Control System(EPICS)that integrates the PLCs.The system fully automates the superconducting cryogenic control with process control,PID(Proportional-Integral-Differential)control loops,real-time data access and data storage,alarm handler and human machine interface.It is capable of automatic recovery as well.This paper describes the BEPCⅡ cryogenic control system,data communication between ST-PLC and EPICS Input/Output Controllers(IOCs),and the integration of the flow control,the low level interlock,the AB-PLC,and EPICS.

  10. Developing Control and Monitoring Software for the Data Acquisition System of the COMPASS Experiment at CERN

    Directory of Open Access Journals (Sweden)

    Martin Bodlák

    2013-01-01

    Full Text Available This paper focuses on the analysis, design and development of software for the new data acquisition system of the COMPASS experiment at CERN. In this system, the data flow is controlled by custom hardware; the software will therefore be used only for run control and for monitoring. The requirements on the software have been analyzed, and the functionality of the system has been defined. The system consists of several distributed nodes; communication between the nodes is based on a custom protocol and a DIM library. A minimal version of the system has already been implemented. Preliminary results of performance and stability tests have shown that the system fulfills the defined requirements, and is stable. In the next phase of development, the system will be tested on the real hardware. It is expected that the system will be ready for deployment in 2014.

  11. Slurry feed variability in West Valley's melter feed tank and sampling system

    International Nuclear Information System (INIS)

    Fow, C.L.; Kurath, D.E.; Pulsipher, B.A.; Bauer, B.P.

    1989-04-01

    The present plan for disposal of high-level wastes at West Valley is to vitrify the wastes for disposal in deep geologic repository. The vitrification process involves mixing the high-level wastes with glass-forming chemicals and feeding the resulting slurry to a liquid-fed ceramic melter. Maintaining the quality of the glass product and proficient melter operation depends on the ability of the melter feed system to produce and maintain a homogeneous mixture of waste and glass-former materials. To investigate the mixing properties of the melter feed preparation system at West Valley, a statistically designed experiment was conducted using synthetic melter feed slurry over a range of concentrations. On the basis of the statistical data analysis, it was found that (1) a homogeneous slurry is produced in the melter feed tank, (2) the liquid-sampling system provides slurry samples that are statistically different from the slurry in the tank, and (3) analytical measurements are the major source of variability. A statistical quality control program for the analytical laboratory and a characterization test of the actual sampling system is recommended. 1 ref., 5 figs., 1 tab

  12. Toward a Tiered Model to Share Clinical Trial Data and Samples in Precision Oncology

    Directory of Open Access Journals (Sweden)

    Stefanie Broes

    2018-01-01

    Full Text Available The recent revolution in science and technology applied to medical research has left in its wake a trial of biomedical data and human samples; however, its opportunities remain largely unfulfilled due to a number of legal, ethical, financial, strategic, and technical barriers. Precision oncology has been at the vanguard to leverage this potential of “Big data” and samples into meaningful solutions for patients, considering the need for new drug development approaches in this area (due to high costs, late-stage failures, and the molecular diversity of cancer. To harness the potential of the vast quantities of data and samples currently fragmented across databases and biobanks, it is critical to engage all stakeholders and share data and samples across research institutes. Here, we identified two general types of sharing strategies. First, open access models, characterized by the absence of any review panel or decision maker, and second controlled access model where some form of control is exercised by either the donor (i.e., patient, the data provider (i.e., initial organization, or an independent party. Further, we theoretically describe and provide examples of nine different strategies focused on greater sharing of patient data and material. These models provide varying levels of control, access to various data and/or samples, and different types of relationship between the donor, data provider, and data requester. We propose a tiered model to share clinical data and samples that takes into account privacy issues and respects sponsors’ legitimate interests. Its implementation would contribute to maximize the value of existing datasets, enabling unraveling the complexity of tumor biology, identify novel biomarkers, and re-direct treatment strategies better, ultimately to help patients with cancer.

  13. Network Model-Assisted Inference from Respondent-Driven Sampling Data

    Science.gov (United States)

    Gile, Krista J.; Handcock, Mark S.

    2015-01-01

    Summary Respondent-Driven Sampling is a widely-used method for sampling hard-to-reach human populations by link-tracing over their social networks. Inference from such data requires specialized techniques because the sampling process is both partially beyond the control of the researcher, and partially implicitly defined. Therefore, it is not generally possible to directly compute the sampling weights for traditional design-based inference, and likelihood inference requires modeling the complex sampling process. As an alternative, we introduce a model-assisted approach, resulting in a design-based estimator leveraging a working network model. We derive a new class of estimators for population means and a corresponding bootstrap standard error estimator. We demonstrate improved performance compared to existing estimators, including adjustment for an initial convenience sample. We also apply the method and an extension to the estimation of HIV prevalence in a high-risk population. PMID:26640328

  14. LabVIEW: a software system for data acquisition, data analysis, and instrument control.

    Science.gov (United States)

    Kalkman, C J

    1995-01-01

    Computer-based data acquisition systems play an important role in clinical monitoring and in the development of new monitoring tools. LabVIEW (National Instruments, Austin, TX) is a data acquisition and programming environment that allows flexible acquisition and processing of analog and digital data. The main feature that distinguishes LabVIEW from other data acquisition programs is its highly modular graphical programming language, "G," and a large library of mathematical and statistical functions. The advantage of graphical programming is that the code is flexible, reusable, and self-documenting. Subroutines can be saved in a library and reused without modification in other programs. This dramatically reduces development time and enables researchers to develop or modify their own programs. LabVIEW uses a large amount of processing power and computer memory, thus requiring a powerful computer. A large-screen monitor is desirable when developing larger applications. LabVIEW is excellently suited for testing new monitoring paradigms, analysis algorithms, or user interfaces. The typical LabVIEW user is the researcher who wants to develop a new monitoring technique, a set of new (derived) variables by integrating signals from several existing patient monitors, closed-loop control of a physiological variable, or a physiological simulator.

  15. The ALICE data acquisition system

    CERN Document Server

    Carena, F; Chapeland, S; Chibante Barroso, V; Costa, F; Dénes, E; Divià, R; Fuchs, U; Grigore, A; Kiss, T; Simonetti, G; Soós, C; Telesca, A; Vande Vyvre, P; Von Haller, B

    2014-01-01

    In this paper we describe the design, the construction, the commissioning and the operation of the Data Acquisition (DAQ) and Experiment Control Systems (ECS) of the ALICE experiment at the CERN Large Hadron Collider (LHC). The DAQ and the ECS are the systems used respectively for the acquisition of all physics data and for the overall control of the experiment. They are two computing systems made of hundreds of PCs and data storage units interconnected via two networks. The collection of experimental data from the detectors is performed by several hundreds of high-speed optical links. We describe in detail the design considerations for these systems handling the extreme data throughput resulting from central lead ions collisions at LHC energy. The implementation of the resulting requirements into hardware (custom optical links and commercial computing equipment), infrastructure (racks, cooling, power distribution, control room), and software led to many innovative solutions which are described together with ...

  16. Identification of continuous-time systems from samples of input ...

    Indian Academy of Sciences (India)

    Abstract. This paper presents an introductory survey of the methods that have been developed for identification of continuous-time systems from samples of input±output data. The two basic approaches may be described as (i) the indirect method, where first a discrete-time model is estimated from the sampled data and then ...

  17. PCI-VME bridge device driver design of a high-performance data acquisition and control system on LINUX

    International Nuclear Information System (INIS)

    Sun Yan; Ye Mei; Zhang Nan; Zhao Jingwei

    2001-01-01

    Data acquisition and control is an important part of nuclear electronic and nuclear detection application in HEP. The key method has been introduced for designing LINUX device driver of PCI-VME bridge device based on realized by authors' data acquisition and control system

  18. Statistical process control charts for attribute data involving very large sample sizes: a review of problems and solutions.

    Science.gov (United States)

    Mohammed, Mohammed A; Panesar, Jagdeep S; Laney, David B; Wilson, Richard

    2013-04-01

    The use of statistical process control (SPC) charts in healthcare is increasing. The primary purpose of SPC is to distinguish between common-cause variation which is attributable to the underlying process, and special-cause variation which is extrinsic to the underlying process. This is important because improvement under common-cause variation requires action on the process, whereas special-cause variation merits an investigation to first find the cause. Nonetheless, when dealing with attribute or count data (eg, number of emergency admissions) involving very large sample sizes, traditional SPC charts often produce tight control limits with most of the data points appearing outside the control limits. This can give a false impression of common and special-cause variation, and potentially misguide the user into taking the wrong actions. Given the growing availability of large datasets from routinely collected databases in healthcare, there is a need to present a review of this problem (which arises because traditional attribute charts only consider within-subgroup variation) and its solutions (which consider within and between-subgroup variation), which involve the use of the well-established measurements chart and the more recently developed attribute charts based on Laney's innovative approach. We close by making some suggestions for practice.

  19. From Design to Production Control Through the Integration of Engineering Data Management and Workflow Management Systems

    CERN Document Server

    Le Goff, J M; Bityukov, S; Estrella, F; Kovács, Z; Le Flour, T; Lieunard, S; McClatchey, R; Murray, S; Organtini, G; Vialle, J P; Bazan, A; Chevenier, G

    1997-01-01

    At a time when many companies are under pressure to reduce "times-to-market" the management of product information from the early stages of design through assembly to manufacture and production has become increasingly important. Similarly in the construction of high energy physics devices the collection of ( often evolving) engineering data is central to the subsequent physics analysis. Traditionally in industry design engineers have employed Engineering Data Management Systems ( also called Product Data Management Systems) to coordinate and control access to documented versions of product designs. However, these systems provide control only at the collaborative design level and are seldom used beyond design. Workflow management systems, on the other hand, are employed in industry to coordinate and support the more complex and repeatable work processes of the production environment. Commer cial workflow products cannot support the highly dynamic activities found both in the design stages of product developmen...

  20. The Run Control System and the Central Hint and Information Processor of the Data Acquisition System of the ATLAS Experiment at the LHC

    CERN Document Server

    Anders, G; The ATLAS collaboration; Lehmann Miotto, G; Magnoni, L

    2014-01-01

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS detector is composed of a large number of distributed hardware and software components (about 3000 machines and more than 15000 concurrent processes at the end of LHC’s Run I) which in a coordinated manner provide the data-taking functionality of the overall system. The Run Control (RC) system steers the data acquisition by starting and stopping processes and by carrying all data-taking elements through well-defined states in a coherent way (finite state machine pattern). The RC is organized as a hierarchical tree (run control tree) of run controllers following the functional de-composition into systems and sub-systems of the ATLAS detector. During the LHC Long Shutdown 1 (LS1) the RC has been completely re-designed and re-implemented in order to better fulfill the new requirements which emerged during the LHC Run 1 and were not foreseen during the initial design phase, and in order to improve the error management and recovery mechanisms. Indeed gi...

  1. Multicopter control with Navio using REX control system

    Science.gov (United States)

    Golembiovsky, Matej; Dedek, Jan; Ozana, Stepan

    2017-06-01

    This article deals with study of possible connection of the REXcontrols platform with Raspberry Pi based control system and Navio2 expansion board. This board is designed for development of autonomous robotic platforms type car, plane or multicopter. In this article, control system REXcontrols is introduced and its integration possibilities for control board Navio2 are discussed. The main discussed aspects are communication possibilities of the REXcontrols system with external scripts which further on allow control of this board. The main reasons for this undertaking are vast possibilities of archiving, visualization, signal processing and control which REXcontrols system allows. The control itself of the navio2 board is done through numerous interfaces. Specifically it is a pair of SPI data buses, an I2C data bus, UART and multiple GPIO pins. However, since REXcontrols control system has only limited access to these data buses, it is necessary to establish the communication through external scripts. For this purpose REXcontrols is equipped with mechanisms; SILO, EPC and REXLANG which are described in the article. Due to its simple implementation into REXcontrols and the option to utilize available libraries for communication with Navio2 board in external script, an EPC block was selected for the final implementation.

  2. Dynamic informational system for control and monitoring the tritium removal pilot plant with data transfer and process analyses

    International Nuclear Information System (INIS)

    Retevoi, Carmen Maria; Stefan, Iuliana; Balteanu, Ovidiu; Stefan, Liviu

    2005-01-01

    The dynamic informational system with datalogging and supervisory control module includes a motion control module and is a new conception used in tritium removal installation with isotopic exchange and cryogenic distillation. The control system includes an event-driven engine that maintains a real-time database, logs historical data, processes alarm information, and communicates with I/O devices. Also, it displays the operator interfaces and performs tasks that are defined for advanced control algorithms, supervisory control, analysis, and display with data transfer from data acquisition room to the control room. By using the parameters, we compute the deuterium and tritium concentration, respectively, of the liquid at the inlet of the isotopic exchange column and, consequently, we can compute at the outlet of the column, the tritium concentration in the water vapors. (authors)

  3. The Design of Feedback Control Systems Containing a Saturation Type Nonlinearity

    Science.gov (United States)

    Schmidt, Stanley F.; Harper, Eleanor V.

    1960-01-01

    A derivation of the optimum response for a step input for plant transfer functions which have an unstable pole and further data on plants with a single zero in the left half of the s plane. The calculated data are presented tabulated in normalized form. Optimum control systems are considered. The optimum system is defined as one which keeps the error as small as possible regardless of the input, under the constraint that the input to the plant (or controlled system) is limited. Intuitive arguments show that in the case where only the error can be sensed directly, the optimum system is obtained from the optimum relay or on-off solution. References to known solutions are presented. For the case when the system is of the sampled-data type, arguments are presented which indicate the optimum sampled-data system may be extremely difficult if not impossible to realize practically except for very simple plant transfer functions. Two examples of aircraft attitude autopilots are presented, one for a statically stable and the other for a statically unstable airframe. The rate of change of elevator motion is assumed limited for these examples. It is shown that by use of nonlinear design techniques described in NASA TN D-20 one can obtain near optimum response for step inputs and reason- able response to sine wave inputs for either case. Also, the nonlinear design prevents inputs from driving the system unstable for either case.

  4. 241-SY-101 data acquisition and control system (DACS) remote operator interface operational test report

    International Nuclear Information System (INIS)

    ERMI, A.M.

    1999-01-01

    The readiness of the upgraded 241-SY-101 Data Acquisition and Control System (DACS) to provide proper control and monitoring of the mixer pump and instrumentation in tank 241-SY-101 was evaluated by the performance of OTP-440-001. Results of the OTP are reported here

  5. The integration of two control systems

    International Nuclear Information System (INIS)

    Bickley, M.; White, K.

    1995-01-01

    During the past year the Continuous Electron Beam Accelerator Facility (CEBAF) has installed a new machine control system, based on the Experimental Physics and Industrial Control System (EPICS). The migration from CEBAF's old control system, Thaumaturgic Automated Control Logic (TACL), had to be done concurrently with commissioning of the CEBAF accelerator. The smooth transition to EPICS was made possible by the similarity of the control systems' topological design and network communication protocol. Both systems have operator display computer nodes which are decoupled from the data acquisition and control nodes. The communication between display and control nodes of both control systems is based on making named requests for data, with data being passed on change of value. Due to TACL's use of a central communications process, it was possible to integrate both control systems' network communications in that process. This in turn meant that CEBAF did not require changes to any other software in order to support network communication between TACL and EPICS. CEBAF implemented the machine's control under EPICS in an evolutionary, controlled manner. 4 refs., 3 figs

  6. ITER Fast Plant System Controller prototype based on PXIe platform

    International Nuclear Information System (INIS)

    Ruiz, M.; Vega, J.; Castro, R.; Sanz, D.; López, J.M.; Arcas, G. de; Barrera, E.; Nieto, J.; Gonçalves, B.; Sousa, J.; Carvalho, B.; Utzel, N.; Makijarvi, P.

    2012-01-01

    Highlights: ► Implementation of Fast Plant System Controller (FPSC) for ITER CODAC. ► Efficient data acquisition and data movement using EPICS. ► Performance of PCIe technologies in the implementation of FPSC. - Abstract: The ITER Fast Plant System Controller (FPSC) is based on embedded technologies. The FPSC will be devoted to both data acquisition tasks (sampling rates higher than 1 kHz) and control purposes (feedback loop actuators). Some of the essential requirements of these systems are: (a) data acquisition and data preprocessing; (b) interfacing with different networks and high speed links (Plant Operation Network, timing network based on IEEE1588, synchronous data transference and streaming/archiving networks); and (c) system setup and operation using EPICS (Experimental Physics and Industrial Control System) process variables. CIEMAT and UPM have implemented a prototype of FPSC using a PXIe (PCI eXtension for Instrumentation) form factor in a R and D project developed in two phases. The paper presents the main features of the two prototypes developed that have been named alpha and beta. The former was implemented using LabVIEW development tools as it was focused on modeling the FPSC software modules, using the graphical features of LabVIEW applications, and measuring the basic performance in the system. The alpha version prototype implements data acquisition with time-stamping, EPICS monitoring using waveform process variables (PVs), and archiving. The beta version prototype is a complete IOC implemented using EPICS with different software functional blocks. These functional blocks are integrated and managed using an ASYN driver solution and provide the basic functionalities required by ITER FPSC such as data acquisition, data archiving, data pre-processing (using both CPU and GPU) and streaming.

  7. PFP Wastewater Sampling Facility

    International Nuclear Information System (INIS)

    Hirzel, D.R.

    1995-01-01

    This test report documents the results obtained while conducting operational testing of the sampling equipment in the 225-WC building, the PFP Wastewater Sampling Facility. The Wastewater Sampling Facility houses equipment to sample and monitor the PFP's liquid effluents before discharging the stream to the 200 Area Treated Effluent Disposal Facility (TEDF). The majority of the streams are not radioactive and discharges from the PFP Heating, Ventilation, and Air Conditioning (HVAC). The streams that might be contaminated are processed through the Low Level Waste Treatment Facility (LLWTF) before discharging to TEDF. The sampling equipment consists of two flow-proportional composite samplers, an ultrasonic flowmeter, pH and conductivity monitors, chart recorder, and associated relays and current isolators to interconnect the equipment to allow proper operation. Data signals from the monitors are received in the 234-5Z Shift Office which contains a chart recorder and alarm annunciator panel. The data signals are also duplicated and sent to the TEDF control room through the Local Control Unit (LCU). Performing the OTP has verified the operability of the PFP wastewater sampling system. This Operability Test Report documents the acceptance of the sampling system for use

  8. Thermostatic system of sensor in NIR spectrometer based on PID control

    Science.gov (United States)

    Wang, Zhihong; Qiao, Liwei; Ji, Xufei

    2016-11-01

    Aiming at the shortcomings of the primary sensor thermostatic control system in the near infrared (NIR) spectrometer, a novel thermostatic control system based on proportional-integral-derivative (PID) control technology was developed to improve the detection precision of the NIR spectrometer. There were five parts including bridge amplifier circuit, analog-digital conversion (ADC) circuit, microcontroller, digital-analog conversion (DAC) circuit and drive circuit in the system. The five parts formed a closed-loop control system based on PID algorithm that was used to control the error between the temperature calculated by the sampling data of ADC and the designed temperature to ensure the stability of the spectrometer's sensor. The experimental results show that, when the operating temperature of sensor is -11°, compared with the original system, the temperature control precision of the new control system is improved from ±0.64° to ±0.04° and the spectrum signal to noise ratio (SNR) is improved from 4891 to 5967.

  9. The ALICE data acquisition system

    Energy Technology Data Exchange (ETDEWEB)

    Carena, F.; Carena, W.; Chapeland, S.; Chibante Barroso, V.; Costa, F. [European Organization for Nuclear Research (CERN), Geneva 23 (Switzerland); Dénes, E. [Research Institute for Particle and Nuclear Physics, Wigner Research Center, Budapest (Hungary); Divià, R.; Fuchs, U. [European Organization for Nuclear Research (CERN), Geneva 23 (Switzerland); Grigore, A. [European Organization for Nuclear Research (CERN), Geneva 23 (Switzerland); Politehnica Univesity of Bucharest, Bucharest (Romania); Kiss, T. [Cerntech Ltd., Budapest (Hungary); Simonetti, G. [Dipartimento Interateneo di Fisica ‘M. Merlin’, Bari (Italy); Soós, C.; Telesca, A.; Vande Vyvre, P. [European Organization for Nuclear Research (CERN), Geneva 23 (Switzerland); Haller, B. von, E-mail: bvonhall@cern.ch [European Organization for Nuclear Research (CERN), Geneva 23 (Switzerland)

    2014-03-21

    In this paper we describe the design, the construction, the commissioning and the operation of the Data Acquisition (DAQ) and Experiment Control Systems (ECS) of the ALICE experiment at the CERN Large Hadron Collider (LHC). The DAQ and the ECS are the systems used respectively for the acquisition of all physics data and for the overall control of the experiment. They are two computing systems made of hundreds of PCs and data storage units interconnected via two networks. The collection of experimental data from the detectors is performed by several hundreds of high-speed optical links. We describe in detail the design considerations for these systems handling the extreme data throughput resulting from central lead ions collisions at LHC energy. The implementation of the resulting requirements into hardware (custom optical links and commercial computing equipment), infrastructure (racks, cooling, power distribution, control room), and software led to many innovative solutions which are described together with a presentation of all the major components of the systems, as currently realized. We also report on the performance achieved during the first period of data taking (from 2009 to 2013) often exceeding those specified in the DAQ Technical Design Report.

  10. Synchronic, optical transmission data link integrated with FPGA circuits (for TESLA LLRF control system)

    Energy Technology Data Exchange (ETDEWEB)

    Zielinski, J.S.

    2006-07-15

    The X-ray free-electron laser X-FEL that is being planned at the DESY research center in cooperation with European partners will produce high-intensity ultra-short X-ray flashes with the properties of laser light. This new light source, which can only be described in terms of superlatives, will open up a whole range of new possibilities for the natural sciences. It could also offer very promising opportunities for industrial users. SIMCON (SIMulator and CONtroller) is the project of the fast, low latency digital controller dedicated to the LLRF1 system in VUV FEL experiment It is being developed by the ELHEP2 group in the Institute of Electronic Systems at Warsaw University of Technology. The main purpose of the project is to create a controller to stabilize the vector sum of fields in cavities of one cryo-module in the experiment. The device can be also used as the simulator of the cavity and test bench for other devices. The synchronic, optical link project was made for the accelerator X-FEL laser TESLA, the LLRF control system experiment at DESY, Hamburg. The control and diagnostic data is transmitted up to 2.5Gbit/s through a plastic fiber in a distance up to a few hundred meters. The link is synchronized once after power up, and never resynchronized when data is transmitted with maximum speed. The one way link bit error rate is less then 10{sup -15}. The transceiver component written in VHDL that works in the dedicated Altera registered Stratix registered GX FPGA circuit. During the work in the PERG laboratory a 2,5Gbit/s serial link with the long vector parallel interface transceiver was created. Long-Data-Vector transceiver transmits 16bit vector each 8ns with 120ns latency. (orig.)

  11. Synchronic, optical transmission data link integrated with FPGA circuits (for TESLA LLRF control system)

    International Nuclear Information System (INIS)

    Zielinski, J.S.

    2006-05-01

    The X-ray free-electron laser X-FEL that is being planned at the DESY research center in cooperation with European partners will produce high-intensity ultra-short X-ray flashes with the properties of laser light. This new light source, which can only be described in terms of superlatives, will open up a whole range of new possibilities for the natural sciences. It could also offer very promising opportunities for industrial users. SIMCON (SIMulator and CONtroller) is the project of the fast, low latency digital controller dedicated to the LLRF1 system in VUV FEL experiment It is being developed by the ELHEP2 group in the Institute of Electronic Systems at Warsaw University of Technology. The main purpose of the project is to create a controller to stabilize the vector sum of fields in cavities of one cryo-module in the experiment. The device can be also used as the simulator of the cavity and test bench for other devices. The synchronic, optical link project was made for the accelerator X-FEL laser TESLA, the LLRF control system experiment at DESY, Hamburg. The control and diagnostic data is transmitted up to 2.5Gbit/s through a plastic fiber in a distance up to a few hundred meters. The link is synchronized once after power up, and never resynchronized when data is transmitted with maximum speed. The one way link bit error rate is less then 10 -15 . The transceiver component written in VHDL that works in the dedicated Altera registered Stratix registered GX FPGA circuit. During the work in the PERG laboratory a 2,5Gbit/s serial link with the long vector parallel interface transceiver was created. Long-Data-Vector transceiver transmits 16bit vector each 8ns with 120ns latency. (orig.)

  12. Delays and networked control systems

    CERN Document Server

    Hetel, Laurentiu; Daafouz, Jamal; Johansson, Karl

    2016-01-01

    This edited monograph includes state-of-the-art contributions on continuous time dynamical networks with delays. The book is divided into four parts. The first part presents tools and methods for the analysis of time-delay systems with a particular attention on control problems of large scale or infinite-dimensional systems with delays. The second part of the book is dedicated to the use of time-delay models for the analysis and design of Networked Control Systems. The third part of the book focuses on the analysis and design of systems with asynchronous sampling intervals which occur in Networked Control Systems. The last part of the book exposes several contributions dealing with the design of cooperative control and observation laws for networked control systems. The target audience primarily comprises researchers and experts in the field of control theory, but the book may also be beneficial for graduate students. .

  13. Design And Construction Of Controller System And Data Acquisition Of Creep Test Machine

    International Nuclear Information System (INIS)

    Farokhi; Arhatari, B.D.; DT. SonyTj.. Histori; Sudarno; Haryanto, Mudi; Triyadi, Ari

    2001-01-01

    Design and construction of creep test machine have been done to get a higher performance of controller system and data acquisition of that machine. The Design and construction were made by adding an automatic power control circuit, an interface and computer program on PC. The interface circuit is made in a form of a card which applicable on the compatible ISA-IBM PC. The computer program is written in turbo C++. With that modification, the test results show reduction in measurement error from 80μm to 90μm. The modification gives also benefit semi-automatic of the creep test machine. It means decreasing on the operator dependence. Another advantages are to make easier on the result data reading, to show the result data on the real time or on file, to make easier on appearing of a test result curve and on the result data analysis

  14. System Design Description for the SY-101 Hydrogen Mitigation Test Project Data Acquisition and Control System (DACS-1)

    Energy Technology Data Exchange (ETDEWEB)

    ERMI, A.M.

    1999-08-25

    This document describes the hardware and software of the computer subsystems for the Data Acquisition and Control System (DACS) used in mitigation tests conducted on waste tank 241-SY-101 at the Hanford Nuclear Reservation, The original system was designed and implemented by LANL, supplied to WHC, and turned over to LMHC for operation. In 1999, the hardware and software were upgraded to provide a state-of-the-art, Year-2000 compliant system.

  15. ISABELLE control system

    International Nuclear Information System (INIS)

    Humphrey, J.W.; Frankel, R.S.; Niederer, J.A.

    1980-01-01

    Design principles for the Brookhaven ISABELLE control intersecting storage ring accelerator are described. Principal features include a locally networked console and control computer complex, a system wide process data highway, and intelligent local device controllers. Progress to date is summarized

  16. On the Analysis of Case-Control Studies in Cluster-correlated Data Settings.

    Science.gov (United States)

    Haneuse, Sebastien; Rivera-Rodriguez, Claudia

    2018-01-01

    In resource-limited settings, long-term evaluation of national antiretroviral treatment (ART) programs often relies on aggregated data, the analysis of which may be subject to ecological bias. As researchers and policy makers consider evaluating individual-level outcomes such as treatment adherence or mortality, the well-known case-control design is appealing in that it provides efficiency gains over random sampling. In the context that motivates this article, valid estimation and inference requires acknowledging any clustering, although, to our knowledge, no statistical methods have been published for the analysis of case-control data for which the underlying population exhibits clustering. Furthermore, in the specific context of an ongoing collaboration in Malawi, rather than performing case-control sampling across all clinics, case-control sampling within clinics has been suggested as a more practical strategy. To our knowledge, although similar outcome-dependent sampling schemes have been described in the literature, a case-control design specific to correlated data settings is new. In this article, we describe this design, discuss balanced versus unbalanced sampling techniques, and provide a general approach to analyzing case-control studies in cluster-correlated settings based on inverse probability-weighted generalized estimating equations. Inference is based on a robust sandwich estimator with correlation parameters estimated to ensure appropriate accounting of the outcome-dependent sampling scheme. We conduct comprehensive simulations, based in part on real data on a sample of N = 78,155 program registrants in Malawi between 2005 and 2007, to evaluate small-sample operating characteristics and potential trade-offs associated with standard case-control sampling or when case-control sampling is performed within clusters.

  17. A novel PMT test system based on waveform sampling

    Science.gov (United States)

    Yin, S.; Ma, L.; Ning, Z.; Qian, S.; Wang, Y.; Jiang, X.; Wang, Z.; Yu, B.; Gao, F.; Zhu, Y.; Wang, Z.

    2018-01-01

    Comparing with the traditional test system based on a QDC and TDC and scaler, a test system based on waveform sampling is constructed for signal sampling of the 8"R5912 and the 20"R12860 Hamamatsu PMT in different energy states from single to multiple photoelectrons. In order to achieve high throughput and to reduce the dead time in data processing, the data acquisition software based on LabVIEW is developed and runs with a parallel mechanism. The analysis algorithm is realized in LabVIEW and the spectra of charge, amplitude, signal width and rising time are analyzed offline. The results from Charge-to-Digital Converter, Time-to-Digital Converter and waveform sampling are discussed in detailed comparison.

  18. The Control and Configuration Software of the ATLAS Data Acquisition System: Upgrades for LHC Run 2

    CERN Document Server

    Aleksandrov, Igor; The ATLAS collaboration; Avolio, Giuseppe; Caprini, Mihai; Corso-Radu, Alina; D'ascanio, Matteo; De Castro Vargas Fernandes, Julio; Kazarov, Andrei; Kolobara, Bernard; Lankford, Andrew; Laurent, Florian; Lehmann Miotto, Giovanna; Magnoni, Luca; Papaevgeniou, Lykourgos; Ryabov, Yury; Santos, Alejandro; Seixas, Jose; Soloviev, Igor; Unel, Gokhan; Yasu, Yoshiji

    2016-01-01

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS detector at the Large Hadron Collider (LHC) at CERN is composed of a large number of distributed hardware and software components which in a coordinated manner provide the data-taking functionality of the overall system. The Controls and Configuration (CC) software offers services to configure, control and monitor the TDAQ system. It is a framework which provides essentially the glue that holds the various sub-systems together. While the overall architecture, established at the end of the 90’s, has proven to be solid and flexible, many software components (from core services, like the Run Control and the error management system, to end- user tools) have undergone a complete redesign or re-implementation during the LHC’s Long Shutdown I period. The upgrades were driven by the need to fold-in the additional requirements that appeared in the course of LHC’s Run 1, to profit from new technologies and to re-factorize and cleanup the code. This paper...

  19. Development of control and data processing system for JAERI ERL-FEL

    International Nuclear Information System (INIS)

    Kikuzawa, Nobuhiro

    2005-03-01

    A personal computer (PC) based distributed control system has been developed for Free Electron Laser (FEL) at Japan Atomic Energy Research Institute (JAERI) and operated since 1992. The control system was implemented on Ethernet LAN of PCs, Nippon Electric Company (NEC Corp.) PC-9800 series 32 bit personal computers. It became troublesome to maintain the control system, because many application programs did not work on outdated hardware interfaces and operating system. Furthermore, since security updates of the operating system (OS) were no longer provided, the problem was in network security when many PCs were connected with the LAN. We have to solve these problems and to improve the reliability and the safety of the control system, an ITRON-based controller was developed. In Japan, the ITRON is very popular and embedded in many products such as industrial instruments or household appliances that are demanded of its high reliability. When the local controller was installed, a new control program was developed by Java language which had high compatibility on many platforms so that replacement of the computer for consoles might become easy in future. High reliability and interchangeability have been successfully realized by them, and the control system made long continuous operation possible. (author)

  20. Fast sampling from a Hidden Markov Model posterior for large data

    DEFF Research Database (Denmark)

    Bonnevie, Rasmus; Hansen, Lars Kai

    2014-01-01

    Hidden Markov Models are of interest in a broad set of applications including modern data driven systems involving very large data sets. However, approximate inference methods based on Bayesian averaging are precluded in such applications as each sampling step requires a full sweep over the data...

  1. A Frequency Domain Design Method For Sampled-Data Compensators

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Jannerup, Ole Erik

    1990-01-01

    A new approach to the design of a sampled-data compensator in the frequency domain is investigated. The starting point is a continuous-time compensator for the continuous-time system which satisfy specific design criteria. The new design method will graphically show how the discrete...

  2. Data acquisition and control system for the ECE imaging diagnostic on the EAST tokamak

    Science.gov (United States)

    Luo, C.; Lan, T.; Zhu, Y.; Xie, J.; Gao, B.; Liu, W.; Yu, C.; Milne, P. G.; Domier, C. W.; Luhmann, N. C.

    2017-06-01

    A 384-channel electron cyclotron emission imaging (ECEI) system is installed on the experimental advanced superconducting tokamak (EAST) and 7-gigabyte data is produced for each regular discharge of a 10-second pulse. The data acquisition and control (DAC) system for the EAST ECEI diagnostics covers the large data production and embeds the ability to report the data quality instantly after the discharge. The symmetric routing design of the timing signal distributions among the 384 channels provides a low-cost solution to the synchronization of a large number of channels. The application of the load-balance bond service largely reduces the configuration difficulty and the cost in the high-speed data transferring tasks. Benefiting from the various kinds of hardware units with dedicated functionalities, an automated and user interactive DAC work flow is achieved, including the pre-selections of the automation scheme and the observation region, 384-channel data acquisition and local caching, post-discharge imaging data quality evaluation, remote system status monitoring, and inter-discharge imaging system event handling. The system configuration in a specific physics experiment is further optimized through the associated operating software which is enhanced by the input of the tokamak operation status and the region of interest (ROI) from other diagnostics. The DAC system is based on a modularized design and scalable to the long-pulse discharges in the EAST tokamak.

  3. Data acquisition and control system for the ECE imaging diagnostic on the EAST tokamak

    International Nuclear Information System (INIS)

    Luo, C.; Lan, T.; Xie, J.; Gao, B.; Liu, W.; Yu, C.; Zhu, Y.; Domier, C.W.; Luhmann, N.C.; Milne, P.G.

    2017-01-01

    A 384-channel electron cyclotron emission imaging (ECEI) system is installed on the experimental advanced superconducting tokamak (EAST) and 7-gigabyte data is produced for each regular discharge of a 10-second pulse. The data acquisition and control (DAC) system for the EAST ECEI diagnostics covers the large data production and embeds the ability to report the data quality instantly after the discharge. The symmetric routing design of the timing signal distributions among the 384 channels provides a low-cost solution to the synchronization of a large number of channels. The application of the load-balance bond service largely reduces the configuration difficulty and the cost in the high-speed data transferring tasks. Benefiting from the various kinds of hardware units with dedicated functionalities, an automated and user interactive DAC work flow is achieved, including the pre-selections of the automation scheme and the observation region, 384-channel data acquisition and local caching, post-discharge imaging data quality evaluation, remote system status monitoring, and inter-discharge imaging system event handling. The system configuration in a specific physics experiment is further optimized through the associated operating software which is enhanced by the input of the tokamak operation status and the region of interest (ROI) from other diagnostics. The DAC system is based on a modularized design and scalable to the long-pulse discharges in the EAST tokamak.

  4. MPS Data Acquisition System

    International Nuclear Information System (INIS)

    Eiseman, S.E.; Miller, W.J.

    1975-01-01

    A description is given of the data acquisition system used with the multiparticle spectrometer facility at Brookhaven. Detailed information is provided on that part of the system which connects the detectors to the data handler; namely, the detector electronics, device controller, and device port optical isolator

  5. Industrial variographic analysis for continuous sampling system validation

    DEFF Research Database (Denmark)

    Engström, Karin; Esbensen, Kim Harry

    2017-01-01

    Karin Engström, LKAB mining, Kiruna, Sweden, continues to present illuminative cases from process industry. Here she reveals more from her ongoing PhD project showing application of variographic characterisation for on-line continuous control of process sampling systems, including the one...

  6. A new data-driven controllability measure with application in intelligent buildings

    DEFF Research Database (Denmark)

    Shaker, Hamid Reza; Lazarova-Molnar, Sanja

    2017-01-01

    and instrumentation within today's intelligent buildings enable collecting high quality data which could be used directly in data-based analysis and control methods. The area of data-based systems analysis and control is concentrating on developing analysis and control methods that rely on data collected from meters...... and sensors, and information obtained by data processing. This differs from the traditional model-based approaches that are based on mathematical models of systems. We propose and describe a data-driven controllability measure for discrete-time linear systems. The concept is developed within a data......-based system analysis and control framework. Therefore, only measured data is used to obtain the proposed controllability measure. The proposed controllability measure not only shows if the system is controllable or not, but also reveals the level of controllability, which is the information its previous...

  7. DATA MANAGEMENT SYSTEM FOR MOBILE SATELLITE PROPAGATION DATA

    Science.gov (United States)

    Kantak, A. V.

    1994-01-01

    The "Data Management System for Mobile Satellite Propogation" package is a collection of FORTRAN programs and UNIX shell scripts designed to handle the huge amounts of data resulting from Mobile Satellite propogation experiments. These experiments are designed to assist in defining channels for mobile satellite systems. By understanding multipath fading characteristics of the channel, doppler effects, and blockage due to manmade objects as well as natural surroundings, characterization of the channel can be realized. Propogation experiments, then, are performed using a prototype of the system simulating the ultimate product environment. After the data from these experiments is generated, the researcher must access this data with a minimum of effort and to derive some standard results. The programs included in this package manipulate the data files generated by the NASA/JPL Mobile Satellite propogation experiment on an interactive basis. In the experiment, a transmitter operating at 869 MHz was carried to an altitude of 32Km by a stratospheric balloon. A vehicle within the line-of-sight of the transmitter was then driven around, splitting the incoming signal into I and Q channels, and sampling the resulting signal strength at 1000 samples per second. The data was collected at various antenna elavation angles and different times of day generating the ancillary data for the experiment. This package contains a program to convert the binary format of the data generated into standard ASCII format suitable for use with a wide variety of machine architectures. Also included is a UNIX shell-script designed to parse this ASCII file into those records of data that match the researcher's desired values for the ancillary data parameters. In addition, four FORTRAN programs are included to obtain standard quantities from the data. Quantities such as probability of signal level greater than or equal to a specified signal level, probability density of the signal levels, frequency

  8. Resolution optimization with irregularly sampled Fourier data

    International Nuclear Information System (INIS)

    Ferrara, Matthew; Parker, Jason T; Cheney, Margaret

    2013-01-01

    Image acquisition systems such as synthetic aperture radar (SAR) and magnetic resonance imaging often measure irregularly spaced Fourier samples of the desired image. In this paper we show the relationship between sample locations, their associated backprojection weights, and image resolution as characterized by the resulting point spread function (PSF). Two new methods for computing data weights, based on different optimization criteria, are proposed. The first method, which solves a maximal-eigenvector problem, optimizes a PSF-derived resolution metric which is shown to be equivalent to the volume of the Cramer–Rao (positional) error ellipsoid in the uniform-weight case. The second approach utilizes as its performance metric the Frobenius error between the PSF operator and the ideal delta function, and is an extension of a previously reported algorithm. Our proposed extension appropriately regularizes the weight estimates in the presence of noisy data and eliminates the superfluous issue of image discretization in the choice of data weights. The Frobenius-error approach results in a Tikhonov-regularized inverse problem whose Tikhonov weights are dependent on the locations of the Fourier data as well as the noise variance. The two new methods are compared against several state-of-the-art weighting strategies for synthetic multistatic point-scatterer data, as well as an ‘interrupted SAR’ dataset representative of in-band interference commonly encountered in very high frequency radar applications. (paper)

  9. Real-time digital control, data acquisition, and analysis system for the DIII-D multipulse Thomson scattering diagnostic

    International Nuclear Information System (INIS)

    Greenfield, C.M.; Campbell, G.L.; Carlstrom, T.N.; DeBoo, J.C.; Hsieh, C.; Snider, R.T.; Trost, P.K.

    1990-01-01

    A VME-based real-time computer system for laser control, data acquisition, and analysis for the DIII-D multipulse Thomson scattering diagnostic is described. The laser control task requires precise timing of up to eight Nd:YAG lasers, each with an average firing rate of 20 Hz. A cpu module in a real-time multiprocessing computer system will operate the lasers with evenly staggered laser pulses or in a ''burst mode,'' where all available (fully charged) lasers can be fired at 50--100 μs intervals upon receipt of an external event trigger signal. One or more cpu modules, along with a LeCroy FERA (fast encoding and readout ADC) system, will perform real-time data acquisition and analysis. Partial electron temperature and density profiles will be available for plasma feedback control within 1 ms following each laser pulse. The VME-based computer system consists of two or more target processor modules (25 MHz Motorola 68030) running the VMEexec real-time operating system connected to a Unix-based host system (also a 68030). All real-time software is fully interrupt driven to maximize system efficiency. Operator interaction and (non-real-time) data analysis takes place on a MicroVAX 3400 connected via DECnet

  10. SAIL--a software system for sample and phenotype availability across biobanks and cohorts.

    Science.gov (United States)

    Gostev, Mikhail; Fernandez-Banet, Julio; Rung, Johan; Dietrich, Joern; Prokopenko, Inga; Ripatti, Samuli; McCarthy, Mark I; Brazma, Alvis; Krestyaninova, Maria

    2011-02-15

    The Sample avAILability system-SAIL-is a web based application for searching, browsing and annotating biological sample collections or biobank entries. By providing individual-level information on the availability of specific data types (phenotypes, genetic or genomic data) and samples within a collection, rather than the actual measurement data, resource integration can be facilitated. A flexible data structure enables the collection owners to provide descriptive information on their samples using existing or custom vocabularies. Users can query for the available samples by various parameters combining them via logical expressions. The system can be scaled to hold data from millions of samples with thousands of variables. SAIL is available under Aferro-GPL open source license: https://github.com/sail.

  11. The development of neutron activation, sample transportation and γ-ray counting routine system for numbers of geological samples

    International Nuclear Information System (INIS)

    Shibata Shin-nosuke; Tanaka, Tsuyoshi; Minami, Masayo

    2001-01-01

    A new gamma-ray counting and data processing system for non-destructive neutron activation analysis has been set up in Radioisotope Center in Nagoya University. The system carry out gamma-ray counting, sample change and data processing automatically, and is able to keep us away from parts of complicated operations in INAA. In this study, we have arranged simple analytical procedure that makes practical works easier than previous. The concrete flow is described from the reparation of powder rock samples to gamma-ray counting and data processing by the new INAA system. Then it is run over that the analyses used two Geological Survey of Japan rock reference samples JB-1a and JG-1a in order to evaluate how the new analytical procedure give any speediness and accuracy for analyses of geological materials. Two United States Geological Survey reference samples BCR-1 and G-2 used as the standard respectively. Twenty two elements for JB-1a and 25 elements for JG-1a were analyzed, the uncertainty are <5% for Na, Sc, Fe, Co, La, Ce, Sm, Eu, Yb, Lu, Hf, Ta and Th, and of <10% for Cr, Zn, Cs, Ba, Nd, Tb and U. This system will enable us to analyze more than 1500 geologic samples per year. (author)

  12. Sample-hold and analog multiplexer for multidetector systems

    Energy Technology Data Exchange (ETDEWEB)

    Goswami, G C; Ghoshdostidar, M R; Ghosh, B; Chaudhuri, N [North Bengal Univ., Darjeeling (India). Dept. of Physics

    1982-08-15

    A new sample-hold circuit with an analog multiplexer system is described. Designed for multichannel acquistion of data from an air shower array, the system is being used for accurate measurements of pulse heights from 16 channels by the use of a single ADC.

  13. Report: ECHO Data Quality Audit – Phase I Results: The Integrated Compliance Information System Needs Security Controls to Protect Significant Non-Compliance Data

    Science.gov (United States)

    Report #09-P-0226, August 31, 2009. End users of the Permit Compliance System and Integrated Compliance Information System National Pollutant Discharge Elimination System can override the Significant Non-Compliance data field without more access controls.

  14. Admission Control of Integrated Voice and Data CDMA/TDD System Considering Asymmetric Traffic and Power Limit

    Institute of Scientific and Technical Information of China (English)

    CAOYanbo; ZHOUBin; LIChengshu

    2004-01-01

    In this paper, we research an admission control scheme of integrated voice and data CDMA/TDD (Code division multiple access/Time division duplex) system considering asymmetric traffic and power limit. A new user can access the system only if the outage probabilities it experiences on the uplink and downlink time slots are below a threshold value. Based on the power limit the results show the voice and data blocking probabilities under different cell coverage~ arrival rates and various uplink/downlink time slot allocation patterns. Furthermore, multicode and multislot schemes are also evaluated under the presented admission control scheme.

  15. A data-driven fault-tolerant control design of linear multivariable systems with performance optimization.

    Science.gov (United States)

    Li, Zhe; Yang, Guang-Hong

    2017-09-01

    In this paper, an integrated data-driven fault-tolerant control (FTC) design scheme is proposed under the configuration of the Youla parameterization for multiple-input multiple-output (MIMO) systems. With unknown system model parameters, the canonical form identification technique is first applied to design the residual observer in fault-free case. In faulty case, with online tuning of the Youla parameters based on the system data via the gradient-based algorithm, the fault influence is attenuated with system performance optimization. In addition, to improve the robustness of the residual generator to a class of system deviations, a novel adaptive scheme is proposed for the residual generator to prevent its over-activation. Simulation results of a two-tank flow system demonstrate the optimized performance and effect of the proposed FTC scheme. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  16. Synchronizing data from irregularly sampled sensors

    Science.gov (United States)

    Uluyol, Onder

    2017-07-11

    A system and method include receiving a set of sampled measurements for each of multiple sensors, wherein the sampled measurements are at irregular intervals or different rates, re-sampling the sampled measurements of each of the multiple sensors at a higher rate than one of the sensor's set of sampled measurements, and synchronizing the sampled measurements of each of the multiple sensors.

  17. Development of gas-sampling device for 13N monitoring system

    International Nuclear Information System (INIS)

    Zhao Lihong; Gong Xueyu

    2003-01-01

    The 13 N monitoring system is used in the monitoring of the rate of leakage of the primary coolant circuit in nuclear power stations. The author introduces a gas-sampling device of the 13 Nmonitoring system. It is with a close-loop flow control system with intelligent control of Single Chip Micyoco (SCM), and has the ability to monitor and replace the filter paper automatically, to increase the automation of the device and stable operation in long time

  18. Gravimetric dust sampling for control purposes and occupational dust sampling.

    CSIR Research Space (South Africa)

    Unsted, AD

    1997-02-01

    Full Text Available Prior to the introduction of gravimetric dust sampling, konimeters had been used for dust sampling, which was largely for control purposes. Whether or not absolute results were achievable was not an issue since relative results were used to evaluate...

  19. The Marketing & Positive Impacts of Behavioral Control System on Societies & Countries

    Directory of Open Access Journals (Sweden)

    Ahmad Adel Mostafa

    2015-03-01

    Full Text Available Behavioral control systems are one of the most prominent tools used by managers and marketers for different internal and external purposes. One of the most important external purposes they have been used for is influencing consumer behavior. This paper explores the positive effects of implementing such systems on societies. It discusses consumer perception of the systems, their influence on their financial behavior in different contexts, how can they create order and how as well as to what extent should it be implemented and finally how can minimize negative consumer behavior. A judgment based sample of typical consumers was surveyed using questionnaires for collecting primary data on these aspects. Secondary data from Egypt, Singapore and Malaysia was also used as an example of using behavioral control systems. Results show that consumers in general have a positive attitude towards imposing such systems. However, there were worries about misuse, abuse and overuse of theses systems’ policies. Consequently, data shows that behavioral control systems can positively enhance and influence consumer behavior as long as it is used to balance both consumer and retailer interests in a moderate, risk free manner.

  20. Method of software development for tasks of automatic control systems for simulation and designing on the base of the technological systems design data

    International Nuclear Information System (INIS)

    Ajzatulin, A.I.

    2007-01-01

    One studies the factors affecting the designing of the full-scale simulation facilities, the design data base simulation and the application of digital computerized process control systems. Paper describes problems dealing with the errors in the process system design data and the algorithm simulation methodological problems. On the basis of the records of the efforts to design the full-scale simulation facilities of the Tienvan NPP and of the Kudankulam NPP one brings to the notice a procedure to elaborate new tools to simulate and to elaborate algorithms for the computerized process control systems based on the process system design data. Paper lists the basic components of the program system under elaboration to ensure simulation and designing and describes their functions. The introduction result is briefly described [ru

  1. Distributed systems status and control

    Science.gov (United States)

    Kreidler, David; Vickers, David

    1990-01-01

    Concepts are investigated for an automated status and control system for a distributed processing environment. System characteristics, data requirements for health assessment, data acquisition methods, system diagnosis methods and control methods were investigated in an attempt to determine the high-level requirements for a system which can be used to assess the health of a distributed processing system and implement control procedures to maintain an accepted level of health for the system. A potential concept for automated status and control includes the use of expert system techniques to assess the health of the system, detect and diagnose faults, and initiate or recommend actions to correct the faults. Therefore, this research included the investigation of methods by which expert systems were developed for real-time environments and distributed systems. The focus is on the features required by real-time expert systems and the tools available to develop real-time expert systems.

  2. Real-time operation without a real-time operating system for instrument control and data acquisition

    Science.gov (United States)

    Klein, Randolf; Poglitsch, Albrecht; Fumi, Fabio; Geis, Norbert; Hamidouche, Murad; Hoenle, Rainer; Looney, Leslie; Raab, Walfried; Viehhauser, Werner

    2004-09-01

    We are building the Field-Imaging Far-Infrared Line Spectrometer (FIFI LS) for the US-German airborne observatory SOFIA. The detector read-out system is driven by a clock signal at a certain frequency. This signal has to be provided and all other sub-systems have to work synchronously to this clock. The data generated by the instrument has to be received by a computer in a timely manner. Usually these requirements are met with a real-time operating system (RTOS). In this presentation we want to show how we meet these demands differently avoiding the stiffness of an RTOS. Digital I/O-cards with a large buffer separate the asynchronous working computers and the synchronous working instrument. The advantage is that the data processing computers do not need to process the data in real-time. It is sufficient that the computer can process the incoming data stream on average. But since the data is read-in synchronously, problems of relating commands and responses (data) have to be solved: The data is arriving at a fixed rate. The receiving I/O-card buffers the data in its buffer until the computer can access it. To relate the data to commands sent previously, the data is tagged by counters in the read-out electronics. These counters count the system's heartbeat and signals derived from that. The heartbeat and control signals synchronous with the heartbeat are sent by an I/O-card working as pattern generator. Its buffer gets continously programmed with a pattern which is clocked out on the control lines. A counter in the I/O-card keeps track of the amount of pattern words clocked out. By reading this counter, the computer knows the state of the instrument or knows the meaning of the data that will arrive with a certain time-tag.

  3. Development of MATLAB software to control data acquisition from a multichannel systems multi-electrode array.

    Science.gov (United States)

    Messier, Erik

    2016-08-01

    A Multichannel Systems (MCS) microelectrode array data acquisition (DAQ) unit is used to collect multichannel electrograms (EGM) from a Langendorff perfused rabbit heart system to study sudden cardiac death (SCD). MCS provides software through which data being processed by the DAQ unit can be displayed and saved, but this software's combined utility with MATLAB is not very effective. MCSs software stores recorded EGM data in a MathCad (MCD) format, which is then converted to a text file format. These text files are very large, and it is therefore very time consuming to import the EGM data into MATLAB for real-time analysis. Therefore, customized MATLAB software was developed to control the acquisition of data from the MCS DAQ unit, and provide specific laboratory accommodations for this study of SCD. The developed DAQ unit control software will be able to accurately: provide real time display of EGM signals; record and save EGM signals in MATLAB in a desired format; and produce real time analysis of the EGM signals; all through an intuitive GUI.

  4. Wireless Remote Control System

    Directory of Open Access Journals (Sweden)

    Adrian Tigauan

    2012-06-01

    Full Text Available This paper presents the design of a wireless remote control system based on the ZigBee communication protocol. Gathering data from sensors or performing control tasks through wireless communication is advantageous in situations in which the use of cables is impractical. An Atmega328 microcontroller (from slave device is used for gathering data from the sensors and transmitting it to a coordinator device with the help of the XBee modules. The ZigBee standard is suitable for low-cost, low-data-rate and low-power wireless networks implementations. The XBee-PRO module, designed to meet ZigBee standards, requires minimal power for reliable data exchange between devices over a distance of up to 1600m outdoors. A key component of the ZigBee protocol is the ability to support networking and this can be used in a wireless remote control system. This system may be employed e.g. to control temperature and humidity (SHT11 sensor and light intensity (TSL230 sensor levels inside a commercial greenhouse.

  5. The NSTX Central Instrumentation and Control System

    International Nuclear Information System (INIS)

    G. Oliaro; J. Dong; K. Tindall; P. Sichta

    1999-01-01

    Earlier this year the National Spherical Torus Experiment (NSTX) at the Princeton Plasma Physics Laboratory achieved ''first plasma''. The Central Instrumentation and Control System was used to support plasma operations. Major elements of the system include the Process Control System, Plasma Control System, Network System, Data Acquisition System, and Synchronization System. This paper will focus on the Process Control System. Topics include the architecture, hardware interface, operator interface, data management, and system performance

  6. GLODAPv2 data exploration and extraction system

    Science.gov (United States)

    Krassovski, Misha; Kozyr, Alex; Boden, Thomas

    2016-04-01

    The Global Ocean Data Analysis Project (GLODAP) is a cooperative effort of investigators funded for ocean synthesis and modeling projects by the U.S. National Oceanic and Atmospheric Administration (NOAA), Department of Energy (DOE), and National Science Foundation (NSF). Cruises conducted as part of the WOCE, JGOFS, and NOAA Ocean-Atmosphere Carbon Exchange Study (OACES) over the decade of the 1990s generated oceanographic data of unparalleled quality and quantity. GLODAPv2 is a uniformly calibrated open-ocean data product containing inorganic carbon and carbon-relevant variables. This new product includes data from approximately one million individual seawater samples collected from over 700 cruises during the period 1972-2013. Extensive quality control and subsequent calibration were carried out for salinity, oxygen, nutrient, carbon dioxide, total alkalinity, pH, and chlorofluorocarbon data. The Carbon Dioxide Information and Analysis Center (CDIAC), serving as the primary DOE disseminator for climate data and information, developed database and web accessible systems that permit users worldwide to query and retrieve data from the GLODAPv2 collection. This presentation will showcase this new system, discuss technologies used to build the GLODAPv2 resource, and describe integration with a metadata search engine provided by CDIAC as well.

  7. Systems and methods for data quality control and cleansing

    Science.gov (United States)

    Wenzel, Michael; Boettcher, Andrew; Drees, Kirk; Kummer, James

    2016-05-31

    A method for detecting and cleansing suspect building automation system data is shown and described. The method includes using processing electronics to automatically determine which of a plurality of error detectors and which of a plurality of data cleansers to use with building automation system data. The method further includes using processing electronics to automatically detect errors in the data and cleanse the data using a subset of the error detectors and a subset of the cleansers.

  8. Torness computer system turns round data

    International Nuclear Information System (INIS)

    Dowler, E.; Hamilton, J.

    1989-01-01

    The Torness nuclear power station has two advanced gas-cooled reactors. A key feature is the distributed computer system which covers both data processing and auto-control. The complete computer system has over 80 processors with 45000 digital and 22000 analogue input signals. The on-line control and monitoring systems includes operating systems, plant data acquisition and processing, alarm and event detection, communications software, process management systems and database management software. Some features of the system are described. (UK)

  9. Note: Design and development of wireless controlled aerosol sampling network for large scale aerosol dispersion experiments

    International Nuclear Information System (INIS)

    Gopalakrishnan, V.; Subramanian, V.; Baskaran, R.; Venkatraman, B.

    2015-01-01

    Wireless based custom built aerosol sampling network is designed, developed, and implemented for environmental aerosol sampling. These aerosol sampling systems are used in field measurement campaign, in which sodium aerosol dispersion experiments have been conducted as a part of environmental impact studies related to sodium cooled fast reactor. The sampling network contains 40 aerosol sampling units and each contains custom built sampling head and the wireless control networking designed with Programmable System on Chip (PSoC™) and Xbee Pro RF modules. The base station control is designed using graphical programming language LabView. The sampling network is programmed to operate in a preset time and the running status of the samplers in the network is visualized from the base station. The system is developed in such a way that it can be used for any other environment sampling system deployed in wide area and uneven terrain where manual operation is difficult due to the requirement of simultaneous operation and status logging

  10. Note: Design and development of wireless controlled aerosol sampling network for large scale aerosol dispersion experiments

    Energy Technology Data Exchange (ETDEWEB)

    Gopalakrishnan, V.; Subramanian, V.; Baskaran, R.; Venkatraman, B. [Radiation Impact Assessment Section, Radiological Safety Division, Indira Gandhi Centre for Atomic Research, Kalpakkam 603 102 (India)

    2015-07-15

    Wireless based custom built aerosol sampling network is designed, developed, and implemented for environmental aerosol sampling. These aerosol sampling systems are used in field measurement campaign, in which sodium aerosol dispersion experiments have been conducted as a part of environmental impact studies related to sodium cooled fast reactor. The sampling network contains 40 aerosol sampling units and each contains custom built sampling head and the wireless control networking designed with Programmable System on Chip (PSoC™) and Xbee Pro RF modules. The base station control is designed using graphical programming language LabView. The sampling network is programmed to operate in a preset time and the running status of the samplers in the network is visualized from the base station. The system is developed in such a way that it can be used for any other environment sampling system deployed in wide area and uneven terrain where manual operation is difficult due to the requirement of simultaneous operation and status logging.

  11. Non-Contact Conductivity Measurement for Automated Sample Processing Systems

    Science.gov (United States)

    Beegle, Luther W.; Kirby, James P.

    2012-01-01

    A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables

  12. High Accuracy Evaluation of the Finite Fourier Transform Using Sampled Data

    Science.gov (United States)

    Morelli, Eugene A.

    1997-01-01

    Many system identification and signal processing procedures can be done advantageously in the frequency domain. A required preliminary step for this approach is the transformation of sampled time domain data into the frequency domain. The analytical tool used for this transformation is the finite Fourier transform. Inaccuracy in the transformation can degrade system identification and signal processing results. This work presents a method for evaluating the finite Fourier transform using cubic interpolation of sampled time domain data for high accuracy, and the chirp Zeta-transform for arbitrary frequency resolution. The accuracy of the technique is demonstrated in example cases where the transformation can be evaluated analytically. Arbitrary frequency resolution is shown to be important for capturing details of the data in the frequency domain. The technique is demonstrated using flight test data from a longitudinal maneuver of the F-18 High Alpha Research Vehicle.

  13. Adaptive Constrained Optimal Control Design for Data-Based Nonlinear Discrete-Time Systems With Critic-Only Structure.

    Science.gov (United States)

    Luo, Biao; Liu, Derong; Wu, Huai-Ning

    2018-06-01

    Reinforcement learning has proved to be a powerful tool to solve optimal control problems over the past few years. However, the data-based constrained optimal control problem of nonaffine nonlinear discrete-time systems has rarely been studied yet. To solve this problem, an adaptive optimal control approach is developed by using the value iteration-based Q-learning (VIQL) with the critic-only structure. Most of the existing constrained control methods require the use of a certain performance index and only suit for linear or affine nonlinear systems, which is unreasonable in practice. To overcome this problem, the system transformation is first introduced with the general performance index. Then, the constrained optimal control problem is converted to an unconstrained optimal control problem. By introducing the action-state value function, i.e., Q-function, the VIQL algorithm is proposed to learn the optimal Q-function of the data-based unconstrained optimal control problem. The convergence results of the VIQL algorithm are established with an easy-to-realize initial condition . To implement the VIQL algorithm, the critic-only structure is developed, where only one neural network is required to approximate the Q-function. The converged Q-function obtained from the critic-only VIQL method is employed to design the adaptive constrained optimal controller based on the gradient descent scheme. Finally, the effectiveness of the developed adaptive control method is tested on three examples with computer simulation.

  14. Control and diagnostic data structures for the MFTF

    International Nuclear Information System (INIS)

    Wade, J.A.; Choy, J.H.

    1979-01-01

    A Data Base Management System (DBMS) is being written as an integral part of the Supervisory Control and Diagnostics System (SCDS) of programs for control of the Mirror Fusion Test Facility (MFTF). The data upon which the DBMS operates consist of control values and evaluative information required for facilities control, along with control values and disgnostic data acquired as a result of each MFTF shot. The user interface to the DBMS essentially consists of two views: a computer program interface called the Program Level Interface (PLI) and a stand-alone interactive program called the Query Level Interface to support terminal-based queries. This paper deals specifically with the data structure capabilities from the viewpoint of the PLI user

  15. Use of Persistent Identifiers to link Heterogeneous Data Systems in the Integrated Earth Data Applications (IEDA) Facility

    Science.gov (United States)

    Hsu, L.; Lehnert, K. A.; Carbotte, S. M.; Arko, R. A.; Ferrini, V.; O'hara, S. H.; Walker, J. D.

    2012-12-01

    The Integrated Earth Data Applications (IEDA) facility maintains multiple data systems with a wide range of solid earth data types from the marine, terrestrial, and polar environments. Examples of the different data types include syntheses of ultra-high resolution seafloor bathymetry collected on large collaborative cruises and analytical geochemistry measurements collected by single investigators in small, unique projects. These different data types have historically been channeled into separate, discipline-specific databases with search and retrieval tailored for the specific data type. However, a current major goal is to integrate data from different systems to allow interdisciplinary data discovery and scientific analysis. To increase discovery and access across these heterogeneous systems, IEDA employs several unique IDs, including sample IDs (International Geo Sample Number, IGSN), person IDs (GeoPass ID), funding award IDs (NSF Award Number), cruise IDs (from the Marine Geoscience Data System Expedition Metadata Catalog), dataset IDs (DOIs), and publication IDs (DOIs). These IDs allow linking of a sample registry (System for Earth SAmple Registration), data libraries and repositories (e.g. Geochemical Research Library, Marine Geoscience Data System), integrated synthesis databases (e.g. EarthChem Portal, PetDB), and investigator services (IEDA Data Compliance Tool). The linked systems allow efficient discovery of related data across different levels of granularity. In addition, IEDA data systems maintain links with several external data systems, including digital journal publishers. Links have been established between the EarthChem Portal and ScienceDirect through publication DOIs, returning sample-level objects and geochemical analyses for a particular publication. Linking IEDA-hosted data to digital publications with IGSNs at the sample level and with IEDA-allocated dataset DOIs are under development. As an example, an individual investigator could sign up

  16. Multi-channel data acquisition and processing system for moessbauer spectroscopy

    International Nuclear Information System (INIS)

    Jin Ge; Yang Yanming

    1987-01-01

    A multi-channel data acquisition and processing system for moessbauer spectroscopy is described, which consists of an intelligent interface and a BC3-80 microcomputer. The system has eight data channels, each channel contains a counting circuit and a memory. A Z80-CPU is used as a main unit for control and access. The microcomputer is used for real-time displaying spectrum, saving the data to disk, printing data and data processing. The system is applicable to a high counting rate multi-wire proportional chamber. It can increase greatly the counting rate for measuring moessbauer spectrum. The signals of each wire in the chamber go through a corresponding amplifier and a differential discriminator and are recorded by a corresponding data channel, the data of each channel is added by the microcomputer. In addition, two channels can be used to measure an absorption and a scattering spectrum at the same time and the internal and the surface information of the sample are obtained simultaneously

  17. Power system distributed oscilation detection based on Synchrophasor data

    Science.gov (United States)

    Ning, Jiawei

    Along with increasing demand for electricity, integration of renewable energy and deregulation of power market, power industry is facing unprecedented challenges nowadays. Within the last couple of decades, several serious blackouts have been taking place in United States. As an effective approach to prevent that, power system small signal stability monitoring has been drawing more interests and attentions from researchers. With wide-spread implementation of Synchrophasors around the world in the last decade, power systems real-time online monitoring becomes much more feasible. Comparing with planning study analysis, real-time online monitoring would benefit control room operators immediately and directly. Among all online monitoring methods, Oscillation Modal Analysis (OMA), a modal identification method based on routine measurement data where the input is unmeasured ambient excitation, is a great tool to evaluate and monitor power system small signal stability. Indeed, high sampling Synchrophasor data around power system is fitted perfectly as inputs to OMA. Existing methods in OMA for power systems are all based on centralized algorithms applying at control centers only; however, with rapid growing number of online Synchrophasors the computation burden at control centers is and will be continually exponentially expanded. The increasing computation time at control center compromises the real-time feature of online monitoring. The communication efforts between substation and control center will also be out of reach. Meanwhile, it is difficult or even impossible for centralized algorithms to detect some poorly damped local modes. In order to avert previous shortcomings of centralized OMA methods and embrace the new changes in the power systems, two new distributed oscillation detection methods with two new decentralized structures are presented in this dissertation. Since the new schemes brought substations into the big oscillation detection picture, the proposed

  18. Sampled-data consensus in switching networks of integrators based on edge events

    Science.gov (United States)

    Xiao, Feng; Meng, Xiangyu; Chen, Tongwen

    2015-02-01

    This paper investigates the event-driven sampled-data consensus in switching networks of multiple integrators and studies both the bidirectional interaction and leader-following passive reaction topologies in a unified framework. In these topologies, each information link is modelled by an edge of the information graph and assigned a sequence of edge events, which activate the mutual data sampling and controller updates of the two linked agents. Two kinds of edge-event-detecting rules are proposed for the general asynchronous data-sampling case and the synchronous periodic event-detecting case. They are implemented in a distributed fashion, and their effectiveness in reducing communication costs and solving consensus problems under a jointly connected topology condition is shown by both theoretical analysis and simulation examples.

  19. Autonomic and Apoptotic, Aeronautical and Aerospace Systems, and Controlling Scientific Data Generated Therefrom

    Science.gov (United States)

    Sterritt, Roy (Inventor); Hinchey, Michael G. (Inventor)

    2015-01-01

    A self-managing system that uses autonomy and autonomicity is provided with the self-* property of autopoiesis (self-creation). In the event of an agent in the system self-destructing, autopoiesis auto-generates a replacement. A self-esteem reward scheme is also provided and can be used for autonomic agents, based on their performance and trust. Art agent with greater self-esteem may clone at a greater rate compared to the rate of an agent with lower self-esteem. A self-managing system is provided for a high volume of distributed autonomic/self-managing mobile agents, and autonomic adhesion is used to attract similar agents together or to repel dissimilar agents from an event horizon. An apoptotic system is also provided that accords an "expiry date" to data and digital objects, for example, that are available on the internet, which finds usefulness not only in general but also for controlling the loaning and use of space scientific data.

  20. Data Description of a System

    Directory of Open Access Journals (Sweden)

    P. Nevriva

    1996-04-01

    Full Text Available In this paper, a brief discussion on description of process by memorized data is given. The insight into the problem can offer modified views on optimal control, on data compression at communication systems with respect to information content of message, etc.The idea of process description by memorized data with different information content will be presented here on the classical case study of optimal control: the data based control algorithm (data algorithm, DA gathers data from the controlled process and derives control signal (control from data accumulated in the data base. The implementation of the DA on the ideal computer which is not limited by its speed or capacity of memory is expected for simplicity. Accuracy of the data algorithm is then given by a-priori knowledge of the task and by information exchange between the controlled process and the computer.